Dec 15 01:42:15 localhost kernel: Linux version 5.14.0-284.11.1.el9_2.x86_64 (mockbuild@x86-vm-09.build.eng.bos.redhat.com) (gcc (GCC) 11.3.1 20221121 (Red Hat 11.3.1-4), GNU ld version 2.35.2-37.el9) #1 SMP PREEMPT_DYNAMIC Wed Apr 12 10:45:03 EDT 2023 Dec 15 01:42:15 localhost kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com. Dec 15 01:42:15 localhost kernel: Command line: BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64 root=UUID=a3dd82de-ffc6-4652-88b9-80e003b8f20a console=tty0 console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-4G:192M,4G-64G:256M,64G-:512M Dec 15 01:42:15 localhost kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Dec 15 01:42:15 localhost kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Dec 15 01:42:15 localhost kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Dec 15 01:42:15 localhost kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Dec 15 01:42:15 localhost kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'standard' format. Dec 15 01:42:15 localhost kernel: signal: max sigframe size: 1776 Dec 15 01:42:15 localhost kernel: BIOS-provided physical RAM map: Dec 15 01:42:15 localhost kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable Dec 15 01:42:15 localhost kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved Dec 15 01:42:15 localhost kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved Dec 15 01:42:15 localhost kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable Dec 15 01:42:15 localhost kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved Dec 15 01:42:15 localhost kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Dec 15 01:42:15 localhost kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved Dec 15 01:42:15 localhost kernel: BIOS-e820: [mem 0x0000000100000000-0x000000043fffffff] usable Dec 15 01:42:15 localhost kernel: NX (Execute Disable) protection: active Dec 15 01:42:15 localhost kernel: SMBIOS 2.8 present. Dec 15 01:42:15 localhost kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014 Dec 15 01:42:15 localhost kernel: Hypervisor detected: KVM Dec 15 01:42:15 localhost kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Dec 15 01:42:15 localhost kernel: kvm-clock: using sched offset of 1910978031 cycles Dec 15 01:42:15 localhost kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Dec 15 01:42:15 localhost kernel: tsc: Detected 2799.998 MHz processor Dec 15 01:42:15 localhost kernel: last_pfn = 0x440000 max_arch_pfn = 0x400000000 Dec 15 01:42:15 localhost kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Dec 15 01:42:15 localhost kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000 Dec 15 01:42:15 localhost kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef] Dec 15 01:42:15 localhost kernel: Using GB pages for direct mapping Dec 15 01:42:15 localhost kernel: RAMDISK: [mem 0x2eef4000-0x33771fff] Dec 15 01:42:15 localhost kernel: ACPI: Early table checksum verification disabled Dec 15 01:42:15 localhost kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS ) Dec 15 01:42:15 localhost kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS BXPC 00000001 BXPC 00000001) Dec 15 01:42:15 localhost kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS BXPC 00000001 BXPC 00000001) Dec 15 01:42:15 localhost kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS BXPC 00000001 BXPC 00000001) Dec 15 01:42:15 localhost kernel: ACPI: FACS 0x00000000BFFDFC40 000040 Dec 15 01:42:15 localhost kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS BXPC 00000001 BXPC 00000001) Dec 15 01:42:15 localhost kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Dec 15 01:42:15 localhost kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4] Dec 15 01:42:15 localhost kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570] Dec 15 01:42:15 localhost kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f] Dec 15 01:42:15 localhost kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694] Dec 15 01:42:15 localhost kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc] Dec 15 01:42:15 localhost kernel: No NUMA configuration found Dec 15 01:42:15 localhost kernel: Faking a node at [mem 0x0000000000000000-0x000000043fffffff] Dec 15 01:42:15 localhost kernel: NODE_DATA(0) allocated [mem 0x43ffd3000-0x43fffdfff] Dec 15 01:42:15 localhost kernel: Reserving 256MB of memory at 2800MB for crashkernel (System RAM: 16383MB) Dec 15 01:42:15 localhost kernel: Zone ranges: Dec 15 01:42:15 localhost kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Dec 15 01:42:15 localhost kernel: DMA32 [mem 0x0000000001000000-0x00000000ffffffff] Dec 15 01:42:15 localhost kernel: Normal [mem 0x0000000100000000-0x000000043fffffff] Dec 15 01:42:15 localhost kernel: Device empty Dec 15 01:42:15 localhost kernel: Movable zone start for each node Dec 15 01:42:15 localhost kernel: Early memory node ranges Dec 15 01:42:15 localhost kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] Dec 15 01:42:15 localhost kernel: node 0: [mem 0x0000000000100000-0x00000000bffdafff] Dec 15 01:42:15 localhost kernel: node 0: [mem 0x0000000100000000-0x000000043fffffff] Dec 15 01:42:15 localhost kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000043fffffff] Dec 15 01:42:15 localhost kernel: On node 0, zone DMA: 1 pages in unavailable ranges Dec 15 01:42:15 localhost kernel: On node 0, zone DMA: 97 pages in unavailable ranges Dec 15 01:42:15 localhost kernel: On node 0, zone Normal: 37 pages in unavailable ranges Dec 15 01:42:15 localhost kernel: ACPI: PM-Timer IO Port: 0x608 Dec 15 01:42:15 localhost kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Dec 15 01:42:15 localhost kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Dec 15 01:42:15 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Dec 15 01:42:15 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Dec 15 01:42:15 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Dec 15 01:42:15 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Dec 15 01:42:15 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Dec 15 01:42:15 localhost kernel: ACPI: Using ACPI (MADT) for SMP configuration information Dec 15 01:42:15 localhost kernel: TSC deadline timer available Dec 15 01:42:15 localhost kernel: smpboot: Allowing 8 CPUs, 0 hotplug CPUs Dec 15 01:42:15 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff] Dec 15 01:42:15 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff] Dec 15 01:42:15 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff] Dec 15 01:42:15 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff] Dec 15 01:42:15 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff] Dec 15 01:42:15 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff] Dec 15 01:42:15 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff] Dec 15 01:42:15 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff] Dec 15 01:42:15 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff] Dec 15 01:42:15 localhost kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices Dec 15 01:42:15 localhost kernel: Booting paravirtualized kernel on KVM Dec 15 01:42:15 localhost kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Dec 15 01:42:15 localhost kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1 Dec 15 01:42:15 localhost kernel: percpu: Embedded 55 pages/cpu s188416 r8192 d28672 u262144 Dec 15 01:42:15 localhost kernel: kvm-guest: PV spinlocks disabled, no host support Dec 15 01:42:15 localhost kernel: Fallback order for Node 0: 0 Dec 15 01:42:15 localhost kernel: Built 1 zonelists, mobility grouping on. Total pages: 4128475 Dec 15 01:42:15 localhost kernel: Policy zone: Normal Dec 15 01:42:15 localhost kernel: Kernel command line: BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64 root=UUID=a3dd82de-ffc6-4652-88b9-80e003b8f20a console=tty0 console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-4G:192M,4G-64G:256M,64G-:512M Dec 15 01:42:15 localhost kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64", will be passed to user space. Dec 15 01:42:15 localhost kernel: Dentry cache hash table entries: 2097152 (order: 12, 16777216 bytes, linear) Dec 15 01:42:15 localhost kernel: Inode-cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear) Dec 15 01:42:15 localhost kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Dec 15 01:42:15 localhost kernel: software IO TLB: area num 8. Dec 15 01:42:15 localhost kernel: Memory: 2826284K/16776676K available (14342K kernel code, 5536K rwdata, 10180K rodata, 2792K init, 7524K bss, 741268K reserved, 0K cma-reserved) Dec 15 01:42:15 localhost kernel: random: get_random_u64 called from kmem_cache_open+0x1e/0x210 with crng_init=0 Dec 15 01:42:15 localhost kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1 Dec 15 01:42:15 localhost kernel: ftrace: allocating 44803 entries in 176 pages Dec 15 01:42:15 localhost kernel: ftrace: allocated 176 pages with 3 groups Dec 15 01:42:15 localhost kernel: Dynamic Preempt: voluntary Dec 15 01:42:15 localhost kernel: rcu: Preemptible hierarchical RCU implementation. Dec 15 01:42:15 localhost kernel: rcu: #011RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8. Dec 15 01:42:15 localhost kernel: #011Trampoline variant of Tasks RCU enabled. Dec 15 01:42:15 localhost kernel: #011Rude variant of Tasks RCU enabled. Dec 15 01:42:15 localhost kernel: #011Tracing variant of Tasks RCU enabled. Dec 15 01:42:15 localhost kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Dec 15 01:42:15 localhost kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8 Dec 15 01:42:15 localhost kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16 Dec 15 01:42:15 localhost kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Dec 15 01:42:15 localhost kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____) Dec 15 01:42:15 localhost kernel: random: crng init done (trusting CPU's manufacturer) Dec 15 01:42:15 localhost kernel: Console: colour VGA+ 80x25 Dec 15 01:42:15 localhost kernel: printk: console [tty0] enabled Dec 15 01:42:15 localhost kernel: printk: console [ttyS0] enabled Dec 15 01:42:15 localhost kernel: ACPI: Core revision 20211217 Dec 15 01:42:15 localhost kernel: APIC: Switch to symmetric I/O mode setup Dec 15 01:42:15 localhost kernel: x2apic enabled Dec 15 01:42:15 localhost kernel: Switched APIC routing to physical x2apic. Dec 15 01:42:15 localhost kernel: tsc: Marking TSC unstable due to TSCs unsynchronized Dec 15 01:42:15 localhost kernel: Calibrating delay loop (skipped) preset value.. 5599.99 BogoMIPS (lpj=2799998) Dec 15 01:42:15 localhost kernel: pid_max: default: 32768 minimum: 301 Dec 15 01:42:15 localhost kernel: LSM: Security Framework initializing Dec 15 01:42:15 localhost kernel: Yama: becoming mindful. Dec 15 01:42:15 localhost kernel: SELinux: Initializing. Dec 15 01:42:15 localhost kernel: LSM support for eBPF active Dec 15 01:42:15 localhost kernel: Mount-cache hash table entries: 32768 (order: 6, 262144 bytes, linear) Dec 15 01:42:15 localhost kernel: Mountpoint-cache hash table entries: 32768 (order: 6, 262144 bytes, linear) Dec 15 01:42:15 localhost kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Dec 15 01:42:15 localhost kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127 Dec 15 01:42:15 localhost kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0 Dec 15 01:42:15 localhost kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Dec 15 01:42:15 localhost kernel: Spectre V2 : Mitigation: Retpolines Dec 15 01:42:15 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB mitigation: Filling RSB on context switch Dec 15 01:42:15 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB : Filling RSB on VMEXIT Dec 15 01:42:15 localhost kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls Dec 15 01:42:15 localhost kernel: RETBleed: Mitigation: untrained return thunk Dec 15 01:42:15 localhost kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Dec 15 01:42:15 localhost kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Dec 15 01:42:15 localhost kernel: Freeing SMP alternatives memory: 36K Dec 15 01:42:15 localhost kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0) Dec 15 01:42:15 localhost kernel: cblist_init_generic: Setting adjustable number of callback queues. Dec 15 01:42:15 localhost kernel: cblist_init_generic: Setting shift to 3 and lim to 1. Dec 15 01:42:15 localhost kernel: cblist_init_generic: Setting shift to 3 and lim to 1. Dec 15 01:42:15 localhost kernel: cblist_init_generic: Setting shift to 3 and lim to 1. Dec 15 01:42:15 localhost kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver. Dec 15 01:42:15 localhost kernel: ... version: 0 Dec 15 01:42:15 localhost kernel: ... bit width: 48 Dec 15 01:42:15 localhost kernel: ... generic registers: 6 Dec 15 01:42:15 localhost kernel: ... value mask: 0000ffffffffffff Dec 15 01:42:15 localhost kernel: ... max period: 00007fffffffffff Dec 15 01:42:15 localhost kernel: ... fixed-purpose events: 0 Dec 15 01:42:15 localhost kernel: ... event mask: 000000000000003f Dec 15 01:42:15 localhost kernel: rcu: Hierarchical SRCU implementation. Dec 15 01:42:15 localhost kernel: rcu: #011Max phase no-delay instances is 400. Dec 15 01:42:15 localhost kernel: smp: Bringing up secondary CPUs ... Dec 15 01:42:15 localhost kernel: x86: Booting SMP configuration: Dec 15 01:42:15 localhost kernel: .... node #0, CPUs: #1 #2 #3 #4 #5 #6 #7 Dec 15 01:42:15 localhost kernel: smp: Brought up 1 node, 8 CPUs Dec 15 01:42:15 localhost kernel: smpboot: Max logical packages: 8 Dec 15 01:42:15 localhost kernel: smpboot: Total of 8 processors activated (44799.96 BogoMIPS) Dec 15 01:42:15 localhost kernel: node 0 deferred pages initialised in 21ms Dec 15 01:42:15 localhost kernel: devtmpfs: initialized Dec 15 01:42:15 localhost kernel: x86/mm: Memory block size: 128MB Dec 15 01:42:15 localhost kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Dec 15 01:42:15 localhost kernel: futex hash table entries: 2048 (order: 5, 131072 bytes, linear) Dec 15 01:42:15 localhost kernel: pinctrl core: initialized pinctrl subsystem Dec 15 01:42:15 localhost kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Dec 15 01:42:15 localhost kernel: DMA: preallocated 2048 KiB GFP_KERNEL pool for atomic allocations Dec 15 01:42:15 localhost kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Dec 15 01:42:15 localhost kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Dec 15 01:42:15 localhost kernel: audit: initializing netlink subsys (disabled) Dec 15 01:42:15 localhost kernel: audit: type=2000 audit(1765780933.628:1): state=initialized audit_enabled=0 res=1 Dec 15 01:42:15 localhost kernel: thermal_sys: Registered thermal governor 'fair_share' Dec 15 01:42:15 localhost kernel: thermal_sys: Registered thermal governor 'step_wise' Dec 15 01:42:15 localhost kernel: thermal_sys: Registered thermal governor 'user_space' Dec 15 01:42:15 localhost kernel: cpuidle: using governor menu Dec 15 01:42:15 localhost kernel: HugeTLB: can optimize 4095 vmemmap pages for hugepages-1048576kB Dec 15 01:42:15 localhost kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Dec 15 01:42:15 localhost kernel: PCI: Using configuration type 1 for base access Dec 15 01:42:15 localhost kernel: PCI: Using configuration type 1 for extended access Dec 15 01:42:15 localhost kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Dec 15 01:42:15 localhost kernel: HugeTLB: can optimize 7 vmemmap pages for hugepages-2048kB Dec 15 01:42:15 localhost kernel: HugeTLB registered 1.00 GiB page size, pre-allocated 0 pages Dec 15 01:42:15 localhost kernel: HugeTLB registered 2.00 MiB page size, pre-allocated 0 pages Dec 15 01:42:15 localhost kernel: cryptd: max_cpu_qlen set to 1000 Dec 15 01:42:15 localhost kernel: ACPI: Added _OSI(Module Device) Dec 15 01:42:15 localhost kernel: ACPI: Added _OSI(Processor Device) Dec 15 01:42:15 localhost kernel: ACPI: Added _OSI(3.0 _SCP Extensions) Dec 15 01:42:15 localhost kernel: ACPI: Added _OSI(Processor Aggregator Device) Dec 15 01:42:15 localhost kernel: ACPI: Added _OSI(Linux-Dell-Video) Dec 15 01:42:15 localhost kernel: ACPI: Added _OSI(Linux-Lenovo-NV-HDMI-Audio) Dec 15 01:42:15 localhost kernel: ACPI: Added _OSI(Linux-HPI-Hybrid-Graphics) Dec 15 01:42:15 localhost kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Dec 15 01:42:15 localhost kernel: ACPI: Interpreter enabled Dec 15 01:42:15 localhost kernel: ACPI: PM: (supports S0 S3 S4 S5) Dec 15 01:42:15 localhost kernel: ACPI: Using IOAPIC for interrupt routing Dec 15 01:42:15 localhost kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Dec 15 01:42:15 localhost kernel: PCI: Using E820 reservations for host bridge windows Dec 15 01:42:15 localhost kernel: ACPI: Enabled 2 GPEs in block 00 to 0F Dec 15 01:42:15 localhost kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Dec 15 01:42:15 localhost kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3] Dec 15 01:42:15 localhost kernel: acpiphp: Slot [3] registered Dec 15 01:42:15 localhost kernel: acpiphp: Slot [4] registered Dec 15 01:42:15 localhost kernel: acpiphp: Slot [5] registered Dec 15 01:42:15 localhost kernel: acpiphp: Slot [6] registered Dec 15 01:42:15 localhost kernel: acpiphp: Slot [7] registered Dec 15 01:42:15 localhost kernel: acpiphp: Slot [8] registered Dec 15 01:42:15 localhost kernel: acpiphp: Slot [9] registered Dec 15 01:42:15 localhost kernel: acpiphp: Slot [10] registered Dec 15 01:42:15 localhost kernel: acpiphp: Slot [11] registered Dec 15 01:42:15 localhost kernel: acpiphp: Slot [12] registered Dec 15 01:42:15 localhost kernel: acpiphp: Slot [13] registered Dec 15 01:42:15 localhost kernel: acpiphp: Slot [14] registered Dec 15 01:42:15 localhost kernel: acpiphp: Slot [15] registered Dec 15 01:42:15 localhost kernel: acpiphp: Slot [16] registered Dec 15 01:42:15 localhost kernel: acpiphp: Slot [17] registered Dec 15 01:42:15 localhost kernel: acpiphp: Slot [18] registered Dec 15 01:42:15 localhost kernel: acpiphp: Slot [19] registered Dec 15 01:42:15 localhost kernel: acpiphp: Slot [20] registered Dec 15 01:42:15 localhost kernel: acpiphp: Slot [21] registered Dec 15 01:42:15 localhost kernel: acpiphp: Slot [22] registered Dec 15 01:42:15 localhost kernel: acpiphp: Slot [23] registered Dec 15 01:42:15 localhost kernel: acpiphp: Slot [24] registered Dec 15 01:42:15 localhost kernel: acpiphp: Slot [25] registered Dec 15 01:42:15 localhost kernel: acpiphp: Slot [26] registered Dec 15 01:42:15 localhost kernel: acpiphp: Slot [27] registered Dec 15 01:42:15 localhost kernel: acpiphp: Slot [28] registered Dec 15 01:42:15 localhost kernel: acpiphp: Slot [29] registered Dec 15 01:42:15 localhost kernel: acpiphp: Slot [30] registered Dec 15 01:42:15 localhost kernel: acpiphp: Slot [31] registered Dec 15 01:42:15 localhost kernel: PCI host bridge to bus 0000:00 Dec 15 01:42:15 localhost kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Dec 15 01:42:15 localhost kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Dec 15 01:42:15 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Dec 15 01:42:15 localhost kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Dec 15 01:42:15 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x440000000-0x4bfffffff window] Dec 15 01:42:15 localhost kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Dec 15 01:42:15 localhost kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 Dec 15 01:42:15 localhost kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 Dec 15 01:42:15 localhost kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 Dec 15 01:42:15 localhost kernel: pci 0000:00:01.1: reg 0x20: [io 0xc140-0xc14f] Dec 15 01:42:15 localhost kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x10: [io 0x01f0-0x01f7] Dec 15 01:42:15 localhost kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x14: [io 0x03f6] Dec 15 01:42:15 localhost kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x18: [io 0x0170-0x0177] Dec 15 01:42:15 localhost kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x1c: [io 0x0376] Dec 15 01:42:15 localhost kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 Dec 15 01:42:15 localhost kernel: pci 0000:00:01.2: reg 0x20: [io 0xc100-0xc11f] Dec 15 01:42:15 localhost kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 Dec 15 01:42:15 localhost kernel: pci 0000:00:01.3: quirk: [io 0x0600-0x063f] claimed by PIIX4 ACPI Dec 15 01:42:15 localhost kernel: pci 0000:00:01.3: quirk: [io 0x0700-0x070f] claimed by PIIX4 SMB Dec 15 01:42:15 localhost kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 Dec 15 01:42:15 localhost kernel: pci 0000:00:02.0: reg 0x10: [mem 0xfe000000-0xfe7fffff pref] Dec 15 01:42:15 localhost kernel: pci 0000:00:02.0: reg 0x18: [mem 0xfe800000-0xfe803fff 64bit pref] Dec 15 01:42:15 localhost kernel: pci 0000:00:02.0: reg 0x20: [mem 0xfeb90000-0xfeb90fff] Dec 15 01:42:15 localhost kernel: pci 0000:00:02.0: reg 0x30: [mem 0xfeb80000-0xfeb8ffff pref] Dec 15 01:42:15 localhost kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Dec 15 01:42:15 localhost kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 Dec 15 01:42:15 localhost kernel: pci 0000:00:03.0: reg 0x10: [io 0xc080-0xc0bf] Dec 15 01:42:15 localhost kernel: pci 0000:00:03.0: reg 0x14: [mem 0xfeb91000-0xfeb91fff] Dec 15 01:42:15 localhost kernel: pci 0000:00:03.0: reg 0x20: [mem 0xfe804000-0xfe807fff 64bit pref] Dec 15 01:42:15 localhost kernel: pci 0000:00:03.0: reg 0x30: [mem 0xfeb00000-0xfeb7ffff pref] Dec 15 01:42:15 localhost kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 Dec 15 01:42:15 localhost kernel: pci 0000:00:04.0: reg 0x10: [io 0xc000-0xc07f] Dec 15 01:42:15 localhost kernel: pci 0000:00:04.0: reg 0x14: [mem 0xfeb92000-0xfeb92fff] Dec 15 01:42:15 localhost kernel: pci 0000:00:04.0: reg 0x20: [mem 0xfe808000-0xfe80bfff 64bit pref] Dec 15 01:42:15 localhost kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 Dec 15 01:42:15 localhost kernel: pci 0000:00:05.0: reg 0x10: [io 0xc0c0-0xc0ff] Dec 15 01:42:15 localhost kernel: pci 0000:00:05.0: reg 0x20: [mem 0xfe80c000-0xfe80ffff 64bit pref] Dec 15 01:42:15 localhost kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 Dec 15 01:42:15 localhost kernel: pci 0000:00:06.0: reg 0x10: [io 0xc120-0xc13f] Dec 15 01:42:15 localhost kernel: pci 0000:00:06.0: reg 0x20: [mem 0xfe810000-0xfe813fff 64bit pref] Dec 15 01:42:15 localhost kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Dec 15 01:42:15 localhost kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Dec 15 01:42:15 localhost kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Dec 15 01:42:15 localhost kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Dec 15 01:42:15 localhost kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9 Dec 15 01:42:15 localhost kernel: iommu: Default domain type: Translated Dec 15 01:42:15 localhost kernel: iommu: DMA domain TLB invalidation policy: lazy mode Dec 15 01:42:15 localhost kernel: SCSI subsystem initialized Dec 15 01:42:15 localhost kernel: ACPI: bus type USB registered Dec 15 01:42:15 localhost kernel: usbcore: registered new interface driver usbfs Dec 15 01:42:15 localhost kernel: usbcore: registered new interface driver hub Dec 15 01:42:15 localhost kernel: usbcore: registered new device driver usb Dec 15 01:42:15 localhost kernel: pps_core: LinuxPPS API ver. 1 registered Dec 15 01:42:15 localhost kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Dec 15 01:42:15 localhost kernel: PTP clock support registered Dec 15 01:42:15 localhost kernel: EDAC MC: Ver: 3.0.0 Dec 15 01:42:15 localhost kernel: NetLabel: Initializing Dec 15 01:42:15 localhost kernel: NetLabel: domain hash size = 128 Dec 15 01:42:15 localhost kernel: NetLabel: protocols = UNLABELED CIPSOv4 CALIPSO Dec 15 01:42:15 localhost kernel: NetLabel: unlabeled traffic allowed by default Dec 15 01:42:15 localhost kernel: PCI: Using ACPI for IRQ routing Dec 15 01:42:15 localhost kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device Dec 15 01:42:15 localhost kernel: pci 0000:00:02.0: vgaarb: bridge control possible Dec 15 01:42:15 localhost kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Dec 15 01:42:15 localhost kernel: vgaarb: loaded Dec 15 01:42:15 localhost kernel: clocksource: Switched to clocksource kvm-clock Dec 15 01:42:15 localhost kernel: VFS: Disk quotas dquot_6.6.0 Dec 15 01:42:15 localhost kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Dec 15 01:42:15 localhost kernel: pnp: PnP ACPI init Dec 15 01:42:15 localhost kernel: pnp: PnP ACPI: found 5 devices Dec 15 01:42:15 localhost kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Dec 15 01:42:15 localhost kernel: NET: Registered PF_INET protocol family Dec 15 01:42:15 localhost kernel: IP idents hash table entries: 262144 (order: 9, 2097152 bytes, linear) Dec 15 01:42:15 localhost kernel: tcp_listen_portaddr_hash hash table entries: 8192 (order: 5, 131072 bytes, linear) Dec 15 01:42:15 localhost kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Dec 15 01:42:15 localhost kernel: TCP established hash table entries: 131072 (order: 8, 1048576 bytes, linear) Dec 15 01:42:15 localhost kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear) Dec 15 01:42:15 localhost kernel: TCP: Hash tables configured (established 131072 bind 65536) Dec 15 01:42:15 localhost kernel: MPTCP token hash table entries: 16384 (order: 6, 393216 bytes, linear) Dec 15 01:42:15 localhost kernel: UDP hash table entries: 8192 (order: 6, 262144 bytes, linear) Dec 15 01:42:15 localhost kernel: UDP-Lite hash table entries: 8192 (order: 6, 262144 bytes, linear) Dec 15 01:42:15 localhost kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Dec 15 01:42:15 localhost kernel: NET: Registered PF_XDP protocol family Dec 15 01:42:15 localhost kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Dec 15 01:42:15 localhost kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Dec 15 01:42:15 localhost kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Dec 15 01:42:15 localhost kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window] Dec 15 01:42:15 localhost kernel: pci_bus 0000:00: resource 8 [mem 0x440000000-0x4bfffffff window] Dec 15 01:42:15 localhost kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release Dec 15 01:42:15 localhost kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers Dec 15 01:42:15 localhost kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11 Dec 15 01:42:15 localhost kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x140 took 27251 usecs Dec 15 01:42:15 localhost kernel: PCI: CLS 0 bytes, default 64 Dec 15 01:42:15 localhost kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Dec 15 01:42:15 localhost kernel: Trying to unpack rootfs image as initramfs... Dec 15 01:42:15 localhost kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB) Dec 15 01:42:15 localhost kernel: ACPI: bus type thunderbolt registered Dec 15 01:42:15 localhost kernel: Initialise system trusted keyrings Dec 15 01:42:15 localhost kernel: Key type blacklist registered Dec 15 01:42:15 localhost kernel: workingset: timestamp_bits=36 max_order=22 bucket_order=0 Dec 15 01:42:15 localhost kernel: zbud: loaded Dec 15 01:42:15 localhost kernel: integrity: Platform Keyring initialized Dec 15 01:42:15 localhost kernel: NET: Registered PF_ALG protocol family Dec 15 01:42:15 localhost kernel: xor: automatically using best checksumming function avx Dec 15 01:42:15 localhost kernel: Key type asymmetric registered Dec 15 01:42:15 localhost kernel: Asymmetric key parser 'x509' registered Dec 15 01:42:15 localhost kernel: Running certificate verification selftests Dec 15 01:42:15 localhost kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db' Dec 15 01:42:15 localhost kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246) Dec 15 01:42:15 localhost kernel: io scheduler mq-deadline registered Dec 15 01:42:15 localhost kernel: io scheduler kyber registered Dec 15 01:42:15 localhost kernel: io scheduler bfq registered Dec 15 01:42:15 localhost kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE Dec 15 01:42:15 localhost kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4 Dec 15 01:42:15 localhost kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0 Dec 15 01:42:15 localhost kernel: ACPI: button: Power Button [PWRF] Dec 15 01:42:15 localhost kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10 Dec 15 01:42:15 localhost kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11 Dec 15 01:42:15 localhost kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10 Dec 15 01:42:15 localhost kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Dec 15 01:42:15 localhost kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Dec 15 01:42:15 localhost kernel: Non-volatile memory driver v1.3 Dec 15 01:42:15 localhost kernel: rdac: device handler registered Dec 15 01:42:15 localhost kernel: hp_sw: device handler registered Dec 15 01:42:15 localhost kernel: emc: device handler registered Dec 15 01:42:15 localhost kernel: alua: device handler registered Dec 15 01:42:15 localhost kernel: libphy: Fixed MDIO Bus: probed Dec 15 01:42:15 localhost kernel: ehci_hcd: USB 2.0 'Enhanced' Host Controller (EHCI) Driver Dec 15 01:42:15 localhost kernel: ehci-pci: EHCI PCI platform driver Dec 15 01:42:15 localhost kernel: ohci_hcd: USB 1.1 'Open' Host Controller (OHCI) Driver Dec 15 01:42:15 localhost kernel: ohci-pci: OHCI PCI platform driver Dec 15 01:42:15 localhost kernel: uhci_hcd: USB Universal Host Controller Interface driver Dec 15 01:42:15 localhost kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller Dec 15 01:42:15 localhost kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1 Dec 15 01:42:15 localhost kernel: uhci_hcd 0000:00:01.2: detected 2 ports Dec 15 01:42:15 localhost kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100 Dec 15 01:42:15 localhost kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14 Dec 15 01:42:15 localhost kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1 Dec 15 01:42:15 localhost kernel: usb usb1: Product: UHCI Host Controller Dec 15 01:42:15 localhost kernel: usb usb1: Manufacturer: Linux 5.14.0-284.11.1.el9_2.x86_64 uhci_hcd Dec 15 01:42:15 localhost kernel: usb usb1: SerialNumber: 0000:00:01.2 Dec 15 01:42:15 localhost kernel: hub 1-0:1.0: USB hub found Dec 15 01:42:15 localhost kernel: hub 1-0:1.0: 2 ports detected Dec 15 01:42:15 localhost kernel: usbcore: registered new interface driver usbserial_generic Dec 15 01:42:15 localhost kernel: usbserial: USB Serial support registered for generic Dec 15 01:42:15 localhost kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Dec 15 01:42:15 localhost kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Dec 15 01:42:15 localhost kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Dec 15 01:42:15 localhost kernel: mousedev: PS/2 mouse device common for all mice Dec 15 01:42:15 localhost kernel: rtc_cmos 00:04: RTC can wake from S4 Dec 15 01:42:15 localhost kernel: rtc_cmos 00:04: registered as rtc0 Dec 15 01:42:15 localhost kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1 Dec 15 01:42:15 localhost kernel: rtc_cmos 00:04: setting system clock to 2025-12-15T06:42:14 UTC (1765780934) Dec 15 01:42:15 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4 Dec 15 01:42:15 localhost kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram Dec 15 01:42:15 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3 Dec 15 01:42:15 localhost kernel: hid: raw HID events driver (C) Jiri Kosina Dec 15 01:42:15 localhost kernel: usbcore: registered new interface driver usbhid Dec 15 01:42:15 localhost kernel: usbhid: USB HID core driver Dec 15 01:42:15 localhost kernel: drop_monitor: Initializing network drop monitor service Dec 15 01:42:15 localhost kernel: Initializing XFRM netlink socket Dec 15 01:42:15 localhost kernel: NET: Registered PF_INET6 protocol family Dec 15 01:42:15 localhost kernel: Segment Routing with IPv6 Dec 15 01:42:15 localhost kernel: NET: Registered PF_PACKET protocol family Dec 15 01:42:15 localhost kernel: mpls_gso: MPLS GSO support Dec 15 01:42:15 localhost kernel: IPI shorthand broadcast: enabled Dec 15 01:42:15 localhost kernel: AVX2 version of gcm_enc/dec engaged. Dec 15 01:42:15 localhost kernel: AES CTR mode by8 optimization enabled Dec 15 01:42:15 localhost kernel: sched_clock: Marking stable (730783604, 190145148)->(1056738258, -135809506) Dec 15 01:42:15 localhost kernel: registered taskstats version 1 Dec 15 01:42:15 localhost kernel: Loading compiled-in X.509 certificates Dec 15 01:42:15 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kernel signing key: aaec4b640ef162b54684864066c7d4ffd428cd72' Dec 15 01:42:15 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80' Dec 15 01:42:15 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8' Dec 15 01:42:15 localhost kernel: zswap: loaded using pool lzo/zbud Dec 15 01:42:15 localhost kernel: page_owner is disabled Dec 15 01:42:15 localhost kernel: Key type big_key registered Dec 15 01:42:15 localhost kernel: Freeing initrd memory: 74232K Dec 15 01:42:15 localhost kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd Dec 15 01:42:15 localhost kernel: Key type encrypted registered Dec 15 01:42:15 localhost kernel: ima: No TPM chip found, activating TPM-bypass! Dec 15 01:42:15 localhost kernel: Loading compiled-in module X.509 certificates Dec 15 01:42:15 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kernel signing key: aaec4b640ef162b54684864066c7d4ffd428cd72' Dec 15 01:42:15 localhost kernel: ima: Allocated hash algorithm: sha256 Dec 15 01:42:15 localhost kernel: ima: No architecture policies found Dec 15 01:42:15 localhost kernel: evm: Initialising EVM extended attributes: Dec 15 01:42:15 localhost kernel: evm: security.selinux Dec 15 01:42:15 localhost kernel: evm: security.SMACK64 (disabled) Dec 15 01:42:15 localhost kernel: evm: security.SMACK64EXEC (disabled) Dec 15 01:42:15 localhost kernel: evm: security.SMACK64TRANSMUTE (disabled) Dec 15 01:42:15 localhost kernel: evm: security.SMACK64MMAP (disabled) Dec 15 01:42:15 localhost kernel: evm: security.apparmor (disabled) Dec 15 01:42:15 localhost kernel: evm: security.ima Dec 15 01:42:15 localhost kernel: evm: security.capability Dec 15 01:42:15 localhost kernel: evm: HMAC attrs: 0x1 Dec 15 01:42:15 localhost kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00 Dec 15 01:42:15 localhost kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10 Dec 15 01:42:15 localhost kernel: usb 1-1: Product: QEMU USB Tablet Dec 15 01:42:15 localhost kernel: usb 1-1: Manufacturer: QEMU Dec 15 01:42:15 localhost kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1 Dec 15 01:42:15 localhost kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5 Dec 15 01:42:15 localhost kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0 Dec 15 01:42:15 localhost kernel: Freeing unused decrypted memory: 2036K Dec 15 01:42:15 localhost kernel: Freeing unused kernel image (initmem) memory: 2792K Dec 15 01:42:15 localhost kernel: Write protecting the kernel read-only data: 26624k Dec 15 01:42:15 localhost kernel: Freeing unused kernel image (text/rodata gap) memory: 2040K Dec 15 01:42:15 localhost kernel: Freeing unused kernel image (rodata/data gap) memory: 60K Dec 15 01:42:15 localhost kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found. Dec 15 01:42:15 localhost kernel: Run /init as init process Dec 15 01:42:15 localhost systemd[1]: systemd 252-13.el9_2 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified) Dec 15 01:42:15 localhost systemd[1]: Detected virtualization kvm. Dec 15 01:42:15 localhost systemd[1]: Detected architecture x86-64. Dec 15 01:42:15 localhost systemd[1]: Running in initrd. Dec 15 01:42:15 localhost systemd[1]: No hostname configured, using default hostname. Dec 15 01:42:15 localhost systemd[1]: Hostname set to . Dec 15 01:42:15 localhost systemd[1]: Initializing machine ID from VM UUID. Dec 15 01:42:15 localhost systemd[1]: Queued start job for default target Initrd Default Target. Dec 15 01:42:15 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch. Dec 15 01:42:15 localhost systemd[1]: Reached target Local Encrypted Volumes. Dec 15 01:42:15 localhost systemd[1]: Reached target Initrd /usr File System. Dec 15 01:42:15 localhost systemd[1]: Reached target Local File Systems. Dec 15 01:42:15 localhost systemd[1]: Reached target Path Units. Dec 15 01:42:15 localhost systemd[1]: Reached target Slice Units. Dec 15 01:42:15 localhost systemd[1]: Reached target Swaps. Dec 15 01:42:15 localhost systemd[1]: Reached target Timer Units. Dec 15 01:42:15 localhost systemd[1]: Listening on D-Bus System Message Bus Socket. Dec 15 01:42:15 localhost systemd[1]: Listening on Journal Socket (/dev/log). Dec 15 01:42:15 localhost systemd[1]: Listening on Journal Socket. Dec 15 01:42:15 localhost systemd[1]: Listening on udev Control Socket. Dec 15 01:42:15 localhost systemd[1]: Listening on udev Kernel Socket. Dec 15 01:42:15 localhost systemd[1]: Reached target Socket Units. Dec 15 01:42:15 localhost systemd[1]: Starting Create List of Static Device Nodes... Dec 15 01:42:15 localhost systemd[1]: Starting Journal Service... Dec 15 01:42:15 localhost systemd[1]: Starting Load Kernel Modules... Dec 15 01:42:15 localhost systemd[1]: Starting Create System Users... Dec 15 01:42:15 localhost systemd[1]: Starting Setup Virtual Console... Dec 15 01:42:15 localhost systemd[1]: Finished Create List of Static Device Nodes. Dec 15 01:42:15 localhost systemd[1]: Finished Load Kernel Modules. Dec 15 01:42:15 localhost systemd-journald[284]: Journal started Dec 15 01:42:15 localhost systemd-journald[284]: Runtime Journal (/run/log/journal/12c7b5898d2b44b680e11f4b0f34f69b) is 8.0M, max 314.7M, 306.7M free. Dec 15 01:42:15 localhost systemd-modules-load[285]: Module 'msr' is built in Dec 15 01:42:15 localhost systemd[1]: Started Journal Service. Dec 15 01:42:15 localhost systemd[1]: Finished Setup Virtual Console. Dec 15 01:42:15 localhost systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met. Dec 15 01:42:15 localhost systemd[1]: Starting dracut cmdline hook... Dec 15 01:42:15 localhost systemd[1]: Starting Apply Kernel Variables... Dec 15 01:42:15 localhost systemd-sysusers[286]: Creating group 'sgx' with GID 997. Dec 15 01:42:15 localhost systemd-sysusers[286]: Creating group 'users' with GID 100. Dec 15 01:42:15 localhost systemd-sysusers[286]: Creating group 'dbus' with GID 81. Dec 15 01:42:15 localhost systemd-sysusers[286]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81. Dec 15 01:42:15 localhost systemd[1]: Finished Create System Users. Dec 15 01:42:15 localhost systemd[1]: Starting Create Static Device Nodes in /dev... Dec 15 01:42:15 localhost dracut-cmdline[289]: dracut-9.2 (Plow) dracut-057-21.git20230214.el9 Dec 15 01:42:15 localhost systemd[1]: Starting Create Volatile Files and Directories... Dec 15 01:42:15 localhost dracut-cmdline[289]: Using kernel command line parameters: BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64 root=UUID=a3dd82de-ffc6-4652-88b9-80e003b8f20a console=tty0 console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-4G:192M,4G-64G:256M,64G-:512M Dec 15 01:42:15 localhost systemd[1]: Finished Apply Kernel Variables. Dec 15 01:42:15 localhost systemd[1]: Finished Create Static Device Nodes in /dev. Dec 15 01:42:15 localhost systemd[1]: Finished Create Volatile Files and Directories. Dec 15 01:42:15 localhost systemd[1]: Finished dracut cmdline hook. Dec 15 01:42:15 localhost systemd[1]: Starting dracut pre-udev hook... Dec 15 01:42:15 localhost kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Dec 15 01:42:15 localhost kernel: device-mapper: uevent: version 1.0.3 Dec 15 01:42:15 localhost kernel: device-mapper: ioctl: 4.47.0-ioctl (2022-07-28) initialised: dm-devel@redhat.com Dec 15 01:42:15 localhost kernel: RPC: Registered named UNIX socket transport module. Dec 15 01:42:15 localhost kernel: RPC: Registered udp transport module. Dec 15 01:42:15 localhost kernel: RPC: Registered tcp transport module. Dec 15 01:42:15 localhost kernel: RPC: Registered tcp NFSv4.1 backchannel transport module. Dec 15 01:42:15 localhost rpc.statd[405]: Version 2.5.4 starting Dec 15 01:42:15 localhost rpc.statd[405]: Initializing NSM state Dec 15 01:42:15 localhost rpc.idmapd[410]: Setting log level to 0 Dec 15 01:42:15 localhost systemd[1]: Finished dracut pre-udev hook. Dec 15 01:42:15 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files... Dec 15 01:42:15 localhost systemd-udevd[423]: Using default interface naming scheme 'rhel-9.0'. Dec 15 01:42:15 localhost systemd[1]: Started Rule-based Manager for Device Events and Files. Dec 15 01:42:15 localhost systemd[1]: Starting dracut pre-trigger hook... Dec 15 01:42:15 localhost systemd[1]: Finished dracut pre-trigger hook. Dec 15 01:42:15 localhost systemd[1]: Starting Coldplug All udev Devices... Dec 15 01:42:15 localhost systemd[1]: Finished Coldplug All udev Devices. Dec 15 01:42:15 localhost systemd[1]: Reached target System Initialization. Dec 15 01:42:15 localhost systemd[1]: Reached target Basic System. Dec 15 01:42:15 localhost systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet). Dec 15 01:42:15 localhost systemd[1]: Reached target Network. Dec 15 01:42:15 localhost systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet). Dec 15 01:42:15 localhost systemd[1]: Starting dracut initqueue hook... Dec 15 01:42:15 localhost kernel: scsi host0: ata_piix Dec 15 01:42:15 localhost kernel: virtio_blk virtio2: [vda] 838860800 512-byte logical blocks (429 GB/400 GiB) Dec 15 01:42:15 localhost kernel: scsi host1: ata_piix Dec 15 01:42:15 localhost kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Dec 15 01:42:15 localhost kernel: GPT:20971519 != 838860799 Dec 15 01:42:15 localhost kernel: GPT:Alternate GPT header not at the end of the disk. Dec 15 01:42:15 localhost kernel: GPT:20971519 != 838860799 Dec 15 01:42:15 localhost kernel: GPT: Use GNU Parted to correct GPT errors. Dec 15 01:42:15 localhost kernel: vda: vda1 vda2 vda3 vda4 Dec 15 01:42:15 localhost kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 Dec 15 01:42:15 localhost kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 Dec 15 01:42:15 localhost systemd-udevd[427]: Network interface NamePolicy= disabled on kernel command line. Dec 15 01:42:15 localhost systemd[1]: Found device /dev/disk/by-uuid/a3dd82de-ffc6-4652-88b9-80e003b8f20a. Dec 15 01:42:15 localhost systemd[1]: Reached target Initrd Root Device. Dec 15 01:42:16 localhost kernel: ata1: found unknown device (class 0) Dec 15 01:42:16 localhost kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100 Dec 15 01:42:16 localhost kernel: scsi 0:0:0:0: CD-ROM QEMU QEMU DVD-ROM 2.5+ PQ: 0 ANSI: 5 Dec 15 01:42:16 localhost kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5 Dec 15 01:42:16 localhost kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray Dec 15 01:42:16 localhost kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Dec 15 01:42:16 localhost systemd[1]: Finished dracut initqueue hook. Dec 15 01:42:16 localhost systemd[1]: Reached target Preparation for Remote File Systems. Dec 15 01:42:16 localhost systemd[1]: Reached target Remote Encrypted Volumes. Dec 15 01:42:16 localhost systemd[1]: Reached target Remote File Systems. Dec 15 01:42:16 localhost systemd[1]: Starting dracut pre-mount hook... Dec 15 01:42:16 localhost systemd[1]: Finished dracut pre-mount hook. Dec 15 01:42:16 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/a3dd82de-ffc6-4652-88b9-80e003b8f20a... Dec 15 01:42:16 localhost systemd-fsck[512]: /usr/sbin/fsck.xfs: XFS file system. Dec 15 01:42:16 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/a3dd82de-ffc6-4652-88b9-80e003b8f20a. Dec 15 01:42:16 localhost systemd[1]: Mounting /sysroot... Dec 15 01:42:16 localhost kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled Dec 15 01:42:16 localhost kernel: XFS (vda4): Mounting V5 Filesystem Dec 15 01:42:16 localhost kernel: XFS (vda4): Ending clean mount Dec 15 01:42:16 localhost systemd[1]: Mounted /sysroot. Dec 15 01:42:16 localhost systemd[1]: Reached target Initrd Root File System. Dec 15 01:42:16 localhost systemd[1]: Starting Mountpoints Configured in the Real Root... Dec 15 01:42:16 localhost systemd[1]: initrd-parse-etc.service: Deactivated successfully. Dec 15 01:42:16 localhost systemd[1]: Finished Mountpoints Configured in the Real Root. Dec 15 01:42:16 localhost systemd[1]: Reached target Initrd File Systems. Dec 15 01:42:16 localhost systemd[1]: Reached target Initrd Default Target. Dec 15 01:42:16 localhost systemd[1]: Starting dracut mount hook... Dec 15 01:42:16 localhost systemd[1]: Finished dracut mount hook. Dec 15 01:42:16 localhost systemd[1]: Starting dracut pre-pivot and cleanup hook... Dec 15 01:42:16 localhost rpc.idmapd[410]: exiting on signal 15 Dec 15 01:42:16 localhost systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully. Dec 15 01:42:16 localhost systemd[1]: Finished dracut pre-pivot and cleanup hook. Dec 15 01:42:16 localhost systemd[1]: Starting Cleaning Up and Shutting Down Daemons... Dec 15 01:42:16 localhost systemd[1]: Stopped target Network. Dec 15 01:42:16 localhost systemd[1]: Stopped target Remote Encrypted Volumes. Dec 15 01:42:16 localhost systemd[1]: Stopped target Timer Units. Dec 15 01:42:16 localhost systemd[1]: dbus.socket: Deactivated successfully. Dec 15 01:42:16 localhost systemd[1]: Closed D-Bus System Message Bus Socket. Dec 15 01:42:16 localhost systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Dec 15 01:42:16 localhost systemd[1]: Stopped dracut pre-pivot and cleanup hook. Dec 15 01:42:16 localhost systemd[1]: Stopped target Initrd Default Target. Dec 15 01:42:16 localhost systemd[1]: Stopped target Basic System. Dec 15 01:42:16 localhost systemd[1]: Stopped target Initrd Root Device. Dec 15 01:42:16 localhost systemd[1]: Stopped target Initrd /usr File System. Dec 15 01:42:16 localhost systemd[1]: Stopped target Path Units. Dec 15 01:42:16 localhost systemd[1]: Stopped target Remote File Systems. Dec 15 01:42:16 localhost systemd[1]: Stopped target Preparation for Remote File Systems. Dec 15 01:42:16 localhost systemd[1]: Stopped target Slice Units. Dec 15 01:42:16 localhost systemd[1]: Stopped target Socket Units. Dec 15 01:42:16 localhost systemd[1]: Stopped target System Initialization. Dec 15 01:42:16 localhost systemd[1]: Stopped target Local File Systems. Dec 15 01:42:16 localhost systemd[1]: Stopped target Swaps. Dec 15 01:42:16 localhost systemd[1]: dracut-mount.service: Deactivated successfully. Dec 15 01:42:16 localhost systemd[1]: Stopped dracut mount hook. Dec 15 01:42:16 localhost systemd[1]: dracut-pre-mount.service: Deactivated successfully. Dec 15 01:42:16 localhost systemd[1]: Stopped dracut pre-mount hook. Dec 15 01:42:16 localhost systemd[1]: Stopped target Local Encrypted Volumes. Dec 15 01:42:16 localhost systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Dec 15 01:42:16 localhost systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch. Dec 15 01:42:16 localhost systemd[1]: dracut-initqueue.service: Deactivated successfully. Dec 15 01:42:16 localhost systemd[1]: Stopped dracut initqueue hook. Dec 15 01:42:16 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully. Dec 15 01:42:16 localhost systemd[1]: Stopped Apply Kernel Variables. Dec 15 01:42:16 localhost systemd[1]: systemd-modules-load.service: Deactivated successfully. Dec 15 01:42:16 localhost systemd[1]: Stopped Load Kernel Modules. Dec 15 01:42:16 localhost systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Dec 15 01:42:16 localhost systemd[1]: Stopped Create Volatile Files and Directories. Dec 15 01:42:16 localhost systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Dec 15 01:42:16 localhost systemd[1]: Stopped Coldplug All udev Devices. Dec 15 01:42:16 localhost systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Dec 15 01:42:16 localhost systemd[1]: Stopped dracut pre-trigger hook. Dec 15 01:42:16 localhost systemd[1]: Stopping Rule-based Manager for Device Events and Files... Dec 15 01:42:16 localhost systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Dec 15 01:42:16 localhost systemd[1]: Stopped Setup Virtual Console. Dec 15 01:42:16 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Dec 15 01:42:16 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Dec 15 01:42:16 localhost systemd[1]: initrd-cleanup.service: Deactivated successfully. Dec 15 01:42:16 localhost systemd[1]: Finished Cleaning Up and Shutting Down Daemons. Dec 15 01:42:16 localhost systemd[1]: systemd-udevd.service: Deactivated successfully. Dec 15 01:42:16 localhost systemd[1]: Stopped Rule-based Manager for Device Events and Files. Dec 15 01:42:16 localhost systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Dec 15 01:42:16 localhost systemd[1]: Closed udev Control Socket. Dec 15 01:42:16 localhost systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Dec 15 01:42:16 localhost systemd[1]: Closed udev Kernel Socket. Dec 15 01:42:16 localhost systemd[1]: dracut-pre-udev.service: Deactivated successfully. Dec 15 01:42:16 localhost systemd[1]: Stopped dracut pre-udev hook. Dec 15 01:42:16 localhost systemd[1]: dracut-cmdline.service: Deactivated successfully. Dec 15 01:42:16 localhost systemd[1]: Stopped dracut cmdline hook. Dec 15 01:42:16 localhost systemd[1]: Starting Cleanup udev Database... Dec 15 01:42:16 localhost systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Dec 15 01:42:16 localhost systemd[1]: Stopped Create Static Device Nodes in /dev. Dec 15 01:42:16 localhost systemd[1]: kmod-static-nodes.service: Deactivated successfully. Dec 15 01:42:16 localhost systemd[1]: Stopped Create List of Static Device Nodes. Dec 15 01:42:16 localhost systemd[1]: systemd-sysusers.service: Deactivated successfully. Dec 15 01:42:16 localhost systemd[1]: Stopped Create System Users. Dec 15 01:42:16 localhost systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Dec 15 01:42:16 localhost systemd[1]: Finished Cleanup udev Database. Dec 15 01:42:16 localhost systemd[1]: Reached target Switch Root. Dec 15 01:42:16 localhost systemd[1]: Starting Switch Root... Dec 15 01:42:16 localhost systemd[1]: Switching root. Dec 15 01:42:16 localhost systemd-journald[284]: Journal stopped Dec 15 01:42:17 localhost systemd-journald[284]: Received SIGTERM from PID 1 (systemd). Dec 15 01:42:17 localhost kernel: audit: type=1404 audit(1765780937.006:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1 Dec 15 01:42:17 localhost kernel: SELinux: policy capability network_peer_controls=1 Dec 15 01:42:17 localhost kernel: SELinux: policy capability open_perms=1 Dec 15 01:42:17 localhost kernel: SELinux: policy capability extended_socket_class=1 Dec 15 01:42:17 localhost kernel: SELinux: policy capability always_check_network=0 Dec 15 01:42:17 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Dec 15 01:42:17 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Dec 15 01:42:17 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Dec 15 01:42:17 localhost kernel: audit: type=1403 audit(1765780937.096:3): auid=4294967295 ses=4294967295 lsm=selinux res=1 Dec 15 01:42:17 localhost systemd[1]: Successfully loaded SELinux policy in 92.316ms. Dec 15 01:42:17 localhost systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 23.869ms. Dec 15 01:42:17 localhost systemd[1]: systemd 252-13.el9_2 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified) Dec 15 01:42:17 localhost systemd[1]: Detected virtualization kvm. Dec 15 01:42:17 localhost systemd[1]: Detected architecture x86-64. Dec 15 01:42:17 localhost systemd-rc-local-generator[582]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 01:42:17 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 01:42:17 localhost systemd[1]: initrd-switch-root.service: Deactivated successfully. Dec 15 01:42:17 localhost systemd[1]: Stopped Switch Root. Dec 15 01:42:17 localhost systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Dec 15 01:42:17 localhost systemd[1]: Created slice Slice /system/getty. Dec 15 01:42:17 localhost systemd[1]: Created slice Slice /system/modprobe. Dec 15 01:42:17 localhost systemd[1]: Created slice Slice /system/serial-getty. Dec 15 01:42:17 localhost systemd[1]: Created slice Slice /system/sshd-keygen. Dec 15 01:42:17 localhost systemd[1]: Created slice Slice /system/systemd-fsck. Dec 15 01:42:17 localhost systemd[1]: Created slice User and Session Slice. Dec 15 01:42:17 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch. Dec 15 01:42:17 localhost systemd[1]: Started Forward Password Requests to Wall Directory Watch. Dec 15 01:42:17 localhost systemd[1]: Set up automount Arbitrary Executable File Formats File System Automount Point. Dec 15 01:42:17 localhost systemd[1]: Reached target Local Encrypted Volumes. Dec 15 01:42:17 localhost systemd[1]: Stopped target Switch Root. Dec 15 01:42:17 localhost systemd[1]: Stopped target Initrd File Systems. Dec 15 01:42:17 localhost systemd[1]: Stopped target Initrd Root File System. Dec 15 01:42:17 localhost systemd[1]: Reached target Local Integrity Protected Volumes. Dec 15 01:42:17 localhost systemd[1]: Reached target Path Units. Dec 15 01:42:17 localhost systemd[1]: Reached target rpc_pipefs.target. Dec 15 01:42:17 localhost systemd[1]: Reached target Slice Units. Dec 15 01:42:17 localhost systemd[1]: Reached target Swaps. Dec 15 01:42:17 localhost systemd[1]: Reached target Local Verity Protected Volumes. Dec 15 01:42:17 localhost systemd[1]: Listening on RPCbind Server Activation Socket. Dec 15 01:42:17 localhost systemd[1]: Reached target RPC Port Mapper. Dec 15 01:42:17 localhost systemd[1]: Listening on Process Core Dump Socket. Dec 15 01:42:17 localhost systemd[1]: Listening on initctl Compatibility Named Pipe. Dec 15 01:42:17 localhost systemd[1]: Listening on udev Control Socket. Dec 15 01:42:17 localhost systemd[1]: Listening on udev Kernel Socket. Dec 15 01:42:17 localhost systemd[1]: Mounting Huge Pages File System... Dec 15 01:42:17 localhost systemd[1]: Mounting POSIX Message Queue File System... Dec 15 01:42:17 localhost systemd[1]: Mounting Kernel Debug File System... Dec 15 01:42:17 localhost systemd[1]: Mounting Kernel Trace File System... Dec 15 01:42:17 localhost systemd[1]: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab). Dec 15 01:42:17 localhost systemd[1]: Starting Create List of Static Device Nodes... Dec 15 01:42:17 localhost systemd[1]: Starting Load Kernel Module configfs... Dec 15 01:42:17 localhost systemd[1]: Starting Load Kernel Module drm... Dec 15 01:42:17 localhost systemd[1]: Starting Load Kernel Module fuse... Dec 15 01:42:17 localhost systemd[1]: Starting Read and set NIS domainname from /etc/sysconfig/network... Dec 15 01:42:17 localhost systemd[1]: systemd-fsck-root.service: Deactivated successfully. Dec 15 01:42:17 localhost systemd[1]: Stopped File System Check on Root Device. Dec 15 01:42:17 localhost systemd[1]: Stopped Journal Service. Dec 15 01:42:17 localhost kernel: fuse: init (API version 7.36) Dec 15 01:42:17 localhost systemd[1]: Starting Journal Service... Dec 15 01:42:17 localhost systemd[1]: Starting Load Kernel Modules... Dec 15 01:42:17 localhost systemd[1]: Starting Generate network units from Kernel command line... Dec 15 01:42:17 localhost systemd[1]: Starting Remount Root and Kernel File Systems... Dec 15 01:42:17 localhost systemd-journald[618]: Journal started Dec 15 01:42:17 localhost systemd-journald[618]: Runtime Journal (/run/log/journal/738a39f68bc78fb81032e509449fb759) is 8.0M, max 314.7M, 306.7M free. Dec 15 01:42:17 localhost systemd[1]: Queued start job for default target Multi-User System. Dec 15 01:42:17 localhost systemd[1]: systemd-journald.service: Deactivated successfully. Dec 15 01:42:17 localhost systemd-modules-load[619]: Module 'msr' is built in Dec 15 01:42:17 localhost systemd[1]: Repartition Root Disk was skipped because no trigger condition checks were met. Dec 15 01:42:17 localhost systemd[1]: Starting Coldplug All udev Devices... Dec 15 01:42:17 localhost kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff) Dec 15 01:42:17 localhost systemd[1]: Started Journal Service. Dec 15 01:42:17 localhost systemd[1]: Mounted Huge Pages File System. Dec 15 01:42:17 localhost systemd[1]: Mounted POSIX Message Queue File System. Dec 15 01:42:17 localhost systemd[1]: Mounted Kernel Debug File System. Dec 15 01:42:17 localhost systemd[1]: Mounted Kernel Trace File System. Dec 15 01:42:17 localhost systemd[1]: Finished Create List of Static Device Nodes. Dec 15 01:42:17 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully. Dec 15 01:42:17 localhost systemd[1]: Finished Load Kernel Module configfs. Dec 15 01:42:17 localhost systemd[1]: modprobe@fuse.service: Deactivated successfully. Dec 15 01:42:17 localhost systemd[1]: Finished Load Kernel Module fuse. Dec 15 01:42:17 localhost kernel: ACPI: bus type drm_connector registered Dec 15 01:42:17 localhost systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network. Dec 15 01:42:17 localhost systemd[1]: modprobe@drm.service: Deactivated successfully. Dec 15 01:42:17 localhost systemd[1]: Finished Load Kernel Module drm. Dec 15 01:42:17 localhost systemd[1]: Finished Load Kernel Modules. Dec 15 01:42:17 localhost systemd[1]: Finished Generate network units from Kernel command line. Dec 15 01:42:17 localhost systemd[1]: Finished Remount Root and Kernel File Systems. Dec 15 01:42:17 localhost systemd[1]: Mounting FUSE Control File System... Dec 15 01:42:17 localhost systemd[1]: Mounting Kernel Configuration File System... Dec 15 01:42:17 localhost systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes). Dec 15 01:42:17 localhost systemd[1]: Starting Rebuild Hardware Database... Dec 15 01:42:17 localhost systemd[1]: Starting Flush Journal to Persistent Storage... Dec 15 01:42:17 localhost systemd[1]: Starting Load/Save Random Seed... Dec 15 01:42:17 localhost systemd[1]: Starting Apply Kernel Variables... Dec 15 01:42:17 localhost systemd[1]: Starting Create System Users... Dec 15 01:42:17 localhost systemd[1]: Mounted FUSE Control File System. Dec 15 01:42:17 localhost systemd-journald[618]: Runtime Journal (/run/log/journal/738a39f68bc78fb81032e509449fb759) is 8.0M, max 314.7M, 306.7M free. Dec 15 01:42:17 localhost systemd-journald[618]: Received client request to flush runtime journal. Dec 15 01:42:17 localhost systemd[1]: Mounted Kernel Configuration File System. Dec 15 01:42:17 localhost systemd[1]: Finished Flush Journal to Persistent Storage. Dec 15 01:42:17 localhost systemd[1]: Finished Load/Save Random Seed. Dec 15 01:42:17 localhost systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes). Dec 15 01:42:17 localhost systemd[1]: Finished Apply Kernel Variables. Dec 15 01:42:17 localhost systemd-sysusers[630]: Creating group 'sgx' with GID 989. Dec 15 01:42:17 localhost systemd-sysusers[630]: Creating group 'systemd-oom' with GID 988. Dec 15 01:42:17 localhost systemd-sysusers[630]: Creating user 'systemd-oom' (systemd Userspace OOM Killer) with UID 988 and GID 988. Dec 15 01:42:17 localhost systemd[1]: Finished Create System Users. Dec 15 01:42:17 localhost systemd[1]: Starting Create Static Device Nodes in /dev... Dec 15 01:42:17 localhost systemd[1]: Finished Coldplug All udev Devices. Dec 15 01:42:17 localhost systemd[1]: Finished Create Static Device Nodes in /dev. Dec 15 01:42:17 localhost systemd[1]: Reached target Preparation for Local File Systems. Dec 15 01:42:17 localhost systemd[1]: Set up automount EFI System Partition Automount. Dec 15 01:42:18 localhost systemd[1]: Finished Rebuild Hardware Database. Dec 15 01:42:18 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files... Dec 15 01:42:18 localhost systemd-udevd[635]: Using default interface naming scheme 'rhel-9.0'. Dec 15 01:42:18 localhost systemd[1]: Started Rule-based Manager for Device Events and Files. Dec 15 01:42:18 localhost systemd[1]: Starting Load Kernel Module configfs... Dec 15 01:42:18 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully. Dec 15 01:42:18 localhost systemd[1]: Finished Load Kernel Module configfs. Dec 15 01:42:18 localhost systemd[1]: Condition check resulted in /dev/ttyS0 being skipped. Dec 15 01:42:18 localhost systemd-udevd[639]: Network interface NamePolicy= disabled on kernel command line. Dec 15 01:42:18 localhost systemd[1]: Condition check resulted in /dev/disk/by-uuid/b141154b-6a70-437a-a97f-d160c9ba37eb being skipped. Dec 15 01:42:18 localhost systemd[1]: Condition check resulted in /dev/disk/by-uuid/7B77-95E7 being skipped. Dec 15 01:42:18 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/7B77-95E7... Dec 15 01:42:18 localhost systemd-fsck[677]: fsck.fat 4.2 (2021-01-31) Dec 15 01:42:18 localhost systemd-fsck[677]: /dev/vda2: 12 files, 1782/51145 clusters Dec 15 01:42:18 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/7B77-95E7. Dec 15 01:42:18 localhost kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0 Dec 15 01:42:18 localhost kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6 Dec 15 01:42:18 localhost kernel: [drm] pci: virtio-vga detected at 0000:00:02.0 Dec 15 01:42:18 localhost kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console Dec 15 01:42:18 localhost kernel: Console: switching to colour dummy device 80x25 Dec 15 01:42:18 localhost kernel: [drm] features: -virgl +edid -resource_blob -host_visible Dec 15 01:42:18 localhost kernel: [drm] features: -context_init Dec 15 01:42:18 localhost kernel: [drm] number of scanouts: 1 Dec 15 01:42:18 localhost kernel: [drm] number of cap sets: 0 Dec 15 01:42:18 localhost kernel: [drm] Initialized virtio_gpu 0.1.0 0 for virtio0 on minor 0 Dec 15 01:42:18 localhost kernel: virtio_gpu virtio0: [drm] drm_plane_enable_fb_damage_clips() not called Dec 15 01:42:18 localhost kernel: Console: switching to colour frame buffer device 128x48 Dec 15 01:42:18 localhost kernel: SVM: TSC scaling supported Dec 15 01:42:18 localhost kernel: kvm: Nested Virtualization enabled Dec 15 01:42:18 localhost kernel: SVM: kvm: Nested Paging enabled Dec 15 01:42:18 localhost kernel: SVM: LBR virtualization supported Dec 15 01:42:18 localhost kernel: virtio_gpu virtio0: [drm] fb0: virtio_gpudrmfb frame buffer device Dec 15 01:42:18 localhost systemd[1]: Mounting /boot... Dec 15 01:42:18 localhost kernel: XFS (vda3): Mounting V5 Filesystem Dec 15 01:42:18 localhost kernel: XFS (vda3): Ending clean mount Dec 15 01:42:18 localhost kernel: xfs filesystem being mounted at /boot supports timestamps until 2038 (0x7fffffff) Dec 15 01:42:18 localhost systemd[1]: Mounted /boot. Dec 15 01:42:18 localhost systemd[1]: Mounting /boot/efi... Dec 15 01:42:18 localhost systemd[1]: Mounted /boot/efi. Dec 15 01:42:18 localhost systemd[1]: Reached target Local File Systems. Dec 15 01:42:18 localhost systemd[1]: Starting Rebuild Dynamic Linker Cache... Dec 15 01:42:18 localhost systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux). Dec 15 01:42:18 localhost systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Dec 15 01:42:18 localhost systemd[1]: Store a System Token in an EFI Variable was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/LoaderFeatures-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f). Dec 15 01:42:18 localhost systemd[1]: Starting Automatic Boot Loader Update... Dec 15 01:42:18 localhost systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id). Dec 15 01:42:18 localhost systemd[1]: Starting Create Volatile Files and Directories... Dec 15 01:42:18 localhost systemd[1]: efi.automount: Got automount request for /efi, triggered by 715 (bootctl) Dec 15 01:42:18 localhost systemd[1]: Starting File System Check on /dev/vda2... Dec 15 01:42:18 localhost systemd[1]: Finished File System Check on /dev/vda2. Dec 15 01:42:18 localhost systemd[1]: Mounting EFI System Partition Automount... Dec 15 01:42:18 localhost systemd[1]: Mounted EFI System Partition Automount. Dec 15 01:42:18 localhost systemd[1]: Finished Automatic Boot Loader Update. Dec 15 01:42:18 localhost systemd[1]: Finished Create Volatile Files and Directories. Dec 15 01:42:18 localhost systemd[1]: Finished Rebuild Dynamic Linker Cache. Dec 15 01:42:18 localhost systemd[1]: Starting Security Auditing Service... Dec 15 01:42:18 localhost systemd[1]: Starting RPC Bind... Dec 15 01:42:18 localhost systemd[1]: Starting Rebuild Journal Catalog... Dec 15 01:42:18 localhost auditd[726]: audit dispatcher initialized with q_depth=1200 and 1 active plugins Dec 15 01:42:18 localhost auditd[726]: Init complete, auditd 3.0.7 listening for events (startup state enable) Dec 15 01:42:18 localhost systemd[1]: Started RPC Bind. Dec 15 01:42:18 localhost systemd[1]: Finished Rebuild Journal Catalog. Dec 15 01:42:18 localhost systemd[1]: Starting Update is Completed... Dec 15 01:42:18 localhost systemd[1]: Finished Update is Completed. Dec 15 01:42:18 localhost augenrules[731]: /sbin/augenrules: No change Dec 15 01:42:18 localhost augenrules[742]: No rules Dec 15 01:42:18 localhost augenrules[742]: enabled 1 Dec 15 01:42:18 localhost augenrules[742]: failure 1 Dec 15 01:42:18 localhost augenrules[742]: pid 726 Dec 15 01:42:18 localhost augenrules[742]: rate_limit 0 Dec 15 01:42:18 localhost augenrules[742]: backlog_limit 8192 Dec 15 01:42:18 localhost augenrules[742]: lost 0 Dec 15 01:42:18 localhost augenrules[742]: backlog 0 Dec 15 01:42:18 localhost augenrules[742]: backlog_wait_time 60000 Dec 15 01:42:18 localhost augenrules[742]: backlog_wait_time_actual 0 Dec 15 01:42:18 localhost augenrules[742]: enabled 1 Dec 15 01:42:18 localhost augenrules[742]: failure 1 Dec 15 01:42:18 localhost augenrules[742]: pid 726 Dec 15 01:42:18 localhost augenrules[742]: rate_limit 0 Dec 15 01:42:18 localhost augenrules[742]: backlog_limit 8192 Dec 15 01:42:18 localhost augenrules[742]: lost 0 Dec 15 01:42:18 localhost augenrules[742]: backlog 4 Dec 15 01:42:18 localhost augenrules[742]: backlog_wait_time 60000 Dec 15 01:42:18 localhost augenrules[742]: backlog_wait_time_actual 0 Dec 15 01:42:18 localhost augenrules[742]: enabled 1 Dec 15 01:42:18 localhost augenrules[742]: failure 1 Dec 15 01:42:18 localhost augenrules[742]: pid 726 Dec 15 01:42:18 localhost augenrules[742]: rate_limit 0 Dec 15 01:42:18 localhost augenrules[742]: backlog_limit 8192 Dec 15 01:42:18 localhost augenrules[742]: lost 0 Dec 15 01:42:18 localhost augenrules[742]: backlog 3 Dec 15 01:42:18 localhost augenrules[742]: backlog_wait_time 60000 Dec 15 01:42:18 localhost augenrules[742]: backlog_wait_time_actual 0 Dec 15 01:42:18 localhost systemd[1]: Started Security Auditing Service. Dec 15 01:42:18 localhost systemd[1]: Starting Record System Boot/Shutdown in UTMP... Dec 15 01:42:18 localhost systemd[1]: Finished Record System Boot/Shutdown in UTMP. Dec 15 01:42:18 localhost systemd[1]: Reached target System Initialization. Dec 15 01:42:18 localhost systemd[1]: Started dnf makecache --timer. Dec 15 01:42:18 localhost systemd[1]: Started Daily rotation of log files. Dec 15 01:42:18 localhost systemd[1]: Started Daily Cleanup of Temporary Directories. Dec 15 01:42:18 localhost systemd[1]: Reached target Timer Units. Dec 15 01:42:18 localhost systemd[1]: Listening on D-Bus System Message Bus Socket. Dec 15 01:42:18 localhost systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket. Dec 15 01:42:18 localhost systemd[1]: Reached target Socket Units. Dec 15 01:42:18 localhost systemd[1]: Starting Initial cloud-init job (pre-networking)... Dec 15 01:42:18 localhost systemd[1]: Starting D-Bus System Message Bus... Dec 15 01:42:18 localhost systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f). Dec 15 01:42:18 localhost systemd[1]: Started D-Bus System Message Bus. Dec 15 01:42:18 localhost systemd[1]: Reached target Basic System. Dec 15 01:42:18 localhost journal[751]: Ready Dec 15 01:42:19 localhost systemd[1]: Starting NTP client/server... Dec 15 01:42:19 localhost systemd[1]: Starting Restore /run/initramfs on shutdown... Dec 15 01:42:19 localhost systemd[1]: Started irqbalance daemon. Dec 15 01:42:19 localhost systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload). Dec 15 01:42:19 localhost systemd[1]: Starting System Logging Service... Dec 15 01:42:19 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Dec 15 01:42:19 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Dec 15 01:42:19 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Dec 15 01:42:19 localhost systemd[1]: Reached target sshd-keygen.target. Dec 15 01:42:19 localhost systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met. Dec 15 01:42:19 localhost systemd[1]: Reached target User and Group Name Lookups. Dec 15 01:42:19 localhost rsyslogd[759]: [origin software="rsyslogd" swVersion="8.2102.0-111.el9" x-pid="759" x-info="https://www.rsyslog.com"] start Dec 15 01:42:19 localhost rsyslogd[759]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2102.0-111.el9 try https://www.rsyslog.com/e/2040 ] Dec 15 01:42:19 localhost systemd[1]: Starting User Login Management... Dec 15 01:42:19 localhost systemd[1]: Started System Logging Service. Dec 15 01:42:19 localhost systemd[1]: Finished Restore /run/initramfs on shutdown. Dec 15 01:42:19 localhost chronyd[766]: chronyd version 4.3 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG) Dec 15 01:42:19 localhost chronyd[766]: Using right/UTC timezone to obtain leap second data Dec 15 01:42:19 localhost chronyd[766]: Loaded seccomp filter (level 2) Dec 15 01:42:19 localhost systemd[1]: Started NTP client/server. Dec 15 01:42:19 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Dec 15 01:42:19 localhost systemd-logind[763]: New seat seat0. Dec 15 01:42:19 localhost systemd-logind[763]: Watching system buttons on /dev/input/event0 (Power Button) Dec 15 01:42:19 localhost systemd-logind[763]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard) Dec 15 01:42:19 localhost systemd[1]: Started User Login Management. Dec 15 01:42:19 localhost cloud-init[770]: Cloud-init v. 22.1-9.el9 running 'init-local' at Mon, 15 Dec 2025 06:42:19 +0000. Up 5.49 seconds. Dec 15 01:42:19 localhost systemd[1]: Starting Hostname Service... Dec 15 01:42:19 localhost systemd[1]: Started Hostname Service. Dec 15 01:42:19 localhost systemd-hostnamed[784]: Hostname set to (static) Dec 15 01:42:19 localhost systemd[1]: Finished Initial cloud-init job (pre-networking). Dec 15 01:42:19 localhost systemd[1]: Reached target Preparation for Network. Dec 15 01:42:19 localhost systemd[1]: Starting Network Manager... Dec 15 01:42:19 localhost NetworkManager[789]: [1765780939.8005] NetworkManager (version 1.42.2-1.el9) is starting... (boot:0d1fb014-fa04-4bbc-9dbb-cbf7146bb597) Dec 15 01:42:19 localhost NetworkManager[789]: [1765780939.8011] Read config: /etc/NetworkManager/NetworkManager.conf (run: 15-carrier-timeout.conf) Dec 15 01:42:19 localhost systemd[1]: Started Network Manager. Dec 15 01:42:19 localhost NetworkManager[789]: [1765780939.8074] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager" Dec 15 01:42:19 localhost systemd[1]: Reached target Network. Dec 15 01:42:19 localhost systemd[1]: Starting Network Manager Wait Online... Dec 15 01:42:19 localhost NetworkManager[789]: [1765780939.8156] manager[0x55d3abfec020]: monitoring kernel firmware directory '/lib/firmware'. Dec 15 01:42:19 localhost NetworkManager[789]: [1765780939.8203] hostname: hostname: using hostnamed Dec 15 01:42:19 localhost NetworkManager[789]: [1765780939.8204] hostname: static hostname changed from (none) to "np0005559462.novalocal" Dec 15 01:42:19 localhost systemd[1]: Starting GSSAPI Proxy Daemon... Dec 15 01:42:19 localhost NetworkManager[789]: [1765780939.8211] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto) Dec 15 01:42:19 localhost systemd[1]: Starting Enable periodic update of entitlement certificates.... Dec 15 01:42:19 localhost NetworkManager[789]: [1765780939.8318] manager[0x55d3abfec020]: rfkill: Wi-Fi hardware radio set enabled Dec 15 01:42:19 localhost NetworkManager[789]: [1765780939.8319] manager[0x55d3abfec020]: rfkill: WWAN hardware radio set enabled Dec 15 01:42:19 localhost systemd[1]: Starting Dynamic System Tuning Daemon... Dec 15 01:42:19 localhost NetworkManager[789]: [1765780939.8374] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.42.2-1.el9/libnm-device-plugin-team.so) Dec 15 01:42:19 localhost NetworkManager[789]: [1765780939.8375] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file Dec 15 01:42:19 localhost NetworkManager[789]: [1765780939.8378] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file Dec 15 01:42:19 localhost NetworkManager[789]: [1765780939.8379] manager: Networking is enabled by state file Dec 15 01:42:19 localhost systemd[1]: Started Enable periodic update of entitlement certificates.. Dec 15 01:42:19 localhost NetworkManager[789]: [1765780939.8395] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.42.2-1.el9/libnm-settings-plugin-ifcfg-rh.so") Dec 15 01:42:19 localhost NetworkManager[789]: [1765780939.8395] settings: Loaded settings plugin: keyfile (internal) Dec 15 01:42:19 localhost NetworkManager[789]: [1765780939.8429] dhcp: init: Using DHCP client 'internal' Dec 15 01:42:19 localhost NetworkManager[789]: [1765780939.8434] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1) Dec 15 01:42:19 localhost systemd[1]: Started GSSAPI Proxy Daemon. Dec 15 01:42:19 localhost NetworkManager[789]: [1765780939.8455] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external') Dec 15 01:42:19 localhost NetworkManager[789]: [1765780939.8465] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', sys-iface-state: 'external') Dec 15 01:42:19 localhost NetworkManager[789]: [1765780939.8478] device (lo): Activation: starting connection 'lo' (3f7c3e97-b35c-4041-9026-128161f4820a) Dec 15 01:42:19 localhost NetworkManager[789]: [1765780939.8492] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2) Dec 15 01:42:19 localhost NetworkManager[789]: [1765780939.8498] device (eth0): state change: unmanaged -> unavailable (reason 'managed', sys-iface-state: 'external') Dec 15 01:42:19 localhost systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch. Dec 15 01:42:19 localhost systemd[1]: Starting Network Manager Script Dispatcher Service... Dec 15 01:42:19 localhost NetworkManager[789]: [1765780939.8568] device (lo): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'external') Dec 15 01:42:19 localhost NetworkManager[789]: [1765780939.8573] device (lo): state change: prepare -> config (reason 'none', sys-iface-state: 'external') Dec 15 01:42:19 localhost NetworkManager[789]: [1765780939.8578] device (lo): state change: config -> ip-config (reason 'none', sys-iface-state: 'external') Dec 15 01:42:19 localhost NetworkManager[789]: [1765780939.8585] device (eth0): carrier: link connected Dec 15 01:42:19 localhost NetworkManager[789]: [1765780939.8590] device (lo): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'external') Dec 15 01:42:19 localhost systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab). Dec 15 01:42:19 localhost systemd[1]: Reached target NFS client services. Dec 15 01:42:19 localhost NetworkManager[789]: [1765780939.8601] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', sys-iface-state: 'managed') Dec 15 01:42:19 localhost systemd[1]: Reached target Preparation for Remote File Systems. Dec 15 01:42:19 localhost NetworkManager[789]: [1765780939.8645] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) Dec 15 01:42:19 localhost systemd[1]: Reached target Remote File Systems. Dec 15 01:42:19 localhost NetworkManager[789]: [1765780939.8654] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) Dec 15 01:42:19 localhost NetworkManager[789]: [1765780939.8656] device (eth0): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'managed') Dec 15 01:42:19 localhost NetworkManager[789]: [1765780939.8663] manager: NetworkManager state is now CONNECTING Dec 15 01:42:19 localhost NetworkManager[789]: [1765780939.8666] device (eth0): state change: prepare -> config (reason 'none', sys-iface-state: 'managed') Dec 15 01:42:19 localhost systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f). Dec 15 01:42:19 localhost NetworkManager[789]: [1765780939.8680] device (eth0): state change: config -> ip-config (reason 'none', sys-iface-state: 'managed') Dec 15 01:42:19 localhost NetworkManager[789]: [1765780939.8686] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds) Dec 15 01:42:19 localhost NetworkManager[789]: [1765780939.8747] dhcp4 (eth0): state changed new lease, address=38.102.83.173 Dec 15 01:42:19 localhost systemd[1]: Started Network Manager Script Dispatcher Service. Dec 15 01:42:19 localhost NetworkManager[789]: [1765780939.8754] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS Dec 15 01:42:19 localhost NetworkManager[789]: [1765780939.8788] device (eth0): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'managed') Dec 15 01:42:19 localhost NetworkManager[789]: [1765780939.8799] device (lo): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'external') Dec 15 01:42:19 localhost NetworkManager[789]: [1765780939.8802] device (lo): state change: secondaries -> activated (reason 'none', sys-iface-state: 'external') Dec 15 01:42:19 localhost NetworkManager[789]: [1765780939.8808] device (lo): Activation: successful, device activated. Dec 15 01:42:19 localhost NetworkManager[789]: [1765780939.8826] device (eth0): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'managed') Dec 15 01:42:19 localhost NetworkManager[789]: [1765780939.8828] device (eth0): state change: secondaries -> activated (reason 'none', sys-iface-state: 'managed') Dec 15 01:42:19 localhost NetworkManager[789]: [1765780939.8834] manager: NetworkManager state is now CONNECTED_SITE Dec 15 01:42:19 localhost NetworkManager[789]: [1765780939.8838] device (eth0): Activation: successful, device activated. Dec 15 01:42:19 localhost NetworkManager[789]: [1765780939.8845] manager: NetworkManager state is now CONNECTED_GLOBAL Dec 15 01:42:19 localhost NetworkManager[789]: [1765780939.8849] manager: startup complete Dec 15 01:42:19 localhost systemd[1]: Finished Network Manager Wait Online. Dec 15 01:42:19 localhost systemd[1]: Starting Initial cloud-init job (metadata service crawler)... Dec 15 01:42:20 localhost cloud-init[986]: Cloud-init v. 22.1-9.el9 running 'init' at Mon, 15 Dec 2025 06:42:20 +0000. Up 6.29 seconds. Dec 15 01:42:20 localhost cloud-init[986]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++ Dec 15 01:42:20 localhost cloud-init[986]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+ Dec 15 01:42:20 localhost cloud-init[986]: ci-info: | Device | Up | Address | Mask | Scope | Hw-Address | Dec 15 01:42:20 localhost cloud-init[986]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+ Dec 15 01:42:20 localhost cloud-init[986]: ci-info: | eth0 | True | 38.102.83.173 | 255.255.255.0 | global | fa:16:3e:ba:ca:0f | Dec 15 01:42:20 localhost cloud-init[986]: ci-info: | eth0 | True | fe80::f816:3eff:feba:ca0f/64 | . | link | fa:16:3e:ba:ca:0f | Dec 15 01:42:20 localhost cloud-init[986]: ci-info: | lo | True | 127.0.0.1 | 255.0.0.0 | host | . | Dec 15 01:42:20 localhost cloud-init[986]: ci-info: | lo | True | ::1/128 | . | host | . | Dec 15 01:42:20 localhost cloud-init[986]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+ Dec 15 01:42:20 localhost cloud-init[986]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++ Dec 15 01:42:20 localhost cloud-init[986]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+ Dec 15 01:42:20 localhost cloud-init[986]: ci-info: | Route | Destination | Gateway | Genmask | Interface | Flags | Dec 15 01:42:20 localhost cloud-init[986]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+ Dec 15 01:42:20 localhost cloud-init[986]: ci-info: | 0 | 0.0.0.0 | 38.102.83.1 | 0.0.0.0 | eth0 | UG | Dec 15 01:42:20 localhost cloud-init[986]: ci-info: | 1 | 38.102.83.0 | 0.0.0.0 | 255.255.255.0 | eth0 | U | Dec 15 01:42:20 localhost cloud-init[986]: ci-info: | 2 | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 | eth0 | UGH | Dec 15 01:42:20 localhost cloud-init[986]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+ Dec 15 01:42:20 localhost cloud-init[986]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++ Dec 15 01:42:20 localhost cloud-init[986]: ci-info: +-------+-------------+---------+-----------+-------+ Dec 15 01:42:20 localhost cloud-init[986]: ci-info: | Route | Destination | Gateway | Interface | Flags | Dec 15 01:42:20 localhost cloud-init[986]: ci-info: +-------+-------------+---------+-----------+-------+ Dec 15 01:42:20 localhost cloud-init[986]: ci-info: | 1 | fe80::/64 | :: | eth0 | U | Dec 15 01:42:20 localhost cloud-init[986]: ci-info: | 3 | multicast | :: | eth0 | U | Dec 15 01:42:20 localhost cloud-init[986]: ci-info: +-------+-------------+---------+-----------+-------+ Dec 15 01:42:20 localhost systemd[1]: Starting Authorization Manager... Dec 15 01:42:20 localhost polkitd[1036]: Started polkitd version 0.117 Dec 15 01:42:20 localhost systemd[1]: Started Dynamic System Tuning Daemon. Dec 15 01:42:20 localhost systemd[1]: Started Authorization Manager. Dec 15 01:42:22 localhost cloud-init[986]: Generating public/private rsa key pair. Dec 15 01:42:22 localhost cloud-init[986]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key Dec 15 01:42:22 localhost cloud-init[986]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub Dec 15 01:42:22 localhost cloud-init[986]: The key fingerprint is: Dec 15 01:42:22 localhost cloud-init[986]: SHA256:zDcS9F3YpHgeq6c6DXbZK6Nyv3gqQ+VLrFf21XoZOXk root@np0005559462.novalocal Dec 15 01:42:22 localhost cloud-init[986]: The key's randomart image is: Dec 15 01:42:22 localhost cloud-init[986]: +---[RSA 3072]----+ Dec 15 01:42:22 localhost cloud-init[986]: | . +o | Dec 15 01:42:22 localhost cloud-init[986]: | . . o.o. | Dec 15 01:42:22 localhost cloud-init[986]: | . o = | Dec 15 01:42:22 localhost cloud-init[986]: | o.. o o | Dec 15 01:42:22 localhost cloud-init[986]: | +S ooo .o| Dec 15 01:42:22 localhost cloud-init[986]: | . *o=o. .=E| Dec 15 01:42:22 localhost cloud-init[986]: | . + B...o .=| Dec 15 01:42:22 localhost cloud-init[986]: | = =.=oo ...| Dec 15 01:42:22 localhost cloud-init[986]: | *=B++ . | Dec 15 01:42:22 localhost cloud-init[986]: +----[SHA256]-----+ Dec 15 01:42:22 localhost cloud-init[986]: Generating public/private ecdsa key pair. Dec 15 01:42:22 localhost cloud-init[986]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key Dec 15 01:42:22 localhost cloud-init[986]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub Dec 15 01:42:22 localhost cloud-init[986]: The key fingerprint is: Dec 15 01:42:22 localhost cloud-init[986]: SHA256:TFCpqEMD8gNj3k8lKjMc1d8Z35jSwh8Ag+Rk3tub6hk root@np0005559462.novalocal Dec 15 01:42:22 localhost cloud-init[986]: The key's randomart image is: Dec 15 01:42:22 localhost cloud-init[986]: +---[ECDSA 256]---+ Dec 15 01:42:22 localhost cloud-init[986]: | ..o+o+o. | Dec 15 01:42:22 localhost cloud-init[986]: |=o =+.ooo | Dec 15 01:42:22 localhost cloud-init[986]: |==o .++o+ * + | Dec 15 01:42:22 localhost cloud-init[986]: | *=o...+o* * . | Dec 15 01:42:22 localhost cloud-init[986]: | .++o .S.+ . | Dec 15 01:42:22 localhost cloud-init[986]: | o . o. | Dec 15 01:42:22 localhost cloud-init[986]: | . E o | Dec 15 01:42:22 localhost cloud-init[986]: | + | Dec 15 01:42:22 localhost cloud-init[986]: | .+ | Dec 15 01:42:22 localhost cloud-init[986]: +----[SHA256]-----+ Dec 15 01:42:22 localhost cloud-init[986]: Generating public/private ed25519 key pair. Dec 15 01:42:22 localhost cloud-init[986]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key Dec 15 01:42:22 localhost cloud-init[986]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub Dec 15 01:42:22 localhost cloud-init[986]: The key fingerprint is: Dec 15 01:42:22 localhost cloud-init[986]: SHA256:Q0EUYrhTfCVLxnc8fyDBI5padYoVlff+2qFF1UKLomA root@np0005559462.novalocal Dec 15 01:42:22 localhost cloud-init[986]: The key's randomart image is: Dec 15 01:42:22 localhost cloud-init[986]: +--[ED25519 256]--+ Dec 15 01:42:22 localhost cloud-init[986]: | oo+Ooo=oo. | Dec 15 01:42:22 localhost cloud-init[986]: | ..o+o+= Xoo..| Dec 15 01:42:22 localhost cloud-init[986]: | oE.oB.=.*ooo| Dec 15 01:42:22 localhost cloud-init[986]: | o. o=... ..+| Dec 15 01:42:22 localhost cloud-init[986]: | . oS + | Dec 15 01:42:22 localhost cloud-init[986]: | . . . .| Dec 15 01:42:22 localhost cloud-init[986]: | o.| Dec 15 01:42:22 localhost cloud-init[986]: | o.o| Dec 15 01:42:22 localhost cloud-init[986]: | ....| Dec 15 01:42:22 localhost cloud-init[986]: +----[SHA256]-----+ Dec 15 01:42:22 localhost systemd[1]: Finished Initial cloud-init job (metadata service crawler). Dec 15 01:42:22 localhost systemd[1]: Reached target Cloud-config availability. Dec 15 01:42:22 localhost systemd[1]: Reached target Network is Online. Dec 15 01:42:22 localhost systemd[1]: Starting Apply the settings specified in cloud-config... Dec 15 01:42:22 localhost systemd[1]: Run Insights Client at boot was skipped because of an unmet condition check (ConditionPathExists=/etc/insights-client/.run_insights_client_next_boot). Dec 15 01:42:22 localhost systemd[1]: Starting Crash recovery kernel arming... Dec 15 01:42:22 localhost systemd[1]: Starting Notify NFS peers of a restart... Dec 15 01:42:22 localhost systemd[1]: Starting OpenSSH server daemon... Dec 15 01:42:22 localhost sm-notify[1128]: Version 2.5.4 starting Dec 15 01:42:22 localhost systemd[1]: Starting Permit User Sessions... Dec 15 01:42:22 localhost systemd[1]: Started Notify NFS peers of a restart. Dec 15 01:42:22 localhost systemd[1]: Finished Permit User Sessions. Dec 15 01:42:22 localhost systemd[1]: Started Command Scheduler. Dec 15 01:42:22 localhost sshd[1129]: main: sshd: ssh-rsa algorithm is disabled Dec 15 01:42:22 localhost systemd[1]: Started Getty on tty1. Dec 15 01:42:22 localhost systemd[1]: Started Serial Getty on ttyS0. Dec 15 01:42:22 localhost systemd[1]: Reached target Login Prompts. Dec 15 01:42:22 localhost systemd[1]: Started OpenSSH server daemon. Dec 15 01:42:22 localhost systemd[1]: Reached target Multi-User System. Dec 15 01:42:22 localhost systemd[1]: Starting Record Runlevel Change in UTMP... Dec 15 01:42:22 localhost sshd[1140]: main: sshd: ssh-rsa algorithm is disabled Dec 15 01:42:22 localhost systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully. Dec 15 01:42:22 localhost systemd[1]: Finished Record Runlevel Change in UTMP. Dec 15 01:42:22 localhost sshd[1160]: main: sshd: ssh-rsa algorithm is disabled Dec 15 01:42:22 localhost sshd[1167]: main: sshd: ssh-rsa algorithm is disabled Dec 15 01:42:22 localhost sshd[1173]: main: sshd: ssh-rsa algorithm is disabled Dec 15 01:42:22 localhost kdumpctl[1133]: kdump: No kdump initial ramdisk found. Dec 15 01:42:22 localhost kdumpctl[1133]: kdump: Rebuilding /boot/initramfs-5.14.0-284.11.1.el9_2.x86_64kdump.img Dec 15 01:42:22 localhost sshd[1191]: main: sshd: ssh-rsa algorithm is disabled Dec 15 01:42:22 localhost sshd[1195]: main: sshd: ssh-rsa algorithm is disabled Dec 15 01:42:22 localhost sshd[1211]: main: sshd: ssh-rsa algorithm is disabled Dec 15 01:42:22 localhost sshd[1228]: main: sshd: ssh-rsa algorithm is disabled Dec 15 01:42:22 localhost sshd[1245]: main: sshd: ssh-rsa algorithm is disabled Dec 15 01:42:22 localhost cloud-init[1283]: Cloud-init v. 22.1-9.el9 running 'modules:config' at Mon, 15 Dec 2025 06:42:22 +0000. Up 8.61 seconds. Dec 15 01:42:22 localhost systemd[1]: Finished Apply the settings specified in cloud-config. Dec 15 01:42:22 localhost systemd[1]: Starting Execute cloud user/final scripts... Dec 15 01:42:22 localhost dracut[1431]: dracut-057-21.git20230214.el9 Dec 15 01:42:22 localhost dracut[1433]: Executing: /usr/bin/dracut --add kdumpbase --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics -o "plymouth resume ifcfg earlykdump" --mount "/dev/disk/by-uuid/a3dd82de-ffc6-4652-88b9-80e003b8f20a /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device -f /boot/initramfs-5.14.0-284.11.1.el9_2.x86_64kdump.img 5.14.0-284.11.1.el9_2.x86_64 Dec 15 01:42:22 localhost cloud-init[1494]: Cloud-init v. 22.1-9.el9 running 'modules:final' at Mon, 15 Dec 2025 06:42:22 +0000. Up 8.97 seconds. Dec 15 01:42:22 localhost dracut[1433]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found! Dec 15 01:42:22 localhost dracut[1433]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found! Dec 15 01:42:22 localhost dracut[1433]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found! Dec 15 01:42:22 localhost dracut[1433]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found! Dec 15 01:42:22 localhost dracut[1433]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found! Dec 15 01:42:22 localhost dracut[1433]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found! Dec 15 01:42:22 localhost dracut[1433]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found! Dec 15 01:42:22 localhost cloud-init[1579]: ############################################################# Dec 15 01:42:22 localhost cloud-init[1582]: -----BEGIN SSH HOST KEY FINGERPRINTS----- Dec 15 01:42:22 localhost dracut[1433]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found! Dec 15 01:42:22 localhost cloud-init[1593]: 256 SHA256:TFCpqEMD8gNj3k8lKjMc1d8Z35jSwh8Ag+Rk3tub6hk root@np0005559462.novalocal (ECDSA) Dec 15 01:42:22 localhost dracut[1433]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found! Dec 15 01:42:22 localhost dracut[1433]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found! Dec 15 01:42:22 localhost cloud-init[1602]: 256 SHA256:Q0EUYrhTfCVLxnc8fyDBI5padYoVlff+2qFF1UKLomA root@np0005559462.novalocal (ED25519) Dec 15 01:42:22 localhost cloud-init[1607]: 3072 SHA256:zDcS9F3YpHgeq6c6DXbZK6Nyv3gqQ+VLrFf21XoZOXk root@np0005559462.novalocal (RSA) Dec 15 01:42:22 localhost cloud-init[1609]: -----END SSH HOST KEY FINGERPRINTS----- Dec 15 01:42:22 localhost cloud-init[1612]: ############################################################# Dec 15 01:42:22 localhost dracut[1433]: dracut module 'connman' will not be installed, because command 'connmand' could not be found! Dec 15 01:42:22 localhost dracut[1433]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found! Dec 15 01:42:22 localhost dracut[1433]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found! Dec 15 01:42:22 localhost dracut[1433]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found! Dec 15 01:42:22 localhost dracut[1433]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'! Dec 15 01:42:22 localhost dracut[1433]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found! Dec 15 01:42:23 localhost dracut[1433]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found! Dec 15 01:42:23 localhost dracut[1433]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found! Dec 15 01:42:23 localhost cloud-init[1494]: Cloud-init v. 22.1-9.el9 finished at Mon, 15 Dec 2025 06:42:23 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0]. Up 9.19 seconds Dec 15 01:42:23 localhost dracut[1433]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found! Dec 15 01:42:23 localhost dracut[1433]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found! Dec 15 01:42:23 localhost dracut[1433]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found! Dec 15 01:42:23 localhost dracut[1433]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found! Dec 15 01:42:23 localhost dracut[1433]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found! Dec 15 01:42:23 localhost dracut[1433]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found! Dec 15 01:42:23 localhost systemd[1]: Reloading Network Manager... Dec 15 01:42:23 localhost NetworkManager[789]: [1765780943.0993] audit: op="reload" arg="0" pid=1701 uid=0 result="success" Dec 15 01:42:23 localhost NetworkManager[789]: [1765780943.1005] config: signal: SIGHUP (no changes from disk) Dec 15 01:42:23 localhost systemd[1]: Reloaded Network Manager. Dec 15 01:42:23 localhost systemd[1]: Finished Execute cloud user/final scripts. Dec 15 01:42:23 localhost systemd[1]: Reached target Cloud-init target. Dec 15 01:42:23 localhost dracut[1433]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found! Dec 15 01:42:23 localhost dracut[1433]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found! Dec 15 01:42:23 localhost dracut[1433]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found! Dec 15 01:42:23 localhost dracut[1433]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found! Dec 15 01:42:23 localhost dracut[1433]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found! Dec 15 01:42:23 localhost dracut[1433]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found! Dec 15 01:42:23 localhost dracut[1433]: memstrack is not available Dec 15 01:42:23 localhost dracut[1433]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng Dec 15 01:42:23 localhost dracut[1433]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found! Dec 15 01:42:23 localhost dracut[1433]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found! Dec 15 01:42:23 localhost dracut[1433]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found! Dec 15 01:42:23 localhost dracut[1433]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found! Dec 15 01:42:23 localhost dracut[1433]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found! Dec 15 01:42:23 localhost dracut[1433]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found! Dec 15 01:42:23 localhost dracut[1433]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found! Dec 15 01:42:23 localhost dracut[1433]: dracut module 'connman' will not be installed, because command 'connmand' could not be found! Dec 15 01:42:23 localhost dracut[1433]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found! Dec 15 01:42:23 localhost dracut[1433]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found! Dec 15 01:42:23 localhost dracut[1433]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found! Dec 15 01:42:23 localhost dracut[1433]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'! Dec 15 01:42:23 localhost dracut[1433]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found! Dec 15 01:42:23 localhost dracut[1433]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found! Dec 15 01:42:23 localhost dracut[1433]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found! Dec 15 01:42:23 localhost dracut[1433]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found! Dec 15 01:42:23 localhost dracut[1433]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found! Dec 15 01:42:23 localhost dracut[1433]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found! Dec 15 01:42:23 localhost dracut[1433]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found! Dec 15 01:42:23 localhost dracut[1433]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found! Dec 15 01:42:23 localhost dracut[1433]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found! Dec 15 01:42:23 localhost dracut[1433]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found! Dec 15 01:42:23 localhost dracut[1433]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found! Dec 15 01:42:23 localhost dracut[1433]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found! Dec 15 01:42:23 localhost dracut[1433]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found! Dec 15 01:42:23 localhost dracut[1433]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found! Dec 15 01:42:23 localhost dracut[1433]: memstrack is not available Dec 15 01:42:23 localhost dracut[1433]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng Dec 15 01:42:23 localhost dracut[1433]: *** Including module: systemd *** Dec 15 01:42:23 localhost dracut[1433]: *** Including module: systemd-initrd *** Dec 15 01:42:23 localhost dracut[1433]: *** Including module: i18n *** Dec 15 01:42:24 localhost dracut[1433]: No KEYMAP configured. Dec 15 01:42:24 localhost dracut[1433]: *** Including module: drm *** Dec 15 01:42:24 localhost dracut[1433]: *** Including module: prefixdevname *** Dec 15 01:42:24 localhost dracut[1433]: *** Including module: kernel-modules *** Dec 15 01:42:24 localhost chronyd[766]: Selected source 149.56.19.163 (2.rhel.pool.ntp.org) Dec 15 01:42:24 localhost chronyd[766]: System clock TAI offset set to 37 seconds Dec 15 01:42:24 localhost dracut[1433]: *** Including module: kernel-modules-extra *** Dec 15 01:42:25 localhost dracut[1433]: *** Including module: qemu *** Dec 15 01:42:25 localhost dracut[1433]: *** Including module: fstab-sys *** Dec 15 01:42:25 localhost dracut[1433]: *** Including module: rootfs-block *** Dec 15 01:42:25 localhost dracut[1433]: *** Including module: terminfo *** Dec 15 01:42:25 localhost dracut[1433]: *** Including module: udev-rules *** Dec 15 01:42:25 localhost dracut[1433]: Skipping udev rule: 91-permissions.rules Dec 15 01:42:25 localhost dracut[1433]: Skipping udev rule: 80-drivers-modprobe.rules Dec 15 01:42:25 localhost dracut[1433]: *** Including module: virtiofs *** Dec 15 01:42:25 localhost dracut[1433]: *** Including module: dracut-systemd *** Dec 15 01:42:25 localhost dracut[1433]: *** Including module: usrmount *** Dec 15 01:42:25 localhost dracut[1433]: *** Including module: base *** Dec 15 01:42:25 localhost dracut[1433]: *** Including module: fs-lib *** Dec 15 01:42:25 localhost dracut[1433]: *** Including module: kdumpbase *** Dec 15 01:42:26 localhost chronyd[766]: Selected source 54.39.23.64 (2.rhel.pool.ntp.org) Dec 15 01:42:26 localhost dracut[1433]: *** Including module: microcode_ctl-fw_dir_override *** Dec 15 01:42:26 localhost dracut[1433]: microcode_ctl module: mangling fw_dir Dec 15 01:42:26 localhost dracut[1433]: microcode_ctl: processing data directory "/usr/share/microcode_ctl/ucode_with_caveats/intel"... Dec 15 01:42:26 localhost dracut[1433]: microcode_ctl: configuration "intel" is ignored Dec 15 01:42:26 localhost dracut[1433]: microcode_ctl: processing data directory "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"... Dec 15 01:42:26 localhost dracut[1433]: microcode_ctl: configuration "intel-06-2d-07" is ignored Dec 15 01:42:26 localhost dracut[1433]: microcode_ctl: processing data directory "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"... Dec 15 01:42:26 localhost dracut[1433]: microcode_ctl: configuration "intel-06-4e-03" is ignored Dec 15 01:42:26 localhost dracut[1433]: microcode_ctl: processing data directory "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"... Dec 15 01:42:26 localhost dracut[1433]: microcode_ctl: configuration "intel-06-4f-01" is ignored Dec 15 01:42:26 localhost dracut[1433]: microcode_ctl: processing data directory "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"... Dec 15 01:42:26 localhost dracut[1433]: microcode_ctl: configuration "intel-06-55-04" is ignored Dec 15 01:42:26 localhost dracut[1433]: microcode_ctl: processing data directory "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"... Dec 15 01:42:26 localhost dracut[1433]: microcode_ctl: configuration "intel-06-5e-03" is ignored Dec 15 01:42:26 localhost dracut[1433]: microcode_ctl: processing data directory "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"... Dec 15 01:42:26 localhost dracut[1433]: microcode_ctl: configuration "intel-06-8c-01" is ignored Dec 15 01:42:26 localhost dracut[1433]: microcode_ctl: processing data directory "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"... Dec 15 01:42:26 localhost dracut[1433]: microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored Dec 15 01:42:26 localhost dracut[1433]: microcode_ctl: processing data directory "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"... Dec 15 01:42:26 localhost dracut[1433]: microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored Dec 15 01:42:26 localhost dracut[1433]: microcode_ctl: final fw_dir: "/lib/firmware/updates/5.14.0-284.11.1.el9_2.x86_64 /lib/firmware/updates /lib/firmware/5.14.0-284.11.1.el9_2.x86_64 /lib/firmware" Dec 15 01:42:26 localhost dracut[1433]: *** Including module: shutdown *** Dec 15 01:42:26 localhost dracut[1433]: *** Including module: squash *** Dec 15 01:42:26 localhost dracut[1433]: *** Including modules done *** Dec 15 01:42:26 localhost dracut[1433]: *** Installing kernel module dependencies *** Dec 15 01:42:27 localhost dracut[1433]: *** Installing kernel module dependencies done *** Dec 15 01:42:27 localhost dracut[1433]: *** Resolving executable dependencies *** Dec 15 01:42:28 localhost dracut[1433]: *** Resolving executable dependencies done *** Dec 15 01:42:28 localhost dracut[1433]: *** Hardlinking files *** Dec 15 01:42:28 localhost dracut[1433]: Mode: real Dec 15 01:42:28 localhost dracut[1433]: Files: 1099 Dec 15 01:42:28 localhost dracut[1433]: Linked: 3 files Dec 15 01:42:28 localhost dracut[1433]: Compared: 0 xattrs Dec 15 01:42:28 localhost dracut[1433]: Compared: 373 files Dec 15 01:42:28 localhost dracut[1433]: Saved: 61.04 KiB Dec 15 01:42:28 localhost dracut[1433]: Duration: 0.026307 seconds Dec 15 01:42:28 localhost dracut[1433]: *** Hardlinking files done *** Dec 15 01:42:28 localhost dracut[1433]: Could not find 'strip'. Not stripping the initramfs. Dec 15 01:42:28 localhost dracut[1433]: *** Generating early-microcode cpio image *** Dec 15 01:42:28 localhost dracut[1433]: *** Constructing AuthenticAMD.bin *** Dec 15 01:42:28 localhost dracut[1433]: *** Store current command line parameters *** Dec 15 01:42:28 localhost dracut[1433]: Stored kernel commandline: Dec 15 01:42:28 localhost dracut[1433]: No dracut internal kernel commandline stored in the initramfs Dec 15 01:42:28 localhost dracut[1433]: *** Install squash loader *** Dec 15 01:42:29 localhost dracut[1433]: *** Squashing the files inside the initramfs *** Dec 15 01:42:29 localhost systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully. Dec 15 01:42:30 localhost dracut[1433]: *** Squashing the files inside the initramfs done *** Dec 15 01:42:30 localhost dracut[1433]: *** Creating image file '/boot/initramfs-5.14.0-284.11.1.el9_2.x86_64kdump.img' *** Dec 15 01:42:30 localhost dracut[1433]: *** Creating initramfs image file '/boot/initramfs-5.14.0-284.11.1.el9_2.x86_64kdump.img' done *** Dec 15 01:42:30 localhost kdumpctl[1133]: kdump: kexec: loaded kdump kernel Dec 15 01:42:30 localhost kdumpctl[1133]: kdump: Starting kdump: [OK] Dec 15 01:42:30 localhost systemd[1]: Finished Crash recovery kernel arming. Dec 15 01:42:30 localhost systemd[1]: Startup finished in 1.280s (kernel) + 1.916s (initrd) + 13.801s (userspace) = 16.998s. Dec 15 01:42:49 localhost systemd[1]: systemd-hostnamed.service: Deactivated successfully. Dec 15 01:42:53 localhost sshd[4172]: main: sshd: ssh-rsa algorithm is disabled Dec 15 01:42:53 localhost systemd[1]: Created slice User Slice of UID 1000. Dec 15 01:42:53 localhost systemd[1]: Starting User Runtime Directory /run/user/1000... Dec 15 01:42:53 localhost systemd-logind[763]: New session 1 of user zuul. Dec 15 01:42:53 localhost systemd[1]: Finished User Runtime Directory /run/user/1000. Dec 15 01:42:53 localhost systemd[1]: Starting User Manager for UID 1000... Dec 15 01:42:53 localhost systemd[4176]: Queued start job for default target Main User Target. Dec 15 01:42:53 localhost systemd[4176]: Created slice User Application Slice. Dec 15 01:42:53 localhost systemd[4176]: Started Mark boot as successful after the user session has run 2 minutes. Dec 15 01:42:53 localhost systemd[4176]: Started Daily Cleanup of User's Temporary Directories. Dec 15 01:42:53 localhost systemd[4176]: Reached target Paths. Dec 15 01:42:53 localhost systemd[4176]: Reached target Timers. Dec 15 01:42:53 localhost systemd[4176]: Starting D-Bus User Message Bus Socket... Dec 15 01:42:53 localhost systemd[4176]: Starting Create User's Volatile Files and Directories... Dec 15 01:42:53 localhost systemd[4176]: Finished Create User's Volatile Files and Directories. Dec 15 01:42:53 localhost systemd[4176]: Listening on D-Bus User Message Bus Socket. Dec 15 01:42:53 localhost systemd[4176]: Reached target Sockets. Dec 15 01:42:53 localhost systemd[4176]: Reached target Basic System. Dec 15 01:42:53 localhost systemd[4176]: Reached target Main User Target. Dec 15 01:42:53 localhost systemd[4176]: Startup finished in 133ms. Dec 15 01:42:53 localhost systemd[1]: Started User Manager for UID 1000. Dec 15 01:42:53 localhost systemd[1]: Started Session 1 of User zuul. Dec 15 01:42:54 localhost python3[4228]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Dec 15 01:43:02 localhost python3[4247]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Dec 15 01:43:10 localhost python3[4300]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Dec 15 01:43:12 localhost python3[4330]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present Dec 15 01:43:15 localhost python3[4346]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC1ko8xh0bPiR6+NCG4km3etin5rm3hZZmVHXuDDaTrTWq3PUz5sEmSbiDQOJj4mOpsouNXjYHkjXSuRLTQ5dqF1BWU5bgiOkTIqccwZdxkqfM2VFXFj/Ej621HUBRYHf7PK5zkl+8G1g2RkkiSd886DSw6I1J+2uT/e+4/0G1vsACTaNArP3/JSOh0hdwu+fnjybrp4sauiJsWaQvwbWao/txJqznQhymNwHZVFRMhFy+x+oDr4ry7w+X2JuGz2ydbUojBUG0REWTKmU4EZsyDsx77GzIJwfHsUUuJ0t4DcDalVqz20D+LXTug8AfgSovuNhZTz8AqRXfCOZK9haLb+3tJwKExMBdnj0cacNw6O23jZTIMbJK+qoxGN4mzqr8RNFVPPLVhL5tcfb2MvN7TWs7+oT0K5xmxDkEnr7iSpmPV0d8TD/wNwfJmtu2W6TWL5zgA6U3GyfnTH2+nhrFk5Ou+yaLES2+Wx8kb0NLWgDnlCcYy9dw1eP3GSc3GIzM= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Dec 15 01:43:15 localhost python3[4360]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 01:43:16 localhost python3[4419]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 01:43:17 localhost python3[4460]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765780996.5628016-387-4413214872735/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=590a0ff85d6248fba2903531d2905bb8_id_rsa follow=False checksum=b38504d433636763ead663645f5f650975b888ea backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 01:43:18 localhost python3[4533]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 01:43:18 localhost python3[4574]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765780998.220916-489-270316771610431/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=590a0ff85d6248fba2903531d2905bb8_id_rsa.pub follow=False checksum=2e3e36d0fa1c6d0defe218a502cdeab9c5909fc5 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 01:43:20 localhost python3[4602]: ansible-ping Invoked with data=pong Dec 15 01:43:22 localhost python3[4616]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Dec 15 01:43:25 localhost python3[4669]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None Dec 15 01:43:28 localhost python3[4691]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 01:43:28 localhost python3[4705]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 01:43:28 localhost python3[4719]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 01:43:29 localhost python3[4733]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 01:43:30 localhost python3[4747]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 01:43:30 localhost python3[4761]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 01:43:32 localhost python3[4777]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 01:43:34 localhost python3[4825]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 01:43:34 localhost python3[4868]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1765781014.0168908-94-59913479160857/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 01:43:42 localhost python3[4897]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Dec 15 01:43:42 localhost python3[4911]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Dec 15 01:43:42 localhost python3[4925]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Dec 15 01:43:42 localhost python3[4939]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Dec 15 01:43:43 localhost python3[4953]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Dec 15 01:43:43 localhost python3[4967]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Dec 15 01:43:43 localhost python3[4981]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Dec 15 01:43:43 localhost python3[4995]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Dec 15 01:43:44 localhost python3[5009]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Dec 15 01:43:44 localhost python3[5023]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Dec 15 01:43:44 localhost python3[5037]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Dec 15 01:43:44 localhost python3[5051]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Dec 15 01:43:45 localhost python3[5065]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Dec 15 01:43:45 localhost python3[5079]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Dec 15 01:43:45 localhost python3[5093]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Dec 15 01:43:45 localhost python3[5107]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Dec 15 01:43:46 localhost python3[5121]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Dec 15 01:43:46 localhost python3[5135]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Dec 15 01:43:46 localhost python3[5149]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Dec 15 01:43:46 localhost python3[5163]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Dec 15 01:43:47 localhost python3[5177]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Dec 15 01:43:47 localhost python3[5191]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Dec 15 01:43:47 localhost python3[5205]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Dec 15 01:43:48 localhost python3[5219]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Dec 15 01:43:48 localhost python3[5233]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Dec 15 01:43:48 localhost python3[5247]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Dec 15 01:43:49 localhost python3[5263]: ansible-community.general.timezone Invoked with name=UTC hwclock=None Dec 15 01:43:49 localhost systemd[1]: Starting Time & Date Service... Dec 15 01:43:49 localhost systemd[1]: Started Time & Date Service. Dec 15 01:43:49 localhost systemd-timedated[5265]: Changed time zone to 'UTC' (UTC). Dec 15 01:43:51 localhost python3[5284]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 01:43:52 localhost python3[5330]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 01:43:53 localhost python3[5371]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1765781032.5028057-491-68842945668779/source _original_basename=tmpt14io7sh follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 01:43:54 localhost python3[5431]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 01:43:55 localhost python3[5472]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1765781034.368605-581-261152562326226/source _original_basename=tmpb40f1yvd follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 01:43:56 localhost python3[5534]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 01:43:57 localhost python3[5577]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1765781036.6500542-726-221559410491673/source _original_basename=tmpb3q2i42h follow=False checksum=2e7e63ba56c9b487ea71081bee61c12a1e9cb2fe backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 01:43:58 localhost python3[5605]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 01:43:58 localhost python3[5621]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 01:44:00 localhost python3[5671]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 01:44:00 localhost python3[5714]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1765781039.993897-849-205900310711860/source _original_basename=tmpza_1cocd follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 01:44:01 localhost python3[5745]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163efc-24cc-890b-dfab-000000000023-1-overcloudnovacompute0 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 01:44:12 localhost python3[5763]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env#012 _uses_shell=True zuul_log_id=fa163efc-24cc-890b-dfab-000000000024-1-overcloudnovacompute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None Dec 15 01:44:19 localhost systemd[1]: systemd-timedated.service: Deactivated successfully. Dec 15 01:44:24 localhost python3[5784]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 01:44:44 localhost python3[5800]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 01:45:38 localhost systemd[4176]: Starting Mark boot as successful... Dec 15 01:45:38 localhost systemd[4176]: Finished Mark boot as successful. Dec 15 01:45:40 localhost chronyd[766]: Selected source 149.56.19.163 (2.rhel.pool.ntp.org) Dec 15 01:45:44 localhost systemd-logind[763]: Session 1 logged out. Waiting for processes to exit. Dec 15 01:46:20 localhost systemd[1]: Unmounting EFI System Partition Automount... Dec 15 01:46:20 localhost systemd[1]: efi.mount: Deactivated successfully. Dec 15 01:46:20 localhost systemd[1]: Unmounted EFI System Partition Automount. Dec 15 01:47:26 localhost sshd[5806]: main: sshd: ssh-rsa algorithm is disabled Dec 15 01:47:53 localhost kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 Dec 15 01:47:53 localhost kernel: pci 0000:00:07.0: reg 0x10: [io 0x0000-0x003f] Dec 15 01:47:53 localhost kernel: pci 0000:00:07.0: reg 0x14: [mem 0x00000000-0x00000fff] Dec 15 01:47:53 localhost kernel: pci 0000:00:07.0: reg 0x20: [mem 0x00000000-0x00003fff 64bit pref] Dec 15 01:47:53 localhost kernel: pci 0000:00:07.0: reg 0x30: [mem 0x00000000-0x0007ffff pref] Dec 15 01:47:53 localhost kernel: pci 0000:00:07.0: BAR 6: assigned [mem 0xc0000000-0xc007ffff pref] Dec 15 01:47:53 localhost kernel: pci 0000:00:07.0: BAR 4: assigned [mem 0x440000000-0x440003fff 64bit pref] Dec 15 01:47:53 localhost kernel: pci 0000:00:07.0: BAR 1: assigned [mem 0xc0080000-0xc0080fff] Dec 15 01:47:53 localhost kernel: pci 0000:00:07.0: BAR 0: assigned [io 0x1000-0x103f] Dec 15 01:47:53 localhost kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003) Dec 15 01:47:53 localhost NetworkManager[789]: [1765781273.2084] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3) Dec 15 01:47:53 localhost systemd-udevd[5808]: Network interface NamePolicy= disabled on kernel command line. Dec 15 01:47:53 localhost NetworkManager[789]: [1765781273.2238] device (eth1): state change: unmanaged -> unavailable (reason 'managed', sys-iface-state: 'external') Dec 15 01:47:53 localhost NetworkManager[789]: [1765781273.2275] settings: (eth1): created default wired connection 'Wired connection 1' Dec 15 01:47:53 localhost NetworkManager[789]: [1765781273.2281] device (eth1): carrier: link connected Dec 15 01:47:53 localhost NetworkManager[789]: [1765781273.2285] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', sys-iface-state: 'managed') Dec 15 01:47:53 localhost NetworkManager[789]: [1765781273.2292] policy: auto-activating connection 'Wired connection 1' (e0aaa872-9ccb-3176-949a-8b25e013a1ac) Dec 15 01:47:53 localhost NetworkManager[789]: [1765781273.2298] device (eth1): Activation: starting connection 'Wired connection 1' (e0aaa872-9ccb-3176-949a-8b25e013a1ac) Dec 15 01:47:53 localhost NetworkManager[789]: [1765781273.2299] device (eth1): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'managed') Dec 15 01:47:53 localhost NetworkManager[789]: [1765781273.2305] device (eth1): state change: prepare -> config (reason 'none', sys-iface-state: 'managed') Dec 15 01:47:53 localhost NetworkManager[789]: [1765781273.2313] device (eth1): state change: config -> ip-config (reason 'none', sys-iface-state: 'managed') Dec 15 01:47:53 localhost NetworkManager[789]: [1765781273.2317] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds) Dec 15 01:47:53 localhost sshd[5812]: main: sshd: ssh-rsa algorithm is disabled Dec 15 01:47:53 localhost systemd-logind[763]: New session 3 of user zuul. Dec 15 01:47:53 localhost systemd[1]: Started Session 3 of User zuul. Dec 15 01:47:54 localhost python3[5829]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163efc-24cc-ebad-a3c9-00000000039b-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 01:47:54 localhost kernel: IPv6: ADDRCONF(NETDEV_CHANGE): eth1: link becomes ready Dec 15 01:48:07 localhost python3[5879]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 01:48:07 localhost python3[5922]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765781286.9686584-435-41297697617026/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=60863936a1cae802fa1dbfabd1ffa89e5c6d910f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 01:48:08 localhost python3[5952]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Dec 15 01:48:08 localhost systemd[1]: NetworkManager-wait-online.service: Deactivated successfully. Dec 15 01:48:08 localhost systemd[1]: Stopped Network Manager Wait Online. Dec 15 01:48:08 localhost systemd[1]: Stopping Network Manager Wait Online... Dec 15 01:48:08 localhost NetworkManager[789]: [1765781288.1714] caught SIGTERM, shutting down normally. Dec 15 01:48:08 localhost systemd[1]: Stopping Network Manager... Dec 15 01:48:08 localhost NetworkManager[789]: [1765781288.1780] dhcp4 (eth0): canceled DHCP transaction Dec 15 01:48:08 localhost NetworkManager[789]: [1765781288.1781] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds) Dec 15 01:48:08 localhost NetworkManager[789]: [1765781288.1781] dhcp4 (eth0): state changed no lease Dec 15 01:48:08 localhost NetworkManager[789]: [1765781288.1784] manager: NetworkManager state is now CONNECTING Dec 15 01:48:08 localhost NetworkManager[789]: [1765781288.1861] dhcp4 (eth1): canceled DHCP transaction Dec 15 01:48:08 localhost NetworkManager[789]: [1765781288.1861] dhcp4 (eth1): state changed no lease Dec 15 01:48:08 localhost systemd[1]: Starting Network Manager Script Dispatcher Service... Dec 15 01:48:08 localhost NetworkManager[789]: [1765781288.1924] exiting (success) Dec 15 01:48:08 localhost systemd[1]: Started Network Manager Script Dispatcher Service. Dec 15 01:48:08 localhost systemd[1]: NetworkManager.service: Deactivated successfully. Dec 15 01:48:08 localhost systemd[1]: Stopped Network Manager. Dec 15 01:48:08 localhost systemd[1]: NetworkManager.service: Consumed 1.884s CPU time. Dec 15 01:48:08 localhost systemd[1]: Starting Network Manager... Dec 15 01:48:08 localhost NetworkManager[5963]: [1765781288.2434] NetworkManager (version 1.42.2-1.el9) is starting... (after a restart, boot:0d1fb014-fa04-4bbc-9dbb-cbf7146bb597) Dec 15 01:48:08 localhost NetworkManager[5963]: [1765781288.2437] Read config: /etc/NetworkManager/NetworkManager.conf (run: 15-carrier-timeout.conf) Dec 15 01:48:08 localhost systemd[1]: Started Network Manager. Dec 15 01:48:08 localhost NetworkManager[5963]: [1765781288.2459] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager" Dec 15 01:48:08 localhost systemd[1]: Starting Network Manager Wait Online... Dec 15 01:48:08 localhost NetworkManager[5963]: [1765781288.2511] manager[0x560ba772f090]: monitoring kernel firmware directory '/lib/firmware'. Dec 15 01:48:08 localhost systemd[1]: Starting Hostname Service... Dec 15 01:48:08 localhost systemd[1]: Started Hostname Service. Dec 15 01:48:08 localhost NetworkManager[5963]: [1765781288.3300] hostname: hostname: using hostnamed Dec 15 01:48:08 localhost NetworkManager[5963]: [1765781288.3301] hostname: static hostname changed from (none) to "np0005559462.novalocal" Dec 15 01:48:08 localhost NetworkManager[5963]: [1765781288.3307] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto) Dec 15 01:48:08 localhost NetworkManager[5963]: [1765781288.3313] manager[0x560ba772f090]: rfkill: Wi-Fi hardware radio set enabled Dec 15 01:48:08 localhost NetworkManager[5963]: [1765781288.3313] manager[0x560ba772f090]: rfkill: WWAN hardware radio set enabled Dec 15 01:48:08 localhost NetworkManager[5963]: [1765781288.3349] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.42.2-1.el9/libnm-device-plugin-team.so) Dec 15 01:48:08 localhost NetworkManager[5963]: [1765781288.3350] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file Dec 15 01:48:08 localhost NetworkManager[5963]: [1765781288.3351] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file Dec 15 01:48:08 localhost NetworkManager[5963]: [1765781288.3352] manager: Networking is enabled by state file Dec 15 01:48:08 localhost NetworkManager[5963]: [1765781288.3359] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.42.2-1.el9/libnm-settings-plugin-ifcfg-rh.so") Dec 15 01:48:08 localhost NetworkManager[5963]: [1765781288.3360] settings: Loaded settings plugin: keyfile (internal) Dec 15 01:48:08 localhost NetworkManager[5963]: [1765781288.3401] dhcp: init: Using DHCP client 'internal' Dec 15 01:48:08 localhost NetworkManager[5963]: [1765781288.3405] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1) Dec 15 01:48:08 localhost NetworkManager[5963]: [1765781288.3413] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external') Dec 15 01:48:08 localhost NetworkManager[5963]: [1765781288.3419] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', sys-iface-state: 'external') Dec 15 01:48:08 localhost NetworkManager[5963]: [1765781288.3431] device (lo): Activation: starting connection 'lo' (3f7c3e97-b35c-4041-9026-128161f4820a) Dec 15 01:48:08 localhost NetworkManager[5963]: [1765781288.3440] device (eth0): carrier: link connected Dec 15 01:48:08 localhost NetworkManager[5963]: [1765781288.3446] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2) Dec 15 01:48:08 localhost NetworkManager[5963]: [1765781288.3452] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated) Dec 15 01:48:08 localhost NetworkManager[5963]: [1765781288.3453] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'assume') Dec 15 01:48:08 localhost NetworkManager[5963]: [1765781288.3462] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', sys-iface-state: 'assume') Dec 15 01:48:08 localhost NetworkManager[5963]: [1765781288.3474] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) Dec 15 01:48:08 localhost NetworkManager[5963]: [1765781288.3484] device (eth1): carrier: link connected Dec 15 01:48:08 localhost NetworkManager[5963]: [1765781288.3491] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3) Dec 15 01:48:08 localhost NetworkManager[5963]: [1765781288.3498] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (e0aaa872-9ccb-3176-949a-8b25e013a1ac) (indicated) Dec 15 01:48:08 localhost NetworkManager[5963]: [1765781288.3498] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'assume') Dec 15 01:48:08 localhost NetworkManager[5963]: [1765781288.3505] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', sys-iface-state: 'assume') Dec 15 01:48:08 localhost NetworkManager[5963]: [1765781288.3514] device (eth1): Activation: starting connection 'Wired connection 1' (e0aaa872-9ccb-3176-949a-8b25e013a1ac) Dec 15 01:48:08 localhost NetworkManager[5963]: [1765781288.3538] device (lo): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'external') Dec 15 01:48:08 localhost NetworkManager[5963]: [1765781288.3543] device (lo): state change: prepare -> config (reason 'none', sys-iface-state: 'external') Dec 15 01:48:08 localhost NetworkManager[5963]: [1765781288.3546] device (lo): state change: config -> ip-config (reason 'none', sys-iface-state: 'external') Dec 15 01:48:08 localhost NetworkManager[5963]: [1765781288.3549] device (eth0): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'assume') Dec 15 01:48:08 localhost NetworkManager[5963]: [1765781288.3555] device (eth0): state change: prepare -> config (reason 'none', sys-iface-state: 'assume') Dec 15 01:48:08 localhost NetworkManager[5963]: [1765781288.3558] device (eth1): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'assume') Dec 15 01:48:08 localhost NetworkManager[5963]: [1765781288.3561] device (eth1): state change: prepare -> config (reason 'none', sys-iface-state: 'assume') Dec 15 01:48:08 localhost NetworkManager[5963]: [1765781288.3564] device (lo): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'external') Dec 15 01:48:08 localhost NetworkManager[5963]: [1765781288.3570] device (eth0): state change: config -> ip-config (reason 'none', sys-iface-state: 'assume') Dec 15 01:48:08 localhost NetworkManager[5963]: [1765781288.3573] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds) Dec 15 01:48:08 localhost NetworkManager[5963]: [1765781288.3584] device (eth1): state change: config -> ip-config (reason 'none', sys-iface-state: 'assume') Dec 15 01:48:08 localhost NetworkManager[5963]: [1765781288.3587] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds) Dec 15 01:48:08 localhost NetworkManager[5963]: [1765781288.3647] device (lo): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'external') Dec 15 01:48:08 localhost NetworkManager[5963]: [1765781288.3656] device (lo): state change: secondaries -> activated (reason 'none', sys-iface-state: 'external') Dec 15 01:48:08 localhost NetworkManager[5963]: [1765781288.3665] device (lo): Activation: successful, device activated. Dec 15 01:48:08 localhost NetworkManager[5963]: [1765781288.3676] dhcp4 (eth0): state changed new lease, address=38.102.83.173 Dec 15 01:48:08 localhost NetworkManager[5963]: [1765781288.3683] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS Dec 15 01:48:08 localhost NetworkManager[5963]: [1765781288.3773] device (eth0): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'assume') Dec 15 01:48:08 localhost NetworkManager[5963]: [1765781288.3815] device (eth0): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'assume') Dec 15 01:48:08 localhost NetworkManager[5963]: [1765781288.3818] device (eth0): state change: secondaries -> activated (reason 'none', sys-iface-state: 'assume') Dec 15 01:48:08 localhost NetworkManager[5963]: [1765781288.3825] manager: NetworkManager state is now CONNECTED_SITE Dec 15 01:48:08 localhost NetworkManager[5963]: [1765781288.3832] device (eth0): Activation: successful, device activated. Dec 15 01:48:08 localhost NetworkManager[5963]: [1765781288.3840] manager: NetworkManager state is now CONNECTED_GLOBAL Dec 15 01:48:08 localhost python3[6020]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163efc-24cc-ebad-a3c9-000000000120-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 01:48:18 localhost systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully. Dec 15 01:48:38 localhost systemd[1]: systemd-hostnamed.service: Deactivated successfully. Dec 15 01:48:38 localhost systemd[4176]: Created slice User Background Tasks Slice. Dec 15 01:48:38 localhost systemd[4176]: Starting Cleanup of User's Temporary Files and Directories... Dec 15 01:48:38 localhost systemd[4176]: Finished Cleanup of User's Temporary Files and Directories. Dec 15 01:48:53 localhost NetworkManager[5963]: [1765781333.8082] device (eth1): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'assume') Dec 15 01:48:53 localhost systemd[1]: Starting Network Manager Script Dispatcher Service... Dec 15 01:48:53 localhost systemd[1]: Started Network Manager Script Dispatcher Service. Dec 15 01:48:53 localhost NetworkManager[5963]: [1765781333.8294] device (eth1): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'assume') Dec 15 01:48:53 localhost NetworkManager[5963]: [1765781333.8297] device (eth1): state change: secondaries -> activated (reason 'none', sys-iface-state: 'assume') Dec 15 01:48:53 localhost NetworkManager[5963]: [1765781333.8305] device (eth1): Activation: successful, device activated. Dec 15 01:48:53 localhost NetworkManager[5963]: [1765781333.8313] manager: startup complete Dec 15 01:48:53 localhost systemd[1]: Finished Network Manager Wait Online. Dec 15 01:49:03 localhost systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully. Dec 15 01:49:08 localhost systemd[1]: session-3.scope: Deactivated successfully. Dec 15 01:49:08 localhost systemd[1]: session-3.scope: Consumed 1.440s CPU time. Dec 15 01:49:08 localhost systemd-logind[763]: Session 3 logged out. Waiting for processes to exit. Dec 15 01:49:08 localhost systemd-logind[763]: Removed session 3. Dec 15 01:50:17 localhost sshd[6056]: main: sshd: ssh-rsa algorithm is disabled Dec 15 01:50:17 localhost systemd-logind[763]: New session 4 of user zuul. Dec 15 01:50:17 localhost systemd[1]: Started Session 4 of User zuul. Dec 15 01:50:18 localhost python3[6107]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 01:50:18 localhost python3[6150]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765781417.9094977-628-50488059083037/source _original_basename=tmp9cbi59mq follow=False checksum=f39adcc6f25af9b77c3197b5fbe1050e570817b5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 01:50:22 localhost systemd[1]: session-4.scope: Deactivated successfully. Dec 15 01:50:22 localhost systemd-logind[763]: Session 4 logged out. Waiting for processes to exit. Dec 15 01:50:22 localhost systemd-logind[763]: Removed session 4. Dec 15 01:56:52 localhost sshd[6167]: main: sshd: ssh-rsa algorithm is disabled Dec 15 01:56:52 localhost systemd-logind[763]: New session 5 of user zuul. Dec 15 01:56:52 localhost systemd[1]: Started Session 5 of User zuul. Dec 15 01:56:52 localhost python3[6186]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda#012 _uses_shell=True zuul_log_id=fa163efc-24cc-ed3d-53fe-000000001fa3-1-overcloudnovacompute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 01:57:04 localhost python3[6206]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 01:57:04 localhost python3[6222]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 01:57:04 localhost python3[6238]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 01:57:05 localhost python3[6254]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 01:57:05 localhost python3[6270]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 01:57:07 localhost python3[6318]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 01:57:07 localhost python3[6361]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765781826.8977656-649-167266762267459/source _original_basename=tmpujqu7fha follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 01:57:09 localhost python3[6391]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Dec 15 01:57:09 localhost systemd[1]: Reloading. Dec 15 01:57:09 localhost systemd-rc-local-generator[6409]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 01:57:09 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 01:57:10 localhost python3[6438]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None Dec 15 01:57:12 localhost python3[6454]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0 riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 01:57:12 localhost python3[6472]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0 riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 01:57:12 localhost python3[6490]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0 riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 01:57:12 localhost python3[6508]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0 riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 01:57:13 localhost python3[6525]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init"; cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system"; cat /sys/fs/cgroup/system.slice/io.max; echo "user"; cat /sys/fs/cgroup/user.slice/io.max;#012 _uses_shell=True zuul_log_id=fa163efc-24cc-ed3d-53fe-000000001faa-1-overcloudnovacompute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 01:57:24 localhost python3[6545]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 15 01:57:24 localhost systemd[1]: Starting Cleanup of Temporary Directories... Dec 15 01:57:24 localhost systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully. Dec 15 01:57:24 localhost systemd[1]: Finished Cleanup of Temporary Directories. Dec 15 01:57:24 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully. Dec 15 01:57:27 localhost systemd[1]: session-5.scope: Deactivated successfully. Dec 15 01:57:27 localhost systemd[1]: session-5.scope: Consumed 3.857s CPU time. Dec 15 01:57:27 localhost systemd-logind[763]: Session 5 logged out. Waiting for processes to exit. Dec 15 01:57:27 localhost systemd-logind[763]: Removed session 5. Dec 15 01:58:41 localhost sshd[6554]: main: sshd: ssh-rsa algorithm is disabled Dec 15 01:58:41 localhost systemd-logind[763]: New session 6 of user zuul. Dec 15 01:58:41 localhost systemd[1]: Started Session 6 of User zuul. Dec 15 01:58:41 localhost systemd[1]: Starting RHSM dbus service... Dec 15 01:58:42 localhost systemd[1]: Started RHSM dbus service. Dec 15 01:58:42 localhost rhsm-service[6578]: INFO [subscription_manager.i18n:169] Could not import locale for C: [Errno 2] No translation file found for domain: 'rhsm' Dec 15 01:58:42 localhost rhsm-service[6578]: INFO [subscription_manager.i18n:139] Could not import locale either for C_C: [Errno 2] No translation file found for domain: 'rhsm' Dec 15 01:58:42 localhost rhsm-service[6578]: INFO [subscription_manager.i18n:169] Could not import locale for C: [Errno 2] No translation file found for domain: 'rhsm' Dec 15 01:58:42 localhost rhsm-service[6578]: INFO [subscription_manager.i18n:139] Could not import locale either for C_C: [Errno 2] No translation file found for domain: 'rhsm' Dec 15 01:58:46 localhost rhsm-service[6578]: INFO [subscription_manager.managerlib:90] Consumer created: np0005559462.novalocal (d8870b96-e9e0-4007-a17e-134952a74633) Dec 15 01:58:46 localhost subscription-manager[6578]: Registered system with identity: d8870b96-e9e0-4007-a17e-134952a74633 Dec 15 01:58:47 localhost rhsm-service[6578]: INFO [subscription_manager.entcertlib:131] certs updated: Dec 15 01:58:47 localhost rhsm-service[6578]: Total updates: 1 Dec 15 01:58:47 localhost rhsm-service[6578]: Found (local) serial# [] Dec 15 01:58:47 localhost rhsm-service[6578]: Expected (UEP) serial# [1144770777762329781] Dec 15 01:58:47 localhost rhsm-service[6578]: Added (new) Dec 15 01:58:47 localhost rhsm-service[6578]: [sn:1144770777762329781 ( Content Access,) @ /etc/pki/entitlement/1144770777762329781.pem] Dec 15 01:58:47 localhost rhsm-service[6578]: Deleted (rogue): Dec 15 01:58:47 localhost rhsm-service[6578]: Dec 15 01:58:47 localhost subscription-manager[6578]: Added subscription for 'Content Access' contract 'None' Dec 15 01:58:47 localhost subscription-manager[6578]: Added subscription for product ' Content Access' Dec 15 01:58:48 localhost rhsm-service[6578]: INFO [subscription_manager.i18n:169] Could not import locale for C: [Errno 2] No translation file found for domain: 'rhsm' Dec 15 01:58:48 localhost rhsm-service[6578]: INFO [subscription_manager.i18n:139] Could not import locale either for C_C: [Errno 2] No translation file found for domain: 'rhsm' Dec 15 01:58:48 localhost rhsm-service[6578]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Dec 15 01:58:48 localhost rhsm-service[6578]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Dec 15 01:58:49 localhost rhsm-service[6578]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Dec 15 01:58:49 localhost rhsm-service[6578]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Dec 15 01:58:49 localhost rhsm-service[6578]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Dec 15 01:58:54 localhost python3[6670]: ansible-ansible.legacy.command Invoked with _raw_params=cat /etc/redhat-release zuul_log_id=fa163efc-24cc-1d36-02be-00000000000d-1-overcloudnovacompute0 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 01:58:55 localhost python3[6689]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None Dec 15 01:59:25 localhost setsebool[6764]: The virt_use_nfs policy boolean was changed to 1 by root Dec 15 01:59:25 localhost setsebool[6764]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root Dec 15 01:59:33 localhost kernel: SELinux: Converting 406 SID table entries... Dec 15 01:59:33 localhost kernel: SELinux: policy capability network_peer_controls=1 Dec 15 01:59:33 localhost kernel: SELinux: policy capability open_perms=1 Dec 15 01:59:33 localhost kernel: SELinux: policy capability extended_socket_class=1 Dec 15 01:59:33 localhost kernel: SELinux: policy capability always_check_network=0 Dec 15 01:59:33 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Dec 15 01:59:33 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Dec 15 01:59:33 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Dec 15 01:59:45 localhost dbus-broker-launch[755]: avc: op=load_policy lsm=selinux seqno=3 res=1 Dec 15 01:59:45 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Dec 15 01:59:45 localhost systemd[1]: Starting man-db-cache-update.service... Dec 15 01:59:45 localhost systemd[1]: Reloading. Dec 15 01:59:46 localhost systemd-rc-local-generator[7618]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 01:59:46 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 01:59:46 localhost systemd[1]: Queuing reload/restart jobs for marked units… Dec 15 01:59:47 localhost rhsm-service[6578]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Dec 15 01:59:53 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Dec 15 01:59:53 localhost systemd[1]: Finished man-db-cache-update.service. Dec 15 01:59:53 localhost systemd[1]: man-db-cache-update.service: Consumed 9.535s CPU time. Dec 15 01:59:53 localhost systemd[1]: run-r62a9a729c6ce4f0ebfdda05310f3cd13.service: Deactivated successfully. Dec 15 02:00:42 localhost systemd[1]: var-lib-containers-storage-overlay-metacopy\x2dcheck130122334-merged.mount: Deactivated successfully. Dec 15 02:00:42 localhost podman[18356]: 2025-12-15 07:00:42.505332548 +0000 UTC m=+0.089541050 system refresh Dec 15 02:00:43 localhost systemd[4176]: Starting D-Bus User Message Bus... Dec 15 02:00:43 localhost dbus-broker-launch[18416]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored Dec 15 02:00:43 localhost dbus-broker-launch[18416]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored Dec 15 02:00:43 localhost systemd[4176]: Started D-Bus User Message Bus. Dec 15 02:00:43 localhost journal[18416]: Ready Dec 15 02:00:43 localhost systemd[4176]: selinux: avc: op=load_policy lsm=selinux seqno=3 res=1 Dec 15 02:00:43 localhost systemd[4176]: Created slice Slice /user. Dec 15 02:00:43 localhost systemd[4176]: podman-18398.scope: unit configures an IP firewall, but not running as root. Dec 15 02:00:43 localhost systemd[4176]: (This warning is only shown for the first unit using IP firewalling.) Dec 15 02:00:43 localhost systemd[4176]: Started podman-18398.scope. Dec 15 02:00:43 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Dec 15 02:00:43 localhost systemd[4176]: Started podman-pause-952f357d.scope. Dec 15 02:00:45 localhost systemd[1]: session-6.scope: Deactivated successfully. Dec 15 02:00:45 localhost systemd[1]: session-6.scope: Consumed 49.233s CPU time. Dec 15 02:00:46 localhost systemd-logind[763]: Session 6 logged out. Waiting for processes to exit. Dec 15 02:00:46 localhost systemd-logind[763]: Removed session 6. Dec 15 02:00:59 localhost sshd[18420]: main: sshd: ssh-rsa algorithm is disabled Dec 15 02:00:59 localhost sshd[18422]: main: sshd: ssh-rsa algorithm is disabled Dec 15 02:00:59 localhost sshd[18419]: main: sshd: ssh-rsa algorithm is disabled Dec 15 02:00:59 localhost sshd[18418]: main: sshd: ssh-rsa algorithm is disabled Dec 15 02:00:59 localhost sshd[18421]: main: sshd: ssh-rsa algorithm is disabled Dec 15 02:01:04 localhost sshd[18443]: main: sshd: ssh-rsa algorithm is disabled Dec 15 02:01:04 localhost systemd-logind[763]: New session 7 of user zuul. Dec 15 02:01:04 localhost systemd[1]: Started Session 7 of User zuul. Dec 15 02:01:04 localhost python3[18460]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBM3Y8v/z0kR7LEyMxrsWGV7LCNYhqetMZn0OXa73Em2Xlp7LHfpdEosyf/OxH7TYvSJXu+bdInEjXLIH/dz7A0k= zuul@np0005559456.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Dec 15 02:01:05 localhost python3[18476]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBM3Y8v/z0kR7LEyMxrsWGV7LCNYhqetMZn0OXa73Em2Xlp7LHfpdEosyf/OxH7TYvSJXu+bdInEjXLIH/dz7A0k= zuul@np0005559456.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Dec 15 02:01:06 localhost systemd[1]: session-7.scope: Deactivated successfully. Dec 15 02:01:06 localhost systemd-logind[763]: Session 7 logged out. Waiting for processes to exit. Dec 15 02:01:06 localhost systemd-logind[763]: Removed session 7. Dec 15 02:02:40 localhost sshd[18478]: main: sshd: ssh-rsa algorithm is disabled Dec 15 02:02:40 localhost systemd-logind[763]: New session 8 of user zuul. Dec 15 02:02:40 localhost systemd[1]: Started Session 8 of User zuul. Dec 15 02:02:40 localhost python3[18497]: ansible-authorized_key Invoked with user=root manage_dir=True key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC1ko8xh0bPiR6+NCG4km3etin5rm3hZZmVHXuDDaTrTWq3PUz5sEmSbiDQOJj4mOpsouNXjYHkjXSuRLTQ5dqF1BWU5bgiOkTIqccwZdxkqfM2VFXFj/Ej621HUBRYHf7PK5zkl+8G1g2RkkiSd886DSw6I1J+2uT/e+4/0G1vsACTaNArP3/JSOh0hdwu+fnjybrp4sauiJsWaQvwbWao/txJqznQhymNwHZVFRMhFy+x+oDr4ry7w+X2JuGz2ydbUojBUG0REWTKmU4EZsyDsx77GzIJwfHsUUuJ0t4DcDalVqz20D+LXTug8AfgSovuNhZTz8AqRXfCOZK9haLb+3tJwKExMBdnj0cacNw6O23jZTIMbJK+qoxGN4mzqr8RNFVPPLVhL5tcfb2MvN7TWs7+oT0K5xmxDkEnr7iSpmPV0d8TD/wNwfJmtu2W6TWL5zgA6U3GyfnTH2+nhrFk5Ou+yaLES2+Wx8kb0NLWgDnlCcYy9dw1eP3GSc3GIzM= zuul-build-sshkey state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Dec 15 02:02:41 localhost python3[18513]: ansible-user Invoked with name=root state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005559462.novalocal update_password=always uid=None group=None groups=None comment=None home=None shell=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None Dec 15 02:02:43 localhost python3[18563]: ansible-ansible.legacy.stat Invoked with path=/root/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 02:02:43 localhost python3[18606]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765782162.9071033-132-194421688159220/source dest=/root/.ssh/id_rsa mode=384 owner=root force=False _original_basename=590a0ff85d6248fba2903531d2905bb8_id_rsa follow=False checksum=b38504d433636763ead663645f5f650975b888ea backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:02:44 localhost python3[18668]: ansible-ansible.legacy.stat Invoked with path=/root/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 02:02:45 localhost python3[18711]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765782164.578159-224-2124682842225/source dest=/root/.ssh/id_rsa.pub mode=420 owner=root force=False _original_basename=590a0ff85d6248fba2903531d2905bb8_id_rsa.pub follow=False checksum=2e3e36d0fa1c6d0defe218a502cdeab9c5909fc5 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:02:47 localhost python3[18741]: ansible-ansible.builtin.file Invoked with path=/etc/nodepool state=directory mode=0777 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:02:48 localhost python3[18787]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 02:02:48 localhost python3[18803]: ansible-ansible.legacy.file Invoked with dest=/etc/nodepool/sub_nodes _original_basename=tmpvw7ezh8i recurse=False state=file path=/etc/nodepool/sub_nodes force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:02:49 localhost python3[18863]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 02:02:49 localhost python3[18879]: ansible-ansible.legacy.file Invoked with dest=/etc/nodepool/sub_nodes_private _original_basename=tmpxsmvm8ew recurse=False state=file path=/etc/nodepool/sub_nodes_private force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:02:51 localhost python3[18939]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 02:02:51 localhost python3[18955]: ansible-ansible.legacy.file Invoked with dest=/etc/nodepool/node_private _original_basename=tmpf7gn0ohp recurse=False state=file path=/etc/nodepool/node_private force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:02:52 localhost systemd[1]: session-8.scope: Deactivated successfully. Dec 15 02:02:52 localhost systemd[1]: session-8.scope: Consumed 3.576s CPU time. Dec 15 02:02:52 localhost systemd-logind[763]: Session 8 logged out. Waiting for processes to exit. Dec 15 02:02:52 localhost systemd-logind[763]: Removed session 8. Dec 15 02:04:57 localhost sshd[18973]: main: sshd: ssh-rsa algorithm is disabled Dec 15 02:04:57 localhost systemd[1]: Started Session 9 of User zuul. Dec 15 02:04:57 localhost systemd-logind[763]: New session 9 of user zuul. Dec 15 02:04:57 localhost python3[19019]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 02:08:28 localhost systemd[1]: Starting dnf makecache... Dec 15 02:08:28 localhost dnf[19021]: Updating Subscription Management repositories. Dec 15 02:08:30 localhost dnf[19021]: Failed determining last makecache time. Dec 15 02:08:30 localhost dnf[19021]: Red Hat Enterprise Linux 9 for x86_64 - BaseOS 22 kB/s | 4.1 kB 00:00 Dec 15 02:08:30 localhost dnf[19021]: Red Hat Enterprise Linux 9 for x86_64 - BaseOS 45 kB/s | 4.1 kB 00:00 Dec 15 02:08:31 localhost dnf[19021]: Red Hat Enterprise Linux 9 for x86_64 - AppStre 47 kB/s | 4.5 kB 00:00 Dec 15 02:08:31 localhost dnf[19021]: Red Hat Enterprise Linux 9 for x86_64 - AppStre 54 kB/s | 4.5 kB 00:00 Dec 15 02:08:31 localhost dnf[19021]: Metadata cache created. Dec 15 02:08:31 localhost systemd[1]: dnf-makecache.service: Deactivated successfully. Dec 15 02:08:31 localhost systemd[1]: Finished dnf makecache. Dec 15 02:08:31 localhost systemd[1]: dnf-makecache.service: Consumed 2.505s CPU time. Dec 15 02:09:57 localhost systemd[1]: session-9.scope: Deactivated successfully. Dec 15 02:09:57 localhost systemd-logind[763]: Session 9 logged out. Waiting for processes to exit. Dec 15 02:09:57 localhost systemd-logind[763]: Removed session 9. Dec 15 02:16:25 localhost sshd[19031]: main: sshd: ssh-rsa algorithm is disabled Dec 15 02:16:25 localhost systemd-logind[763]: New session 10 of user zuul. Dec 15 02:16:25 localhost systemd[1]: Started Session 10 of User zuul. Dec 15 02:16:26 localhost python3[19048]: ansible-ansible.legacy.command Invoked with _raw_params=cat /etc/redhat-release zuul_log_id=fa163efc-24cc-4281-54ac-00000000000c-1-overcloudnovacompute0 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 02:16:27 localhost python3[19068]: ansible-ansible.legacy.command Invoked with _raw_params=yum clean all zuul_log_id=fa163efc-24cc-4281-54ac-00000000000d-1-overcloudnovacompute0 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 02:16:32 localhost python3[19088]: ansible-community.general.rhsm_repository Invoked with name=['rhel-9-for-x86_64-baseos-eus-rpms'] state=enabled purge=False Dec 15 02:16:35 localhost rhsm-service[6578]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Dec 15 02:17:29 localhost python3[19245]: ansible-community.general.rhsm_repository Invoked with name=['rhel-9-for-x86_64-appstream-eus-rpms'] state=enabled purge=False Dec 15 02:17:32 localhost rhsm-service[6578]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Dec 15 02:17:32 localhost rhsm-service[6578]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Dec 15 02:17:41 localhost python3[19386]: ansible-community.general.rhsm_repository Invoked with name=['rhel-9-for-x86_64-highavailability-eus-rpms'] state=enabled purge=False Dec 15 02:17:43 localhost rhsm-service[6578]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Dec 15 02:17:43 localhost rhsm-service[6578]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Dec 15 02:17:48 localhost rhsm-service[6578]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Dec 15 02:18:11 localhost python3[19661]: ansible-community.general.rhsm_repository Invoked with name=['fast-datapath-for-rhel-9-x86_64-rpms'] state=enabled purge=False Dec 15 02:18:14 localhost rhsm-service[6578]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Dec 15 02:18:14 localhost rhsm-service[6578]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Dec 15 02:18:20 localhost rhsm-service[6578]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Dec 15 02:18:20 localhost rhsm-service[6578]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Dec 15 02:18:43 localhost python3[19997]: ansible-community.general.rhsm_repository Invoked with name=['openstack-17.1-for-rhel-9-x86_64-rpms'] state=enabled purge=False Dec 15 02:18:45 localhost rhsm-service[6578]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Dec 15 02:18:46 localhost rhsm-service[6578]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Dec 15 02:18:51 localhost rhsm-service[6578]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Dec 15 02:18:51 localhost rhsm-service[6578]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Dec 15 02:19:16 localhost python3[20276]: ansible-ansible.legacy.command Invoked with _raw_params=yum repolist --enabled#012 _uses_shell=True zuul_log_id=fa163efc-24cc-4281-54ac-000000000013-1-overcloudnovacompute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 02:19:21 localhost python3[20295]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch', 'os-net-config', 'ansible-core'] state=present update_cache=True allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None Dec 15 02:19:32 localhost systemd[1]: Started daily update of the root trust anchor for DNSSEC. Dec 15 02:19:40 localhost kernel: SELinux: Converting 499 SID table entries... Dec 15 02:19:40 localhost kernel: SELinux: policy capability network_peer_controls=1 Dec 15 02:19:40 localhost kernel: SELinux: policy capability open_perms=1 Dec 15 02:19:40 localhost kernel: SELinux: policy capability extended_socket_class=1 Dec 15 02:19:40 localhost kernel: SELinux: policy capability always_check_network=0 Dec 15 02:19:40 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Dec 15 02:19:40 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Dec 15 02:19:40 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Dec 15 02:19:44 localhost dbus-broker-launch[755]: avc: op=load_policy lsm=selinux seqno=4 res=1 Dec 15 02:19:44 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Dec 15 02:19:44 localhost systemd[1]: Starting man-db-cache-update.service... Dec 15 02:19:44 localhost systemd[1]: Reloading. Dec 15 02:19:44 localhost systemd-rc-local-generator[20969]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 02:19:44 localhost systemd-sysv-generator[20972]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 02:19:44 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 02:19:44 localhost systemd[1]: Queuing reload/restart jobs for marked units… Dec 15 02:19:45 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Dec 15 02:19:45 localhost systemd[1]: Finished man-db-cache-update.service. Dec 15 02:19:45 localhost systemd[1]: run-r6383e3863b2349f2998f794e5f911333.service: Deactivated successfully. Dec 15 02:19:45 localhost rhsm-service[6578]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Dec 15 02:19:46 localhost rhsm-service[6578]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Dec 15 02:20:12 localhost python3[21641]: ansible-ansible.legacy.command Invoked with _raw_params=ansible-galaxy collection install ansible.posix#012 _uses_shell=True zuul_log_id=fa163efc-24cc-4281-54ac-000000000015-1-overcloudnovacompute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 02:20:39 localhost python3[21661]: ansible-ansible.builtin.file Invoked with path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:20:40 localhost python3[21709]: ansible-ansible.legacy.stat Invoked with path=/etc/os-net-config/tripleo_config.yaml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 02:20:40 localhost python3[21752]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765783239.8812582-291-108421211914073/source dest=/etc/os-net-config/tripleo_config.yaml mode=None follow=False _original_basename=overcloud_net_config.j2 checksum=3358dfc6c6ce646155135d0cad900026cb34ba08 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:20:42 localhost python3[21782]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None Dec 15 02:20:42 localhost systemd-journald[618]: Field hash table of /run/log/journal/738a39f68bc78fb81032e509449fb759/system.journal has a fill level at 89.2 (297 of 333 items), suggesting rotation. Dec 15 02:20:42 localhost systemd-journald[618]: /run/log/journal/738a39f68bc78fb81032e509449fb759/system.journal: Journal header limits reached or header out-of-date, rotating. Dec 15 02:20:42 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Dec 15 02:20:42 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Dec 15 02:20:42 localhost python3[21803]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network-20 state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None Dec 15 02:20:42 localhost python3[21823]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network-21 state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None Dec 15 02:20:42 localhost python3[21843]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network-22 state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None Dec 15 02:20:43 localhost python3[21863]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network-23 state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None Dec 15 02:20:46 localhost python3[21883]: ansible-ansible.builtin.systemd Invoked with name=network state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Dec 15 02:20:47 localhost systemd[1]: Starting LSB: Bring up/down networking... Dec 15 02:20:47 localhost network[21886]: WARN : [network] You are using 'network' service provided by 'network-scripts', which are now deprecated. Dec 15 02:20:47 localhost network[21897]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Dec 15 02:20:47 localhost network[21886]: WARN : [network] 'network-scripts' will be removed from distribution in near future. Dec 15 02:20:47 localhost network[21898]: 'network-scripts' will be removed from distribution in near future. Dec 15 02:20:47 localhost network[21886]: WARN : [network] It is advised to switch to 'NetworkManager' instead for network management. Dec 15 02:20:47 localhost network[21899]: It is advised to switch to 'NetworkManager' instead for network management. Dec 15 02:20:47 localhost NetworkManager[5963]: [1765783247.8049] audit: op="connections-reload" pid=21927 uid=0 result="success" Dec 15 02:20:47 localhost network[21886]: Bringing up loopback interface: [ OK ] Dec 15 02:20:48 localhost NetworkManager[5963]: [1765783248.0061] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth0" pid=22015 uid=0 result="success" Dec 15 02:20:48 localhost network[21886]: Bringing up interface eth0: [ OK ] Dec 15 02:20:48 localhost systemd[1]: Started LSB: Bring up/down networking. Dec 15 02:20:48 localhost python3[22056]: ansible-ansible.builtin.systemd Invoked with name=openvswitch state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Dec 15 02:20:49 localhost systemd[1]: Starting Open vSwitch Database Unit... Dec 15 02:20:49 localhost chown[22060]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory Dec 15 02:20:49 localhost ovs-ctl[22065]: /etc/openvswitch/conf.db does not exist ... (warning). Dec 15 02:20:49 localhost ovs-ctl[22065]: Creating empty database /etc/openvswitch/conf.db [ OK ] Dec 15 02:20:49 localhost ovs-ctl[22065]: Starting ovsdb-server [ OK ] Dec 15 02:20:49 localhost ovs-vsctl[22115]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1 Dec 15 02:20:49 localhost ovs-vsctl[22135]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.6-141.el9fdp "external-ids:system-id=\"12d96d64-e862-4f68-81e5-8d9ec5d3a5e2\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"rhel\"" "system-version=\"9.2\"" Dec 15 02:20:49 localhost ovs-ctl[22065]: Configuring Open vSwitch system IDs [ OK ] Dec 15 02:20:49 localhost ovs-ctl[22065]: Enabling remote OVSDB managers [ OK ] Dec 15 02:20:49 localhost ovs-vsctl[22141]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=np0005559462.novalocal Dec 15 02:20:49 localhost systemd[1]: Started Open vSwitch Database Unit. Dec 15 02:20:49 localhost systemd[1]: Starting Open vSwitch Delete Transient Ports... Dec 15 02:20:49 localhost systemd[1]: Finished Open vSwitch Delete Transient Ports. Dec 15 02:20:49 localhost systemd[1]: Starting Open vSwitch Forwarding Unit... Dec 15 02:20:50 localhost kernel: openvswitch: Open vSwitch switching datapath Dec 15 02:20:50 localhost ovs-ctl[22186]: Inserting openvswitch module [ OK ] Dec 15 02:20:50 localhost ovs-ctl[22154]: Starting ovs-vswitchd [ OK ] Dec 15 02:20:50 localhost ovs-vsctl[22204]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=np0005559462.novalocal Dec 15 02:20:50 localhost ovs-ctl[22154]: Enabling remote OVSDB managers [ OK ] Dec 15 02:20:50 localhost systemd[1]: Started Open vSwitch Forwarding Unit. Dec 15 02:20:50 localhost systemd[1]: Starting Open vSwitch... Dec 15 02:20:50 localhost systemd[1]: Finished Open vSwitch. Dec 15 02:20:50 localhost python3[22222]: ansible-ansible.legacy.command Invoked with _raw_params=os-net-config -c /etc/os-net-config/tripleo_config.yaml#012 _uses_shell=True zuul_log_id=fa163efc-24cc-4281-54ac-00000000001a-1-overcloudnovacompute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 02:20:51 localhost NetworkManager[5963]: [1765783251.7080] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22380 uid=0 result="success" Dec 15 02:20:51 localhost ifup[22381]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Dec 15 02:20:51 localhost ifup[22382]: 'network-scripts' will be removed from distribution in near future. Dec 15 02:20:51 localhost ifup[22383]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Dec 15 02:20:51 localhost NetworkManager[5963]: [1765783251.7352] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22389 uid=0 result="success" Dec 15 02:20:51 localhost ovs-vsctl[22391]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --may-exist add-br br-ex -- set bridge br-ex other-config:mac-table-size=50000 -- set bridge br-ex other-config:hwaddr=fa:16:3e:b3:bb:e3 -- set bridge br-ex fail_mode=standalone -- del-controller br-ex Dec 15 02:20:51 localhost kernel: device ovs-system entered promiscuous mode Dec 15 02:20:51 localhost systemd-udevd[22130]: Network interface NamePolicy= disabled on kernel command line. Dec 15 02:20:51 localhost NetworkManager[5963]: [1765783251.7653] manager: (ovs-system): new Generic device (/org/freedesktop/NetworkManager/Devices/4) Dec 15 02:20:51 localhost kernel: Timeout policy base is empty Dec 15 02:20:51 localhost kernel: Failed to associated timeout policy `ovs_test_tp' Dec 15 02:20:51 localhost kernel: device br-ex entered promiscuous mode Dec 15 02:20:51 localhost NetworkManager[5963]: [1765783251.8063] manager: (br-ex): new Generic device (/org/freedesktop/NetworkManager/Devices/5) Dec 15 02:20:51 localhost NetworkManager[5963]: [1765783251.8330] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22417 uid=0 result="success" Dec 15 02:20:51 localhost NetworkManager[5963]: [1765783251.8535] device (br-ex): carrier: link connected Dec 15 02:20:54 localhost NetworkManager[5963]: [1765783254.9124] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22446 uid=0 result="success" Dec 15 02:20:54 localhost NetworkManager[5963]: [1765783254.9591] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22461 uid=0 result="success" Dec 15 02:20:55 localhost NET[22486]: /etc/sysconfig/network-scripts/ifup-post : updated /etc/resolv.conf Dec 15 02:20:55 localhost NetworkManager[5963]: [1765783255.0451] device (eth1): state change: activated -> unmanaged (reason 'unmanaged', sys-iface-state: 'managed') Dec 15 02:20:55 localhost NetworkManager[5963]: [1765783255.0590] dhcp4 (eth1): canceled DHCP transaction Dec 15 02:20:55 localhost NetworkManager[5963]: [1765783255.0591] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds) Dec 15 02:20:55 localhost NetworkManager[5963]: [1765783255.0592] dhcp4 (eth1): state changed no lease Dec 15 02:20:55 localhost NetworkManager[5963]: [1765783255.0644] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22495 uid=0 result="success" Dec 15 02:20:55 localhost ifup[22496]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Dec 15 02:20:55 localhost ifup[22497]: 'network-scripts' will be removed from distribution in near future. Dec 15 02:20:55 localhost systemd[1]: Starting Network Manager Script Dispatcher Service... Dec 15 02:20:55 localhost ifup[22499]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Dec 15 02:20:55 localhost systemd[1]: Started Network Manager Script Dispatcher Service. Dec 15 02:20:55 localhost NetworkManager[5963]: [1765783255.1005] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22512 uid=0 result="success" Dec 15 02:20:55 localhost NetworkManager[5963]: [1765783255.1453] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22523 uid=0 result="success" Dec 15 02:20:55 localhost NetworkManager[5963]: [1765783255.1522] device (eth1): carrier: link connected Dec 15 02:20:55 localhost NetworkManager[5963]: [1765783255.1742] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22532 uid=0 result="success" Dec 15 02:20:55 localhost ipv6_wait_tentative[22544]: Waiting for interface eth1 IPv6 address(es) to leave the 'tentative' state Dec 15 02:20:56 localhost ipv6_wait_tentative[22549]: Waiting for interface eth1 IPv6 address(es) to leave the 'tentative' state Dec 15 02:20:58 localhost NetworkManager[5963]: [1765783258.1768] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22558 uid=0 result="success" Dec 15 02:20:58 localhost ovs-vsctl[22573]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex eth1 -- add-port br-ex eth1 Dec 15 02:20:58 localhost kernel: device eth1 entered promiscuous mode Dec 15 02:20:58 localhost NetworkManager[5963]: [1765783258.3036] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22581 uid=0 result="success" Dec 15 02:20:58 localhost ifup[22582]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Dec 15 02:20:58 localhost ifup[22583]: 'network-scripts' will be removed from distribution in near future. Dec 15 02:20:58 localhost ifup[22584]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Dec 15 02:20:58 localhost NetworkManager[5963]: [1765783258.3381] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22590 uid=0 result="success" Dec 15 02:20:58 localhost NetworkManager[5963]: [1765783258.3857] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=22600 uid=0 result="success" Dec 15 02:20:58 localhost ifup[22601]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Dec 15 02:20:58 localhost ifup[22602]: 'network-scripts' will be removed from distribution in near future. Dec 15 02:20:58 localhost ifup[22603]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Dec 15 02:20:58 localhost NetworkManager[5963]: [1765783258.4181] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=22609 uid=0 result="success" Dec 15 02:20:58 localhost ovs-vsctl[22612]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan21 -- add-port br-ex vlan21 tag=21 -- set Interface vlan21 type=internal Dec 15 02:20:58 localhost kernel: device vlan21 entered promiscuous mode Dec 15 02:20:58 localhost NetworkManager[5963]: [1765783258.4587] manager: (vlan21): new Generic device (/org/freedesktop/NetworkManager/Devices/6) Dec 15 02:20:58 localhost systemd-udevd[22614]: Network interface NamePolicy= disabled on kernel command line. Dec 15 02:20:58 localhost NetworkManager[5963]: [1765783258.4826] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=22623 uid=0 result="success" Dec 15 02:20:58 localhost NetworkManager[5963]: [1765783258.5024] device (vlan21): carrier: link connected Dec 15 02:21:01 localhost NetworkManager[5963]: [1765783261.5572] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=22653 uid=0 result="success" Dec 15 02:21:01 localhost NetworkManager[5963]: [1765783261.6051] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=22668 uid=0 result="success" Dec 15 02:21:01 localhost NetworkManager[5963]: [1765783261.6677] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=22689 uid=0 result="success" Dec 15 02:21:01 localhost ifup[22690]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Dec 15 02:21:01 localhost ifup[22691]: 'network-scripts' will be removed from distribution in near future. Dec 15 02:21:01 localhost ifup[22692]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Dec 15 02:21:01 localhost NetworkManager[5963]: [1765783261.7000] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=22698 uid=0 result="success" Dec 15 02:21:01 localhost ovs-vsctl[22701]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan23 -- add-port br-ex vlan23 tag=23 -- set Interface vlan23 type=internal Dec 15 02:21:01 localhost kernel: device vlan23 entered promiscuous mode Dec 15 02:21:01 localhost NetworkManager[5963]: [1765783261.7746] manager: (vlan23): new Generic device (/org/freedesktop/NetworkManager/Devices/7) Dec 15 02:21:01 localhost systemd-udevd[22703]: Network interface NamePolicy= disabled on kernel command line. Dec 15 02:21:01 localhost NetworkManager[5963]: [1765783261.8006] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=22713 uid=0 result="success" Dec 15 02:21:01 localhost NetworkManager[5963]: [1765783261.8251] device (vlan23): carrier: link connected Dec 15 02:21:04 localhost NetworkManager[5963]: [1765783264.8814] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=22743 uid=0 result="success" Dec 15 02:21:04 localhost NetworkManager[5963]: [1765783264.9268] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=22758 uid=0 result="success" Dec 15 02:21:04 localhost NetworkManager[5963]: [1765783264.9774] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=22779 uid=0 result="success" Dec 15 02:21:04 localhost ifup[22780]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Dec 15 02:21:04 localhost ifup[22781]: 'network-scripts' will be removed from distribution in near future. Dec 15 02:21:04 localhost ifup[22782]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Dec 15 02:21:05 localhost NetworkManager[5963]: [1765783265.0018] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=22788 uid=0 result="success" Dec 15 02:21:05 localhost ovs-vsctl[22791]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan22 -- add-port br-ex vlan22 tag=22 -- set Interface vlan22 type=internal Dec 15 02:21:05 localhost kernel: device vlan22 entered promiscuous mode Dec 15 02:21:05 localhost NetworkManager[5963]: [1765783265.0402] manager: (vlan22): new Generic device (/org/freedesktop/NetworkManager/Devices/8) Dec 15 02:21:05 localhost systemd-udevd[22793]: Network interface NamePolicy= disabled on kernel command line. Dec 15 02:21:05 localhost NetworkManager[5963]: [1765783265.0640] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=22803 uid=0 result="success" Dec 15 02:21:05 localhost NetworkManager[5963]: [1765783265.0856] device (vlan22): carrier: link connected Dec 15 02:21:05 localhost systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully. Dec 15 02:21:08 localhost NetworkManager[5963]: [1765783268.1417] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=22833 uid=0 result="success" Dec 15 02:21:08 localhost NetworkManager[5963]: [1765783268.1903] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=22848 uid=0 result="success" Dec 15 02:21:08 localhost NetworkManager[5963]: [1765783268.2516] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=22869 uid=0 result="success" Dec 15 02:21:08 localhost ifup[22870]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Dec 15 02:21:08 localhost ifup[22871]: 'network-scripts' will be removed from distribution in near future. Dec 15 02:21:08 localhost ifup[22872]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Dec 15 02:21:08 localhost NetworkManager[5963]: [1765783268.2836] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=22878 uid=0 result="success" Dec 15 02:21:08 localhost ovs-vsctl[22881]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan20 -- add-port br-ex vlan20 tag=20 -- set Interface vlan20 type=internal Dec 15 02:21:08 localhost kernel: device vlan20 entered promiscuous mode Dec 15 02:21:08 localhost NetworkManager[5963]: [1765783268.3651] manager: (vlan20): new Generic device (/org/freedesktop/NetworkManager/Devices/9) Dec 15 02:21:08 localhost systemd-udevd[22883]: Network interface NamePolicy= disabled on kernel command line. Dec 15 02:21:08 localhost NetworkManager[5963]: [1765783268.3908] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=22893 uid=0 result="success" Dec 15 02:21:08 localhost NetworkManager[5963]: [1765783268.4126] device (vlan20): carrier: link connected Dec 15 02:21:11 localhost NetworkManager[5963]: [1765783271.4664] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=22923 uid=0 result="success" Dec 15 02:21:11 localhost NetworkManager[5963]: [1765783271.5123] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=22938 uid=0 result="success" Dec 15 02:21:11 localhost NetworkManager[5963]: [1765783271.5748] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=22959 uid=0 result="success" Dec 15 02:21:11 localhost ifup[22960]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Dec 15 02:21:11 localhost ifup[22961]: 'network-scripts' will be removed from distribution in near future. Dec 15 02:21:11 localhost ifup[22962]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Dec 15 02:21:11 localhost NetworkManager[5963]: [1765783271.6078] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=22968 uid=0 result="success" Dec 15 02:21:11 localhost ovs-vsctl[22971]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan44 -- add-port br-ex vlan44 tag=44 -- set Interface vlan44 type=internal Dec 15 02:21:11 localhost kernel: device vlan44 entered promiscuous mode Dec 15 02:21:11 localhost systemd-udevd[22973]: Network interface NamePolicy= disabled on kernel command line. Dec 15 02:21:11 localhost NetworkManager[5963]: [1765783271.6539] manager: (vlan44): new Generic device (/org/freedesktop/NetworkManager/Devices/10) Dec 15 02:21:11 localhost NetworkManager[5963]: [1765783271.6783] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=22983 uid=0 result="success" Dec 15 02:21:11 localhost NetworkManager[5963]: [1765783271.6985] device (vlan44): carrier: link connected Dec 15 02:21:14 localhost NetworkManager[5963]: [1765783274.8048] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23013 uid=0 result="success" Dec 15 02:21:14 localhost NetworkManager[5963]: [1765783274.8549] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23028 uid=0 result="success" Dec 15 02:21:14 localhost NetworkManager[5963]: [1765783274.9179] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23049 uid=0 result="success" Dec 15 02:21:14 localhost ifup[23050]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Dec 15 02:21:14 localhost ifup[23051]: 'network-scripts' will be removed from distribution in near future. Dec 15 02:21:14 localhost ifup[23052]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Dec 15 02:21:14 localhost NetworkManager[5963]: [1765783274.9510] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23058 uid=0 result="success" Dec 15 02:21:14 localhost ovs-vsctl[23061]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan44 -- add-port br-ex vlan44 tag=44 -- set Interface vlan44 type=internal Dec 15 02:21:15 localhost NetworkManager[5963]: [1765783275.0087] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23068 uid=0 result="success" Dec 15 02:21:16 localhost NetworkManager[5963]: [1765783276.0712] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23095 uid=0 result="success" Dec 15 02:21:16 localhost NetworkManager[5963]: [1765783276.1222] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23110 uid=0 result="success" Dec 15 02:21:16 localhost NetworkManager[5963]: [1765783276.1861] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23131 uid=0 result="success" Dec 15 02:21:16 localhost ifup[23132]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Dec 15 02:21:16 localhost ifup[23133]: 'network-scripts' will be removed from distribution in near future. Dec 15 02:21:16 localhost ifup[23134]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Dec 15 02:21:16 localhost NetworkManager[5963]: [1765783276.2197] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23140 uid=0 result="success" Dec 15 02:21:16 localhost ovs-vsctl[23143]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan20 -- add-port br-ex vlan20 tag=20 -- set Interface vlan20 type=internal Dec 15 02:21:16 localhost NetworkManager[5963]: [1765783276.2853] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23150 uid=0 result="success" Dec 15 02:21:17 localhost NetworkManager[5963]: [1765783277.3518] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23178 uid=0 result="success" Dec 15 02:21:17 localhost NetworkManager[5963]: [1765783277.3984] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23193 uid=0 result="success" Dec 15 02:21:17 localhost NetworkManager[5963]: [1765783277.4546] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23214 uid=0 result="success" Dec 15 02:21:17 localhost ifup[23215]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Dec 15 02:21:17 localhost ifup[23216]: 'network-scripts' will be removed from distribution in near future. Dec 15 02:21:17 localhost ifup[23217]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Dec 15 02:21:17 localhost NetworkManager[5963]: [1765783277.4861] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23223 uid=0 result="success" Dec 15 02:21:17 localhost ovs-vsctl[23226]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan21 -- add-port br-ex vlan21 tag=21 -- set Interface vlan21 type=internal Dec 15 02:21:17 localhost NetworkManager[5963]: [1765783277.5441] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23233 uid=0 result="success" Dec 15 02:21:18 localhost NetworkManager[5963]: [1765783278.6081] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23261 uid=0 result="success" Dec 15 02:21:18 localhost NetworkManager[5963]: [1765783278.6508] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23276 uid=0 result="success" Dec 15 02:21:18 localhost NetworkManager[5963]: [1765783278.7000] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23297 uid=0 result="success" Dec 15 02:21:18 localhost ifup[23298]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Dec 15 02:21:18 localhost ifup[23299]: 'network-scripts' will be removed from distribution in near future. Dec 15 02:21:18 localhost ifup[23300]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Dec 15 02:21:18 localhost NetworkManager[5963]: [1765783278.7188] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23306 uid=0 result="success" Dec 15 02:21:18 localhost ovs-vsctl[23309]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan23 -- add-port br-ex vlan23 tag=23 -- set Interface vlan23 type=internal Dec 15 02:21:18 localhost NetworkManager[5963]: [1765783278.7714] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23316 uid=0 result="success" Dec 15 02:21:19 localhost NetworkManager[5963]: [1765783279.8350] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23344 uid=0 result="success" Dec 15 02:21:19 localhost NetworkManager[5963]: [1765783279.8883] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23359 uid=0 result="success" Dec 15 02:21:19 localhost NetworkManager[5963]: [1765783279.9603] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23380 uid=0 result="success" Dec 15 02:21:19 localhost ifup[23381]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated. Dec 15 02:21:19 localhost ifup[23382]: 'network-scripts' will be removed from distribution in near future. Dec 15 02:21:19 localhost ifup[23383]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well. Dec 15 02:21:19 localhost NetworkManager[5963]: [1765783279.9991] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23389 uid=0 result="success" Dec 15 02:21:20 localhost ovs-vsctl[23392]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan22 -- add-port br-ex vlan22 tag=22 -- set Interface vlan22 type=internal Dec 15 02:21:20 localhost NetworkManager[5963]: [1765783280.0583] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23399 uid=0 result="success" Dec 15 02:21:21 localhost NetworkManager[5963]: [1765783281.1198] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23427 uid=0 result="success" Dec 15 02:21:21 localhost NetworkManager[5963]: [1765783281.1669] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23442 uid=0 result="success" Dec 15 02:22:13 localhost python3[23474]: ansible-ansible.legacy.command Invoked with _raw_params=ip a#012ping -c 2 -W 2 192.168.122.10#012ping -c 2 -W 2 192.168.122.11#012 _uses_shell=True zuul_log_id=fa163efc-24cc-4281-54ac-00000000001b-1-overcloudnovacompute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 02:22:19 localhost python3[23493]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC1ko8xh0bPiR6+NCG4km3etin5rm3hZZmVHXuDDaTrTWq3PUz5sEmSbiDQOJj4mOpsouNXjYHkjXSuRLTQ5dqF1BWU5bgiOkTIqccwZdxkqfM2VFXFj/Ej621HUBRYHf7PK5zkl+8G1g2RkkiSd886DSw6I1J+2uT/e+4/0G1vsACTaNArP3/JSOh0hdwu+fnjybrp4sauiJsWaQvwbWao/txJqznQhymNwHZVFRMhFy+x+oDr4ry7w+X2JuGz2ydbUojBUG0REWTKmU4EZsyDsx77GzIJwfHsUUuJ0t4DcDalVqz20D+LXTug8AfgSovuNhZTz8AqRXfCOZK9haLb+3tJwKExMBdnj0cacNw6O23jZTIMbJK+qoxGN4mzqr8RNFVPPLVhL5tcfb2MvN7TWs7+oT0K5xmxDkEnr7iSpmPV0d8TD/wNwfJmtu2W6TWL5zgA6U3GyfnTH2+nhrFk5Ou+yaLES2+Wx8kb0NLWgDnlCcYy9dw1eP3GSc3GIzM= zuul-build-sshkey manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Dec 15 02:22:19 localhost python3[23509]: ansible-ansible.posix.authorized_key Invoked with user=root key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC1ko8xh0bPiR6+NCG4km3etin5rm3hZZmVHXuDDaTrTWq3PUz5sEmSbiDQOJj4mOpsouNXjYHkjXSuRLTQ5dqF1BWU5bgiOkTIqccwZdxkqfM2VFXFj/Ej621HUBRYHf7PK5zkl+8G1g2RkkiSd886DSw6I1J+2uT/e+4/0G1vsACTaNArP3/JSOh0hdwu+fnjybrp4sauiJsWaQvwbWao/txJqznQhymNwHZVFRMhFy+x+oDr4ry7w+X2JuGz2ydbUojBUG0REWTKmU4EZsyDsx77GzIJwfHsUUuJ0t4DcDalVqz20D+LXTug8AfgSovuNhZTz8AqRXfCOZK9haLb+3tJwKExMBdnj0cacNw6O23jZTIMbJK+qoxGN4mzqr8RNFVPPLVhL5tcfb2MvN7TWs7+oT0K5xmxDkEnr7iSpmPV0d8TD/wNwfJmtu2W6TWL5zgA6U3GyfnTH2+nhrFk5Ou+yaLES2+Wx8kb0NLWgDnlCcYy9dw1eP3GSc3GIzM= zuul-build-sshkey manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Dec 15 02:22:21 localhost python3[23523]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC1ko8xh0bPiR6+NCG4km3etin5rm3hZZmVHXuDDaTrTWq3PUz5sEmSbiDQOJj4mOpsouNXjYHkjXSuRLTQ5dqF1BWU5bgiOkTIqccwZdxkqfM2VFXFj/Ej621HUBRYHf7PK5zkl+8G1g2RkkiSd886DSw6I1J+2uT/e+4/0G1vsACTaNArP3/JSOh0hdwu+fnjybrp4sauiJsWaQvwbWao/txJqznQhymNwHZVFRMhFy+x+oDr4ry7w+X2JuGz2ydbUojBUG0REWTKmU4EZsyDsx77GzIJwfHsUUuJ0t4DcDalVqz20D+LXTug8AfgSovuNhZTz8AqRXfCOZK9haLb+3tJwKExMBdnj0cacNw6O23jZTIMbJK+qoxGN4mzqr8RNFVPPLVhL5tcfb2MvN7TWs7+oT0K5xmxDkEnr7iSpmPV0d8TD/wNwfJmtu2W6TWL5zgA6U3GyfnTH2+nhrFk5Ou+yaLES2+Wx8kb0NLWgDnlCcYy9dw1eP3GSc3GIzM= zuul-build-sshkey manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Dec 15 02:22:21 localhost python3[23539]: ansible-ansible.posix.authorized_key Invoked with user=root key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC1ko8xh0bPiR6+NCG4km3etin5rm3hZZmVHXuDDaTrTWq3PUz5sEmSbiDQOJj4mOpsouNXjYHkjXSuRLTQ5dqF1BWU5bgiOkTIqccwZdxkqfM2VFXFj/Ej621HUBRYHf7PK5zkl+8G1g2RkkiSd886DSw6I1J+2uT/e+4/0G1vsACTaNArP3/JSOh0hdwu+fnjybrp4sauiJsWaQvwbWao/txJqznQhymNwHZVFRMhFy+x+oDr4ry7w+X2JuGz2ydbUojBUG0REWTKmU4EZsyDsx77GzIJwfHsUUuJ0t4DcDalVqz20D+LXTug8AfgSovuNhZTz8AqRXfCOZK9haLb+3tJwKExMBdnj0cacNw6O23jZTIMbJK+qoxGN4mzqr8RNFVPPLVhL5tcfb2MvN7TWs7+oT0K5xmxDkEnr7iSpmPV0d8TD/wNwfJmtu2W6TWL5zgA6U3GyfnTH2+nhrFk5Ou+yaLES2+Wx8kb0NLWgDnlCcYy9dw1eP3GSc3GIzM= zuul-build-sshkey manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None Dec 15 02:22:22 localhost python3[23553]: ansible-ansible.builtin.slurp Invoked with path=/etc/hostname src=/etc/hostname Dec 15 02:22:23 localhost python3[23568]: ansible-ansible.legacy.command Invoked with _raw_params=hostname="np0005559462.novalocal"#012hostname_str_array=(${hostname//./ })#012echo ${hostname_str_array[0]} > /home/zuul/ansible_hostname#012 _uses_shell=True zuul_log_id=fa163efc-24cc-4281-54ac-000000000022-1-overcloudnovacompute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 02:22:24 localhost python3[23588]: ansible-ansible.legacy.command Invoked with _raw_params=hostname=$(cat /home/zuul/ansible_hostname)#012hostnamectl hostname "$hostname.localdomain"#012 _uses_shell=True zuul_log_id=fa163efc-24cc-4281-54ac-000000000023-1-overcloudnovacompute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 02:22:24 localhost systemd[1]: Starting Hostname Service... Dec 15 02:22:24 localhost systemd[1]: Started Hostname Service. Dec 15 02:22:24 localhost systemd-hostnamed[23592]: Hostname set to (static) Dec 15 02:22:24 localhost NetworkManager[5963]: [1765783344.4035] hostname: static hostname changed from "np0005559462.novalocal" to "np0005559462.localdomain" Dec 15 02:22:24 localhost systemd[1]: Starting Network Manager Script Dispatcher Service... Dec 15 02:22:24 localhost systemd[1]: Started Network Manager Script Dispatcher Service. Dec 15 02:22:25 localhost systemd-logind[763]: Session 10 logged out. Waiting for processes to exit. Dec 15 02:22:25 localhost systemd[1]: session-10.scope: Deactivated successfully. Dec 15 02:22:25 localhost systemd[1]: session-10.scope: Consumed 1min 43.178s CPU time. Dec 15 02:22:25 localhost systemd-logind[763]: Removed session 10. Dec 15 02:22:28 localhost sshd[23603]: main: sshd: ssh-rsa algorithm is disabled Dec 15 02:22:28 localhost systemd-logind[763]: New session 11 of user zuul. Dec 15 02:22:28 localhost systemd[1]: Started Session 11 of User zuul. Dec 15 02:22:28 localhost python3[23620]: ansible-ansible.builtin.slurp Invoked with path=/home/zuul/ansible_hostname src=/home/zuul/ansible_hostname Dec 15 02:22:31 localhost systemd[1]: session-11.scope: Deactivated successfully. Dec 15 02:22:31 localhost systemd-logind[763]: Session 11 logged out. Waiting for processes to exit. Dec 15 02:22:31 localhost systemd-logind[763]: Removed session 11. Dec 15 02:22:34 localhost systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully. Dec 15 02:22:54 localhost systemd[1]: systemd-hostnamed.service: Deactivated successfully. Dec 15 02:23:11 localhost sshd[23624]: main: sshd: ssh-rsa algorithm is disabled Dec 15 02:23:11 localhost systemd-logind[763]: New session 12 of user zuul. Dec 15 02:23:11 localhost systemd[1]: Started Session 12 of User zuul. Dec 15 02:23:12 localhost python3[23643]: ansible-ansible.legacy.dnf Invoked with name=['lvm2', 'jq'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None Dec 15 02:23:15 localhost systemd[1]: Reloading. Dec 15 02:23:15 localhost systemd-rc-local-generator[23682]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 02:23:15 localhost systemd-sysv-generator[23685]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 02:23:15 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 02:23:16 localhost systemd[1]: Listening on Device-mapper event daemon FIFOs. Dec 15 02:23:16 localhost systemd[1]: Reloading. Dec 15 02:23:16 localhost systemd-rc-local-generator[23725]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 02:23:16 localhost systemd-sysv-generator[23731]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 02:23:16 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 02:23:16 localhost systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling... Dec 15 02:23:16 localhost systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling. Dec 15 02:23:16 localhost systemd[1]: Reloading. Dec 15 02:23:16 localhost systemd-sysv-generator[23771]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 02:23:16 localhost systemd-rc-local-generator[23765]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 02:23:16 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 02:23:16 localhost systemd[1]: Listening on LVM2 poll daemon socket. Dec 15 02:23:17 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Dec 15 02:23:17 localhost systemd[1]: Starting man-db-cache-update.service... Dec 15 02:23:17 localhost systemd[1]: Reloading. Dec 15 02:23:17 localhost systemd-rc-local-generator[23825]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 02:23:17 localhost systemd-sysv-generator[23828]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 02:23:17 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 02:23:17 localhost systemd[1]: Queuing reload/restart jobs for marked units… Dec 15 02:23:17 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Dec 15 02:23:17 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Dec 15 02:23:17 localhost systemd[1]: Finished man-db-cache-update.service. Dec 15 02:23:17 localhost systemd[1]: run-rad41e78d5af248fb81c3d21aa5a331ef.service: Deactivated successfully. Dec 15 02:23:17 localhost systemd[1]: run-r6550bc9bc93c4d22bf35fe19dfd8b631.service: Deactivated successfully. Dec 15 02:24:18 localhost systemd[1]: session-12.scope: Deactivated successfully. Dec 15 02:24:18 localhost systemd[1]: session-12.scope: Consumed 4.713s CPU time. Dec 15 02:24:18 localhost systemd-logind[763]: Session 12 logged out. Waiting for processes to exit. Dec 15 02:24:18 localhost systemd-logind[763]: Removed session 12. Dec 15 02:40:17 localhost sshd[24421]: main: sshd: ssh-rsa algorithm is disabled Dec 15 02:40:18 localhost systemd-logind[763]: New session 13 of user zuul. Dec 15 02:40:18 localhost systemd[1]: Started Session 13 of User zuul. Dec 15 02:40:18 localhost python3[24469]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Dec 15 02:40:20 localhost python3[24556]: ansible-ansible.builtin.dnf Invoked with name=['util-linux', 'lvm2', 'jq', 'podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Dec 15 02:40:23 localhost python3[24573]: ansible-ansible.builtin.stat Invoked with path=/dev/loop3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 15 02:40:24 localhost python3[24590]: ansible-ansible.legacy.command Invoked with _raw_params=dd if=/dev/zero of=/var/lib/ceph-osd-0.img bs=1 count=0 seek=7G#012losetup /dev/loop3 /var/lib/ceph-osd-0.img#012lsblk _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 02:40:24 localhost kernel: loop: module loaded Dec 15 02:40:24 localhost kernel: loop3: detected capacity change from 0 to 14680064 Dec 15 02:40:24 localhost python3[24616]: ansible-ansible.legacy.command Invoked with _raw_params=pvcreate /dev/loop3#012vgcreate ceph_vg0 /dev/loop3#012lvcreate -n ceph_lv0 -l +100%FREE ceph_vg0#012lvs _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 02:40:24 localhost lvm[24619]: PV /dev/loop3 not used. Dec 15 02:40:24 localhost lvm[24621]: PV /dev/loop3 online, VG ceph_vg0 is complete. Dec 15 02:40:24 localhost systemd[1]: Started /usr/sbin/lvm vgchange -aay --autoactivation event ceph_vg0. Dec 15 02:40:24 localhost lvm[24627]: 1 logical volume(s) in volume group "ceph_vg0" now active Dec 15 02:40:24 localhost lvm[24631]: PV /dev/loop3 online, VG ceph_vg0 is complete. Dec 15 02:40:24 localhost lvm[24631]: VG ceph_vg0 finished Dec 15 02:40:24 localhost systemd[1]: lvm-activate-ceph_vg0.service: Deactivated successfully. Dec 15 02:40:25 localhost python3[24680]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/ceph-osd-losetup-0.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 02:40:26 localhost python3[24723]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765784425.2734456-53803-47580916701773/source dest=/etc/systemd/system/ceph-osd-losetup-0.service mode=0644 force=True follow=False _original_basename=ceph-osd-losetup.service.j2 checksum=427b1db064a970126b729b07acf99fa7d0eecb9c backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:40:26 localhost python3[24753]: ansible-ansible.builtin.systemd Invoked with state=started enabled=True name=ceph-osd-losetup-0.service daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 02:40:26 localhost systemd[1]: Reloading. Dec 15 02:40:27 localhost systemd-sysv-generator[24785]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 02:40:27 localhost systemd-rc-local-generator[24779]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 02:40:27 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 02:40:27 localhost systemd[1]: Starting Ceph OSD losetup... Dec 15 02:40:27 localhost bash[24794]: /dev/loop3: [64516]:8399529 (/var/lib/ceph-osd-0.img) Dec 15 02:40:27 localhost systemd[1]: Finished Ceph OSD losetup. Dec 15 02:40:27 localhost lvm[24795]: PV /dev/loop3 online, VG ceph_vg0 is complete. Dec 15 02:40:27 localhost lvm[24795]: VG ceph_vg0 finished Dec 15 02:40:27 localhost python3[24811]: ansible-ansible.builtin.dnf Invoked with name=['util-linux', 'lvm2', 'jq', 'podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Dec 15 02:40:30 localhost python3[24828]: ansible-ansible.builtin.stat Invoked with path=/dev/loop4 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 15 02:40:31 localhost python3[24844]: ansible-ansible.legacy.command Invoked with _raw_params=dd if=/dev/zero of=/var/lib/ceph-osd-1.img bs=1 count=0 seek=7G#012losetup /dev/loop4 /var/lib/ceph-osd-1.img#012lsblk _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 02:40:31 localhost kernel: loop4: detected capacity change from 0 to 14680064 Dec 15 02:40:31 localhost python3[24866]: ansible-ansible.legacy.command Invoked with _raw_params=pvcreate /dev/loop4#012vgcreate ceph_vg1 /dev/loop4#012lvcreate -n ceph_lv1 -l +100%FREE ceph_vg1#012lvs _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 02:40:31 localhost lvm[24869]: PV /dev/loop4 not used. Dec 15 02:40:32 localhost lvm[24879]: PV /dev/loop4 online, VG ceph_vg1 is complete. Dec 15 02:40:32 localhost systemd[1]: Started /usr/sbin/lvm vgchange -aay --autoactivation event ceph_vg1. Dec 15 02:40:32 localhost lvm[24881]: 1 logical volume(s) in volume group "ceph_vg1" now active Dec 15 02:40:32 localhost systemd[1]: lvm-activate-ceph_vg1.service: Deactivated successfully. Dec 15 02:40:32 localhost python3[24929]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/ceph-osd-losetup-1.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 02:40:33 localhost python3[24972]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765784432.3509822-53909-108681929498519/source dest=/etc/systemd/system/ceph-osd-losetup-1.service mode=0644 force=True follow=False _original_basename=ceph-osd-losetup.service.j2 checksum=19612168ea279db4171b94ee1f8625de1ec44b58 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:40:33 localhost python3[25002]: ansible-ansible.builtin.systemd Invoked with state=started enabled=True name=ceph-osd-losetup-1.service daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 02:40:33 localhost systemd[1]: Reloading. Dec 15 02:40:33 localhost systemd-rc-local-generator[25031]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 02:40:33 localhost systemd-sysv-generator[25035]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 02:40:33 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 02:40:33 localhost systemd[1]: Starting Ceph OSD losetup... Dec 15 02:40:33 localhost bash[25043]: /dev/loop4: [64516]:8400144 (/var/lib/ceph-osd-1.img) Dec 15 02:40:33 localhost systemd[1]: Finished Ceph OSD losetup. Dec 15 02:40:34 localhost lvm[25044]: PV /dev/loop4 online, VG ceph_vg1 is complete. Dec 15 02:40:34 localhost lvm[25044]: VG ceph_vg1 finished Dec 15 02:40:44 localhost python3[25089]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all', 'min'] gather_timeout=45 filter=[] fact_path=/etc/ansible/facts.d Dec 15 02:40:45 localhost python3[25109]: ansible-hostname Invoked with name=np0005559462.localdomain use=None Dec 15 02:40:45 localhost systemd[1]: Starting Hostname Service... Dec 15 02:40:45 localhost systemd[1]: Started Hostname Service. Dec 15 02:40:47 localhost python3[25132]: ansible-tempfile Invoked with state=file suffix=tmphosts prefix=ansible. path=None Dec 15 02:40:48 localhost python3[25180]: ansible-ansible.legacy.copy Invoked with remote_src=True src=/etc/hosts dest=/tmp/ansible.v3p1if2ftmphosts mode=preserve backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:40:48 localhost python3[25210]: ansible-blockinfile Invoked with state=absent path=/tmp/ansible.v3p1if2ftmphosts block= marker=# {mark} marker_begin=HEAT_HOSTS_START - Do not edit manually within this section! marker_end=HEAT_HOSTS_END create=False backup=False unsafe_writes=False insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:40:49 localhost python3[25226]: ansible-blockinfile Invoked with create=True path=/tmp/ansible.v3p1if2ftmphosts insertbefore=BOF block=192.168.122.106 np0005559462.localdomain np0005559462#012192.168.122.106 np0005559462.ctlplane.localdomain np0005559462.ctlplane#012192.168.122.107 np0005559463.localdomain np0005559463#012192.168.122.107 np0005559463.ctlplane.localdomain np0005559463.ctlplane#012192.168.122.108 np0005559464.localdomain np0005559464#012192.168.122.108 np0005559464.ctlplane.localdomain np0005559464.ctlplane#012192.168.122.103 np0005559459.localdomain np0005559459#012192.168.122.103 np0005559459.ctlplane.localdomain np0005559459.ctlplane#012192.168.122.104 np0005559460.localdomain np0005559460#012192.168.122.104 np0005559460.ctlplane.localdomain np0005559460.ctlplane#012192.168.122.105 np0005559461.localdomain np0005559461#012192.168.122.105 np0005559461.ctlplane.localdomain np0005559461.ctlplane#012#012192.168.122.100 undercloud.ctlplane.localdomain undercloud.ctlplane#012 marker=# {mark} marker_begin=START_HOST_ENTRIES_FOR_STACK: overcloud marker_end=END_HOST_ENTRIES_FOR_STACK: overcloud state=present backup=False unsafe_writes=False insertafter=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:40:49 localhost python3[25242]: ansible-ansible.legacy.command Invoked with _raw_params=cp "/tmp/ansible.v3p1if2ftmphosts" "/etc/hosts" _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 02:40:50 localhost python3[25259]: ansible-file Invoked with path=/tmp/ansible.v3p1if2ftmphosts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:40:52 localhost python3[25275]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active ntpd.service || systemctl is-enabled ntpd.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 02:40:53 localhost python3[25293]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Dec 15 02:40:57 localhost python3[25342]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 02:40:58 localhost python3[25387]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765784457.2989535-54824-162525052076866/source dest=/etc/chrony.conf owner=root group=root mode=420 follow=False _original_basename=chrony.conf.j2 checksum=4fd4fbbb2de00c70a54478b7feb8ef8adf6a3362 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:40:59 localhost python3[25417]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 02:41:01 localhost python3[25435]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Dec 15 02:41:01 localhost chronyd[766]: chronyd exiting Dec 15 02:41:01 localhost systemd[1]: Stopping NTP client/server... Dec 15 02:41:01 localhost systemd[1]: chronyd.service: Deactivated successfully. Dec 15 02:41:01 localhost systemd[1]: Stopped NTP client/server. Dec 15 02:41:01 localhost systemd[1]: chronyd.service: Consumed 109ms CPU time, read 1.9M from disk, written 0B to disk. Dec 15 02:41:01 localhost systemd[1]: Starting NTP client/server... Dec 15 02:41:01 localhost chronyd[25442]: chronyd version 4.3 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG) Dec 15 02:41:01 localhost chronyd[25442]: Frequency -26.059 +/- 0.112 ppm read from /var/lib/chrony/drift Dec 15 02:41:01 localhost chronyd[25442]: Loaded seccomp filter (level 2) Dec 15 02:41:01 localhost systemd[1]: Started NTP client/server. Dec 15 02:41:02 localhost python3[25491]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/chrony-online.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 02:41:02 localhost python3[25534]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765784461.7887173-54990-63755061131228/source dest=/etc/systemd/system/chrony-online.service _original_basename=chrony-online.service follow=False checksum=d4d85e046d61f558ac7ec8178c6d529d893e81e1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:41:03 localhost python3[25564]: ansible-systemd Invoked with state=started name=chrony-online.service enabled=True daemon-reload=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 02:41:03 localhost systemd[1]: Reloading. Dec 15 02:41:03 localhost systemd-rc-local-generator[25591]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 02:41:03 localhost systemd-sysv-generator[25595]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 02:41:03 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 02:41:03 localhost systemd[1]: Reloading. Dec 15 02:41:03 localhost systemd-rc-local-generator[25631]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 02:41:03 localhost systemd-sysv-generator[25635]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 02:41:03 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 02:41:03 localhost systemd[1]: Starting chronyd online sources service... Dec 15 02:41:03 localhost chronyc[25641]: 200 OK Dec 15 02:41:03 localhost systemd[1]: chrony-online.service: Deactivated successfully. Dec 15 02:41:03 localhost systemd[1]: Finished chronyd online sources service. Dec 15 02:41:04 localhost python3[25657]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc makestep _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 02:41:04 localhost chronyd[25442]: System clock was stepped by 0.000000 seconds Dec 15 02:41:04 localhost python3[25674]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc waitsync 30 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 02:41:05 localhost chronyd[25442]: Selected source 23.133.168.245 (pool.ntp.org) Dec 15 02:41:15 localhost python3[25691]: ansible-timezone Invoked with name=UTC hwclock=None Dec 15 02:41:15 localhost systemd[1]: Starting Time & Date Service... Dec 15 02:41:15 localhost systemd[1]: Started Time & Date Service. Dec 15 02:41:15 localhost systemd[1]: systemd-hostnamed.service: Deactivated successfully. Dec 15 02:41:16 localhost python3[25713]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Dec 15 02:41:16 localhost chronyd[25442]: chronyd exiting Dec 15 02:41:16 localhost systemd[1]: Stopping NTP client/server... Dec 15 02:41:16 localhost systemd[1]: chronyd.service: Deactivated successfully. Dec 15 02:41:16 localhost systemd[1]: Stopped NTP client/server. Dec 15 02:41:16 localhost systemd[1]: Starting NTP client/server... Dec 15 02:41:16 localhost chronyd[25720]: chronyd version 4.3 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG) Dec 15 02:41:16 localhost chronyd[25720]: Frequency -26.059 +/- 0.112 ppm read from /var/lib/chrony/drift Dec 15 02:41:16 localhost chronyd[25720]: Loaded seccomp filter (level 2) Dec 15 02:41:16 localhost systemd[1]: Started NTP client/server. Dec 15 02:41:20 localhost chronyd[25720]: Selected source 162.159.200.1 (pool.ntp.org) Dec 15 02:41:45 localhost systemd[1]: systemd-timedated.service: Deactivated successfully. Dec 15 02:43:13 localhost sshd[25920]: main: sshd: ssh-rsa algorithm is disabled Dec 15 02:43:14 localhost systemd[1]: Created slice User Slice of UID 1002. Dec 15 02:43:14 localhost systemd[1]: Starting User Runtime Directory /run/user/1002... Dec 15 02:43:14 localhost systemd-logind[763]: New session 14 of user ceph-admin. Dec 15 02:43:14 localhost systemd[1]: Finished User Runtime Directory /run/user/1002. Dec 15 02:43:14 localhost systemd[1]: Starting User Manager for UID 1002... Dec 15 02:43:14 localhost sshd[25937]: main: sshd: ssh-rsa algorithm is disabled Dec 15 02:43:14 localhost systemd[25924]: Queued start job for default target Main User Target. Dec 15 02:43:14 localhost systemd[25924]: Created slice User Application Slice. Dec 15 02:43:14 localhost systemd[25924]: Started Mark boot as successful after the user session has run 2 minutes. Dec 15 02:43:14 localhost systemd[25924]: Started Daily Cleanup of User's Temporary Directories. Dec 15 02:43:14 localhost systemd[25924]: Reached target Paths. Dec 15 02:43:14 localhost systemd[25924]: Reached target Timers. Dec 15 02:43:14 localhost systemd[25924]: Starting D-Bus User Message Bus Socket... Dec 15 02:43:14 localhost systemd[25924]: Starting Create User's Volatile Files and Directories... Dec 15 02:43:14 localhost systemd[25924]: Finished Create User's Volatile Files and Directories. Dec 15 02:43:14 localhost systemd[25924]: Listening on D-Bus User Message Bus Socket. Dec 15 02:43:14 localhost systemd[25924]: Reached target Sockets. Dec 15 02:43:14 localhost systemd[25924]: Reached target Basic System. Dec 15 02:43:14 localhost systemd[25924]: Reached target Main User Target. Dec 15 02:43:14 localhost systemd[25924]: Startup finished in 115ms. Dec 15 02:43:14 localhost systemd[1]: Started User Manager for UID 1002. Dec 15 02:43:14 localhost systemd[1]: Started Session 14 of User ceph-admin. Dec 15 02:43:14 localhost systemd-logind[763]: New session 16 of user ceph-admin. Dec 15 02:43:14 localhost systemd[1]: Started Session 16 of User ceph-admin. Dec 15 02:43:14 localhost sshd[25959]: main: sshd: ssh-rsa algorithm is disabled Dec 15 02:43:14 localhost systemd-logind[763]: New session 17 of user ceph-admin. Dec 15 02:43:14 localhost systemd[1]: Started Session 17 of User ceph-admin. Dec 15 02:43:15 localhost sshd[25978]: main: sshd: ssh-rsa algorithm is disabled Dec 15 02:43:15 localhost systemd-logind[763]: New session 18 of user ceph-admin. Dec 15 02:43:15 localhost systemd[1]: Started Session 18 of User ceph-admin. Dec 15 02:43:15 localhost sshd[25997]: main: sshd: ssh-rsa algorithm is disabled Dec 15 02:43:15 localhost systemd-logind[763]: New session 19 of user ceph-admin. Dec 15 02:43:15 localhost systemd[1]: Started Session 19 of User ceph-admin. Dec 15 02:43:15 localhost sshd[26016]: main: sshd: ssh-rsa algorithm is disabled Dec 15 02:43:15 localhost systemd-logind[763]: New session 20 of user ceph-admin. Dec 15 02:43:15 localhost systemd[1]: Started Session 20 of User ceph-admin. Dec 15 02:43:16 localhost sshd[26035]: main: sshd: ssh-rsa algorithm is disabled Dec 15 02:43:16 localhost systemd-logind[763]: New session 21 of user ceph-admin. Dec 15 02:43:16 localhost systemd[1]: Started Session 21 of User ceph-admin. Dec 15 02:43:16 localhost sshd[26054]: main: sshd: ssh-rsa algorithm is disabled Dec 15 02:43:16 localhost systemd-logind[763]: New session 22 of user ceph-admin. Dec 15 02:43:16 localhost systemd[1]: Started Session 22 of User ceph-admin. Dec 15 02:43:16 localhost sshd[26073]: main: sshd: ssh-rsa algorithm is disabled Dec 15 02:43:17 localhost systemd-logind[763]: New session 23 of user ceph-admin. Dec 15 02:43:17 localhost systemd[1]: Started Session 23 of User ceph-admin. Dec 15 02:43:17 localhost sshd[26092]: main: sshd: ssh-rsa algorithm is disabled Dec 15 02:43:17 localhost systemd-logind[763]: New session 24 of user ceph-admin. Dec 15 02:43:17 localhost systemd[1]: Started Session 24 of User ceph-admin. Dec 15 02:43:17 localhost sshd[26109]: main: sshd: ssh-rsa algorithm is disabled Dec 15 02:43:18 localhost systemd-logind[763]: New session 25 of user ceph-admin. Dec 15 02:43:18 localhost systemd[1]: Started Session 25 of User ceph-admin. Dec 15 02:43:18 localhost sshd[26128]: main: sshd: ssh-rsa algorithm is disabled Dec 15 02:43:18 localhost systemd-logind[763]: New session 26 of user ceph-admin. Dec 15 02:43:18 localhost systemd[1]: Started Session 26 of User ceph-admin. Dec 15 02:43:18 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Dec 15 02:43:43 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Dec 15 02:43:44 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Dec 15 02:43:44 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Dec 15 02:43:44 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Dec 15 02:43:44 localhost systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 26344 (sysctl) Dec 15 02:43:44 localhost systemd[1]: Mounting Arbitrary Executable File Formats File System... Dec 15 02:43:44 localhost systemd[1]: Mounted Arbitrary Executable File Formats File System. Dec 15 02:43:45 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Dec 15 02:43:46 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Dec 15 02:43:50 localhost kernel: VFS: idmapped mount is not enabled. Dec 15 02:44:11 localhost podman[26485]: Dec 15 02:44:11 localhost podman[26485]: 2025-12-15 07:44:11.716213196 +0000 UTC m=+25.225080223 container create 022001357d3eb02307b7bd57d6d5212b00ac5bbee9ac0daa5b34bcede4430420 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pedantic_tesla, release=1763362218, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, vendor=Red Hat, Inc., ceph=True, description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, distribution-scope=public, architecture=x86_64, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, GIT_CLEAN=True, name=rhceph, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container) Dec 15 02:44:11 localhost podman[26485]: 2025-12-15 07:43:46.531976733 +0000 UTC m=+0.040843820 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 02:44:11 localhost systemd[1]: Created slice Slice /machine. Dec 15 02:44:11 localhost systemd[1]: Started libpod-conmon-022001357d3eb02307b7bd57d6d5212b00ac5bbee9ac0daa5b34bcede4430420.scope. Dec 15 02:44:11 localhost systemd[1]: Started libcrun container. Dec 15 02:44:11 localhost podman[26485]: 2025-12-15 07:44:11.84127981 +0000 UTC m=+25.350146857 container init 022001357d3eb02307b7bd57d6d5212b00ac5bbee9ac0daa5b34bcede4430420 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pedantic_tesla, release=1763362218, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, architecture=x86_64, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , RELEASE=main, GIT_CLEAN=True, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, io.openshift.expose-services=, ceph=True, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Dec 15 02:44:11 localhost podman[26485]: 2025-12-15 07:44:11.85114554 +0000 UTC m=+25.360012597 container start 022001357d3eb02307b7bd57d6d5212b00ac5bbee9ac0daa5b34bcede4430420 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pedantic_tesla, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, io.openshift.expose-services=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, distribution-scope=public, GIT_CLEAN=True, release=1763362218, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux , architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, build-date=2025-11-26T19:44:28Z, description=Red Hat Ceph Storage 7, GIT_BRANCH=main) Dec 15 02:44:11 localhost podman[26485]: 2025-12-15 07:44:11.851487059 +0000 UTC m=+25.360354146 container attach 022001357d3eb02307b7bd57d6d5212b00ac5bbee9ac0daa5b34bcede4430420 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pedantic_tesla, build-date=2025-11-26T19:44:28Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, GIT_BRANCH=main, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, vendor=Red Hat, Inc., io.buildah.version=1.41.4, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64) Dec 15 02:44:11 localhost pedantic_tesla[26605]: 167 167 Dec 15 02:44:11 localhost podman[26485]: 2025-12-15 07:44:11.85488779 +0000 UTC m=+25.363754857 container died 022001357d3eb02307b7bd57d6d5212b00ac5bbee9ac0daa5b34bcede4430420 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pedantic_tesla, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, description=Red Hat Ceph Storage 7, release=1763362218, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, ceph=True, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, GIT_BRANCH=main, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , GIT_CLEAN=True, io.openshift.tags=rhceph ceph, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main) Dec 15 02:44:11 localhost systemd[1]: libpod-022001357d3eb02307b7bd57d6d5212b00ac5bbee9ac0daa5b34bcede4430420.scope: Deactivated successfully. Dec 15 02:44:11 localhost podman[26610]: 2025-12-15 07:44:11.974164569 +0000 UTC m=+0.101485961 container remove 022001357d3eb02307b7bd57d6d5212b00ac5bbee9ac0daa5b34bcede4430420 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pedantic_tesla, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, GIT_BRANCH=main, com.redhat.component=rhceph-container, release=1763362218, io.openshift.tags=rhceph ceph, ceph=True, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, name=rhceph) Dec 15 02:44:11 localhost systemd[1]: libpod-conmon-022001357d3eb02307b7bd57d6d5212b00ac5bbee9ac0daa5b34bcede4430420.scope: Deactivated successfully. Dec 15 02:44:12 localhost podman[26631]: Dec 15 02:44:12 localhost podman[26631]: 2025-12-15 07:44:12.189472416 +0000 UTC m=+0.053830372 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 02:44:12 localhost podman[26631]: 2025-12-15 07:44:12.420349303 +0000 UTC m=+0.284707239 container create ad7422b204667eb6ad676a19e20c4dd7d8f22f355eaef11fa5ace19c30aa7ef7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zealous_jang, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-26T19:44:28Z, CEPH_POINT_RELEASE=, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, ceph=True, maintainer=Guillaume Abrioux , vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, name=rhceph, version=7, io.buildah.version=1.41.4, distribution-scope=public, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, release=1763362218) Dec 15 02:44:12 localhost systemd[1]: Started libpod-conmon-ad7422b204667eb6ad676a19e20c4dd7d8f22f355eaef11fa5ace19c30aa7ef7.scope. Dec 15 02:44:12 localhost systemd[1]: Started libcrun container. Dec 15 02:44:12 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4afdbad2e2d6c1edcf4f7b647ae8fda48b530251f3970ef49b3022e7d7a285d7/merged/rootfs supports timestamps until 2038 (0x7fffffff) Dec 15 02:44:12 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4afdbad2e2d6c1edcf4f7b647ae8fda48b530251f3970ef49b3022e7d7a285d7/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Dec 15 02:44:12 localhost systemd[1]: var-lib-containers-storage-overlay-244e070303c4968f924b9f59d05f0e7e0628d8b76a49fe29acb112838b858272-merged.mount: Deactivated successfully. Dec 15 02:44:16 localhost podman[26631]: 2025-12-15 07:44:16.320994016 +0000 UTC m=+4.185351942 container init ad7422b204667eb6ad676a19e20c4dd7d8f22f355eaef11fa5ace19c30aa7ef7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zealous_jang, GIT_CLEAN=True, io.openshift.expose-services=, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, architecture=x86_64, description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, RELEASE=main, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, io.buildah.version=1.41.4, version=7, release=1763362218, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , vcs-type=git, build-date=2025-11-26T19:44:28Z) Dec 15 02:44:16 localhost podman[26631]: 2025-12-15 07:44:16.466419088 +0000 UTC m=+4.330777024 container start ad7422b204667eb6ad676a19e20c4dd7d8f22f355eaef11fa5ace19c30aa7ef7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zealous_jang, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, GIT_CLEAN=True, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, vendor=Red Hat, Inc., version=7, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=Guillaume Abrioux , distribution-scope=public, release=1763362218, architecture=x86_64, io.openshift.tags=rhceph ceph) Dec 15 02:44:16 localhost podman[26631]: 2025-12-15 07:44:16.466931081 +0000 UTC m=+4.331289037 container attach ad7422b204667eb6ad676a19e20c4dd7d8f22f355eaef11fa5ace19c30aa7ef7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zealous_jang, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, name=rhceph, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, release=1763362218, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, version=7, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, GIT_BRANCH=main, ceph=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, vendor=Red Hat, Inc., io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64) Dec 15 02:44:17 localhost zealous_jang[26652]: [ Dec 15 02:44:17 localhost zealous_jang[26652]: { Dec 15 02:44:17 localhost zealous_jang[26652]: "available": false, Dec 15 02:44:17 localhost zealous_jang[26652]: "ceph_device": false, Dec 15 02:44:17 localhost zealous_jang[26652]: "device_id": "QEMU_DVD-ROM_QM00001", Dec 15 02:44:17 localhost zealous_jang[26652]: "lsm_data": {}, Dec 15 02:44:17 localhost zealous_jang[26652]: "lvs": [], Dec 15 02:44:17 localhost zealous_jang[26652]: "path": "/dev/sr0", Dec 15 02:44:17 localhost zealous_jang[26652]: "rejected_reasons": [ Dec 15 02:44:17 localhost zealous_jang[26652]: "Has a FileSystem", Dec 15 02:44:17 localhost zealous_jang[26652]: "Insufficient space (<5GB)" Dec 15 02:44:17 localhost zealous_jang[26652]: ], Dec 15 02:44:17 localhost zealous_jang[26652]: "sys_api": { Dec 15 02:44:17 localhost zealous_jang[26652]: "actuators": null, Dec 15 02:44:17 localhost zealous_jang[26652]: "device_nodes": "sr0", Dec 15 02:44:17 localhost zealous_jang[26652]: "human_readable_size": "482.00 KB", Dec 15 02:44:17 localhost zealous_jang[26652]: "id_bus": "ata", Dec 15 02:44:17 localhost zealous_jang[26652]: "model": "QEMU DVD-ROM", Dec 15 02:44:17 localhost zealous_jang[26652]: "nr_requests": "2", Dec 15 02:44:17 localhost zealous_jang[26652]: "partitions": {}, Dec 15 02:44:17 localhost zealous_jang[26652]: "path": "/dev/sr0", Dec 15 02:44:17 localhost zealous_jang[26652]: "removable": "1", Dec 15 02:44:17 localhost zealous_jang[26652]: "rev": "2.5+", Dec 15 02:44:17 localhost zealous_jang[26652]: "ro": "0", Dec 15 02:44:17 localhost zealous_jang[26652]: "rotational": "1", Dec 15 02:44:17 localhost zealous_jang[26652]: "sas_address": "", Dec 15 02:44:17 localhost zealous_jang[26652]: "sas_device_handle": "", Dec 15 02:44:17 localhost zealous_jang[26652]: "scheduler_mode": "mq-deadline", Dec 15 02:44:17 localhost zealous_jang[26652]: "sectors": 0, Dec 15 02:44:17 localhost zealous_jang[26652]: "sectorsize": "2048", Dec 15 02:44:17 localhost zealous_jang[26652]: "size": 493568.0, Dec 15 02:44:17 localhost zealous_jang[26652]: "support_discard": "0", Dec 15 02:44:17 localhost zealous_jang[26652]: "type": "disk", Dec 15 02:44:17 localhost zealous_jang[26652]: "vendor": "QEMU" Dec 15 02:44:17 localhost zealous_jang[26652]: } Dec 15 02:44:17 localhost zealous_jang[26652]: } Dec 15 02:44:17 localhost zealous_jang[26652]: ] Dec 15 02:44:17 localhost systemd[1]: libpod-ad7422b204667eb6ad676a19e20c4dd7d8f22f355eaef11fa5ace19c30aa7ef7.scope: Deactivated successfully. Dec 15 02:44:17 localhost podman[26631]: 2025-12-15 07:44:17.199107549 +0000 UTC m=+5.063465455 container died ad7422b204667eb6ad676a19e20c4dd7d8f22f355eaef11fa5ace19c30aa7ef7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zealous_jang, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, release=1763362218, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, io.openshift.expose-services=, version=7, description=Red Hat Ceph Storage 7, distribution-scope=public, architecture=x86_64, vendor=Red Hat, Inc., name=rhceph, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0) Dec 15 02:44:17 localhost systemd[1]: var-lib-containers-storage-overlay-4afdbad2e2d6c1edcf4f7b647ae8fda48b530251f3970ef49b3022e7d7a285d7-merged.mount: Deactivated successfully. Dec 15 02:44:17 localhost podman[28118]: 2025-12-15 07:44:17.291352126 +0000 UTC m=+0.081494444 container remove ad7422b204667eb6ad676a19e20c4dd7d8f22f355eaef11fa5ace19c30aa7ef7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zealous_jang, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , architecture=x86_64, CEPH_POINT_RELEASE=, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, name=rhceph, io.openshift.tags=rhceph ceph, release=1763362218, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, build-date=2025-11-26T19:44:28Z, com.redhat.component=rhceph-container) Dec 15 02:44:17 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Dec 15 02:44:17 localhost systemd[1]: libpod-conmon-ad7422b204667eb6ad676a19e20c4dd7d8f22f355eaef11fa5ace19c30aa7ef7.scope: Deactivated successfully. Dec 15 02:44:17 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Dec 15 02:44:17 localhost systemd[1]: systemd-coredump.socket: Deactivated successfully. Dec 15 02:44:17 localhost systemd[1]: Closed Process Core Dump Socket. Dec 15 02:44:17 localhost systemd[1]: Stopping Process Core Dump Socket... Dec 15 02:44:17 localhost systemd[1]: Listening on Process Core Dump Socket. Dec 15 02:44:17 localhost systemd[1]: Reloading. Dec 15 02:44:18 localhost systemd-rc-local-generator[28203]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 02:44:18 localhost systemd-sysv-generator[28206]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 02:44:18 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 02:44:18 localhost systemd[1]: Reloading. Dec 15 02:44:18 localhost systemd-sysv-generator[28239]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 02:44:18 localhost systemd-rc-local-generator[28235]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 02:44:18 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 02:44:38 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Dec 15 02:44:38 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Dec 15 02:44:38 localhost podman[28321]: Dec 15 02:44:38 localhost podman[28321]: 2025-12-15 07:44:38.926346119 +0000 UTC m=+0.064802557 container create 8986fb57c106aa1ea68776e436af912315e975161d208710bd5fd935f5137f9f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=focused_gates, RELEASE=main, io.openshift.expose-services=, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1763362218, distribution-scope=public, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vendor=Red Hat, Inc., version=7, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main) Dec 15 02:44:38 localhost systemd[1]: Started libpod-conmon-8986fb57c106aa1ea68776e436af912315e975161d208710bd5fd935f5137f9f.scope. Dec 15 02:44:38 localhost systemd[1]: Started libcrun container. Dec 15 02:44:38 localhost podman[28321]: 2025-12-15 07:44:38.893481083 +0000 UTC m=+0.031937521 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 02:44:39 localhost podman[28321]: 2025-12-15 07:44:39.005172226 +0000 UTC m=+0.143628684 container init 8986fb57c106aa1ea68776e436af912315e975161d208710bd5fd935f5137f9f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=focused_gates, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, GIT_CLEAN=True, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux , ceph=True, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, architecture=x86_64, version=7, release=1763362218, io.openshift.expose-services=, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Dec 15 02:44:39 localhost podman[28321]: 2025-12-15 07:44:39.015518492 +0000 UTC m=+0.153974920 container start 8986fb57c106aa1ea68776e436af912315e975161d208710bd5fd935f5137f9f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=focused_gates, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, description=Red Hat Ceph Storage 7, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., GIT_CLEAN=True, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, distribution-scope=public, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, release=1763362218, version=7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=) Dec 15 02:44:39 localhost podman[28321]: 2025-12-15 07:44:39.01583097 +0000 UTC m=+0.154287448 container attach 8986fb57c106aa1ea68776e436af912315e975161d208710bd5fd935f5137f9f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=focused_gates, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , release=1763362218, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, distribution-scope=public, GIT_CLEAN=True, CEPH_POINT_RELEASE=, version=7, io.openshift.tags=rhceph ceph, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, ceph=True, com.redhat.component=rhceph-container) Dec 15 02:44:39 localhost focused_gates[28336]: 167 167 Dec 15 02:44:39 localhost systemd[1]: libpod-8986fb57c106aa1ea68776e436af912315e975161d208710bd5fd935f5137f9f.scope: Deactivated successfully. Dec 15 02:44:39 localhost podman[28321]: 2025-12-15 07:44:39.020369227 +0000 UTC m=+0.158825685 container died 8986fb57c106aa1ea68776e436af912315e975161d208710bd5fd935f5137f9f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=focused_gates, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, release=1763362218, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, io.openshift.expose-services=, architecture=x86_64, GIT_CLEAN=True, version=7, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-11-26T19:44:28Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public) Dec 15 02:44:39 localhost podman[28341]: 2025-12-15 07:44:39.111054349 +0000 UTC m=+0.079957357 container remove 8986fb57c106aa1ea68776e436af912315e975161d208710bd5fd935f5137f9f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=focused_gates, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-11-26T19:44:28Z, distribution-scope=public, version=7, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1763362218, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, description=Red Hat Ceph Storage 7, vcs-type=git, io.openshift.expose-services=, GIT_CLEAN=True) Dec 15 02:44:39 localhost systemd[1]: libpod-conmon-8986fb57c106aa1ea68776e436af912315e975161d208710bd5fd935f5137f9f.scope: Deactivated successfully. Dec 15 02:44:39 localhost systemd[1]: Reloading. Dec 15 02:44:39 localhost systemd-rc-local-generator[28383]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 02:44:39 localhost systemd-sysv-generator[28386]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 02:44:39 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 02:44:39 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Dec 15 02:44:39 localhost systemd[1]: Reloading. Dec 15 02:44:39 localhost systemd-rc-local-generator[28422]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 02:44:39 localhost systemd-sysv-generator[28427]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 02:44:39 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 02:44:39 localhost systemd[1]: Reached target All Ceph clusters and services. Dec 15 02:44:39 localhost systemd[1]: Reloading. Dec 15 02:44:39 localhost systemd-rc-local-generator[28459]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 02:44:39 localhost systemd-sysv-generator[28464]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 02:44:39 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 02:44:39 localhost systemd[1]: Reached target Ceph cluster bce17446-41b5-5408-a23e-0b011906b44a. Dec 15 02:44:39 localhost systemd[1]: Reloading. Dec 15 02:44:39 localhost systemd-rc-local-generator[28497]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 02:44:39 localhost systemd-sysv-generator[28502]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 02:44:40 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 02:44:40 localhost systemd[1]: Reloading. Dec 15 02:44:40 localhost systemd-sysv-generator[28543]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 02:44:40 localhost systemd-rc-local-generator[28538]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 02:44:40 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 02:44:40 localhost systemd[1]: Created slice Slice /system/ceph-bce17446-41b5-5408-a23e-0b011906b44a. Dec 15 02:44:40 localhost systemd[1]: Reached target System Time Set. Dec 15 02:44:40 localhost systemd[1]: Reached target System Time Synchronized. Dec 15 02:44:40 localhost systemd[1]: Starting Ceph crash.np0005559462 for bce17446-41b5-5408-a23e-0b011906b44a... Dec 15 02:44:40 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Dec 15 02:44:40 localhost systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully. Dec 15 02:44:40 localhost podman[28603]: Dec 15 02:44:40 localhost podman[28603]: 2025-12-15 07:44:40.753369205 +0000 UTC m=+0.075605065 container create 8dcda56b365b42dc8758aab77a9ec80db304780e449052738f7e4e648ae1ecaf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-crash-np0005559462, CEPH_POINT_RELEASE=, RELEASE=main, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, version=7, vcs-type=git, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, io.openshift.expose-services=) Dec 15 02:44:40 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c8293189f718bb84f24a4c153d24bff1cf16e60149d40cc8c50faa9e9e6e5a3c/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Dec 15 02:44:40 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c8293189f718bb84f24a4c153d24bff1cf16e60149d40cc8c50faa9e9e6e5a3c/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Dec 15 02:44:40 localhost podman[28603]: 2025-12-15 07:44:40.725737465 +0000 UTC m=+0.047973325 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 02:44:40 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c8293189f718bb84f24a4c153d24bff1cf16e60149d40cc8c50faa9e9e6e5a3c/merged/etc/ceph/ceph.client.crash.np0005559462.keyring supports timestamps until 2038 (0x7fffffff) Dec 15 02:44:40 localhost podman[28603]: 2025-12-15 07:44:40.843700269 +0000 UTC m=+0.165936139 container init 8dcda56b365b42dc8758aab77a9ec80db304780e449052738f7e4e648ae1ecaf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-crash-np0005559462, vcs-type=git, GIT_CLEAN=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, build-date=2025-11-26T19:44:28Z, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, distribution-scope=public, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., name=rhceph, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, ceph=True, version=7, architecture=x86_64, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218) Dec 15 02:44:40 localhost podman[28603]: 2025-12-15 07:44:40.852761522 +0000 UTC m=+0.174997382 container start 8dcda56b365b42dc8758aab77a9ec80db304780e449052738f7e4e648ae1ecaf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-crash-np0005559462, vcs-type=git, release=1763362218, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, GIT_BRANCH=main, io.openshift.expose-services=, architecture=x86_64, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, vendor=Red Hat, Inc., distribution-scope=public, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux ) Dec 15 02:44:40 localhost bash[28603]: 8dcda56b365b42dc8758aab77a9ec80db304780e449052738f7e4e648ae1ecaf Dec 15 02:44:40 localhost systemd[1]: Started Ceph crash.np0005559462 for bce17446-41b5-5408-a23e-0b011906b44a. Dec 15 02:44:40 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-crash-np0005559462[28617]: INFO:ceph-crash:pinging cluster to exercise our key Dec 15 02:44:41 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-crash-np0005559462[28617]: 2025-12-15T07:44:41.027+0000 7fe6a48fc640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory Dec 15 02:44:41 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-crash-np0005559462[28617]: 2025-12-15T07:44:41.027+0000 7fe6a48fc640 -1 AuthRegistry(0x7fe6a00680d0) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx Dec 15 02:44:41 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-crash-np0005559462[28617]: 2025-12-15T07:44:41.028+0000 7fe6a48fc640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory Dec 15 02:44:41 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-crash-np0005559462[28617]: 2025-12-15T07:44:41.028+0000 7fe6a48fc640 -1 AuthRegistry(0x7fe6a48fb000) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx Dec 15 02:44:41 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-crash-np0005559462[28617]: 2025-12-15T07:44:41.039+0000 7fe69effd640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1] Dec 15 02:44:41 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-crash-np0005559462[28617]: 2025-12-15T07:44:41.039+0000 7fe69f7fe640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1] Dec 15 02:44:41 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-crash-np0005559462[28617]: 2025-12-15T07:44:41.040+0000 7fe69ffff640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1] Dec 15 02:44:41 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-crash-np0005559462[28617]: 2025-12-15T07:44:41.040+0000 7fe6a48fc640 -1 monclient: authenticate NOTE: no keyring found; disabled cephx authentication Dec 15 02:44:41 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-crash-np0005559462[28617]: [errno 13] RADOS permission denied (error connecting to the cluster) Dec 15 02:44:41 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-crash-np0005559462[28617]: INFO:ceph-crash:monitoring path /var/lib/ceph/crash, delay 600s Dec 15 02:44:49 localhost podman[28722]: Dec 15 02:44:49 localhost podman[28722]: 2025-12-15 07:44:49.742718568 +0000 UTC m=+0.075205215 container create ce00982bfdfbc49d704372f1c15613643e8a280f5da509fafae7ea7c48a6a8b6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pedantic_heisenberg, com.redhat.component=rhceph-container, io.openshift.expose-services=, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, RELEASE=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, ceph=True, maintainer=Guillaume Abrioux , release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, distribution-scope=public, description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, vcs-type=git, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph) Dec 15 02:44:49 localhost systemd[1]: Started libpod-conmon-ce00982bfdfbc49d704372f1c15613643e8a280f5da509fafae7ea7c48a6a8b6.scope. Dec 15 02:44:49 localhost systemd[1]: Started libcrun container. Dec 15 02:44:49 localhost podman[28722]: 2025-12-15 07:44:49.712424049 +0000 UTC m=+0.044910696 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 02:44:49 localhost podman[28722]: 2025-12-15 07:44:49.82053267 +0000 UTC m=+0.153019327 container init ce00982bfdfbc49d704372f1c15613643e8a280f5da509fafae7ea7c48a6a8b6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pedantic_heisenberg, io.buildah.version=1.41.4, RELEASE=main, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, GIT_CLEAN=True, name=rhceph, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, release=1763362218, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, version=7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, distribution-scope=public) Dec 15 02:44:49 localhost podman[28722]: 2025-12-15 07:44:49.831489922 +0000 UTC m=+0.163976569 container start ce00982bfdfbc49d704372f1c15613643e8a280f5da509fafae7ea7c48a6a8b6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pedantic_heisenberg, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , io.openshift.expose-services=, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, GIT_CLEAN=True, release=1763362218, io.buildah.version=1.41.4, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-11-26T19:44:28Z, distribution-scope=public, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0) Dec 15 02:44:49 localhost podman[28722]: 2025-12-15 07:44:49.831759429 +0000 UTC m=+0.164246076 container attach ce00982bfdfbc49d704372f1c15613643e8a280f5da509fafae7ea7c48a6a8b6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pedantic_heisenberg, release=1763362218, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, maintainer=Guillaume Abrioux , GIT_BRANCH=main, vcs-type=git, io.openshift.tags=rhceph ceph, ceph=True, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, build-date=2025-11-26T19:44:28Z, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, name=rhceph) Dec 15 02:44:49 localhost pedantic_heisenberg[28738]: 167 167 Dec 15 02:44:49 localhost systemd[1]: libpod-ce00982bfdfbc49d704372f1c15613643e8a280f5da509fafae7ea7c48a6a8b6.scope: Deactivated successfully. Dec 15 02:44:49 localhost podman[28722]: 2025-12-15 07:44:49.835808623 +0000 UTC m=+0.168295340 container died ce00982bfdfbc49d704372f1c15613643e8a280f5da509fafae7ea7c48a6a8b6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pedantic_heisenberg, GIT_BRANCH=main, name=rhceph, vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, architecture=x86_64, io.openshift.expose-services=, build-date=2025-11-26T19:44:28Z, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, maintainer=Guillaume Abrioux , cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, version=7, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218) Dec 15 02:44:49 localhost systemd[1]: tmp-crun.ZsAHt2.mount: Deactivated successfully. Dec 15 02:44:49 localhost podman[28743]: 2025-12-15 07:44:49.924340929 +0000 UTC m=+0.079534916 container remove ce00982bfdfbc49d704372f1c15613643e8a280f5da509fafae7ea7c48a6a8b6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pedantic_heisenberg, GIT_CLEAN=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., GIT_BRANCH=main, release=1763362218, vcs-type=git, version=7, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Dec 15 02:44:49 localhost systemd[1]: libpod-conmon-ce00982bfdfbc49d704372f1c15613643e8a280f5da509fafae7ea7c48a6a8b6.scope: Deactivated successfully. Dec 15 02:44:50 localhost podman[28765]: Dec 15 02:44:50 localhost podman[28765]: 2025-12-15 07:44:50.13121022 +0000 UTC m=+0.071959062 container create 431729d43bf09b5ba3e3605055f3fd36204b259f14bc9cc129667b271c3eaa8a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=musing_wu, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-11-26T19:44:28Z, version=7, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, maintainer=Guillaume Abrioux , GIT_CLEAN=True, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, release=1763362218, distribution-scope=public, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7) Dec 15 02:44:50 localhost systemd[1]: Started libpod-conmon-431729d43bf09b5ba3e3605055f3fd36204b259f14bc9cc129667b271c3eaa8a.scope. Dec 15 02:44:50 localhost systemd[1]: Started libcrun container. Dec 15 02:44:50 localhost podman[28765]: 2025-12-15 07:44:50.102902122 +0000 UTC m=+0.043650954 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 02:44:50 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5db372addc250f318907d40a951b885de9c412563f57e5863c0ff0d7ad3f9d04/merged/rootfs supports timestamps until 2038 (0x7fffffff) Dec 15 02:44:50 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5db372addc250f318907d40a951b885de9c412563f57e5863c0ff0d7ad3f9d04/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Dec 15 02:44:50 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5db372addc250f318907d40a951b885de9c412563f57e5863c0ff0d7ad3f9d04/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Dec 15 02:44:50 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5db372addc250f318907d40a951b885de9c412563f57e5863c0ff0d7ad3f9d04/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Dec 15 02:44:50 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5db372addc250f318907d40a951b885de9c412563f57e5863c0ff0d7ad3f9d04/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff) Dec 15 02:44:50 localhost podman[28765]: 2025-12-15 07:44:50.261298235 +0000 UTC m=+0.202047067 container init 431729d43bf09b5ba3e3605055f3fd36204b259f14bc9cc129667b271c3eaa8a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=musing_wu, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, version=7, release=1763362218, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, vcs-type=git, GIT_CLEAN=True, vendor=Red Hat, Inc., ceph=True, RELEASE=main, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git) Dec 15 02:44:50 localhost podman[28765]: 2025-12-15 07:44:50.271250701 +0000 UTC m=+0.211999533 container start 431729d43bf09b5ba3e3605055f3fd36204b259f14bc9cc129667b271c3eaa8a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=musing_wu, CEPH_POINT_RELEASE=, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, architecture=x86_64, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, GIT_BRANCH=main, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, build-date=2025-11-26T19:44:28Z, release=1763362218, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, RELEASE=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , version=7) Dec 15 02:44:50 localhost podman[28765]: 2025-12-15 07:44:50.27158801 +0000 UTC m=+0.212336902 container attach 431729d43bf09b5ba3e3605055f3fd36204b259f14bc9cc129667b271c3eaa8a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=musing_wu, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, version=7, CEPH_POINT_RELEASE=, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, build-date=2025-11-26T19:44:28Z, ceph=True, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, GIT_BRANCH=main, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph) Dec 15 02:44:50 localhost musing_wu[28781]: --> passed data devices: 0 physical, 2 LVM Dec 15 02:44:50 localhost musing_wu[28781]: --> relative data size: 1.0 Dec 15 02:44:50 localhost systemd[1]: var-lib-containers-storage-overlay-0d2c56941e07e22b8761f4683c71fbf9410144c4b147d66d10c6d76249397629-merged.mount: Deactivated successfully. Dec 15 02:44:50 localhost musing_wu[28781]: Running command: /usr/bin/ceph-authtool --gen-print-key Dec 15 02:44:50 localhost musing_wu[28781]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring -i - osd new 6d4c9d22-e303-4075-8ccd-2bb4bc620212 Dec 15 02:44:51 localhost lvm[28835]: PV /dev/loop3 online, VG ceph_vg0 is complete. Dec 15 02:44:51 localhost lvm[28835]: VG ceph_vg0 finished Dec 15 02:44:51 localhost musing_wu[28781]: Running command: /usr/bin/ceph-authtool --gen-print-key Dec 15 02:44:51 localhost musing_wu[28781]: Running command: /usr/bin/mount -t tmpfs tmpfs /var/lib/ceph/osd/ceph-0 Dec 15 02:44:51 localhost musing_wu[28781]: Running command: /usr/bin/chown -h ceph:ceph /dev/ceph_vg0/ceph_lv0 Dec 15 02:44:51 localhost musing_wu[28781]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0 Dec 15 02:44:51 localhost musing_wu[28781]: Running command: /usr/bin/ln -s /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-0/block Dec 15 02:44:51 localhost musing_wu[28781]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring mon getmap -o /var/lib/ceph/osd/ceph-0/activate.monmap Dec 15 02:44:51 localhost musing_wu[28781]: stderr: got monmap epoch 3 Dec 15 02:44:51 localhost musing_wu[28781]: --> Creating keyring file for osd.0 Dec 15 02:44:51 localhost musing_wu[28781]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0/keyring Dec 15 02:44:51 localhost musing_wu[28781]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0/ Dec 15 02:44:51 localhost musing_wu[28781]: Running command: /usr/bin/ceph-osd --cluster ceph --osd-objectstore bluestore --mkfs -i 0 --monmap /var/lib/ceph/osd/ceph-0/activate.monmap --keyfile - --osdspec-affinity default_drive_group --osd-data /var/lib/ceph/osd/ceph-0/ --osd-uuid 6d4c9d22-e303-4075-8ccd-2bb4bc620212 --setuser ceph --setgroup ceph Dec 15 02:44:54 localhost musing_wu[28781]: stderr: 2025-12-15T07:44:51.952+0000 7f9af0452a80 -1 bluestore(/var/lib/ceph/osd/ceph-0//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3] Dec 15 02:44:54 localhost musing_wu[28781]: stderr: 2025-12-15T07:44:51.952+0000 7f9af0452a80 -1 bluestore(/var/lib/ceph/osd/ceph-0/) _read_fsid unparsable uuid Dec 15 02:44:54 localhost musing_wu[28781]: --> ceph-volume lvm prepare successful for: ceph_vg0/ceph_lv0 Dec 15 02:44:54 localhost musing_wu[28781]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0 Dec 15 02:44:54 localhost musing_wu[28781]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-0 --no-mon-config Dec 15 02:44:54 localhost musing_wu[28781]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-0/block Dec 15 02:44:54 localhost musing_wu[28781]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-0/block Dec 15 02:44:54 localhost musing_wu[28781]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0 Dec 15 02:44:54 localhost musing_wu[28781]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0 Dec 15 02:44:54 localhost musing_wu[28781]: --> ceph-volume lvm activate successful for osd ID: 0 Dec 15 02:44:54 localhost musing_wu[28781]: --> ceph-volume lvm create successful for: ceph_vg0/ceph_lv0 Dec 15 02:44:54 localhost musing_wu[28781]: Running command: /usr/bin/ceph-authtool --gen-print-key Dec 15 02:44:54 localhost musing_wu[28781]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring -i - osd new 9639648d-7992-48ee-ae84-b668fb65e316 Dec 15 02:44:55 localhost lvm[29779]: PV /dev/loop4 online, VG ceph_vg1 is complete. Dec 15 02:44:55 localhost lvm[29779]: VG ceph_vg1 finished Dec 15 02:44:55 localhost musing_wu[28781]: Running command: /usr/bin/ceph-authtool --gen-print-key Dec 15 02:44:55 localhost musing_wu[28781]: Running command: /usr/bin/mount -t tmpfs tmpfs /var/lib/ceph/osd/ceph-3 Dec 15 02:44:55 localhost musing_wu[28781]: Running command: /usr/bin/chown -h ceph:ceph /dev/ceph_vg1/ceph_lv1 Dec 15 02:44:55 localhost musing_wu[28781]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1 Dec 15 02:44:55 localhost musing_wu[28781]: Running command: /usr/bin/ln -s /dev/ceph_vg1/ceph_lv1 /var/lib/ceph/osd/ceph-3/block Dec 15 02:44:55 localhost musing_wu[28781]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring mon getmap -o /var/lib/ceph/osd/ceph-3/activate.monmap Dec 15 02:44:55 localhost musing_wu[28781]: stderr: got monmap epoch 3 Dec 15 02:44:55 localhost musing_wu[28781]: --> Creating keyring file for osd.3 Dec 15 02:44:55 localhost musing_wu[28781]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3/keyring Dec 15 02:44:55 localhost musing_wu[28781]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3/ Dec 15 02:44:55 localhost musing_wu[28781]: Running command: /usr/bin/ceph-osd --cluster ceph --osd-objectstore bluestore --mkfs -i 3 --monmap /var/lib/ceph/osd/ceph-3/activate.monmap --keyfile - --osdspec-affinity default_drive_group --osd-data /var/lib/ceph/osd/ceph-3/ --osd-uuid 9639648d-7992-48ee-ae84-b668fb65e316 --setuser ceph --setgroup ceph Dec 15 02:44:58 localhost musing_wu[28781]: stderr: 2025-12-15T07:44:55.733+0000 7f8f937a5a80 -1 bluestore(/var/lib/ceph/osd/ceph-3//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3] Dec 15 02:44:58 localhost musing_wu[28781]: stderr: 2025-12-15T07:44:55.733+0000 7f8f937a5a80 -1 bluestore(/var/lib/ceph/osd/ceph-3/) _read_fsid unparsable uuid Dec 15 02:44:58 localhost musing_wu[28781]: --> ceph-volume lvm prepare successful for: ceph_vg1/ceph_lv1 Dec 15 02:44:58 localhost musing_wu[28781]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3 Dec 15 02:44:58 localhost musing_wu[28781]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg1/ceph_lv1 --path /var/lib/ceph/osd/ceph-3 --no-mon-config Dec 15 02:44:58 localhost musing_wu[28781]: Running command: /usr/bin/ln -snf /dev/ceph_vg1/ceph_lv1 /var/lib/ceph/osd/ceph-3/block Dec 15 02:44:58 localhost musing_wu[28781]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-3/block Dec 15 02:44:58 localhost musing_wu[28781]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1 Dec 15 02:44:58 localhost musing_wu[28781]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3 Dec 15 02:44:58 localhost musing_wu[28781]: --> ceph-volume lvm activate successful for osd ID: 3 Dec 15 02:44:58 localhost musing_wu[28781]: --> ceph-volume lvm create successful for: ceph_vg1/ceph_lv1 Dec 15 02:44:58 localhost systemd[1]: libpod-431729d43bf09b5ba3e3605055f3fd36204b259f14bc9cc129667b271c3eaa8a.scope: Deactivated successfully. Dec 15 02:44:58 localhost systemd[1]: libpod-431729d43bf09b5ba3e3605055f3fd36204b259f14bc9cc129667b271c3eaa8a.scope: Consumed 3.776s CPU time. Dec 15 02:44:58 localhost podman[28765]: 2025-12-15 07:44:58.341471297 +0000 UTC m=+8.282220159 container died 431729d43bf09b5ba3e3605055f3fd36204b259f14bc9cc129667b271c3eaa8a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=musing_wu, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, GIT_CLEAN=True, build-date=2025-11-26T19:44:28Z, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, architecture=x86_64, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, GIT_BRANCH=main, release=1763362218, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, RELEASE=main, vendor=Red Hat, Inc., version=7) Dec 15 02:44:58 localhost systemd[1]: var-lib-containers-storage-overlay-5db372addc250f318907d40a951b885de9c412563f57e5863c0ff0d7ad3f9d04-merged.mount: Deactivated successfully. Dec 15 02:44:58 localhost podman[30693]: 2025-12-15 07:44:58.425950729 +0000 UTC m=+0.075216965 container remove 431729d43bf09b5ba3e3605055f3fd36204b259f14bc9cc129667b271c3eaa8a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=musing_wu, description=Red Hat Ceph Storage 7, vcs-type=git, distribution-scope=public, RELEASE=main, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1763362218, name=rhceph, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=) Dec 15 02:44:58 localhost systemd[1]: libpod-conmon-431729d43bf09b5ba3e3605055f3fd36204b259f14bc9cc129667b271c3eaa8a.scope: Deactivated successfully. Dec 15 02:44:59 localhost podman[30776]: Dec 15 02:44:59 localhost podman[30776]: 2025-12-15 07:44:59.227631336 +0000 UTC m=+0.072246009 container create 1acb1cb18a67433795adc66a7bb5b8786412e32e45dfedc05ac150922dba4845 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=adoring_hertz, RELEASE=main, version=7, ceph=True, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, build-date=2025-11-26T19:44:28Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, io.openshift.expose-services=, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, release=1763362218, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, name=rhceph, distribution-scope=public, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream) Dec 15 02:44:59 localhost systemd[1]: Started libpod-conmon-1acb1cb18a67433795adc66a7bb5b8786412e32e45dfedc05ac150922dba4845.scope. Dec 15 02:44:59 localhost systemd[1]: Started libcrun container. Dec 15 02:44:59 localhost podman[30776]: 2025-12-15 07:44:59.296521358 +0000 UTC m=+0.141136021 container init 1acb1cb18a67433795adc66a7bb5b8786412e32e45dfedc05ac150922dba4845 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=adoring_hertz, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, RELEASE=main, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, GIT_BRANCH=main, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, version=7, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-11-26T19:44:28Z, io.openshift.expose-services=, com.redhat.component=rhceph-container, ceph=True) Dec 15 02:44:59 localhost podman[30776]: 2025-12-15 07:44:59.199223576 +0000 UTC m=+0.043838249 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 02:44:59 localhost podman[30776]: 2025-12-15 07:44:59.308886346 +0000 UTC m=+0.153501009 container start 1acb1cb18a67433795adc66a7bb5b8786412e32e45dfedc05ac150922dba4845 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=adoring_hertz, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, io.openshift.expose-services=, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, release=1763362218, name=rhceph, RELEASE=main, architecture=x86_64, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, build-date=2025-11-26T19:44:28Z, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4) Dec 15 02:44:59 localhost podman[30776]: 2025-12-15 07:44:59.309346738 +0000 UTC m=+0.153961401 container attach 1acb1cb18a67433795adc66a7bb5b8786412e32e45dfedc05ac150922dba4845 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=adoring_hertz, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, GIT_BRANCH=main, release=1763362218, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-11-26T19:44:28Z, version=7, distribution-scope=public, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , name=rhceph, com.redhat.component=rhceph-container, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph) Dec 15 02:44:59 localhost adoring_hertz[30791]: 167 167 Dec 15 02:44:59 localhost systemd[1]: libpod-1acb1cb18a67433795adc66a7bb5b8786412e32e45dfedc05ac150922dba4845.scope: Deactivated successfully. Dec 15 02:44:59 localhost podman[30776]: 2025-12-15 07:44:59.31370156 +0000 UTC m=+0.158316223 container died 1acb1cb18a67433795adc66a7bb5b8786412e32e45dfedc05ac150922dba4845 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=adoring_hertz, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, description=Red Hat Ceph Storage 7, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, ceph=True, GIT_CLEAN=True, maintainer=Guillaume Abrioux , build-date=2025-11-26T19:44:28Z, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, vcs-type=git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, com.redhat.component=rhceph-container) Dec 15 02:44:59 localhost systemd[1]: var-lib-containers-storage-overlay-83fc3490d7905934a799757b840f6fa78c2ca5c674d946183f4d5dc357d1f0b7-merged.mount: Deactivated successfully. Dec 15 02:44:59 localhost podman[30796]: 2025-12-15 07:44:59.404162756 +0000 UTC m=+0.081222950 container remove 1acb1cb18a67433795adc66a7bb5b8786412e32e45dfedc05ac150922dba4845 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=adoring_hertz, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1763362218, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, io.openshift.expose-services=, build-date=2025-11-26T19:44:28Z, CEPH_POINT_RELEASE=, architecture=x86_64, GIT_CLEAN=True, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, version=7, name=rhceph, io.openshift.tags=rhceph ceph) Dec 15 02:44:59 localhost systemd[1]: libpod-conmon-1acb1cb18a67433795adc66a7bb5b8786412e32e45dfedc05ac150922dba4845.scope: Deactivated successfully. Dec 15 02:44:59 localhost podman[30819]: Dec 15 02:44:59 localhost podman[30819]: 2025-12-15 07:44:59.621312051 +0000 UTC m=+0.077857533 container create 315ea68346a2ffd987d9b2e753898e52e9310b0375d4093ff5f1c0979e73d065 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_lovelace, io.openshift.tags=rhceph ceph, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux , GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, version=7, ceph=True, vcs-type=git, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, release=1763362218, architecture=x86_64, build-date=2025-11-26T19:44:28Z) Dec 15 02:44:59 localhost systemd[1]: Started libpod-conmon-315ea68346a2ffd987d9b2e753898e52e9310b0375d4093ff5f1c0979e73d065.scope. Dec 15 02:44:59 localhost systemd[1]: Started libcrun container. Dec 15 02:44:59 localhost podman[30819]: 2025-12-15 07:44:59.589041591 +0000 UTC m=+0.045587063 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 02:44:59 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ed506bed244d2cf26d3ce0492733c7a1ea670dbe900e552a8a2ae25e812ae143/merged/rootfs supports timestamps until 2038 (0x7fffffff) Dec 15 02:44:59 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ed506bed244d2cf26d3ce0492733c7a1ea670dbe900e552a8a2ae25e812ae143/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Dec 15 02:44:59 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ed506bed244d2cf26d3ce0492733c7a1ea670dbe900e552a8a2ae25e812ae143/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Dec 15 02:44:59 localhost podman[30819]: 2025-12-15 07:44:59.720908602 +0000 UTC m=+0.177454084 container init 315ea68346a2ffd987d9b2e753898e52e9310b0375d4093ff5f1c0979e73d065 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_lovelace, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, release=1763362218, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, version=7, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, ceph=True, io.buildah.version=1.41.4, architecture=x86_64, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., vcs-type=git, name=rhceph) Dec 15 02:44:59 localhost podman[30819]: 2025-12-15 07:44:59.738111754 +0000 UTC m=+0.194657196 container start 315ea68346a2ffd987d9b2e753898e52e9310b0375d4093ff5f1c0979e73d065 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_lovelace, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, ceph=True, description=Red Hat Ceph Storage 7, version=7, architecture=x86_64, build-date=2025-11-26T19:44:28Z, name=rhceph, GIT_CLEAN=True, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , release=1763362218, io.openshift.expose-services=) Dec 15 02:44:59 localhost podman[30819]: 2025-12-15 07:44:59.73834659 +0000 UTC m=+0.194892092 container attach 315ea68346a2ffd987d9b2e753898e52e9310b0375d4093ff5f1c0979e73d065 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_lovelace, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-11-26T19:44:28Z, vendor=Red Hat, Inc., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, RELEASE=main, architecture=x86_64, GIT_BRANCH=main, ceph=True, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, name=rhceph, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , version=7) Dec 15 02:45:00 localhost priceless_lovelace[30835]: { Dec 15 02:45:00 localhost priceless_lovelace[30835]: "0": [ Dec 15 02:45:00 localhost priceless_lovelace[30835]: { Dec 15 02:45:00 localhost priceless_lovelace[30835]: "devices": [ Dec 15 02:45:00 localhost priceless_lovelace[30835]: "/dev/loop3" Dec 15 02:45:00 localhost priceless_lovelace[30835]: ], Dec 15 02:45:00 localhost priceless_lovelace[30835]: "lv_name": "ceph_lv0", Dec 15 02:45:00 localhost priceless_lovelace[30835]: "lv_path": "/dev/ceph_vg0/ceph_lv0", Dec 15 02:45:00 localhost priceless_lovelace[30835]: "lv_size": "7511998464", Dec 15 02:45:00 localhost priceless_lovelace[30835]: "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=BLnE1h-OYRC-ey9E-mmCR-VvO4-LvIk-2I930R,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=bce17446-41b5-5408-a23e-0b011906b44a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=6d4c9d22-e303-4075-8ccd-2bb4bc620212,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0", Dec 15 02:45:00 localhost priceless_lovelace[30835]: "lv_uuid": "BLnE1h-OYRC-ey9E-mmCR-VvO4-LvIk-2I930R", Dec 15 02:45:00 localhost priceless_lovelace[30835]: "name": "ceph_lv0", Dec 15 02:45:00 localhost priceless_lovelace[30835]: "path": "/dev/ceph_vg0/ceph_lv0", Dec 15 02:45:00 localhost priceless_lovelace[30835]: "tags": { Dec 15 02:45:00 localhost priceless_lovelace[30835]: "ceph.block_device": "/dev/ceph_vg0/ceph_lv0", Dec 15 02:45:00 localhost priceless_lovelace[30835]: "ceph.block_uuid": "BLnE1h-OYRC-ey9E-mmCR-VvO4-LvIk-2I930R", Dec 15 02:45:00 localhost priceless_lovelace[30835]: "ceph.cephx_lockbox_secret": "", Dec 15 02:45:00 localhost priceless_lovelace[30835]: "ceph.cluster_fsid": "bce17446-41b5-5408-a23e-0b011906b44a", Dec 15 02:45:00 localhost priceless_lovelace[30835]: "ceph.cluster_name": "ceph", Dec 15 02:45:00 localhost priceless_lovelace[30835]: "ceph.crush_device_class": "", Dec 15 02:45:00 localhost priceless_lovelace[30835]: "ceph.encrypted": "0", Dec 15 02:45:00 localhost priceless_lovelace[30835]: "ceph.osd_fsid": "6d4c9d22-e303-4075-8ccd-2bb4bc620212", Dec 15 02:45:00 localhost priceless_lovelace[30835]: "ceph.osd_id": "0", Dec 15 02:45:00 localhost priceless_lovelace[30835]: "ceph.osdspec_affinity": "default_drive_group", Dec 15 02:45:00 localhost priceless_lovelace[30835]: "ceph.type": "block", Dec 15 02:45:00 localhost priceless_lovelace[30835]: "ceph.vdo": "0" Dec 15 02:45:00 localhost priceless_lovelace[30835]: }, Dec 15 02:45:00 localhost priceless_lovelace[30835]: "type": "block", Dec 15 02:45:00 localhost priceless_lovelace[30835]: "vg_name": "ceph_vg0" Dec 15 02:45:00 localhost priceless_lovelace[30835]: } Dec 15 02:45:00 localhost priceless_lovelace[30835]: ], Dec 15 02:45:00 localhost priceless_lovelace[30835]: "3": [ Dec 15 02:45:00 localhost priceless_lovelace[30835]: { Dec 15 02:45:00 localhost priceless_lovelace[30835]: "devices": [ Dec 15 02:45:00 localhost priceless_lovelace[30835]: "/dev/loop4" Dec 15 02:45:00 localhost priceless_lovelace[30835]: ], Dec 15 02:45:00 localhost priceless_lovelace[30835]: "lv_name": "ceph_lv1", Dec 15 02:45:00 localhost priceless_lovelace[30835]: "lv_path": "/dev/ceph_vg1/ceph_lv1", Dec 15 02:45:00 localhost priceless_lovelace[30835]: "lv_size": "7511998464", Dec 15 02:45:00 localhost priceless_lovelace[30835]: "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=aGsFOg-Zf2m-s0ap-uI2s-K7HN-BXbo-32X8cP,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=bce17446-41b5-5408-a23e-0b011906b44a,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=9639648d-7992-48ee-ae84-b668fb65e316,ceph.osd_id=3,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0", Dec 15 02:45:00 localhost priceless_lovelace[30835]: "lv_uuid": "aGsFOg-Zf2m-s0ap-uI2s-K7HN-BXbo-32X8cP", Dec 15 02:45:00 localhost priceless_lovelace[30835]: "name": "ceph_lv1", Dec 15 02:45:00 localhost priceless_lovelace[30835]: "path": "/dev/ceph_vg1/ceph_lv1", Dec 15 02:45:00 localhost priceless_lovelace[30835]: "tags": { Dec 15 02:45:00 localhost priceless_lovelace[30835]: "ceph.block_device": "/dev/ceph_vg1/ceph_lv1", Dec 15 02:45:00 localhost priceless_lovelace[30835]: "ceph.block_uuid": "aGsFOg-Zf2m-s0ap-uI2s-K7HN-BXbo-32X8cP", Dec 15 02:45:00 localhost priceless_lovelace[30835]: "ceph.cephx_lockbox_secret": "", Dec 15 02:45:00 localhost priceless_lovelace[30835]: "ceph.cluster_fsid": "bce17446-41b5-5408-a23e-0b011906b44a", Dec 15 02:45:00 localhost priceless_lovelace[30835]: "ceph.cluster_name": "ceph", Dec 15 02:45:00 localhost priceless_lovelace[30835]: "ceph.crush_device_class": "", Dec 15 02:45:00 localhost priceless_lovelace[30835]: "ceph.encrypted": "0", Dec 15 02:45:00 localhost priceless_lovelace[30835]: "ceph.osd_fsid": "9639648d-7992-48ee-ae84-b668fb65e316", Dec 15 02:45:00 localhost priceless_lovelace[30835]: "ceph.osd_id": "3", Dec 15 02:45:00 localhost priceless_lovelace[30835]: "ceph.osdspec_affinity": "default_drive_group", Dec 15 02:45:00 localhost priceless_lovelace[30835]: "ceph.type": "block", Dec 15 02:45:00 localhost priceless_lovelace[30835]: "ceph.vdo": "0" Dec 15 02:45:00 localhost priceless_lovelace[30835]: }, Dec 15 02:45:00 localhost priceless_lovelace[30835]: "type": "block", Dec 15 02:45:00 localhost priceless_lovelace[30835]: "vg_name": "ceph_vg1" Dec 15 02:45:00 localhost priceless_lovelace[30835]: } Dec 15 02:45:00 localhost priceless_lovelace[30835]: ] Dec 15 02:45:00 localhost priceless_lovelace[30835]: } Dec 15 02:45:00 localhost systemd[1]: libpod-315ea68346a2ffd987d9b2e753898e52e9310b0375d4093ff5f1c0979e73d065.scope: Deactivated successfully. Dec 15 02:45:00 localhost podman[30819]: 2025-12-15 07:45:00.086596687 +0000 UTC m=+0.543142189 container died 315ea68346a2ffd987d9b2e753898e52e9310b0375d4093ff5f1c0979e73d065 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_lovelace, io.buildah.version=1.41.4, GIT_CLEAN=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, version=7, ceph=True, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, vcs-type=git, name=rhceph, release=1763362218, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream) Dec 15 02:45:00 localhost podman[30844]: 2025-12-15 07:45:00.154570105 +0000 UTC m=+0.061632926 container remove 315ea68346a2ffd987d9b2e753898e52e9310b0375d4093ff5f1c0979e73d065 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_lovelace, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, GIT_BRANCH=main, distribution-scope=public, version=7, GIT_CLEAN=True, architecture=x86_64, com.redhat.component=rhceph-container, release=1763362218, vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux ) Dec 15 02:45:00 localhost systemd[1]: libpod-conmon-315ea68346a2ffd987d9b2e753898e52e9310b0375d4093ff5f1c0979e73d065.scope: Deactivated successfully. Dec 15 02:45:00 localhost systemd[1]: tmp-crun.fjsWQS.mount: Deactivated successfully. Dec 15 02:45:00 localhost systemd[1]: var-lib-containers-storage-overlay-ed506bed244d2cf26d3ce0492733c7a1ea670dbe900e552a8a2ae25e812ae143-merged.mount: Deactivated successfully. Dec 15 02:45:00 localhost podman[30931]: Dec 15 02:45:00 localhost podman[30931]: 2025-12-15 07:45:00.892980505 +0000 UTC m=+0.066542662 container create a33e5cb2407e1c976bbc7ada5ab7bee682e2ff30625116c680d2173685fe201a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wonderful_chatterjee, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, description=Red Hat Ceph Storage 7, version=7, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, name=rhceph, release=1763362218, GIT_BRANCH=main, io.openshift.expose-services=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , build-date=2025-11-26T19:44:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, vcs-type=git, io.buildah.version=1.41.4, distribution-scope=public) Dec 15 02:45:00 localhost systemd[1]: Started libpod-conmon-a33e5cb2407e1c976bbc7ada5ab7bee682e2ff30625116c680d2173685fe201a.scope. Dec 15 02:45:00 localhost systemd[1]: Started libcrun container. Dec 15 02:45:00 localhost podman[30931]: 2025-12-15 07:45:00.860089889 +0000 UTC m=+0.033652046 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 02:45:00 localhost podman[30931]: 2025-12-15 07:45:00.961416905 +0000 UTC m=+0.134979082 container init a33e5cb2407e1c976bbc7ada5ab7bee682e2ff30625116c680d2173685fe201a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wonderful_chatterjee, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, vcs-type=git, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, name=rhceph, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, GIT_BRANCH=main, architecture=x86_64, build-date=2025-11-26T19:44:28Z) Dec 15 02:45:00 localhost podman[30931]: 2025-12-15 07:45:00.971257808 +0000 UTC m=+0.144819975 container start a33e5cb2407e1c976bbc7ada5ab7bee682e2ff30625116c680d2173685fe201a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wonderful_chatterjee, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , build-date=2025-11-26T19:44:28Z, release=1763362218, version=7, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, GIT_CLEAN=True, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4) Dec 15 02:45:00 localhost podman[30931]: 2025-12-15 07:45:00.971551876 +0000 UTC m=+0.145114043 container attach a33e5cb2407e1c976bbc7ada5ab7bee682e2ff30625116c680d2173685fe201a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wonderful_chatterjee, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-11-26T19:44:28Z, vendor=Red Hat, Inc., vcs-type=git, io.buildah.version=1.41.4, distribution-scope=public, GIT_CLEAN=True, ceph=True, release=1763362218, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, name=rhceph) Dec 15 02:45:00 localhost wonderful_chatterjee[30947]: 167 167 Dec 15 02:45:00 localhost systemd[1]: libpod-a33e5cb2407e1c976bbc7ada5ab7bee682e2ff30625116c680d2173685fe201a.scope: Deactivated successfully. Dec 15 02:45:00 localhost podman[30931]: 2025-12-15 07:45:00.976977655 +0000 UTC m=+0.150539812 container died a33e5cb2407e1c976bbc7ada5ab7bee682e2ff30625116c680d2173685fe201a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wonderful_chatterjee, RELEASE=main, maintainer=Guillaume Abrioux , build-date=2025-11-26T19:44:28Z, GIT_CLEAN=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, CEPH_POINT_RELEASE=, name=rhceph, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, release=1763362218, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Dec 15 02:45:01 localhost podman[30952]: 2025-12-15 07:45:01.066195599 +0000 UTC m=+0.080121601 container remove a33e5cb2407e1c976bbc7ada5ab7bee682e2ff30625116c680d2173685fe201a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wonderful_chatterjee, release=1763362218, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, maintainer=Guillaume Abrioux , vcs-type=git, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, RELEASE=main, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, vendor=Red Hat, Inc., name=rhceph, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Dec 15 02:45:01 localhost systemd[1]: libpod-conmon-a33e5cb2407e1c976bbc7ada5ab7bee682e2ff30625116c680d2173685fe201a.scope: Deactivated successfully. Dec 15 02:45:01 localhost podman[30981]: Dec 15 02:45:01 localhost podman[30981]: 2025-12-15 07:45:01.370339221 +0000 UTC m=+0.071430918 container create 3b30a8f881be948aab8ea8f2e7e66953c6b5b376162f6f9ca186780839fb4301 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-osd-0-activate-test, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, distribution-scope=public, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, ceph=True, build-date=2025-11-26T19:44:28Z, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, name=rhceph, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Dec 15 02:45:01 localhost systemd[1]: var-lib-containers-storage-overlay-99622ada9b1b19b49b3a08253b600063b27be3b0ad8d23e8eb0c09909368710f-merged.mount: Deactivated successfully. Dec 15 02:45:01 localhost systemd[1]: Started libpod-conmon-3b30a8f881be948aab8ea8f2e7e66953c6b5b376162f6f9ca186780839fb4301.scope. Dec 15 02:45:01 localhost systemd[1]: Started libcrun container. Dec 15 02:45:01 localhost podman[30981]: 2025-12-15 07:45:01.340827522 +0000 UTC m=+0.041919229 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 02:45:01 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/696824e58df06eccfb756f5a52a39ae5309144222a4894330aef8c0a033de587/merged/rootfs supports timestamps until 2038 (0x7fffffff) Dec 15 02:45:01 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/696824e58df06eccfb756f5a52a39ae5309144222a4894330aef8c0a033de587/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Dec 15 02:45:01 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/696824e58df06eccfb756f5a52a39ae5309144222a4894330aef8c0a033de587/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Dec 15 02:45:01 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/696824e58df06eccfb756f5a52a39ae5309144222a4894330aef8c0a033de587/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Dec 15 02:45:01 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/696824e58df06eccfb756f5a52a39ae5309144222a4894330aef8c0a033de587/merged/var/lib/ceph/osd/ceph-0 supports timestamps until 2038 (0x7fffffff) Dec 15 02:45:01 localhost podman[30981]: 2025-12-15 07:45:01.49979524 +0000 UTC m=+0.200886947 container init 3b30a8f881be948aab8ea8f2e7e66953c6b5b376162f6f9ca186780839fb4301 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-osd-0-activate-test, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., distribution-scope=public, build-date=2025-11-26T19:44:28Z, architecture=x86_64, com.redhat.component=rhceph-container, GIT_BRANCH=main, ceph=True, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, release=1763362218, io.openshift.expose-services=, version=7) Dec 15 02:45:01 localhost podman[30981]: 2025-12-15 07:45:01.510499846 +0000 UTC m=+0.211591563 container start 3b30a8f881be948aab8ea8f2e7e66953c6b5b376162f6f9ca186780839fb4301 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-osd-0-activate-test, com.redhat.component=rhceph-container, build-date=2025-11-26T19:44:28Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-type=git, name=rhceph, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, RELEASE=main, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , ceph=True) Dec 15 02:45:01 localhost podman[30981]: 2025-12-15 07:45:01.511312906 +0000 UTC m=+0.212404613 container attach 3b30a8f881be948aab8ea8f2e7e66953c6b5b376162f6f9ca186780839fb4301 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-osd-0-activate-test, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, GIT_CLEAN=True, RELEASE=main, GIT_BRANCH=main, ceph=True, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, build-date=2025-11-26T19:44:28Z, description=Red Hat Ceph Storage 7, distribution-scope=public, release=1763362218) Dec 15 02:45:01 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-osd-0-activate-test[30996]: usage: ceph-volume activate [-h] [--osd-id OSD_ID] [--osd-uuid OSD_UUID] Dec 15 02:45:01 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-osd-0-activate-test[30996]: [--no-systemd] [--no-tmpfs] Dec 15 02:45:01 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-osd-0-activate-test[30996]: ceph-volume activate: error: unrecognized arguments: --bad-option Dec 15 02:45:01 localhost systemd[1]: libpod-3b30a8f881be948aab8ea8f2e7e66953c6b5b376162f6f9ca186780839fb4301.scope: Deactivated successfully. Dec 15 02:45:01 localhost podman[30981]: 2025-12-15 07:45:01.731760756 +0000 UTC m=+0.432852433 container died 3b30a8f881be948aab8ea8f2e7e66953c6b5b376162f6f9ca186780839fb4301 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-osd-0-activate-test, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, RELEASE=main, architecture=x86_64, io.buildah.version=1.41.4, io.openshift.expose-services=, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, name=rhceph, io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, GIT_CLEAN=True) Dec 15 02:45:01 localhost podman[31001]: 2025-12-15 07:45:01.804481566 +0000 UTC m=+0.066800909 container remove 3b30a8f881be948aab8ea8f2e7e66953c6b5b376162f6f9ca186780839fb4301 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-osd-0-activate-test, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-11-26T19:44:28Z, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , release=1763362218, ceph=True, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, name=rhceph, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, GIT_CLEAN=True, io.openshift.expose-services=, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git) Dec 15 02:45:01 localhost systemd-journald[618]: Field hash table of /run/log/journal/738a39f68bc78fb81032e509449fb759/system.journal has a fill level at 75.1 (250 of 333 items), suggesting rotation. Dec 15 02:45:01 localhost systemd-journald[618]: /run/log/journal/738a39f68bc78fb81032e509449fb759/system.journal: Journal header limits reached or header out-of-date, rotating. Dec 15 02:45:01 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Dec 15 02:45:01 localhost systemd[1]: libpod-conmon-3b30a8f881be948aab8ea8f2e7e66953c6b5b376162f6f9ca186780839fb4301.scope: Deactivated successfully. Dec 15 02:45:01 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Dec 15 02:45:02 localhost systemd[1]: Reloading. Dec 15 02:45:02 localhost systemd-rc-local-generator[31058]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 02:45:02 localhost systemd-sysv-generator[31062]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 02:45:02 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 02:45:02 localhost systemd[1]: var-lib-containers-storage-overlay-696824e58df06eccfb756f5a52a39ae5309144222a4894330aef8c0a033de587-merged.mount: Deactivated successfully. Dec 15 02:45:02 localhost systemd[1]: Reloading. Dec 15 02:45:02 localhost systemd-rc-local-generator[31100]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 02:45:02 localhost systemd-sysv-generator[31106]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 02:45:02 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 02:45:02 localhost systemd[1]: Starting Ceph osd.0 for bce17446-41b5-5408-a23e-0b011906b44a... Dec 15 02:45:03 localhost podman[31166]: Dec 15 02:45:03 localhost podman[31166]: 2025-12-15 07:45:03.007986427 +0000 UTC m=+0.047647136 container create dda9e8459583475bff3e2d545ae3d6cfc8681586e3bc4791610eebe739dd8fde (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-osd-0-activate, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, build-date=2025-11-26T19:44:28Z, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, RELEASE=main, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, release=1763362218, version=7, distribution-scope=public, GIT_CLEAN=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, com.redhat.component=rhceph-container, io.openshift.expose-services=) Dec 15 02:45:03 localhost systemd[1]: tmp-crun.zmcpAa.mount: Deactivated successfully. Dec 15 02:45:03 localhost systemd[1]: Started libcrun container. Dec 15 02:45:03 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6aac20ef032ee3c8c82713a354909535c2a0059b40b2dda9e2be9a7df426ed7a/merged/rootfs supports timestamps until 2038 (0x7fffffff) Dec 15 02:45:03 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6aac20ef032ee3c8c82713a354909535c2a0059b40b2dda9e2be9a7df426ed7a/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Dec 15 02:45:03 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6aac20ef032ee3c8c82713a354909535c2a0059b40b2dda9e2be9a7df426ed7a/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Dec 15 02:45:03 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6aac20ef032ee3c8c82713a354909535c2a0059b40b2dda9e2be9a7df426ed7a/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Dec 15 02:45:03 localhost podman[31166]: 2025-12-15 07:45:02.991257217 +0000 UTC m=+0.030917966 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 02:45:03 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6aac20ef032ee3c8c82713a354909535c2a0059b40b2dda9e2be9a7df426ed7a/merged/var/lib/ceph/osd/ceph-0 supports timestamps until 2038 (0x7fffffff) Dec 15 02:45:03 localhost podman[31166]: 2025-12-15 07:45:03.106945302 +0000 UTC m=+0.146606021 container init dda9e8459583475bff3e2d545ae3d6cfc8681586e3bc4791610eebe739dd8fde (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-osd-0-activate, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., com.redhat.component=rhceph-container, GIT_CLEAN=True, distribution-scope=public, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, release=1763362218, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, io.openshift.tags=rhceph ceph, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, io.buildah.version=1.41.4, vcs-type=git, build-date=2025-11-26T19:44:28Z, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7) Dec 15 02:45:03 localhost podman[31166]: 2025-12-15 07:45:03.118800127 +0000 UTC m=+0.158460836 container start dda9e8459583475bff3e2d545ae3d6cfc8681586e3bc4791610eebe739dd8fde (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-osd-0-activate, vendor=Red Hat, Inc., io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, vcs-type=git, build-date=2025-11-26T19:44:28Z, com.redhat.component=rhceph-container, distribution-scope=public, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, version=7) Dec 15 02:45:03 localhost podman[31166]: 2025-12-15 07:45:03.119088524 +0000 UTC m=+0.158749273 container attach dda9e8459583475bff3e2d545ae3d6cfc8681586e3bc4791610eebe739dd8fde (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-osd-0-activate, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, distribution-scope=public, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, ceph=True, com.redhat.component=rhceph-container, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, release=1763362218, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=) Dec 15 02:45:03 localhost systemd[1]: tmp-crun.oBKqNg.mount: Deactivated successfully. Dec 15 02:45:03 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-osd-0-activate[31180]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0 Dec 15 02:45:03 localhost bash[31166]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0 Dec 15 02:45:03 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-osd-0-activate[31180]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-0 --no-mon-config --dev /dev/mapper/ceph_vg0-ceph_lv0 Dec 15 02:45:03 localhost bash[31166]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-0 --no-mon-config --dev /dev/mapper/ceph_vg0-ceph_lv0 Dec 15 02:45:03 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-osd-0-activate[31180]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg0-ceph_lv0 Dec 15 02:45:03 localhost bash[31166]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg0-ceph_lv0 Dec 15 02:45:03 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-osd-0-activate[31180]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0 Dec 15 02:45:03 localhost bash[31166]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0 Dec 15 02:45:03 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-osd-0-activate[31180]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg0-ceph_lv0 /var/lib/ceph/osd/ceph-0/block Dec 15 02:45:03 localhost bash[31166]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg0-ceph_lv0 /var/lib/ceph/osd/ceph-0/block Dec 15 02:45:03 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-osd-0-activate[31180]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0 Dec 15 02:45:03 localhost bash[31166]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0 Dec 15 02:45:03 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-osd-0-activate[31180]: --> ceph-volume raw activate successful for osd ID: 0 Dec 15 02:45:03 localhost bash[31166]: --> ceph-volume raw activate successful for osd ID: 0 Dec 15 02:45:03 localhost systemd[1]: libpod-dda9e8459583475bff3e2d545ae3d6cfc8681586e3bc4791610eebe739dd8fde.scope: Deactivated successfully. Dec 15 02:45:03 localhost podman[31166]: 2025-12-15 07:45:03.826637371 +0000 UTC m=+0.866298170 container died dda9e8459583475bff3e2d545ae3d6cfc8681586e3bc4791610eebe739dd8fde (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-osd-0-activate, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, GIT_CLEAN=True, vcs-type=git, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, version=7, vendor=Red Hat, Inc., io.openshift.expose-services=, distribution-scope=public, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0) Dec 15 02:45:03 localhost systemd[1]: var-lib-containers-storage-overlay-6aac20ef032ee3c8c82713a354909535c2a0059b40b2dda9e2be9a7df426ed7a-merged.mount: Deactivated successfully. Dec 15 02:45:03 localhost podman[31295]: 2025-12-15 07:45:03.924964749 +0000 UTC m=+0.083474517 container remove dda9e8459583475bff3e2d545ae3d6cfc8681586e3bc4791610eebe739dd8fde (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-osd-0-activate, version=7, architecture=x86_64, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, vcs-type=git, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, release=1763362218) Dec 15 02:45:04 localhost podman[31356]: Dec 15 02:45:04 localhost podman[31356]: 2025-12-15 07:45:04.262464538 +0000 UTC m=+0.075131082 container create 6d224b41e65d71a9ec97754e5384ed3a0b595e3062ce65002e54faa196e92ceb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-osd-0, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , vcs-type=git, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, com.redhat.component=rhceph-container, build-date=2025-11-26T19:44:28Z, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., name=rhceph, GIT_BRANCH=main, distribution-scope=public, io.buildah.version=1.41.4, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1763362218, CEPH_POINT_RELEASE=, RELEASE=main, architecture=x86_64, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph) Dec 15 02:45:04 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/55bf56500dd77908899ed0fcb47c8d18fcdfa1b7dcda01ec99573c005b5111fd/merged/rootfs supports timestamps until 2038 (0x7fffffff) Dec 15 02:45:04 localhost podman[31356]: 2025-12-15 07:45:04.232370935 +0000 UTC m=+0.045037479 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 02:45:04 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/55bf56500dd77908899ed0fcb47c8d18fcdfa1b7dcda01ec99573c005b5111fd/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Dec 15 02:45:04 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/55bf56500dd77908899ed0fcb47c8d18fcdfa1b7dcda01ec99573c005b5111fd/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Dec 15 02:45:04 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/55bf56500dd77908899ed0fcb47c8d18fcdfa1b7dcda01ec99573c005b5111fd/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Dec 15 02:45:04 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/55bf56500dd77908899ed0fcb47c8d18fcdfa1b7dcda01ec99573c005b5111fd/merged/var/lib/ceph/osd/ceph-0 supports timestamps until 2038 (0x7fffffff) Dec 15 02:45:04 localhost podman[31356]: 2025-12-15 07:45:04.383203543 +0000 UTC m=+0.195870087 container init 6d224b41e65d71a9ec97754e5384ed3a0b595e3062ce65002e54faa196e92ceb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-osd-0, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, name=rhceph, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, CEPH_POINT_RELEASE=, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, io.buildah.version=1.41.4, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1763362218, version=7) Dec 15 02:45:04 localhost podman[31356]: 2025-12-15 07:45:04.392552745 +0000 UTC m=+0.205219289 container start 6d224b41e65d71a9ec97754e5384ed3a0b595e3062ce65002e54faa196e92ceb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-osd-0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, name=rhceph, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, architecture=x86_64, description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, vcs-type=git, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, release=1763362218, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, io.buildah.version=1.41.4, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Dec 15 02:45:04 localhost bash[31356]: 6d224b41e65d71a9ec97754e5384ed3a0b595e3062ce65002e54faa196e92ceb Dec 15 02:45:04 localhost systemd[1]: Started Ceph osd.0 for bce17446-41b5-5408-a23e-0b011906b44a. Dec 15 02:45:04 localhost ceph-osd[31375]: set uid:gid to 167:167 (ceph:ceph) Dec 15 02:45:04 localhost ceph-osd[31375]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-osd, pid 2 Dec 15 02:45:04 localhost ceph-osd[31375]: pidfile_write: ignore empty --pid-file Dec 15 02:45:04 localhost ceph-osd[31375]: bdev(0x5568990eee00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block Dec 15 02:45:04 localhost ceph-osd[31375]: bdev(0x5568990eee00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument Dec 15 02:45:04 localhost ceph-osd[31375]: bdev(0x5568990eee00 /var/lib/ceph/osd/ceph-0/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Dec 15 02:45:04 localhost ceph-osd[31375]: bluestore(/var/lib/ceph/osd/ceph-0) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06 Dec 15 02:45:04 localhost ceph-osd[31375]: bdev(0x5568990ef180 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block Dec 15 02:45:04 localhost ceph-osd[31375]: bdev(0x5568990ef180 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument Dec 15 02:45:04 localhost ceph-osd[31375]: bdev(0x5568990ef180 /var/lib/ceph/osd/ceph-0/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Dec 15 02:45:04 localhost ceph-osd[31375]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-0/block size 7.0 GiB Dec 15 02:45:04 localhost ceph-osd[31375]: bdev(0x5568990ef180 /var/lib/ceph/osd/ceph-0/block) close Dec 15 02:45:04 localhost ceph-osd[31375]: bdev(0x5568990eee00 /var/lib/ceph/osd/ceph-0/block) close Dec 15 02:45:04 localhost ceph-osd[31375]: starting osd.0 osd_data /var/lib/ceph/osd/ceph-0 /var/lib/ceph/osd/ceph-0/journal Dec 15 02:45:04 localhost ceph-osd[31375]: load: jerasure load: lrc Dec 15 02:45:04 localhost ceph-osd[31375]: bdev(0x5568990eee00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block Dec 15 02:45:04 localhost ceph-osd[31375]: bdev(0x5568990eee00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument Dec 15 02:45:04 localhost ceph-osd[31375]: bdev(0x5568990eee00 /var/lib/ceph/osd/ceph-0/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Dec 15 02:45:04 localhost ceph-osd[31375]: bluestore(/var/lib/ceph/osd/ceph-0) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06 Dec 15 02:45:04 localhost ceph-osd[31375]: bdev(0x5568990eee00 /var/lib/ceph/osd/ceph-0/block) close Dec 15 02:45:05 localhost podman[31464]: Dec 15 02:45:05 localhost ceph-osd[31375]: bdev(0x5568990eee00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block Dec 15 02:45:05 localhost ceph-osd[31375]: bdev(0x5568990eee00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument Dec 15 02:45:05 localhost ceph-osd[31375]: bdev(0x5568990eee00 /var/lib/ceph/osd/ceph-0/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Dec 15 02:45:05 localhost ceph-osd[31375]: bluestore(/var/lib/ceph/osd/ceph-0) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06 Dec 15 02:45:05 localhost ceph-osd[31375]: bdev(0x5568990eee00 /var/lib/ceph/osd/ceph-0/block) close Dec 15 02:45:05 localhost podman[31464]: 2025-12-15 07:45:05.245723105 +0000 UTC m=+0.061877322 container create f46f7173f60a4d743ebe63a164393a79e0eae18c3e6aa61fea66b693498c09be (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nervous_kowalevski, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, distribution-scope=public, version=7, io.buildah.version=1.41.4, build-date=2025-11-26T19:44:28Z, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, RELEASE=main, com.redhat.component=rhceph-container, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux , io.openshift.expose-services=, release=1763362218, vendor=Red Hat, Inc., vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, architecture=x86_64, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Dec 15 02:45:05 localhost systemd[1]: Started libpod-conmon-f46f7173f60a4d743ebe63a164393a79e0eae18c3e6aa61fea66b693498c09be.scope. Dec 15 02:45:05 localhost systemd[1]: Started libcrun container. Dec 15 02:45:05 localhost podman[31464]: 2025-12-15 07:45:05.312519843 +0000 UTC m=+0.128674090 container init f46f7173f60a4d743ebe63a164393a79e0eae18c3e6aa61fea66b693498c09be (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nervous_kowalevski, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, version=7, build-date=2025-11-26T19:44:28Z, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, vcs-type=git, maintainer=Guillaume Abrioux , GIT_CLEAN=True, distribution-scope=public, vendor=Red Hat, Inc., release=1763362218, description=Red Hat Ceph Storage 7) Dec 15 02:45:05 localhost podman[31464]: 2025-12-15 07:45:05.214534603 +0000 UTC m=+0.030688880 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 02:45:05 localhost podman[31464]: 2025-12-15 07:45:05.328455903 +0000 UTC m=+0.144610150 container start f46f7173f60a4d743ebe63a164393a79e0eae18c3e6aa61fea66b693498c09be (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nervous_kowalevski, maintainer=Guillaume Abrioux , architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, GIT_CLEAN=True, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, release=1763362218, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_BRANCH=main, name=rhceph, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-26T19:44:28Z, ceph=True, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git) Dec 15 02:45:05 localhost podman[31464]: 2025-12-15 07:45:05.32872466 +0000 UTC m=+0.144878907 container attach f46f7173f60a4d743ebe63a164393a79e0eae18c3e6aa61fea66b693498c09be (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nervous_kowalevski, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, name=rhceph, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , release=1763362218, architecture=x86_64, io.openshift.tags=rhceph ceph, version=7, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, build-date=2025-11-26T19:44:28Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., vcs-type=git, GIT_CLEAN=True) Dec 15 02:45:05 localhost systemd[1]: libpod-f46f7173f60a4d743ebe63a164393a79e0eae18c3e6aa61fea66b693498c09be.scope: Deactivated successfully. Dec 15 02:45:05 localhost nervous_kowalevski[31483]: 167 167 Dec 15 02:45:05 localhost podman[31464]: 2025-12-15 07:45:05.332147278 +0000 UTC m=+0.148301525 container died f46f7173f60a4d743ebe63a164393a79e0eae18c3e6aa61fea66b693498c09be (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nervous_kowalevski, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-11-26T19:44:28Z, vendor=Red Hat, Inc., GIT_BRANCH=main, io.openshift.tags=rhceph ceph, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1763362218, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, vcs-type=git, ceph=True, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=) Dec 15 02:45:05 localhost systemd[1]: tmp-crun.60Z8EO.mount: Deactivated successfully. Dec 15 02:45:05 localhost systemd[1]: var-lib-containers-storage-overlay-44a9d20e36f74c792f297b57d46d13d3cd5f614e5ff98fcc95eeb0243f5c2713-merged.mount: Deactivated successfully. Dec 15 02:45:05 localhost podman[31488]: 2025-12-15 07:45:05.424876943 +0000 UTC m=+0.079296410 container remove f46f7173f60a4d743ebe63a164393a79e0eae18c3e6aa61fea66b693498c09be (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nervous_kowalevski, version=7, CEPH_POINT_RELEASE=, release=1763362218, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, vendor=Red Hat, Inc., GIT_CLEAN=True, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, RELEASE=main, com.redhat.component=rhceph-container, distribution-scope=public, name=rhceph, io.openshift.tags=rhceph ceph, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-11-26T19:44:28Z, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0) Dec 15 02:45:05 localhost systemd[1]: libpod-conmon-f46f7173f60a4d743ebe63a164393a79e0eae18c3e6aa61fea66b693498c09be.scope: Deactivated successfully. Dec 15 02:45:05 localhost ceph-osd[31375]: mClockScheduler: set_osd_capacity_params_from_config: osd_bandwidth_cost_per_io: 499321.90 bytes/io, osd_bandwidth_capacity_per_shard 157286400.00 bytes/second Dec 15 02:45:05 localhost ceph-osd[31375]: osd.0:0.OSDShard using op scheduler mclock_scheduler, cutoff=196 Dec 15 02:45:05 localhost ceph-osd[31375]: bdev(0x5568990eee00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block Dec 15 02:45:05 localhost ceph-osd[31375]: bdev(0x5568990eee00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument Dec 15 02:45:05 localhost ceph-osd[31375]: bdev(0x5568990eee00 /var/lib/ceph/osd/ceph-0/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Dec 15 02:45:05 localhost ceph-osd[31375]: bluestore(/var/lib/ceph/osd/ceph-0) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06 Dec 15 02:45:05 localhost ceph-osd[31375]: bdev(0x5568990ef180 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block Dec 15 02:45:05 localhost ceph-osd[31375]: bdev(0x5568990ef180 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument Dec 15 02:45:05 localhost ceph-osd[31375]: bdev(0x5568990ef180 /var/lib/ceph/osd/ceph-0/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Dec 15 02:45:05 localhost ceph-osd[31375]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-0/block size 7.0 GiB Dec 15 02:45:05 localhost ceph-osd[31375]: bluefs mount Dec 15 02:45:05 localhost ceph-osd[31375]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000 Dec 15 02:45:05 localhost ceph-osd[31375]: bluefs mount shared_bdev_used = 0 Dec 15 02:45:05 localhost ceph-osd[31375]: bluestore(/var/lib/ceph/osd/ceph-0) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: RocksDB version: 7.9.2 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Git sha 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Compile date 2025-09-23 00:00:00 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: DB SUMMARY Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: DB Session ID: GVF0X3DWF49JJ2HDW5TD Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: CURRENT file: CURRENT Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: IDENTITY file: IDENTITY Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: MANIFEST file: MANIFEST-000032 size: 1007 Bytes Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: SST files in db.slow dir, Total Num: 0, files: Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.error_if_exists: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.create_if_missing: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.paranoid_checks: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.flush_verify_memtable_count: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.track_and_verify_wals_in_manifest: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.verify_sst_unique_id_in_manifest: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.env: 0x556899382cb0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.fs: LegacyFileSystem Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.info_log: 0x55689a08cd00 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_file_opening_threads: 16 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.statistics: (nil) Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.use_fsync: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_log_file_size: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_manifest_file_size: 1073741824 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.log_file_time_to_roll: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.keep_log_file_num: 1000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.recycle_log_file_num: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.allow_fallocate: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.allow_mmap_reads: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.allow_mmap_writes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.use_direct_reads: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.use_direct_io_for_flush_and_compaction: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.create_missing_column_families: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.db_log_dir: Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.wal_dir: db.wal Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.table_cache_numshardbits: 6 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.WAL_ttl_seconds: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.WAL_size_limit_MB: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_write_batch_group_size_bytes: 1048576 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.manifest_preallocation_size: 4194304 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.is_fd_close_on_exec: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.advise_random_on_open: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.db_write_buffer_size: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.write_buffer_manager: 0x5568990d8140 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.access_hint_on_compaction_start: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.random_access_max_buffer_size: 1048576 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.use_adaptive_mutex: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.rate_limiter: (nil) Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.sst_file_manager.rate_bytes_per_sec: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.wal_recovery_mode: 2 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.enable_thread_tracking: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.enable_pipelined_write: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.unordered_write: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.allow_concurrent_memtable_write: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.enable_write_thread_adaptive_yield: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.write_thread_max_yield_usec: 100 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.write_thread_slow_yield_usec: 3 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.row_cache: None Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.wal_filter: None Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.avoid_flush_during_recovery: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.allow_ingest_behind: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.two_write_queues: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.manual_wal_flush: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.wal_compression: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.atomic_flush: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.avoid_unnecessary_blocking_io: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.persist_stats_to_disk: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.write_dbid_to_manifest: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.log_readahead_size: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.file_checksum_gen_factory: Unknown Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.best_efforts_recovery: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bgerror_resume_count: 2147483647 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bgerror_resume_retry_interval: 1000000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.allow_data_in_errors: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.db_host_id: __hostname__ Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.enforce_single_del_contracts: true Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_background_jobs: 4 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_background_compactions: -1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_subcompactions: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.avoid_flush_during_shutdown: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.writable_file_max_buffer_size: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.delayed_write_rate : 16777216 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_total_wal_size: 1073741824 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.delete_obsolete_files_period_micros: 21600000000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.stats_dump_period_sec: 600 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.stats_persist_period_sec: 600 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.stats_history_buffer_size: 1048576 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_open_files: -1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bytes_per_sync: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.wal_bytes_per_sync: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.strict_bytes_per_sync: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_readahead_size: 2097152 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_background_flushes: -1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Compression algorithms supported: Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: #011kZSTD supported: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: #011kXpressCompression supported: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: #011kBZip2Compression supported: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: #011kZSTDNotFinalCompression supported: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: #011kLZ4Compression supported: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: #011kZlibCompression supported: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: #011kLZ4HCCompression supported: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: #011kSnappyCompression supported: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Fast CRC32 supported: Supported on x86 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: DMutex implementation: pthread_mutex_t Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/db_impl/db_impl_readonly.cc:25] Opening the db in read only mode Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 0, name: default) Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]: Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.comparator: leveldb.BytewiseComparator Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.merge_operator: .T:int64_array.b:bitwise_xor Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_filter: None Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_filter_factory: None Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.sst_partitioner_factory: None Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_factory: SkipListFactory Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.table_factory: BlockBasedTable Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55689a08cec0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x5568990c6850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.write_buffer_size: 16777216 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_write_buffer_number: 64 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression: LZ4 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression: Disabled Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.prefix_extractor: nullptr Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.num_levels: 7 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.level: 32767 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.enabled: false Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.window_bits: -14 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.level: 32767 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.strategy: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.parallel_threads: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.enabled: false Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.level0_stop_writes_trigger: 36 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.target_file_size_base: 67108864 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.target_file_size_multiplier: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_compaction_bytes: 1677721600 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.arena_block_size: 1048576 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.disable_auto_compactions: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_style: kCompactionStyleLevel Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.table_properties_collectors: Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.inplace_update_support: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.inplace_update_num_locks: 10000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_whole_key_filtering: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_huge_page_size: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bloom_locality: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_successive_merges: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.optimize_filters_for_hits: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.paranoid_file_checks: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.force_consistency_checks: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.report_bg_io_stats: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.ttl: 2592000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.periodic_compaction_seconds: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.preclude_last_level_data_seconds: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.preserve_internal_time_seconds: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.enable_blob_files: false Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.min_blob_size: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_file_size: 268435456 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_compression_type: NoCompression Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.enable_blob_garbage_collection: false Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_compaction_readahead_size: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_file_starting_level: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 1, name: m-0) Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]: Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.comparator: leveldb.BytewiseComparator Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.merge_operator: None Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_filter: None Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_filter_factory: None Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.sst_partitioner_factory: None Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_factory: SkipListFactory Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.table_factory: BlockBasedTable Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55689a08cec0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x5568990c6850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.write_buffer_size: 16777216 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_write_buffer_number: 64 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression: LZ4 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression: Disabled Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.prefix_extractor: nullptr Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.num_levels: 7 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.level: 32767 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.enabled: false Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.window_bits: -14 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.level: 32767 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.strategy: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.parallel_threads: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.enabled: false Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.level0_stop_writes_trigger: 36 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.target_file_size_base: 67108864 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.target_file_size_multiplier: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_compaction_bytes: 1677721600 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.arena_block_size: 1048576 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.disable_auto_compactions: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_style: kCompactionStyleLevel Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.inplace_update_support: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.inplace_update_num_locks: 10000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_whole_key_filtering: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_huge_page_size: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bloom_locality: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_successive_merges: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.optimize_filters_for_hits: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.paranoid_file_checks: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.force_consistency_checks: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.report_bg_io_stats: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.ttl: 2592000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.periodic_compaction_seconds: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.preclude_last_level_data_seconds: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.preserve_internal_time_seconds: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.enable_blob_files: false Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.min_blob_size: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_file_size: 268435456 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_compression_type: NoCompression Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.enable_blob_garbage_collection: false Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_compaction_readahead_size: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_file_starting_level: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 2, name: m-1) Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]: Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.comparator: leveldb.BytewiseComparator Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.merge_operator: None Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_filter: None Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_filter_factory: None Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.sst_partitioner_factory: None Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_factory: SkipListFactory Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.table_factory: BlockBasedTable Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55689a08cec0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x5568990c6850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.write_buffer_size: 16777216 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_write_buffer_number: 64 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression: LZ4 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression: Disabled Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.prefix_extractor: nullptr Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.num_levels: 7 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.level: 32767 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.enabled: false Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.window_bits: -14 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.level: 32767 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.strategy: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.parallel_threads: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.enabled: false Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.level0_stop_writes_trigger: 36 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.target_file_size_base: 67108864 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.target_file_size_multiplier: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_compaction_bytes: 1677721600 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.arena_block_size: 1048576 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.disable_auto_compactions: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_style: kCompactionStyleLevel Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.inplace_update_support: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.inplace_update_num_locks: 10000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_whole_key_filtering: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_huge_page_size: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bloom_locality: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_successive_merges: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.optimize_filters_for_hits: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.paranoid_file_checks: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.force_consistency_checks: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.report_bg_io_stats: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.ttl: 2592000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.periodic_compaction_seconds: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.preclude_last_level_data_seconds: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.preserve_internal_time_seconds: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.enable_blob_files: false Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.min_blob_size: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_file_size: 268435456 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_compression_type: NoCompression Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.enable_blob_garbage_collection: false Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_compaction_readahead_size: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_file_starting_level: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 3, name: m-2) Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]: Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.comparator: leveldb.BytewiseComparator Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.merge_operator: None Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_filter: None Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_filter_factory: None Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.sst_partitioner_factory: None Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_factory: SkipListFactory Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.table_factory: BlockBasedTable Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55689a08cec0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x5568990c6850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.write_buffer_size: 16777216 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_write_buffer_number: 64 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression: LZ4 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression: Disabled Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.prefix_extractor: nullptr Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.num_levels: 7 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.level: 32767 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.enabled: false Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.window_bits: -14 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.level: 32767 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.strategy: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.parallel_threads: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.enabled: false Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.level0_stop_writes_trigger: 36 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.target_file_size_base: 67108864 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.target_file_size_multiplier: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_compaction_bytes: 1677721600 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.arena_block_size: 1048576 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.disable_auto_compactions: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_style: kCompactionStyleLevel Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.inplace_update_support: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.inplace_update_num_locks: 10000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_whole_key_filtering: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_huge_page_size: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bloom_locality: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_successive_merges: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.optimize_filters_for_hits: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.paranoid_file_checks: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.force_consistency_checks: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.report_bg_io_stats: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.ttl: 2592000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.periodic_compaction_seconds: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.preclude_last_level_data_seconds: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.preserve_internal_time_seconds: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.enable_blob_files: false Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.min_blob_size: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_file_size: 268435456 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_compression_type: NoCompression Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.enable_blob_garbage_collection: false Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_compaction_readahead_size: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_file_starting_level: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 4, name: p-0) Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]: Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.comparator: leveldb.BytewiseComparator Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.merge_operator: None Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_filter: None Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_filter_factory: None Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.sst_partitioner_factory: None Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_factory: SkipListFactory Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.table_factory: BlockBasedTable Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55689a08cec0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x5568990c6850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.write_buffer_size: 16777216 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_write_buffer_number: 64 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression: LZ4 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression: Disabled Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.prefix_extractor: nullptr Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.num_levels: 7 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.level: 32767 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.enabled: false Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.window_bits: -14 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.level: 32767 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.strategy: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.parallel_threads: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.enabled: false Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.level0_stop_writes_trigger: 36 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.target_file_size_base: 67108864 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.target_file_size_multiplier: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_compaction_bytes: 1677721600 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.arena_block_size: 1048576 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.disable_auto_compactions: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_style: kCompactionStyleLevel Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.inplace_update_support: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.inplace_update_num_locks: 10000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_whole_key_filtering: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_huge_page_size: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bloom_locality: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_successive_merges: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.optimize_filters_for_hits: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.paranoid_file_checks: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.force_consistency_checks: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.report_bg_io_stats: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.ttl: 2592000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.periodic_compaction_seconds: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.preclude_last_level_data_seconds: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.preserve_internal_time_seconds: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.enable_blob_files: false Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.min_blob_size: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_file_size: 268435456 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_compression_type: NoCompression Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.enable_blob_garbage_collection: false Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_compaction_readahead_size: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_file_starting_level: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 5, name: p-1) Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]: Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.comparator: leveldb.BytewiseComparator Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.merge_operator: None Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_filter: None Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_filter_factory: None Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.sst_partitioner_factory: None Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_factory: SkipListFactory Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.table_factory: BlockBasedTable Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55689a08cec0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x5568990c6850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.write_buffer_size: 16777216 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_write_buffer_number: 64 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression: LZ4 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression: Disabled Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.prefix_extractor: nullptr Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.num_levels: 7 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.level: 32767 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.enabled: false Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.window_bits: -14 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.level: 32767 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.strategy: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.parallel_threads: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.enabled: false Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.level0_stop_writes_trigger: 36 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.target_file_size_base: 67108864 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.target_file_size_multiplier: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_compaction_bytes: 1677721600 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.arena_block_size: 1048576 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.disable_auto_compactions: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_style: kCompactionStyleLevel Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.inplace_update_support: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.inplace_update_num_locks: 10000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_whole_key_filtering: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_huge_page_size: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bloom_locality: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_successive_merges: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.optimize_filters_for_hits: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.paranoid_file_checks: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.force_consistency_checks: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.report_bg_io_stats: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.ttl: 2592000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.periodic_compaction_seconds: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.preclude_last_level_data_seconds: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.preserve_internal_time_seconds: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.enable_blob_files: false Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.min_blob_size: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_file_size: 268435456 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_compression_type: NoCompression Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.enable_blob_garbage_collection: false Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_compaction_readahead_size: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_file_starting_level: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 6, name: p-2) Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]: Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.comparator: leveldb.BytewiseComparator Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.merge_operator: None Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_filter: None Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_filter_factory: None Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.sst_partitioner_factory: None Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_factory: SkipListFactory Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.table_factory: BlockBasedTable Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55689a08cec0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x5568990c6850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.write_buffer_size: 16777216 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_write_buffer_number: 64 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression: LZ4 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression: Disabled Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.prefix_extractor: nullptr Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.num_levels: 7 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.level: 32767 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.enabled: false Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.window_bits: -14 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.level: 32767 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.strategy: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.parallel_threads: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.enabled: false Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.level0_stop_writes_trigger: 36 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.target_file_size_base: 67108864 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.target_file_size_multiplier: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_compaction_bytes: 1677721600 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.arena_block_size: 1048576 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.disable_auto_compactions: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_style: kCompactionStyleLevel Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.inplace_update_support: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.inplace_update_num_locks: 10000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_whole_key_filtering: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_huge_page_size: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bloom_locality: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_successive_merges: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.optimize_filters_for_hits: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.paranoid_file_checks: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.force_consistency_checks: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.report_bg_io_stats: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.ttl: 2592000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.periodic_compaction_seconds: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.preclude_last_level_data_seconds: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.preserve_internal_time_seconds: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.enable_blob_files: false Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.min_blob_size: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_file_size: 268435456 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_compression_type: NoCompression Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.enable_blob_garbage_collection: false Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_compaction_readahead_size: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_file_starting_level: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 7, name: O-0) Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]: Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.comparator: leveldb.BytewiseComparator Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.merge_operator: None Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_filter: None Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_filter_factory: None Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.sst_partitioner_factory: None Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_factory: SkipListFactory Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.table_factory: BlockBasedTable Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55689a08d0e0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x5568990c62d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.write_buffer_size: 16777216 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_write_buffer_number: 64 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression: LZ4 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression: Disabled Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.prefix_extractor: nullptr Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.num_levels: 7 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.level: 32767 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.enabled: false Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.window_bits: -14 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.level: 32767 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.strategy: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.parallel_threads: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.enabled: false Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.level0_stop_writes_trigger: 36 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.target_file_size_base: 67108864 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.target_file_size_multiplier: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_compaction_bytes: 1677721600 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.arena_block_size: 1048576 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.disable_auto_compactions: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_style: kCompactionStyleLevel Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.inplace_update_support: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.inplace_update_num_locks: 10000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_whole_key_filtering: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_huge_page_size: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bloom_locality: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_successive_merges: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.optimize_filters_for_hits: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.paranoid_file_checks: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.force_consistency_checks: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.report_bg_io_stats: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.ttl: 2592000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.periodic_compaction_seconds: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.preclude_last_level_data_seconds: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.preserve_internal_time_seconds: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.enable_blob_files: false Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.min_blob_size: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_file_size: 268435456 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_compression_type: NoCompression Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.enable_blob_garbage_collection: false Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_compaction_readahead_size: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_file_starting_level: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 8, name: O-1) Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]: Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.comparator: leveldb.BytewiseComparator Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.merge_operator: None Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_filter: None Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_filter_factory: None Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.sst_partitioner_factory: None Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_factory: SkipListFactory Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.table_factory: BlockBasedTable Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55689a08d0e0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x5568990c62d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.write_buffer_size: 16777216 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_write_buffer_number: 64 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression: LZ4 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression: Disabled Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.prefix_extractor: nullptr Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.num_levels: 7 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.level: 32767 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.enabled: false Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.window_bits: -14 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.level: 32767 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.strategy: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.parallel_threads: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.enabled: false Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.level0_stop_writes_trigger: 36 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.target_file_size_base: 67108864 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.target_file_size_multiplier: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_compaction_bytes: 1677721600 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.arena_block_size: 1048576 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.disable_auto_compactions: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_style: kCompactionStyleLevel Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.inplace_update_support: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.inplace_update_num_locks: 10000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_whole_key_filtering: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_huge_page_size: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bloom_locality: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_successive_merges: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.optimize_filters_for_hits: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.paranoid_file_checks: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.force_consistency_checks: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.report_bg_io_stats: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.ttl: 2592000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.periodic_compaction_seconds: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.preclude_last_level_data_seconds: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.preserve_internal_time_seconds: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.enable_blob_files: false Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.min_blob_size: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_file_size: 268435456 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_compression_type: NoCompression Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.enable_blob_garbage_collection: false Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_compaction_readahead_size: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_file_starting_level: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 9, name: O-2) Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]: Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.comparator: leveldb.BytewiseComparator Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.merge_operator: None Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_filter: None Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_filter_factory: None Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.sst_partitioner_factory: None Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_factory: SkipListFactory Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.table_factory: BlockBasedTable Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55689a08d0e0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x5568990c62d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.write_buffer_size: 16777216 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_write_buffer_number: 64 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression: LZ4 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression: Disabled Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.prefix_extractor: nullptr Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.num_levels: 7 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.level: 32767 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.enabled: false Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.window_bits: -14 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.level: 32767 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.strategy: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.parallel_threads: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.enabled: false Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.level0_stop_writes_trigger: 36 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.target_file_size_base: 67108864 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.target_file_size_multiplier: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_compaction_bytes: 1677721600 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.arena_block_size: 1048576 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.disable_auto_compactions: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_style: kCompactionStyleLevel Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.inplace_update_support: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.inplace_update_num_locks: 10000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_whole_key_filtering: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_huge_page_size: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bloom_locality: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_successive_merges: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.optimize_filters_for_hits: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.paranoid_file_checks: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.force_consistency_checks: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.report_bg_io_stats: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.ttl: 2592000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.periodic_compaction_seconds: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.preclude_last_level_data_seconds: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.preserve_internal_time_seconds: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.enable_blob_files: false Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.min_blob_size: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_file_size: 268435456 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_compression_type: NoCompression Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.enable_blob_garbage_collection: false Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_compaction_readahead_size: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_file_starting_level: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 10, name: L) Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/column_family.cc:635] #011(skipping printing options) Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 11, name: P) Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/column_family.cc:635] #011(skipping printing options) Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: abd90ad5-9c07-431d-a96e-b08c1474ad86 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765784705531726, "job": 1, "event": "recovery_started", "wal_files": [31]} Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765784705532005, "job": 1, "event": "recovery_finished"} Dec 15 02:45:05 localhost ceph-osd[31375]: bluestore(/var/lib/ceph/osd/ceph-0) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0 Dec 15 02:45:05 localhost ceph-osd[31375]: bluestore(/var/lib/ceph/osd/ceph-0) _open_super_meta old nid_max 1025 Dec 15 02:45:05 localhost ceph-osd[31375]: bluestore(/var/lib/ceph/osd/ceph-0) _open_super_meta old blobid_max 10240 Dec 15 02:45:05 localhost ceph-osd[31375]: bluestore(/var/lib/ceph/osd/ceph-0) _open_super_meta ondisk_format 4 compat_ondisk_format 3 Dec 15 02:45:05 localhost ceph-osd[31375]: bluestore(/var/lib/ceph/osd/ceph-0) _open_super_meta min_alloc_size 0x1000 Dec 15 02:45:05 localhost ceph-osd[31375]: freelist init Dec 15 02:45:05 localhost ceph-osd[31375]: freelist _read_cfg Dec 15 02:45:05 localhost ceph-osd[31375]: bluestore(/var/lib/ceph/osd/ceph-0) _init_alloc loaded 7.0 GiB in 2 extents, allocator type hybrid, capacity 0x1bfc00000, block size 0x1000, free 0x1bfbfd000, fragmentation 5.5e-07 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/db_impl/db_impl.cc:496] Shutdown: canceling all background work Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/db_impl/db_impl.cc:704] Shutdown complete Dec 15 02:45:05 localhost ceph-osd[31375]: bluefs umount Dec 15 02:45:05 localhost ceph-osd[31375]: bdev(0x5568990ef180 /var/lib/ceph/osd/ceph-0/block) close Dec 15 02:45:05 localhost podman[31709]: Dec 15 02:45:05 localhost podman[31709]: 2025-12-15 07:45:05.750745854 +0000 UTC m=+0.061197185 container create 984f25c455f8a3e234a3e117b823a7cd12a3eda1735512cc69a518d5304e5471 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-osd-3-activate-test, vcs-type=git, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, CEPH_POINT_RELEASE=, build-date=2025-11-26T19:44:28Z, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, name=rhceph, GIT_CLEAN=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, architecture=x86_64, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, ceph=True, release=1763362218, io.openshift.expose-services=, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Dec 15 02:45:05 localhost ceph-osd[31375]: bdev(0x5568990ef180 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block Dec 15 02:45:05 localhost ceph-osd[31375]: bdev(0x5568990ef180 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument Dec 15 02:45:05 localhost ceph-osd[31375]: bdev(0x5568990ef180 /var/lib/ceph/osd/ceph-0/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Dec 15 02:45:05 localhost ceph-osd[31375]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-0/block size 7.0 GiB Dec 15 02:45:05 localhost ceph-osd[31375]: bluefs mount Dec 15 02:45:05 localhost ceph-osd[31375]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000 Dec 15 02:45:05 localhost ceph-osd[31375]: bluefs mount shared_bdev_used = 4718592 Dec 15 02:45:05 localhost ceph-osd[31375]: bluestore(/var/lib/ceph/osd/ceph-0) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: RocksDB version: 7.9.2 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Git sha 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Compile date 2025-09-23 00:00:00 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: DB SUMMARY Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: DB Session ID: GVF0X3DWF49JJ2HDW5TC Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: CURRENT file: CURRENT Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: IDENTITY file: IDENTITY Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: MANIFEST file: MANIFEST-000032 size: 1007 Bytes Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: SST files in db.slow dir, Total Num: 0, files: Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.error_if_exists: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.create_if_missing: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.paranoid_checks: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.flush_verify_memtable_count: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.track_and_verify_wals_in_manifest: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.verify_sst_unique_id_in_manifest: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.env: 0x55689917a310 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.fs: LegacyFileSystem Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.info_log: 0x55689a0cf720 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_file_opening_threads: 16 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.statistics: (nil) Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.use_fsync: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_log_file_size: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_manifest_file_size: 1073741824 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.log_file_time_to_roll: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.keep_log_file_num: 1000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.recycle_log_file_num: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.allow_fallocate: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.allow_mmap_reads: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.allow_mmap_writes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.use_direct_reads: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.use_direct_io_for_flush_and_compaction: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.create_missing_column_families: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.db_log_dir: Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.wal_dir: db.wal Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.table_cache_numshardbits: 6 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.WAL_ttl_seconds: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.WAL_size_limit_MB: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_write_batch_group_size_bytes: 1048576 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.manifest_preallocation_size: 4194304 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.is_fd_close_on_exec: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.advise_random_on_open: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.db_write_buffer_size: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.write_buffer_manager: 0x5568990d9540 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.access_hint_on_compaction_start: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.random_access_max_buffer_size: 1048576 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.use_adaptive_mutex: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.rate_limiter: (nil) Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.sst_file_manager.rate_bytes_per_sec: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.wal_recovery_mode: 2 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.enable_thread_tracking: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.enable_pipelined_write: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.unordered_write: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.allow_concurrent_memtable_write: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.enable_write_thread_adaptive_yield: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.write_thread_max_yield_usec: 100 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.write_thread_slow_yield_usec: 3 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.row_cache: None Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.wal_filter: None Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.avoid_flush_during_recovery: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.allow_ingest_behind: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.two_write_queues: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.manual_wal_flush: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.wal_compression: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.atomic_flush: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.avoid_unnecessary_blocking_io: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.persist_stats_to_disk: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.write_dbid_to_manifest: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.log_readahead_size: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.file_checksum_gen_factory: Unknown Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.best_efforts_recovery: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bgerror_resume_count: 2147483647 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bgerror_resume_retry_interval: 1000000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.allow_data_in_errors: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.db_host_id: __hostname__ Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.enforce_single_del_contracts: true Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_background_jobs: 4 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_background_compactions: -1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_subcompactions: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.avoid_flush_during_shutdown: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.writable_file_max_buffer_size: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.delayed_write_rate : 16777216 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_total_wal_size: 1073741824 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.delete_obsolete_files_period_micros: 21600000000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.stats_dump_period_sec: 600 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.stats_persist_period_sec: 600 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.stats_history_buffer_size: 1048576 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_open_files: -1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bytes_per_sync: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.wal_bytes_per_sync: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.strict_bytes_per_sync: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_readahead_size: 2097152 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_background_flushes: -1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Compression algorithms supported: Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: #011kZSTD supported: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: #011kXpressCompression supported: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: #011kBZip2Compression supported: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: #011kZSTDNotFinalCompression supported: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: #011kLZ4Compression supported: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: #011kZlibCompression supported: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: #011kLZ4HCCompression supported: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: #011kSnappyCompression supported: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Fast CRC32 supported: Supported on x86 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: DMutex implementation: pthread_mutex_t Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 0, name: default) Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]: Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.comparator: leveldb.BytewiseComparator Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.merge_operator: .T:int64_array.b:bitwise_xor Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_filter: None Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_filter_factory: None Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.sst_partitioner_factory: None Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_factory: SkipListFactory Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.table_factory: BlockBasedTable Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55689a0ce160)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x5568990c62d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.write_buffer_size: 16777216 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_write_buffer_number: 64 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression: LZ4 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression: Disabled Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.prefix_extractor: nullptr Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.num_levels: 7 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.level: 32767 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.enabled: false Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.window_bits: -14 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.level: 32767 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.strategy: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.parallel_threads: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.enabled: false Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.level0_stop_writes_trigger: 36 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.target_file_size_base: 67108864 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.target_file_size_multiplier: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_compaction_bytes: 1677721600 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.arena_block_size: 1048576 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.disable_auto_compactions: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_style: kCompactionStyleLevel Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.table_properties_collectors: Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.inplace_update_support: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.inplace_update_num_locks: 10000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_whole_key_filtering: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_huge_page_size: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bloom_locality: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_successive_merges: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.optimize_filters_for_hits: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.paranoid_file_checks: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.force_consistency_checks: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.report_bg_io_stats: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.ttl: 2592000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.periodic_compaction_seconds: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.preclude_last_level_data_seconds: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.preserve_internal_time_seconds: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.enable_blob_files: false Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.min_blob_size: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_file_size: 268435456 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_compression_type: NoCompression Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.enable_blob_garbage_collection: false Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_compaction_readahead_size: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_file_starting_level: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 1, name: m-0) Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]: Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.comparator: leveldb.BytewiseComparator Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.merge_operator: None Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_filter: None Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_filter_factory: None Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.sst_partitioner_factory: None Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_factory: SkipListFactory Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.table_factory: BlockBasedTable Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55689a0ce160)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x5568990c62d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.write_buffer_size: 16777216 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_write_buffer_number: 64 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression: LZ4 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression: Disabled Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.prefix_extractor: nullptr Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.num_levels: 7 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.level: 32767 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.enabled: false Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.window_bits: -14 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.level: 32767 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.strategy: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.parallel_threads: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.enabled: false Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.level0_stop_writes_trigger: 36 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.target_file_size_base: 67108864 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.target_file_size_multiplier: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_compaction_bytes: 1677721600 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.arena_block_size: 1048576 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.disable_auto_compactions: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_style: kCompactionStyleLevel Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.inplace_update_support: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.inplace_update_num_locks: 10000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_whole_key_filtering: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_huge_page_size: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bloom_locality: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_successive_merges: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.optimize_filters_for_hits: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.paranoid_file_checks: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.force_consistency_checks: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.report_bg_io_stats: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.ttl: 2592000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.periodic_compaction_seconds: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.preclude_last_level_data_seconds: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.preserve_internal_time_seconds: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.enable_blob_files: false Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.min_blob_size: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_file_size: 268435456 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_compression_type: NoCompression Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.enable_blob_garbage_collection: false Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_compaction_readahead_size: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_file_starting_level: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 2, name: m-1) Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]: Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.comparator: leveldb.BytewiseComparator Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.merge_operator: None Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_filter: None Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_filter_factory: None Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.sst_partitioner_factory: None Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_factory: SkipListFactory Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.table_factory: BlockBasedTable Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55689a0ce160)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x5568990c62d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.write_buffer_size: 16777216 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_write_buffer_number: 64 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression: LZ4 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression: Disabled Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.prefix_extractor: nullptr Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.num_levels: 7 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.level: 32767 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.enabled: false Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.window_bits: -14 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.level: 32767 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.strategy: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.parallel_threads: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.enabled: false Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.level0_stop_writes_trigger: 36 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.target_file_size_base: 67108864 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.target_file_size_multiplier: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_compaction_bytes: 1677721600 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.arena_block_size: 1048576 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.disable_auto_compactions: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_style: kCompactionStyleLevel Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.inplace_update_support: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.inplace_update_num_locks: 10000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_whole_key_filtering: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_huge_page_size: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bloom_locality: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_successive_merges: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.optimize_filters_for_hits: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.paranoid_file_checks: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.force_consistency_checks: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.report_bg_io_stats: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.ttl: 2592000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.periodic_compaction_seconds: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.preclude_last_level_data_seconds: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.preserve_internal_time_seconds: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.enable_blob_files: false Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.min_blob_size: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_file_size: 268435456 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_compression_type: NoCompression Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.enable_blob_garbage_collection: false Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_compaction_readahead_size: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_file_starting_level: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 3, name: m-2) Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]: Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.comparator: leveldb.BytewiseComparator Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.merge_operator: None Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_filter: None Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_filter_factory: None Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.sst_partitioner_factory: None Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_factory: SkipListFactory Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.table_factory: BlockBasedTable Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55689a0ce160)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x5568990c62d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.write_buffer_size: 16777216 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_write_buffer_number: 64 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression: LZ4 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression: Disabled Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.prefix_extractor: nullptr Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.num_levels: 7 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.level: 32767 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.enabled: false Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.window_bits: -14 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.level: 32767 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.strategy: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.parallel_threads: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.enabled: false Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.level0_stop_writes_trigger: 36 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.target_file_size_base: 67108864 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.target_file_size_multiplier: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_compaction_bytes: 1677721600 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.arena_block_size: 1048576 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.disable_auto_compactions: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_style: kCompactionStyleLevel Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.inplace_update_support: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.inplace_update_num_locks: 10000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Dec 15 02:45:05 localhost systemd[1]: Started libpod-conmon-984f25c455f8a3e234a3e117b823a7cd12a3eda1735512cc69a518d5304e5471.scope. Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_whole_key_filtering: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_huge_page_size: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bloom_locality: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_successive_merges: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.optimize_filters_for_hits: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.paranoid_file_checks: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.force_consistency_checks: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.report_bg_io_stats: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.ttl: 2592000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.periodic_compaction_seconds: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.preclude_last_level_data_seconds: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.preserve_internal_time_seconds: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.enable_blob_files: false Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.min_blob_size: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_file_size: 268435456 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_compression_type: NoCompression Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.enable_blob_garbage_collection: false Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_compaction_readahead_size: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_file_starting_level: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 4, name: p-0) Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]: Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.comparator: leveldb.BytewiseComparator Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.merge_operator: None Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_filter: None Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_filter_factory: None Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.sst_partitioner_factory: None Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_factory: SkipListFactory Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.table_factory: BlockBasedTable Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55689a0ce160)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x5568990c62d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.write_buffer_size: 16777216 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_write_buffer_number: 64 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression: LZ4 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression: Disabled Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.prefix_extractor: nullptr Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.num_levels: 7 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.level: 32767 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.enabled: false Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.window_bits: -14 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.level: 32767 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.strategy: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.parallel_threads: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.enabled: false Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.level0_stop_writes_trigger: 36 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.target_file_size_base: 67108864 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.target_file_size_multiplier: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_compaction_bytes: 1677721600 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.arena_block_size: 1048576 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.disable_auto_compactions: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_style: kCompactionStyleLevel Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.inplace_update_support: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.inplace_update_num_locks: 10000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_whole_key_filtering: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_huge_page_size: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bloom_locality: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_successive_merges: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.optimize_filters_for_hits: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.paranoid_file_checks: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.force_consistency_checks: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.report_bg_io_stats: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.ttl: 2592000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.periodic_compaction_seconds: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.preclude_last_level_data_seconds: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.preserve_internal_time_seconds: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.enable_blob_files: false Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.min_blob_size: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_file_size: 268435456 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_compression_type: NoCompression Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.enable_blob_garbage_collection: false Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_compaction_readahead_size: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_file_starting_level: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 5, name: p-1) Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]: Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.comparator: leveldb.BytewiseComparator Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.merge_operator: None Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_filter: None Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_filter_factory: None Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.sst_partitioner_factory: None Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_factory: SkipListFactory Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.table_factory: BlockBasedTable Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55689a0ce160)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x5568990c62d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.write_buffer_size: 16777216 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_write_buffer_number: 64 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression: LZ4 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression: Disabled Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.prefix_extractor: nullptr Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.num_levels: 7 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.level: 32767 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.enabled: false Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.window_bits: -14 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.level: 32767 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.strategy: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.parallel_threads: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.enabled: false Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.level0_stop_writes_trigger: 36 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.target_file_size_base: 67108864 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.target_file_size_multiplier: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_compaction_bytes: 1677721600 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.arena_block_size: 1048576 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.disable_auto_compactions: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_style: kCompactionStyleLevel Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.inplace_update_support: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.inplace_update_num_locks: 10000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_whole_key_filtering: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_huge_page_size: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bloom_locality: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_successive_merges: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.optimize_filters_for_hits: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.paranoid_file_checks: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.force_consistency_checks: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.report_bg_io_stats: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.ttl: 2592000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.periodic_compaction_seconds: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.preclude_last_level_data_seconds: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.preserve_internal_time_seconds: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.enable_blob_files: false Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.min_blob_size: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_file_size: 268435456 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_compression_type: NoCompression Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.enable_blob_garbage_collection: false Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_compaction_readahead_size: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_file_starting_level: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 6, name: p-2) Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]: Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.comparator: leveldb.BytewiseComparator Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.merge_operator: None Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_filter: None Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_filter_factory: None Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.sst_partitioner_factory: None Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_factory: SkipListFactory Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.table_factory: BlockBasedTable Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55689a0ce160)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x5568990c62d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.write_buffer_size: 16777216 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_write_buffer_number: 64 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression: LZ4 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression: Disabled Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.prefix_extractor: nullptr Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.num_levels: 7 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.level: 32767 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.enabled: false Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.window_bits: -14 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.level: 32767 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.strategy: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.parallel_threads: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.enabled: false Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.level0_stop_writes_trigger: 36 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.target_file_size_base: 67108864 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.target_file_size_multiplier: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_compaction_bytes: 1677721600 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.arena_block_size: 1048576 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.disable_auto_compactions: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_style: kCompactionStyleLevel Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.inplace_update_support: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.inplace_update_num_locks: 10000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_whole_key_filtering: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_huge_page_size: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bloom_locality: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_successive_merges: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.optimize_filters_for_hits: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.paranoid_file_checks: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.force_consistency_checks: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.report_bg_io_stats: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.ttl: 2592000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.periodic_compaction_seconds: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.preclude_last_level_data_seconds: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.preserve_internal_time_seconds: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.enable_blob_files: false Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.min_blob_size: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_file_size: 268435456 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_compression_type: NoCompression Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.enable_blob_garbage_collection: false Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_compaction_readahead_size: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_file_starting_level: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 7, name: O-0) Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]: Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.comparator: leveldb.BytewiseComparator Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.merge_operator: None Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_filter: None Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_filter_factory: None Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.sst_partitioner_factory: None Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_factory: SkipListFactory Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.table_factory: BlockBasedTable Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55689a0cf3a0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x5568990c7610#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.write_buffer_size: 16777216 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_write_buffer_number: 64 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression: LZ4 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression: Disabled Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.prefix_extractor: nullptr Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.num_levels: 7 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.level: 32767 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.enabled: false Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.window_bits: -14 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.level: 32767 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.strategy: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.parallel_threads: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.enabled: false Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.level0_stop_writes_trigger: 36 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.target_file_size_base: 67108864 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.target_file_size_multiplier: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_compaction_bytes: 1677721600 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.arena_block_size: 1048576 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.disable_auto_compactions: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_style: kCompactionStyleLevel Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.inplace_update_support: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.inplace_update_num_locks: 10000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_whole_key_filtering: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_huge_page_size: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bloom_locality: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_successive_merges: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.optimize_filters_for_hits: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.paranoid_file_checks: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.force_consistency_checks: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.report_bg_io_stats: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.ttl: 2592000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.periodic_compaction_seconds: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.preclude_last_level_data_seconds: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.preserve_internal_time_seconds: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.enable_blob_files: false Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.min_blob_size: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_file_size: 268435456 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_compression_type: NoCompression Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.enable_blob_garbage_collection: false Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_compaction_readahead_size: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_file_starting_level: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 8, name: O-1) Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]: Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.comparator: leveldb.BytewiseComparator Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.merge_operator: None Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_filter: None Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_filter_factory: None Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.sst_partitioner_factory: None Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_factory: SkipListFactory Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.table_factory: BlockBasedTable Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55689a0cf3a0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x5568990c7610#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.write_buffer_size: 16777216 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_write_buffer_number: 64 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression: LZ4 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression: Disabled Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.prefix_extractor: nullptr Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.num_levels: 7 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.level: 32767 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.enabled: false Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.window_bits: -14 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.level: 32767 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.strategy: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.parallel_threads: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.enabled: false Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.level0_stop_writes_trigger: 36 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.target_file_size_base: 67108864 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.target_file_size_multiplier: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_compaction_bytes: 1677721600 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.arena_block_size: 1048576 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.disable_auto_compactions: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_style: kCompactionStyleLevel Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.inplace_update_support: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.inplace_update_num_locks: 10000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_whole_key_filtering: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_huge_page_size: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bloom_locality: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_successive_merges: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.optimize_filters_for_hits: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.paranoid_file_checks: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.force_consistency_checks: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.report_bg_io_stats: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.ttl: 2592000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.periodic_compaction_seconds: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.preclude_last_level_data_seconds: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.preserve_internal_time_seconds: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.enable_blob_files: false Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.min_blob_size: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_file_size: 268435456 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_compression_type: NoCompression Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.enable_blob_garbage_collection: false Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_compaction_readahead_size: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_file_starting_level: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 9, name: O-2) Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]: Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.comparator: leveldb.BytewiseComparator Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.merge_operator: None Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_filter: None Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_filter_factory: None Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.sst_partitioner_factory: None Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_factory: SkipListFactory Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.table_factory: BlockBasedTable Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55689a0cf3a0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x5568990c7610#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.write_buffer_size: 16777216 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_write_buffer_number: 64 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression: LZ4 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression: Disabled Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.prefix_extractor: nullptr Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.num_levels: 7 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.level: 32767 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.enabled: false Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.window_bits: -14 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.level: 32767 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.strategy: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.parallel_threads: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.enabled: false Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.level0_stop_writes_trigger: 36 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.target_file_size_base: 67108864 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.target_file_size_multiplier: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_compaction_bytes: 1677721600 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.arena_block_size: 1048576 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.disable_auto_compactions: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_style: kCompactionStyleLevel Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.inplace_update_support: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.inplace_update_num_locks: 10000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_whole_key_filtering: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.memtable_huge_page_size: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.bloom_locality: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.max_successive_merges: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.optimize_filters_for_hits: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.paranoid_file_checks: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.force_consistency_checks: 1 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.report_bg_io_stats: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.ttl: 2592000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.periodic_compaction_seconds: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.preclude_last_level_data_seconds: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.preserve_internal_time_seconds: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.enable_blob_files: false Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.min_blob_size: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_file_size: 268435456 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_compression_type: NoCompression Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.enable_blob_garbage_collection: false Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_compaction_readahead_size: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.blob_file_starting_level: 0 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 10, name: L) Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/column_family.cc:635] #011(skipping printing options) Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 11, name: P) Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/column_family.cc:635] #011(skipping printing options) Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: abd90ad5-9c07-431d-a96e-b08c1474ad86 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765784705798489, "job": 1, "event": "recovery_started", "wal_files": [31]} Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765784705805439, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 35, "file_size": 1261, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13, "largest_seqno": 21, "table_properties": {"data_size": 128, "index_size": 27, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 87, "raw_average_key_size": 17, "raw_value_size": 82, "raw_average_value_size": 16, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 2, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": ".T:int64_array.b:bitwise_xor", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765784705, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "abd90ad5-9c07-431d-a96e-b08c1474ad86", "db_session_id": "GVF0X3DWF49JJ2HDW5TC", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}} Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765784705808851, "cf_name": "p-0", "job": 1, "event": "table_file_creation", "file_number": 36, "file_size": 1609, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14, "largest_seqno": 15, "table_properties": {"data_size": 468, "index_size": 39, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 72, "raw_average_key_size": 36, "raw_value_size": 567, "raw_average_value_size": 283, "num_data_blocks": 1, "num_entries": 2, "num_filter_entries": 2, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "p-0", "column_family_id": 4, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765784705, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "abd90ad5-9c07-431d-a96e-b08c1474ad86", "db_session_id": "GVF0X3DWF49JJ2HDW5TC", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}} Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765784705812433, "cf_name": "O-2", "job": 1, "event": "table_file_creation", "file_number": 37, "file_size": 1290, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16, "largest_seqno": 16, "table_properties": {"data_size": 121, "index_size": 64, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 55, "raw_average_key_size": 55, "raw_value_size": 50, "raw_average_value_size": 50, "num_data_blocks": 1, "num_entries": 1, "num_filter_entries": 1, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "O-2", "column_family_id": 9, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765784705, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "abd90ad5-9c07-431d-a96e-b08c1474ad86", "db_session_id": "GVF0X3DWF49JJ2HDW5TC", "orig_file_number": 37, "seqno_to_time_mapping": "N/A"}} Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/db_impl/db_impl_open.cc:1432] Failed to truncate log #31: IO error: No such file or directory: While open a file for appending: db.wal/000031.log: No such file or directory Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765784705816137, "job": 1, "event": "recovery_finished"} Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/version_set.cc:5047] Creating manifest 40 Dec 15 02:45:05 localhost podman[31709]: 2025-12-15 07:45:05.722399085 +0000 UTC m=+0.032850416 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 02:45:05 localhost systemd[1]: Started libcrun container. Dec 15 02:45:05 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ac11ac63870e6031c6e27fc2ecda4e00be78c3c93205a03b9d16e3474a27e3a5/merged/rootfs supports timestamps until 2038 (0x7fffffff) Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x55689918e380 Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: DB pointer 0x556899feba00 Dec 15 02:45:05 localhost ceph-osd[31375]: bluestore(/var/lib/ceph/osd/ceph-0) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0 Dec 15 02:45:05 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ac11ac63870e6031c6e27fc2ecda4e00be78c3c93205a03b9d16e3474a27e3a5/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Dec 15 02:45:05 localhost ceph-osd[31375]: bluestore(/var/lib/ceph/osd/ceph-0) _upgrade_super from 4, latest 4 Dec 15 02:45:05 localhost ceph-osd[31375]: bluestore(/var/lib/ceph/osd/ceph-0) _upgrade_super done Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Dec 15 02:45:05 localhost ceph-osd[31375]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 0.1 total, 0.1 interval#012Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s#012Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 2/0 2.61 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.2 0.01 0.00 1 0.007 0 0 0.0 0.0#012 Sum 2/0 2.61 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.2 0.01 0.00 1 0.007 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.2 0.01 0.00 1 0.007 0 0 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.2 0.01 0.00 1 0.007 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x5568990c62d0#2 capacity: 460.80 MB usage: 0.94 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 7.7e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(2,0.72 KB,0.000152323%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-0] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x5568990c62d0#2 capacity: 460.80 MB usage: 0.94 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 7.7e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(2,0.72 KB,0.000152323%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-1] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.1 total, 0.1 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012 Dec 15 02:45:05 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ac11ac63870e6031c6e27fc2ecda4e00be78c3c93205a03b9d16e3474a27e3a5/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Dec 15 02:45:05 localhost ceph-osd[31375]: /builddir/build/BUILD/ceph-18.2.1/src/cls/cephfs/cls_cephfs.cc:201: loading cephfs Dec 15 02:45:05 localhost ceph-osd[31375]: /builddir/build/BUILD/ceph-18.2.1/src/cls/hello/cls_hello.cc:316: loading cls_hello Dec 15 02:45:05 localhost ceph-osd[31375]: _get_class not permitted to load lua Dec 15 02:45:05 localhost ceph-osd[31375]: _get_class not permitted to load sdk Dec 15 02:45:05 localhost ceph-osd[31375]: _get_class not permitted to load test_remote_reads Dec 15 02:45:05 localhost ceph-osd[31375]: osd.0 0 crush map has features 288232575208783872, adjusting msgr requires for clients Dec 15 02:45:05 localhost ceph-osd[31375]: osd.0 0 crush map has features 288232575208783872 was 8705, adjusting msgr requires for mons Dec 15 02:45:05 localhost ceph-osd[31375]: osd.0 0 crush map has features 288232575208783872, adjusting msgr requires for osds Dec 15 02:45:05 localhost ceph-osd[31375]: osd.0 0 check_osdmap_features enabling on-disk ERASURE CODES compat feature Dec 15 02:45:05 localhost ceph-osd[31375]: osd.0 0 load_pgs Dec 15 02:45:05 localhost ceph-osd[31375]: osd.0 0 load_pgs opened 0 pgs Dec 15 02:45:05 localhost ceph-osd[31375]: osd.0 0 log_to_monitors true Dec 15 02:45:05 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-osd-0[31370]: 2025-12-15T07:45:05.854+0000 7fec5f366a80 -1 osd.0 0 log_to_monitors true Dec 15 02:45:05 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ac11ac63870e6031c6e27fc2ecda4e00be78c3c93205a03b9d16e3474a27e3a5/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Dec 15 02:45:05 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ac11ac63870e6031c6e27fc2ecda4e00be78c3c93205a03b9d16e3474a27e3a5/merged/var/lib/ceph/osd/ceph-3 supports timestamps until 2038 (0x7fffffff) Dec 15 02:45:05 localhost podman[31709]: 2025-12-15 07:45:05.87930607 +0000 UTC m=+0.189757411 container init 984f25c455f8a3e234a3e117b823a7cd12a3eda1735512cc69a518d5304e5471 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-osd-3-activate-test, vcs-type=git, RELEASE=main, release=1763362218, vendor=Red Hat, Inc., version=7, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux , name=rhceph, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, build-date=2025-11-26T19:44:28Z, io.openshift.expose-services=, GIT_CLEAN=True, io.openshift.tags=rhceph ceph) Dec 15 02:45:05 localhost podman[31709]: 2025-12-15 07:45:05.890687713 +0000 UTC m=+0.201139054 container start 984f25c455f8a3e234a3e117b823a7cd12a3eda1735512cc69a518d5304e5471 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-osd-3-activate-test, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, distribution-scope=public, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, version=7, maintainer=Guillaume Abrioux , architecture=x86_64, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, release=1763362218, CEPH_POINT_RELEASE=, RELEASE=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream) Dec 15 02:45:05 localhost podman[31709]: 2025-12-15 07:45:05.891131154 +0000 UTC m=+0.201582505 container attach 984f25c455f8a3e234a3e117b823a7cd12a3eda1735512cc69a518d5304e5471 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-osd-3-activate-test, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , GIT_BRANCH=main, ceph=True, GIT_CLEAN=True, build-date=2025-11-26T19:44:28Z, RELEASE=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, distribution-scope=public, name=rhceph, vcs-type=git, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc.) Dec 15 02:45:06 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-osd-3-activate-test[31906]: usage: ceph-volume activate [-h] [--osd-id OSD_ID] [--osd-uuid OSD_UUID] Dec 15 02:45:06 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-osd-3-activate-test[31906]: [--no-systemd] [--no-tmpfs] Dec 15 02:45:06 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-osd-3-activate-test[31906]: ceph-volume activate: error: unrecognized arguments: --bad-option Dec 15 02:45:06 localhost systemd[1]: libpod-984f25c455f8a3e234a3e117b823a7cd12a3eda1735512cc69a518d5304e5471.scope: Deactivated successfully. Dec 15 02:45:06 localhost podman[31709]: 2025-12-15 07:45:06.09862386 +0000 UTC m=+0.409075211 container died 984f25c455f8a3e234a3e117b823a7cd12a3eda1735512cc69a518d5304e5471 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-osd-3-activate-test, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, vendor=Red Hat, Inc., RELEASE=main, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, build-date=2025-11-26T19:44:28Z, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, release=1763362218) Dec 15 02:45:06 localhost podman[31944]: 2025-12-15 07:45:06.18107102 +0000 UTC m=+0.073722117 container remove 984f25c455f8a3e234a3e117b823a7cd12a3eda1735512cc69a518d5304e5471 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-osd-3-activate-test, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, io.openshift.expose-services=, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-11-26T19:44:28Z, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, CEPH_POINT_RELEASE=, RELEASE=main, release=1763362218, vcs-type=git, description=Red Hat Ceph Storage 7) Dec 15 02:45:06 localhost systemd[1]: libpod-conmon-984f25c455f8a3e234a3e117b823a7cd12a3eda1735512cc69a518d5304e5471.scope: Deactivated successfully. Dec 15 02:45:06 localhost systemd[1]: var-lib-containers-storage-overlay-ac11ac63870e6031c6e27fc2ecda4e00be78c3c93205a03b9d16e3474a27e3a5-merged.mount: Deactivated successfully. Dec 15 02:45:06 localhost systemd[1]: Reloading. Dec 15 02:45:06 localhost systemd-rc-local-generator[31991]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 02:45:06 localhost systemd-sysv-generator[32000]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 02:45:06 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 02:45:06 localhost systemd[1]: Reloading. Dec 15 02:45:06 localhost systemd-sysv-generator[32040]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 02:45:06 localhost systemd-rc-local-generator[32036]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 02:45:06 localhost ceph-osd[31375]: log_channel(cluster) log [DBG] : purged_snaps scrub starts Dec 15 02:45:06 localhost ceph-osd[31375]: log_channel(cluster) log [DBG] : purged_snaps scrub ok Dec 15 02:45:06 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 02:45:06 localhost systemd[1]: Starting Ceph osd.3 for bce17446-41b5-5408-a23e-0b011906b44a... Dec 15 02:45:07 localhost podman[32105]: Dec 15 02:45:07 localhost podman[32105]: 2025-12-15 07:45:07.31728254 +0000 UTC m=+0.079103935 container create 0f2e6dd0cedf1ff36e9af7fc2aec3f8d8905669ddcff10e7069e45b5c388b3e0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-osd-3-activate, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, description=Red Hat Ceph Storage 7, architecture=x86_64, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, io.openshift.expose-services=, distribution-scope=public, release=1763362218, build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, RELEASE=main, maintainer=Guillaume Abrioux , io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, CEPH_POINT_RELEASE=) Dec 15 02:45:07 localhost systemd[1]: tmp-crun.xX7xVo.mount: Deactivated successfully. Dec 15 02:45:07 localhost systemd[1]: Started libcrun container. Dec 15 02:45:07 localhost podman[32105]: 2025-12-15 07:45:07.28460068 +0000 UTC m=+0.046422085 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 02:45:07 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a68fcd6f493ce1efb6ed18bc2f7f6002fada8bbf84dc7a68b2fa4889a2942726/merged/rootfs supports timestamps until 2038 (0x7fffffff) Dec 15 02:45:07 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a68fcd6f493ce1efb6ed18bc2f7f6002fada8bbf84dc7a68b2fa4889a2942726/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Dec 15 02:45:07 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a68fcd6f493ce1efb6ed18bc2f7f6002fada8bbf84dc7a68b2fa4889a2942726/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Dec 15 02:45:07 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a68fcd6f493ce1efb6ed18bc2f7f6002fada8bbf84dc7a68b2fa4889a2942726/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Dec 15 02:45:07 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a68fcd6f493ce1efb6ed18bc2f7f6002fada8bbf84dc7a68b2fa4889a2942726/merged/var/lib/ceph/osd/ceph-3 supports timestamps until 2038 (0x7fffffff) Dec 15 02:45:07 localhost podman[32105]: 2025-12-15 07:45:07.445604891 +0000 UTC m=+0.207426296 container init 0f2e6dd0cedf1ff36e9af7fc2aec3f8d8905669ddcff10e7069e45b5c388b3e0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-osd-3-activate, architecture=x86_64, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, GIT_CLEAN=True, build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, release=1763362218, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.expose-services=, ceph=True, vcs-type=git, version=7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, maintainer=Guillaume Abrioux , RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, name=rhceph, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git) Dec 15 02:45:07 localhost podman[32105]: 2025-12-15 07:45:07.454877029 +0000 UTC m=+0.216698424 container start 0f2e6dd0cedf1ff36e9af7fc2aec3f8d8905669ddcff10e7069e45b5c388b3e0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-osd-3-activate, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, RELEASE=main, GIT_CLEAN=True, architecture=x86_64, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, distribution-scope=public, ceph=True, vcs-type=git, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc.) Dec 15 02:45:07 localhost podman[32105]: 2025-12-15 07:45:07.455146156 +0000 UTC m=+0.216967611 container attach 0f2e6dd0cedf1ff36e9af7fc2aec3f8d8905669ddcff10e7069e45b5c388b3e0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-osd-3-activate, vendor=Red Hat, Inc., ceph=True, vcs-type=git, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, GIT_CLEAN=True, build-date=2025-11-26T19:44:28Z, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, CEPH_POINT_RELEASE=, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, com.redhat.component=rhceph-container, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, release=1763362218, version=7, description=Red Hat Ceph Storage 7) Dec 15 02:45:07 localhost ceph-osd[31375]: osd.0 0 done with init, starting boot process Dec 15 02:45:07 localhost ceph-osd[31375]: osd.0 0 start_boot Dec 15 02:45:07 localhost ceph-osd[31375]: osd.0 0 maybe_override_options_for_qos osd_max_backfills set to 1 Dec 15 02:45:07 localhost ceph-osd[31375]: osd.0 0 maybe_override_options_for_qos osd_recovery_max_active set to 0 Dec 15 02:45:07 localhost ceph-osd[31375]: osd.0 0 maybe_override_options_for_qos osd_recovery_max_active_hdd set to 3 Dec 15 02:45:07 localhost ceph-osd[31375]: osd.0 0 maybe_override_options_for_qos osd_recovery_max_active_ssd set to 10 Dec 15 02:45:07 localhost ceph-osd[31375]: osd.0 0 bench count 12288000 bsize 4 KiB Dec 15 02:45:08 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-osd-3-activate[32119]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3 Dec 15 02:45:08 localhost bash[32105]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3 Dec 15 02:45:08 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-osd-3-activate[32119]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-3 --no-mon-config --dev /dev/mapper/ceph_vg1-ceph_lv1 Dec 15 02:45:08 localhost bash[32105]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-3 --no-mon-config --dev /dev/mapper/ceph_vg1-ceph_lv1 Dec 15 02:45:08 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-osd-3-activate[32119]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg1-ceph_lv1 Dec 15 02:45:08 localhost bash[32105]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg1-ceph_lv1 Dec 15 02:45:08 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-osd-3-activate[32119]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1 Dec 15 02:45:08 localhost bash[32105]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1 Dec 15 02:45:08 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-osd-3-activate[32119]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg1-ceph_lv1 /var/lib/ceph/osd/ceph-3/block Dec 15 02:45:08 localhost bash[32105]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg1-ceph_lv1 /var/lib/ceph/osd/ceph-3/block Dec 15 02:45:08 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-osd-3-activate[32119]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3 Dec 15 02:45:08 localhost bash[32105]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3 Dec 15 02:45:08 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-osd-3-activate[32119]: --> ceph-volume raw activate successful for osd ID: 3 Dec 15 02:45:08 localhost bash[32105]: --> ceph-volume raw activate successful for osd ID: 3 Dec 15 02:45:08 localhost systemd[1]: libpod-0f2e6dd0cedf1ff36e9af7fc2aec3f8d8905669ddcff10e7069e45b5c388b3e0.scope: Deactivated successfully. Dec 15 02:45:08 localhost podman[32105]: 2025-12-15 07:45:08.128813241 +0000 UTC m=+0.890634666 container died 0f2e6dd0cedf1ff36e9af7fc2aec3f8d8905669ddcff10e7069e45b5c388b3e0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-osd-3-activate, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, build-date=2025-11-26T19:44:28Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, architecture=x86_64, GIT_BRANCH=main, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, distribution-scope=public, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, RELEASE=main, version=7, maintainer=Guillaume Abrioux , name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True) Dec 15 02:45:08 localhost systemd[1]: tmp-crun.AVGaNk.mount: Deactivated successfully. Dec 15 02:45:08 localhost systemd[1]: var-lib-containers-storage-overlay-a68fcd6f493ce1efb6ed18bc2f7f6002fada8bbf84dc7a68b2fa4889a2942726-merged.mount: Deactivated successfully. Dec 15 02:45:08 localhost podman[32234]: 2025-12-15 07:45:08.25633493 +0000 UTC m=+0.118861367 container remove 0f2e6dd0cedf1ff36e9af7fc2aec3f8d8905669ddcff10e7069e45b5c388b3e0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-osd-3-activate, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, GIT_BRANCH=main, RELEASE=main, vendor=Red Hat, Inc., ceph=True, io.openshift.expose-services=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, architecture=x86_64, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1763362218, version=7, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , distribution-scope=public, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, GIT_CLEAN=True) Dec 15 02:45:08 localhost podman[32292]: Dec 15 02:45:08 localhost podman[32292]: 2025-12-15 07:45:08.626808288 +0000 UTC m=+0.086000872 container create b98390a15dab4d7f67e756b890fc396f94112afc3efc051058266f2ad9d8ce0c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-osd-3, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, description=Red Hat Ceph Storage 7, version=7, distribution-scope=public, GIT_BRANCH=main, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, build-date=2025-11-26T19:44:28Z, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux , RELEASE=main, CEPH_POINT_RELEASE=, name=rhceph, GIT_CLEAN=True, release=1763362218, vcs-type=git, io.openshift.tags=rhceph ceph) Dec 15 02:45:08 localhost podman[32292]: 2025-12-15 07:45:08.592497925 +0000 UTC m=+0.051690499 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 02:45:08 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/535feadadf4623b579741715b1292a3ef7180ef19d036fb26679d8292e89d6c7/merged/rootfs supports timestamps until 2038 (0x7fffffff) Dec 15 02:45:08 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/535feadadf4623b579741715b1292a3ef7180ef19d036fb26679d8292e89d6c7/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Dec 15 02:45:08 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/535feadadf4623b579741715b1292a3ef7180ef19d036fb26679d8292e89d6c7/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Dec 15 02:45:08 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/535feadadf4623b579741715b1292a3ef7180ef19d036fb26679d8292e89d6c7/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Dec 15 02:45:08 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/535feadadf4623b579741715b1292a3ef7180ef19d036fb26679d8292e89d6c7/merged/var/lib/ceph/osd/ceph-3 supports timestamps until 2038 (0x7fffffff) Dec 15 02:45:08 localhost podman[32292]: 2025-12-15 07:45:08.760814124 +0000 UTC m=+0.220006668 container init b98390a15dab4d7f67e756b890fc396f94112afc3efc051058266f2ad9d8ce0c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-osd-3, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, description=Red Hat Ceph Storage 7, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., name=rhceph, GIT_CLEAN=True, build-date=2025-11-26T19:44:28Z, RELEASE=main, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux , release=1763362218, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, version=7, io.openshift.tags=rhceph ceph, GIT_BRANCH=main) Dec 15 02:45:08 localhost podman[32292]: 2025-12-15 07:45:08.786767291 +0000 UTC m=+0.245959875 container start b98390a15dab4d7f67e756b890fc396f94112afc3efc051058266f2ad9d8ce0c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-osd-3, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, build-date=2025-11-26T19:44:28Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, CEPH_POINT_RELEASE=, GIT_CLEAN=True, ceph=True, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, vcs-type=git, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, RELEASE=main, io.openshift.expose-services=, maintainer=Guillaume Abrioux , release=1763362218, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0) Dec 15 02:45:08 localhost bash[32292]: b98390a15dab4d7f67e756b890fc396f94112afc3efc051058266f2ad9d8ce0c Dec 15 02:45:08 localhost systemd[1]: Started Ceph osd.3 for bce17446-41b5-5408-a23e-0b011906b44a. Dec 15 02:45:08 localhost ceph-osd[32311]: set uid:gid to 167:167 (ceph:ceph) Dec 15 02:45:08 localhost ceph-osd[32311]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-osd, pid 2 Dec 15 02:45:08 localhost ceph-osd[32311]: pidfile_write: ignore empty --pid-file Dec 15 02:45:08 localhost ceph-osd[32311]: bdev(0x564edfd22e00 /var/lib/ceph/osd/ceph-3/block) open path /var/lib/ceph/osd/ceph-3/block Dec 15 02:45:08 localhost ceph-osd[32311]: bdev(0x564edfd22e00 /var/lib/ceph/osd/ceph-3/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-3/block failed: (22) Invalid argument Dec 15 02:45:08 localhost ceph-osd[32311]: bdev(0x564edfd22e00 /var/lib/ceph/osd/ceph-3/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Dec 15 02:45:08 localhost ceph-osd[32311]: bluestore(/var/lib/ceph/osd/ceph-3) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06 Dec 15 02:45:08 localhost ceph-osd[32311]: bdev(0x564edfd23180 /var/lib/ceph/osd/ceph-3/block) open path /var/lib/ceph/osd/ceph-3/block Dec 15 02:45:08 localhost ceph-osd[32311]: bdev(0x564edfd23180 /var/lib/ceph/osd/ceph-3/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-3/block failed: (22) Invalid argument Dec 15 02:45:08 localhost ceph-osd[32311]: bdev(0x564edfd23180 /var/lib/ceph/osd/ceph-3/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Dec 15 02:45:08 localhost ceph-osd[32311]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-3/block size 7.0 GiB Dec 15 02:45:08 localhost ceph-osd[32311]: bdev(0x564edfd23180 /var/lib/ceph/osd/ceph-3/block) close Dec 15 02:45:08 localhost ceph-osd[32311]: bdev(0x564edfd22e00 /var/lib/ceph/osd/ceph-3/block) close Dec 15 02:45:09 localhost ceph-osd[32311]: starting osd.3 osd_data /var/lib/ceph/osd/ceph-3 /var/lib/ceph/osd/ceph-3/journal Dec 15 02:45:09 localhost ceph-osd[32311]: load: jerasure load: lrc Dec 15 02:45:09 localhost ceph-osd[32311]: bdev(0x564edfd22e00 /var/lib/ceph/osd/ceph-3/block) open path /var/lib/ceph/osd/ceph-3/block Dec 15 02:45:09 localhost ceph-osd[32311]: bdev(0x564edfd22e00 /var/lib/ceph/osd/ceph-3/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-3/block failed: (22) Invalid argument Dec 15 02:45:09 localhost ceph-osd[32311]: bdev(0x564edfd22e00 /var/lib/ceph/osd/ceph-3/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Dec 15 02:45:09 localhost ceph-osd[32311]: bluestore(/var/lib/ceph/osd/ceph-3) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06 Dec 15 02:45:09 localhost ceph-osd[32311]: bdev(0x564edfd22e00 /var/lib/ceph/osd/ceph-3/block) close Dec 15 02:45:09 localhost ceph-osd[32311]: bdev(0x564edfd22e00 /var/lib/ceph/osd/ceph-3/block) open path /var/lib/ceph/osd/ceph-3/block Dec 15 02:45:09 localhost ceph-osd[32311]: bdev(0x564edfd22e00 /var/lib/ceph/osd/ceph-3/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-3/block failed: (22) Invalid argument Dec 15 02:45:09 localhost ceph-osd[32311]: bdev(0x564edfd22e00 /var/lib/ceph/osd/ceph-3/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Dec 15 02:45:09 localhost ceph-osd[32311]: bluestore(/var/lib/ceph/osd/ceph-3) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06 Dec 15 02:45:09 localhost ceph-osd[32311]: bdev(0x564edfd22e00 /var/lib/ceph/osd/ceph-3/block) close Dec 15 02:45:09 localhost podman[32406]: Dec 15 02:45:09 localhost podman[32406]: 2025-12-15 07:45:09.628634482 +0000 UTC m=+0.086543417 container create 0f490476795a09baac10199b73c8aa8db647d75425267f4a166b445a21c3f46e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=amazing_tharp, distribution-scope=public, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, io.buildah.version=1.41.4, vcs-type=git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, name=rhceph, RELEASE=main, ceph=True, release=1763362218, maintainer=Guillaume Abrioux , architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Dec 15 02:45:09 localhost ceph-osd[32311]: mClockScheduler: set_osd_capacity_params_from_config: osd_bandwidth_cost_per_io: 499321.90 bytes/io, osd_bandwidth_capacity_per_shard 157286400.00 bytes/second Dec 15 02:45:09 localhost ceph-osd[32311]: osd.3:0.OSDShard using op scheduler mclock_scheduler, cutoff=196 Dec 15 02:45:09 localhost ceph-osd[32311]: bdev(0x564edfd22e00 /var/lib/ceph/osd/ceph-3/block) open path /var/lib/ceph/osd/ceph-3/block Dec 15 02:45:09 localhost ceph-osd[32311]: bdev(0x564edfd22e00 /var/lib/ceph/osd/ceph-3/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-3/block failed: (22) Invalid argument Dec 15 02:45:09 localhost ceph-osd[32311]: bdev(0x564edfd22e00 /var/lib/ceph/osd/ceph-3/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Dec 15 02:45:09 localhost ceph-osd[32311]: bluestore(/var/lib/ceph/osd/ceph-3) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06 Dec 15 02:45:09 localhost ceph-osd[32311]: bdev(0x564edfd23180 /var/lib/ceph/osd/ceph-3/block) open path /var/lib/ceph/osd/ceph-3/block Dec 15 02:45:09 localhost ceph-osd[32311]: bdev(0x564edfd23180 /var/lib/ceph/osd/ceph-3/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-3/block failed: (22) Invalid argument Dec 15 02:45:09 localhost ceph-osd[32311]: bdev(0x564edfd23180 /var/lib/ceph/osd/ceph-3/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Dec 15 02:45:09 localhost ceph-osd[32311]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-3/block size 7.0 GiB Dec 15 02:45:09 localhost ceph-osd[32311]: bluefs mount Dec 15 02:45:09 localhost ceph-osd[32311]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000 Dec 15 02:45:09 localhost ceph-osd[32311]: bluefs mount shared_bdev_used = 0 Dec 15 02:45:09 localhost ceph-osd[32311]: bluestore(/var/lib/ceph/osd/ceph-3) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: RocksDB version: 7.9.2 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Git sha 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Compile date 2025-09-23 00:00:00 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: DB SUMMARY Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: DB Session ID: OTGNSRAE5SWL4LSU2B73 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: CURRENT file: CURRENT Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: IDENTITY file: IDENTITY Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: MANIFEST file: MANIFEST-000032 size: 1007 Bytes Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: SST files in db.slow dir, Total Num: 0, files: Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.error_if_exists: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.create_if_missing: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.paranoid_checks: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.flush_verify_memtable_count: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.track_and_verify_wals_in_manifest: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.verify_sst_unique_id_in_manifest: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.env: 0x564edffb6cb0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.fs: LegacyFileSystem Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.info_log: 0x564ee0cbeb80 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_file_opening_threads: 16 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.statistics: (nil) Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.use_fsync: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_log_file_size: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_manifest_file_size: 1073741824 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.log_file_time_to_roll: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.keep_log_file_num: 1000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.recycle_log_file_num: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.allow_fallocate: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.allow_mmap_reads: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.allow_mmap_writes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.use_direct_reads: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.use_direct_io_for_flush_and_compaction: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.create_missing_column_families: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.db_log_dir: Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.wal_dir: db.wal Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.table_cache_numshardbits: 6 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.WAL_ttl_seconds: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.WAL_size_limit_MB: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_write_batch_group_size_bytes: 1048576 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.manifest_preallocation_size: 4194304 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.is_fd_close_on_exec: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.advise_random_on_open: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.db_write_buffer_size: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.write_buffer_manager: 0x564edfd0c140 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.access_hint_on_compaction_start: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.random_access_max_buffer_size: 1048576 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.use_adaptive_mutex: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.rate_limiter: (nil) Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.sst_file_manager.rate_bytes_per_sec: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.wal_recovery_mode: 2 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.enable_thread_tracking: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.enable_pipelined_write: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.unordered_write: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.allow_concurrent_memtable_write: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.enable_write_thread_adaptive_yield: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.write_thread_max_yield_usec: 100 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.write_thread_slow_yield_usec: 3 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.row_cache: None Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.wal_filter: None Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.avoid_flush_during_recovery: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.allow_ingest_behind: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.two_write_queues: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.manual_wal_flush: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.wal_compression: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.atomic_flush: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.avoid_unnecessary_blocking_io: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.persist_stats_to_disk: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.write_dbid_to_manifest: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.log_readahead_size: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.file_checksum_gen_factory: Unknown Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.best_efforts_recovery: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bgerror_resume_count: 2147483647 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bgerror_resume_retry_interval: 1000000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.allow_data_in_errors: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.db_host_id: __hostname__ Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.enforce_single_del_contracts: true Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_background_jobs: 4 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_background_compactions: -1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_subcompactions: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.avoid_flush_during_shutdown: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.writable_file_max_buffer_size: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.delayed_write_rate : 16777216 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_total_wal_size: 1073741824 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.delete_obsolete_files_period_micros: 21600000000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.stats_dump_period_sec: 600 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.stats_persist_period_sec: 600 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.stats_history_buffer_size: 1048576 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_open_files: -1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bytes_per_sync: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.wal_bytes_per_sync: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.strict_bytes_per_sync: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_readahead_size: 2097152 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_background_flushes: -1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Compression algorithms supported: Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: #011kZSTD supported: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: #011kXpressCompression supported: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: #011kBZip2Compression supported: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: #011kZSTDNotFinalCompression supported: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: #011kLZ4Compression supported: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: #011kZlibCompression supported: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: #011kLZ4HCCompression supported: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: #011kSnappyCompression supported: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Fast CRC32 supported: Supported on x86 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: DMutex implementation: pthread_mutex_t Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: [db/db_impl/db_impl_readonly.cc:25] Opening the db in read only mode Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 0, name: default) Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]: Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.comparator: leveldb.BytewiseComparator Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.merge_operator: .T:int64_array.b:bitwise_xor Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_filter: None Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_filter_factory: None Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.sst_partitioner_factory: None Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_factory: SkipListFactory Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.table_factory: BlockBasedTable Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x564ee0cbed40)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x564edfcfa850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.write_buffer_size: 16777216 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_write_buffer_number: 64 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression: LZ4 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression: Disabled Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.prefix_extractor: nullptr Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.num_levels: 7 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.level: 32767 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.enabled: false Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.window_bits: -14 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.level: 32767 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.strategy: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.parallel_threads: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.enabled: false Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.level0_stop_writes_trigger: 36 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.target_file_size_base: 67108864 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.target_file_size_multiplier: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_compaction_bytes: 1677721600 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.arena_block_size: 1048576 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.disable_auto_compactions: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_style: kCompactionStyleLevel Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.table_properties_collectors: Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.inplace_update_support: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.inplace_update_num_locks: 10000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_whole_key_filtering: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_huge_page_size: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bloom_locality: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_successive_merges: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.optimize_filters_for_hits: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.paranoid_file_checks: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.force_consistency_checks: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.report_bg_io_stats: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.ttl: 2592000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.periodic_compaction_seconds: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.preclude_last_level_data_seconds: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.preserve_internal_time_seconds: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.enable_blob_files: false Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.min_blob_size: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_file_size: 268435456 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_compression_type: NoCompression Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.enable_blob_garbage_collection: false Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_compaction_readahead_size: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_file_starting_level: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 1, name: m-0) Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]: Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.comparator: leveldb.BytewiseComparator Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.merge_operator: None Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_filter: None Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_filter_factory: None Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.sst_partitioner_factory: None Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_factory: SkipListFactory Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.table_factory: BlockBasedTable Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x564ee0cbed40)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x564edfcfa850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.write_buffer_size: 16777216 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_write_buffer_number: 64 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression: LZ4 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression: Disabled Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.prefix_extractor: nullptr Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.num_levels: 7 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.level: 32767 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.enabled: false Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.window_bits: -14 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.level: 32767 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.strategy: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.parallel_threads: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.enabled: false Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.level0_stop_writes_trigger: 36 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.target_file_size_base: 67108864 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.target_file_size_multiplier: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_compaction_bytes: 1677721600 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.arena_block_size: 1048576 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.disable_auto_compactions: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_style: kCompactionStyleLevel Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.inplace_update_support: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.inplace_update_num_locks: 10000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_whole_key_filtering: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_huge_page_size: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bloom_locality: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_successive_merges: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.optimize_filters_for_hits: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.paranoid_file_checks: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.force_consistency_checks: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.report_bg_io_stats: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.ttl: 2592000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.periodic_compaction_seconds: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.preclude_last_level_data_seconds: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.preserve_internal_time_seconds: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.enable_blob_files: false Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.min_blob_size: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_file_size: 268435456 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_compression_type: NoCompression Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.enable_blob_garbage_collection: false Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_compaction_readahead_size: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_file_starting_level: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 2, name: m-1) Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]: Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.comparator: leveldb.BytewiseComparator Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.merge_operator: None Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_filter: None Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_filter_factory: None Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.sst_partitioner_factory: None Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_factory: SkipListFactory Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.table_factory: BlockBasedTable Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x564ee0cbed40)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x564edfcfa850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.write_buffer_size: 16777216 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_write_buffer_number: 64 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression: LZ4 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression: Disabled Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.prefix_extractor: nullptr Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.num_levels: 7 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.level: 32767 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.enabled: false Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.window_bits: -14 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.level: 32767 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.strategy: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.parallel_threads: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.enabled: false Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.level0_stop_writes_trigger: 36 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.target_file_size_base: 67108864 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.target_file_size_multiplier: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_compaction_bytes: 1677721600 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.arena_block_size: 1048576 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.disable_auto_compactions: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_style: kCompactionStyleLevel Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.inplace_update_support: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.inplace_update_num_locks: 10000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_whole_key_filtering: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_huge_page_size: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bloom_locality: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_successive_merges: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.optimize_filters_for_hits: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.paranoid_file_checks: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.force_consistency_checks: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.report_bg_io_stats: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.ttl: 2592000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.periodic_compaction_seconds: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.preclude_last_level_data_seconds: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.preserve_internal_time_seconds: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.enable_blob_files: false Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.min_blob_size: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_file_size: 268435456 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_compression_type: NoCompression Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.enable_blob_garbage_collection: false Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_compaction_readahead_size: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_file_starting_level: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 3, name: m-2) Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]: Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.comparator: leveldb.BytewiseComparator Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.merge_operator: None Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_filter: None Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_filter_factory: None Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.sst_partitioner_factory: None Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_factory: SkipListFactory Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.table_factory: BlockBasedTable Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x564ee0cbed40)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x564edfcfa850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.write_buffer_size: 16777216 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_write_buffer_number: 64 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression: LZ4 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression: Disabled Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.prefix_extractor: nullptr Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.num_levels: 7 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.level: 32767 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.enabled: false Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.window_bits: -14 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.level: 32767 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.strategy: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.parallel_threads: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.enabled: false Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.level0_stop_writes_trigger: 36 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.target_file_size_base: 67108864 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.target_file_size_multiplier: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_compaction_bytes: 1677721600 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.arena_block_size: 1048576 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.disable_auto_compactions: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_style: kCompactionStyleLevel Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.inplace_update_support: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.inplace_update_num_locks: 10000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_whole_key_filtering: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_huge_page_size: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bloom_locality: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_successive_merges: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.optimize_filters_for_hits: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.paranoid_file_checks: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.force_consistency_checks: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.report_bg_io_stats: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.ttl: 2592000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.periodic_compaction_seconds: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.preclude_last_level_data_seconds: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.preserve_internal_time_seconds: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.enable_blob_files: false Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.min_blob_size: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_file_size: 268435456 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_compression_type: NoCompression Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.enable_blob_garbage_collection: false Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_compaction_readahead_size: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_file_starting_level: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 4, name: p-0) Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]: Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.comparator: leveldb.BytewiseComparator Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.merge_operator: None Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_filter: None Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_filter_factory: None Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.sst_partitioner_factory: None Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_factory: SkipListFactory Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.table_factory: BlockBasedTable Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x564ee0cbed40)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x564edfcfa850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.write_buffer_size: 16777216 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_write_buffer_number: 64 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression: LZ4 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression: Disabled Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.prefix_extractor: nullptr Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.num_levels: 7 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.level: 32767 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.enabled: false Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.window_bits: -14 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.level: 32767 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.strategy: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.parallel_threads: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.enabled: false Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.level0_stop_writes_trigger: 36 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.target_file_size_base: 67108864 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.target_file_size_multiplier: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_compaction_bytes: 1677721600 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.arena_block_size: 1048576 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.disable_auto_compactions: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_style: kCompactionStyleLevel Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.inplace_update_support: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.inplace_update_num_locks: 10000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_whole_key_filtering: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_huge_page_size: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bloom_locality: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_successive_merges: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.optimize_filters_for_hits: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.paranoid_file_checks: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.force_consistency_checks: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.report_bg_io_stats: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.ttl: 2592000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.periodic_compaction_seconds: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.preclude_last_level_data_seconds: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.preserve_internal_time_seconds: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.enable_blob_files: false Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.min_blob_size: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_file_size: 268435456 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_compression_type: NoCompression Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.enable_blob_garbage_collection: false Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_compaction_readahead_size: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_file_starting_level: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 5, name: p-1) Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]: Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.comparator: leveldb.BytewiseComparator Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.merge_operator: None Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_filter: None Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_filter_factory: None Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.sst_partitioner_factory: None Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_factory: SkipListFactory Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.table_factory: BlockBasedTable Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x564ee0cbed40)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x564edfcfa850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.write_buffer_size: 16777216 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_write_buffer_number: 64 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression: LZ4 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression: Disabled Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.prefix_extractor: nullptr Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.num_levels: 7 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.level: 32767 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.enabled: false Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.window_bits: -14 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.level: 32767 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.strategy: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.parallel_threads: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.enabled: false Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.level0_stop_writes_trigger: 36 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.target_file_size_base: 67108864 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.target_file_size_multiplier: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_compaction_bytes: 1677721600 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.arena_block_size: 1048576 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.disable_auto_compactions: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_style: kCompactionStyleLevel Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Dec 15 02:45:09 localhost podman[32406]: 2025-12-15 07:45:09.591270232 +0000 UTC m=+0.049179187 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.inplace_update_support: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.inplace_update_num_locks: 10000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_whole_key_filtering: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_huge_page_size: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bloom_locality: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_successive_merges: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.optimize_filters_for_hits: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.paranoid_file_checks: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.force_consistency_checks: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.report_bg_io_stats: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.ttl: 2592000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.periodic_compaction_seconds: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.preclude_last_level_data_seconds: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.preserve_internal_time_seconds: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.enable_blob_files: false Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.min_blob_size: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_file_size: 268435456 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_compression_type: NoCompression Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.enable_blob_garbage_collection: false Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_compaction_readahead_size: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_file_starting_level: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 6, name: p-2) Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]: Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.comparator: leveldb.BytewiseComparator Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.merge_operator: None Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_filter: None Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_filter_factory: None Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.sst_partitioner_factory: None Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_factory: SkipListFactory Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.table_factory: BlockBasedTable Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x564ee0cbed40)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x564edfcfa850#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.write_buffer_size: 16777216 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_write_buffer_number: 64 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression: LZ4 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression: Disabled Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.prefix_extractor: nullptr Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.num_levels: 7 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.level: 32767 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.enabled: false Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.window_bits: -14 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.level: 32767 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.strategy: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.parallel_threads: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.enabled: false Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.level0_stop_writes_trigger: 36 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.target_file_size_base: 67108864 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.target_file_size_multiplier: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_compaction_bytes: 1677721600 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.arena_block_size: 1048576 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.disable_auto_compactions: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_style: kCompactionStyleLevel Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.inplace_update_support: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.inplace_update_num_locks: 10000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_whole_key_filtering: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_huge_page_size: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bloom_locality: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_successive_merges: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.optimize_filters_for_hits: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.paranoid_file_checks: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.force_consistency_checks: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.report_bg_io_stats: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.ttl: 2592000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.periodic_compaction_seconds: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.preclude_last_level_data_seconds: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.preserve_internal_time_seconds: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.enable_blob_files: false Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.min_blob_size: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_file_size: 268435456 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_compression_type: NoCompression Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.enable_blob_garbage_collection: false Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_compaction_readahead_size: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_file_starting_level: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 7, name: O-0) Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]: Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.comparator: leveldb.BytewiseComparator Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.merge_operator: None Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_filter: None Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_filter_factory: None Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.sst_partitioner_factory: None Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_factory: SkipListFactory Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.table_factory: BlockBasedTable Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x564ee0cbef60)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x564edfcfa2d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.write_buffer_size: 16777216 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_write_buffer_number: 64 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression: LZ4 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression: Disabled Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.prefix_extractor: nullptr Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.num_levels: 7 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.level: 32767 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.enabled: false Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.window_bits: -14 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.level: 32767 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.strategy: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.parallel_threads: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.enabled: false Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.level0_stop_writes_trigger: 36 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.target_file_size_base: 67108864 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.target_file_size_multiplier: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_compaction_bytes: 1677721600 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.arena_block_size: 1048576 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.disable_auto_compactions: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_style: kCompactionStyleLevel Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.inplace_update_support: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.inplace_update_num_locks: 10000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_whole_key_filtering: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_huge_page_size: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bloom_locality: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_successive_merges: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.optimize_filters_for_hits: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.paranoid_file_checks: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.force_consistency_checks: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.report_bg_io_stats: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.ttl: 2592000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.periodic_compaction_seconds: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.preclude_last_level_data_seconds: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.preserve_internal_time_seconds: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.enable_blob_files: false Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.min_blob_size: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_file_size: 268435456 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_compression_type: NoCompression Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.enable_blob_garbage_collection: false Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_compaction_readahead_size: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_file_starting_level: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 8, name: O-1) Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]: Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.comparator: leveldb.BytewiseComparator Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.merge_operator: None Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_filter: None Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_filter_factory: None Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.sst_partitioner_factory: None Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_factory: SkipListFactory Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.table_factory: BlockBasedTable Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x564ee0cbef60)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x564edfcfa2d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.write_buffer_size: 16777216 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_write_buffer_number: 64 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression: LZ4 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression: Disabled Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.prefix_extractor: nullptr Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.num_levels: 7 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.level: 32767 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.enabled: false Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.window_bits: -14 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.level: 32767 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.strategy: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.parallel_threads: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.enabled: false Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.level0_stop_writes_trigger: 36 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.target_file_size_base: 67108864 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.target_file_size_multiplier: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_compaction_bytes: 1677721600 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.arena_block_size: 1048576 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.disable_auto_compactions: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_style: kCompactionStyleLevel Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.inplace_update_support: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.inplace_update_num_locks: 10000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_whole_key_filtering: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_huge_page_size: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bloom_locality: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_successive_merges: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.optimize_filters_for_hits: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.paranoid_file_checks: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.force_consistency_checks: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.report_bg_io_stats: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.ttl: 2592000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.periodic_compaction_seconds: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.preclude_last_level_data_seconds: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.preserve_internal_time_seconds: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.enable_blob_files: false Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.min_blob_size: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_file_size: 268435456 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_compression_type: NoCompression Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.enable_blob_garbage_collection: false Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_compaction_readahead_size: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_file_starting_level: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 9, name: O-2) Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]: Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.comparator: leveldb.BytewiseComparator Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.merge_operator: None Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_filter: None Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_filter_factory: None Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.sst_partitioner_factory: None Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_factory: SkipListFactory Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.table_factory: BlockBasedTable Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x564ee0cbef60)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x564edfcfa2d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.write_buffer_size: 16777216 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_write_buffer_number: 64 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression: LZ4 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression: Disabled Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.prefix_extractor: nullptr Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.num_levels: 7 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.level: 32767 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.enabled: false Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.window_bits: -14 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.level: 32767 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.strategy: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.parallel_threads: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.enabled: false Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.level0_stop_writes_trigger: 36 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.target_file_size_base: 67108864 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.target_file_size_multiplier: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_compaction_bytes: 1677721600 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.arena_block_size: 1048576 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.disable_auto_compactions: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_style: kCompactionStyleLevel Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.inplace_update_support: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.inplace_update_num_locks: 10000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_whole_key_filtering: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_huge_page_size: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bloom_locality: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_successive_merges: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.optimize_filters_for_hits: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.paranoid_file_checks: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.force_consistency_checks: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.report_bg_io_stats: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.ttl: 2592000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.periodic_compaction_seconds: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.preclude_last_level_data_seconds: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.preserve_internal_time_seconds: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.enable_blob_files: false Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.min_blob_size: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_file_size: 268435456 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_compression_type: NoCompression Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.enable_blob_garbage_collection: false Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_compaction_readahead_size: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_file_starting_level: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 10, name: L) Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: [db/column_family.cc:635] #011(skipping printing options) Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 11, name: P) Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: [db/column_family.cc:635] #011(skipping printing options) Dec 15 02:45:09 localhost systemd[1]: Started libpod-conmon-0f490476795a09baac10199b73c8aa8db647d75425267f4a166b445a21c3f46e.scope. Dec 15 02:45:09 localhost systemd[1]: tmp-crun.LnRpdP.mount: Deactivated successfully. Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: ace239a5-421c-4fff-b74c-5c7a1464e415 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765784709715982, "job": 1, "event": "recovery_started", "wal_files": [31]} Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765784709716331, "job": 1, "event": "recovery_finished"} Dec 15 02:45:09 localhost ceph-osd[32311]: bluestore(/var/lib/ceph/osd/ceph-3) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0 Dec 15 02:45:09 localhost ceph-osd[32311]: bluestore(/var/lib/ceph/osd/ceph-3) _open_super_meta old nid_max 1025 Dec 15 02:45:09 localhost ceph-osd[32311]: bluestore(/var/lib/ceph/osd/ceph-3) _open_super_meta old blobid_max 10240 Dec 15 02:45:09 localhost ceph-osd[32311]: bluestore(/var/lib/ceph/osd/ceph-3) _open_super_meta ondisk_format 4 compat_ondisk_format 3 Dec 15 02:45:09 localhost ceph-osd[32311]: bluestore(/var/lib/ceph/osd/ceph-3) _open_super_meta min_alloc_size 0x1000 Dec 15 02:45:09 localhost ceph-osd[32311]: freelist init Dec 15 02:45:09 localhost ceph-osd[32311]: freelist _read_cfg Dec 15 02:45:09 localhost ceph-osd[32311]: bluestore(/var/lib/ceph/osd/ceph-3) _init_alloc loaded 7.0 GiB in 2 extents, allocator type hybrid, capacity 0x1bfc00000, block size 0x1000, free 0x1bfbfd000, fragmentation 5.5e-07 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: [db/db_impl/db_impl.cc:496] Shutdown: canceling all background work Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: [db/db_impl/db_impl.cc:704] Shutdown complete Dec 15 02:45:09 localhost ceph-osd[32311]: bluefs umount Dec 15 02:45:09 localhost ceph-osd[32311]: bdev(0x564edfd23180 /var/lib/ceph/osd/ceph-3/block) close Dec 15 02:45:09 localhost systemd[1]: Started libcrun container. Dec 15 02:45:09 localhost podman[32406]: 2025-12-15 07:45:09.742939382 +0000 UTC m=+0.200848317 container init 0f490476795a09baac10199b73c8aa8db647d75425267f4a166b445a21c3f46e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=amazing_tharp, distribution-scope=public, architecture=x86_64, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, vcs-type=git, vendor=Red Hat, Inc., version=7, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, release=1763362218, build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7) Dec 15 02:45:09 localhost amazing_tharp[32508]: 167 167 Dec 15 02:45:09 localhost systemd[1]: libpod-0f490476795a09baac10199b73c8aa8db647d75425267f4a166b445a21c3f46e.scope: Deactivated successfully. Dec 15 02:45:09 localhost podman[32406]: 2025-12-15 07:45:09.776791143 +0000 UTC m=+0.234700038 container start 0f490476795a09baac10199b73c8aa8db647d75425267f4a166b445a21c3f46e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=amazing_tharp, build-date=2025-11-26T19:44:28Z, architecture=x86_64, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, name=rhceph, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, io.openshift.tags=rhceph ceph, release=1763362218, maintainer=Guillaume Abrioux , ceph=True, vcs-type=git, GIT_CLEAN=True, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0) Dec 15 02:45:09 localhost podman[32406]: 2025-12-15 07:45:09.777652944 +0000 UTC m=+0.235561899 container attach 0f490476795a09baac10199b73c8aa8db647d75425267f4a166b445a21c3f46e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=amazing_tharp, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, name=rhceph, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, GIT_CLEAN=True, maintainer=Guillaume Abrioux , build-date=2025-11-26T19:44:28Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, version=7, release=1763362218, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, RELEASE=main) Dec 15 02:45:09 localhost podman[32406]: 2025-12-15 07:45:09.780517978 +0000 UTC m=+0.238426893 container died 0f490476795a09baac10199b73c8aa8db647d75425267f4a166b445a21c3f46e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=amazing_tharp, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, RELEASE=main, CEPH_POINT_RELEASE=, vcs-type=git, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, release=1763362218, name=rhceph, GIT_BRANCH=main, maintainer=Guillaume Abrioux , io.openshift.expose-services=) Dec 15 02:45:09 localhost podman[32619]: 2025-12-15 07:45:09.890284601 +0000 UTC m=+0.120859119 container remove 0f490476795a09baac10199b73c8aa8db647d75425267f4a166b445a21c3f46e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=amazing_tharp, name=rhceph, io.openshift.expose-services=, ceph=True, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, vcs-type=git, maintainer=Guillaume Abrioux , GIT_BRANCH=main, GIT_CLEAN=True, release=1763362218, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0) Dec 15 02:45:09 localhost systemd[1]: libpod-conmon-0f490476795a09baac10199b73c8aa8db647d75425267f4a166b445a21c3f46e.scope: Deactivated successfully. Dec 15 02:45:09 localhost ceph-osd[32311]: bdev(0x564edfd23180 /var/lib/ceph/osd/ceph-3/block) open path /var/lib/ceph/osd/ceph-3/block Dec 15 02:45:09 localhost ceph-osd[32311]: bdev(0x564edfd23180 /var/lib/ceph/osd/ceph-3/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-3/block failed: (22) Invalid argument Dec 15 02:45:09 localhost ceph-osd[32311]: bdev(0x564edfd23180 /var/lib/ceph/osd/ceph-3/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported Dec 15 02:45:09 localhost ceph-osd[32311]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-3/block size 7.0 GiB Dec 15 02:45:09 localhost ceph-osd[32311]: bluefs mount Dec 15 02:45:09 localhost ceph-osd[32311]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000 Dec 15 02:45:09 localhost ceph-osd[32311]: bluefs mount shared_bdev_used = 4718592 Dec 15 02:45:09 localhost ceph-osd[32311]: bluestore(/var/lib/ceph/osd/ceph-3) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: RocksDB version: 7.9.2 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Git sha 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Compile date 2025-09-23 00:00:00 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: DB SUMMARY Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: DB Session ID: OTGNSRAE5SWL4LSU2B72 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: CURRENT file: CURRENT Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: IDENTITY file: IDENTITY Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: MANIFEST file: MANIFEST-000032 size: 1007 Bytes Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: SST files in db.slow dir, Total Num: 0, files: Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.error_if_exists: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.create_if_missing: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.paranoid_checks: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.flush_verify_memtable_count: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.track_and_verify_wals_in_manifest: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.verify_sst_unique_id_in_manifest: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.env: 0x564edfe485b0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.fs: LegacyFileSystem Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.info_log: 0x564ee0d22f80 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_file_opening_threads: 16 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.statistics: (nil) Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.use_fsync: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_log_file_size: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_manifest_file_size: 1073741824 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.log_file_time_to_roll: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.keep_log_file_num: 1000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.recycle_log_file_num: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.allow_fallocate: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.allow_mmap_reads: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.allow_mmap_writes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.use_direct_reads: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.use_direct_io_for_flush_and_compaction: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.create_missing_column_families: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.db_log_dir: Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.wal_dir: db.wal Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.table_cache_numshardbits: 6 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.WAL_ttl_seconds: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.WAL_size_limit_MB: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_write_batch_group_size_bytes: 1048576 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.manifest_preallocation_size: 4194304 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.is_fd_close_on_exec: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.advise_random_on_open: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.db_write_buffer_size: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.write_buffer_manager: 0x564edfd0d5e0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.access_hint_on_compaction_start: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.random_access_max_buffer_size: 1048576 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.use_adaptive_mutex: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.rate_limiter: (nil) Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.sst_file_manager.rate_bytes_per_sec: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.wal_recovery_mode: 2 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.enable_thread_tracking: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.enable_pipelined_write: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.unordered_write: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.allow_concurrent_memtable_write: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.enable_write_thread_adaptive_yield: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.write_thread_max_yield_usec: 100 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.write_thread_slow_yield_usec: 3 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.row_cache: None Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.wal_filter: None Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.avoid_flush_during_recovery: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.allow_ingest_behind: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.two_write_queues: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.manual_wal_flush: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.wal_compression: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.atomic_flush: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.avoid_unnecessary_blocking_io: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.persist_stats_to_disk: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.write_dbid_to_manifest: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.log_readahead_size: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.file_checksum_gen_factory: Unknown Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.best_efforts_recovery: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bgerror_resume_count: 2147483647 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bgerror_resume_retry_interval: 1000000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.allow_data_in_errors: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.db_host_id: __hostname__ Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.enforce_single_del_contracts: true Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_background_jobs: 4 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_background_compactions: -1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_subcompactions: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.avoid_flush_during_shutdown: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.writable_file_max_buffer_size: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.delayed_write_rate : 16777216 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_total_wal_size: 1073741824 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.delete_obsolete_files_period_micros: 21600000000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.stats_dump_period_sec: 600 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.stats_persist_period_sec: 600 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.stats_history_buffer_size: 1048576 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_open_files: -1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bytes_per_sync: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.wal_bytes_per_sync: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.strict_bytes_per_sync: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_readahead_size: 2097152 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_background_flushes: -1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Compression algorithms supported: Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: #011kZSTD supported: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: #011kXpressCompression supported: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: #011kBZip2Compression supported: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: #011kZSTDNotFinalCompression supported: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: #011kLZ4Compression supported: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: #011kZlibCompression supported: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: #011kLZ4HCCompression supported: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: #011kSnappyCompression supported: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Fast CRC32 supported: Supported on x86 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: DMutex implementation: pthread_mutex_t Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 0, name: default) Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]: Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.comparator: leveldb.BytewiseComparator Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.merge_operator: .T:int64_array.b:bitwise_xor Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_filter: None Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_filter_factory: None Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.sst_partitioner_factory: None Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_factory: SkipListFactory Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.table_factory: BlockBasedTable Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x564ee0d23180)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x564edfcfa2d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.write_buffer_size: 16777216 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_write_buffer_number: 64 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression: LZ4 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression: Disabled Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.prefix_extractor: nullptr Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.num_levels: 7 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.level: 32767 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.enabled: false Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.window_bits: -14 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.level: 32767 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.strategy: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.parallel_threads: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.enabled: false Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.level0_stop_writes_trigger: 36 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.target_file_size_base: 67108864 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.target_file_size_multiplier: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_compaction_bytes: 1677721600 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.arena_block_size: 1048576 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.disable_auto_compactions: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_style: kCompactionStyleLevel Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.table_properties_collectors: Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.inplace_update_support: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.inplace_update_num_locks: 10000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_whole_key_filtering: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_huge_page_size: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bloom_locality: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_successive_merges: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.optimize_filters_for_hits: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.paranoid_file_checks: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.force_consistency_checks: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.report_bg_io_stats: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.ttl: 2592000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.periodic_compaction_seconds: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.preclude_last_level_data_seconds: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.preserve_internal_time_seconds: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.enable_blob_files: false Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.min_blob_size: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_file_size: 268435456 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_compression_type: NoCompression Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.enable_blob_garbage_collection: false Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_compaction_readahead_size: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_file_starting_level: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 1, name: m-0) Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]: Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.comparator: leveldb.BytewiseComparator Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.merge_operator: None Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_filter: None Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_filter_factory: None Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.sst_partitioner_factory: None Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_factory: SkipListFactory Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.table_factory: BlockBasedTable Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x564ee0d23180)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x564edfcfa2d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.write_buffer_size: 16777216 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_write_buffer_number: 64 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression: LZ4 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression: Disabled Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.prefix_extractor: nullptr Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.num_levels: 7 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.level: 32767 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.enabled: false Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.window_bits: -14 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.level: 32767 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.strategy: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.parallel_threads: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.enabled: false Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.level0_stop_writes_trigger: 36 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.target_file_size_base: 67108864 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.target_file_size_multiplier: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_compaction_bytes: 1677721600 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.arena_block_size: 1048576 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.disable_auto_compactions: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_style: kCompactionStyleLevel Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.inplace_update_support: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.inplace_update_num_locks: 10000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_whole_key_filtering: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_huge_page_size: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bloom_locality: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_successive_merges: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.optimize_filters_for_hits: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.paranoid_file_checks: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.force_consistency_checks: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.report_bg_io_stats: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.ttl: 2592000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.periodic_compaction_seconds: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.preclude_last_level_data_seconds: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.preserve_internal_time_seconds: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.enable_blob_files: false Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.min_blob_size: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_file_size: 268435456 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_compression_type: NoCompression Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.enable_blob_garbage_collection: false Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_compaction_readahead_size: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_file_starting_level: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 2, name: m-1) Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]: Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.comparator: leveldb.BytewiseComparator Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.merge_operator: None Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_filter: None Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_filter_factory: None Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.sst_partitioner_factory: None Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_factory: SkipListFactory Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.table_factory: BlockBasedTable Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x564ee0d23180)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x564edfcfa2d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.write_buffer_size: 16777216 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_write_buffer_number: 64 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression: LZ4 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression: Disabled Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.prefix_extractor: nullptr Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.num_levels: 7 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.level: 32767 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.enabled: false Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.window_bits: -14 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.level: 32767 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.strategy: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.parallel_threads: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.enabled: false Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.level0_stop_writes_trigger: 36 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.target_file_size_base: 67108864 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.target_file_size_multiplier: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_compaction_bytes: 1677721600 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.arena_block_size: 1048576 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.disable_auto_compactions: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_style: kCompactionStyleLevel Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.inplace_update_support: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.inplace_update_num_locks: 10000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_whole_key_filtering: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_huge_page_size: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bloom_locality: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_successive_merges: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.optimize_filters_for_hits: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.paranoid_file_checks: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.force_consistency_checks: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.report_bg_io_stats: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.ttl: 2592000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.periodic_compaction_seconds: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.preclude_last_level_data_seconds: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.preserve_internal_time_seconds: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.enable_blob_files: false Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.min_blob_size: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_file_size: 268435456 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_compression_type: NoCompression Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.enable_blob_garbage_collection: false Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_compaction_readahead_size: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_file_starting_level: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 3, name: m-2) Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]: Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.comparator: leveldb.BytewiseComparator Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.merge_operator: None Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_filter: None Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_filter_factory: None Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.sst_partitioner_factory: None Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_factory: SkipListFactory Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.table_factory: BlockBasedTable Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x564ee0d23180)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x564edfcfa2d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.write_buffer_size: 16777216 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_write_buffer_number: 64 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression: LZ4 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression: Disabled Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.prefix_extractor: nullptr Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.num_levels: 7 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.level: 32767 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.enabled: false Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.window_bits: -14 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.level: 32767 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.strategy: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.parallel_threads: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.enabled: false Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.level0_stop_writes_trigger: 36 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.target_file_size_base: 67108864 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.target_file_size_multiplier: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_compaction_bytes: 1677721600 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.arena_block_size: 1048576 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.disable_auto_compactions: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_style: kCompactionStyleLevel Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.inplace_update_support: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.inplace_update_num_locks: 10000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_whole_key_filtering: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_huge_page_size: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bloom_locality: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_successive_merges: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.optimize_filters_for_hits: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.paranoid_file_checks: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.force_consistency_checks: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.report_bg_io_stats: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.ttl: 2592000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.periodic_compaction_seconds: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.preclude_last_level_data_seconds: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.preserve_internal_time_seconds: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.enable_blob_files: false Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.min_blob_size: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_file_size: 268435456 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_compression_type: NoCompression Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.enable_blob_garbage_collection: false Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_compaction_readahead_size: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_file_starting_level: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 4, name: p-0) Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]: Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.comparator: leveldb.BytewiseComparator Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.merge_operator: None Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_filter: None Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_filter_factory: None Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.sst_partitioner_factory: None Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_factory: SkipListFactory Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.table_factory: BlockBasedTable Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x564ee0d23180)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x564edfcfa2d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.write_buffer_size: 16777216 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_write_buffer_number: 64 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression: LZ4 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression: Disabled Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.prefix_extractor: nullptr Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.num_levels: 7 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.level: 32767 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.enabled: false Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.window_bits: -14 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.level: 32767 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.strategy: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.parallel_threads: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.enabled: false Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.level0_stop_writes_trigger: 36 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.target_file_size_base: 67108864 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.target_file_size_multiplier: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_compaction_bytes: 1677721600 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.arena_block_size: 1048576 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.disable_auto_compactions: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_style: kCompactionStyleLevel Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.inplace_update_support: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.inplace_update_num_locks: 10000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_whole_key_filtering: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_huge_page_size: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bloom_locality: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_successive_merges: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.optimize_filters_for_hits: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.paranoid_file_checks: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.force_consistency_checks: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.report_bg_io_stats: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.ttl: 2592000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.periodic_compaction_seconds: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.preclude_last_level_data_seconds: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.preserve_internal_time_seconds: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.enable_blob_files: false Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.min_blob_size: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_file_size: 268435456 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_compression_type: NoCompression Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.enable_blob_garbage_collection: false Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_compaction_readahead_size: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_file_starting_level: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 5, name: p-1) Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]: Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.comparator: leveldb.BytewiseComparator Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.merge_operator: None Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_filter: None Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_filter_factory: None Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.sst_partitioner_factory: None Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_factory: SkipListFactory Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.table_factory: BlockBasedTable Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x564ee0d23180)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x564edfcfa2d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.write_buffer_size: 16777216 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_write_buffer_number: 64 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression: LZ4 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression: Disabled Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.prefix_extractor: nullptr Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.num_levels: 7 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.level: 32767 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.enabled: false Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.window_bits: -14 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.level: 32767 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.strategy: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.parallel_threads: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.enabled: false Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.level0_stop_writes_trigger: 36 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.target_file_size_base: 67108864 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.target_file_size_multiplier: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_compaction_bytes: 1677721600 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.arena_block_size: 1048576 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.disable_auto_compactions: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_style: kCompactionStyleLevel Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.inplace_update_support: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.inplace_update_num_locks: 10000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_whole_key_filtering: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_huge_page_size: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bloom_locality: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_successive_merges: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.optimize_filters_for_hits: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.paranoid_file_checks: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.force_consistency_checks: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.report_bg_io_stats: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.ttl: 2592000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.periodic_compaction_seconds: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.preclude_last_level_data_seconds: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.preserve_internal_time_seconds: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.enable_blob_files: false Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.min_blob_size: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_file_size: 268435456 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_compression_type: NoCompression Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.enable_blob_garbage_collection: false Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_compaction_readahead_size: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_file_starting_level: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 6, name: p-2) Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]: Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.comparator: leveldb.BytewiseComparator Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.merge_operator: None Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_filter: None Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_filter_factory: None Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.sst_partitioner_factory: None Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_factory: SkipListFactory Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.table_factory: BlockBasedTable Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x564ee0d23180)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x564edfcfa2d0#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 483183820#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.write_buffer_size: 16777216 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_write_buffer_number: 64 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression: LZ4 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression: Disabled Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.prefix_extractor: nullptr Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.num_levels: 7 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.level: 32767 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.enabled: false Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.window_bits: -14 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.level: 32767 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.strategy: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.parallel_threads: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.enabled: false Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.level0_stop_writes_trigger: 36 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.target_file_size_base: 67108864 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.target_file_size_multiplier: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_compaction_bytes: 1677721600 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.arena_block_size: 1048576 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.disable_auto_compactions: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_style: kCompactionStyleLevel Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.inplace_update_support: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.inplace_update_num_locks: 10000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_whole_key_filtering: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_huge_page_size: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bloom_locality: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_successive_merges: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.optimize_filters_for_hits: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.paranoid_file_checks: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.force_consistency_checks: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.report_bg_io_stats: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.ttl: 2592000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.periodic_compaction_seconds: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.preclude_last_level_data_seconds: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.preserve_internal_time_seconds: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.enable_blob_files: false Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.min_blob_size: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_file_size: 268435456 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_compression_type: NoCompression Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.enable_blob_garbage_collection: false Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_compaction_readahead_size: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_file_starting_level: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 7, name: O-0) Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]: Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.comparator: leveldb.BytewiseComparator Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.merge_operator: None Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_filter: None Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_filter_factory: None Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.sst_partitioner_factory: None Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_factory: SkipListFactory Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.table_factory: BlockBasedTable Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x564ee0d232c0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x564edfcfb610#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.write_buffer_size: 16777216 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_write_buffer_number: 64 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression: LZ4 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression: Disabled Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.prefix_extractor: nullptr Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.num_levels: 7 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.level: 32767 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.enabled: false Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.window_bits: -14 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.level: 32767 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.strategy: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.parallel_threads: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.enabled: false Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.level0_stop_writes_trigger: 36 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.target_file_size_base: 67108864 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.target_file_size_multiplier: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_compaction_bytes: 1677721600 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.arena_block_size: 1048576 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.disable_auto_compactions: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_style: kCompactionStyleLevel Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.inplace_update_support: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.inplace_update_num_locks: 10000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_whole_key_filtering: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_huge_page_size: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bloom_locality: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_successive_merges: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.optimize_filters_for_hits: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.paranoid_file_checks: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.force_consistency_checks: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.report_bg_io_stats: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.ttl: 2592000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.periodic_compaction_seconds: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.preclude_last_level_data_seconds: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.preserve_internal_time_seconds: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.enable_blob_files: false Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.min_blob_size: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_file_size: 268435456 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_compression_type: NoCompression Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.enable_blob_garbage_collection: false Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_compaction_readahead_size: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_file_starting_level: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 8, name: O-1) Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]: Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.comparator: leveldb.BytewiseComparator Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.merge_operator: None Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_filter: None Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_filter_factory: None Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.sst_partitioner_factory: None Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_factory: SkipListFactory Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.table_factory: BlockBasedTable Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x564ee0d232c0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x564edfcfb610#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.write_buffer_size: 16777216 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_write_buffer_number: 64 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression: LZ4 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression: Disabled Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.prefix_extractor: nullptr Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.num_levels: 7 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.level: 32767 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.enabled: false Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.window_bits: -14 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.level: 32767 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.strategy: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.parallel_threads: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.enabled: false Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.level0_stop_writes_trigger: 36 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.target_file_size_base: 67108864 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.target_file_size_multiplier: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_compaction_bytes: 1677721600 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.arena_block_size: 1048576 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.disable_auto_compactions: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_style: kCompactionStyleLevel Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.inplace_update_support: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.inplace_update_num_locks: 10000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_whole_key_filtering: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_huge_page_size: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bloom_locality: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_successive_merges: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.optimize_filters_for_hits: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.paranoid_file_checks: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.force_consistency_checks: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.report_bg_io_stats: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.ttl: 2592000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.periodic_compaction_seconds: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.preclude_last_level_data_seconds: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.preserve_internal_time_seconds: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.enable_blob_files: false Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.min_blob_size: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_file_size: 268435456 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_compression_type: NoCompression Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.enable_blob_garbage_collection: false Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_compaction_readahead_size: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_file_starting_level: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 9, name: O-2) Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]: Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.comparator: leveldb.BytewiseComparator Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.merge_operator: None Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_filter: None Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_filter_factory: None Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.sst_partitioner_factory: None Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_factory: SkipListFactory Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.table_factory: BlockBasedTable Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x564ee0d232c0)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x564edfcfb610#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.write_buffer_size: 16777216 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_write_buffer_number: 64 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression: LZ4 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression: Disabled Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.prefix_extractor: nullptr Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.num_levels: 7 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.min_write_buffer_number_to_merge: 6 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.level: 32767 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.enabled: false Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.window_bits: -14 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.level: 32767 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.strategy: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.parallel_threads: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.enabled: false Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.level0_file_num_compaction_trigger: 8 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.level0_stop_writes_trigger: 36 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.target_file_size_base: 67108864 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.target_file_size_multiplier: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_base: 1073741824 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier: 8.000000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_compaction_bytes: 1677721600 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.arena_block_size: 1048576 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.disable_auto_compactions: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_style: kCompactionStyleLevel Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.inplace_update_support: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.inplace_update_num_locks: 10000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_whole_key_filtering: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.memtable_huge_page_size: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.bloom_locality: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.max_successive_merges: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.optimize_filters_for_hits: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.paranoid_file_checks: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.force_consistency_checks: 1 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.report_bg_io_stats: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.ttl: 2592000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.periodic_compaction_seconds: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.preclude_last_level_data_seconds: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.preserve_internal_time_seconds: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.enable_blob_files: false Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.min_blob_size: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_file_size: 268435456 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_compression_type: NoCompression Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.enable_blob_garbage_collection: false Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_compaction_readahead_size: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.blob_file_starting_level: 0 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 10, name: L) Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: [db/column_family.cc:635] #011(skipping printing options) Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 11, name: P) Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: [db/column_family.cc:635] #011(skipping printing options) Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: ace239a5-421c-4fff-b74c-5c7a1464e415 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765784709958482, "job": 1, "event": "recovery_started", "wal_files": [31]} Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2 Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765784709963357, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 35, "file_size": 1261, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13, "largest_seqno": 21, "table_properties": {"data_size": 128, "index_size": 27, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 87, "raw_average_key_size": 17, "raw_value_size": 82, "raw_average_value_size": 16, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 2, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": ".T:int64_array.b:bitwise_xor", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765784709, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "ace239a5-421c-4fff-b74c-5c7a1464e415", "db_session_id": "OTGNSRAE5SWL4LSU2B72", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}} Dec 15 02:45:09 localhost ceph-osd[32311]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765784709970092, "cf_name": "p-0", "job": 1, "event": "table_file_creation", "file_number": 36, "file_size": 1607, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14, "largest_seqno": 15, "table_properties": {"data_size": 466, "index_size": 39, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 72, "raw_average_key_size": 36, "raw_value_size": 567, "raw_average_value_size": 283, "num_data_blocks": 1, "num_entries": 2, "num_filter_entries": 2, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "p-0", "column_family_id": 4, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765784709, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "ace239a5-421c-4fff-b74c-5c7a1464e415", "db_session_id": "OTGNSRAE5SWL4LSU2B72", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}} Dec 15 02:45:10 localhost ceph-osd[32311]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765784710002524, "cf_name": "O-2", "job": 1, "event": "table_file_creation", "file_number": 37, "file_size": 1290, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16, "largest_seqno": 16, "table_properties": {"data_size": 121, "index_size": 64, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 55, "raw_average_key_size": 55, "raw_value_size": 50, "raw_average_value_size": 50, "num_data_blocks": 1, "num_entries": 1, "num_filter_entries": 1, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "O-2", "column_family_id": 9, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765784709, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "ace239a5-421c-4fff-b74c-5c7a1464e415", "db_session_id": "OTGNSRAE5SWL4LSU2B72", "orig_file_number": 37, "seqno_to_time_mapping": "N/A"}} Dec 15 02:45:10 localhost ceph-osd[32311]: rocksdb: [db/db_impl/db_impl_open.cc:1432] Failed to truncate log #31: IO error: No such file or directory: While open a file for appending: db.wal/000031.log: No such file or directory Dec 15 02:45:10 localhost ceph-osd[32311]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765784710009640, "job": 1, "event": "recovery_finished"} Dec 15 02:45:10 localhost ceph-osd[32311]: rocksdb: [db/version_set.cc:5047] Creating manifest 40 Dec 15 02:45:10 localhost podman[32824]: Dec 15 02:45:10 localhost podman[32824]: 2025-12-15 07:45:10.122488043 +0000 UTC m=+0.097097748 container create 3995bb57fcc062ef2de653ea4c3a5391b6faf6fca44fc2c3916141b89e782d61 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hopeful_feistel, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, release=1763362218, io.openshift.expose-services=, maintainer=Guillaume Abrioux , vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, io.buildah.version=1.41.4, name=rhceph, vendor=Red Hat, Inc., GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, vcs-type=git, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph) Dec 15 02:45:10 localhost ceph-osd[32311]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x564ee0d58380 Dec 15 02:45:10 localhost ceph-osd[32311]: rocksdb: DB pointer 0x564ee0c1ba00 Dec 15 02:45:10 localhost ceph-osd[32311]: bluestore(/var/lib/ceph/osd/ceph-3) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0 Dec 15 02:45:10 localhost ceph-osd[32311]: bluestore(/var/lib/ceph/osd/ceph-3) _upgrade_super from 4, latest 4 Dec 15 02:45:10 localhost ceph-osd[32311]: bluestore(/var/lib/ceph/osd/ceph-3) _upgrade_super done Dec 15 02:45:10 localhost ceph-osd[32311]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Dec 15 02:45:10 localhost ceph-osd[32311]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 0.2 total, 0.2 interval#012Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s#012Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 2/0 2.61 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.3 0.00 0.00 1 0.005 0 0 0.0 0.0#012 Sum 2/0 2.61 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.3 0.00 0.00 1 0.005 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.3 0.00 0.00 1 0.005 0 0 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.3 0.00 0.00 1 0.005 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.2 total, 0.2 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.01 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.01 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x564edfcfa2d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 4.2e-05 secs_since: 0#012Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-0] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.2 total, 0.2 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x564edfcfa2d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 4.2e-05 secs_since: 0#012Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-1] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.2 total, 0.2 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x564edfcfa2d0#2 capacity: 460.80 MB usag Dec 15 02:45:10 localhost ceph-osd[32311]: /builddir/build/BUILD/ceph-18.2.1/src/cls/cephfs/cls_cephfs.cc:201: loading cephfs Dec 15 02:45:10 localhost ceph-osd[32311]: /builddir/build/BUILD/ceph-18.2.1/src/cls/hello/cls_hello.cc:316: loading cls_hello Dec 15 02:45:10 localhost ceph-osd[32311]: _get_class not permitted to load lua Dec 15 02:45:10 localhost ceph-osd[32311]: _get_class not permitted to load sdk Dec 15 02:45:10 localhost ceph-osd[32311]: _get_class not permitted to load test_remote_reads Dec 15 02:45:10 localhost ceph-osd[32311]: osd.3 0 crush map has features 288232575208783872, adjusting msgr requires for clients Dec 15 02:45:10 localhost ceph-osd[32311]: osd.3 0 crush map has features 288232575208783872 was 8705, adjusting msgr requires for mons Dec 15 02:45:10 localhost ceph-osd[32311]: osd.3 0 crush map has features 288232575208783872, adjusting msgr requires for osds Dec 15 02:45:10 localhost ceph-osd[32311]: osd.3 0 check_osdmap_features enabling on-disk ERASURE CODES compat feature Dec 15 02:45:10 localhost ceph-osd[32311]: osd.3 0 load_pgs Dec 15 02:45:10 localhost ceph-osd[32311]: osd.3 0 load_pgs opened 0 pgs Dec 15 02:45:10 localhost ceph-osd[32311]: osd.3 0 log_to_monitors true Dec 15 02:45:10 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-osd-3[32307]: 2025-12-15T07:45:10.136+0000 7fd1536a7a80 -1 osd.3 0 log_to_monitors true Dec 15 02:45:10 localhost podman[32824]: 2025-12-15 07:45:10.06368334 +0000 UTC m=+0.038293075 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 02:45:10 localhost systemd[1]: Started libpod-conmon-3995bb57fcc062ef2de653ea4c3a5391b6faf6fca44fc2c3916141b89e782d61.scope. Dec 15 02:45:10 localhost systemd[1]: Started libcrun container. Dec 15 02:45:10 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1e288801f3638f95a1f9adf3c12c51ab0cefe9432dcf5162a4bc0c60df213fc7/merged/rootfs supports timestamps until 2038 (0x7fffffff) Dec 15 02:45:10 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1e288801f3638f95a1f9adf3c12c51ab0cefe9432dcf5162a4bc0c60df213fc7/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Dec 15 02:45:10 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1e288801f3638f95a1f9adf3c12c51ab0cefe9432dcf5162a4bc0c60df213fc7/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Dec 15 02:45:10 localhost podman[32824]: 2025-12-15 07:45:10.259279621 +0000 UTC m=+0.233889356 container init 3995bb57fcc062ef2de653ea4c3a5391b6faf6fca44fc2c3916141b89e782d61 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hopeful_feistel, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, architecture=x86_64, description=Red Hat Ceph Storage 7, release=1763362218, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, distribution-scope=public, build-date=2025-11-26T19:44:28Z, RELEASE=main, GIT_CLEAN=True, GIT_BRANCH=main) Dec 15 02:45:10 localhost podman[32824]: 2025-12-15 07:45:10.26590226 +0000 UTC m=+0.240511995 container start 3995bb57fcc062ef2de653ea4c3a5391b6faf6fca44fc2c3916141b89e782d61 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hopeful_feistel, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, maintainer=Guillaume Abrioux , name=rhceph, vendor=Red Hat, Inc., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, build-date=2025-11-26T19:44:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, distribution-scope=public, ceph=True, RELEASE=main, release=1763362218, GIT_CLEAN=True, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Dec 15 02:45:10 localhost podman[32824]: 2025-12-15 07:45:10.266133316 +0000 UTC m=+0.240743081 container attach 3995bb57fcc062ef2de653ea4c3a5391b6faf6fca44fc2c3916141b89e782d61 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hopeful_feistel, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, description=Red Hat Ceph Storage 7, ceph=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, com.redhat.component=rhceph-container, release=1763362218, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, build-date=2025-11-26T19:44:28Z, io.openshift.expose-services=, GIT_BRANCH=main) Dec 15 02:45:10 localhost systemd[1]: var-lib-containers-storage-overlay-4bd6a4398e3a50749140f40deb1964514ce58392915f6dbbf45b419b69dd4422-merged.mount: Deactivated successfully. Dec 15 02:45:10 localhost hopeful_feistel[32872]: { Dec 15 02:45:10 localhost hopeful_feistel[32872]: "6d4c9d22-e303-4075-8ccd-2bb4bc620212": { Dec 15 02:45:10 localhost hopeful_feistel[32872]: "ceph_fsid": "bce17446-41b5-5408-a23e-0b011906b44a", Dec 15 02:45:10 localhost hopeful_feistel[32872]: "device": "/dev/mapper/ceph_vg0-ceph_lv0", Dec 15 02:45:10 localhost hopeful_feistel[32872]: "osd_id": 0, Dec 15 02:45:10 localhost hopeful_feistel[32872]: "osd_uuid": "6d4c9d22-e303-4075-8ccd-2bb4bc620212", Dec 15 02:45:10 localhost hopeful_feistel[32872]: "type": "bluestore" Dec 15 02:45:10 localhost hopeful_feistel[32872]: }, Dec 15 02:45:10 localhost hopeful_feistel[32872]: "9639648d-7992-48ee-ae84-b668fb65e316": { Dec 15 02:45:10 localhost hopeful_feistel[32872]: "ceph_fsid": "bce17446-41b5-5408-a23e-0b011906b44a", Dec 15 02:45:10 localhost hopeful_feistel[32872]: "device": "/dev/mapper/ceph_vg1-ceph_lv1", Dec 15 02:45:10 localhost hopeful_feistel[32872]: "osd_id": 3, Dec 15 02:45:10 localhost hopeful_feistel[32872]: "osd_uuid": "9639648d-7992-48ee-ae84-b668fb65e316", Dec 15 02:45:10 localhost hopeful_feistel[32872]: "type": "bluestore" Dec 15 02:45:10 localhost hopeful_feistel[32872]: } Dec 15 02:45:10 localhost hopeful_feistel[32872]: } Dec 15 02:45:10 localhost systemd[1]: libpod-3995bb57fcc062ef2de653ea4c3a5391b6faf6fca44fc2c3916141b89e782d61.scope: Deactivated successfully. Dec 15 02:45:10 localhost podman[32824]: 2025-12-15 07:45:10.828097049 +0000 UTC m=+0.802706784 container died 3995bb57fcc062ef2de653ea4c3a5391b6faf6fca44fc2c3916141b89e782d61 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hopeful_feistel, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, name=rhceph, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, CEPH_POINT_RELEASE=, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, io.openshift.expose-services=, GIT_BRANCH=main, release=1763362218, description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, build-date=2025-11-26T19:44:28Z, vendor=Red Hat, Inc., io.buildah.version=1.41.4, maintainer=Guillaume Abrioux , RELEASE=main, distribution-scope=public) Dec 15 02:45:10 localhost systemd[1]: tmp-crun.30iH4k.mount: Deactivated successfully. Dec 15 02:45:10 localhost systemd[1]: var-lib-containers-storage-overlay-1e288801f3638f95a1f9adf3c12c51ab0cefe9432dcf5162a4bc0c60df213fc7-merged.mount: Deactivated successfully. Dec 15 02:45:10 localhost podman[32908]: 2025-12-15 07:45:10.934105295 +0000 UTC m=+0.098454792 container remove 3995bb57fcc062ef2de653ea4c3a5391b6faf6fca44fc2c3916141b89e782d61 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hopeful_feistel, release=1763362218, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, architecture=x86_64, ceph=True, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, io.buildah.version=1.41.4, RELEASE=main, build-date=2025-11-26T19:44:28Z, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, name=rhceph, version=7, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers) Dec 15 02:45:10 localhost systemd[1]: libpod-conmon-3995bb57fcc062ef2de653ea4c3a5391b6faf6fca44fc2c3916141b89e782d61.scope: Deactivated successfully. Dec 15 02:45:11 localhost ceph-osd[32311]: log_channel(cluster) log [DBG] : purged_snaps scrub starts Dec 15 02:45:11 localhost ceph-osd[32311]: log_channel(cluster) log [DBG] : purged_snaps scrub ok Dec 15 02:45:11 localhost ceph-osd[31375]: osd.0 0 maybe_override_max_osd_capacity_for_qos osd bench result - bandwidth (MiB/sec): 21.267 iops: 5444.341 elapsed_sec: 0.551 Dec 15 02:45:11 localhost ceph-osd[31375]: log_channel(cluster) log [WRN] : OSD bench result of 5444.340879 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.0. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd]. Dec 15 02:45:11 localhost ceph-osd[31375]: osd.0 0 waiting for initial osdmap Dec 15 02:45:11 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-osd-0[31370]: 2025-12-15T07:45:11.721+0000 7fec5b2e5640 -1 osd.0 0 waiting for initial osdmap Dec 15 02:45:11 localhost ceph-osd[32311]: osd.3 0 done with init, starting boot process Dec 15 02:45:11 localhost ceph-osd[32311]: osd.3 0 start_boot Dec 15 02:45:11 localhost ceph-osd[32311]: osd.3 0 maybe_override_options_for_qos osd_max_backfills set to 1 Dec 15 02:45:11 localhost ceph-osd[32311]: osd.3 0 maybe_override_options_for_qos osd_recovery_max_active set to 0 Dec 15 02:45:11 localhost ceph-osd[32311]: osd.3 0 maybe_override_options_for_qos osd_recovery_max_active_hdd set to 3 Dec 15 02:45:11 localhost ceph-osd[32311]: osd.3 0 maybe_override_options_for_qos osd_recovery_max_active_ssd set to 10 Dec 15 02:45:11 localhost ceph-osd[32311]: osd.3 0 bench count 12288000 bsize 4 KiB Dec 15 02:45:11 localhost ceph-osd[31375]: osd.0 12 crush map has features 288514050185494528, adjusting msgr requires for clients Dec 15 02:45:11 localhost ceph-osd[31375]: osd.0 12 crush map has features 288514050185494528 was 288232575208792577, adjusting msgr requires for mons Dec 15 02:45:11 localhost ceph-osd[31375]: osd.0 12 crush map has features 3314932999778484224, adjusting msgr requires for osds Dec 15 02:45:11 localhost ceph-osd[31375]: osd.0 12 check_osdmap_features require_osd_release unknown -> reef Dec 15 02:45:11 localhost ceph-osd[31375]: osd.0 12 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory Dec 15 02:45:11 localhost ceph-osd[31375]: osd.0 12 set_numa_affinity not setting numa affinity Dec 15 02:45:11 localhost ceph-osd[31375]: osd.0 12 _collect_metadata loop3: no unique device id for loop3: fallback method has no model nor serial Dec 15 02:45:11 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-osd-0[31370]: 2025-12-15T07:45:11.774+0000 7fec5690f640 -1 osd.0 12 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory Dec 15 02:45:12 localhost ceph-osd[31375]: osd.0 12 tick checking mon for new map Dec 15 02:45:12 localhost podman[33036]: 2025-12-15 07:45:12.73814391 +0000 UTC m=+0.087822249 container exec 8dcda56b365b42dc8758aab77a9ec80db304780e449052738f7e4e648ae1ecaf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-crash-np0005559462, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, io.openshift.expose-services=, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, distribution-scope=public, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, description=Red Hat Ceph Storage 7, name=rhceph, vcs-type=git, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, GIT_CLEAN=True) Dec 15 02:45:12 localhost ceph-osd[31375]: osd.0 13 state: booting -> active Dec 15 02:45:12 localhost podman[33036]: 2025-12-15 07:45:12.844483885 +0000 UTC m=+0.194162234 container exec_died 8dcda56b365b42dc8758aab77a9ec80db304780e449052738f7e4e648ae1ecaf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-crash-np0005559462, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, GIT_BRANCH=main, architecture=x86_64, vendor=Red Hat, Inc., GIT_CLEAN=True, RELEASE=main, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, distribution-scope=public, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218) Dec 15 02:45:13 localhost ceph-osd[31375]: osd.0 14 crush map has features 288514051259236352, adjusting msgr requires for clients Dec 15 02:45:13 localhost ceph-osd[31375]: osd.0 14 crush map has features 288514051259236352 was 288514050185503233, adjusting msgr requires for mons Dec 15 02:45:13 localhost ceph-osd[31375]: osd.0 14 crush map has features 3314933000852226048, adjusting msgr requires for osds Dec 15 02:45:14 localhost podman[33234]: Dec 15 02:45:14 localhost podman[33234]: 2025-12-15 07:45:14.740402743 +0000 UTC m=+0.090966851 container create 6cfb59ea60d860f05d7739ea9b58442152d73a845cefa3155f41389dea565599 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=naughty_joliot, version=7, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, build-date=2025-11-26T19:44:28Z, release=1763362218, vcs-type=git, io.buildah.version=1.41.4, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64) Dec 15 02:45:14 localhost systemd[25924]: Starting Mark boot as successful... Dec 15 02:45:14 localhost systemd[25924]: Finished Mark boot as successful. Dec 15 02:45:14 localhost systemd[1]: Started libpod-conmon-6cfb59ea60d860f05d7739ea9b58442152d73a845cefa3155f41389dea565599.scope. Dec 15 02:45:14 localhost podman[33234]: 2025-12-15 07:45:14.699749637 +0000 UTC m=+0.050313795 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 02:45:14 localhost systemd[1]: Started libcrun container. Dec 15 02:45:14 localhost podman[33234]: 2025-12-15 07:45:14.831213208 +0000 UTC m=+0.181777316 container init 6cfb59ea60d860f05d7739ea9b58442152d73a845cefa3155f41389dea565599 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=naughty_joliot, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, release=1763362218, architecture=x86_64, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, version=7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, name=rhceph, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, io.openshift.tags=rhceph ceph) Dec 15 02:45:14 localhost naughty_joliot[33250]: 167 167 Dec 15 02:45:14 localhost systemd[1]: libpod-6cfb59ea60d860f05d7739ea9b58442152d73a845cefa3155f41389dea565599.scope: Deactivated successfully. Dec 15 02:45:14 localhost podman[33234]: 2025-12-15 07:45:14.871422172 +0000 UTC m=+0.221986250 container start 6cfb59ea60d860f05d7739ea9b58442152d73a845cefa3155f41389dea565599 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=naughty_joliot, maintainer=Guillaume Abrioux , name=rhceph, RELEASE=main, build-date=2025-11-26T19:44:28Z, vendor=Red Hat, Inc., ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, GIT_BRANCH=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, release=1763362218, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, architecture=x86_64, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Dec 15 02:45:14 localhost podman[33234]: 2025-12-15 07:45:14.871757891 +0000 UTC m=+0.222321979 container attach 6cfb59ea60d860f05d7739ea9b58442152d73a845cefa3155f41389dea565599 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=naughty_joliot, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., com.redhat.component=rhceph-container, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, io.openshift.expose-services=, GIT_CLEAN=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, release=1763362218, name=rhceph, RELEASE=main, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7) Dec 15 02:45:14 localhost podman[33234]: 2025-12-15 07:45:14.873352802 +0000 UTC m=+0.223916930 container died 6cfb59ea60d860f05d7739ea9b58442152d73a845cefa3155f41389dea565599 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=naughty_joliot, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , GIT_BRANCH=main, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, vendor=Red Hat, Inc., architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-11-26T19:44:28Z, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, vcs-type=git, CEPH_POINT_RELEASE=, ceph=True, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0) Dec 15 02:45:14 localhost systemd[1]: tmp-crun.fy9ixa.mount: Deactivated successfully. Dec 15 02:45:14 localhost ceph-osd[31375]: osd.0 pg_epoch: 14 pg[1.0( empty local-lis/les=0/0 n=0 ec=14/14 lis/c=0/0 les/c/f=0/0/0 sis=14) [2,0] r=1 lpr=14 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Dec 15 02:45:15 localhost podman[33255]: 2025-12-15 07:45:15.007649105 +0000 UTC m=+0.145308278 container remove 6cfb59ea60d860f05d7739ea9b58442152d73a845cefa3155f41389dea565599 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=naughty_joliot, ceph=True, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux , RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., GIT_BRANCH=main, distribution-scope=public, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, release=1763362218) Dec 15 02:45:15 localhost systemd[1]: libpod-conmon-6cfb59ea60d860f05d7739ea9b58442152d73a845cefa3155f41389dea565599.scope: Deactivated successfully. Dec 15 02:45:15 localhost podman[33277]: Dec 15 02:45:15 localhost podman[33277]: 2025-12-15 07:45:15.240514524 +0000 UTC m=+0.087586943 container create a2eefc882a038541b644b7d7fca1696db0f46be3134abc5f3e860da34ccb372a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=great_bouman, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, build-date=2025-11-26T19:44:28Z, distribution-scope=public, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , release=1763362218, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, GIT_BRANCH=main, RELEASE=main, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., io.buildah.version=1.41.4, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, vcs-type=git, ceph=True, version=7, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 15 02:45:15 localhost systemd[1]: Started libpod-conmon-a2eefc882a038541b644b7d7fca1696db0f46be3134abc5f3e860da34ccb372a.scope. Dec 15 02:45:15 localhost podman[33277]: 2025-12-15 07:45:15.186840614 +0000 UTC m=+0.033913023 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 02:45:15 localhost systemd[1]: Started libcrun container. Dec 15 02:45:15 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5a7407e9ecad5906df1a30337b352af418c5d1797ade979699c876446d95199b/merged/rootfs supports timestamps until 2038 (0x7fffffff) Dec 15 02:45:15 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5a7407e9ecad5906df1a30337b352af418c5d1797ade979699c876446d95199b/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Dec 15 02:45:15 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5a7407e9ecad5906df1a30337b352af418c5d1797ade979699c876446d95199b/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Dec 15 02:45:15 localhost podman[33277]: 2025-12-15 07:45:15.355402819 +0000 UTC m=+0.202475238 container init a2eefc882a038541b644b7d7fca1696db0f46be3134abc5f3e860da34ccb372a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=great_bouman, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, vendor=Red Hat, Inc., GIT_BRANCH=main, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, distribution-scope=public, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, release=1763362218, io.buildah.version=1.41.4, version=7, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True) Dec 15 02:45:15 localhost podman[33277]: 2025-12-15 07:45:15.369316596 +0000 UTC m=+0.216389025 container start a2eefc882a038541b644b7d7fca1696db0f46be3134abc5f3e860da34ccb372a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=great_bouman, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, version=7, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, distribution-scope=public, vcs-type=git, ceph=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, build-date=2025-11-26T19:44:28Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, name=rhceph, io.openshift.expose-services=, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream) Dec 15 02:45:15 localhost podman[33277]: 2025-12-15 07:45:15.369706146 +0000 UTC m=+0.216778615 container attach a2eefc882a038541b644b7d7fca1696db0f46be3134abc5f3e860da34ccb372a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=great_bouman, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1763362218, architecture=x86_64, GIT_BRANCH=main, name=rhceph, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, RELEASE=main, ceph=True, distribution-scope=public, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux ) Dec 15 02:45:15 localhost systemd[1]: var-lib-containers-storage-overlay-ffa678b328dc12a6a81f7151c786d74c8a999abbb1e49b4a0c50f41ae2f1a693-merged.mount: Deactivated successfully. Dec 15 02:45:16 localhost great_bouman[33292]: [ Dec 15 02:45:16 localhost great_bouman[33292]: { Dec 15 02:45:16 localhost great_bouman[33292]: "available": false, Dec 15 02:45:16 localhost great_bouman[33292]: "ceph_device": false, Dec 15 02:45:16 localhost great_bouman[33292]: "device_id": "QEMU_DVD-ROM_QM00001", Dec 15 02:45:16 localhost great_bouman[33292]: "lsm_data": {}, Dec 15 02:45:16 localhost great_bouman[33292]: "lvs": [], Dec 15 02:45:16 localhost great_bouman[33292]: "path": "/dev/sr0", Dec 15 02:45:16 localhost great_bouman[33292]: "rejected_reasons": [ Dec 15 02:45:16 localhost great_bouman[33292]: "Has a FileSystem", Dec 15 02:45:16 localhost great_bouman[33292]: "Insufficient space (<5GB)" Dec 15 02:45:16 localhost great_bouman[33292]: ], Dec 15 02:45:16 localhost great_bouman[33292]: "sys_api": { Dec 15 02:45:16 localhost great_bouman[33292]: "actuators": null, Dec 15 02:45:16 localhost great_bouman[33292]: "device_nodes": "sr0", Dec 15 02:45:16 localhost great_bouman[33292]: "human_readable_size": "482.00 KB", Dec 15 02:45:16 localhost great_bouman[33292]: "id_bus": "ata", Dec 15 02:45:16 localhost great_bouman[33292]: "model": "QEMU DVD-ROM", Dec 15 02:45:16 localhost great_bouman[33292]: "nr_requests": "2", Dec 15 02:45:16 localhost great_bouman[33292]: "partitions": {}, Dec 15 02:45:16 localhost great_bouman[33292]: "path": "/dev/sr0", Dec 15 02:45:16 localhost great_bouman[33292]: "removable": "1", Dec 15 02:45:16 localhost great_bouman[33292]: "rev": "2.5+", Dec 15 02:45:16 localhost great_bouman[33292]: "ro": "0", Dec 15 02:45:16 localhost great_bouman[33292]: "rotational": "1", Dec 15 02:45:16 localhost great_bouman[33292]: "sas_address": "", Dec 15 02:45:16 localhost great_bouman[33292]: "sas_device_handle": "", Dec 15 02:45:16 localhost great_bouman[33292]: "scheduler_mode": "mq-deadline", Dec 15 02:45:16 localhost great_bouman[33292]: "sectors": 0, Dec 15 02:45:16 localhost great_bouman[33292]: "sectorsize": "2048", Dec 15 02:45:16 localhost great_bouman[33292]: "size": 493568.0, Dec 15 02:45:16 localhost great_bouman[33292]: "support_discard": "0", Dec 15 02:45:16 localhost great_bouman[33292]: "type": "disk", Dec 15 02:45:16 localhost great_bouman[33292]: "vendor": "QEMU" Dec 15 02:45:16 localhost great_bouman[33292]: } Dec 15 02:45:16 localhost great_bouman[33292]: } Dec 15 02:45:16 localhost great_bouman[33292]: ] Dec 15 02:45:16 localhost systemd[1]: libpod-a2eefc882a038541b644b7d7fca1696db0f46be3134abc5f3e860da34ccb372a.scope: Deactivated successfully. Dec 15 02:45:16 localhost podman[33277]: 2025-12-15 07:45:16.394148013 +0000 UTC m=+1.241220392 container died a2eefc882a038541b644b7d7fca1696db0f46be3134abc5f3e860da34ccb372a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=great_bouman, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux , vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, release=1763362218, ceph=True, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, io.openshift.expose-services=, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vendor=Red Hat, Inc., GIT_BRANCH=main) Dec 15 02:45:16 localhost systemd[1]: var-lib-containers-storage-overlay-5a7407e9ecad5906df1a30337b352af418c5d1797ade979699c876446d95199b-merged.mount: Deactivated successfully. Dec 15 02:45:16 localhost podman[34765]: 2025-12-15 07:45:16.515677478 +0000 UTC m=+0.111536049 container remove a2eefc882a038541b644b7d7fca1696db0f46be3134abc5f3e860da34ccb372a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=great_bouman, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, RELEASE=main, name=rhceph, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, maintainer=Guillaume Abrioux , build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, io.openshift.expose-services=, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, version=7, release=1763362218, architecture=x86_64, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0) Dec 15 02:45:16 localhost systemd[1]: libpod-conmon-a2eefc882a038541b644b7d7fca1696db0f46be3134abc5f3e860da34ccb372a.scope: Deactivated successfully. Dec 15 02:45:16 localhost ceph-osd[32311]: osd.3 0 maybe_override_max_osd_capacity_for_qos osd bench result - bandwidth (MiB/sec): 19.020 iops: 4869.193 elapsed_sec: 0.616 Dec 15 02:45:16 localhost ceph-osd[32311]: log_channel(cluster) log [WRN] : OSD bench result of 4869.193301 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.3. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd]. Dec 15 02:45:16 localhost ceph-osd[32311]: osd.3 0 waiting for initial osdmap Dec 15 02:45:16 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-osd-3[32307]: 2025-12-15T07:45:16.546+0000 7fd14fe3b640 -1 osd.3 0 waiting for initial osdmap Dec 15 02:45:16 localhost ceph-osd[32311]: osd.3 16 crush map has features 288514051259236352, adjusting msgr requires for clients Dec 15 02:45:16 localhost ceph-osd[32311]: osd.3 16 crush map has features 288514051259236352 was 288232575208792577, adjusting msgr requires for mons Dec 15 02:45:16 localhost ceph-osd[32311]: osd.3 16 crush map has features 3314933000852226048, adjusting msgr requires for osds Dec 15 02:45:16 localhost ceph-osd[32311]: osd.3 16 check_osdmap_features require_osd_release unknown -> reef Dec 15 02:45:16 localhost ceph-osd[32311]: osd.3 16 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory Dec 15 02:45:16 localhost ceph-osd[32311]: osd.3 16 set_numa_affinity not setting numa affinity Dec 15 02:45:16 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-osd-3[32307]: 2025-12-15T07:45:16.674+0000 7fd14ac50640 -1 osd.3 16 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory Dec 15 02:45:16 localhost ceph-osd[32311]: osd.3 16 _collect_metadata loop4: no unique device id for loop4: fallback method has no model nor serial Dec 15 02:45:17 localhost ceph-osd[32311]: osd.3 17 state: booting -> active Dec 15 02:45:18 localhost ceph-osd[31375]: osd.0 pg_epoch: 18 pg[1.0( v 17'76 (0'0,17'76] local-lis/les=14/15 n=2 ec=14/14 lis/c=14/0 les/c/f=15/0/0 sis=18 pruub=12.844082832s) [2,0,4] r=1 lpr=18 pi=[14,18)/1 luod=0'0 lua=0'0 crt=17'76 lcod 17'75 mlcod 0'0 active pruub 25.203254700s@ mbc={}] start_peering_interval up [2,0] -> [2,0,4], acting [2,0] -> [2,0,4], acting_primary 2 -> 2, up_primary 2 -> 2, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 02:45:18 localhost ceph-osd[31375]: osd.0 pg_epoch: 18 pg[1.0( v 17'76 (0'0,17'76] local-lis/les=14/15 n=2 ec=14/14 lis/c=14/0 les/c/f=15/0/0 sis=18 pruub=12.843963623s) [2,0,4] r=1 lpr=18 pi=[14,18)/1 crt=17'76 lcod 17'75 mlcod 0'0 unknown NOTIFY pruub 25.203254700s@ mbc={}] state: transitioning to Stray Dec 15 02:45:25 localhost podman[34897]: 2025-12-15 07:45:25.865355747 +0000 UTC m=+0.118165300 container exec 8dcda56b365b42dc8758aab77a9ec80db304780e449052738f7e4e648ae1ecaf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-crash-np0005559462, build-date=2025-11-26T19:44:28Z, name=rhceph, io.buildah.version=1.41.4, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, vcs-type=git, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7) Dec 15 02:45:25 localhost podman[34897]: 2025-12-15 07:45:25.983364772 +0000 UTC m=+0.236174265 container exec_died 8dcda56b365b42dc8758aab77a9ec80db304780e449052738f7e4e648ae1ecaf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-crash-np0005559462, com.redhat.component=rhceph-container, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, distribution-scope=public, CEPH_POINT_RELEASE=, GIT_CLEAN=True, build-date=2025-11-26T19:44:28Z, vcs-type=git, GIT_BRANCH=main, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, release=1763362218, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers) Dec 15 02:46:27 localhost podman[35075]: 2025-12-15 07:46:27.805889051 +0000 UTC m=+0.083159413 container exec 8dcda56b365b42dc8758aab77a9ec80db304780e449052738f7e4e648ae1ecaf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-crash-np0005559462, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, architecture=x86_64, GIT_BRANCH=main, vendor=Red Hat, Inc., RELEASE=main, version=7, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, GIT_CLEAN=True, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph) Dec 15 02:46:27 localhost podman[35075]: 2025-12-15 07:46:27.939744981 +0000 UTC m=+0.217015323 container exec_died 8dcda56b365b42dc8758aab77a9ec80db304780e449052738f7e4e648ae1ecaf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-crash-np0005559462, GIT_BRANCH=main, release=1763362218, build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4, RELEASE=main, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, vendor=Red Hat, Inc., ceph=True, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, distribution-scope=public) Dec 15 02:46:37 localhost systemd[1]: session-13.scope: Deactivated successfully. Dec 15 02:46:37 localhost systemd[1]: session-13.scope: Consumed 21.135s CPU time. Dec 15 02:46:37 localhost systemd-logind[763]: Session 13 logged out. Waiting for processes to exit. Dec 15 02:46:37 localhost systemd-logind[763]: Removed session 13. Dec 15 02:48:38 localhost systemd[25924]: Created slice User Background Tasks Slice. Dec 15 02:48:38 localhost systemd[25924]: Starting Cleanup of User's Temporary Files and Directories... Dec 15 02:48:38 localhost systemd[25924]: Finished Cleanup of User's Temporary Files and Directories. Dec 15 02:50:15 localhost sshd[35451]: main: sshd: ssh-rsa algorithm is disabled Dec 15 02:50:15 localhost systemd-logind[763]: New session 27 of user zuul. Dec 15 02:50:15 localhost systemd[1]: Started Session 27 of User zuul. Dec 15 02:50:15 localhost python3[35499]: ansible-ansible.legacy.ping Invoked with data=pong Dec 15 02:50:16 localhost python3[35544]: ansible-setup Invoked with gather_subset=['!facter', '!ohai'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Dec 15 02:50:17 localhost python3[35564]: ansible-user Invoked with name=tripleo-admin generate_ssh_key=False state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005559462.localdomain update_password=always uid=None group=None groups=None comment=None home=None shell=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None Dec 15 02:50:17 localhost python3[35620]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/tripleo-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 02:50:18 localhost python3[35663]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/tripleo-admin mode=288 owner=root group=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765785017.6472328-65470-235903460734408/source _original_basename=tmpcqncfrd8 follow=False checksum=b3e7ecdcc699d217c6b083a91b07208207813d93 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:50:18 localhost python3[35693]: ansible-file Invoked with path=/home/tripleo-admin state=directory owner=tripleo-admin group=tripleo-admin mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:50:19 localhost python3[35709]: ansible-file Invoked with path=/home/tripleo-admin/.ssh state=directory owner=tripleo-admin group=tripleo-admin mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:50:19 localhost python3[35725]: ansible-file Invoked with path=/home/tripleo-admin/.ssh/authorized_keys state=touch owner=tripleo-admin group=tripleo-admin mode=384 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:50:20 localhost python3[35741]: ansible-lineinfile Invoked with path=/home/tripleo-admin/.ssh/authorized_keys line=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC1ko8xh0bPiR6+NCG4km3etin5rm3hZZmVHXuDDaTrTWq3PUz5sEmSbiDQOJj4mOpsouNXjYHkjXSuRLTQ5dqF1BWU5bgiOkTIqccwZdxkqfM2VFXFj/Ej621HUBRYHf7PK5zkl+8G1g2RkkiSd886DSw6I1J+2uT/e+4/0G1vsACTaNArP3/JSOh0hdwu+fnjybrp4sauiJsWaQvwbWao/txJqznQhymNwHZVFRMhFy+x+oDr4ry7w+X2JuGz2ydbUojBUG0REWTKmU4EZsyDsx77GzIJwfHsUUuJ0t4DcDalVqz20D+LXTug8AfgSovuNhZTz8AqRXfCOZK9haLb+3tJwKExMBdnj0cacNw6O23jZTIMbJK+qoxGN4mzqr8RNFVPPLVhL5tcfb2MvN7TWs7+oT0K5xmxDkEnr7iSpmPV0d8TD/wNwfJmtu2W6TWL5zgA6U3GyfnTH2+nhrFk5Ou+yaLES2+Wx8kb0NLWgDnlCcYy9dw1eP3GSc3GIzM= zuul-build-sshkey#012 regexp=Generated by TripleO state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:50:20 localhost python3[35755]: ansible-ping Invoked with data=pong Dec 15 02:50:32 localhost sshd[35756]: main: sshd: ssh-rsa algorithm is disabled Dec 15 02:50:32 localhost systemd-logind[763]: New session 28 of user tripleo-admin. Dec 15 02:50:32 localhost systemd[1]: Created slice User Slice of UID 1003. Dec 15 02:50:32 localhost systemd[1]: Starting User Runtime Directory /run/user/1003... Dec 15 02:50:32 localhost systemd[1]: Finished User Runtime Directory /run/user/1003. Dec 15 02:50:32 localhost systemd[1]: Starting User Manager for UID 1003... Dec 15 02:50:32 localhost systemd[35760]: Queued start job for default target Main User Target. Dec 15 02:50:32 localhost systemd[35760]: Created slice User Application Slice. Dec 15 02:50:32 localhost systemd[35760]: Started Mark boot as successful after the user session has run 2 minutes. Dec 15 02:50:32 localhost systemd[35760]: Started Daily Cleanup of User's Temporary Directories. Dec 15 02:50:32 localhost systemd[35760]: Reached target Paths. Dec 15 02:50:32 localhost systemd[35760]: Reached target Timers. Dec 15 02:50:32 localhost systemd[35760]: Starting D-Bus User Message Bus Socket... Dec 15 02:50:32 localhost systemd[35760]: Starting Create User's Volatile Files and Directories... Dec 15 02:50:32 localhost systemd[35760]: Finished Create User's Volatile Files and Directories. Dec 15 02:50:32 localhost systemd[35760]: Listening on D-Bus User Message Bus Socket. Dec 15 02:50:32 localhost systemd[35760]: Reached target Sockets. Dec 15 02:50:32 localhost systemd[35760]: Reached target Basic System. Dec 15 02:50:32 localhost systemd[35760]: Reached target Main User Target. Dec 15 02:50:32 localhost systemd[35760]: Startup finished in 116ms. Dec 15 02:50:32 localhost systemd[1]: Started User Manager for UID 1003. Dec 15 02:50:32 localhost systemd[1]: Started Session 28 of User tripleo-admin. Dec 15 02:50:33 localhost python3[35821]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all', 'min'] gather_timeout=45 filter=[] fact_path=/etc/ansible/facts.d Dec 15 02:50:38 localhost python3[35917]: ansible-selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config Dec 15 02:50:39 localhost python3[35933]: ansible-tempfile Invoked with state=file suffix=tmphosts prefix=ansible. path=None Dec 15 02:50:40 localhost python3[35981]: ansible-ansible.legacy.copy Invoked with remote_src=True src=/etc/hosts dest=/tmp/ansible.epxz5dectmphosts mode=preserve backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:50:40 localhost python3[36011]: ansible-blockinfile Invoked with state=absent path=/tmp/ansible.epxz5dectmphosts block= marker=# {mark} marker_begin=HEAT_HOSTS_START - Do not edit manually within this section! marker_end=HEAT_HOSTS_END create=False backup=False unsafe_writes=False insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:50:41 localhost python3[36027]: ansible-blockinfile Invoked with create=True path=/tmp/ansible.epxz5dectmphosts insertbefore=BOF block=172.17.0.106 np0005559462.localdomain np0005559462#012172.18.0.106 np0005559462.storage.localdomain np0005559462.storage#012172.20.0.106 np0005559462.storagemgmt.localdomain np0005559462.storagemgmt#012172.17.0.106 np0005559462.internalapi.localdomain np0005559462.internalapi#012172.19.0.106 np0005559462.tenant.localdomain np0005559462.tenant#012192.168.122.106 np0005559462.ctlplane.localdomain np0005559462.ctlplane#012172.17.0.107 np0005559463.localdomain np0005559463#012172.18.0.107 np0005559463.storage.localdomain np0005559463.storage#012172.20.0.107 np0005559463.storagemgmt.localdomain np0005559463.storagemgmt#012172.17.0.107 np0005559463.internalapi.localdomain np0005559463.internalapi#012172.19.0.107 np0005559463.tenant.localdomain np0005559463.tenant#012192.168.122.107 np0005559463.ctlplane.localdomain np0005559463.ctlplane#012172.17.0.108 np0005559464.localdomain np0005559464#012172.18.0.108 np0005559464.storage.localdomain np0005559464.storage#012172.20.0.108 np0005559464.storagemgmt.localdomain np0005559464.storagemgmt#012172.17.0.108 np0005559464.internalapi.localdomain np0005559464.internalapi#012172.19.0.108 np0005559464.tenant.localdomain np0005559464.tenant#012192.168.122.108 np0005559464.ctlplane.localdomain np0005559464.ctlplane#012172.17.0.103 np0005559459.localdomain np0005559459#012172.18.0.103 np0005559459.storage.localdomain np0005559459.storage#012172.20.0.103 np0005559459.storagemgmt.localdomain np0005559459.storagemgmt#012172.17.0.103 np0005559459.internalapi.localdomain np0005559459.internalapi#012172.19.0.103 np0005559459.tenant.localdomain np0005559459.tenant#012192.168.122.103 np0005559459.ctlplane.localdomain np0005559459.ctlplane#012172.17.0.104 np0005559460.localdomain np0005559460#012172.18.0.104 np0005559460.storage.localdomain np0005559460.storage#012172.20.0.104 np0005559460.storagemgmt.localdomain np0005559460.storagemgmt#012172.17.0.104 np0005559460.internalapi.localdomain np0005559460.internalapi#012172.19.0.104 np0005559460.tenant.localdomain np0005559460.tenant#012192.168.122.104 np0005559460.ctlplane.localdomain np0005559460.ctlplane#012172.17.0.105 np0005559461.localdomain np0005559461#012172.18.0.105 np0005559461.storage.localdomain np0005559461.storage#012172.20.0.105 np0005559461.storagemgmt.localdomain np0005559461.storagemgmt#012172.17.0.105 np0005559461.internalapi.localdomain np0005559461.internalapi#012172.19.0.105 np0005559461.tenant.localdomain np0005559461.tenant#012192.168.122.105 np0005559461.ctlplane.localdomain np0005559461.ctlplane#012#012192.168.122.100 undercloud.ctlplane.localdomain undercloud.ctlplane#012192.168.122.99 overcloud.ctlplane.localdomain#012172.18.0.159 overcloud.storage.localdomain#012172.20.0.162 overcloud.storagemgmt.localdomain#012172.17.0.121 overcloud.internalapi.localdomain#012172.21.0.190 overcloud.localdomain#012 marker=# {mark} marker_begin=START_HOST_ENTRIES_FOR_STACK: overcloud marker_end=END_HOST_ENTRIES_FOR_STACK: overcloud state=present backup=False unsafe_writes=False insertafter=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:50:42 localhost python3[36043]: ansible-ansible.legacy.command Invoked with _raw_params=cp "/tmp/ansible.epxz5dectmphosts" "/etc/hosts" _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 02:50:42 localhost python3[36060]: ansible-file Invoked with path=/tmp/ansible.epxz5dectmphosts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:50:43 localhost python3[36076]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides rhosp-release _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 02:50:45 localhost python3[36093]: ansible-ansible.legacy.dnf Invoked with name=['rhosp-release'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Dec 15 02:50:49 localhost python3[36113]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides driverctl lvm2 jq nftables openvswitch openstack-heat-agents openstack-selinux os-net-config python3-libselinux python3-pyyaml puppet-tripleo rsync tmpwatch sysstat iproute-tc _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 02:50:50 localhost python3[36130]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'jq', 'nftables', 'openvswitch', 'openstack-heat-agents', 'openstack-selinux', 'os-net-config', 'python3-libselinux', 'python3-pyyaml', 'puppet-tripleo', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Dec 15 02:51:59 localhost kernel: SELinux: Converting 2700 SID table entries... Dec 15 02:51:59 localhost kernel: SELinux: policy capability network_peer_controls=1 Dec 15 02:51:59 localhost kernel: SELinux: policy capability open_perms=1 Dec 15 02:51:59 localhost kernel: SELinux: policy capability extended_socket_class=1 Dec 15 02:51:59 localhost kernel: SELinux: policy capability always_check_network=0 Dec 15 02:51:59 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Dec 15 02:51:59 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Dec 15 02:51:59 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Dec 15 02:51:59 localhost dbus-broker-launch[755]: avc: op=load_policy lsm=selinux seqno=6 res=1 Dec 15 02:51:59 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Dec 15 02:51:59 localhost systemd[1]: Starting man-db-cache-update.service... Dec 15 02:51:59 localhost systemd[1]: Reloading. Dec 15 02:51:59 localhost systemd-rc-local-generator[36994]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 02:51:59 localhost systemd-sysv-generator[36999]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 02:51:59 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 02:52:00 localhost systemd[1]: Queuing reload/restart jobs for marked units… Dec 15 02:52:00 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Dec 15 02:52:00 localhost systemd[1]: Finished man-db-cache-update.service. Dec 15 02:52:00 localhost systemd[1]: run-r3e96c031650241538ed2ff666bd76f38.service: Deactivated successfully. Dec 15 02:52:04 localhost python3[37429]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 jq nftables openvswitch openstack-heat-agents openstack-selinux os-net-config python3-libselinux python3-pyyaml puppet-tripleo rsync tmpwatch sysstat iproute-tc _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 02:52:06 localhost python3[37568]: ansible-ansible.legacy.systemd Invoked with name=openvswitch enabled=True state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 02:52:06 localhost systemd[1]: Reloading. Dec 15 02:52:06 localhost systemd-rc-local-generator[37599]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 02:52:06 localhost systemd-sysv-generator[37602]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 02:52:06 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 02:52:07 localhost python3[37623]: ansible-file Invoked with path=/var/lib/heat-config/tripleo-config-download state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:52:07 localhost python3[37639]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides openstack-network-scripts _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 02:52:08 localhost python3[37656]: ansible-systemd Invoked with name=NetworkManager enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None Dec 15 02:52:09 localhost python3[37674]: ansible-ini_file Invoked with path=/etc/NetworkManager/NetworkManager.conf state=present no_extra_spaces=True section=main option=dns value=none backup=True exclusive=True allow_no_value=False create=True unsafe_writes=False values=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:52:10 localhost python3[37692]: ansible-ini_file Invoked with path=/etc/NetworkManager/NetworkManager.conf state=present no_extra_spaces=True section=main option=rc-manager value=unmanaged backup=True exclusive=True allow_no_value=False create=True unsafe_writes=False values=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:52:10 localhost python3[37710]: ansible-ansible.legacy.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Dec 15 02:52:10 localhost systemd[1]: Reloading Network Manager... Dec 15 02:52:10 localhost NetworkManager[5963]: [1765785130.8440] audit: op="reload" arg="0" pid=37713 uid=0 result="success" Dec 15 02:52:10 localhost NetworkManager[5963]: [1765785130.8454] config: signal: SIGHUP,config-files,values,values-user,no-auto-default,dns-mode,rc-manager (/etc/NetworkManager/NetworkManager.conf (lib: 00-server.conf) (run: 15-carrier-timeout.conf)) Dec 15 02:52:10 localhost NetworkManager[5963]: [1765785130.8455] dns-mgr: init: dns=none,systemd-resolved rc-manager=unmanaged Dec 15 02:52:10 localhost systemd[1]: Reloaded Network Manager. Dec 15 02:52:11 localhost python3[37729]: ansible-ansible.legacy.command Invoked with _raw_params=ln -f -s /usr/share/openstack-puppet/modules/* /etc/puppet/modules/ _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 02:52:11 localhost python3[37746]: ansible-stat Invoked with path=/usr/bin/ansible-playbook follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 15 02:52:12 localhost python3[37764]: ansible-stat Invoked with path=/usr/bin/ansible-playbook-3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 15 02:52:12 localhost python3[37780]: ansible-file Invoked with state=link src=/usr/bin/ansible-playbook path=/usr/bin/ansible-playbook-3 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:52:13 localhost python3[37796]: ansible-tempfile Invoked with state=file prefix=ansible. suffix= path=None Dec 15 02:52:13 localhost python3[37812]: ansible-stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 15 02:52:14 localhost python3[37828]: ansible-blockinfile Invoked with path=/tmp/ansible.pq5ap1pr block=[192.168.122.106]*,[np0005559462.ctlplane.localdomain]*,[172.17.0.106]*,[np0005559462.internalapi.localdomain]*,[172.18.0.106]*,[np0005559462.storage.localdomain]*,[172.20.0.106]*,[np0005559462.storagemgmt.localdomain]*,[172.19.0.106]*,[np0005559462.tenant.localdomain]*,[np0005559462.localdomain]*,[np0005559462]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDrQzXLeVaQJEU2ztjn0fNzYVokp78uisG99XwWlgQg7ZpJT+WTsYqcKv35fw1GN/zQy59gWCk8kYF10TYn8QaRCgF0ZXGY8H0LjB7U4x/HUzMWF03yBBRXZjcF39ubxRGSmMaVWBpOYp9M08b2RBwNDJCTjdrYyyaRMnS+jyA8nFsXD/p8n8I625s6JjDL5pU06+6Urj5IJp/9WFWZUfPJZz31ZXhK/se/5c44GzjneCEn99dfU+Brux9+D6WQpI1HZxEuPPoUTRG/+dTwx0e2VrbxdqAxoMfXS+fB2l5XEQMrQD1Q6aG7K7ndtd+6BQYvLmFakcX/UevQngJOuz08tcgdea0gRXmOIr3JbqmFn3bcOP4ozZ9R23Hs4fMenHDW8Ivw57xe1oyPHF82POHh2HreGMWqVlcsWcHhZLEzEOHlLBfZEBrBLzP6Zck0gXqh3zgzip+dF70qUxiGtenul3aCJIJGIf/tUoMkGYM51NwDylsw0We2cO7tkD36uvc=#012[192.168.122.107]*,[np0005559463.ctlplane.localdomain]*,[172.17.0.107]*,[np0005559463.internalapi.localdomain]*,[172.18.0.107]*,[np0005559463.storage.localdomain]*,[172.20.0.107]*,[np0005559463.storagemgmt.localdomain]*,[172.19.0.107]*,[np0005559463.tenant.localdomain]*,[np0005559463.localdomain]*,[np0005559463]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC0zv1YwbYydDOC13IxABIAlv2+N4/64PF3ADLPVluzLQONerjQ0gjtgR7OfrcoFTZTiqc2tJKpMFSw9qOZAUqHG8sT8FCcrTiwVoKv8yyUzcZh7iheOk4d4FMyUCuxj+VD7lbHDRSPnCafc91T/BKNKvMV2R/h+zdHoyg8u8zng++18JeHuYdtavYz5Uz7sgK6EZ4JPLao3nLM8Vbzl2lk0yfZQvVRn3dXvPf4jwvZJl+kNum2ZqrtrHACqFvq+zbh0hFCZYHmoc7dvmrl8pBj8e3Qs4iXW8vkYf0TKHtCCcPz3o9WDH946bmi2pAWCYgejR5Qg13HBcKQb0sKuXYSk6F7s6pzrOSAHajY3SLA0xVgpmbac0NyWJgnAdXxqspsXnW7z2bwgTDhxzRDh4QHTpWSlhr1PFnY/HvzvCJdo9RZ5D3xFEen1YQPXPJzWmsAQzvASxYUmweNC1xO3Vb5cnd51AiQOVpBJRgF64lKKCOnjqxOOKOvLD9L82F+Gsc=#012[192.168.122.108]*,[np0005559464.ctlplane.localdomain]*,[172.17.0.108]*,[np0005559464.internalapi.localdomain]*,[172.18.0.108]*,[np0005559464.storage.localdomain]*,[172.20.0.108]*,[np0005559464.storagemgmt.localdomain]*,[172.19.0.108]*,[np0005559464.tenant.localdomain]*,[np0005559464.localdomain]*,[np0005559464]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCxqkgoIIgxeHEDa634HpTkDNt9TKXrhBXtKApvA2HDOYJC7R2lrHf0hxkAuU/MCkIZse6a6pP4n4JPp29gByMqzGBNcoo1iSRmKkSHDpaUeu0f+9fH8BDL9pdOwo8IiwHn9kk8ffoq3gqVhWKEdD4Td38/we+YLKMNXqM7yyrIiXSbLSPLJxR66ZRY+JXYFKRs8IJSMRyuXnNporymS5NtzgTxuYebROnCEG+mNONzDWnqPSBB0oSEi76oKKTqJRq4kv+8V3ZTMIMTs42VntiTD5hBgzPa2ZhmY0wKdz7vI1xGZ+2SAuZkzRwv7YXF6J0pqlKgHxE7TrvWUj+EXb/kgQ/tgtB+wXbu9pLw91/8L3hgAWNrMzcWheIEHXcH3btf9HTSxLtgs1xB8EgPw/nmSUDppsScqfPEUPtHoZOZ0O3wevinvnBqo9dmxOcZWSPmrujK3TvByP2omhYn8MWSctYg2sER10rd8Wg3JArPwiEdcp3UA/hkrYGAJWYNeBM=#012[192.168.122.103]*,[np0005559459.ctlplane.localdomain]*,[172.17.0.103]*,[np0005559459.internalapi.localdomain]*,[172.18.0.103]*,[np0005559459.storage.localdomain]*,[172.20.0.103]*,[np0005559459.storagemgmt.localdomain]*,[172.19.0.103]*,[np0005559459.tenant.localdomain]*,[np0005559459.localdomain]*,[np0005559459]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCqlnRqscQHmiXC9F0uBpTeOoqxw0/HUjdhUjI49Q/PK0NsH6XeoHwFn3RUyYvUTYQrRCPNFeNH+THx5uA5O7mjJBaHg0RbtpQ0qSufn0fFyDFIjuMW8u2Bs7DA7daecXfzweFHqWsZzRksCCZdvGUK61zPvmhxuPkzYaME/JuZ0RxpAMyb0YhyvzL++niWIi+OBpDYpAnbynsPE428f5U5GJ87eUDZ4g0Iy3+4HC98k/DBchi4w22zg1nU4O3vtPhLAgEWX7z3/nz/9St6ifZhXW2xurdbBr9nPb8xSGeAN7a0aERqAI4tYkZo9rXzKOTGB85OeKP1XoYiWdhexZjV+j4RwAwNWVAIVACfkw8ZuW5WmFuCSVjT8A+EmQo9PLeg3RGBjRbTO8orQ59hNelHoqnK60HO4/JO6x3VAcn/EpCSS5a8xsMQFQDCMbaPrmNpTA0Na1qpq+yu92glOEGwff5HKWN2vDfVqGQZhWwabrKtbVf5cKMi2jhmecCNj9E=#012[192.168.122.104]*,[np0005559460.ctlplane.localdomain]*,[172.17.0.104]*,[np0005559460.internalapi.localdomain]*,[172.18.0.104]*,[np0005559460.storage.localdomain]*,[172.20.0.104]*,[np0005559460.storagemgmt.localdomain]*,[172.19.0.104]*,[np0005559460.tenant.localdomain]*,[np0005559460.localdomain]*,[np0005559460]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDEKLYeWo8M0KKNWeR21MbsfAxuubj3NQ9RoSnLNJB8iaQd0VMr5Mr6IxEco83PyPUfuz+0BCKzHjMUmEFroQLP0uJAbkmRhN+pbImyPvqY0LdluyDf40PwNKyXGbWGbW76YLzHwp+CUSV3BAQ6cLJzrMQn3GFmepexVzDNvziyEloO16lObV6r/1mKXEMM4qkqPbDNNznwKBL8jS8nXglEUPTE0ATtyp/4/tvv8tUTQ5uQyxF6QwBmzzMgrhyA/L0CQC1kYCjUtxAau88hJ6XT06rJ+bGWUojpdI+hYlKRtcd/5x8+9LD8kQ29s4AnLPg72qZglIFa46JfZAPFBabQoMDtnA2uZlJ+AWYEcvCyIlLPRRiXaSDVMkBhzYT3FVwLqmdmpcurZzs1WS8HEVbLY4ZJchb2gL1PVZMazpBH8tEH/n4fmy7p2t1G5z2xT45grhWyr5xE9fQtfes2N8l1gMa02U9vCGr8lBhSGic6KL09+XWZtdqGCz1IfoRt0tM=#012[192.168.122.105]*,[np0005559461.ctlplane.localdomain]*,[172.17.0.105]*,[np0005559461.internalapi.localdomain]*,[172.18.0.105]*,[np0005559461.storage.localdomain]*,[172.20.0.105]*,[np0005559461.storagemgmt.localdomain]*,[172.19.0.105]*,[np0005559461.tenant.localdomain]*,[np0005559461.localdomain]*,[np0005559461]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC+pZDj8/JotpJwzMLuGO3sGR9qelkKNigZ2dHBVONU8Te2pVOOlBjGecT3+MT3PMotPbB8TwWgRvbJE0Z12178pRNQX61gnd2TtITG7EvEsL9j+LZHo+AJC2eSsdTlWMhCOlRy/TEUYfAJFXRawnsSsEU377lC5qTLesYFzCdgb3aC3pme1bP38Fpx2QDE8XZjl9wq7C1isruKuTifALk4kS2NnVU6XKllWAemqz4vf0UJUCG1qI2HmxPP/miVK//pk1ZdZzZk1kvbQYbaxXcsVJ7DHR+tTWPp/56OlKngz91Qt0xidMlJHxn8bf5rZChk4a0HLBbae2/ksxutNZb7i/LZ9B3Q41/Lq8bcvQPLkvYcW7tkMxHbR2MfKCFFfjxsJV03L3HrgbdsctrXW+58VS4sFRWRdKkOSRkesSF1+KDxG5GqFKFAhRp76OESCiv81XJPXGO5ElNpxnkajHwO/ts/neF3vlUr5z5BOPZ+hLohivjXlIFEQrFF9EqUau8=#012 create=True state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:52:15 localhost python3[37844]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.pq5ap1pr' > /etc/ssh/ssh_known_hosts _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 02:52:15 localhost python3[37862]: ansible-file Invoked with path=/tmp/ansible.pq5ap1pr state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:52:16 localhost python3[37878]: ansible-file Invoked with path=/var/log/journal state=directory mode=0750 owner=root group=root setype=var_log_t recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 15 02:52:16 localhost python3[37894]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active cloud-init.service || systemctl is-enabled cloud-init.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 02:52:17 localhost python3[37912]: ansible-ansible.legacy.command Invoked with _raw_params=cat /proc/cmdline | grep -q cloud-init=disabled _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 02:52:17 localhost python3[37931]: ansible-community.general.cloud_init_data_facts Invoked with filter=status Dec 15 02:52:20 localhost python3[38068]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides tuned tuned-profiles-cpu-partitioning _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 02:52:20 localhost python3[38085]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Dec 15 02:52:23 localhost dbus-broker-launch[751]: Noticed file-system modification, trigger reload. Dec 15 02:52:23 localhost dbus-broker-launch[751]: Noticed file-system modification, trigger reload. Dec 15 02:52:24 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Dec 15 02:52:24 localhost systemd[1]: Starting man-db-cache-update.service... Dec 15 02:52:24 localhost systemd[1]: Reloading. Dec 15 02:52:24 localhost systemd-sysv-generator[38153]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 02:52:24 localhost systemd-rc-local-generator[38150]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 02:52:24 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 02:52:24 localhost systemd[1]: Queuing reload/restart jobs for marked units… Dec 15 02:52:24 localhost systemd[1]: Stopping Dynamic System Tuning Daemon... Dec 15 02:52:24 localhost systemd[1]: tuned.service: Deactivated successfully. Dec 15 02:52:24 localhost systemd[1]: Stopped Dynamic System Tuning Daemon. Dec 15 02:52:24 localhost systemd[1]: tuned.service: Consumed 1.713s CPU time. Dec 15 02:52:24 localhost systemd[1]: Starting Dynamic System Tuning Daemon... Dec 15 02:52:24 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Dec 15 02:52:24 localhost systemd[1]: Finished man-db-cache-update.service. Dec 15 02:52:24 localhost systemd[1]: run-rf90dd15db5ba43bd929331183d688a24.service: Deactivated successfully. Dec 15 02:52:25 localhost systemd[1]: Started Dynamic System Tuning Daemon. Dec 15 02:52:25 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Dec 15 02:52:25 localhost systemd[1]: Starting man-db-cache-update.service... Dec 15 02:52:26 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Dec 15 02:52:26 localhost systemd[1]: Finished man-db-cache-update.service. Dec 15 02:52:26 localhost systemd[1]: run-r25dfbdea0a69491dae4c2ddde287b5bd.service: Deactivated successfully. Dec 15 02:52:27 localhost python3[38521]: ansible-systemd Invoked with name=tuned state=restarted enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 02:52:27 localhost systemd[1]: Stopping Dynamic System Tuning Daemon... Dec 15 02:52:27 localhost systemd[1]: tuned.service: Deactivated successfully. Dec 15 02:52:27 localhost systemd[1]: Stopped Dynamic System Tuning Daemon. Dec 15 02:52:27 localhost systemd[1]: Starting Dynamic System Tuning Daemon... Dec 15 02:52:28 localhost systemd[1]: Started Dynamic System Tuning Daemon. Dec 15 02:52:28 localhost python3[38716]: ansible-ansible.legacy.command Invoked with _raw_params=which tuned-adm _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 02:52:29 localhost python3[38733]: ansible-slurp Invoked with src=/etc/tuned/active_profile Dec 15 02:52:30 localhost python3[38749]: ansible-stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 15 02:52:30 localhost python3[38765]: ansible-ansible.legacy.command Invoked with _raw_params=tuned-adm profile throughput-performance _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 02:52:32 localhost python3[38785]: ansible-ansible.legacy.command Invoked with _raw_params=cat /proc/cmdline _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 02:52:33 localhost python3[38802]: ansible-stat Invoked with path=/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova/nova.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 15 02:52:35 localhost python3[38818]: ansible-replace Invoked with regexp=TRIPLEO_HEAT_TEMPLATE_KERNEL_ARGS dest=/etc/default/grub replace= path=/etc/default/grub backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:52:38 localhost systemd[35760]: Starting Mark boot as successful... Dec 15 02:52:38 localhost systemd[35760]: Finished Mark boot as successful. Dec 15 02:52:41 localhost python3[38912]: ansible-file Invoked with path=/etc/puppet/hieradata state=directory mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:52:41 localhost python3[38960]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hiera.yaml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 02:52:41 localhost python3[39005]: ansible-ansible.legacy.copy Invoked with mode=384 dest=/etc/puppet/hiera.yaml src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765785161.1872184-70219-261940119051199/source _original_basename=tmpvni8wh84 follow=False checksum=aaf3699defba931d532f4955ae152f505046749a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:52:42 localhost python3[39035]: ansible-file Invoked with src=/etc/puppet/hiera.yaml dest=/etc/hiera.yaml state=link force=True path=/etc/hiera.yaml recurse=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:52:43 localhost python3[39083]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/all_nodes.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 02:52:43 localhost python3[39126]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765785162.742903-70309-210756500044441/source dest=/etc/puppet/hieradata/all_nodes.json _original_basename=overcloud.json follow=False checksum=47c055675afa20d285c542bd0688919d2e1f93aa backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:52:43 localhost python3[39188]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/bootstrap_node.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 02:52:44 localhost python3[39231]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765785163.609677-70365-25894118795666/source dest=/etc/puppet/hieradata/bootstrap_node.json mode=None follow=False _original_basename=bootstrap_node.j2 checksum=207e865c503d2a98f256eec665982e962798df13 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:52:44 localhost python3[39293]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/vip_data.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 02:52:45 localhost python3[39336]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765785164.484262-70365-18582580448556/source dest=/etc/puppet/hieradata/vip_data.json mode=None follow=False _original_basename=vip_data.j2 checksum=d2a0393bdef25c443c6a7710a9d6ec4e72829e4a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:52:45 localhost python3[39398]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/net_ip_map.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 02:52:46 localhost python3[39441]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765785165.4274538-70365-256789938024412/source dest=/etc/puppet/hieradata/net_ip_map.json mode=None follow=False _original_basename=net_ip_map.j2 checksum=68b5a56a66cb10764ef3288009ad5e9b7e8faf12 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:52:46 localhost python3[39503]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/cloud_domain.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 02:52:47 localhost python3[39546]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765785166.4263887-70365-128699154842789/source dest=/etc/puppet/hieradata/cloud_domain.json mode=None follow=False _original_basename=cloud_domain.j2 checksum=5dd835a63e6a03d74797c2e2eadf4bea1cecd9d9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:52:47 localhost python3[39608]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/fqdn.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 02:52:47 localhost python3[39651]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765785167.2253354-70365-74441326509867/source dest=/etc/puppet/hieradata/fqdn.json mode=None follow=False _original_basename=fqdn.j2 checksum=fbe347e593b93384e0cec0bd7744e823eb5444b0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:52:48 localhost python3[39713]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/service_names.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 02:52:48 localhost python3[39756]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765785168.1516001-70365-6843702041948/source dest=/etc/puppet/hieradata/service_names.json mode=None follow=False _original_basename=service_names.j2 checksum=ff586b96402d8ae133745cf06f17e772b2f22d52 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:52:49 localhost python3[39818]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/service_configs.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 02:52:49 localhost python3[39861]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765785168.9801702-70365-2890463168958/source dest=/etc/puppet/hieradata/service_configs.json mode=None follow=False _original_basename=service_configs.j2 checksum=8338108adce7037f2efca0df2b750fcb66fbd809 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:52:50 localhost python3[39923]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/extraconfig.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 02:52:50 localhost python3[39966]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765785169.8276412-70365-68415342632680/source dest=/etc/puppet/hieradata/extraconfig.json mode=None follow=False _original_basename=extraconfig.j2 checksum=5f36b2ea290645ee34d943220a14b54ee5ea5be5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:52:50 localhost python3[40028]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/role_extraconfig.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 02:52:51 localhost python3[40071]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765785170.6567116-70365-113693248107882/source dest=/etc/puppet/hieradata/role_extraconfig.json mode=None follow=False _original_basename=role_extraconfig.j2 checksum=34875968bf996542162e620523f9dcfb3deac331 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:52:51 localhost python3[40133]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/ovn_chassis_mac_map.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 02:52:52 localhost python3[40176]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765785171.5408943-70365-127152074144189/source dest=/etc/puppet/hieradata/ovn_chassis_mac_map.json mode=None follow=False _original_basename=ovn_chassis_mac_map.j2 checksum=6f2e564cc8c8523d19f2e16f438a571c80aea80b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:52:52 localhost python3[40206]: ansible-stat Invoked with path={'src': '/etc/puppet/hieradata/ansible_managed.json'} follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 15 02:52:53 localhost python3[40254]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/ansible_managed.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 02:52:54 localhost python3[40297]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/ansible_managed.json owner=root group=root mode=0644 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765785173.443413-70988-6403911328982/source _original_basename=tmpqiv88973 follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:52:59 localhost python3[40327]: ansible-setup Invoked with gather_subset=['!all', '!min', 'network'] filter=['ansible_default_ipv4'] gather_timeout=10 fact_path=/etc/ansible/facts.d Dec 15 02:52:59 localhost python3[40388]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 38.102.83.1 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 02:53:04 localhost python3[40405]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 192.168.122.10 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 02:53:09 localhost python3[40422]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 192.168.122.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")#012MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")#012echo "$INT $MTU"#012 _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 02:53:10 localhost python3[40445]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 172.18.0.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")#012MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")#012echo "$INT $MTU"#012 _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 02:53:10 localhost python3[40468]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 172.20.0.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")#012MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")#012echo "$INT $MTU"#012 _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 02:53:11 localhost python3[40491]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 172.17.0.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")#012MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")#012echo "$INT $MTU"#012 _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 02:53:12 localhost python3[40514]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 172.19.0.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")#012MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")#012echo "$INT $MTU"#012 _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 02:53:43 localhost sshd[40599]: main: sshd: ssh-rsa algorithm is disabled Dec 15 02:53:49 localhost sshd[40600]: main: sshd: ssh-rsa algorithm is disabled Dec 15 02:53:53 localhost python3[40616]: ansible-file Invoked with path=/etc/puppet/hieradata state=directory mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:53:53 localhost python3[40664]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hiera.yaml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 02:53:53 localhost python3[40682]: ansible-ansible.legacy.file Invoked with mode=384 dest=/etc/puppet/hiera.yaml _original_basename=tmph2o22cr6 recurse=False state=file path=/etc/puppet/hiera.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:53:54 localhost python3[40712]: ansible-file Invoked with src=/etc/puppet/hiera.yaml dest=/etc/hiera.yaml state=link force=True path=/etc/hiera.yaml recurse=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:53:54 localhost python3[40760]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/all_nodes.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 02:53:55 localhost python3[40778]: ansible-ansible.legacy.file Invoked with dest=/etc/puppet/hieradata/all_nodes.json _original_basename=overcloud.json recurse=False state=file path=/etc/puppet/hieradata/all_nodes.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:53:55 localhost python3[40840]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/bootstrap_node.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 02:53:56 localhost python3[40858]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/bootstrap_node.json _original_basename=bootstrap_node.j2 recurse=False state=file path=/etc/puppet/hieradata/bootstrap_node.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:53:56 localhost python3[40920]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/vip_data.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 02:53:56 localhost python3[40938]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/vip_data.json _original_basename=vip_data.j2 recurse=False state=file path=/etc/puppet/hieradata/vip_data.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:53:57 localhost python3[41000]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/net_ip_map.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 02:53:58 localhost python3[41018]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/net_ip_map.json _original_basename=net_ip_map.j2 recurse=False state=file path=/etc/puppet/hieradata/net_ip_map.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:53:58 localhost python3[41080]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/cloud_domain.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 02:53:58 localhost python3[41098]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/cloud_domain.json _original_basename=cloud_domain.j2 recurse=False state=file path=/etc/puppet/hieradata/cloud_domain.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:53:59 localhost python3[41160]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/fqdn.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 02:53:59 localhost python3[41178]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/fqdn.json _original_basename=fqdn.j2 recurse=False state=file path=/etc/puppet/hieradata/fqdn.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:54:00 localhost python3[41240]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/service_names.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 02:54:00 localhost python3[41258]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/service_names.json _original_basename=service_names.j2 recurse=False state=file path=/etc/puppet/hieradata/service_names.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:54:00 localhost python3[41320]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/service_configs.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 02:54:01 localhost python3[41338]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/service_configs.json _original_basename=service_configs.j2 recurse=False state=file path=/etc/puppet/hieradata/service_configs.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:54:01 localhost python3[41400]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/extraconfig.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 02:54:01 localhost python3[41418]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/extraconfig.json _original_basename=extraconfig.j2 recurse=False state=file path=/etc/puppet/hieradata/extraconfig.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:54:02 localhost python3[41480]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/role_extraconfig.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 02:54:02 localhost python3[41498]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/role_extraconfig.json _original_basename=role_extraconfig.j2 recurse=False state=file path=/etc/puppet/hieradata/role_extraconfig.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:54:02 localhost python3[41560]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/ovn_chassis_mac_map.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 02:54:03 localhost python3[41578]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/ovn_chassis_mac_map.json _original_basename=ovn_chassis_mac_map.j2 recurse=False state=file path=/etc/puppet/hieradata/ovn_chassis_mac_map.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:54:03 localhost python3[41608]: ansible-stat Invoked with path={'src': '/etc/puppet/hieradata/ansible_managed.json'} follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 15 02:54:04 localhost python3[41656]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/ansible_managed.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 02:54:04 localhost python3[41674]: ansible-ansible.legacy.file Invoked with owner=root group=root mode=0644 dest=/etc/puppet/hieradata/ansible_managed.json _original_basename=tmptebj6wca recurse=False state=file path=/etc/puppet/hieradata/ansible_managed.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:54:07 localhost python3[41704]: ansible-dnf Invoked with name=['firewalld'] state=absent allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Dec 15 02:54:12 localhost python3[41721]: ansible-ansible.builtin.systemd Invoked with name=iptables.service state=stopped enabled=False daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 02:54:12 localhost python3[41739]: ansible-ansible.builtin.systemd Invoked with name=ip6tables.service state=stopped enabled=False daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 02:54:13 localhost python3[41757]: ansible-ansible.builtin.systemd Invoked with name=nftables state=started enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 02:54:13 localhost systemd[1]: Reloading. Dec 15 02:54:13 localhost systemd-rc-local-generator[41783]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 02:54:13 localhost systemd-sysv-generator[41787]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 02:54:13 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 02:54:13 localhost systemd[1]: Starting Netfilter Tables... Dec 15 02:54:13 localhost systemd[1]: Finished Netfilter Tables. Dec 15 02:54:14 localhost python3[41847]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 02:54:14 localhost python3[41890]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765785253.8130581-73922-153199485217970/source _original_basename=iptables.nft follow=False checksum=ede9860c99075946a7bc827210247aac639bc84a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:54:14 localhost python3[41920]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 02:54:15 localhost python3[41938]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 02:54:16 localhost python3[41987]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-jumps.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 02:54:16 localhost python3[42030]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-jumps.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765785255.8070073-74032-80431657326507/source mode=None follow=False _original_basename=jump-chain.j2 checksum=eec306c3276262a27663d76bd0ea526457445afa backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:54:17 localhost python3[42092]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-update-jumps.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 02:54:17 localhost python3[42135]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-update-jumps.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765785256.7773912-74092-213811059542139/source mode=None follow=False _original_basename=jump-chain.j2 checksum=eec306c3276262a27663d76bd0ea526457445afa backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:54:18 localhost python3[42197]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-flushes.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 02:54:18 localhost python3[42240]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-flushes.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765785257.834902-74163-113383974111235/source mode=None follow=False _original_basename=flush-chain.j2 checksum=e8e7b8db0d61a7fe393441cc91613f470eb34a6e backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:54:19 localhost python3[42302]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-chains.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 02:54:19 localhost python3[42345]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-chains.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765785258.769477-74220-47807794643319/source mode=None follow=False _original_basename=chains.j2 checksum=e60ee651f5014e83924f4e901ecc8e25b1906610 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:54:20 localhost python3[42407]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-rules.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 02:54:20 localhost python3[42450]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-rules.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765785259.6564314-74265-238090807322705/source mode=None follow=False _original_basename=ruleset.j2 checksum=0444e4206083f91e2fb2aabfa2928244c2db35ed backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:54:21 localhost python3[42480]: ansible-ansible.legacy.command Invoked with _raw_params=cat /etc/nftables/tripleo-chains.nft /etc/nftables/tripleo-flushes.nft /etc/nftables/tripleo-rules.nft /etc/nftables/tripleo-update-jumps.nft /etc/nftables/tripleo-jumps.nft | nft -c -f - _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 02:54:21 localhost python3[42545]: ansible-ansible.builtin.blockinfile Invoked with path=/etc/sysconfig/nftables.conf backup=False validate=nft -c -f %s block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/tripleo-chains.nft"#012include "/etc/nftables/tripleo-rules.nft"#012include "/etc/nftables/tripleo-jumps.nft"#012 state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:54:22 localhost python3[42562]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/tripleo-chains.nft _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 02:54:22 localhost python3[42579]: ansible-ansible.legacy.command Invoked with _raw_params=cat /etc/nftables/tripleo-flushes.nft /etc/nftables/tripleo-rules.nft /etc/nftables/tripleo-update-jumps.nft | nft -f - _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 02:54:23 localhost python3[42598]: ansible-file Invoked with mode=0750 path=/var/log/containers/collectd setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 15 02:54:23 localhost python3[42614]: ansible-file Invoked with mode=0755 path=/var/lib/container-user-scripts/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 15 02:54:23 localhost python3[42630]: ansible-file Invoked with mode=0750 path=/var/log/containers/ceilometer setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 15 02:54:24 localhost python3[42646]: ansible-seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False Dec 15 02:54:25 localhost dbus-broker-launch[755]: avc: op=load_policy lsm=selinux seqno=7 res=1 Dec 15 02:54:25 localhost python3[42666]: ansible-community.general.sefcontext Invoked with setype=container_file_t state=present target=/etc/iscsi(/.*)? ignore_selinux_state=False ftype=a reload=True seuser=None selevel=None Dec 15 02:54:26 localhost kernel: SELinux: Converting 2704 SID table entries... Dec 15 02:54:26 localhost kernel: SELinux: policy capability network_peer_controls=1 Dec 15 02:54:26 localhost kernel: SELinux: policy capability open_perms=1 Dec 15 02:54:26 localhost kernel: SELinux: policy capability extended_socket_class=1 Dec 15 02:54:26 localhost kernel: SELinux: policy capability always_check_network=0 Dec 15 02:54:26 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Dec 15 02:54:26 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Dec 15 02:54:26 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Dec 15 02:54:26 localhost dbus-broker-launch[755]: avc: op=load_policy lsm=selinux seqno=8 res=1 Dec 15 02:54:27 localhost python3[42687]: ansible-community.general.sefcontext Invoked with setype=container_file_t state=present target=/etc/target(/.*)? ignore_selinux_state=False ftype=a reload=True seuser=None selevel=None Dec 15 02:54:27 localhost kernel: SELinux: Converting 2704 SID table entries... Dec 15 02:54:27 localhost kernel: SELinux: policy capability network_peer_controls=1 Dec 15 02:54:27 localhost kernel: SELinux: policy capability open_perms=1 Dec 15 02:54:27 localhost kernel: SELinux: policy capability extended_socket_class=1 Dec 15 02:54:27 localhost kernel: SELinux: policy capability always_check_network=0 Dec 15 02:54:27 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Dec 15 02:54:27 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Dec 15 02:54:27 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Dec 15 02:54:28 localhost dbus-broker-launch[755]: avc: op=load_policy lsm=selinux seqno=9 res=1 Dec 15 02:54:28 localhost python3[42708]: ansible-community.general.sefcontext Invoked with setype=container_file_t state=present target=/var/lib/iscsi(/.*)? ignore_selinux_state=False ftype=a reload=True seuser=None selevel=None Dec 15 02:54:29 localhost kernel: SELinux: Converting 2704 SID table entries... Dec 15 02:54:29 localhost kernel: SELinux: policy capability network_peer_controls=1 Dec 15 02:54:29 localhost kernel: SELinux: policy capability open_perms=1 Dec 15 02:54:29 localhost kernel: SELinux: policy capability extended_socket_class=1 Dec 15 02:54:29 localhost kernel: SELinux: policy capability always_check_network=0 Dec 15 02:54:29 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Dec 15 02:54:29 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Dec 15 02:54:29 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Dec 15 02:54:29 localhost dbus-broker-launch[755]: avc: op=load_policy lsm=selinux seqno=10 res=1 Dec 15 02:54:29 localhost python3[42733]: ansible-file Invoked with path=/etc/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 15 02:54:30 localhost python3[42749]: ansible-file Invoked with path=/etc/target setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 15 02:54:30 localhost python3[42765]: ansible-file Invoked with path=/var/lib/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 15 02:54:30 localhost python3[42781]: ansible-stat Invoked with path=/lib/systemd/system/iscsid.socket follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 15 02:54:31 localhost python3[42797]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-enabled --quiet iscsi.service _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 02:54:31 localhost python3[42814]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Dec 15 02:54:35 localhost python3[42831]: ansible-file Invoked with path=/etc/modules-load.d state=directory mode=493 owner=root group=root setype=etc_t recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 15 02:54:35 localhost python3[42879]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-tripleo.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 02:54:36 localhost python3[42922]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765785275.55635-75098-46051083424121/source dest=/etc/modules-load.d/99-tripleo.conf mode=420 owner=root group=root setype=etc_t follow=False _original_basename=tripleo-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None attributes=None Dec 15 02:54:36 localhost python3[42952]: ansible-systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Dec 15 02:54:36 localhost systemd[1]: systemd-modules-load.service: Deactivated successfully. Dec 15 02:54:36 localhost systemd[1]: Stopped Load Kernel Modules. Dec 15 02:54:36 localhost systemd[1]: Stopping Load Kernel Modules... Dec 15 02:54:36 localhost systemd[1]: Starting Load Kernel Modules... Dec 15 02:54:36 localhost kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Dec 15 02:54:36 localhost kernel: Bridge firewalling registered Dec 15 02:54:36 localhost systemd-modules-load[42955]: Inserted module 'br_netfilter' Dec 15 02:54:36 localhost systemd-modules-load[42955]: Module 'msr' is built in Dec 15 02:54:36 localhost systemd[1]: Finished Load Kernel Modules. Dec 15 02:54:37 localhost python3[43006]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-tripleo.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 02:54:37 localhost python3[43049]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765785276.9659765-75280-248499774480026/source dest=/etc/sysctl.d/99-tripleo.conf mode=420 owner=root group=root setype=etc_t follow=False _original_basename=tripleo-sysctl.conf.j2 checksum=cddb9401fdafaaf28a4a94b98448f98ae93c94c9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None attributes=None Dec 15 02:54:38 localhost python3[43079]: ansible-sysctl Invoked with name=fs.aio-max-nr value=1048576 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Dec 15 02:54:38 localhost python3[43096]: ansible-sysctl Invoked with name=fs.inotify.max_user_instances value=1024 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Dec 15 02:54:38 localhost python3[43114]: ansible-sysctl Invoked with name=kernel.pid_max value=1048576 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Dec 15 02:54:39 localhost python3[43132]: ansible-sysctl Invoked with name=net.bridge.bridge-nf-call-arptables value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Dec 15 02:54:39 localhost python3[43149]: ansible-sysctl Invoked with name=net.bridge.bridge-nf-call-ip6tables value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Dec 15 02:54:39 localhost python3[43166]: ansible-sysctl Invoked with name=net.bridge.bridge-nf-call-iptables value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Dec 15 02:54:40 localhost python3[43183]: ansible-sysctl Invoked with name=net.ipv4.conf.all.rp_filter value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Dec 15 02:54:40 localhost python3[43201]: ansible-sysctl Invoked with name=net.ipv4.ip_forward value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Dec 15 02:54:40 localhost python3[43219]: ansible-sysctl Invoked with name=net.ipv4.ip_local_reserved_ports value=35357,49000-49001 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Dec 15 02:54:41 localhost python3[43237]: ansible-sysctl Invoked with name=net.ipv4.ip_nonlocal_bind value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Dec 15 02:54:41 localhost python3[43255]: ansible-sysctl Invoked with name=net.ipv4.neigh.default.gc_thresh1 value=1024 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Dec 15 02:54:41 localhost python3[43273]: ansible-sysctl Invoked with name=net.ipv4.neigh.default.gc_thresh2 value=2048 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Dec 15 02:54:42 localhost python3[43291]: ansible-sysctl Invoked with name=net.ipv4.neigh.default.gc_thresh3 value=4096 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Dec 15 02:54:42 localhost python3[43339]: ansible-sysctl Invoked with name=net.ipv6.conf.all.disable_ipv6 value=0 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Dec 15 02:54:42 localhost python3[43378]: ansible-sysctl Invoked with name=net.ipv6.conf.all.forwarding value=0 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Dec 15 02:54:43 localhost python3[43425]: ansible-sysctl Invoked with name=net.ipv6.conf.default.disable_ipv6 value=0 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Dec 15 02:54:43 localhost python3[43454]: ansible-sysctl Invoked with name=net.ipv6.conf.lo.disable_ipv6 value=0 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Dec 15 02:54:43 localhost python3[43490]: ansible-sysctl Invoked with name=net.ipv6.ip_nonlocal_bind value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False Dec 15 02:54:44 localhost python3[43508]: ansible-systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Dec 15 02:54:44 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully. Dec 15 02:54:44 localhost systemd[1]: Stopped Apply Kernel Variables. Dec 15 02:54:44 localhost systemd[1]: Stopping Apply Kernel Variables... Dec 15 02:54:44 localhost systemd[1]: Starting Apply Kernel Variables... Dec 15 02:54:44 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Dec 15 02:54:44 localhost systemd[1]: Finished Apply Kernel Variables. Dec 15 02:54:44 localhost python3[43543]: ansible-file Invoked with mode=0750 path=/var/log/containers/metrics_qdr setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 15 02:54:44 localhost python3[43559]: ansible-file Invoked with path=/var/lib/metrics_qdr setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 15 02:54:45 localhost python3[43575]: ansible-file Invoked with mode=0750 path=/var/log/containers/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 15 02:54:45 localhost python3[43591]: ansible-stat Invoked with path=/var/lib/nova/instances follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 15 02:54:45 localhost python3[43607]: ansible-file Invoked with path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 15 02:54:46 localhost python3[43623]: ansible-file Invoked with path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 15 02:54:46 localhost python3[43639]: ansible-file Invoked with path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 15 02:54:46 localhost python3[43655]: ansible-file Invoked with path=/var/lib/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 15 02:54:47 localhost python3[43671]: ansible-file Invoked with path=/etc/tmpfiles.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:54:47 localhost python3[43719]: ansible-ansible.legacy.stat Invoked with path=/etc/tmpfiles.d/run-nova.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 02:54:47 localhost python3[43762]: ansible-ansible.legacy.copy Invoked with dest=/etc/tmpfiles.d/run-nova.conf src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765785287.2161193-75676-201190502955759/source _original_basename=tmp1h2w6jxp follow=False checksum=f834349098718ec09c7562bcb470b717a83ff411 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:54:48 localhost python3[43792]: ansible-ansible.legacy.command Invoked with _raw_params=systemd-tmpfiles --create _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 02:54:49 localhost python3[43809]: ansible-file Invoked with path=/var/lib/tripleo-config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:54:49 localhost python3[43857]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/delay-nova-compute follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 02:54:50 localhost python3[43900]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/nova/delay-nova-compute mode=493 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765785289.620018-75768-94313720919379/source _original_basename=tmpsk541nac follow=False checksum=f07ad3e8cf3766b3b3b07ae8278826a0ef3bb5e3 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:54:50 localhost python3[43930]: ansible-file Invoked with mode=0750 path=/var/log/containers/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 15 02:54:51 localhost python3[43946]: ansible-file Invoked with path=/etc/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 15 02:54:51 localhost python3[43962]: ansible-file Invoked with path=/etc/libvirt/secrets setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 15 02:54:52 localhost python3[43978]: ansible-file Invoked with path=/etc/libvirt/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 15 02:54:52 localhost python3[43994]: ansible-file Invoked with path=/var/lib/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 15 02:54:52 localhost python3[44010]: ansible-file Invoked with path=/var/cache/libvirt state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:54:52 localhost python3[44026]: ansible-file Invoked with path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 15 02:54:53 localhost python3[44042]: ansible-file Invoked with path=/run/libvirt state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:54:53 localhost python3[44058]: ansible-file Invoked with mode=0770 path=/var/log/containers/libvirt/swtpm setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 15 02:54:54 localhost python3[44074]: ansible-group Invoked with gid=107 name=qemu state=present system=False local=False non_unique=False Dec 15 02:54:54 localhost python3[44096]: ansible-user Invoked with comment=qemu user group=qemu name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005559462.localdomain update_password=always groups=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None Dec 15 02:54:55 localhost python3[44120]: ansible-file Invoked with group=qemu owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None serole=None selevel=None attributes=None Dec 15 02:54:55 localhost python3[44136]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/rpm -q libvirt-daemon _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 02:54:56 localhost python3[44185]: ansible-ansible.legacy.stat Invoked with path=/etc/tmpfiles.d/run-libvirt.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 02:54:56 localhost python3[44228]: ansible-ansible.legacy.copy Invoked with dest=/etc/tmpfiles.d/run-libvirt.conf src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765785295.8984451-76102-15913644731034/source _original_basename=tmpgvy_evd5 follow=False checksum=57f3ff94c666c6aae69ae22e23feb750cf9e8b13 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:54:56 localhost python3[44258]: ansible-seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False Dec 15 02:54:57 localhost dbus-broker-launch[755]: avc: op=load_policy lsm=selinux seqno=11 res=1 Dec 15 02:54:58 localhost python3[44278]: ansible-file Invoked with path=/etc/crypto-policies/local.d/gnutls-qemu.config state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:54:58 localhost python3[44294]: ansible-file Invoked with path=/run/libvirt setype=virt_var_run_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 15 02:54:58 localhost python3[44310]: ansible-seboolean Invoked with name=logrotate_read_inside_containers persistent=True state=True ignore_selinux_state=False Dec 15 02:55:00 localhost dbus-broker-launch[755]: avc: op=load_policy lsm=selinux seqno=12 res=1 Dec 15 02:55:00 localhost python3[44330]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Dec 15 02:55:03 localhost python3[44347]: ansible-setup Invoked with gather_subset=['!all', '!min', 'network'] filter=['ansible_interfaces'] gather_timeout=10 fact_path=/etc/ansible/facts.d Dec 15 02:55:04 localhost python3[44408]: ansible-file Invoked with path=/etc/containers/networks state=directory recurse=True mode=493 owner=root group=root force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:55:04 localhost python3[44424]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 02:55:05 localhost python3[44484]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 02:55:05 localhost python3[44527]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765785304.5735762-76379-118120825200018/source dest=/etc/containers/networks/podman.json mode=0644 owner=root group=root follow=False _original_basename=podman_network_config.j2 checksum=673f4a9ed48471ad56b066f7c680abac526c9b0c backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:55:05 localhost ceph-osd[31375]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Dec 15 02:55:05 localhost ceph-osd[31375]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 600.1 total, 600.0 interval#012Cumulative writes: 3398 writes, 16K keys, 3398 commit groups, 1.0 writes per commit group, ingest: 0.01 GB, 0.03 MB/s#012Cumulative WAL: 3398 writes, 202 syncs, 16.82 writes per sync, written: 0.01 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 3398 writes, 16K keys, 3398 commit groups, 1.0 writes per commit group, ingest: 15.31 MB, 0.03 MB/s#012Interval WAL: 3398 writes, 202 syncs, 16.82 writes per sync, written: 0.01 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 2/0 2.61 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.2 0.01 0.00 1 0.007 0 0 0.0 0.0#012 Sum 2/0 2.61 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.2 0.01 0.00 1 0.007 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.2 0.01 0.00 1 0.007 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x5568990c62d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.7e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-0] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x5568990c62d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.7e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-1] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memt Dec 15 02:55:05 localhost python3[44589]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 02:55:06 localhost python3[44634]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765785305.5950963-76417-176578087682793/source dest=/etc/containers/registries.conf owner=root group=root setype=etc_t mode=0644 follow=False _original_basename=registries.conf.j2 checksum=710a00cfb11a4c3eba9c028ef1984a9fea9ba83a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None attributes=None Dec 15 02:55:06 localhost python3[44664]: ansible-ini_file Invoked with path=/etc/containers/containers.conf owner=root group=root setype=etc_t mode=0644 create=True section=containers option=pids_limit value=4096 backup=False state=present exclusive=True no_extra_spaces=False allow_no_value=False unsafe_writes=False values=None seuser=None serole=None selevel=None attributes=None Dec 15 02:55:07 localhost python3[44680]: ansible-ini_file Invoked with path=/etc/containers/containers.conf owner=root group=root setype=etc_t mode=0644 create=True section=engine option=events_logger value="journald" backup=False state=present exclusive=True no_extra_spaces=False allow_no_value=False unsafe_writes=False values=None seuser=None serole=None selevel=None attributes=None Dec 15 02:55:07 localhost python3[44696]: ansible-ini_file Invoked with path=/etc/containers/containers.conf owner=root group=root setype=etc_t mode=0644 create=True section=engine option=runtime value="crun" backup=False state=present exclusive=True no_extra_spaces=False allow_no_value=False unsafe_writes=False values=None seuser=None serole=None selevel=None attributes=None Dec 15 02:55:07 localhost python3[44712]: ansible-ini_file Invoked with path=/etc/containers/containers.conf owner=root group=root setype=etc_t mode=0644 create=True section=network option=network_backend value="netavark" backup=False state=present exclusive=True no_extra_spaces=False allow_no_value=False unsafe_writes=False values=None seuser=None serole=None selevel=None attributes=None Dec 15 02:55:08 localhost python3[44760]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 02:55:08 localhost python3[44803]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765785308.0729403-76778-256406930055097/source _original_basename=tmpce7k761o follow=False checksum=0bfbc70e9a4740c9004b9947da681f723d529c83 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:55:09 localhost python3[44833]: ansible-file Invoked with mode=0750 path=/var/log/containers/rsyslog setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 15 02:55:09 localhost python3[44849]: ansible-file Invoked with path=/var/lib/rsyslog.container setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 15 02:55:10 localhost ceph-osd[32311]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Dec 15 02:55:10 localhost ceph-osd[32311]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 600.2 total, 600.0 interval#012Cumulative writes: 3249 writes, 16K keys, 3249 commit groups, 1.0 writes per commit group, ingest: 0.01 GB, 0.02 MB/s#012Cumulative WAL: 3249 writes, 140 syncs, 23.21 writes per sync, written: 0.01 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 3249 writes, 16K keys, 3249 commit groups, 1.0 writes per commit group, ingest: 14.64 MB, 0.02 MB/s#012Interval WAL: 3249 writes, 140 syncs, 23.21 writes per sync, written: 0.01 GB, 0.02 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 2/0 2.61 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.3 0.00 0.00 1 0.005 0 0 0.0 0.0#012 Sum 2/0 2.61 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.3 0.00 0.00 1 0.005 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.3 0.00 0.00 1 0.005 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.2 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x564edfcfa2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.3e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-0] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.2 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x564edfcfa2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.3e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-1] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.2 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memt Dec 15 02:55:10 localhost python3[44865]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Dec 15 02:55:13 localhost python3[44914]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 02:55:14 localhost python3[44959]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765785313.3783321-77051-216467922741561/source validate=/usr/sbin/sshd -T -f %s mode=None follow=False _original_basename=sshd_config_block.j2 checksum=913c99ed7d5c33615bfb07a6792a4ef143dcfd2b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:55:14 localhost python3[44990]: ansible-systemd Invoked with name=sshd state=restarted enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 02:55:14 localhost systemd[1]: Stopping OpenSSH server daemon... Dec 15 02:55:14 localhost systemd[1]: sshd.service: Deactivated successfully. Dec 15 02:55:14 localhost systemd[1]: Stopped OpenSSH server daemon. Dec 15 02:55:14 localhost systemd[1]: sshd.service: Consumed 2.018s CPU time, read 2.1M from disk, written 0B to disk. Dec 15 02:55:14 localhost systemd[1]: Stopped target sshd-keygen.target. Dec 15 02:55:14 localhost systemd[1]: Stopping sshd-keygen.target... Dec 15 02:55:14 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Dec 15 02:55:14 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Dec 15 02:55:14 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Dec 15 02:55:14 localhost systemd[1]: Reached target sshd-keygen.target. Dec 15 02:55:14 localhost systemd[1]: Starting OpenSSH server daemon... Dec 15 02:55:14 localhost sshd[44994]: main: sshd: ssh-rsa algorithm is disabled Dec 15 02:55:14 localhost systemd[1]: Started OpenSSH server daemon. Dec 15 02:55:15 localhost python3[45010]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active ntpd.service || systemctl is-enabled ntpd.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 02:55:16 localhost python3[45028]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active ntpd.service || systemctl is-enabled ntpd.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 02:55:16 localhost python3[45046]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Dec 15 02:55:20 localhost python3[45095]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 02:55:20 localhost python3[45113]: ansible-ansible.legacy.file Invoked with owner=root group=root mode=420 dest=/etc/chrony.conf _original_basename=chrony.conf.j2 recurse=False state=file path=/etc/chrony.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:55:21 localhost python3[45143]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 02:55:21 localhost python3[45193]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/chrony-online.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 02:55:22 localhost python3[45211]: ansible-ansible.legacy.file Invoked with dest=/etc/systemd/system/chrony-online.service _original_basename=chrony-online.service recurse=False state=file path=/etc/systemd/system/chrony-online.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:55:22 localhost python3[45241]: ansible-systemd Invoked with state=started name=chrony-online.service enabled=True daemon-reload=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 02:55:22 localhost systemd[1]: Reloading. Dec 15 02:55:22 localhost systemd-sysv-generator[45271]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 02:55:22 localhost systemd-rc-local-generator[45268]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 02:55:22 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 02:55:22 localhost systemd[1]: Starting chronyd online sources service... Dec 15 02:55:23 localhost chronyc[45280]: 200 OK Dec 15 02:55:23 localhost systemd[1]: chrony-online.service: Deactivated successfully. Dec 15 02:55:23 localhost systemd[1]: Finished chronyd online sources service. Dec 15 02:55:23 localhost python3[45296]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc makestep _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 02:55:23 localhost chronyd[25720]: System clock was stepped by -0.000029 seconds Dec 15 02:55:23 localhost python3[45313]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc waitsync 30 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 02:55:24 localhost python3[45330]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc makestep _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 02:55:24 localhost chronyd[25720]: System clock was stepped by 0.000000 seconds Dec 15 02:55:24 localhost python3[45347]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc waitsync 30 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 02:55:25 localhost python3[45364]: ansible-timezone Invoked with name=UTC hwclock=None Dec 15 02:55:25 localhost systemd[1]: Starting Time & Date Service... Dec 15 02:55:25 localhost systemd[1]: Started Time & Date Service. Dec 15 02:55:26 localhost python3[45384]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides tuned tuned-profiles-cpu-partitioning _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 02:55:26 localhost python3[45401]: ansible-ansible.legacy.command Invoked with _raw_params=which tuned-adm _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 02:55:27 localhost python3[45418]: ansible-slurp Invoked with src=/etc/tuned/active_profile Dec 15 02:55:27 localhost python3[45434]: ansible-stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 15 02:55:28 localhost python3[45450]: ansible-file Invoked with mode=0750 path=/var/log/containers/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 15 02:55:28 localhost python3[45466]: ansible-file Invoked with path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 15 02:55:29 localhost python3[45514]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/neutron-cleanup follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 02:55:29 localhost python3[45557]: ansible-ansible.legacy.copy Invoked with dest=/usr/libexec/neutron-cleanup force=True mode=0755 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765785328.8742437-77918-151543153269745/source _original_basename=tmplaouzm6s follow=False checksum=f9cc7d1e91fbae49caa7e35eb2253bba146a73b4 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:55:30 localhost python3[45619]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/neutron-cleanup.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 02:55:30 localhost python3[45662]: ansible-ansible.legacy.copy Invoked with dest=/usr/lib/systemd/system/neutron-cleanup.service force=True src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765785329.855398-77979-173513438326367/source _original_basename=tmpf_g01bzi follow=False checksum=6b6cd9f074903a28d054eb530a10c7235d0c39fc backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:55:31 localhost python3[45692]: ansible-ansible.legacy.systemd Invoked with enabled=True name=neutron-cleanup daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None Dec 15 02:55:31 localhost systemd[1]: Reloading. Dec 15 02:55:31 localhost systemd-rc-local-generator[45716]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 02:55:31 localhost systemd-sysv-generator[45721]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 02:55:31 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 02:55:31 localhost python3[45746]: ansible-file Invoked with mode=0750 path=/var/log/containers/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 15 02:55:32 localhost python3[45762]: ansible-ansible.legacy.command Invoked with _raw_params=ip netns add ns_temp _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 02:55:32 localhost python3[45779]: ansible-ansible.legacy.command Invoked with _raw_params=ip netns delete ns_temp _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 02:55:32 localhost systemd[1]: run-netns-ns_temp.mount: Deactivated successfully. Dec 15 02:55:32 localhost python3[45796]: ansible-file Invoked with path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 15 02:55:33 localhost python3[45812]: ansible-file Invoked with path=/var/lib/neutron/kill_scripts state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:55:33 localhost python3[45860]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 02:55:34 localhost python3[45903]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=493 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765785333.3605561-78179-147100974735147/source _original_basename=tmplv0v2xt5 follow=False checksum=2f369fbe8f83639cdfd4efc53e7feb4ee77d1ed7 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:55:38 localhost systemd[35760]: Created slice User Background Tasks Slice. Dec 15 02:55:38 localhost systemd[35760]: Starting Cleanup of User's Temporary Files and Directories... Dec 15 02:55:38 localhost systemd[35760]: Finished Cleanup of User's Temporary Files and Directories. Dec 15 02:55:55 localhost systemd[1]: systemd-timedated.service: Deactivated successfully. Dec 15 02:55:58 localhost python3[46013]: ansible-file Invoked with path=/var/log/containers state=directory setype=container_file_t selevel=s0 mode=488 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None Dec 15 02:55:59 localhost python3[46029]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory selevel=s0 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None setype=None attributes=None Dec 15 02:55:59 localhost python3[46045]: ansible-file Invoked with path=/var/lib/tripleo-config state=directory setype=container_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Dec 15 02:55:59 localhost python3[46061]: ansible-file Invoked with path=/var/lib/container-startup-configs.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:56:00 localhost python3[46077]: ansible-file Invoked with path=/var/lib/docker-container-startup-configs.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:56:00 localhost python3[46093]: ansible-community.general.sefcontext Invoked with target=/var/lib/container-config-scripts(/.*)? setype=container_file_t state=present ignore_selinux_state=False ftype=a reload=True seuser=None selevel=None Dec 15 02:56:01 localhost kernel: SELinux: Converting 2707 SID table entries... Dec 15 02:56:01 localhost kernel: SELinux: policy capability network_peer_controls=1 Dec 15 02:56:01 localhost kernel: SELinux: policy capability open_perms=1 Dec 15 02:56:01 localhost kernel: SELinux: policy capability extended_socket_class=1 Dec 15 02:56:01 localhost kernel: SELinux: policy capability always_check_network=0 Dec 15 02:56:01 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Dec 15 02:56:01 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Dec 15 02:56:01 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Dec 15 02:56:01 localhost dbus-broker-launch[755]: avc: op=load_policy lsm=selinux seqno=13 res=1 Dec 15 02:56:01 localhost python3[46115]: ansible-file Invoked with path=/var/lib/container-config-scripts state=directory setype=container_file_t recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 15 02:56:03 localhost python3[46252]: ansible-container_startup_config Invoked with config_base_dir=/var/lib/tripleo-config/container-startup-config config_data={'step_1': {'metrics_qdr': {'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, 'metrics_qdr_init_logs': {'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}}, 'step_2': {'create_haproxy_wrapper': {'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, 'create_virtlogd_wrapper': {'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765784752'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, 'nova_compute_init_log': {'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765784752'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, 'nova_virtqemud_init_logs': {'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765784752'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}}, 'step_3': {'ceilometer_init_log': {'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, 'collectd': {'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, 'iscsid': {'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, 'nova_statedir_owner': {'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1765784752', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, 'nova_virtlogd_wrapper': {'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': [ Dec 15 02:56:03 localhost rsyslogd[759]: message too long (31243) with configured size 8096, begin of message is: ansible-container_startup_config Invoked with config_base_dir=/var/lib/tripleo-c [v8.2102.0-111.el9 try https://www.rsyslog.com/e/2445 ] Dec 15 02:56:03 localhost python3[46268]: ansible-file Invoked with path=/var/lib/kolla/config_files state=directory setype=container_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Dec 15 02:56:04 localhost python3[46284]: ansible-file Invoked with path=/var/lib/config-data mode=493 state=directory setype=container_file_t selevel=s0 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None Dec 15 02:56:04 localhost python3[46300]: ansible-tripleo_container_configs Invoked with config_data={'/var/lib/kolla/config_files/ceilometer-agent-ipmi.json': {'command': '/usr/bin/ceilometer-polling --polling-namespaces ipmi --logfile /var/log/ceilometer/ipmi.log', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}]}, '/var/lib/kolla/config_files/ceilometer_agent_compute.json': {'command': '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /var/log/ceilometer/compute.log', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}]}, '/var/lib/kolla/config_files/collectd.json': {'command': '/usr/sbin/collectd -f', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/', 'merge': False, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/etc/collectd.d'}], 'permissions': [{'owner': 'collectd:collectd', 'path': '/var/log/collectd', 'recurse': True}, {'owner': 'collectd:collectd', 'path': '/scripts', 'recurse': True}, {'owner': 'collectd:collectd', 'path': '/config-scripts', 'recurse': True}]}, '/var/lib/kolla/config_files/iscsid.json': {'command': '/usr/sbin/iscsid -f', 'config_files': [{'dest': '/etc/iscsi/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-iscsid/'}]}, '/var/lib/kolla/config_files/logrotate-crond.json': {'command': '/usr/sbin/crond -s -n', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}]}, '/var/lib/kolla/config_files/metrics_qdr.json': {'command': '/usr/sbin/qdrouterd -c /etc/qpid-dispatch/qdrouterd.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/', 'merge': True, 'optional': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-tls/*'}], 'permissions': [{'owner': 'qdrouterd:qdrouterd', 'path': '/var/lib/qdrouterd', 'recurse': True}, {'optional': True, 'owner': 'qdrouterd:qdrouterd', 'path': '/etc/pki/tls/certs/metrics_qdr.crt'}, {'optional': True, 'owner': 'qdrouterd:qdrouterd', 'path': '/etc/pki/tls/private/metrics_qdr.key'}]}, '/var/lib/kolla/config_files/nova-migration-target.json': {'command': 'dumb-init --single-child -- /usr/sbin/sshd -D -p 2022', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ssh/', 'owner': 'root', 'perm': '0600', 'source': '/host-ssh/ssh_host_*_key'}]}, '/var/lib/kolla/config_files/nova_compute.json': {'command': '/var/lib/nova/delay-nova-compute --delay 180 --nova-binary /usr/bin/nova-compute ', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/iscsi/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-iscsid/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/var/log/nova', 'recurse': True}, {'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json': {'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_wait_for_compute_service.py', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}], 'permissions': [{'owner': 'nova:nova', 'path': '/var/log/nova', 'recurse': True}]}, '/var/lib/kolla/config_files/nova_virtlogd.json': {'command': '/usr/local/bin/virtlogd_wrapper', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtnodedevd.json': {'command': '/usr/sbin/virtnodedevd --config /etc/libvirt/virtnodedevd.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtproxyd.json': {'command': '/usr/sbin/virtproxyd --config /etc/libvirt/virtproxyd.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtqemud.json': {'command': '/usr/sbin/virtqemud --config /etc/libvirt/virtqemud.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtsecretd.json': {'command': '/usr/sbin/virtsecretd --config /etc/libvirt/virtsecretd.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtstoraged.json': {'command': '/usr/sbin/virtstoraged --config /etc/libvirt/virtstoraged.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/ovn_controller.json': {'command': '/usr/bin/ovn-controller --pidfile --log-file unix:/run/openvswitch/db.sock ', 'permissions': [{'owner': 'root:root', 'path': '/var/log/openvswitch', 'recurse': True}, {'owner': 'root:root', 'path': '/var/log/ovn', 'recurse': True}]}, '/var/lib/kolla/config_files/ovn_metadata_agent.json': {'command': '/usr/bin/networking-ovn-metadata-agent --config-file /etc/neutron/neutron.conf --config-file /etc/neutron/plugins/networking-ovn/networking-ovn-metadata-agent.ini --log-file=/var/log/neutron/ovn-metadata-agent.log', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}], 'permissions': [{'owner': 'neutron:neutron', 'path': '/var/log/neutron', 'recurse': True}, {'owner': 'neutron:neutron', 'path': '/var/lib/neutron', 'recurse': True}, {'optional': True, 'owner': 'neutron:neutron', 'path': '/etc/pki/tls/certs/ovn_metadata.crt', 'perm': '0644'}, {'optional': True, 'owner': 'neutron:neutron', 'path': '/etc/pki/tls/private/ovn_metadata.key', 'perm': '0644'}]}, '/var/lib/kolla/config_files/rsyslog.json': {'command': '/usr/sbin/rsyslogd -n', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}], 'permissions': [{'owner': 'root:root', 'path': '/var/lib/rsyslog', 'recurse': True}, {'owner': 'root:root', 'path': '/var/log/rsyslog', 'recurse': True}]}} Dec 15 02:56:09 localhost python3[46348]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 02:56:10 localhost python3[46391]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765785369.6795099-79920-106159246774283/source _original_basename=tmpwh3zdqzt follow=False checksum=dfdcc7695edd230e7a2c06fc7b739bfa56506d8f backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:56:10 localhost python3[46421]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_1 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 15 02:56:12 localhost python3[46544]: ansible-file Invoked with path=/var/lib/container-puppet state=directory setype=container_file_t selevel=s0 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Dec 15 02:56:14 localhost python3[46665]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6 Dec 15 02:56:16 localhost python3[46682]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q lvm2 _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 02:56:17 localhost python3[46699]: ansible-ansible.legacy.dnf Invoked with name=['systemd-container'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None Dec 15 02:56:21 localhost dbus-broker-launch[751]: Noticed file-system modification, trigger reload. Dec 15 02:56:21 localhost dbus-broker-launch[18416]: Noticed file-system modification, trigger reload. Dec 15 02:56:21 localhost dbus-broker-launch[751]: Noticed file-system modification, trigger reload. Dec 15 02:56:21 localhost dbus-broker-launch[18416]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored Dec 15 02:56:21 localhost dbus-broker-launch[18416]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored Dec 15 02:56:21 localhost systemd[1]: Reexecuting. Dec 15 02:56:21 localhost systemd[1]: systemd 252-14.el9_2.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified) Dec 15 02:56:21 localhost systemd[1]: Detected virtualization kvm. Dec 15 02:56:21 localhost systemd[1]: Detected architecture x86-64. Dec 15 02:56:21 localhost systemd-rc-local-generator[46752]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 02:56:21 localhost systemd-sysv-generator[46757]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 02:56:21 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 02:56:29 localhost kernel: SELinux: Converting 2707 SID table entries... Dec 15 02:56:29 localhost kernel: SELinux: policy capability network_peer_controls=1 Dec 15 02:56:29 localhost kernel: SELinux: policy capability open_perms=1 Dec 15 02:56:29 localhost kernel: SELinux: policy capability extended_socket_class=1 Dec 15 02:56:29 localhost kernel: SELinux: policy capability always_check_network=0 Dec 15 02:56:29 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Dec 15 02:56:29 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Dec 15 02:56:29 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Dec 15 02:56:29 localhost dbus-broker-launch[751]: Noticed file-system modification, trigger reload. Dec 15 02:56:29 localhost dbus-broker-launch[755]: avc: op=load_policy lsm=selinux seqno=14 res=1 Dec 15 02:56:29 localhost dbus-broker-launch[751]: Noticed file-system modification, trigger reload. Dec 15 02:56:31 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Dec 15 02:56:31 localhost systemd[1]: Starting man-db-cache-update.service... Dec 15 02:56:31 localhost systemd[1]: Reloading. Dec 15 02:56:31 localhost systemd-rc-local-generator[46831]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 02:56:31 localhost systemd-sysv-generator[46835]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 02:56:31 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 02:56:31 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Dec 15 02:56:31 localhost systemd[1]: Queuing reload/restart jobs for marked units… Dec 15 02:56:31 localhost systemd-journald[618]: Journal stopped Dec 15 02:56:31 localhost systemd-journald[618]: Received SIGTERM from PID 1 (systemd). Dec 15 02:56:31 localhost systemd[1]: Stopping Journal Service... Dec 15 02:56:31 localhost systemd[1]: Stopping Rule-based Manager for Device Events and Files... Dec 15 02:56:31 localhost systemd[1]: systemd-journald.service: Deactivated successfully. Dec 15 02:56:31 localhost systemd[1]: Stopped Journal Service. Dec 15 02:56:31 localhost systemd[1]: systemd-journald.service: Consumed 1.773s CPU time. Dec 15 02:56:31 localhost systemd[1]: Starting Journal Service... Dec 15 02:56:31 localhost systemd[1]: systemd-udevd.service: Deactivated successfully. Dec 15 02:56:31 localhost systemd[1]: Stopped Rule-based Manager for Device Events and Files. Dec 15 02:56:31 localhost systemd[1]: systemd-udevd.service: Consumed 3.047s CPU time. Dec 15 02:56:31 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files... Dec 15 02:56:31 localhost systemd-journald[47230]: Journal started Dec 15 02:56:31 localhost systemd-journald[47230]: Runtime Journal (/run/log/journal/738a39f68bc78fb81032e509449fb759) is 12.1M, max 314.7M, 302.6M free. Dec 15 02:56:31 localhost systemd[1]: Started Journal Service. Dec 15 02:56:31 localhost systemd-journald[47230]: Field hash table of /run/log/journal/738a39f68bc78fb81032e509449fb759/system.journal has a fill level at 75.4 (251 of 333 items), suggesting rotation. Dec 15 02:56:31 localhost systemd-journald[47230]: /run/log/journal/738a39f68bc78fb81032e509449fb759/system.journal: Journal header limits reached or header out-of-date, rotating. Dec 15 02:56:31 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Dec 15 02:56:31 localhost systemd-udevd[47236]: Using default interface naming scheme 'rhel-9.0'. Dec 15 02:56:31 localhost systemd[1]: Started Rule-based Manager for Device Events and Files. Dec 15 02:56:31 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Dec 15 02:56:31 localhost systemd[1]: Reloading. Dec 15 02:56:31 localhost systemd-sysv-generator[47786]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 02:56:31 localhost systemd-rc-local-generator[47782]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 02:56:31 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 02:56:32 localhost systemd[1]: Queuing reload/restart jobs for marked units… Dec 15 02:56:32 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Dec 15 02:56:32 localhost systemd[1]: Finished man-db-cache-update.service. Dec 15 02:56:32 localhost systemd[1]: man-db-cache-update.service: Consumed 1.275s CPU time. Dec 15 02:56:32 localhost systemd[1]: run-r6207361023aa4c26b8a2ebcc5b95e20f.service: Deactivated successfully. Dec 15 02:56:32 localhost systemd[1]: run-r62b90b6e2b034f05baa8e53eab024187.service: Deactivated successfully. Dec 15 02:56:33 localhost python3[48185]: ansible-sysctl Invoked with name=vm.unprivileged_userfaultfd reload=True state=present sysctl_file=/etc/sysctl.d/99-tripleo-postcopy.conf sysctl_set=True value=1 ignoreerrors=False Dec 15 02:56:34 localhost python3[48204]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active ksm.service || systemctl is-enabled ksm.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 02:56:35 localhost python3[48222]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Dec 15 02:56:35 localhost python3[48222]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 --format json Dec 15 02:56:35 localhost python3[48222]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 -q --tls-verify=false Dec 15 02:56:43 localhost podman[48235]: 2025-12-15 07:56:35.621064752 +0000 UTC m=+0.043328688 image pull registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 Dec 15 02:56:43 localhost python3[48222]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect bac901955dcf7a32a493c6ef724c092009bbc18467858aa8c55e916b8c2b2b8f --format json Dec 15 02:56:43 localhost python3[48336]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Dec 15 02:56:43 localhost python3[48336]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 --format json Dec 15 02:56:43 localhost python3[48336]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 -q --tls-verify=false Dec 15 02:56:50 localhost podman[48349]: 2025-12-15 07:56:43.93164332 +0000 UTC m=+0.043807501 image pull registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 Dec 15 02:56:50 localhost python3[48336]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 44feaf8d87c1d40487578230316b622680576d805efdb45dfeea6aad464b41f1 --format json Dec 15 02:56:51 localhost podman[48544]: 2025-12-15 07:56:51.183184686 +0000 UTC m=+0.074304435 container exec 8dcda56b365b42dc8758aab77a9ec80db304780e449052738f7e4e648ae1ecaf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-crash-np0005559462, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, vcs-type=git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, version=7, release=1763362218, build-date=2025-11-26T19:44:28Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, ceph=True) Dec 15 02:56:51 localhost podman[48544]: 2025-12-15 07:56:51.2574791 +0000 UTC m=+0.148598859 container exec_died 8dcda56b365b42dc8758aab77a9ec80db304780e449052738f7e4e648ae1ecaf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-crash-np0005559462, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., name=rhceph, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, maintainer=Guillaume Abrioux , io.openshift.expose-services=, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, version=7, vcs-type=git, build-date=2025-11-26T19:44:28Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, architecture=x86_64, distribution-scope=public, release=1763362218, ceph=True, com.redhat.component=rhceph-container) Dec 15 02:56:51 localhost python3[48565]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Dec 15 02:56:51 localhost python3[48565]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 --format json Dec 15 02:56:51 localhost python3[48565]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 -q --tls-verify=false Dec 15 02:56:56 localhost sshd[48756]: main: sshd: ssh-rsa algorithm is disabled Dec 15 02:57:12 localhost podman[48606]: 2025-12-15 07:56:51.389227709 +0000 UTC m=+0.041908480 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Dec 15 02:57:12 localhost python3[48565]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 3a088c12511c977065fdc5f1594cba7b1a79f163578a6ffd0ac4a475b8e67938 --format json Dec 15 02:57:12 localhost python3[49596]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Dec 15 02:57:12 localhost python3[49596]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 --format json Dec 15 02:57:12 localhost python3[49596]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 -q --tls-verify=false Dec 15 02:57:23 localhost podman[49609]: 2025-12-15 07:57:12.554332093 +0000 UTC m=+0.039061594 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 Dec 15 02:57:23 localhost python3[49596]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 514d439186251360cf734cbc6d4a44c834664891872edf3798a653dfaacf10c0 --format json Dec 15 02:57:24 localhost python3[49686]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Dec 15 02:57:24 localhost python3[49686]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 --format json Dec 15 02:57:24 localhost python3[49686]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 -q --tls-verify=false Dec 15 02:57:27 localhost podman[49699]: 2025-12-15 07:57:24.465191088 +0000 UTC m=+0.043003150 image pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 Dec 15 02:57:27 localhost python3[49686]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect a9dd7a2ac6f35cb086249f87f74e2f8e74e7e2ad5141ce2228263be6faedce26 --format json Dec 15 02:57:31 localhost python3[49790]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Dec 15 02:57:31 localhost python3[49790]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 --format json Dec 15 02:57:31 localhost python3[49790]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 -q --tls-verify=false Dec 15 02:57:37 localhost podman[49852]: 2025-12-15 07:57:31.185767154 +0000 UTC m=+0.044795417 image pull registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 Dec 15 02:57:37 localhost python3[49790]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 24976907b2c2553304119aba5731a800204d664feed24ca9eb7f2b4c7d81016b --format json Dec 15 02:57:37 localhost python3[50069]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Dec 15 02:57:37 localhost python3[50069]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 --format json Dec 15 02:57:37 localhost python3[50069]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 -q --tls-verify=false Dec 15 02:57:39 localhost podman[50082]: 2025-12-15 07:57:37.950458422 +0000 UTC m=+0.045103306 image pull registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 Dec 15 02:57:39 localhost python3[50069]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 57163a7b21fdbb804a27897cb6e6052a5e5c7a339c45d663e80b52375a760dcf --format json Dec 15 02:57:40 localhost python3[50158]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Dec 15 02:57:40 localhost python3[50158]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 --format json Dec 15 02:57:40 localhost python3[50158]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 -q --tls-verify=false Dec 15 02:57:42 localhost podman[50170]: 2025-12-15 07:57:40.331456297 +0000 UTC m=+0.041988284 image pull registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 Dec 15 02:57:42 localhost python3[50158]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 076d82a27d63c8328729ed27ceb4291585ae18d017befe6fe353df7aa11715ae --format json Dec 15 02:57:42 localhost python3[50248]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Dec 15 02:57:42 localhost python3[50248]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1 --format json Dec 15 02:57:42 localhost python3[50248]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1 -q --tls-verify=false Dec 15 02:57:45 localhost podman[50260]: 2025-12-15 07:57:42.684435475 +0000 UTC m=+0.032761773 image pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1 Dec 15 02:57:45 localhost python3[50248]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect d0dbcb95546840a8d088df044347a7877ad5ea45a2ddba0578e9bb5de4ab0da5 --format json Dec 15 02:57:45 localhost python3[50336]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Dec 15 02:57:45 localhost python3[50336]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 --format json Dec 15 02:57:45 localhost python3[50336]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 -q --tls-verify=false Dec 15 02:57:49 localhost podman[50350]: 2025-12-15 07:57:45.761907934 +0000 UTC m=+0.045636399 image pull registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 Dec 15 02:57:49 localhost python3[50336]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect e6e981540e553415b2d6eda490d7683db07164af2e7a0af8245623900338a4d6 --format json Dec 15 02:57:49 localhost python3[50440]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None Dec 15 02:57:49 localhost python3[50440]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 --format json Dec 15 02:57:49 localhost python3[50440]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 -q --tls-verify=false Dec 15 02:57:51 localhost podman[50452]: 2025-12-15 07:57:49.56566772 +0000 UTC m=+0.041911893 image pull registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 Dec 15 02:57:51 localhost python3[50440]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 87ee88cbf01fb42e0b22747072843bcca6130a90eda4de6e74b3ccd847bb4040 --format json Dec 15 02:57:52 localhost python3[50530]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_1 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 15 02:57:53 localhost ansible-async_wrapper.py[50765]: Invoked with 817992767663 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1765785473.4034476-82615-279243627817583/AnsiballZ_command.py _ Dec 15 02:57:53 localhost ansible-async_wrapper.py[50768]: Starting module and watcher Dec 15 02:57:53 localhost ansible-async_wrapper.py[50768]: Start watching 50769 (3600) Dec 15 02:57:53 localhost ansible-async_wrapper.py[50769]: Start module (50769) Dec 15 02:57:53 localhost ansible-async_wrapper.py[50765]: Return async_wrapper task started. Dec 15 02:57:54 localhost python3[50789]: ansible-ansible.legacy.async_status Invoked with jid=817992767663.50765 mode=status _async_dir=/tmp/.ansible_async Dec 15 02:57:57 localhost puppet-user[50780]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Dec 15 02:57:57 localhost puppet-user[50780]: (file: /etc/puppet/hiera.yaml) Dec 15 02:57:57 localhost puppet-user[50780]: Warning: Undefined variable '::deploy_config_name'; Dec 15 02:57:57 localhost puppet-user[50780]: (file & line not available) Dec 15 02:57:57 localhost puppet-user[50780]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Dec 15 02:57:57 localhost puppet-user[50780]: (file & line not available) Dec 15 02:57:57 localhost puppet-user[50780]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8) Dec 15 02:57:57 localhost puppet-user[50780]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69) Dec 15 02:57:57 localhost puppet-user[50780]: Notice: Compiled catalog for np0005559462.localdomain in environment production in 0.12 seconds Dec 15 02:57:57 localhost puppet-user[50780]: Notice: /Stage[main]/Tripleo::Profile::Base::Database::Mysql::Client/Exec[directory-create-etc-my.cnf.d]/returns: executed successfully Dec 15 02:57:57 localhost puppet-user[50780]: Notice: /Stage[main]/Tripleo::Profile::Base::Database::Mysql::Client/File[/etc/my.cnf.d/tripleo.cnf]/ensure: created Dec 15 02:57:57 localhost puppet-user[50780]: Notice: /Stage[main]/Tripleo::Profile::Base::Database::Mysql::Client/Augeas[tripleo-mysql-client-conf]/returns: executed successfully Dec 15 02:57:57 localhost puppet-user[50780]: Notice: Applied catalog in 0.11 seconds Dec 15 02:57:57 localhost puppet-user[50780]: Application: Dec 15 02:57:57 localhost puppet-user[50780]: Initial environment: production Dec 15 02:57:57 localhost puppet-user[50780]: Converged environment: production Dec 15 02:57:57 localhost puppet-user[50780]: Run mode: user Dec 15 02:57:57 localhost puppet-user[50780]: Changes: Dec 15 02:57:57 localhost puppet-user[50780]: Total: 3 Dec 15 02:57:57 localhost puppet-user[50780]: Events: Dec 15 02:57:57 localhost puppet-user[50780]: Success: 3 Dec 15 02:57:57 localhost puppet-user[50780]: Total: 3 Dec 15 02:57:57 localhost puppet-user[50780]: Resources: Dec 15 02:57:57 localhost puppet-user[50780]: Changed: 3 Dec 15 02:57:57 localhost puppet-user[50780]: Out of sync: 3 Dec 15 02:57:57 localhost puppet-user[50780]: Total: 10 Dec 15 02:57:57 localhost puppet-user[50780]: Time: Dec 15 02:57:57 localhost puppet-user[50780]: Filebucket: 0.00 Dec 15 02:57:57 localhost puppet-user[50780]: Schedule: 0.00 Dec 15 02:57:57 localhost puppet-user[50780]: File: 0.00 Dec 15 02:57:57 localhost puppet-user[50780]: Exec: 0.02 Dec 15 02:57:57 localhost puppet-user[50780]: Augeas: 0.07 Dec 15 02:57:57 localhost puppet-user[50780]: Transaction evaluation: 0.10 Dec 15 02:57:57 localhost puppet-user[50780]: Catalog application: 0.11 Dec 15 02:57:57 localhost puppet-user[50780]: Config retrieval: 0.16 Dec 15 02:57:57 localhost puppet-user[50780]: Last run: 1765785477 Dec 15 02:57:57 localhost puppet-user[50780]: Total: 0.11 Dec 15 02:57:57 localhost puppet-user[50780]: Version: Dec 15 02:57:57 localhost puppet-user[50780]: Config: 1765785477 Dec 15 02:57:57 localhost puppet-user[50780]: Puppet: 7.10.0 Dec 15 02:57:57 localhost ansible-async_wrapper.py[50769]: Module complete (50769) Dec 15 02:57:58 localhost ansible-async_wrapper.py[50768]: Done in kid B. Dec 15 02:58:04 localhost python3[50932]: ansible-ansible.legacy.async_status Invoked with jid=817992767663.50765 mode=status _async_dir=/tmp/.ansible_async Dec 15 02:58:05 localhost python3[50948]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Dec 15 02:58:05 localhost python3[50964]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 15 02:58:06 localhost python3[51012]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 02:58:06 localhost python3[51055]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/container-puppet/puppetlabs/facter.conf setype=svirt_sandbox_file_t selevel=s0 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765785485.9224215-82788-141456105047426/source _original_basename=tmpmd8jluvk follow=False checksum=53908622cb869db5e2e2a68e737aa2ab1a872111 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None attributes=None Dec 15 02:58:07 localhost python3[51085]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:58:08 localhost python3[51188]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None Dec 15 02:58:08 localhost python3[51207]: ansible-file Invoked with path=/var/lib/tripleo-config/container-puppet-config mode=448 recurse=True setype=container_file_t force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 15 02:58:09 localhost python3[51223]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=False puppet_config=/var/lib/container-puppet/container-puppet.json short_hostname=np0005559462 step=1 update_config_hash_only=False Dec 15 02:58:09 localhost python3[51239]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:58:10 localhost python3[51255]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_1 config_pattern=container-puppet-*.json config_overrides={} debug=True Dec 15 02:58:11 localhost python3[51271]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None Dec 15 02:58:12 localhost python3[51312]: ansible-tripleo_container_manage Invoked with config_id=tripleo_puppet_step1 config_dir=/var/lib/tripleo-config/container-puppet-config/step_1 config_patterns=container-puppet-*.json config_overrides={} concurrency=6 log_base_path=/var/log/containers/stdouts debug=False Dec 15 02:58:12 localhost podman[51486]: 2025-12-15 07:58:12.336790167 +0000 UTC m=+0.062396015 container create 78945b55310ce128de7c360c749c4caec3fd3259feb54fbae2253ebadd2f6d1a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, managed_by=tripleo_ansible, build-date=2025-11-19T00:35:22Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, config_id=tripleo_puppet_step1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005559462', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, com.redhat.component=openstack-nova-libvirt-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4, container_name=container-puppet-nova_libvirt, batch=17.1_20251118.1, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, distribution-scope=public, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044) Dec 15 02:58:12 localhost podman[51487]: 2025-12-15 07:58:12.36568481 +0000 UTC m=+0.090568069 container create 46b6269168ffd58e0b42dc9c29693e3761e13556befa8caaaeedb8def86ca41b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=container-puppet-metrics_qdr, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005559462', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, managed_by=tripleo_ansible, architecture=x86_64, url=https://www.redhat.com, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, name=rhosp17/openstack-qdrouterd, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_puppet_step1) Dec 15 02:58:12 localhost podman[51488]: 2025-12-15 07:58:12.389987772 +0000 UTC m=+0.113509295 container create eb715ee44032c1981e0162895debe9cbad3eb8d4c06106251f6c892310ccf334 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, io.buildah.version=1.41.4, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, container_name=container-puppet-iscsid, managed_by=tripleo_ansible, version=17.1.12, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, batch=17.1_20251118.1, vcs-type=git, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_puppet_step1, com.redhat.component=openstack-iscsid-container, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005559462', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, release=1761123044) Dec 15 02:58:12 localhost podman[51486]: 2025-12-15 07:58:12.299649281 +0000 UTC m=+0.025255129 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Dec 15 02:58:12 localhost systemd[1]: Started libpod-conmon-78945b55310ce128de7c360c749c4caec3fd3259feb54fbae2253ebadd2f6d1a.scope. Dec 15 02:58:12 localhost podman[51487]: 2025-12-15 07:58:12.305692308 +0000 UTC m=+0.030575607 image pull registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 Dec 15 02:58:12 localhost systemd[1]: Started libpod-conmon-46b6269168ffd58e0b42dc9c29693e3761e13556befa8caaaeedb8def86ca41b.scope. Dec 15 02:58:12 localhost systemd[1]: Started libcrun container. Dec 15 02:58:12 localhost systemd[1]: Started libpod-conmon-eb715ee44032c1981e0162895debe9cbad3eb8d4c06106251f6c892310ccf334.scope. Dec 15 02:58:12 localhost systemd[1]: Started libcrun container. Dec 15 02:58:12 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fb7bf143538b6ac117a23468fe01aaa41031d6332839335f81ce26797b9a28dd/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff) Dec 15 02:58:12 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8386ac9ef0e341b40941113adbcd0de64d383dd53b6c975b3c29a443c4fff823/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff) Dec 15 02:58:12 localhost systemd[1]: Started libcrun container. Dec 15 02:58:12 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fc8d0c3725ac03a6b54ddb2ddc35b1ea5dcbf08234ce27c5f96260191136f0c8/merged/tmp/iscsi.host supports timestamps until 2038 (0x7fffffff) Dec 15 02:58:12 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fc8d0c3725ac03a6b54ddb2ddc35b1ea5dcbf08234ce27c5f96260191136f0c8/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff) Dec 15 02:58:12 localhost podman[51486]: 2025-12-15 07:58:12.428339861 +0000 UTC m=+0.153945739 container init 78945b55310ce128de7c360c749c4caec3fd3259feb54fbae2253ebadd2f6d1a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, config_id=tripleo_puppet_step1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, name=rhosp17/openstack-nova-libvirt, container_name=container-puppet-nova_libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005559462', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, release=1761123044, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:35:22Z, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, version=17.1.12, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 15 02:58:12 localhost podman[51488]: 2025-12-15 07:58:12.331691684 +0000 UTC m=+0.055213277 image pull registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 Dec 15 02:58:12 localhost podman[51519]: 2025-12-15 07:58:12.333143923 +0000 UTC m=+0.028257597 image pull registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 Dec 15 02:58:12 localhost podman[51519]: 2025-12-15 07:58:12.432237722 +0000 UTC m=+0.127351396 container create 915a0fea2b170626fac9e25557b760cddd96bae4ec862148cf4294a9463929b0 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005559462', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, config_id=tripleo_puppet_step1, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.4, container_name=container-puppet-crond, vendor=Red Hat, Inc., name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, release=1761123044, tcib_managed=true, build-date=2025-11-18T22:49:32Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, architecture=x86_64, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible) Dec 15 02:58:12 localhost podman[51486]: 2025-12-15 07:58:12.443216998 +0000 UTC m=+0.168822876 container start 78945b55310ce128de7c360c749c4caec3fd3259feb54fbae2253ebadd2f6d1a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, release=1761123044, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, managed_by=tripleo_ansible, config_id=tripleo_puppet_step1, build-date=2025-11-19T00:35:22Z, container_name=container-puppet-nova_libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005559462', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, version=17.1.12, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true) Dec 15 02:58:12 localhost podman[51486]: 2025-12-15 07:58:12.443510745 +0000 UTC m=+0.169116663 container attach 78945b55310ce128de7c360c749c4caec3fd3259feb54fbae2253ebadd2f6d1a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:35:22Z, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=container-puppet-nova_libvirt, com.redhat.component=openstack-nova-libvirt-container, name=rhosp17/openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005559462', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, batch=17.1_20251118.1, distribution-scope=public, config_id=tripleo_puppet_step1, release=1761123044, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12) Dec 15 02:58:12 localhost systemd[1]: Started libpod-conmon-915a0fea2b170626fac9e25557b760cddd96bae4ec862148cf4294a9463929b0.scope. Dec 15 02:58:12 localhost systemd[1]: Started libcrun container. Dec 15 02:58:12 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9451c1a019d581a9391f5464645028b763edcb5db90a2f23bed97e93e14a6922/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff) Dec 15 02:58:13 localhost podman[51487]: 2025-12-15 07:58:13.814464087 +0000 UTC m=+1.539347346 container init 46b6269168ffd58e0b42dc9c29693e3761e13556befa8caaaeedb8def86ca41b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=container-puppet-metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, distribution-scope=public, maintainer=OpenStack TripleO Team, config_id=tripleo_puppet_step1, architecture=x86_64, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, tcib_managed=true, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.buildah.version=1.41.4, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005559462', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Dec 15 02:58:13 localhost podman[51487]: 2025-12-15 07:58:13.826318675 +0000 UTC m=+1.551201974 container start 46b6269168ffd58e0b42dc9c29693e3761e13556befa8caaaeedb8def86ca41b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005559462', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, version=17.1.12, vendor=Red Hat, Inc., container_name=container-puppet-metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_puppet_step1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, architecture=x86_64) Dec 15 02:58:13 localhost podman[51487]: 2025-12-15 07:58:13.827665031 +0000 UTC m=+1.552548320 container attach 46b6269168ffd58e0b42dc9c29693e3761e13556befa8caaaeedb8def86ca41b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., managed_by=tripleo_ansible, url=https://www.redhat.com, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, vcs-type=git, io.buildah.version=1.41.4, container_name=container-puppet-metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, release=1761123044, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005559462', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20251118.1, tcib_managed=true, io.openshift.expose-services=, config_id=tripleo_puppet_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public) Dec 15 02:58:13 localhost podman[51488]: 2025-12-15 07:58:13.843536433 +0000 UTC m=+1.567057986 container init eb715ee44032c1981e0162895debe9cbad3eb8d4c06106251f6c892310ccf334 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, config_id=tripleo_puppet_step1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, container_name=container-puppet-iscsid, architecture=x86_64, batch=17.1_20251118.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005559462', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4) Dec 15 02:58:13 localhost podman[51519]: 2025-12-15 07:58:13.850729181 +0000 UTC m=+1.545842875 container init 915a0fea2b170626fac9e25557b760cddd96bae4ec862148cf4294a9463929b0 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, io.buildah.version=1.41.4, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=container-puppet-crond, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., name=rhosp17/openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005559462', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:32Z, config_id=tripleo_puppet_step1, url=https://www.redhat.com, io.openshift.expose-services=, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container) Dec 15 02:58:13 localhost podman[51519]: 2025-12-15 07:58:13.869184731 +0000 UTC m=+1.564298425 container start 915a0fea2b170626fac9e25557b760cddd96bae4ec862148cf4294a9463929b0 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, batch=17.1_20251118.1, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, release=1761123044, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=container-puppet-crond, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005559462', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:32Z, config_id=tripleo_puppet_step1, managed_by=tripleo_ansible) Dec 15 02:58:13 localhost podman[51519]: 2025-12-15 07:58:13.871087521 +0000 UTC m=+1.566201265 container attach 915a0fea2b170626fac9e25557b760cddd96bae4ec862148cf4294a9463929b0 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, build-date=2025-11-18T22:49:32Z, config_id=tripleo_puppet_step1, maintainer=OpenStack TripleO Team, container_name=container-puppet-crond, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, name=rhosp17/openstack-cron, architecture=x86_64, url=https://www.redhat.com, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, release=1761123044, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005559462', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 15 02:58:13 localhost podman[51544]: 2025-12-15 07:58:13.904607144 +0000 UTC m=+1.564629175 container create 852e52b2369027fc0dd0127533f2fdf68dc31346981a5170e50cafc5ac2724f3 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, vendor=Red Hat, Inc., managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:51:28Z, vcs-type=git, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=container-puppet-collectd, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005559462', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, config_id=tripleo_puppet_step1, tcib_managed=true, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com) Dec 15 02:58:13 localhost systemd[1]: Started libpod-conmon-852e52b2369027fc0dd0127533f2fdf68dc31346981a5170e50cafc5ac2724f3.scope. Dec 15 02:58:13 localhost podman[51544]: 2025-12-15 07:58:13.840848403 +0000 UTC m=+1.500870494 image pull registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 Dec 15 02:58:13 localhost systemd[1]: Started libcrun container. Dec 15 02:58:13 localhost podman[51488]: 2025-12-15 07:58:13.955096108 +0000 UTC m=+1.678617661 container start eb715ee44032c1981e0162895debe9cbad3eb8d4c06106251f6c892310ccf334 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_puppet_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005559462', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, maintainer=OpenStack TripleO Team, vcs-type=git, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., batch=17.1_20251118.1, distribution-scope=public, container_name=container-puppet-iscsid, com.redhat.component=openstack-iscsid-container, release=1761123044, architecture=x86_64) Dec 15 02:58:13 localhost podman[51488]: 2025-12-15 07:58:13.955397656 +0000 UTC m=+1.678919219 container attach eb715ee44032c1981e0162895debe9cbad3eb8d4c06106251f6c892310ccf334 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, io.openshift.expose-services=, container_name=container-puppet-iscsid, config_id=tripleo_puppet_step1, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005559462', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.buildah.version=1.41.4, com.redhat.component=openstack-iscsid-container, architecture=x86_64, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., release=1761123044, version=17.1.12, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 15 02:58:13 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2895a7b3952df41cbaaca10bd32cfede98398660650c6484c46e17b8b9bc3a09/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff) Dec 15 02:58:13 localhost podman[51544]: 2025-12-15 07:58:13.967582003 +0000 UTC m=+1.627604034 container init 852e52b2369027fc0dd0127533f2fdf68dc31346981a5170e50cafc5ac2724f3 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, vendor=Red Hat, Inc., vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, batch=17.1_20251118.1, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_puppet_step1, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, architecture=x86_64, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=container-puppet-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, build-date=2025-11-18T22:51:28Z, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005559462', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public) Dec 15 02:58:13 localhost podman[51544]: 2025-12-15 07:58:13.981601968 +0000 UTC m=+1.641623989 container start 852e52b2369027fc0dd0127533f2fdf68dc31346981a5170e50cafc5ac2724f3 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, vcs-type=git, container_name=container-puppet-collectd, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_puppet_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., batch=17.1_20251118.1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005559462', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z) Dec 15 02:58:13 localhost podman[51544]: 2025-12-15 07:58:13.981815534 +0000 UTC m=+1.641837575 container attach 852e52b2369027fc0dd0127533f2fdf68dc31346981a5170e50cafc5ac2724f3 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005559462', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, name=rhosp17/openstack-collectd, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z, vendor=Red Hat, Inc., batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_puppet_step1, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, architecture=x86_64, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=container-puppet-collectd, version=17.1.12, io.openshift.expose-services=) Dec 15 02:58:14 localhost systemd[1]: tmp-crun.s2Thwr.mount: Deactivated successfully. Dec 15 02:58:14 localhost podman[51385]: 2025-12-15 07:58:12.20357996 +0000 UTC m=+0.044495210 image pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1 Dec 15 02:58:15 localhost podman[51721]: 2025-12-15 07:58:15.158155308 +0000 UTC m=+0.077622831 container create 9442ecf14a185aadd8c1e5a63c3a3da4947fba358ea4939cc1577bdba9cad4db (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=container-puppet-ceilometer, build-date=2025-11-19T00:11:59Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005559462', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, com.redhat.component=openstack-ceilometer-central-container, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-central, name=rhosp17/openstack-ceilometer-central, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, batch=17.1_20251118.1, distribution-scope=public, config_id=tripleo_puppet_step1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.41.4, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, release=1761123044, vcs-type=git) Dec 15 02:58:15 localhost systemd[1]: Started libpod-conmon-9442ecf14a185aadd8c1e5a63c3a3da4947fba358ea4939cc1577bdba9cad4db.scope. Dec 15 02:58:15 localhost systemd[1]: Started libcrun container. Dec 15 02:58:15 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8ebb8b8f9ef66ab5decfc0c9ea7632e995655bc9c0e31d590dc91d11091f3784/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff) Dec 15 02:58:15 localhost podman[51721]: 2025-12-15 07:58:15.216179119 +0000 UTC m=+0.135646622 container init 9442ecf14a185aadd8c1e5a63c3a3da4947fba358ea4939cc1577bdba9cad4db (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-central, release=1761123044, url=https://www.redhat.com, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, container_name=container-puppet-ceilometer, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, tcib_managed=true, distribution-scope=public, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005559462', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-central-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, build-date=2025-11-19T00:11:59Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, name=rhosp17/openstack-ceilometer-central, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, config_id=tripleo_puppet_step1) Dec 15 02:58:15 localhost podman[51721]: 2025-12-15 07:58:15.122768657 +0000 UTC m=+0.042236170 image pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1 Dec 15 02:58:15 localhost podman[51721]: 2025-12-15 07:58:15.225934303 +0000 UTC m=+0.145401806 container start 9442ecf14a185aadd8c1e5a63c3a3da4947fba358ea4939cc1577bdba9cad4db (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, container_name=container-puppet-ceilometer, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, build-date=2025-11-19T00:11:59Z, io.buildah.version=1.41.4, io.openshift.expose-services=, batch=17.1_20251118.1, distribution-scope=public, config_id=tripleo_puppet_step1, name=rhosp17/openstack-ceilometer-central, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005559462', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, description=Red Hat OpenStack Platform 17.1 ceilometer-central, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, com.redhat.component=openstack-ceilometer-central-container, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, release=1761123044, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc.) Dec 15 02:58:15 localhost podman[51721]: 2025-12-15 07:58:15.226108608 +0000 UTC m=+0.145576111 container attach 9442ecf14a185aadd8c1e5a63c3a3da4947fba358ea4939cc1577bdba9cad4db (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, version=17.1.12, name=rhosp17/openstack-ceilometer-central, batch=17.1_20251118.1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005559462', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=container-puppet-ceilometer, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.component=openstack-ceilometer-central-container, description=Red Hat OpenStack Platform 17.1 ceilometer-central, build-date=2025-11-19T00:11:59Z, config_id=tripleo_puppet_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, release=1761123044, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc.) Dec 15 02:58:15 localhost ovs-vsctl[51764]: ovs|00001|db_ctl_base|ERR|unix:/var/run/openvswitch/db.sock: database connection failed (No such file or directory) Dec 15 02:58:15 localhost puppet-user[51634]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Dec 15 02:58:15 localhost puppet-user[51634]: (file: /etc/puppet/hiera.yaml) Dec 15 02:58:15 localhost puppet-user[51634]: Warning: Undefined variable '::deploy_config_name'; Dec 15 02:58:15 localhost puppet-user[51634]: (file & line not available) Dec 15 02:58:15 localhost puppet-user[51614]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Dec 15 02:58:15 localhost puppet-user[51614]: (file: /etc/puppet/hiera.yaml) Dec 15 02:58:15 localhost puppet-user[51614]: Warning: Undefined variable '::deploy_config_name'; Dec 15 02:58:15 localhost puppet-user[51614]: (file & line not available) Dec 15 02:58:15 localhost puppet-user[51634]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Dec 15 02:58:15 localhost puppet-user[51634]: (file & line not available) Dec 15 02:58:15 localhost puppet-user[51651]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Dec 15 02:58:15 localhost puppet-user[51651]: (file: /etc/puppet/hiera.yaml) Dec 15 02:58:15 localhost puppet-user[51651]: Warning: Undefined variable '::deploy_config_name'; Dec 15 02:58:15 localhost puppet-user[51651]: (file & line not available) Dec 15 02:58:15 localhost puppet-user[51614]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Dec 15 02:58:15 localhost puppet-user[51614]: (file & line not available) Dec 15 02:58:15 localhost puppet-user[51651]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Dec 15 02:58:15 localhost puppet-user[51651]: (file & line not available) Dec 15 02:58:15 localhost puppet-user[51634]: Notice: Accepting previously invalid value for target type 'Integer' Dec 15 02:58:15 localhost puppet-user[51674]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Dec 15 02:58:15 localhost puppet-user[51674]: (file: /etc/puppet/hiera.yaml) Dec 15 02:58:15 localhost puppet-user[51674]: Warning: Undefined variable '::deploy_config_name'; Dec 15 02:58:15 localhost puppet-user[51674]: (file & line not available) Dec 15 02:58:15 localhost puppet-user[51634]: Notice: Compiled catalog for np0005559462.localdomain in environment production in 0.12 seconds Dec 15 02:58:15 localhost puppet-user[51656]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Dec 15 02:58:15 localhost puppet-user[51656]: (file: /etc/puppet/hiera.yaml) Dec 15 02:58:15 localhost puppet-user[51656]: Warning: Undefined variable '::deploy_config_name'; Dec 15 02:58:15 localhost puppet-user[51656]: (file & line not available) Dec 15 02:58:15 localhost puppet-user[51651]: Notice: Compiled catalog for np0005559462.localdomain in environment production in 0.10 seconds Dec 15 02:58:15 localhost puppet-user[51674]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Dec 15 02:58:15 localhost puppet-user[51674]: (file & line not available) Dec 15 02:58:15 localhost puppet-user[51634]: Notice: /Stage[main]/Qdr::Config/File[/var/lib/qdrouterd]/owner: owner changed 'qdrouterd' to 'root' Dec 15 02:58:15 localhost puppet-user[51634]: Notice: /Stage[main]/Qdr::Config/File[/var/lib/qdrouterd]/group: group changed 'qdrouterd' to 'root' Dec 15 02:58:15 localhost puppet-user[51634]: Notice: /Stage[main]/Qdr::Config/File[/var/lib/qdrouterd]/mode: mode changed '0700' to '0755' Dec 15 02:58:15 localhost puppet-user[51634]: Notice: /Stage[main]/Qdr::Config/File[/etc/qpid-dispatch/ssl]/ensure: created Dec 15 02:58:15 localhost puppet-user[51656]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Dec 15 02:58:15 localhost puppet-user[51656]: (file & line not available) Dec 15 02:58:15 localhost puppet-user[51634]: Notice: /Stage[main]/Qdr::Config/File[qdrouterd.conf]/content: content changed '{sha256}89e10d8896247f992c5f0baf027c25a8ca5d0441be46d8859d9db2067ea74cd3' to '{sha256}9be4acfe195ae062cf0702491cb39017e76baf44e5cc66e659f05043194b2e7e' Dec 15 02:58:15 localhost puppet-user[51634]: Notice: /Stage[main]/Qdr::Config/File[/var/log/qdrouterd]/ensure: created Dec 15 02:58:15 localhost puppet-user[51634]: Notice: /Stage[main]/Qdr::Config/File[/var/log/qdrouterd/metrics_qdr.log]/ensure: created Dec 15 02:58:15 localhost puppet-user[51634]: Notice: Applied catalog in 0.03 seconds Dec 15 02:58:15 localhost puppet-user[51634]: Application: Dec 15 02:58:15 localhost puppet-user[51634]: Initial environment: production Dec 15 02:58:15 localhost puppet-user[51634]: Converged environment: production Dec 15 02:58:15 localhost puppet-user[51634]: Run mode: user Dec 15 02:58:15 localhost puppet-user[51634]: Changes: Dec 15 02:58:15 localhost puppet-user[51634]: Total: 7 Dec 15 02:58:15 localhost puppet-user[51634]: Events: Dec 15 02:58:15 localhost puppet-user[51634]: Success: 7 Dec 15 02:58:15 localhost puppet-user[51634]: Total: 7 Dec 15 02:58:15 localhost puppet-user[51634]: Resources: Dec 15 02:58:15 localhost puppet-user[51634]: Skipped: 13 Dec 15 02:58:15 localhost puppet-user[51634]: Changed: 5 Dec 15 02:58:15 localhost puppet-user[51634]: Out of sync: 5 Dec 15 02:58:15 localhost puppet-user[51634]: Total: 20 Dec 15 02:58:15 localhost puppet-user[51634]: Time: Dec 15 02:58:15 localhost puppet-user[51634]: File: 0.01 Dec 15 02:58:15 localhost puppet-user[51634]: Transaction evaluation: 0.02 Dec 15 02:58:15 localhost puppet-user[51634]: Catalog application: 0.03 Dec 15 02:58:15 localhost puppet-user[51634]: Config retrieval: 0.16 Dec 15 02:58:15 localhost puppet-user[51634]: Last run: 1765785495 Dec 15 02:58:15 localhost puppet-user[51634]: Total: 0.03 Dec 15 02:58:15 localhost puppet-user[51634]: Version: Dec 15 02:58:15 localhost puppet-user[51634]: Config: 1765785495 Dec 15 02:58:15 localhost puppet-user[51634]: Puppet: 7.10.0 Dec 15 02:58:15 localhost puppet-user[51651]: Notice: /Stage[main]/Tripleo::Profile::Base::Iscsid/Exec[reset-iscsi-initiator-name]/returns: executed successfully Dec 15 02:58:15 localhost puppet-user[51651]: Notice: /Stage[main]/Tripleo::Profile::Base::Iscsid/File[/etc/iscsi/.initiator_reset]/ensure: created Dec 15 02:58:15 localhost puppet-user[51656]: Notice: Compiled catalog for np0005559462.localdomain in environment production in 0.08 seconds Dec 15 02:58:15 localhost puppet-user[51651]: Notice: /Stage[main]/Tripleo::Profile::Base::Iscsid/Exec[sync-iqn-to-host]/returns: executed successfully Dec 15 02:58:15 localhost puppet-user[51656]: Notice: /Stage[main]/Tripleo::Profile::Base::Logging::Logrotate/File[/etc/logrotate-crond.conf]/ensure: defined content as '{sha256}1c3202f58bd2ae16cb31badcbb7f0d4e6697157b987d1887736ad96bb73d70b0' Dec 15 02:58:15 localhost puppet-user[51656]: Notice: /Stage[main]/Tripleo::Profile::Base::Logging::Logrotate/Cron[logrotate-crond]/ensure: created Dec 15 02:58:15 localhost puppet-user[51656]: Notice: Applied catalog in 0.04 seconds Dec 15 02:58:15 localhost puppet-user[51656]: Application: Dec 15 02:58:15 localhost puppet-user[51656]: Initial environment: production Dec 15 02:58:15 localhost puppet-user[51656]: Converged environment: production Dec 15 02:58:15 localhost puppet-user[51656]: Run mode: user Dec 15 02:58:15 localhost puppet-user[51656]: Changes: Dec 15 02:58:15 localhost puppet-user[51656]: Total: 2 Dec 15 02:58:15 localhost puppet-user[51656]: Events: Dec 15 02:58:15 localhost puppet-user[51656]: Success: 2 Dec 15 02:58:15 localhost puppet-user[51656]: Total: 2 Dec 15 02:58:15 localhost puppet-user[51656]: Resources: Dec 15 02:58:15 localhost puppet-user[51656]: Changed: 2 Dec 15 02:58:15 localhost puppet-user[51656]: Out of sync: 2 Dec 15 02:58:15 localhost puppet-user[51656]: Skipped: 7 Dec 15 02:58:15 localhost puppet-user[51656]: Total: 9 Dec 15 02:58:15 localhost puppet-user[51656]: Time: Dec 15 02:58:15 localhost puppet-user[51656]: Cron: 0.01 Dec 15 02:58:15 localhost puppet-user[51656]: File: 0.01 Dec 15 02:58:15 localhost puppet-user[51656]: Transaction evaluation: 0.03 Dec 15 02:58:15 localhost puppet-user[51656]: Catalog application: 0.04 Dec 15 02:58:15 localhost puppet-user[51656]: Config retrieval: 0.11 Dec 15 02:58:15 localhost puppet-user[51656]: Last run: 1765785495 Dec 15 02:58:15 localhost puppet-user[51656]: Total: 0.04 Dec 15 02:58:15 localhost puppet-user[51656]: Version: Dec 15 02:58:15 localhost puppet-user[51656]: Config: 1765785495 Dec 15 02:58:15 localhost puppet-user[51656]: Puppet: 7.10.0 Dec 15 02:58:15 localhost puppet-user[51614]: Warning: Scope(Class[Nova]): The os_region_name parameter is deprecated and will be removed \ Dec 15 02:58:15 localhost puppet-user[51614]: in a future release. Use nova::cinder::os_region_name instead Dec 15 02:58:15 localhost puppet-user[51614]: Warning: Scope(Class[Nova]): The catalog_info parameter is deprecated and will be removed \ Dec 15 02:58:15 localhost puppet-user[51614]: in a future release. Use nova::cinder::catalog_info instead Dec 15 02:58:16 localhost systemd[1]: libpod-46b6269168ffd58e0b42dc9c29693e3761e13556befa8caaaeedb8def86ca41b.scope: Deactivated successfully. Dec 15 02:58:16 localhost systemd[1]: libpod-46b6269168ffd58e0b42dc9c29693e3761e13556befa8caaaeedb8def86ca41b.scope: Consumed 2.111s CPU time. Dec 15 02:58:16 localhost puppet-user[51614]: Warning: Unknown variable: '::nova::compute::verify_glance_signatures'. (file: /etc/puppet/modules/nova/manifests/glance.pp, line: 62, column: 41) Dec 15 02:58:16 localhost podman[51487]: 2025-12-15 07:58:16.051342721 +0000 UTC m=+3.776225990 container died 46b6269168ffd58e0b42dc9c29693e3761e13556befa8caaaeedb8def86ca41b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005559462', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vendor=Red Hat, Inc., batch=17.1_20251118.1, tcib_managed=true, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, vcs-type=git, url=https://www.redhat.com, version=17.1.12, name=rhosp17/openstack-qdrouterd, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=container-puppet-metrics_qdr, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_puppet_step1, release=1761123044) Dec 15 02:58:16 localhost puppet-user[51614]: Warning: Unknown variable: '::nova::compute::libvirt::remove_unused_base_images'. (file: /etc/puppet/modules/nova/manifests/compute/image_cache.pp, line: 44, column: 5) Dec 15 02:58:16 localhost puppet-user[51614]: Warning: Unknown variable: '::nova::compute::libvirt::remove_unused_original_minimum_age_seconds'. (file: /etc/puppet/modules/nova/manifests/compute/image_cache.pp, line: 48, column: 5) Dec 15 02:58:16 localhost puppet-user[51614]: Warning: Unknown variable: '::nova::compute::libvirt::remove_unused_resized_minimum_age_seconds'. (file: /etc/puppet/modules/nova/manifests/compute/image_cache.pp, line: 52, column: 5) Dec 15 02:58:16 localhost puppet-user[51674]: Notice: Compiled catalog for np0005559462.localdomain in environment production in 0.39 seconds Dec 15 02:58:16 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-46b6269168ffd58e0b42dc9c29693e3761e13556befa8caaaeedb8def86ca41b-userdata-shm.mount: Deactivated successfully. Dec 15 02:58:16 localhost systemd[1]: var-lib-containers-storage-overlay-8386ac9ef0e341b40941113adbcd0de64d383dd53b6c975b3c29a443c4fff823-merged.mount: Deactivated successfully. Dec 15 02:58:16 localhost puppet-user[51614]: Warning: Scope(Class[Tripleo::Profile::Base::Nova::Compute]): The keymgr_backend parameter has been deprecated Dec 15 02:58:16 localhost puppet-user[51614]: Warning: Scope(Class[Nova::Compute]): vcpu_pin_set is deprecated, instead use cpu_dedicated_set or cpu_shared_set. Dec 15 02:58:16 localhost puppet-user[51614]: Warning: Scope(Class[Nova::Compute]): verify_glance_signatures is deprecated. Use the same parameter in nova::glance Dec 15 02:58:16 localhost systemd[1]: libpod-915a0fea2b170626fac9e25557b760cddd96bae4ec862148cf4294a9463929b0.scope: Deactivated successfully. Dec 15 02:58:16 localhost systemd[1]: libpod-915a0fea2b170626fac9e25557b760cddd96bae4ec862148cf4294a9463929b0.scope: Consumed 2.138s CPU time. Dec 15 02:58:16 localhost podman[52119]: 2025-12-15 07:58:16.158801699 +0000 UTC m=+0.097419697 container cleanup 46b6269168ffd58e0b42dc9c29693e3761e13556befa8caaaeedb8def86ca41b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, vendor=Red Hat, Inc., container_name=container-puppet-metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005559462', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, build-date=2025-11-18T22:49:46Z, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_puppet_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd) Dec 15 02:58:16 localhost systemd[1]: libpod-conmon-46b6269168ffd58e0b42dc9c29693e3761e13556befa8caaaeedb8def86ca41b.scope: Deactivated successfully. Dec 15 02:58:16 localhost python3[51312]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-metrics_qdr --conmon-pidfile /run/container-puppet-metrics_qdr.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005559462 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron --env NAME=metrics_qdr --env STEP_CONFIG=include ::tripleo::packages#012include tripleo::profile::base::metrics::qdr#012 --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-metrics_qdr --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005559462', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-metrics_qdr.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 Dec 15 02:58:16 localhost puppet-user[51651]: Notice: /Stage[main]/Tripleo::Profile::Base::Iscsid/Augeas[chap_algs in /etc/iscsi/iscsid.conf]/returns: executed successfully Dec 15 02:58:16 localhost podman[51519]: 2025-12-15 07:58:16.207163568 +0000 UTC m=+3.902277262 container died 915a0fea2b170626fac9e25557b760cddd96bae4ec862148cf4294a9463929b0 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, release=1761123044, io.openshift.expose-services=, tcib_managed=true, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005559462', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_puppet_step1, batch=17.1_20251118.1, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, container_name=container-puppet-crond, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2025-11-18T22:49:32Z) Dec 15 02:58:16 localhost puppet-user[51674]: Notice: /Stage[main]/Collectd::Config/File[collectd.conf]/content: content changed '{sha256}aea388a73ebafc7e07a81ddb930a91099211f660eee55fbf92c13007a77501e5' to '{sha256}2523d01ee9c3022c0e9f61d896b1474a168e18472aee141cc278e69fe13f41c1' Dec 15 02:58:16 localhost puppet-user[51674]: Notice: /Stage[main]/Collectd::Config/File[collectd.conf]/owner: owner changed 'collectd' to 'root' Dec 15 02:58:16 localhost puppet-user[51651]: Notice: Applied catalog in 0.45 seconds Dec 15 02:58:16 localhost puppet-user[51674]: Notice: /Stage[main]/Collectd::Config/File[collectd.conf]/group: group changed 'collectd' to 'root' Dec 15 02:58:16 localhost puppet-user[51651]: Application: Dec 15 02:58:16 localhost puppet-user[51651]: Initial environment: production Dec 15 02:58:16 localhost puppet-user[51651]: Converged environment: production Dec 15 02:58:16 localhost puppet-user[51651]: Run mode: user Dec 15 02:58:16 localhost puppet-user[51651]: Changes: Dec 15 02:58:16 localhost puppet-user[51651]: Total: 4 Dec 15 02:58:16 localhost puppet-user[51651]: Events: Dec 15 02:58:16 localhost puppet-user[51651]: Success: 4 Dec 15 02:58:16 localhost puppet-user[51651]: Total: 4 Dec 15 02:58:16 localhost puppet-user[51651]: Resources: Dec 15 02:58:16 localhost puppet-user[51651]: Changed: 4 Dec 15 02:58:16 localhost puppet-user[51651]: Out of sync: 4 Dec 15 02:58:16 localhost puppet-user[51651]: Skipped: 8 Dec 15 02:58:16 localhost puppet-user[51651]: Total: 13 Dec 15 02:58:16 localhost puppet-user[51651]: Time: Dec 15 02:58:16 localhost puppet-user[51651]: File: 0.00 Dec 15 02:58:16 localhost puppet-user[51651]: Exec: 0.03 Dec 15 02:58:16 localhost puppet-user[51651]: Config retrieval: 0.13 Dec 15 02:58:16 localhost puppet-user[51651]: Augeas: 0.41 Dec 15 02:58:16 localhost puppet-user[51651]: Transaction evaluation: 0.45 Dec 15 02:58:16 localhost puppet-user[51651]: Catalog application: 0.45 Dec 15 02:58:16 localhost puppet-user[51651]: Last run: 1765785496 Dec 15 02:58:16 localhost puppet-user[51651]: Total: 0.45 Dec 15 02:58:16 localhost puppet-user[51651]: Version: Dec 15 02:58:16 localhost puppet-user[51651]: Config: 1765785495 Dec 15 02:58:16 localhost puppet-user[51651]: Puppet: 7.10.0 Dec 15 02:58:16 localhost puppet-user[51674]: Notice: /Stage[main]/Collectd::Config/File[collectd.conf]/mode: mode changed '0644' to '0640' Dec 15 02:58:16 localhost podman[52142]: 2025-12-15 07:58:16.253323989 +0000 UTC m=+0.089042838 container cleanup 915a0fea2b170626fac9e25557b760cddd96bae4ec862148cf4294a9463929b0 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, tcib_managed=true, vcs-type=git, container_name=container-puppet-crond, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, batch=17.1_20251118.1, name=rhosp17/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, config_id=tripleo_puppet_step1, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:32Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005559462', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vendor=Red Hat, Inc.) Dec 15 02:58:16 localhost systemd[1]: libpod-conmon-915a0fea2b170626fac9e25557b760cddd96bae4ec862148cf4294a9463929b0.scope: Deactivated successfully. Dec 15 02:58:16 localhost puppet-user[51674]: Notice: /Stage[main]/Collectd::Config/File[collectd.d]/owner: owner changed 'collectd' to 'root' Dec 15 02:58:16 localhost puppet-user[51674]: Notice: /Stage[main]/Collectd::Config/File[collectd.d]/group: group changed 'collectd' to 'root' Dec 15 02:58:16 localhost puppet-user[51674]: Notice: /Stage[main]/Collectd::Config/File[collectd.d]/mode: mode changed '0755' to '0750' Dec 15 02:58:16 localhost puppet-user[51674]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-cpu.conf]/ensure: removed Dec 15 02:58:16 localhost puppet-user[51674]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-interface.conf]/ensure: removed Dec 15 02:58:16 localhost puppet-user[51674]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-load.conf]/ensure: removed Dec 15 02:58:16 localhost puppet-user[51674]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-memory.conf]/ensure: removed Dec 15 02:58:16 localhost puppet-user[51674]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-syslog.conf]/ensure: removed Dec 15 02:58:16 localhost puppet-user[51674]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/apache.conf]/ensure: removed Dec 15 02:58:16 localhost puppet-user[51674]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/dns.conf]/ensure: removed Dec 15 02:58:16 localhost puppet-user[51674]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/ipmi.conf]/ensure: removed Dec 15 02:58:16 localhost puppet-user[51674]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/mcelog.conf]/ensure: removed Dec 15 02:58:16 localhost python3[51312]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-crond --conmon-pidfile /run/container-puppet-crond.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005559462 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron --env NAME=crond --env STEP_CONFIG=include ::tripleo::packages#012include tripleo::profile::base::logging::logrotate --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-crond --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005559462', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-crond.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 Dec 15 02:58:16 localhost puppet-user[51674]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/mysql.conf]/ensure: removed Dec 15 02:58:16 localhost puppet-user[51674]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/ovs-events.conf]/ensure: removed Dec 15 02:58:16 localhost puppet-user[51674]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/ovs-stats.conf]/ensure: removed Dec 15 02:58:16 localhost puppet-user[51674]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/ping.conf]/ensure: removed Dec 15 02:58:16 localhost puppet-user[51674]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/pmu.conf]/ensure: removed Dec 15 02:58:16 localhost puppet-user[51674]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/rdt.conf]/ensure: removed Dec 15 02:58:16 localhost puppet-user[51674]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/sensors.conf]/ensure: removed Dec 15 02:58:16 localhost puppet-user[51674]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/snmp.conf]/ensure: removed Dec 15 02:58:16 localhost puppet-user[51674]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/write_prometheus.conf]/ensure: removed Dec 15 02:58:16 localhost puppet-user[51674]: Notice: /Stage[main]/Collectd::Plugin::Python/File[/usr/lib/python3.9/site-packages]/mode: mode changed '0755' to '0750' Dec 15 02:58:16 localhost puppet-user[51674]: Notice: /Stage[main]/Collectd::Plugin::Python/Collectd::Plugin[python]/File[python.load]/ensure: defined content as '{sha256}0163924a0099dd43fe39cb85e836df147fd2cfee8197dc6866d3c384539eb6ee' Dec 15 02:58:16 localhost systemd[1]: tmp-crun.JwMvdg.mount: Deactivated successfully. Dec 15 02:58:16 localhost systemd[1]: var-lib-containers-storage-overlay-9451c1a019d581a9391f5464645028b763edcb5db90a2f23bed97e93e14a6922-merged.mount: Deactivated successfully. Dec 15 02:58:16 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-915a0fea2b170626fac9e25557b760cddd96bae4ec862148cf4294a9463929b0-userdata-shm.mount: Deactivated successfully. Dec 15 02:58:16 localhost puppet-user[51674]: Notice: /Stage[main]/Collectd::Plugin::Python/Concat[/etc/collectd.d/python-config.conf]/File[/etc/collectd.d/python-config.conf]/ensure: defined content as '{sha256}2e5fb20e60b30f84687fc456a37fc62451000d2d85f5bbc1b3fca3a5eac9deeb' Dec 15 02:58:16 localhost puppet-user[51674]: Notice: /Stage[main]/Collectd::Plugin::Logfile/Collectd::Plugin[logfile]/File[logfile.load]/ensure: defined content as '{sha256}07bbda08ef9b824089500bdc6ac5a86e7d1ef2ae3ed4ed423c0559fe6361e5af' Dec 15 02:58:16 localhost puppet-user[51674]: Notice: /Stage[main]/Collectd::Plugin::Amqp1/Collectd::Plugin[amqp1]/File[amqp1.load]/ensure: defined content as '{sha256}dee3f10cb1ff461ac3f1e743a5ef3f06993398c6c829895de1dae7f242a64b39' Dec 15 02:58:16 localhost puppet-user[51614]: Warning: Scope(Class[Nova::Compute::Libvirt]): nova::compute::libvirt::images_type will be required if rbd ephemeral storage is used. Dec 15 02:58:16 localhost puppet-user[51674]: Notice: /Stage[main]/Collectd::Plugin::Ceph/Collectd::Plugin[ceph]/File[ceph.load]/ensure: defined content as '{sha256}c796abffda2e860875295b4fc11cc95c6032b4e13fa8fb128e839a305aa1676c' Dec 15 02:58:16 localhost puppet-user[51674]: Notice: /Stage[main]/Collectd::Plugin::Cpu/Collectd::Plugin[cpu]/File[cpu.load]/ensure: defined content as '{sha256}67d4c8bf6bf5785f4cb6b596712204d9eacbcebbf16fe289907195d4d3cb0e34' Dec 15 02:58:16 localhost puppet-user[51674]: Notice: /Stage[main]/Collectd::Plugin::Df/Collectd::Plugin[df]/File[df.load]/ensure: defined content as '{sha256}edeb4716d96fc9dca2c6adfe07bae70ba08c6af3944a3900581cba0f08f3c4ba' Dec 15 02:58:16 localhost puppet-user[51674]: Notice: /Stage[main]/Collectd::Plugin::Disk/Collectd::Plugin[disk]/File[disk.load]/ensure: defined content as '{sha256}1d0cb838278f3226fcd381f0fc2e0e1abaf0d590f4ba7bcb2fc6ec113d3ebde7' Dec 15 02:58:16 localhost puppet-user[51674]: Notice: /Stage[main]/Collectd::Plugin::Hugepages/Collectd::Plugin[hugepages]/File[hugepages.load]/ensure: defined content as '{sha256}9b9f35b65a73da8d4037e4355a23b678f2cf61997ccf7a5e1adf2a7ce6415827' Dec 15 02:58:16 localhost puppet-user[51674]: Notice: /Stage[main]/Collectd::Plugin::Hugepages/Collectd::Plugin[hugepages]/File[older_hugepages.load]/ensure: removed Dec 15 02:58:16 localhost puppet-user[51674]: Notice: /Stage[main]/Collectd::Plugin::Interface/Collectd::Plugin[interface]/File[interface.load]/ensure: defined content as '{sha256}b76b315dc312e398940fe029c6dbc5c18d2b974ff7527469fc7d3617b5222046' Dec 15 02:58:16 localhost puppet-user[51674]: Notice: /Stage[main]/Collectd::Plugin::Load/Collectd::Plugin[load]/File[load.load]/ensure: defined content as '{sha256}af2403f76aebd2f10202d66d2d55e1a8d987eed09ced5a3e3873a4093585dc31' Dec 15 02:58:16 localhost puppet-user[51674]: Notice: /Stage[main]/Collectd::Plugin::Memory/Collectd::Plugin[memory]/File[memory.load]/ensure: defined content as '{sha256}0f270425ee6b05fc9440ee32b9afd1010dcbddd9b04ca78ff693858f7ecb9d0e' Dec 15 02:58:16 localhost puppet-user[51674]: Notice: /Stage[main]/Collectd::Plugin::Unixsock/Collectd::Plugin[unixsock]/File[unixsock.load]/ensure: defined content as '{sha256}9d1ec1c51ba386baa6f62d2e019dbd6998ad924bf868b3edc2d24d3dc3c63885' Dec 15 02:58:16 localhost puppet-user[51674]: Notice: /Stage[main]/Collectd::Plugin::Uptime/Collectd::Plugin[uptime]/File[uptime.load]/ensure: defined content as '{sha256}f7a26c6369f904d0ca1af59627ebea15f5e72160bcacdf08d217af282b42e5c0' Dec 15 02:58:16 localhost puppet-user[51674]: Notice: /Stage[main]/Collectd::Plugin::Virt/Collectd::Plugin[virt]/File[virt.load]/ensure: defined content as '{sha256}9a2bcf913f6bf8a962a0ff351a9faea51ae863cc80af97b77f63f8ab68941c62' Dec 15 02:58:16 localhost puppet-user[51674]: Notice: /Stage[main]/Collectd::Plugin::Virt/Collectd::Plugin[virt]/File[older_virt.load]/ensure: removed Dec 15 02:58:16 localhost puppet-user[51674]: Notice: Applied catalog in 0.27 seconds Dec 15 02:58:16 localhost puppet-user[51674]: Application: Dec 15 02:58:16 localhost puppet-user[51674]: Initial environment: production Dec 15 02:58:16 localhost puppet-user[51674]: Converged environment: production Dec 15 02:58:16 localhost puppet-user[51674]: Run mode: user Dec 15 02:58:16 localhost puppet-user[51674]: Changes: Dec 15 02:58:16 localhost puppet-user[51674]: Total: 43 Dec 15 02:58:16 localhost puppet-user[51674]: Events: Dec 15 02:58:16 localhost puppet-user[51674]: Success: 43 Dec 15 02:58:16 localhost puppet-user[51674]: Total: 43 Dec 15 02:58:16 localhost puppet-user[51674]: Resources: Dec 15 02:58:16 localhost puppet-user[51674]: Skipped: 14 Dec 15 02:58:16 localhost puppet-user[51674]: Changed: 38 Dec 15 02:58:16 localhost puppet-user[51674]: Out of sync: 38 Dec 15 02:58:16 localhost puppet-user[51674]: Total: 82 Dec 15 02:58:16 localhost puppet-user[51674]: Time: Dec 15 02:58:16 localhost puppet-user[51674]: Concat file: 0.00 Dec 15 02:58:16 localhost puppet-user[51674]: Concat fragment: 0.00 Dec 15 02:58:16 localhost puppet-user[51674]: File: 0.12 Dec 15 02:58:16 localhost puppet-user[51674]: Transaction evaluation: 0.27 Dec 15 02:58:16 localhost puppet-user[51674]: Catalog application: 0.27 Dec 15 02:58:16 localhost puppet-user[51674]: Config retrieval: 0.47 Dec 15 02:58:16 localhost puppet-user[51674]: Last run: 1765785496 Dec 15 02:58:16 localhost puppet-user[51674]: Total: 0.27 Dec 15 02:58:16 localhost puppet-user[51674]: Version: Dec 15 02:58:16 localhost puppet-user[51674]: Config: 1765785495 Dec 15 02:58:16 localhost puppet-user[51674]: Puppet: 7.10.0 Dec 15 02:58:16 localhost systemd[1]: libpod-eb715ee44032c1981e0162895debe9cbad3eb8d4c06106251f6c892310ccf334.scope: Deactivated successfully. Dec 15 02:58:16 localhost systemd[1]: libpod-eb715ee44032c1981e0162895debe9cbad3eb8d4c06106251f6c892310ccf334.scope: Consumed 2.517s CPU time. Dec 15 02:58:16 localhost podman[51488]: 2025-12-15 07:58:16.512812755 +0000 UTC m=+4.236334308 container died eb715ee44032c1981e0162895debe9cbad3eb8d4c06106251f6c892310ccf334 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, maintainer=OpenStack TripleO Team, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, version=17.1.12, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005559462', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, batch=17.1_20251118.1, managed_by=tripleo_ansible, config_id=tripleo_puppet_step1, build-date=2025-11-18T23:44:13Z, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=container-puppet-iscsid, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., tcib_managed=true, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4) Dec 15 02:58:16 localhost podman[52262]: 2025-12-15 07:58:16.528547795 +0000 UTC m=+0.063374111 container create adb686c846280011080046965b56fe97bcd01ac827087ebfa44d52fe93616ff5 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=container-puppet-rsyslog, vcs-type=git, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-rsyslog-container, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.openshift.expose-services=, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2025-11-18T22:49:49Z, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005559462', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, url=https://www.redhat.com, distribution-scope=public, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, config_id=tripleo_puppet_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 15 02:58:16 localhost systemd[1]: Started libpod-conmon-adb686c846280011080046965b56fe97bcd01ac827087ebfa44d52fe93616ff5.scope. Dec 15 02:58:16 localhost systemd[1]: Started libcrun container. Dec 15 02:58:16 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ea8b9b9472adf5d6d950d327572433ab4db7c804ff790266ce1ebb10d8ac319e/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff) Dec 15 02:58:16 localhost podman[52262]: 2025-12-15 07:58:16.590605711 +0000 UTC m=+0.125432027 container init adb686c846280011080046965b56fe97bcd01ac827087ebfa44d52fe93616ff5 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, release=1761123044, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, name=rhosp17/openstack-rsyslog, architecture=x86_64, build-date=2025-11-18T22:49:49Z, maintainer=OpenStack TripleO Team, vcs-type=git, config_id=tripleo_puppet_step1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005559462', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, container_name=container-puppet-rsyslog, tcib_managed=true, version=17.1.12, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.component=openstack-rsyslog-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, url=https://www.redhat.com) Dec 15 02:58:16 localhost podman[52291]: 2025-12-15 07:58:16.597749746 +0000 UTC m=+0.076390949 container cleanup eb715ee44032c1981e0162895debe9cbad3eb8d4c06106251f6c892310ccf334 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, maintainer=OpenStack TripleO Team, version=17.1.12, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=container-puppet-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, release=1761123044, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_puppet_step1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005559462', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, name=rhosp17/openstack-iscsid, tcib_managed=true) Dec 15 02:58:16 localhost podman[52262]: 2025-12-15 07:58:16.498705158 +0000 UTC m=+0.033531494 image pull registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 Dec 15 02:58:16 localhost systemd[1]: libpod-conmon-eb715ee44032c1981e0162895debe9cbad3eb8d4c06106251f6c892310ccf334.scope: Deactivated successfully. Dec 15 02:58:16 localhost python3[51312]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-iscsid --conmon-pidfile /run/container-puppet-iscsid.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005559462 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,iscsid_config --env NAME=iscsid --env STEP_CONFIG=include ::tripleo::packages#012include tripleo::profile::base::iscsid#012 --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-iscsid --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005559462', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-iscsid.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/iscsi:/tmp/iscsi.host:z --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 Dec 15 02:58:16 localhost podman[52262]: 2025-12-15 07:58:16.650385107 +0000 UTC m=+0.185211433 container start adb686c846280011080046965b56fe97bcd01ac827087ebfa44d52fe93616ff5 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, io.openshift.expose-services=, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=container-puppet-rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2025-11-18T22:49:49Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005559462', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, config_id=tripleo_puppet_step1, name=rhosp17/openstack-rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, com.redhat.component=openstack-rsyslog-container, summary=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.41.4, batch=17.1_20251118.1, managed_by=tripleo_ansible) Dec 15 02:58:16 localhost podman[52262]: 2025-12-15 07:58:16.650550971 +0000 UTC m=+0.185377287 container attach adb686c846280011080046965b56fe97bcd01ac827087ebfa44d52fe93616ff5 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, io.buildah.version=1.41.4, release=1761123044, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005559462', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_puppet_step1, com.redhat.component=openstack-rsyslog-container, url=https://www.redhat.com, version=17.1.12, batch=17.1_20251118.1, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, name=rhosp17/openstack-rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=container-puppet-rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2025-11-18T22:49:49Z, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 15 02:58:16 localhost podman[52328]: 2025-12-15 07:58:16.671585919 +0000 UTC m=+0.074740627 container create 17bca7f1ea4867411d6ff448cba20a6d46a34d6867615471aea7b38bf7c820f4 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.openshift.expose-services=, architecture=x86_64, managed_by=tripleo_ansible, io.buildah.version=1.41.4, vcs-type=git, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, distribution-scope=public, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=container-puppet-ovn_controller, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_puppet_step1, name=rhosp17/openstack-ovn-controller, version=17.1.12, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005559462', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com) Dec 15 02:58:16 localhost systemd[1]: Started libpod-conmon-17bca7f1ea4867411d6ff448cba20a6d46a34d6867615471aea7b38bf7c820f4.scope. Dec 15 02:58:16 localhost systemd[1]: Started libcrun container. Dec 15 02:58:16 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d37fb345fe4e85b30ebfa0aa6234403fca41e6174e3eeb88a2e8baf1f4bd0d92/merged/etc/sysconfig/modules supports timestamps until 2038 (0x7fffffff) Dec 15 02:58:16 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d37fb345fe4e85b30ebfa0aa6234403fca41e6174e3eeb88a2e8baf1f4bd0d92/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff) Dec 15 02:58:16 localhost podman[52328]: 2025-12-15 07:58:16.723383167 +0000 UTC m=+0.126537875 container init 17bca7f1ea4867411d6ff448cba20a6d46a34d6867615471aea7b38bf7c820f4 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, com.redhat.component=openstack-ovn-controller-container, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005559462', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, config_id=tripleo_puppet_step1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, container_name=container-puppet-ovn_controller, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, architecture=x86_64, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1) Dec 15 02:58:16 localhost podman[52328]: 2025-12-15 07:58:16.632609414 +0000 UTC m=+0.035764152 image pull registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 Dec 15 02:58:16 localhost podman[52328]: 2025-12-15 07:58:16.752107145 +0000 UTC m=+0.155261863 container start 17bca7f1ea4867411d6ff448cba20a6d46a34d6867615471aea7b38bf7c820f4 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, distribution-scope=public, vendor=Red Hat, Inc., release=1761123044, managed_by=tripleo_ansible, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005559462', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, build-date=2025-11-18T23:34:05Z, config_id=tripleo_puppet_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp17/openstack-ovn-controller, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, container_name=container-puppet-ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, version=17.1.12) Dec 15 02:58:16 localhost podman[52328]: 2025-12-15 07:58:16.755032941 +0000 UTC m=+0.158187659 container attach 17bca7f1ea4867411d6ff448cba20a6d46a34d6867615471aea7b38bf7c820f4 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, maintainer=OpenStack TripleO Team, architecture=x86_64, name=rhosp17/openstack-ovn-controller, url=https://www.redhat.com, tcib_managed=true, config_id=tripleo_puppet_step1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=container-puppet-ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005559462', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, io.openshift.expose-services=, build-date=2025-11-18T23:34:05Z, vendor=Red Hat, Inc., release=1761123044, batch=17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 15 02:58:16 localhost systemd[1]: libpod-852e52b2369027fc0dd0127533f2fdf68dc31346981a5170e50cafc5ac2724f3.scope: Deactivated successfully. Dec 15 02:58:16 localhost systemd[1]: libpod-852e52b2369027fc0dd0127533f2fdf68dc31346981a5170e50cafc5ac2724f3.scope: Consumed 2.573s CPU time. Dec 15 02:58:16 localhost podman[51544]: 2025-12-15 07:58:16.802212569 +0000 UTC m=+4.462234600 container died 852e52b2369027fc0dd0127533f2fdf68dc31346981a5170e50cafc5ac2724f3 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, release=1761123044, batch=17.1_20251118.1, version=17.1.12, architecture=x86_64, container_name=container-puppet-collectd, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, managed_by=tripleo_ansible, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-collectd-container, build-date=2025-11-18T22:51:28Z, vendor=Red Hat, Inc., io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005559462', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, distribution-scope=public, config_id=tripleo_puppet_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team) Dec 15 02:58:16 localhost podman[52439]: 2025-12-15 07:58:16.870380134 +0000 UTC m=+0.062675023 container cleanup 852e52b2369027fc0dd0127533f2fdf68dc31346981a5170e50cafc5ac2724f3 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, release=1761123044, architecture=x86_64, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, batch=17.1_20251118.1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_puppet_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005559462', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-type=git, io.buildah.version=1.41.4, container_name=container-puppet-collectd) Dec 15 02:58:16 localhost systemd[1]: libpod-conmon-852e52b2369027fc0dd0127533f2fdf68dc31346981a5170e50cafc5ac2724f3.scope: Deactivated successfully. Dec 15 02:58:16 localhost python3[51312]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-collectd --conmon-pidfile /run/container-puppet-collectd.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005559462 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,collectd_client_config,exec --env NAME=collectd --env STEP_CONFIG=include ::tripleo::packages#012include tripleo::profile::base::metrics::collectd --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-collectd --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005559462', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-collectd.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 Dec 15 02:58:16 localhost puppet-user[51614]: Notice: Compiled catalog for np0005559462.localdomain in environment production in 1.29 seconds Dec 15 02:58:17 localhost puppet-user[51614]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Migration::Client/File[/etc/nova/migration/identity]/content: content changed '{sha256}86610d84e745a3992358ae0b747297805d075492e5114c666fa08f8aecce7da0' to '{sha256}9f78c17629c8f6efb1d230ee9ca39117806f1c34f281506912faa65c5b79e2a3' Dec 15 02:58:17 localhost puppet-user[51614]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Migration::Client/File_line[nova_ssh_port]/ensure: created Dec 15 02:58:17 localhost puppet-user[51614]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Libvirt/File[/etc/sasl2/libvirt.conf]/content: content changed '{sha256}78510a0d6f14b269ddeb9f9638dfdfba9f976d370ee2ec04ba25352a8af6df35' to '{sha256}6d7bcae773217a30c0772f75d0d1b6d21f5d64e72853f5e3d91bb47799dbb7fe' Dec 15 02:58:17 localhost puppet-user[51614]: Warning: Empty environment setting 'TLS_PASSWORD' Dec 15 02:58:17 localhost puppet-user[51614]: (file: /etc/puppet/modules/tripleo/manifests/profile/base/nova/libvirt.pp, line: 182) Dec 15 02:58:17 localhost puppet-user[51614]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Libvirt/Exec[set libvirt sasl credentials]/returns: executed successfully Dec 15 02:58:17 localhost puppet-user[51614]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Migration::Target/File[/etc/nova/migration/authorized_keys]/content: content changed '{sha256}0d05a8832f36c0517b84e9c3ad11069d531c7d2be5297661e5552fd29e3a5e47' to '{sha256}63c7fd7679075ee8ac4fa768ddfe79573f0dc88eaeaad49fc71732a9505b83e0' Dec 15 02:58:17 localhost puppet-user[51614]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Migration::Target/File_line[nova_migration_logindefs]/ensure: created Dec 15 02:58:17 localhost puppet-user[51776]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Dec 15 02:58:17 localhost puppet-user[51776]: (file: /etc/puppet/hiera.yaml) Dec 15 02:58:17 localhost puppet-user[51776]: Warning: Undefined variable '::deploy_config_name'; Dec 15 02:58:17 localhost puppet-user[51776]: (file & line not available) Dec 15 02:58:17 localhost systemd[1]: var-lib-containers-storage-overlay-2895a7b3952df41cbaaca10bd32cfede98398660650c6484c46e17b8b9bc3a09-merged.mount: Deactivated successfully. Dec 15 02:58:17 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-852e52b2369027fc0dd0127533f2fdf68dc31346981a5170e50cafc5ac2724f3-userdata-shm.mount: Deactivated successfully. Dec 15 02:58:17 localhost systemd[1]: var-lib-containers-storage-overlay-fc8d0c3725ac03a6b54ddb2ddc35b1ea5dcbf08234ce27c5f96260191136f0c8-merged.mount: Deactivated successfully. Dec 15 02:58:17 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-eb715ee44032c1981e0162895debe9cbad3eb8d4c06106251f6c892310ccf334-userdata-shm.mount: Deactivated successfully. Dec 15 02:58:17 localhost puppet-user[51776]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Dec 15 02:58:17 localhost puppet-user[51776]: (file & line not available) Dec 15 02:58:17 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Workarounds/Nova_config[workarounds/never_download_image_if_on_rbd]/ensure: created Dec 15 02:58:17 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Workarounds/Nova_config[workarounds/disable_compute_service_check_for_ffu]/ensure: created Dec 15 02:58:17 localhost puppet-user[51614]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/ssl_only]/ensure: created Dec 15 02:58:17 localhost puppet-user[51614]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/my_ip]/ensure: created Dec 15 02:58:17 localhost puppet-user[51614]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/host]/ensure: created Dec 15 02:58:17 localhost puppet-user[51614]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/cpu_allocation_ratio]/ensure: created Dec 15 02:58:17 localhost puppet-user[51776]: Warning: Unknown variable: '::ceilometer::cache_backend'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 145, column: 39) Dec 15 02:58:17 localhost puppet-user[51776]: Warning: Unknown variable: '::ceilometer::memcache_servers'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 146, column: 39) Dec 15 02:58:17 localhost puppet-user[51776]: Warning: Unknown variable: '::ceilometer::cache_tls_enabled'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 147, column: 39) Dec 15 02:58:17 localhost puppet-user[51776]: Warning: Unknown variable: '::ceilometer::cache_tls_cafile'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 148, column: 39) Dec 15 02:58:17 localhost puppet-user[51776]: Warning: Unknown variable: '::ceilometer::cache_tls_certfile'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 149, column: 39) Dec 15 02:58:17 localhost puppet-user[51776]: Warning: Unknown variable: '::ceilometer::cache_tls_keyfile'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 150, column: 39) Dec 15 02:58:17 localhost puppet-user[51776]: Warning: Unknown variable: '::ceilometer::cache_tls_allowed_ciphers'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 151, column: 39) Dec 15 02:58:17 localhost puppet-user[51776]: Warning: Unknown variable: '::ceilometer::manage_backend_package'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 152, column: 39) Dec 15 02:58:17 localhost puppet-user[51614]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/ram_allocation_ratio]/ensure: created Dec 15 02:58:17 localhost puppet-user[51614]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/disk_allocation_ratio]/ensure: created Dec 15 02:58:17 localhost puppet-user[51614]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/dhcp_domain]/ensure: created Dec 15 02:58:17 localhost puppet-user[51614]: Notice: /Stage[main]/Nova/Nova_config[vif_plug_ovs/ovsdb_connection]/ensure: created Dec 15 02:58:17 localhost puppet-user[51776]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_password'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 63, column: 25) Dec 15 02:58:17 localhost puppet-user[51776]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_url'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 68, column: 25) Dec 15 02:58:17 localhost puppet-user[51776]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_region'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 69, column: 28) Dec 15 02:58:17 localhost puppet-user[51776]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_user'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 70, column: 25) Dec 15 02:58:17 localhost puppet-user[51776]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_tenant_name'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 71, column: 29) Dec 15 02:58:17 localhost puppet-user[51776]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_cacert'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 72, column: 23) Dec 15 02:58:17 localhost puppet-user[51776]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_endpoint_type'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 73, column: 26) Dec 15 02:58:17 localhost puppet-user[51776]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_user_domain_name'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 74, column: 33) Dec 15 02:58:17 localhost puppet-user[51614]: Notice: /Stage[main]/Nova/Nova_config[notifications/notification_format]/ensure: created Dec 15 02:58:17 localhost puppet-user[51776]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_project_domain_name'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 75, column: 36) Dec 15 02:58:17 localhost puppet-user[51776]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_type'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 76, column: 26) Dec 15 02:58:17 localhost puppet-user[51614]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/state_path]/ensure: created Dec 15 02:58:17 localhost puppet-user[51614]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/service_down_time]/ensure: created Dec 15 02:58:17 localhost puppet-user[51614]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/rootwrap_config]/ensure: created Dec 15 02:58:17 localhost puppet-user[51614]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/report_interval]/ensure: created Dec 15 02:58:17 localhost puppet-user[51614]: Notice: /Stage[main]/Nova/Nova_config[notifications/notify_on_state_change]/ensure: created Dec 15 02:58:17 localhost puppet-user[51614]: Notice: /Stage[main]/Nova/Nova_config[cinder/cross_az_attach]/ensure: created Dec 15 02:58:17 localhost puppet-user[51776]: Notice: Compiled catalog for np0005559462.localdomain in environment production in 0.39 seconds Dec 15 02:58:17 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Glance/Nova_config[glance/valid_interfaces]/ensure: created Dec 15 02:58:17 localhost puppet-user[51776]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[DEFAULT/http_timeout]/ensure: created Dec 15 02:58:17 localhost puppet-user[51776]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[DEFAULT/host]/ensure: created Dec 15 02:58:17 localhost puppet-user[51776]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[publisher/telemetry_secret]/ensure: created Dec 15 02:58:17 localhost puppet-user[51776]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[hardware/readonly_user_name]/ensure: created Dec 15 02:58:17 localhost puppet-user[51776]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[hardware/readonly_user_password]/ensure: created Dec 15 02:58:17 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/auth_type]/ensure: created Dec 15 02:58:17 localhost puppet-user[51776]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/auth_url]/ensure: created Dec 15 02:58:17 localhost puppet-user[51776]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/region_name]/ensure: created Dec 15 02:58:17 localhost puppet-user[51776]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/username]/ensure: created Dec 15 02:58:17 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/auth_url]/ensure: created Dec 15 02:58:17 localhost puppet-user[51776]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/password]/ensure: created Dec 15 02:58:17 localhost puppet-user[51776]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/project_name]/ensure: created Dec 15 02:58:17 localhost puppet-user[51776]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/interface]/ensure: created Dec 15 02:58:17 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/password]/ensure: created Dec 15 02:58:17 localhost puppet-user[51776]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/user_domain_name]/ensure: created Dec 15 02:58:17 localhost puppet-user[51776]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/project_domain_name]/ensure: created Dec 15 02:58:17 localhost puppet-user[51776]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/auth_type]/ensure: created Dec 15 02:58:17 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/project_domain_name]/ensure: created Dec 15 02:58:17 localhost puppet-user[51776]: Notice: /Stage[main]/Ceilometer::Agent::Polling/Ceilometer_config[compute/instance_discovery_method]/ensure: created Dec 15 02:58:17 localhost puppet-user[51776]: Notice: /Stage[main]/Ceilometer::Agent::Polling/Ceilometer_config[DEFAULT/polling_namespaces]/ensure: created Dec 15 02:58:17 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/project_name]/ensure: created Dec 15 02:58:17 localhost puppet-user[51776]: Notice: /Stage[main]/Ceilometer::Agent::Polling/Ceilometer_config[polling/tenant_name_discovery]/ensure: created Dec 15 02:58:17 localhost puppet-user[51776]: Notice: /Stage[main]/Ceilometer::Agent::Polling/Ceilometer_config[coordination/backend_url]/ensure: created Dec 15 02:58:17 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/user_domain_name]/ensure: created Dec 15 02:58:17 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/username]/ensure: created Dec 15 02:58:17 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/region_name]/ensure: created Dec 15 02:58:17 localhost puppet-user[51776]: Notice: /Stage[main]/Ceilometer::Cache/Oslo::Cache[ceilometer_config]/Ceilometer_config[cache/backend]/ensure: created Dec 15 02:58:17 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/valid_interfaces]/ensure: created Dec 15 02:58:17 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/password]/ensure: created Dec 15 02:58:17 localhost puppet-user[51776]: Notice: /Stage[main]/Ceilometer::Cache/Oslo::Cache[ceilometer_config]/Ceilometer_config[cache/enabled]/ensure: created Dec 15 02:58:17 localhost puppet-user[51776]: Notice: /Stage[main]/Ceilometer::Cache/Oslo::Cache[ceilometer_config]/Ceilometer_config[cache/memcache_servers]/ensure: created Dec 15 02:58:17 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/auth_type]/ensure: created Dec 15 02:58:17 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/auth_url]/ensure: created Dec 15 02:58:17 localhost puppet-user[51776]: Notice: /Stage[main]/Ceilometer::Cache/Oslo::Cache[ceilometer_config]/Ceilometer_config[cache/tls_enabled]/ensure: created Dec 15 02:58:17 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/region_name]/ensure: created Dec 15 02:58:17 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/project_name]/ensure: created Dec 15 02:58:17 localhost puppet-user[51776]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Rabbit[ceilometer_config]/Ceilometer_config[oslo_messaging_rabbit/heartbeat_in_pthread]/ensure: created Dec 15 02:58:17 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/project_domain_name]/ensure: created Dec 15 02:58:17 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/username]/ensure: created Dec 15 02:58:17 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/user_domain_name]/ensure: created Dec 15 02:58:17 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/os_region_name]/ensure: created Dec 15 02:58:17 localhost puppet-user[51776]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Amqp[ceilometer_config]/Ceilometer_config[oslo_messaging_amqp/rpc_address_prefix]/ensure: created Dec 15 02:58:17 localhost puppet-user[51776]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Amqp[ceilometer_config]/Ceilometer_config[oslo_messaging_amqp/notify_address_prefix]/ensure: created Dec 15 02:58:17 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/catalog_info]/ensure: created Dec 15 02:58:18 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/manager_interval]/ensure: created Dec 15 02:58:18 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/remove_unused_base_images]/ensure: created Dec 15 02:58:18 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/remove_unused_original_minimum_age_seconds]/ensure: created Dec 15 02:58:18 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/remove_unused_resized_minimum_age_seconds]/ensure: created Dec 15 02:58:18 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/precache_concurrency]/ensure: created Dec 15 02:58:18 localhost puppet-user[51776]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Notifications[ceilometer_config]/Ceilometer_config[oslo_messaging_notifications/driver]/ensure: created Dec 15 02:58:18 localhost puppet-user[51776]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Notifications[ceilometer_config]/Ceilometer_config[oslo_messaging_notifications/transport_url]/ensure: created Dec 15 02:58:18 localhost puppet-user[51776]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Notifications[ceilometer_config]/Ceilometer_config[oslo_messaging_notifications/topics]/ensure: created Dec 15 02:58:18 localhost puppet-user[51776]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Default[ceilometer_config]/Ceilometer_config[DEFAULT/transport_url]/ensure: created Dec 15 02:58:18 localhost puppet-user[51776]: Notice: /Stage[main]/Ceilometer::Logging/Oslo::Log[ceilometer_config]/Ceilometer_config[DEFAULT/debug]/ensure: created Dec 15 02:58:18 localhost puppet-user[51776]: Notice: /Stage[main]/Ceilometer::Logging/Oslo::Log[ceilometer_config]/Ceilometer_config[DEFAULT/log_dir]/ensure: created Dec 15 02:58:18 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Vendordata/Nova_config[vendordata_dynamic_auth/project_domain_name]/ensure: created Dec 15 02:58:18 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Vendordata/Nova_config[vendordata_dynamic_auth/user_domain_name]/ensure: created Dec 15 02:58:18 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute::Provider/Nova_config[compute/provider_config_location]/ensure: created Dec 15 02:58:18 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute::Provider/File[/etc/nova/provider_config]/ensure: created Dec 15 02:58:18 localhost puppet-user[51776]: Notice: Applied catalog in 0.42 seconds Dec 15 02:58:18 localhost puppet-user[51776]: Application: Dec 15 02:58:18 localhost puppet-user[51776]: Initial environment: production Dec 15 02:58:18 localhost puppet-user[51776]: Converged environment: production Dec 15 02:58:18 localhost puppet-user[51776]: Run mode: user Dec 15 02:58:18 localhost puppet-user[51776]: Changes: Dec 15 02:58:18 localhost puppet-user[51776]: Total: 31 Dec 15 02:58:18 localhost puppet-user[51776]: Events: Dec 15 02:58:18 localhost puppet-user[51776]: Success: 31 Dec 15 02:58:18 localhost puppet-user[51776]: Total: 31 Dec 15 02:58:18 localhost puppet-user[51776]: Resources: Dec 15 02:58:18 localhost puppet-user[51776]: Skipped: 22 Dec 15 02:58:18 localhost puppet-user[51776]: Changed: 31 Dec 15 02:58:18 localhost puppet-user[51776]: Out of sync: 31 Dec 15 02:58:18 localhost puppet-user[51776]: Total: 151 Dec 15 02:58:18 localhost puppet-user[51776]: Time: Dec 15 02:58:18 localhost puppet-user[51776]: Package: 0.02 Dec 15 02:58:18 localhost puppet-user[51776]: Ceilometer config: 0.33 Dec 15 02:58:18 localhost puppet-user[51776]: Transaction evaluation: 0.41 Dec 15 02:58:18 localhost puppet-user[51776]: Catalog application: 0.42 Dec 15 02:58:18 localhost puppet-user[51776]: Config retrieval: 0.46 Dec 15 02:58:18 localhost puppet-user[51776]: Last run: 1765785498 Dec 15 02:58:18 localhost puppet-user[51776]: Resources: 0.00 Dec 15 02:58:18 localhost puppet-user[51776]: Total: 0.42 Dec 15 02:58:18 localhost puppet-user[51776]: Version: Dec 15 02:58:18 localhost puppet-user[51776]: Config: 1765785497 Dec 15 02:58:18 localhost puppet-user[51776]: Puppet: 7.10.0 Dec 15 02:58:18 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/use_cow_images]/ensure: created Dec 15 02:58:18 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/mkisofs_cmd]/ensure: created Dec 15 02:58:18 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/force_raw_images]/ensure: created Dec 15 02:58:18 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/reserved_host_memory_mb]/ensure: created Dec 15 02:58:18 localhost puppet-user[52360]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Dec 15 02:58:18 localhost puppet-user[52360]: (file: /etc/puppet/hiera.yaml) Dec 15 02:58:18 localhost puppet-user[52360]: Warning: Undefined variable '::deploy_config_name'; Dec 15 02:58:18 localhost puppet-user[52360]: (file & line not available) Dec 15 02:58:18 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/reserved_huge_pages]/ensure: created Dec 15 02:58:18 localhost puppet-user[52360]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Dec 15 02:58:18 localhost puppet-user[52360]: (file & line not available) Dec 15 02:58:18 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/resume_guests_state_on_host_boot]/ensure: created Dec 15 02:58:18 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute/Nova_config[key_manager/backend]/ensure: created Dec 15 02:58:18 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/sync_power_state_interval]/ensure: created Dec 15 02:58:18 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute/Nova_config[compute/consecutive_build_service_disable_threshold]/ensure: created Dec 15 02:58:18 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute/Nova_config[compute/live_migration_wait_for_vif_plug]/ensure: created Dec 15 02:58:18 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute/Nova_config[compute/max_disk_devices_to_attach]/ensure: created Dec 15 02:58:18 localhost puppet-user[52360]: Notice: Compiled catalog for np0005559462.localdomain in environment production in 0.21 seconds Dec 15 02:58:18 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Vncproxy::Common/Nova_config[vnc/novncproxy_base_url]/ensure: created Dec 15 02:58:18 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute/Nova_config[vnc/server_proxyclient_address]/ensure: created Dec 15 02:58:18 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute/Nova_config[vnc/enabled]/ensure: created Dec 15 02:58:18 localhost systemd[1]: libpod-9442ecf14a185aadd8c1e5a63c3a3da4947fba358ea4939cc1577bdba9cad4db.scope: Deactivated successfully. Dec 15 02:58:18 localhost systemd[1]: libpod-9442ecf14a185aadd8c1e5a63c3a3da4947fba358ea4939cc1577bdba9cad4db.scope: Consumed 2.943s CPU time. Dec 15 02:58:18 localhost puppet-user[52438]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Dec 15 02:58:18 localhost puppet-user[52438]: (file: /etc/puppet/hiera.yaml) Dec 15 02:58:18 localhost puppet-user[52438]: Warning: Undefined variable '::deploy_config_name'; Dec 15 02:58:18 localhost puppet-user[52438]: (file & line not available) Dec 15 02:58:18 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute/Nova_config[spice/enabled]/ensure: created Dec 15 02:58:18 localhost podman[51721]: 2025-12-15 07:58:18.57413163 +0000 UTC m=+3.493599193 container died 9442ecf14a185aadd8c1e5a63c3a3da4947fba358ea4939cc1577bdba9cad4db (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-central, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, build-date=2025-11-19T00:11:59Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_puppet_step1, vcs-type=git, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, io.buildah.version=1.41.4, tcib_managed=true, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-central, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-central-container, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005559462', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, vendor=Red Hat, Inc., container_name=container-puppet-ceilometer, architecture=x86_64, batch=17.1_20251118.1) Dec 15 02:58:18 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/instance_usage_audit]/ensure: created Dec 15 02:58:18 localhost puppet-user[52360]: Notice: /Stage[main]/Rsyslog::Base/File[/etc/rsyslog.conf]/content: content changed '{sha256}d6f679f6a4eb6f33f9fc20c846cb30bef93811e1c86bc4da1946dc3100b826c3' to '{sha256}7963bd801fadd49a17561f4d3f80738c3f504b413b11c443432d8303138041f2' Dec 15 02:58:18 localhost puppet-user[52438]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Dec 15 02:58:18 localhost puppet-user[52438]: (file & line not available) Dec 15 02:58:18 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/instance_usage_audit_period]/ensure: created Dec 15 02:58:18 localhost puppet-user[52360]: Notice: /Stage[main]/Rsyslog::Config::Global/Rsyslog::Component::Global_config[MaxMessageSize]/Rsyslog::Generate_concat[rsyslog::concat::global_config::MaxMessageSize]/Concat[/etc/rsyslog.d/00_rsyslog.conf]/File[/etc/rsyslog.d/00_rsyslog.conf]/ensure: defined content as '{sha256}a291d5cc6d5884a978161f4c7b5831d43edd07797cc590bae366e7f150b8643b' Dec 15 02:58:18 localhost puppet-user[52360]: Notice: /Stage[main]/Rsyslog::Config::Templates/Rsyslog::Component::Template[rsyslog-node-index]/Rsyslog::Generate_concat[rsyslog::concat::template::rsyslog-node-index]/Concat[/etc/rsyslog.d/50_openstack_logs.conf]/File[/etc/rsyslog.d/50_openstack_logs.conf]/ensure: defined content as '{sha256}29ce8d7de27601380eff8e1a040ab90e58405ea6b535cbaacadaaa5f458ec3ee' Dec 15 02:58:18 localhost puppet-user[52360]: Notice: Applied catalog in 0.10 seconds Dec 15 02:58:18 localhost puppet-user[52360]: Application: Dec 15 02:58:18 localhost puppet-user[52360]: Initial environment: production Dec 15 02:58:18 localhost puppet-user[52360]: Converged environment: production Dec 15 02:58:18 localhost puppet-user[52360]: Run mode: user Dec 15 02:58:18 localhost puppet-user[52360]: Changes: Dec 15 02:58:18 localhost puppet-user[52360]: Total: 3 Dec 15 02:58:18 localhost puppet-user[52360]: Events: Dec 15 02:58:18 localhost puppet-user[52360]: Success: 3 Dec 15 02:58:18 localhost puppet-user[52360]: Total: 3 Dec 15 02:58:18 localhost puppet-user[52360]: Resources: Dec 15 02:58:18 localhost puppet-user[52360]: Skipped: 11 Dec 15 02:58:18 localhost puppet-user[52360]: Changed: 3 Dec 15 02:58:18 localhost puppet-user[52360]: Out of sync: 3 Dec 15 02:58:18 localhost puppet-user[52360]: Total: 25 Dec 15 02:58:18 localhost puppet-user[52360]: Time: Dec 15 02:58:18 localhost puppet-user[52360]: Concat file: 0.00 Dec 15 02:58:18 localhost puppet-user[52360]: Concat fragment: 0.00 Dec 15 02:58:18 localhost puppet-user[52360]: File: 0.02 Dec 15 02:58:18 localhost puppet-user[52360]: Transaction evaluation: 0.10 Dec 15 02:58:18 localhost puppet-user[52360]: Catalog application: 0.10 Dec 15 02:58:18 localhost puppet-user[52360]: Config retrieval: 0.25 Dec 15 02:58:18 localhost puppet-user[52360]: Last run: 1765785498 Dec 15 02:58:18 localhost puppet-user[52360]: Total: 0.10 Dec 15 02:58:18 localhost puppet-user[52360]: Version: Dec 15 02:58:18 localhost puppet-user[52360]: Config: 1765785498 Dec 15 02:58:18 localhost puppet-user[52360]: Puppet: 7.10.0 Dec 15 02:58:18 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[DEFAULT/vif_plugging_is_fatal]/ensure: created Dec 15 02:58:18 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[DEFAULT/vif_plugging_timeout]/ensure: created Dec 15 02:58:18 localhost systemd[1]: tmp-crun.yrIklR.mount: Deactivated successfully. Dec 15 02:58:18 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9442ecf14a185aadd8c1e5a63c3a3da4947fba358ea4939cc1577bdba9cad4db-userdata-shm.mount: Deactivated successfully. Dec 15 02:58:18 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/default_floating_pool]/ensure: created Dec 15 02:58:18 localhost systemd[1]: var-lib-containers-storage-overlay-8ebb8b8f9ef66ab5decfc0c9ea7632e995655bc9c0e31d590dc91d11091f3784-merged.mount: Deactivated successfully. Dec 15 02:58:18 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/timeout]/ensure: created Dec 15 02:58:18 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/project_name]/ensure: created Dec 15 02:58:18 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/project_domain_name]/ensure: created Dec 15 02:58:18 localhost podman[52702]: 2025-12-15 07:58:18.692241014 +0000 UTC m=+0.109918882 container cleanup 9442ecf14a185aadd8c1e5a63c3a3da4947fba358ea4939cc1577bdba9cad4db (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, config_id=tripleo_puppet_step1, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-central, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=17.1.12, release=1761123044, architecture=x86_64, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-central, com.redhat.component=openstack-ceilometer-central-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, url=https://www.redhat.com, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005559462', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, maintainer=OpenStack TripleO Team, container_name=container-puppet-ceilometer, managed_by=tripleo_ansible, build-date=2025-11-19T00:11:59Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1) Dec 15 02:58:18 localhost systemd[1]: libpod-conmon-9442ecf14a185aadd8c1e5a63c3a3da4947fba358ea4939cc1577bdba9cad4db.scope: Deactivated successfully. Dec 15 02:58:18 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/region_name]/ensure: created Dec 15 02:58:18 localhost python3[51312]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-ceilometer --conmon-pidfile /run/container-puppet-ceilometer.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005559462 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config --env NAME=ceilometer --env STEP_CONFIG=include ::tripleo::packages#012include tripleo::profile::base::ceilometer::agent::polling#012include tripleo::profile::base::ceilometer::agent::polling#012 --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-ceilometer --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005559462', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-ceilometer.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1 Dec 15 02:58:18 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/username]/ensure: created Dec 15 02:58:18 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/user_domain_name]/ensure: created Dec 15 02:58:18 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/password]/ensure: created Dec 15 02:58:18 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/auth_url]/ensure: created Dec 15 02:58:18 localhost puppet-user[52438]: Notice: Compiled catalog for np0005559462.localdomain in environment production in 0.25 seconds Dec 15 02:58:18 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/valid_interfaces]/ensure: created Dec 15 02:58:18 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/ovs_bridge]/ensure: created Dec 15 02:58:18 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/extension_sync_interval]/ensure: created Dec 15 02:58:18 localhost ovs-vsctl[52785]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-remote=tcp:172.17.0.103:6642,tcp:172.17.0.104:6642,tcp:172.17.0.105:6642 Dec 15 02:58:18 localhost puppet-user[52438]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-remote]/ensure: created Dec 15 02:58:18 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/auth_type]/ensure: created Dec 15 02:58:18 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_uri]/ensure: created Dec 15 02:58:18 localhost ovs-vsctl[52787]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-encap-type=geneve Dec 15 02:58:18 localhost puppet-user[52438]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-encap-type]/ensure: created Dec 15 02:58:18 localhost ovs-vsctl[52794]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-encap-ip=172.19.0.106 Dec 15 02:58:18 localhost systemd[1]: libpod-adb686c846280011080046965b56fe97bcd01ac827087ebfa44d52fe93616ff5.scope: Deactivated successfully. Dec 15 02:58:18 localhost puppet-user[52438]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-encap-ip]/ensure: created Dec 15 02:58:18 localhost systemd[1]: libpod-adb686c846280011080046965b56fe97bcd01ac827087ebfa44d52fe93616ff5.scope: Consumed 2.224s CPU time. Dec 15 02:58:18 localhost podman[52262]: 2025-12-15 07:58:18.925160698 +0000 UTC m=+2.459987114 container died adb686c846280011080046965b56fe97bcd01ac827087ebfa44d52fe93616ff5 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, vcs-type=git, vendor=Red Hat, Inc., config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005559462', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.component=openstack-rsyslog-container, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=container-puppet-rsyslog, io.openshift.expose-services=, distribution-scope=public, version=17.1.12, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-rsyslog, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:49Z, tcib_managed=true, url=https://www.redhat.com, config_id=tripleo_puppet_step1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 15 02:58:18 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_tunnelled]/ensure: created Dec 15 02:58:18 localhost ovs-vsctl[52813]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:hostname=np0005559462.localdomain Dec 15 02:58:18 localhost puppet-user[52438]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:hostname]/value: value changed 'np0005559462.novalocal' to 'np0005559462.localdomain' Dec 15 02:58:18 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_inbound_addr]/ensure: created Dec 15 02:58:18 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_permit_post_copy]/ensure: created Dec 15 02:58:18 localhost ovs-vsctl[52816]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-bridge=br-int Dec 15 02:58:18 localhost puppet-user[52438]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-bridge]/ensure: created Dec 15 02:58:18 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_permit_auto_converge]/ensure: created Dec 15 02:58:18 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Migration::Libvirt/Virtproxyd_config[listen_tls]/ensure: created Dec 15 02:58:18 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Migration::Libvirt/Virtproxyd_config[listen_tcp]/ensure: created Dec 15 02:58:18 localhost ovs-vsctl[52819]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-remote-probe-interval=60000 Dec 15 02:58:19 localhost puppet-user[52438]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-remote-probe-interval]/ensure: created Dec 15 02:58:19 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/rbd_user]/ensure: created Dec 15 02:58:19 localhost podman[52802]: 2025-12-15 07:58:19.011541307 +0000 UTC m=+0.078392902 container cleanup adb686c846280011080046965b56fe97bcd01ac827087ebfa44d52fe93616ff5 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, version=17.1.12, build-date=2025-11-18T22:49:49Z, description=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005559462', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, name=rhosp17/openstack-rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-rsyslog-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, container_name=container-puppet-rsyslog, tcib_managed=true, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_puppet_step1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, release=1761123044, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.expose-services=, maintainer=OpenStack TripleO Team) Dec 15 02:58:19 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/rbd_secret_uuid]/ensure: created Dec 15 02:58:19 localhost ovs-vsctl[52821]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-openflow-probe-interval=60 Dec 15 02:58:19 localhost systemd[1]: libpod-conmon-adb686c846280011080046965b56fe97bcd01ac827087ebfa44d52fe93616ff5.scope: Deactivated successfully. Dec 15 02:58:19 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute::Rbd/File[/etc/nova/secret.xml]/ensure: defined content as '{sha256}4ff80b650a815084430bff109fdd9b388ec4737a2fdc296d8a59f3a03178361a' Dec 15 02:58:19 localhost puppet-user[52438]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-openflow-probe-interval]/ensure: created Dec 15 02:58:19 localhost python3[51312]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-rsyslog --conmon-pidfile /run/container-puppet-rsyslog.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005559462 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment --env NAME=rsyslog --env STEP_CONFIG=include ::tripleo::packages#012include tripleo::profile::base::logging::rsyslog --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-rsyslog --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005559462', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-rsyslog.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 Dec 15 02:58:19 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_type]/ensure: created Dec 15 02:58:19 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_rbd_pool]/ensure: created Dec 15 02:58:19 localhost ovs-vsctl[52824]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-monitor-all=true Dec 15 02:58:19 localhost puppet-user[52438]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-monitor-all]/ensure: created Dec 15 02:58:19 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_rbd_ceph_conf]/ensure: created Dec 15 02:58:19 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_rbd_glance_store_name]/ensure: created Dec 15 02:58:19 localhost ovs-vsctl[52835]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-ofctrl-wait-before-clear=8000 Dec 15 02:58:19 localhost puppet-user[52438]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-ofctrl-wait-before-clear]/ensure: created Dec 15 02:58:19 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_rbd_glance_copy_poll_interval]/ensure: created Dec 15 02:58:19 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_rbd_glance_copy_timeout]/ensure: created Dec 15 02:58:19 localhost ovs-vsctl[52837]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-encap-tos=0 Dec 15 02:58:19 localhost puppet-user[52438]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-encap-tos]/ensure: created Dec 15 02:58:19 localhost ovs-vsctl[52841]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-chassis-mac-mappings=datacentre:fa:16:3e:8c:45:74 Dec 15 02:58:19 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[DEFAULT/compute_driver]/ensure: created Dec 15 02:58:19 localhost puppet-user[52438]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-chassis-mac-mappings]/ensure: created Dec 15 02:58:19 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[DEFAULT/preallocate_images]/ensure: created Dec 15 02:58:19 localhost ovs-vsctl[52848]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-bridge-mappings=datacentre:br-ex Dec 15 02:58:19 localhost puppet-user[52438]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-bridge-mappings]/ensure: created Dec 15 02:58:19 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[vnc/server_listen]/ensure: created Dec 15 02:58:19 localhost ovs-vsctl[52853]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-match-northd-version=false Dec 15 02:58:19 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/virt_type]/ensure: created Dec 15 02:58:19 localhost puppet-user[52438]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-match-northd-version]/ensure: created Dec 15 02:58:19 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/cpu_mode]/ensure: created Dec 15 02:58:19 localhost ovs-vsctl[52855]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:garp-max-timeout-sec=0 Dec 15 02:58:19 localhost puppet-user[52438]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:garp-max-timeout-sec]/ensure: created Dec 15 02:58:19 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/inject_password]/ensure: created Dec 15 02:58:19 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/inject_key]/ensure: created Dec 15 02:58:19 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/inject_partition]/ensure: created Dec 15 02:58:19 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/hw_disk_discard]/ensure: created Dec 15 02:58:19 localhost puppet-user[52438]: Notice: Applied catalog in 0.39 seconds Dec 15 02:58:19 localhost puppet-user[52438]: Application: Dec 15 02:58:19 localhost puppet-user[52438]: Initial environment: production Dec 15 02:58:19 localhost puppet-user[52438]: Converged environment: production Dec 15 02:58:19 localhost puppet-user[52438]: Run mode: user Dec 15 02:58:19 localhost puppet-user[52438]: Changes: Dec 15 02:58:19 localhost puppet-user[52438]: Total: 14 Dec 15 02:58:19 localhost puppet-user[52438]: Events: Dec 15 02:58:19 localhost puppet-user[52438]: Success: 14 Dec 15 02:58:19 localhost puppet-user[52438]: Total: 14 Dec 15 02:58:19 localhost puppet-user[52438]: Resources: Dec 15 02:58:19 localhost puppet-user[52438]: Skipped: 12 Dec 15 02:58:19 localhost puppet-user[52438]: Changed: 14 Dec 15 02:58:19 localhost puppet-user[52438]: Out of sync: 14 Dec 15 02:58:19 localhost puppet-user[52438]: Total: 29 Dec 15 02:58:19 localhost puppet-user[52438]: Time: Dec 15 02:58:19 localhost puppet-user[52438]: Exec: 0.02 Dec 15 02:58:19 localhost puppet-user[52438]: Config retrieval: 0.29 Dec 15 02:58:19 localhost puppet-user[52438]: Vs config: 0.33 Dec 15 02:58:19 localhost puppet-user[52438]: Transaction evaluation: 0.39 Dec 15 02:58:19 localhost puppet-user[52438]: Catalog application: 0.39 Dec 15 02:58:19 localhost puppet-user[52438]: Last run: 1765785499 Dec 15 02:58:19 localhost puppet-user[52438]: Total: 0.39 Dec 15 02:58:19 localhost puppet-user[52438]: Version: Dec 15 02:58:19 localhost puppet-user[52438]: Config: 1765785498 Dec 15 02:58:19 localhost puppet-user[52438]: Puppet: 7.10.0 Dec 15 02:58:19 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/hw_machine_type]/ensure: created Dec 15 02:58:19 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/enabled_perf_events]/ensure: created Dec 15 02:58:19 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/rx_queue_size]/ensure: created Dec 15 02:58:19 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/tx_queue_size]/ensure: created Dec 15 02:58:19 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/file_backed_memory]/ensure: created Dec 15 02:58:19 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/volume_use_multipath]/ensure: created Dec 15 02:58:19 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/num_pcie_ports]/ensure: created Dec 15 02:58:19 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/mem_stats_period_seconds]/ensure: created Dec 15 02:58:19 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/pmem_namespaces]/ensure: created Dec 15 02:58:19 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/swtpm_enabled]/ensure: created Dec 15 02:58:19 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/cpu_model_extra_flags]/ensure: created Dec 15 02:58:19 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/disk_cachemodes]/ensure: created Dec 15 02:58:19 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtlogd/Virtlogd_config[log_filters]/ensure: created Dec 15 02:58:19 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtlogd/Virtlogd_config[log_outputs]/ensure: created Dec 15 02:58:19 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtproxyd/Virtproxyd_config[log_filters]/ensure: created Dec 15 02:58:19 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtproxyd/Virtproxyd_config[log_outputs]/ensure: created Dec 15 02:58:19 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtqemud/Virtqemud_config[log_filters]/ensure: created Dec 15 02:58:19 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtqemud/Virtqemud_config[log_outputs]/ensure: created Dec 15 02:58:19 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtnodedevd/Virtnodedevd_config[log_filters]/ensure: created Dec 15 02:58:19 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtnodedevd/Virtnodedevd_config[log_outputs]/ensure: created Dec 15 02:58:19 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtstoraged/Virtstoraged_config[log_filters]/ensure: created Dec 15 02:58:19 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtstoraged/Virtstoraged_config[log_outputs]/ensure: created Dec 15 02:58:19 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtsecretd/Virtsecretd_config[log_filters]/ensure: created Dec 15 02:58:19 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtsecretd/Virtsecretd_config[log_outputs]/ensure: created Dec 15 02:58:19 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[unix_sock_group]/ensure: created Dec 15 02:58:19 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[auth_unix_ro]/ensure: created Dec 15 02:58:19 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[auth_unix_rw]/ensure: created Dec 15 02:58:19 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[unix_sock_ro_perms]/ensure: created Dec 15 02:58:19 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[unix_sock_rw_perms]/ensure: created Dec 15 02:58:19 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[unix_sock_group]/ensure: created Dec 15 02:58:19 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[auth_unix_ro]/ensure: created Dec 15 02:58:19 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[auth_unix_rw]/ensure: created Dec 15 02:58:19 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[unix_sock_ro_perms]/ensure: created Dec 15 02:58:19 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[unix_sock_rw_perms]/ensure: created Dec 15 02:58:19 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[unix_sock_group]/ensure: created Dec 15 02:58:19 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[auth_unix_ro]/ensure: created Dec 15 02:58:19 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[auth_unix_rw]/ensure: created Dec 15 02:58:19 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[unix_sock_ro_perms]/ensure: created Dec 15 02:58:19 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[unix_sock_rw_perms]/ensure: created Dec 15 02:58:19 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[unix_sock_group]/ensure: created Dec 15 02:58:19 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[auth_unix_ro]/ensure: created Dec 15 02:58:19 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[auth_unix_rw]/ensure: created Dec 15 02:58:19 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[unix_sock_ro_perms]/ensure: created Dec 15 02:58:19 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[unix_sock_rw_perms]/ensure: created Dec 15 02:58:19 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[unix_sock_group]/ensure: created Dec 15 02:58:19 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[auth_unix_ro]/ensure: created Dec 15 02:58:19 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[auth_unix_rw]/ensure: created Dec 15 02:58:19 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[unix_sock_ro_perms]/ensure: created Dec 15 02:58:19 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[unix_sock_rw_perms]/ensure: created Dec 15 02:58:19 localhost systemd[1]: libpod-17bca7f1ea4867411d6ff448cba20a6d46a34d6867615471aea7b38bf7c820f4.scope: Deactivated successfully. Dec 15 02:58:19 localhost systemd[1]: libpod-17bca7f1ea4867411d6ff448cba20a6d46a34d6867615471aea7b38bf7c820f4.scope: Consumed 2.688s CPU time. Dec 15 02:58:19 localhost podman[52328]: 2025-12-15 07:58:19.63700105 +0000 UTC m=+3.040155768 container died 17bca7f1ea4867411d6ff448cba20a6d46a34d6867615471aea7b38bf7c820f4 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=container-puppet-ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, version=17.1.12, managed_by=tripleo_ansible, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, release=1761123044, build-date=2025-11-18T23:34:05Z, tcib_managed=true, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005559462', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, config_id=tripleo_puppet_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public) Dec 15 02:58:19 localhost systemd[1]: tmp-crun.VU5KXP.mount: Deactivated successfully. Dec 15 02:58:19 localhost systemd[1]: var-lib-containers-storage-overlay-ea8b9b9472adf5d6d950d327572433ab4db7c804ff790266ce1ebb10d8ac319e-merged.mount: Deactivated successfully. Dec 15 02:58:19 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-adb686c846280011080046965b56fe97bcd01ac827087ebfa44d52fe93616ff5-userdata-shm.mount: Deactivated successfully. Dec 15 02:58:19 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Compute::Libvirt::Qemu/Augeas[qemu-conf-limits]/returns: executed successfully Dec 15 02:58:20 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-17bca7f1ea4867411d6ff448cba20a6d46a34d6867615471aea7b38bf7c820f4-userdata-shm.mount: Deactivated successfully. Dec 15 02:58:20 localhost systemd[1]: var-lib-containers-storage-overlay-d37fb345fe4e85b30ebfa0aa6234403fca41e6174e3eeb88a2e8baf1f4bd0d92-merged.mount: Deactivated successfully. Dec 15 02:58:20 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Migration::Qemu/Augeas[qemu-conf-migration-ports]/returns: executed successfully Dec 15 02:58:20 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Logging/Oslo::Log[nova_config]/Nova_config[DEFAULT/debug]/ensure: created Dec 15 02:58:20 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Logging/Oslo::Log[nova_config]/Nova_config[DEFAULT/log_dir]/ensure: created Dec 15 02:58:20 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Cache/Oslo::Cache[nova_config]/Nova_config[cache/backend]/ensure: created Dec 15 02:58:20 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Cache/Oslo::Cache[nova_config]/Nova_config[cache/enabled]/ensure: created Dec 15 02:58:20 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Cache/Oslo::Cache[nova_config]/Nova_config[cache/memcache_servers]/ensure: created Dec 15 02:58:20 localhost podman[52895]: 2025-12-15 07:58:20.948681877 +0000 UTC m=+1.294404039 container cleanup 17bca7f1ea4867411d6ff448cba20a6d46a34d6867615471aea7b38bf7c820f4 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, release=1761123044, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, version=17.1.12, architecture=x86_64, container_name=container-puppet-ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, distribution-scope=public, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, config_id=tripleo_puppet_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vendor=Red Hat, Inc., config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005559462', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272) Dec 15 02:58:20 localhost python3[51312]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-ovn_controller --conmon-pidfile /run/container-puppet-ovn_controller.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005559462 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,vs_config,exec --env NAME=ovn_controller --env STEP_CONFIG=include ::tripleo::packages#012include tripleo::profile::base::neutron::agents::ovn#012 --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-ovn_controller --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005559462', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-ovn_controller.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /etc/sysconfig/modules:/etc/sysconfig/modules --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 Dec 15 02:58:20 localhost systemd[1]: libpod-conmon-17bca7f1ea4867411d6ff448cba20a6d46a34d6867615471aea7b38bf7c820f4.scope: Deactivated successfully. Dec 15 02:58:20 localhost podman[52427]: 2025-12-15 07:58:16.830552677 +0000 UTC m=+0.028992476 image pull registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1 Dec 15 02:58:21 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Cache/Oslo::Cache[nova_config]/Nova_config[cache/tls_enabled]/ensure: created Dec 15 02:58:21 localhost puppet-user[51614]: Notice: /Stage[main]/Nova/Oslo::Messaging::Rabbit[nova_config]/Nova_config[oslo_messaging_rabbit/heartbeat_in_pthread]/ensure: created Dec 15 02:58:21 localhost puppet-user[51614]: Notice: /Stage[main]/Nova/Oslo::Messaging::Rabbit[nova_config]/Nova_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]/ensure: created Dec 15 02:58:21 localhost puppet-user[51614]: Notice: /Stage[main]/Nova/Oslo::Messaging::Rabbit[nova_config]/Nova_config[oslo_messaging_rabbit/ssl]/ensure: created Dec 15 02:58:21 localhost podman[53126]: 2025-12-15 07:58:21.258896493 +0000 UTC m=+0.126428812 container create aa7843baae392ce5332cf47666421d9cbad18ab5a866dc9d29c1ada16ad7c2c2 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, architecture=x86_64, managed_by=tripleo_ansible, container_name=container-puppet-neutron, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_puppet_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-server, vcs-type=git, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-server, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-server, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, version=17.1.12, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-server-container, batch=17.1_20251118.1, release=1761123044, build-date=2025-11-19T00:23:27Z, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005559462', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}) Dec 15 02:58:21 localhost podman[53126]: 2025-12-15 07:58:21.164753353 +0000 UTC m=+0.032285712 image pull registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1 Dec 15 02:58:21 localhost systemd[1]: Started libpod-conmon-aa7843baae392ce5332cf47666421d9cbad18ab5a866dc9d29c1ada16ad7c2c2.scope. Dec 15 02:58:21 localhost systemd[1]: Started libcrun container. Dec 15 02:58:21 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3b50fc9e8c36464873157655c311974075d15360c3eaaa7584c489f72ca2e57a/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff) Dec 15 02:58:21 localhost podman[53126]: 2025-12-15 07:58:21.333868995 +0000 UTC m=+0.201401324 container init aa7843baae392ce5332cf47666421d9cbad18ab5a866dc9d29c1ada16ad7c2c2 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, build-date=2025-11-19T00:23:27Z, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, tcib_managed=true, com.redhat.component=openstack-neutron-server-container, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-server, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-type=git, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-server, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005559462', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=container-puppet-neutron, url=https://www.redhat.com, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., config_id=tripleo_puppet_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, description=Red Hat OpenStack Platform 17.1 neutron-server, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, managed_by=tripleo_ansible) Dec 15 02:58:21 localhost podman[53126]: 2025-12-15 07:58:21.343834575 +0000 UTC m=+0.211366884 container start aa7843baae392ce5332cf47666421d9cbad18ab5a866dc9d29c1ada16ad7c2c2 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, tcib_managed=true, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005559462', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:23:27Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-server, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-neutron-server-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, config_id=tripleo_puppet_step1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, name=rhosp17/openstack-neutron-server, release=1761123044, url=https://www.redhat.com, vcs-type=git, container_name=container-puppet-neutron, io.buildah.version=1.41.4, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-server, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vendor=Red Hat, Inc., version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 15 02:58:21 localhost podman[53126]: 2025-12-15 07:58:21.345131619 +0000 UTC m=+0.212663938 container attach aa7843baae392ce5332cf47666421d9cbad18ab5a866dc9d29c1ada16ad7c2c2 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, url=https://www.redhat.com, name=rhosp17/openstack-neutron-server, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, distribution-scope=public, build-date=2025-11-19T00:23:27Z, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, container_name=container-puppet-neutron, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-server, maintainer=OpenStack TripleO Team, config_id=tripleo_puppet_step1, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-server, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, com.redhat.component=openstack-neutron-server-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005559462', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.openshift.expose-services=) Dec 15 02:58:21 localhost puppet-user[51614]: Notice: /Stage[main]/Nova/Oslo::Messaging::Default[nova_config]/Nova_config[DEFAULT/transport_url]/ensure: created Dec 15 02:58:21 localhost puppet-user[51614]: Notice: /Stage[main]/Nova/Oslo::Messaging::Notifications[nova_config]/Nova_config[oslo_messaging_notifications/driver]/ensure: created Dec 15 02:58:21 localhost puppet-user[51614]: Notice: /Stage[main]/Nova/Oslo::Messaging::Notifications[nova_config]/Nova_config[oslo_messaging_notifications/transport_url]/ensure: created Dec 15 02:58:21 localhost puppet-user[51614]: Notice: /Stage[main]/Nova/Oslo::Concurrency[nova_config]/Nova_config[oslo_concurrency/lock_path]/ensure: created Dec 15 02:58:21 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/auth_type]/ensure: created Dec 15 02:58:21 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/region_name]/ensure: created Dec 15 02:58:21 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/auth_url]/ensure: created Dec 15 02:58:21 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/username]/ensure: created Dec 15 02:58:21 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/password]/ensure: created Dec 15 02:58:21 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/user_domain_name]/ensure: created Dec 15 02:58:21 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/project_name]/ensure: created Dec 15 02:58:21 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/project_domain_name]/ensure: created Dec 15 02:58:21 localhost puppet-user[51614]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/send_service_user_token]/ensure: created Dec 15 02:58:21 localhost puppet-user[51614]: Notice: /Stage[main]/Ssh::Server::Config/Concat[/etc/ssh/sshd_config]/File[/etc/ssh/sshd_config]/ensure: defined content as '{sha256}3ccd56cc76ec60fa08fd698d282c9c89b1e8c485a00f47d57569ed8f6f8a16e4' Dec 15 02:58:21 localhost puppet-user[51614]: Notice: Applied catalog in 4.61 seconds Dec 15 02:58:21 localhost puppet-user[51614]: Application: Dec 15 02:58:21 localhost puppet-user[51614]: Initial environment: production Dec 15 02:58:21 localhost puppet-user[51614]: Converged environment: production Dec 15 02:58:21 localhost puppet-user[51614]: Run mode: user Dec 15 02:58:21 localhost puppet-user[51614]: Changes: Dec 15 02:58:21 localhost puppet-user[51614]: Total: 183 Dec 15 02:58:21 localhost puppet-user[51614]: Events: Dec 15 02:58:21 localhost puppet-user[51614]: Success: 183 Dec 15 02:58:21 localhost puppet-user[51614]: Total: 183 Dec 15 02:58:21 localhost puppet-user[51614]: Resources: Dec 15 02:58:21 localhost puppet-user[51614]: Changed: 183 Dec 15 02:58:21 localhost puppet-user[51614]: Out of sync: 183 Dec 15 02:58:21 localhost puppet-user[51614]: Skipped: 57 Dec 15 02:58:21 localhost puppet-user[51614]: Total: 487 Dec 15 02:58:21 localhost puppet-user[51614]: Time: Dec 15 02:58:21 localhost puppet-user[51614]: Concat fragment: 0.00 Dec 15 02:58:21 localhost puppet-user[51614]: Anchor: 0.00 Dec 15 02:58:21 localhost puppet-user[51614]: File line: 0.00 Dec 15 02:58:21 localhost puppet-user[51614]: Virtlogd config: 0.00 Dec 15 02:58:21 localhost puppet-user[51614]: Package: 0.01 Dec 15 02:58:21 localhost puppet-user[51614]: Virtstoraged config: 0.01 Dec 15 02:58:21 localhost puppet-user[51614]: Virtqemud config: 0.01 Dec 15 02:58:21 localhost puppet-user[51614]: Virtsecretd config: 0.01 Dec 15 02:58:21 localhost puppet-user[51614]: Virtnodedevd config: 0.01 Dec 15 02:58:21 localhost puppet-user[51614]: Exec: 0.02 Dec 15 02:58:21 localhost puppet-user[51614]: File: 0.03 Dec 15 02:58:21 localhost puppet-user[51614]: Virtproxyd config: 0.03 Dec 15 02:58:21 localhost puppet-user[51614]: Augeas: 1.11 Dec 15 02:58:21 localhost puppet-user[51614]: Config retrieval: 1.57 Dec 15 02:58:21 localhost puppet-user[51614]: Last run: 1765785501 Dec 15 02:58:21 localhost puppet-user[51614]: Nova config: 3.14 Dec 15 02:58:21 localhost puppet-user[51614]: Transaction evaluation: 4.60 Dec 15 02:58:21 localhost puppet-user[51614]: Catalog application: 4.61 Dec 15 02:58:21 localhost puppet-user[51614]: Resources: 0.00 Dec 15 02:58:21 localhost puppet-user[51614]: Concat file: 0.00 Dec 15 02:58:21 localhost puppet-user[51614]: Total: 4.61 Dec 15 02:58:21 localhost puppet-user[51614]: Version: Dec 15 02:58:21 localhost puppet-user[51614]: Config: 1765785495 Dec 15 02:58:21 localhost puppet-user[51614]: Puppet: 7.10.0 Dec 15 02:58:22 localhost systemd[1]: libpod-78945b55310ce128de7c360c749c4caec3fd3259feb54fbae2253ebadd2f6d1a.scope: Deactivated successfully. Dec 15 02:58:22 localhost systemd[1]: libpod-78945b55310ce128de7c360c749c4caec3fd3259feb54fbae2253ebadd2f6d1a.scope: Consumed 8.510s CPU time. Dec 15 02:58:22 localhost podman[51486]: 2025-12-15 07:58:22.68305151 +0000 UTC m=+10.408657388 container died 78945b55310ce128de7c360c749c4caec3fd3259feb54fbae2253ebadd2f6d1a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:35:22Z, container_name=container-puppet-nova_libvirt, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-libvirt-container, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, architecture=x86_64, version=17.1.12, io.buildah.version=1.41.4, tcib_managed=true, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_puppet_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005559462', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, url=https://www.redhat.com, name=rhosp17/openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 15 02:58:22 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-78945b55310ce128de7c360c749c4caec3fd3259feb54fbae2253ebadd2f6d1a-userdata-shm.mount: Deactivated successfully. Dec 15 02:58:22 localhost systemd[1]: var-lib-containers-storage-overlay-fb7bf143538b6ac117a23468fe01aaa41031d6332839335f81ce26797b9a28dd-merged.mount: Deactivated successfully. Dec 15 02:58:22 localhost podman[53197]: 2025-12-15 07:58:22.851369642 +0000 UTC m=+0.161662430 container cleanup 78945b55310ce128de7c360c749c4caec3fd3259feb54fbae2253ebadd2f6d1a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, com.redhat.component=openstack-nova-libvirt-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, container_name=container-puppet-nova_libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, version=17.1.12, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005559462', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, config_id=tripleo_puppet_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, architecture=x86_64, url=https://www.redhat.com, vcs-type=git, io.openshift.expose-services=, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, build-date=2025-11-19T00:35:22Z, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Dec 15 02:58:22 localhost systemd[1]: libpod-conmon-78945b55310ce128de7c360c749c4caec3fd3259feb54fbae2253ebadd2f6d1a.scope: Deactivated successfully. Dec 15 02:58:22 localhost python3[51312]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-nova_libvirt --conmon-pidfile /run/container-puppet-nova_libvirt.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005559462 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password --env NAME=nova_libvirt --env STEP_CONFIG=include ::tripleo::packages#012# TODO(emilien): figure how to deal with libvirt profile.#012# We'll probably treat it like we do with Neutron plugins.#012# Until then, just include it in the default nova-compute role.#012include tripleo::profile::base::nova::compute::libvirt#012#012include tripleo::profile::base::nova::libvirt#012#012include tripleo::profile::base::nova::compute::libvirt_guests#012#012include tripleo::profile::base::sshd#012include tripleo::profile::base::nova::migration::target --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-nova_libvirt --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005559462', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-nova_libvirt.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Dec 15 02:58:23 localhost puppet-user[53157]: Error: Facter: error while resolving custom fact "haproxy_version": undefined method `strip' for nil:NilClass Dec 15 02:58:23 localhost puppet-user[53157]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Dec 15 02:58:23 localhost puppet-user[53157]: (file: /etc/puppet/hiera.yaml) Dec 15 02:58:23 localhost puppet-user[53157]: Warning: Undefined variable '::deploy_config_name'; Dec 15 02:58:23 localhost puppet-user[53157]: (file & line not available) Dec 15 02:58:23 localhost puppet-user[53157]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Dec 15 02:58:23 localhost puppet-user[53157]: (file & line not available) Dec 15 02:58:23 localhost puppet-user[53157]: Warning: Unknown variable: 'dhcp_agents_per_net'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/neutron.pp, line: 154, column: 37) Dec 15 02:58:23 localhost puppet-user[53157]: Notice: Compiled catalog for np0005559462.localdomain in environment production in 0.59 seconds Dec 15 02:58:24 localhost puppet-user[53157]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/auth_strategy]/ensure: created Dec 15 02:58:24 localhost puppet-user[53157]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/core_plugin]/ensure: created Dec 15 02:58:24 localhost puppet-user[53157]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/host]/ensure: created Dec 15 02:58:24 localhost puppet-user[53157]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/dns_domain]/ensure: created Dec 15 02:58:24 localhost puppet-user[53157]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/dhcp_agent_notification]/ensure: created Dec 15 02:58:24 localhost puppet-user[53157]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/allow_overlapping_ips]/ensure: created Dec 15 02:58:24 localhost puppet-user[53157]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/global_physnet_mtu]/ensure: created Dec 15 02:58:24 localhost puppet-user[53157]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/vlan_transparent]/ensure: created Dec 15 02:58:24 localhost puppet-user[53157]: Notice: /Stage[main]/Neutron/Neutron_config[agent/root_helper]/ensure: created Dec 15 02:58:24 localhost puppet-user[53157]: Notice: /Stage[main]/Neutron/Neutron_config[agent/report_interval]/ensure: created Dec 15 02:58:24 localhost puppet-user[53157]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/service_plugins]/ensure: created Dec 15 02:58:24 localhost puppet-user[53157]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/debug]/ensure: created Dec 15 02:58:24 localhost puppet-user[53157]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/nova_metadata_host]/ensure: created Dec 15 02:58:24 localhost puppet-user[53157]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/nova_metadata_protocol]/ensure: created Dec 15 02:58:24 localhost puppet-user[53157]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/metadata_proxy_shared_secret]/ensure: created Dec 15 02:58:24 localhost puppet-user[53157]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/metadata_workers]/ensure: created Dec 15 02:58:24 localhost puppet-user[53157]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/state_path]/ensure: created Dec 15 02:58:24 localhost puppet-user[53157]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/hwol_qos_enabled]/ensure: created Dec 15 02:58:24 localhost puppet-user[53157]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[agent/root_helper]/ensure: created Dec 15 02:58:24 localhost puppet-user[53157]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovs/ovsdb_connection]/ensure: created Dec 15 02:58:24 localhost puppet-user[53157]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovs/ovsdb_connection_timeout]/ensure: created Dec 15 02:58:24 localhost puppet-user[53157]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovn/ovsdb_probe_interval]/ensure: created Dec 15 02:58:24 localhost puppet-user[53157]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovn/ovn_nb_connection]/ensure: created Dec 15 02:58:24 localhost puppet-user[53157]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovn/ovn_sb_connection]/ensure: created Dec 15 02:58:24 localhost puppet-user[53157]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Default[neutron_config]/Neutron_config[DEFAULT/transport_url]/ensure: created Dec 15 02:58:24 localhost puppet-user[53157]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Default[neutron_config]/Neutron_config[DEFAULT/control_exchange]/ensure: created Dec 15 02:58:24 localhost puppet-user[53157]: Notice: /Stage[main]/Neutron/Oslo::Concurrency[neutron_config]/Neutron_config[oslo_concurrency/lock_path]/ensure: created Dec 15 02:58:24 localhost puppet-user[53157]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Notifications[neutron_config]/Neutron_config[oslo_messaging_notifications/driver]/ensure: created Dec 15 02:58:24 localhost puppet-user[53157]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Notifications[neutron_config]/Neutron_config[oslo_messaging_notifications/transport_url]/ensure: created Dec 15 02:58:24 localhost puppet-user[53157]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Rabbit[neutron_config]/Neutron_config[oslo_messaging_rabbit/heartbeat_in_pthread]/ensure: created Dec 15 02:58:24 localhost puppet-user[53157]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Rabbit[neutron_config]/Neutron_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]/ensure: created Dec 15 02:58:24 localhost puppet-user[53157]: Notice: /Stage[main]/Neutron::Logging/Oslo::Log[neutron_config]/Neutron_config[DEFAULT/debug]/ensure: created Dec 15 02:58:24 localhost puppet-user[53157]: Notice: /Stage[main]/Neutron::Logging/Oslo::Log[neutron_config]/Neutron_config[DEFAULT/log_dir]/ensure: created Dec 15 02:58:24 localhost puppet-user[53157]: Notice: Applied catalog in 0.43 seconds Dec 15 02:58:24 localhost puppet-user[53157]: Application: Dec 15 02:58:24 localhost puppet-user[53157]: Initial environment: production Dec 15 02:58:24 localhost puppet-user[53157]: Converged environment: production Dec 15 02:58:24 localhost puppet-user[53157]: Run mode: user Dec 15 02:58:24 localhost puppet-user[53157]: Changes: Dec 15 02:58:24 localhost puppet-user[53157]: Total: 33 Dec 15 02:58:24 localhost puppet-user[53157]: Events: Dec 15 02:58:24 localhost puppet-user[53157]: Success: 33 Dec 15 02:58:24 localhost puppet-user[53157]: Total: 33 Dec 15 02:58:24 localhost puppet-user[53157]: Resources: Dec 15 02:58:24 localhost puppet-user[53157]: Skipped: 21 Dec 15 02:58:24 localhost puppet-user[53157]: Changed: 33 Dec 15 02:58:24 localhost puppet-user[53157]: Out of sync: 33 Dec 15 02:58:24 localhost puppet-user[53157]: Total: 155 Dec 15 02:58:24 localhost puppet-user[53157]: Time: Dec 15 02:58:24 localhost puppet-user[53157]: Resources: 0.00 Dec 15 02:58:24 localhost puppet-user[53157]: Ovn metadata agent config: 0.01 Dec 15 02:58:24 localhost puppet-user[53157]: Neutron config: 0.35 Dec 15 02:58:24 localhost puppet-user[53157]: Transaction evaluation: 0.42 Dec 15 02:58:24 localhost puppet-user[53157]: Catalog application: 0.43 Dec 15 02:58:24 localhost puppet-user[53157]: Config retrieval: 0.66 Dec 15 02:58:24 localhost puppet-user[53157]: Last run: 1765785504 Dec 15 02:58:24 localhost puppet-user[53157]: Total: 0.43 Dec 15 02:58:24 localhost puppet-user[53157]: Version: Dec 15 02:58:24 localhost puppet-user[53157]: Config: 1765785503 Dec 15 02:58:24 localhost puppet-user[53157]: Puppet: 7.10.0 Dec 15 02:58:24 localhost systemd[1]: libpod-aa7843baae392ce5332cf47666421d9cbad18ab5a866dc9d29c1ada16ad7c2c2.scope: Deactivated successfully. Dec 15 02:58:24 localhost systemd[1]: libpod-aa7843baae392ce5332cf47666421d9cbad18ab5a866dc9d29c1ada16ad7c2c2.scope: Consumed 3.507s CPU time. Dec 15 02:58:24 localhost podman[53126]: 2025-12-15 07:58:24.909332808 +0000 UTC m=+3.776865127 container died aa7843baae392ce5332cf47666421d9cbad18ab5a866dc9d29c1ada16ad7c2c2 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-server-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005559462', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_puppet_step1, vcs-type=git, tcib_managed=true, release=1761123044, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-server, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, build-date=2025-11-19T00:23:27Z, description=Red Hat OpenStack Platform 17.1 neutron-server, summary=Red Hat OpenStack Platform 17.1 neutron-server, vendor=Red Hat, Inc., container_name=container-puppet-neutron, architecture=x86_64, version=17.1.12) Dec 15 02:58:24 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-aa7843baae392ce5332cf47666421d9cbad18ab5a866dc9d29c1ada16ad7c2c2-userdata-shm.mount: Deactivated successfully. Dec 15 02:58:24 localhost systemd[1]: var-lib-containers-storage-overlay-3b50fc9e8c36464873157655c311974075d15360c3eaaa7584c489f72ca2e57a-merged.mount: Deactivated successfully. Dec 15 02:58:25 localhost podman[53340]: 2025-12-15 07:58:25.019880777 +0000 UTC m=+0.104881663 container cleanup aa7843baae392ce5332cf47666421d9cbad18ab5a866dc9d29c1ada16ad7c2c2 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:23:27Z, name=rhosp17/openstack-neutron-server, com.redhat.component=openstack-neutron-server-container, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005559462', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, url=https://www.redhat.com, io.buildah.version=1.41.4, batch=17.1_20251118.1, config_id=tripleo_puppet_step1, description=Red Hat OpenStack Platform 17.1 neutron-server, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, version=17.1.12, container_name=container-puppet-neutron, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-server, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, architecture=x86_64, distribution-scope=public, release=1761123044, io.openshift.expose-services=, maintainer=OpenStack TripleO Team) Dec 15 02:58:25 localhost systemd[1]: libpod-conmon-aa7843baae392ce5332cf47666421d9cbad18ab5a866dc9d29c1ada16ad7c2c2.scope: Deactivated successfully. Dec 15 02:58:25 localhost python3[51312]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-neutron --conmon-pidfile /run/container-puppet-neutron.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005559462 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config --env NAME=neutron --env STEP_CONFIG=include ::tripleo::packages#012include tripleo::profile::base::neutron::ovn_metadata#012 --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-neutron --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005559462', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-neutron.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1 Dec 15 02:58:26 localhost python3[53392]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:58:26 localhost python3[53424]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 15 02:58:27 localhost python3[53474]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 02:58:27 localhost python3[53517]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765785507.2542653-83427-126206875713090/source dest=/usr/libexec/tripleo-container-shutdown mode=0700 owner=root group=root _original_basename=tripleo-container-shutdown follow=False checksum=7d67b1986212f5548057505748cd74cfcf9c0d35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:58:28 localhost python3[53579]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 02:58:28 localhost python3[53622]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765785508.0959373-83427-149658441247260/source dest=/usr/libexec/tripleo-start-podman-container mode=0700 owner=root group=root _original_basename=tripleo-start-podman-container follow=False checksum=536965633b8d3b1ce794269ffb07be0105a560a0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:58:29 localhost python3[53684]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 02:58:29 localhost python3[53727]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765785509.0298736-83487-10976110708049/source dest=/usr/lib/systemd/system/tripleo-container-shutdown.service mode=0644 owner=root group=root _original_basename=tripleo-container-shutdown-service follow=False checksum=66c1d41406ba8714feb9ed0a35259a7a57ef9707 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:58:30 localhost python3[53789]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 02:58:30 localhost python3[53832]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765785509.906013-83514-228449688465329/source dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset mode=0644 owner=root group=root _original_basename=91-tripleo-container-shutdown-preset follow=False checksum=bccb1207dcbcfaa5ca05f83c8f36ce4c2460f081 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:58:31 localhost python3[53862]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 02:58:31 localhost systemd[1]: Reloading. Dec 15 02:58:31 localhost systemd-rc-local-generator[53879]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 02:58:31 localhost systemd-sysv-generator[53882]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 02:58:31 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 02:58:31 localhost systemd[1]: Reloading. Dec 15 02:58:31 localhost systemd-sysv-generator[53931]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 02:58:31 localhost systemd-rc-local-generator[53928]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 02:58:31 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 02:58:31 localhost systemd[1]: Starting TripleO Container Shutdown... Dec 15 02:58:31 localhost systemd[1]: Finished TripleO Container Shutdown. Dec 15 02:58:32 localhost python3[53986]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 02:58:32 localhost python3[54029]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765785511.893064-83648-61477707831107/source dest=/usr/lib/systemd/system/netns-placeholder.service mode=0644 owner=root group=root _original_basename=netns-placeholder-service follow=False checksum=8e9c6d5ce3a6e7f71c18780ec899f32f23de4c71 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:58:33 localhost python3[54091]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 02:58:33 localhost python3[54134]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765785512.7979956-83675-108880310572775/source dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset mode=0644 owner=root group=root _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:58:33 localhost python3[54164]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 02:58:34 localhost systemd[1]: Reloading. Dec 15 02:58:34 localhost systemd-sysv-generator[54194]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 02:58:34 localhost systemd-rc-local-generator[54191]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 02:58:34 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 02:58:35 localhost systemd[1]: Reloading. Dec 15 02:58:35 localhost systemd-rc-local-generator[54224]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 02:58:35 localhost systemd-sysv-generator[54229]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 02:58:35 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 02:58:35 localhost systemd[1]: Starting Create netns directory... Dec 15 02:58:35 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Dec 15 02:58:35 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Dec 15 02:58:35 localhost systemd[1]: Finished Create netns directory. Dec 15 02:58:36 localhost python3[54257]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6 Dec 15 02:58:36 localhost python3[54257]: ansible-container_puppet_config [WARNING] Config change detected for metrics_qdr, new hash: 29d07da103bbd44a9ed3e29999314b03 Dec 15 02:58:36 localhost python3[54257]: ansible-container_puppet_config [WARNING] Config change detected for collectd, new hash: da9a0dc7b40588672419e3ce10063e21 Dec 15 02:58:36 localhost python3[54257]: ansible-container_puppet_config [WARNING] Config change detected for iscsid, new hash: 182e509007ab5e6e5b2500a552cbd5ba Dec 15 02:58:36 localhost python3[54257]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtlogd_wrapper, new hash: 879500e96bf8dfb93687004bd86f2317 Dec 15 02:58:36 localhost python3[54257]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtnodedevd, new hash: 879500e96bf8dfb93687004bd86f2317 Dec 15 02:58:36 localhost python3[54257]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtproxyd, new hash: 879500e96bf8dfb93687004bd86f2317 Dec 15 02:58:36 localhost python3[54257]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtqemud, new hash: 879500e96bf8dfb93687004bd86f2317 Dec 15 02:58:36 localhost python3[54257]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtsecretd, new hash: 879500e96bf8dfb93687004bd86f2317 Dec 15 02:58:36 localhost python3[54257]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtstoraged, new hash: 879500e96bf8dfb93687004bd86f2317 Dec 15 02:58:36 localhost python3[54257]: ansible-container_puppet_config [WARNING] Config change detected for rsyslog, new hash: 605429f322a7b034ef9794ac46c40b29 Dec 15 02:58:36 localhost python3[54257]: ansible-container_puppet_config [WARNING] Config change detected for ceilometer_agent_compute, new hash: fcee5a4a91f85471fca7b61211375646 Dec 15 02:58:36 localhost python3[54257]: ansible-container_puppet_config [WARNING] Config change detected for ceilometer_agent_ipmi, new hash: fcee5a4a91f85471fca7b61211375646 Dec 15 02:58:36 localhost python3[54257]: ansible-container_puppet_config [WARNING] Config change detected for logrotate_crond, new hash: 53ed83bb0cae779ff95edb2002262c6f Dec 15 02:58:36 localhost python3[54257]: ansible-container_puppet_config [WARNING] Config change detected for nova_libvirt_init_secret, new hash: 879500e96bf8dfb93687004bd86f2317 Dec 15 02:58:36 localhost python3[54257]: ansible-container_puppet_config [WARNING] Config change detected for nova_migration_target, new hash: 879500e96bf8dfb93687004bd86f2317 Dec 15 02:58:36 localhost python3[54257]: ansible-container_puppet_config [WARNING] Config change detected for ovn_metadata_agent, new hash: a56a6f14b467cd9064e40c03defa5ed7 Dec 15 02:58:36 localhost python3[54257]: ansible-container_puppet_config [WARNING] Config change detected for nova_compute, new hash: 182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317 Dec 15 02:58:36 localhost python3[54257]: ansible-container_puppet_config [WARNING] Config change detected for nova_wait_for_compute_service, new hash: 879500e96bf8dfb93687004bd86f2317 Dec 15 02:58:36 localhost sshd[54258]: main: sshd: ssh-rsa algorithm is disabled Dec 15 02:58:37 localhost python3[54316]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step1 config_dir=/var/lib/tripleo-config/container-startup-config/step_1 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False Dec 15 02:58:37 localhost podman[54352]: 2025-12-15 07:58:37.874968151 +0000 UTC m=+0.082254313 container create 6dcc7b6e1af3209b0c629d52a5c1bcc3aa9f9e89b1416f1a3b8815afca8e8840 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, release=1761123044, io.openshift.expose-services=, tcib_managed=true, name=rhosp17/openstack-qdrouterd, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, managed_by=tripleo_ansible, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, architecture=x86_64, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr_init_logs, summary=Red Hat OpenStack Platform 17.1 qdrouterd) Dec 15 02:58:37 localhost systemd[1]: Started libpod-conmon-6dcc7b6e1af3209b0c629d52a5c1bcc3aa9f9e89b1416f1a3b8815afca8e8840.scope. Dec 15 02:58:37 localhost podman[54352]: 2025-12-15 07:58:37.833339027 +0000 UTC m=+0.040625219 image pull registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 Dec 15 02:58:37 localhost systemd[1]: Started libcrun container. Dec 15 02:58:37 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9e88ec9f46b32b3c9a9299bb19cf61d3058917e906f113fa2506a712d1aba379/merged/var/log/qdrouterd supports timestamps until 2038 (0x7fffffff) Dec 15 02:58:37 localhost podman[54352]: 2025-12-15 07:58:37.947764796 +0000 UTC m=+0.155050958 container init 6dcc7b6e1af3209b0c629d52a5c1bcc3aa9f9e89b1416f1a3b8815afca8e8840 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, batch=17.1_20251118.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, container_name=metrics_qdr_init_logs, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, release=1761123044, build-date=2025-11-18T22:49:46Z, name=rhosp17/openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd) Dec 15 02:58:37 localhost podman[54352]: 2025-12-15 07:58:37.958069204 +0000 UTC m=+0.165355366 container start 6dcc7b6e1af3209b0c629d52a5c1bcc3aa9f9e89b1416f1a3b8815afca8e8840 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, maintainer=OpenStack TripleO Team, release=1761123044, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.4, url=https://www.redhat.com, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr_init_logs, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 15 02:58:37 localhost podman[54352]: 2025-12-15 07:58:37.958421083 +0000 UTC m=+0.165707255 container attach 6dcc7b6e1af3209b0c629d52a5c1bcc3aa9f9e89b1416f1a3b8815afca8e8840 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, distribution-scope=public, container_name=metrics_qdr_init_logs, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 15 02:58:37 localhost systemd[1]: libpod-6dcc7b6e1af3209b0c629d52a5c1bcc3aa9f9e89b1416f1a3b8815afca8e8840.scope: Deactivated successfully. Dec 15 02:58:37 localhost podman[54352]: 2025-12-15 07:58:37.963953487 +0000 UTC m=+0.171239659 container died 6dcc7b6e1af3209b0c629d52a5c1bcc3aa9f9e89b1416f1a3b8815afca8e8840 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, architecture=x86_64, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr_init_logs, io.openshift.expose-services=, url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd) Dec 15 02:58:38 localhost podman[54371]: 2025-12-15 07:58:38.040777767 +0000 UTC m=+0.070305831 container cleanup 6dcc7b6e1af3209b0c629d52a5c1bcc3aa9f9e89b1416f1a3b8815afca8e8840 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, url=https://www.redhat.com, architecture=x86_64, managed_by=tripleo_ansible, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, distribution-scope=public, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.openshift.expose-services=, container_name=metrics_qdr_init_logs, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, io.buildah.version=1.41.4, release=1761123044, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 15 02:58:38 localhost systemd[1]: libpod-conmon-6dcc7b6e1af3209b0c629d52a5c1bcc3aa9f9e89b1416f1a3b8815afca8e8840.scope: Deactivated successfully. Dec 15 02:58:38 localhost python3[54316]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name metrics_qdr_init_logs --conmon-pidfile /run/metrics_qdr_init_logs.pid --detach=False --label config_id=tripleo_step1 --label container_name=metrics_qdr_init_logs --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/metrics_qdr_init_logs.log --network none --privileged=False --user root --volume /var/log/containers/metrics_qdr:/var/log/qdrouterd:z registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 /bin/bash -c chown -R qdrouterd:qdrouterd /var/log/qdrouterd Dec 15 02:58:38 localhost podman[54445]: 2025-12-15 07:58:38.50683343 +0000 UTC m=+0.079577872 container create 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.12, url=https://www.redhat.com, container_name=metrics_qdr, config_id=tripleo_step1, vendor=Red Hat, Inc., vcs-type=git, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z) Dec 15 02:58:38 localhost systemd[1]: Started libpod-conmon-6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.scope. Dec 15 02:58:38 localhost systemd[1]: Started libcrun container. Dec 15 02:58:38 localhost podman[54445]: 2025-12-15 07:58:38.46377433 +0000 UTC m=+0.036518782 image pull registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 Dec 15 02:58:38 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e75ee441979af5ea9ed0c5146d1f659562a9f9f9039bdfa8b70f6f9c6eebd6bb/merged/var/log/qdrouterd supports timestamps until 2038 (0x7fffffff) Dec 15 02:58:38 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e75ee441979af5ea9ed0c5146d1f659562a9f9f9039bdfa8b70f6f9c6eebd6bb/merged/var/lib/qdrouterd supports timestamps until 2038 (0x7fffffff) Dec 15 02:58:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 02:58:38 localhost podman[54445]: 2025-12-15 07:58:38.596729251 +0000 UTC m=+0.169473683 container init 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, vcs-type=git, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., architecture=x86_64, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, version=17.1.12) Dec 15 02:58:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 02:58:38 localhost podman[54445]: 2025-12-15 07:58:38.632492192 +0000 UTC m=+0.205236614 container start 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, config_id=tripleo_step1, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, distribution-scope=public, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, version=17.1.12, url=https://www.redhat.com) Dec 15 02:58:38 localhost python3[54316]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name metrics_qdr --conmon-pidfile /run/metrics_qdr.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=29d07da103bbd44a9ed3e29999314b03 --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step1 --label container_name=metrics_qdr --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/metrics_qdr.log --network host --privileged=False --user qdrouterd --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro --volume /var/lib/metrics_qdr:/var/lib/qdrouterd:z --volume /var/log/containers/metrics_qdr:/var/log/qdrouterd:z registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 Dec 15 02:58:38 localhost podman[54468]: 2025-12-15 07:58:38.732954997 +0000 UTC m=+0.091598006 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=starting, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, url=https://www.redhat.com, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, name=rhosp17/openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4) Dec 15 02:58:38 localhost systemd[1]: var-lib-containers-storage-overlay-9e88ec9f46b32b3c9a9299bb19cf61d3058917e906f113fa2506a712d1aba379-merged.mount: Deactivated successfully. Dec 15 02:58:38 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6dcc7b6e1af3209b0c629d52a5c1bcc3aa9f9e89b1416f1a3b8815afca8e8840-userdata-shm.mount: Deactivated successfully. Dec 15 02:58:38 localhost podman[54468]: 2025-12-15 07:58:38.960734377 +0000 UTC m=+0.319377346 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, architecture=x86_64, io.buildah.version=1.41.4, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, tcib_managed=true, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 15 02:58:38 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 02:58:39 localhost python3[54538]: ansible-file Invoked with path=/etc/systemd/system/tripleo_metrics_qdr.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:58:39 localhost python3[54554]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_metrics_qdr_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 15 02:58:40 localhost python3[54615]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765785519.5948136-83792-15098659974579/source dest=/etc/systemd/system/tripleo_metrics_qdr.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:58:40 localhost python3[54631]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Dec 15 02:58:40 localhost systemd[1]: Reloading. Dec 15 02:58:40 localhost systemd-rc-local-generator[54652]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 02:58:40 localhost systemd-sysv-generator[54658]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 02:58:40 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 02:58:41 localhost python3[54683]: ansible-systemd Invoked with state=restarted name=tripleo_metrics_qdr.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 02:58:41 localhost systemd[1]: Reloading. Dec 15 02:58:41 localhost systemd-rc-local-generator[54710]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 02:58:41 localhost systemd-sysv-generator[54714]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 02:58:41 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 02:58:41 localhost systemd[1]: Starting metrics_qdr container... Dec 15 02:58:41 localhost systemd[1]: Started metrics_qdr container. Dec 15 02:58:42 localhost python3[54764]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks1.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:58:43 localhost python3[54885]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks1.json short_hostname=np0005559462 step=1 update_config_hash_only=False Dec 15 02:58:44 localhost python3[54901]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 02:58:44 localhost python3[54917]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_1 config_pattern=container-puppet-*.json config_overrides={} debug=True Dec 15 02:59:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 02:59:09 localhost systemd[1]: tmp-crun.Kr0v1Q.mount: Deactivated successfully. Dec 15 02:59:09 localhost podman[54995]: 2025-12-15 07:59:09.760176203 +0000 UTC m=+0.088235739 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, architecture=x86_64, release=1761123044, vendor=Red Hat, Inc., distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, version=17.1.12, url=https://www.redhat.com, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1) Dec 15 02:59:09 localhost podman[54995]: 2025-12-15 07:59:09.957034117 +0000 UTC m=+0.285093633 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-type=git, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1) Dec 15 02:59:09 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 02:59:25 localhost sshd[55025]: main: sshd: ssh-rsa algorithm is disabled Dec 15 02:59:27 localhost sshd[55026]: main: sshd: ssh-rsa algorithm is disabled Dec 15 02:59:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 02:59:40 localhost systemd[1]: tmp-crun.tk941c.mount: Deactivated successfully. Dec 15 02:59:40 localhost podman[55028]: 2025-12-15 07:59:40.753132011 +0000 UTC m=+0.080404553 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, batch=17.1_20251118.1, config_id=tripleo_step1, distribution-scope=public, tcib_managed=true, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd) Dec 15 02:59:40 localhost podman[55028]: 2025-12-15 07:59:40.947139933 +0000 UTC m=+0.274412445 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, version=17.1.12, batch=17.1_20251118.1, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, release=1761123044, vcs-type=git, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 15 02:59:40 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 02:59:43 localhost sshd[55058]: main: sshd: ssh-rsa algorithm is disabled Dec 15 03:00:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:00:11 localhost podman[55136]: 2025-12-15 08:00:11.906581589 +0000 UTC m=+0.236504345 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, url=https://www.redhat.com, release=1761123044, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, version=17.1.12, config_id=tripleo_step1, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc.) Dec 15 03:00:12 localhost podman[55136]: 2025-12-15 08:00:12.109371002 +0000 UTC m=+0.439293748 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, config_id=tripleo_step1, io.buildah.version=1.41.4, tcib_managed=true, build-date=2025-11-18T22:49:46Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, architecture=x86_64, url=https://www.redhat.com, vendor=Red Hat, Inc., container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 15 03:00:12 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:00:42 localhost sshd[55165]: main: sshd: ssh-rsa algorithm is disabled Dec 15 03:00:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:00:42 localhost podman[55167]: 2025-12-15 08:00:42.753805964 +0000 UTC m=+0.083900125 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.12, architecture=x86_64, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, url=https://www.redhat.com, vcs-type=git) Dec 15 03:00:42 localhost podman[55167]: 2025-12-15 08:00:42.954332167 +0000 UTC m=+0.284426348 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, batch=17.1_20251118.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, release=1761123044, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd) Dec 15 03:00:42 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:01:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:01:13 localhost systemd[1]: tmp-crun.LDt9Ys.mount: Deactivated successfully. Dec 15 03:01:13 localhost podman[55283]: 2025-12-15 08:01:13.759430335 +0000 UTC m=+0.088099316 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.12, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.buildah.version=1.41.4, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vendor=Red Hat, Inc., release=1761123044, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, architecture=x86_64, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 15 03:01:13 localhost podman[55283]: 2025-12-15 08:01:13.953216927 +0000 UTC m=+0.281885828 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vendor=Red Hat, Inc., managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, tcib_managed=true, batch=17.1_20251118.1, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd) Dec 15 03:01:13 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:01:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:01:44 localhost systemd[1]: tmp-crun.HDVzu7.mount: Deactivated successfully. Dec 15 03:01:44 localhost podman[55312]: 2025-12-15 08:01:44.790801628 +0000 UTC m=+0.121334276 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, batch=17.1_20251118.1, config_id=tripleo_step1, version=17.1.12, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.openshift.expose-services=) Dec 15 03:01:44 localhost podman[55312]: 2025-12-15 08:01:44.992157572 +0000 UTC m=+0.322690160 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, vcs-type=git, build-date=2025-11-18T22:49:46Z, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, container_name=metrics_qdr, release=1761123044, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.buildah.version=1.41.4, tcib_managed=true, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 15 03:01:45 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:01:51 localhost sshd[55340]: main: sshd: ssh-rsa algorithm is disabled Dec 15 03:02:08 localhost sshd[55421]: main: sshd: ssh-rsa algorithm is disabled Dec 15 03:02:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:02:15 localhost podman[55422]: 2025-12-15 08:02:15.771692823 +0000 UTC m=+0.099612966 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, name=rhosp17/openstack-qdrouterd, distribution-scope=public, release=1761123044, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-type=git, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, managed_by=tripleo_ansible, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 15 03:02:15 localhost podman[55422]: 2025-12-15 08:02:15.983963589 +0000 UTC m=+0.311883732 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, config_id=tripleo_step1, release=1761123044, name=rhosp17/openstack-qdrouterd, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, version=17.1.12, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1) Dec 15 03:02:15 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:02:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:02:46 localhost podman[55451]: 2025-12-15 08:02:46.758513461 +0000 UTC m=+0.089010080 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, batch=17.1_20251118.1, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.buildah.version=1.41.4, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, vcs-type=git, name=rhosp17/openstack-qdrouterd, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr) Dec 15 03:02:46 localhost podman[55451]: 2025-12-15 08:02:46.941209226 +0000 UTC m=+0.271705745 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, vcs-type=git, managed_by=tripleo_ansible, io.buildah.version=1.41.4, release=1761123044, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, tcib_managed=true, io.openshift.expose-services=, config_id=tripleo_step1) Dec 15 03:02:46 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:02:58 localhost sshd[55481]: main: sshd: ssh-rsa algorithm is disabled Dec 15 03:03:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:03:17 localhost systemd[1]: tmp-crun.ygNyEm.mount: Deactivated successfully. Dec 15 03:03:17 localhost podman[55560]: 2025-12-15 08:03:17.771019854 +0000 UTC m=+0.098672511 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, container_name=metrics_qdr, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, vcs-type=git, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, url=https://www.redhat.com, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd) Dec 15 03:03:18 localhost podman[55560]: 2025-12-15 08:03:18.003370622 +0000 UTC m=+0.331023219 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, url=https://www.redhat.com, io.openshift.expose-services=, vcs-type=git, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, vendor=Red Hat, Inc., io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, tcib_managed=true, build-date=2025-11-18T22:49:46Z, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12) Dec 15 03:03:18 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:03:47 localhost ceph-osd[32311]: osd.3 pg_epoch: 20 pg[2.0( empty local-lis/les=0/0 n=0 ec=20/20 lis/c=0/0 les/c/f=0/0/0 sis=20) [2,3,1] r=1 lpr=20 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Dec 15 03:03:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:03:48 localhost systemd[1]: tmp-crun.a7EimH.mount: Deactivated successfully. Dec 15 03:03:48 localhost podman[55593]: 2025-12-15 08:03:48.749069983 +0000 UTC m=+0.078038483 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., config_id=tripleo_step1, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, distribution-scope=public, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, batch=17.1_20251118.1, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, maintainer=OpenStack TripleO Team) Dec 15 03:03:48 localhost ceph-osd[32311]: osd.3 pg_epoch: 22 pg[3.0( empty local-lis/les=0/0 n=0 ec=22/22 lis/c=0/0 les/c/f=0/0/0 sis=22) [3,2,1] r=0 lpr=22 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:03:48 localhost podman[55593]: 2025-12-15 08:03:48.968310637 +0000 UTC m=+0.297279137 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, vendor=Red Hat, Inc., version=17.1.12, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, distribution-scope=public, config_id=tripleo_step1, vcs-type=git, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, tcib_managed=true, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 15 03:03:48 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:03:49 localhost ceph-osd[32311]: osd.3 pg_epoch: 23 pg[3.0( empty local-lis/les=22/23 n=0 ec=22/22 lis/c=0/0 les/c/f=0/0/0 sis=22) [3,2,1] r=0 lpr=22 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:03:52 localhost ceph-osd[32311]: osd.3 pg_epoch: 24 pg[4.0( empty local-lis/les=0/0 n=0 ec=24/24 lis/c=0/0 les/c/f=0/0/0 sis=24) [4,5,3] r=2 lpr=24 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Dec 15 03:03:53 localhost ceph-osd[32311]: osd.3 pg_epoch: 26 pg[5.0( empty local-lis/les=0/0 n=0 ec=26/26 lis/c=0/0 les/c/f=0/0/0 sis=26) [3,4,2] r=0 lpr=26 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:03:54 localhost ceph-osd[32311]: osd.3 pg_epoch: 27 pg[5.0( empty local-lis/les=26/27 n=0 ec=26/26 lis/c=0/0 les/c/f=0/0/0 sis=26) [3,4,2] r=0 lpr=26 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:07 localhost ceph-osd[32311]: osd.3 pg_epoch: 32 pg[6.0( empty local-lis/les=0/0 n=0 ec=32/32 lis/c=0/0 les/c/f=0/0/0 sis=32) [4,5,3] r=2 lpr=32 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Dec 15 03:04:08 localhost ceph-osd[31375]: osd.0 pg_epoch: 33 pg[7.0( empty local-lis/les=0/0 n=0 ec=33/33 lis/c=0/0 les/c/f=0/0/0 sis=33) [5,0,4] r=1 lpr=33 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Dec 15 03:04:10 localhost sshd[55622]: main: sshd: ssh-rsa algorithm is disabled Dec 15 03:04:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:04:19 localhost podman[55731]: 2025-12-15 08:04:19.758595085 +0000 UTC m=+0.090529018 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, architecture=x86_64, distribution-scope=public, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, url=https://www.redhat.com, batch=17.1_20251118.1, container_name=metrics_qdr, vendor=Red Hat, Inc., release=1761123044, com.redhat.component=openstack-qdrouterd-container) Dec 15 03:04:19 localhost podman[55731]: 2025-12-15 08:04:19.967602536 +0000 UTC m=+0.299536519 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, managed_by=tripleo_ansible, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, batch=17.1_20251118.1) Dec 15 03:04:19 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:04:34 localhost ceph-osd[32311]: osd.3 pg_epoch: 37 pg[2.0( empty local-lis/les=20/21 n=0 ec=20/20 lis/c=20/20 les/c/f=21/21/0 sis=37 pruub=9.199455261s) [2,3,1] r=1 lpr=37 pi=[20,37)/1 crt=0'0 mlcod 0'0 active pruub 1173.753540039s@ mbc={}] start_peering_interval up [2,3,1] -> [2,3,1], acting [2,3,1] -> [2,3,1], acting_primary 2 -> 2, up_primary 2 -> 2, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:34 localhost ceph-osd[32311]: osd.3 pg_epoch: 37 pg[2.0( empty local-lis/les=20/21 n=0 ec=20/20 lis/c=20/20 les/c/f=21/21/0 sis=37 pruub=9.196267128s) [2,3,1] r=1 lpr=37 pi=[20,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1173.753540039s@ mbc={}] state: transitioning to Stray Dec 15 03:04:36 localhost ceph-osd[32311]: osd.3 pg_epoch: 38 pg[2.15( empty local-lis/les=20/21 n=0 ec=37/20 lis/c=20/20 les/c/f=21/21/0 sis=37) [2,3,1] r=1 lpr=37 pi=[20,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:36 localhost ceph-osd[32311]: osd.3 pg_epoch: 38 pg[2.18( empty local-lis/les=20/21 n=0 ec=37/20 lis/c=20/20 les/c/f=21/21/0 sis=37) [2,3,1] r=1 lpr=37 pi=[20,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:36 localhost ceph-osd[32311]: osd.3 pg_epoch: 38 pg[2.16( empty local-lis/les=20/21 n=0 ec=37/20 lis/c=20/20 les/c/f=21/21/0 sis=37) [2,3,1] r=1 lpr=37 pi=[20,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:36 localhost ceph-osd[32311]: osd.3 pg_epoch: 38 pg[2.14( empty local-lis/les=20/21 n=0 ec=37/20 lis/c=20/20 les/c/f=21/21/0 sis=37) [2,3,1] r=1 lpr=37 pi=[20,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:36 localhost ceph-osd[32311]: osd.3 pg_epoch: 38 pg[2.13( empty local-lis/les=20/21 n=0 ec=37/20 lis/c=20/20 les/c/f=21/21/0 sis=37) [2,3,1] r=1 lpr=37 pi=[20,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:36 localhost ceph-osd[32311]: osd.3 pg_epoch: 38 pg[2.12( empty local-lis/les=20/21 n=0 ec=37/20 lis/c=20/20 les/c/f=21/21/0 sis=37) [2,3,1] r=1 lpr=37 pi=[20,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:36 localhost ceph-osd[32311]: osd.3 pg_epoch: 38 pg[2.f( empty local-lis/les=20/21 n=0 ec=37/20 lis/c=20/20 les/c/f=21/21/0 sis=37) [2,3,1] r=1 lpr=37 pi=[20,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:36 localhost ceph-osd[32311]: osd.3 pg_epoch: 38 pg[2.10( empty local-lis/les=20/21 n=0 ec=37/20 lis/c=20/20 les/c/f=21/21/0 sis=37) [2,3,1] r=1 lpr=37 pi=[20,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:36 localhost ceph-osd[32311]: osd.3 pg_epoch: 38 pg[2.11( empty local-lis/les=20/21 n=0 ec=37/20 lis/c=20/20 les/c/f=21/21/0 sis=37) [2,3,1] r=1 lpr=37 pi=[20,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:36 localhost ceph-osd[32311]: osd.3 pg_epoch: 38 pg[2.17( empty local-lis/les=20/21 n=0 ec=37/20 lis/c=20/20 les/c/f=21/21/0 sis=37) [2,3,1] r=1 lpr=37 pi=[20,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:36 localhost ceph-osd[32311]: osd.3 pg_epoch: 38 pg[2.e( empty local-lis/les=20/21 n=0 ec=37/20 lis/c=20/20 les/c/f=21/21/0 sis=37) [2,3,1] r=1 lpr=37 pi=[20,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:36 localhost ceph-osd[32311]: osd.3 pg_epoch: 38 pg[2.c( empty local-lis/les=20/21 n=0 ec=37/20 lis/c=20/20 les/c/f=21/21/0 sis=37) [2,3,1] r=1 lpr=37 pi=[20,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:36 localhost ceph-osd[32311]: osd.3 pg_epoch: 38 pg[2.d( empty local-lis/les=20/21 n=0 ec=37/20 lis/c=20/20 les/c/f=21/21/0 sis=37) [2,3,1] r=1 lpr=37 pi=[20,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:36 localhost ceph-osd[32311]: osd.3 pg_epoch: 38 pg[2.9( empty local-lis/les=20/21 n=0 ec=37/20 lis/c=20/20 les/c/f=21/21/0 sis=37) [2,3,1] r=1 lpr=37 pi=[20,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:36 localhost ceph-osd[32311]: osd.3 pg_epoch: 38 pg[2.a( empty local-lis/les=20/21 n=0 ec=37/20 lis/c=20/20 les/c/f=21/21/0 sis=37) [2,3,1] r=1 lpr=37 pi=[20,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:36 localhost ceph-osd[32311]: osd.3 pg_epoch: 38 pg[2.b( empty local-lis/les=20/21 n=0 ec=37/20 lis/c=20/20 les/c/f=21/21/0 sis=37) [2,3,1] r=1 lpr=37 pi=[20,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:36 localhost ceph-osd[32311]: osd.3 pg_epoch: 38 pg[2.1( empty local-lis/les=20/21 n=0 ec=37/20 lis/c=20/20 les/c/f=21/21/0 sis=37) [2,3,1] r=1 lpr=37 pi=[20,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:36 localhost ceph-osd[32311]: osd.3 pg_epoch: 38 pg[2.6( empty local-lis/les=20/21 n=0 ec=37/20 lis/c=20/20 les/c/f=21/21/0 sis=37) [2,3,1] r=1 lpr=37 pi=[20,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:36 localhost ceph-osd[32311]: osd.3 pg_epoch: 38 pg[2.19( empty local-lis/les=20/21 n=0 ec=37/20 lis/c=20/20 les/c/f=21/21/0 sis=37) [2,3,1] r=1 lpr=37 pi=[20,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:36 localhost ceph-osd[32311]: osd.3 pg_epoch: 38 pg[3.0( empty local-lis/les=22/23 n=0 ec=22/22 lis/c=22/22 les/c/f=23/23/0 sis=38 pruub=9.950901985s) [3,2,1] r=0 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 active pruub 1175.838989258s@ mbc={}] start_peering_interval up [3,2,1] -> [3,2,1], acting [3,2,1] -> [3,2,1], acting_primary 3 -> 3, up_primary 3 -> 3, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:36 localhost ceph-osd[32311]: osd.3 pg_epoch: 38 pg[2.7( empty local-lis/les=20/21 n=0 ec=37/20 lis/c=20/20 les/c/f=21/21/0 sis=37) [2,3,1] r=1 lpr=37 pi=[20,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:36 localhost ceph-osd[32311]: osd.3 pg_epoch: 38 pg[2.2( empty local-lis/les=20/21 n=0 ec=37/20 lis/c=20/20 les/c/f=21/21/0 sis=37) [2,3,1] r=1 lpr=37 pi=[20,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:36 localhost ceph-osd[32311]: osd.3 pg_epoch: 38 pg[2.4( empty local-lis/les=20/21 n=0 ec=37/20 lis/c=20/20 les/c/f=21/21/0 sis=37) [2,3,1] r=1 lpr=37 pi=[20,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:36 localhost ceph-osd[32311]: osd.3 pg_epoch: 38 pg[2.3( empty local-lis/les=20/21 n=0 ec=37/20 lis/c=20/20 les/c/f=21/21/0 sis=37) [2,3,1] r=1 lpr=37 pi=[20,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:36 localhost ceph-osd[32311]: osd.3 pg_epoch: 38 pg[2.5( empty local-lis/les=20/21 n=0 ec=37/20 lis/c=20/20 les/c/f=21/21/0 sis=37) [2,3,1] r=1 lpr=37 pi=[20,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:36 localhost ceph-osd[32311]: osd.3 pg_epoch: 38 pg[2.8( empty local-lis/les=20/21 n=0 ec=37/20 lis/c=20/20 les/c/f=21/21/0 sis=37) [2,3,1] r=1 lpr=37 pi=[20,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:36 localhost ceph-osd[32311]: osd.3 pg_epoch: 38 pg[2.1a( empty local-lis/les=20/21 n=0 ec=37/20 lis/c=20/20 les/c/f=21/21/0 sis=37) [2,3,1] r=1 lpr=37 pi=[20,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:36 localhost ceph-osd[32311]: osd.3 pg_epoch: 38 pg[2.1b( empty local-lis/les=20/21 n=0 ec=37/20 lis/c=20/20 les/c/f=21/21/0 sis=37) [2,3,1] r=1 lpr=37 pi=[20,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:36 localhost ceph-osd[32311]: osd.3 pg_epoch: 38 pg[2.1c( empty local-lis/les=20/21 n=0 ec=37/20 lis/c=20/20 les/c/f=21/21/0 sis=37) [2,3,1] r=1 lpr=37 pi=[20,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:36 localhost ceph-osd[32311]: osd.3 pg_epoch: 38 pg[2.1d( empty local-lis/les=20/21 n=0 ec=37/20 lis/c=20/20 les/c/f=21/21/0 sis=37) [2,3,1] r=1 lpr=37 pi=[20,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:36 localhost ceph-osd[32311]: osd.3 pg_epoch: 38 pg[2.1f( empty local-lis/les=20/21 n=0 ec=37/20 lis/c=20/20 les/c/f=21/21/0 sis=37) [2,3,1] r=1 lpr=37 pi=[20,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:36 localhost ceph-osd[32311]: osd.3 pg_epoch: 38 pg[2.1e( empty local-lis/les=20/21 n=0 ec=37/20 lis/c=20/20 les/c/f=21/21/0 sis=37) [2,3,1] r=1 lpr=37 pi=[20,37)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:36 localhost ceph-osd[32311]: osd.3 pg_epoch: 38 pg[3.0( empty local-lis/les=22/23 n=0 ec=22/22 lis/c=22/22 les/c/f=23/23/0 sis=38 pruub=9.950901985s) [3,2,1] r=0 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown pruub 1175.838989258s@ mbc={}] state: transitioning to Primary Dec 15 03:04:37 localhost ceph-osd[32311]: osd.3 pg_epoch: 39 pg[3.1e( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,2,1] r=0 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:04:37 localhost ceph-osd[32311]: osd.3 pg_epoch: 39 pg[3.1d( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,2,1] r=0 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:04:37 localhost ceph-osd[32311]: osd.3 pg_epoch: 39 pg[3.1b( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,2,1] r=0 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:04:37 localhost ceph-osd[32311]: osd.3 pg_epoch: 39 pg[3.1f( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,2,1] r=0 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:04:37 localhost ceph-osd[32311]: osd.3 pg_epoch: 39 pg[3.18( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,2,1] r=0 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:04:37 localhost ceph-osd[32311]: osd.3 pg_epoch: 39 pg[3.9( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,2,1] r=0 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:04:37 localhost ceph-osd[32311]: osd.3 pg_epoch: 39 pg[3.1c( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,2,1] r=0 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:04:37 localhost ceph-osd[32311]: osd.3 pg_epoch: 39 pg[3.4( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,2,1] r=0 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:04:37 localhost ceph-osd[32311]: osd.3 pg_epoch: 39 pg[3.5( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,2,1] r=0 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:04:37 localhost ceph-osd[32311]: osd.3 pg_epoch: 39 pg[3.2( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,2,1] r=0 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:04:37 localhost ceph-osd[32311]: osd.3 pg_epoch: 39 pg[3.6( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,2,1] r=0 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:04:37 localhost ceph-osd[32311]: osd.3 pg_epoch: 39 pg[3.7( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,2,1] r=0 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:04:37 localhost ceph-osd[32311]: osd.3 pg_epoch: 39 pg[3.8( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,2,1] r=0 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:04:37 localhost ceph-osd[32311]: osd.3 pg_epoch: 39 pg[3.1( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,2,1] r=0 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:04:37 localhost ceph-osd[32311]: osd.3 pg_epoch: 39 pg[3.1a( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,2,1] r=0 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:04:37 localhost ceph-osd[32311]: osd.3 pg_epoch: 39 pg[3.b( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,2,1] r=0 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:04:37 localhost ceph-osd[32311]: osd.3 pg_epoch: 39 pg[3.3( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,2,1] r=0 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:04:37 localhost ceph-osd[32311]: osd.3 pg_epoch: 39 pg[3.d( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,2,1] r=0 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:04:37 localhost ceph-osd[32311]: osd.3 pg_epoch: 39 pg[3.c( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,2,1] r=0 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:04:37 localhost ceph-osd[32311]: osd.3 pg_epoch: 39 pg[3.f( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,2,1] r=0 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:04:37 localhost ceph-osd[32311]: osd.3 pg_epoch: 39 pg[3.e( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,2,1] r=0 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:04:37 localhost ceph-osd[32311]: osd.3 pg_epoch: 39 pg[3.11( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,2,1] r=0 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:04:37 localhost ceph-osd[32311]: osd.3 pg_epoch: 39 pg[3.a( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,2,1] r=0 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:04:37 localhost ceph-osd[32311]: osd.3 pg_epoch: 39 pg[3.10( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,2,1] r=0 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:04:37 localhost ceph-osd[32311]: osd.3 pg_epoch: 39 pg[3.12( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,2,1] r=0 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:04:37 localhost ceph-osd[32311]: osd.3 pg_epoch: 39 pg[3.15( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,2,1] r=0 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:04:37 localhost ceph-osd[32311]: osd.3 pg_epoch: 39 pg[3.13( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,2,1] r=0 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:04:37 localhost ceph-osd[32311]: osd.3 pg_epoch: 39 pg[3.17( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,2,1] r=0 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:04:37 localhost ceph-osd[32311]: osd.3 pg_epoch: 39 pg[3.16( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,2,1] r=0 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:04:37 localhost ceph-osd[32311]: osd.3 pg_epoch: 39 pg[3.19( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,2,1] r=0 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:04:37 localhost ceph-osd[32311]: osd.3 pg_epoch: 39 pg[3.14( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,2,1] r=0 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:04:37 localhost ceph-osd[32311]: osd.3 pg_epoch: 39 pg[3.0( empty local-lis/les=38/39 n=0 ec=22/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,2,1] r=0 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:37 localhost ceph-osd[32311]: osd.3 pg_epoch: 39 pg[3.1a( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,2,1] r=0 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:37 localhost ceph-osd[32311]: osd.3 pg_epoch: 39 pg[3.17( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,2,1] r=0 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:37 localhost ceph-osd[32311]: osd.3 pg_epoch: 39 pg[3.1b( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,2,1] r=0 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:37 localhost ceph-osd[32311]: osd.3 pg_epoch: 39 pg[3.18( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,2,1] r=0 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:37 localhost ceph-osd[32311]: osd.3 pg_epoch: 39 pg[3.12( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,2,1] r=0 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:37 localhost ceph-osd[32311]: osd.3 pg_epoch: 39 pg[3.19( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,2,1] r=0 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:37 localhost ceph-osd[32311]: osd.3 pg_epoch: 39 pg[3.14( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,2,1] r=0 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:37 localhost ceph-osd[32311]: osd.3 pg_epoch: 39 pg[3.d( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,2,1] r=0 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:37 localhost ceph-osd[32311]: osd.3 pg_epoch: 39 pg[3.10( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,2,1] r=0 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:37 localhost ceph-osd[32311]: osd.3 pg_epoch: 39 pg[3.11( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,2,1] r=0 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:37 localhost ceph-osd[32311]: osd.3 pg_epoch: 39 pg[3.e( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,2,1] r=0 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:37 localhost ceph-osd[32311]: osd.3 pg_epoch: 39 pg[3.13( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,2,1] r=0 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:37 localhost ceph-osd[32311]: osd.3 pg_epoch: 39 pg[3.c( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,2,1] r=0 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:37 localhost ceph-osd[32311]: osd.3 pg_epoch: 39 pg[3.15( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,2,1] r=0 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:37 localhost ceph-osd[32311]: osd.3 pg_epoch: 39 pg[3.1( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,2,1] r=0 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:37 localhost ceph-osd[32311]: osd.3 pg_epoch: 39 pg[3.3( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,2,1] r=0 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:37 localhost ceph-osd[32311]: osd.3 pg_epoch: 39 pg[3.4( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,2,1] r=0 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:37 localhost ceph-osd[32311]: osd.3 pg_epoch: 39 pg[3.5( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,2,1] r=0 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:37 localhost ceph-osd[32311]: osd.3 pg_epoch: 39 pg[3.6( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,2,1] r=0 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:37 localhost ceph-osd[32311]: osd.3 pg_epoch: 39 pg[3.7( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,2,1] r=0 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:37 localhost ceph-osd[32311]: osd.3 pg_epoch: 39 pg[3.f( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,2,1] r=0 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:37 localhost ceph-osd[32311]: osd.3 pg_epoch: 39 pg[3.9( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,2,1] r=0 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:37 localhost ceph-osd[32311]: osd.3 pg_epoch: 39 pg[3.a( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,2,1] r=0 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:37 localhost ceph-osd[32311]: osd.3 pg_epoch: 39 pg[3.8( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,2,1] r=0 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:37 localhost ceph-osd[32311]: osd.3 pg_epoch: 39 pg[3.2( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,2,1] r=0 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:37 localhost ceph-osd[32311]: osd.3 pg_epoch: 39 pg[3.1d( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,2,1] r=0 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:37 localhost ceph-osd[32311]: osd.3 pg_epoch: 39 pg[3.b( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,2,1] r=0 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:37 localhost ceph-osd[32311]: osd.3 pg_epoch: 39 pg[3.1e( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,2,1] r=0 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:37 localhost ceph-osd[32311]: osd.3 pg_epoch: 39 pg[3.1c( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,2,1] r=0 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:37 localhost ceph-osd[32311]: osd.3 pg_epoch: 39 pg[3.1f( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,2,1] r=0 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:37 localhost ceph-osd[32311]: osd.3 pg_epoch: 39 pg[3.16( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,2,1] r=0 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:38 localhost ceph-osd[32311]: osd.3 pg_epoch: 40 pg[4.0( empty local-lis/les=24/25 n=0 ec=24/24 lis/c=24/24 les/c/f=25/25/0 sis=40 pruub=9.986693382s) [4,5,3] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 active pruub 1177.900024414s@ mbc={}] start_peering_interval up [4,5,3] -> [4,5,3], acting [4,5,3] -> [4,5,3], acting_primary 4 -> 4, up_primary 4 -> 4, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:38 localhost ceph-osd[32311]: osd.3 pg_epoch: 40 pg[5.0( empty local-lis/les=26/27 n=0 ec=26/26 lis/c=26/26 les/c/f=27/27/0 sis=40 pruub=12.353844643s) [3,4,2] r=0 lpr=40 pi=[26,40)/1 crt=0'0 mlcod 0'0 active pruub 1180.267333984s@ mbc={}] start_peering_interval up [3,4,2] -> [3,4,2], acting [3,4,2] -> [3,4,2], acting_primary 3 -> 3, up_primary 3 -> 3, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:38 localhost ceph-osd[32311]: osd.3 pg_epoch: 40 pg[4.0( empty local-lis/les=24/25 n=0 ec=24/24 lis/c=24/24 les/c/f=25/25/0 sis=40 pruub=9.983009338s) [4,5,3] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1177.900024414s@ mbc={}] state: transitioning to Stray Dec 15 03:04:38 localhost ceph-osd[32311]: osd.3 pg_epoch: 40 pg[5.0( empty local-lis/les=26/27 n=0 ec=26/26 lis/c=26/26 les/c/f=27/27/0 sis=40 pruub=12.353844643s) [3,4,2] r=0 lpr=40 pi=[26,40)/1 crt=0'0 mlcod 0'0 unknown pruub 1180.267333984s@ mbc={}] state: transitioning to Primary Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[4.19( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,5,3] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[5.18( empty local-lis/les=26/27 n=0 ec=40/26 lis/c=26/26 les/c/f=27/27/0 sis=40) [3,4,2] r=0 lpr=40 pi=[26,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[5.19( empty local-lis/les=26/27 n=0 ec=40/26 lis/c=26/26 les/c/f=27/27/0 sis=40) [3,4,2] r=0 lpr=40 pi=[26,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[5.1a( empty local-lis/les=26/27 n=0 ec=40/26 lis/c=26/26 les/c/f=27/27/0 sis=40) [3,4,2] r=0 lpr=40 pi=[26,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[4.1a( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,5,3] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[5.1b( empty local-lis/les=26/27 n=0 ec=40/26 lis/c=26/26 les/c/f=27/27/0 sis=40) [3,4,2] r=0 lpr=40 pi=[26,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[4.1b( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,5,3] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[4.1d( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,5,3] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[5.1c( empty local-lis/les=26/27 n=0 ec=40/26 lis/c=26/26 les/c/f=27/27/0 sis=40) [3,4,2] r=0 lpr=40 pi=[26,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[4.1c( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,5,3] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[5.1d( empty local-lis/les=26/27 n=0 ec=40/26 lis/c=26/26 les/c/f=27/27/0 sis=40) [3,4,2] r=0 lpr=40 pi=[26,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[4.18( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,5,3] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[5.1e( empty local-lis/les=26/27 n=0 ec=40/26 lis/c=26/26 les/c/f=27/27/0 sis=40) [3,4,2] r=0 lpr=40 pi=[26,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[4.e( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,5,3] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[4.1f( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,5,3] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[5.f( empty local-lis/les=26/27 n=0 ec=40/26 lis/c=26/26 les/c/f=27/27/0 sis=40) [3,4,2] r=0 lpr=40 pi=[26,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[4.3( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,5,3] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[5.2( empty local-lis/les=26/27 n=0 ec=40/26 lis/c=26/26 les/c/f=27/27/0 sis=40) [3,4,2] r=0 lpr=40 pi=[26,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[5.4( empty local-lis/les=26/27 n=0 ec=40/26 lis/c=26/26 les/c/f=27/27/0 sis=40) [3,4,2] r=0 lpr=40 pi=[26,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[4.5( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,5,3] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[5.5( empty local-lis/les=26/27 n=0 ec=40/26 lis/c=26/26 les/c/f=27/27/0 sis=40) [3,4,2] r=0 lpr=40 pi=[26,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[4.4( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,5,3] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[4.2( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,5,3] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[5.3( empty local-lis/les=26/27 n=0 ec=40/26 lis/c=26/26 les/c/f=27/27/0 sis=40) [3,4,2] r=0 lpr=40 pi=[26,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[4.1( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,5,3] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[4.7( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,5,3] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[5.1( empty local-lis/les=26/27 n=0 ec=40/26 lis/c=26/26 les/c/f=27/27/0 sis=40) [3,4,2] r=0 lpr=40 pi=[26,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[5.6( empty local-lis/les=26/27 n=0 ec=40/26 lis/c=26/26 les/c/f=27/27/0 sis=40) [3,4,2] r=0 lpr=40 pi=[26,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[4.6( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,5,3] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[5.7( empty local-lis/les=26/27 n=0 ec=40/26 lis/c=26/26 les/c/f=27/27/0 sis=40) [3,4,2] r=0 lpr=40 pi=[26,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[4.f( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,5,3] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[4.c( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,5,3] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[5.d( empty local-lis/les=26/27 n=0 ec=40/26 lis/c=26/26 les/c/f=27/27/0 sis=40) [3,4,2] r=0 lpr=40 pi=[26,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[4.d( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,5,3] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[5.c( empty local-lis/les=26/27 n=0 ec=40/26 lis/c=26/26 les/c/f=27/27/0 sis=40) [3,4,2] r=0 lpr=40 pi=[26,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[4.a( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,5,3] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[5.b( empty local-lis/les=26/27 n=0 ec=40/26 lis/c=26/26 les/c/f=27/27/0 sis=40) [3,4,2] r=0 lpr=40 pi=[26,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[4.b( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,5,3] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[5.a( empty local-lis/les=26/27 n=0 ec=40/26 lis/c=26/26 les/c/f=27/27/0 sis=40) [3,4,2] r=0 lpr=40 pi=[26,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[5.9( empty local-lis/les=26/27 n=0 ec=40/26 lis/c=26/26 les/c/f=27/27/0 sis=40) [3,4,2] r=0 lpr=40 pi=[26,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[4.9( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,5,3] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[5.8( empty local-lis/les=26/27 n=0 ec=40/26 lis/c=26/26 les/c/f=27/27/0 sis=40) [3,4,2] r=0 lpr=40 pi=[26,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[4.8( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,5,3] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[4.16( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,5,3] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[5.17( empty local-lis/les=26/27 n=0 ec=40/26 lis/c=26/26 les/c/f=27/27/0 sis=40) [3,4,2] r=0 lpr=40 pi=[26,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[4.17( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,5,3] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[5.e( empty local-lis/les=26/27 n=0 ec=40/26 lis/c=26/26 les/c/f=27/27/0 sis=40) [3,4,2] r=0 lpr=40 pi=[26,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[5.16( empty local-lis/les=26/27 n=0 ec=40/26 lis/c=26/26 les/c/f=27/27/0 sis=40) [3,4,2] r=0 lpr=40 pi=[26,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[4.14( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,5,3] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[5.15( empty local-lis/les=26/27 n=0 ec=40/26 lis/c=26/26 les/c/f=27/27/0 sis=40) [3,4,2] r=0 lpr=40 pi=[26,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[4.15( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,5,3] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[5.14( empty local-lis/les=26/27 n=0 ec=40/26 lis/c=26/26 les/c/f=27/27/0 sis=40) [3,4,2] r=0 lpr=40 pi=[26,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[4.12( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,5,3] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[4.13( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,5,3] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[5.12( empty local-lis/les=26/27 n=0 ec=40/26 lis/c=26/26 les/c/f=27/27/0 sis=40) [3,4,2] r=0 lpr=40 pi=[26,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[4.10( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,5,3] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[5.11( empty local-lis/les=26/27 n=0 ec=40/26 lis/c=26/26 les/c/f=27/27/0 sis=40) [3,4,2] r=0 lpr=40 pi=[26,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[5.13( empty local-lis/les=26/27 n=0 ec=40/26 lis/c=26/26 les/c/f=27/27/0 sis=40) [3,4,2] r=0 lpr=40 pi=[26,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[4.11( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,5,3] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[5.10( empty local-lis/les=26/27 n=0 ec=40/26 lis/c=26/26 les/c/f=27/27/0 sis=40) [3,4,2] r=0 lpr=40 pi=[26,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[5.1f( empty local-lis/les=26/27 n=0 ec=40/26 lis/c=26/26 les/c/f=27/27/0 sis=40) [3,4,2] r=0 lpr=40 pi=[26,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[4.1e( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,5,3] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[5.0( empty local-lis/les=40/41 n=0 ec=26/26 lis/c=26/26 les/c/f=27/27/0 sis=40) [3,4,2] r=0 lpr=40 pi=[26,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[5.1d( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=26/26 les/c/f=27/27/0 sis=40) [3,4,2] r=0 lpr=40 pi=[26,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[5.1b( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=26/26 les/c/f=27/27/0 sis=40) [3,4,2] r=0 lpr=40 pi=[26,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[5.1a( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=26/26 les/c/f=27/27/0 sis=40) [3,4,2] r=0 lpr=40 pi=[26,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[5.18( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=26/26 les/c/f=27/27/0 sis=40) [3,4,2] r=0 lpr=40 pi=[26,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[5.6( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=26/26 les/c/f=27/27/0 sis=40) [3,4,2] r=0 lpr=40 pi=[26,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[5.9( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=26/26 les/c/f=27/27/0 sis=40) [3,4,2] r=0 lpr=40 pi=[26,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[5.3( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=26/26 les/c/f=27/27/0 sis=40) [3,4,2] r=0 lpr=40 pi=[26,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[5.1e( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=26/26 les/c/f=27/27/0 sis=40) [3,4,2] r=0 lpr=40 pi=[26,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[5.2( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=26/26 les/c/f=27/27/0 sis=40) [3,4,2] r=0 lpr=40 pi=[26,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[5.1( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=26/26 les/c/f=27/27/0 sis=40) [3,4,2] r=0 lpr=40 pi=[26,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[5.5( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=26/26 les/c/f=27/27/0 sis=40) [3,4,2] r=0 lpr=40 pi=[26,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[5.4( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=26/26 les/c/f=27/27/0 sis=40) [3,4,2] r=0 lpr=40 pi=[26,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[5.b( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=26/26 les/c/f=27/27/0 sis=40) [3,4,2] r=0 lpr=40 pi=[26,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[5.8( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=26/26 les/c/f=27/27/0 sis=40) [3,4,2] r=0 lpr=40 pi=[26,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[5.a( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=26/26 les/c/f=27/27/0 sis=40) [3,4,2] r=0 lpr=40 pi=[26,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[5.1c( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=26/26 les/c/f=27/27/0 sis=40) [3,4,2] r=0 lpr=40 pi=[26,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[5.f( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=26/26 les/c/f=27/27/0 sis=40) [3,4,2] r=0 lpr=40 pi=[26,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[5.c( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=26/26 les/c/f=27/27/0 sis=40) [3,4,2] r=0 lpr=40 pi=[26,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[5.d( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=26/26 les/c/f=27/27/0 sis=40) [3,4,2] r=0 lpr=40 pi=[26,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[5.17( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=26/26 les/c/f=27/27/0 sis=40) [3,4,2] r=0 lpr=40 pi=[26,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[5.19( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=26/26 les/c/f=27/27/0 sis=40) [3,4,2] r=0 lpr=40 pi=[26,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[5.14( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=26/26 les/c/f=27/27/0 sis=40) [3,4,2] r=0 lpr=40 pi=[26,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[5.7( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=26/26 les/c/f=27/27/0 sis=40) [3,4,2] r=0 lpr=40 pi=[26,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[5.16( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=26/26 les/c/f=27/27/0 sis=40) [3,4,2] r=0 lpr=40 pi=[26,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[5.13( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=26/26 les/c/f=27/27/0 sis=40) [3,4,2] r=0 lpr=40 pi=[26,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[5.15( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=26/26 les/c/f=27/27/0 sis=40) [3,4,2] r=0 lpr=40 pi=[26,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[5.12( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=26/26 les/c/f=27/27/0 sis=40) [3,4,2] r=0 lpr=40 pi=[26,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[5.e( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=26/26 les/c/f=27/27/0 sis=40) [3,4,2] r=0 lpr=40 pi=[26,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[5.10( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=26/26 les/c/f=27/27/0 sis=40) [3,4,2] r=0 lpr=40 pi=[26,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[5.1f( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=26/26 les/c/f=27/27/0 sis=40) [3,4,2] r=0 lpr=40 pi=[26,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:39 localhost ceph-osd[32311]: osd.3 pg_epoch: 41 pg[5.11( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=26/26 les/c/f=27/27/0 sis=40) [3,4,2] r=0 lpr=40 pi=[26,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:40 localhost ceph-osd[31375]: osd.0 pg_epoch: 42 pg[7.0( v 35'39 (0'0,35'39] local-lis/les=33/34 n=22 ec=33/33 lis/c=33/33 les/c/f=34/34/0 sis=42 pruub=8.493065834s) [5,0,4] r=1 lpr=42 pi=[33,42)/1 luod=0'0 lua=35'37 crt=35'39 lcod 35'38 mlcod 0'0 active pruub 1182.773315430s@ mbc={}] start_peering_interval up [5,0,4] -> [5,0,4], acting [5,0,4] -> [5,0,4], acting_primary 5 -> 5, up_primary 5 -> 5, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:40 localhost ceph-osd[32311]: osd.3 pg_epoch: 42 pg[6.0( empty local-lis/les=32/33 n=0 ec=32/32 lis/c=32/32 les/c/f=33/33/0 sis=42 pruub=15.247041702s) [4,5,3] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 active pruub 1185.242919922s@ mbc={}] start_peering_interval up [4,5,3] -> [4,5,3], acting [4,5,3] -> [4,5,3], acting_primary 4 -> 4, up_primary 4 -> 4, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:40 localhost ceph-osd[31375]: osd.0 pg_epoch: 42 pg[7.0( v 35'39 lc 0'0 (0'0,35'39] local-lis/les=33/34 n=1 ec=33/33 lis/c=33/33 les/c/f=34/34/0 sis=42 pruub=8.492086411s) [5,0,4] r=1 lpr=42 pi=[33,42)/1 crt=35'39 lcod 35'38 mlcod 0'0 unknown NOTIFY pruub 1182.773315430s@ mbc={}] state: transitioning to Stray Dec 15 03:04:40 localhost ceph-osd[32311]: osd.3 pg_epoch: 42 pg[6.0( empty local-lis/les=32/33 n=0 ec=32/32 lis/c=32/32 les/c/f=33/33/0 sis=42 pruub=15.241368294s) [4,5,3] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1185.242919922s@ mbc={}] state: transitioning to Stray Dec 15 03:04:41 localhost ceph-osd[32311]: osd.3 pg_epoch: 43 pg[6.1b( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [4,5,3] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:41 localhost ceph-osd[32311]: osd.3 pg_epoch: 43 pg[6.1a( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [4,5,3] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:41 localhost ceph-osd[32311]: osd.3 pg_epoch: 43 pg[6.19( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [4,5,3] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:41 localhost ceph-osd[32311]: osd.3 pg_epoch: 43 pg[6.18( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [4,5,3] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:41 localhost ceph-osd[32311]: osd.3 pg_epoch: 43 pg[6.1f( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [4,5,3] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:41 localhost ceph-osd[32311]: osd.3 pg_epoch: 43 pg[6.1e( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [4,5,3] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:41 localhost ceph-osd[31375]: osd.0 pg_epoch: 43 pg[7.d( v 35'39 lc 0'0 (0'0,35'39] local-lis/les=33/34 n=1 ec=42/33 lis/c=33/33 les/c/f=34/34/0 sis=42) [5,0,4] r=1 lpr=42 pi=[33,42)/1 crt=35'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:41 localhost ceph-osd[31375]: osd.0 pg_epoch: 43 pg[7.e( v 35'39 lc 0'0 (0'0,35'39] local-lis/les=33/34 n=1 ec=42/33 lis/c=33/33 les/c/f=34/34/0 sis=42) [5,0,4] r=1 lpr=42 pi=[33,42)/1 crt=35'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:41 localhost ceph-osd[32311]: osd.3 pg_epoch: 43 pg[6.1d( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [4,5,3] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:41 localhost ceph-osd[32311]: osd.3 pg_epoch: 43 pg[6.c( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [4,5,3] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:41 localhost ceph-osd[31375]: osd.0 pg_epoch: 43 pg[7.6( v 35'39 lc 0'0 (0'0,35'39] local-lis/les=33/34 n=2 ec=42/33 lis/c=33/33 les/c/f=34/34/0 sis=42) [5,0,4] r=1 lpr=42 pi=[33,42)/1 crt=35'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:41 localhost ceph-osd[32311]: osd.3 pg_epoch: 43 pg[6.1( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [4,5,3] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:41 localhost ceph-osd[31375]: osd.0 pg_epoch: 43 pg[7.c( v 35'39 lc 0'0 (0'0,35'39] local-lis/les=33/34 n=1 ec=42/33 lis/c=33/33 les/c/f=34/34/0 sis=42) [5,0,4] r=1 lpr=42 pi=[33,42)/1 crt=35'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:41 localhost ceph-osd[31375]: osd.0 pg_epoch: 43 pg[7.f( v 35'39 lc 0'0 (0'0,35'39] local-lis/les=33/34 n=1 ec=42/33 lis/c=33/33 les/c/f=34/34/0 sis=42) [5,0,4] r=1 lpr=42 pi=[33,42)/1 crt=35'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:41 localhost ceph-osd[32311]: osd.3 pg_epoch: 43 pg[6.7( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [4,5,3] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:41 localhost ceph-osd[31375]: osd.0 pg_epoch: 43 pg[7.9( v 35'39 lc 0'0 (0'0,35'39] local-lis/les=33/34 n=1 ec=42/33 lis/c=33/33 les/c/f=34/34/0 sis=42) [5,0,4] r=1 lpr=42 pi=[33,42)/1 crt=35'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:41 localhost ceph-osd[31375]: osd.0 pg_epoch: 43 pg[7.1( v 35'39 (0'0,35'39] local-lis/les=33/34 n=2 ec=42/33 lis/c=33/33 les/c/f=34/34/0 sis=42) [5,0,4] r=1 lpr=42 pi=[33,42)/1 crt=35'39 lcod 0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:41 localhost ceph-osd[31375]: osd.0 pg_epoch: 43 pg[7.2( v 35'39 lc 0'0 (0'0,35'39] local-lis/les=33/34 n=2 ec=42/33 lis/c=33/33 les/c/f=34/34/0 sis=42) [5,0,4] r=1 lpr=42 pi=[33,42)/1 crt=35'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:41 localhost ceph-osd[31375]: osd.0 pg_epoch: 43 pg[7.3( v 35'39 lc 0'0 (0'0,35'39] local-lis/les=33/34 n=2 ec=42/33 lis/c=33/33 les/c/f=34/34/0 sis=42) [5,0,4] r=1 lpr=42 pi=[33,42)/1 crt=35'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:41 localhost ceph-osd[31375]: osd.0 pg_epoch: 43 pg[7.4( v 35'39 lc 0'0 (0'0,35'39] local-lis/les=33/34 n=2 ec=42/33 lis/c=33/33 les/c/f=34/34/0 sis=42) [5,0,4] r=1 lpr=42 pi=[33,42)/1 crt=35'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:41 localhost ceph-osd[31375]: osd.0 pg_epoch: 43 pg[7.8( v 35'39 lc 0'0 (0'0,35'39] local-lis/les=33/34 n=1 ec=42/33 lis/c=33/33 les/c/f=34/34/0 sis=42) [5,0,4] r=1 lpr=42 pi=[33,42)/1 crt=35'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:41 localhost ceph-osd[31375]: osd.0 pg_epoch: 43 pg[7.5( v 35'39 lc 0'0 (0'0,35'39] local-lis/les=33/34 n=2 ec=42/33 lis/c=33/33 les/c/f=34/34/0 sis=42) [5,0,4] r=1 lpr=42 pi=[33,42)/1 crt=35'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:41 localhost ceph-osd[31375]: osd.0 pg_epoch: 43 pg[7.7( v 35'39 lc 0'0 (0'0,35'39] local-lis/les=33/34 n=1 ec=42/33 lis/c=33/33 les/c/f=34/34/0 sis=42) [5,0,4] r=1 lpr=42 pi=[33,42)/1 crt=35'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:41 localhost ceph-osd[31375]: osd.0 pg_epoch: 43 pg[7.a( v 35'39 lc 0'0 (0'0,35'39] local-lis/les=33/34 n=1 ec=42/33 lis/c=33/33 les/c/f=34/34/0 sis=42) [5,0,4] r=1 lpr=42 pi=[33,42)/1 crt=35'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:41 localhost ceph-osd[31375]: osd.0 pg_epoch: 43 pg[7.b( v 35'39 lc 0'0 (0'0,35'39] local-lis/les=33/34 n=1 ec=42/33 lis/c=33/33 les/c/f=34/34/0 sis=42) [5,0,4] r=1 lpr=42 pi=[33,42)/1 crt=35'39 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:41 localhost ceph-osd[32311]: osd.3 pg_epoch: 43 pg[6.6( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [4,5,3] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:41 localhost ceph-osd[32311]: osd.3 pg_epoch: 43 pg[6.3( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [4,5,3] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:41 localhost ceph-osd[32311]: osd.3 pg_epoch: 43 pg[6.2( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [4,5,3] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:41 localhost ceph-osd[32311]: osd.3 pg_epoch: 43 pg[6.5( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [4,5,3] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:41 localhost ceph-osd[32311]: osd.3 pg_epoch: 43 pg[6.4( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [4,5,3] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:41 localhost ceph-osd[32311]: osd.3 pg_epoch: 43 pg[6.e( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [4,5,3] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:41 localhost ceph-osd[32311]: osd.3 pg_epoch: 43 pg[6.d( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [4,5,3] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:41 localhost ceph-osd[32311]: osd.3 pg_epoch: 43 pg[6.9( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [4,5,3] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:41 localhost ceph-osd[32311]: osd.3 pg_epoch: 43 pg[6.8( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [4,5,3] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:41 localhost ceph-osd[32311]: osd.3 pg_epoch: 43 pg[6.f( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [4,5,3] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:41 localhost ceph-osd[32311]: osd.3 pg_epoch: 43 pg[6.a( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [4,5,3] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:41 localhost ceph-osd[32311]: osd.3 pg_epoch: 43 pg[6.b( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [4,5,3] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:41 localhost ceph-osd[32311]: osd.3 pg_epoch: 43 pg[6.14( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [4,5,3] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:41 localhost ceph-osd[32311]: osd.3 pg_epoch: 43 pg[6.15( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [4,5,3] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:41 localhost ceph-osd[32311]: osd.3 pg_epoch: 43 pg[6.16( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [4,5,3] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:41 localhost ceph-osd[32311]: osd.3 pg_epoch: 43 pg[6.17( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [4,5,3] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:41 localhost ceph-osd[32311]: osd.3 pg_epoch: 43 pg[6.10( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [4,5,3] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:41 localhost ceph-osd[32311]: osd.3 pg_epoch: 43 pg[6.12( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [4,5,3] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:41 localhost ceph-osd[32311]: osd.3 pg_epoch: 43 pg[6.13( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [4,5,3] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:41 localhost ceph-osd[32311]: osd.3 pg_epoch: 43 pg[6.1c( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [4,5,3] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:41 localhost ceph-osd[32311]: osd.3 pg_epoch: 43 pg[6.11( empty local-lis/les=32/33 n=0 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [4,5,3] r=2 lpr=42 pi=[32,42)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state: transitioning to Stray Dec 15 03:04:42 localhost ceph-osd[32311]: log_channel(cluster) log [DBG] : 3.0 scrub starts Dec 15 03:04:42 localhost python3[55775]: ansible-file Invoked with path=/var/lib/tripleo-config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:04:42 localhost ceph-osd[32311]: log_channel(cluster) log [DBG] : 3.0 scrub ok Dec 15 03:04:43 localhost python3[55791]: ansible-file Invoked with path=/var/lib/tripleo-config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:04:45 localhost ceph-osd[32311]: log_channel(cluster) log [DBG] : 3.1b scrub starts Dec 15 03:04:45 localhost ceph-osd[32311]: log_channel(cluster) log [DBG] : 3.1b scrub ok Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[5.18( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.219707489s) [2,0,4] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1184.981201172s@ mbc={}] start_peering_interval up [3,4,2] -> [2,0,4], acting [3,4,2] -> [2,0,4], acting_primary 3 -> 2, up_primary 3 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[4.19( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.214246750s) [3,1,2] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1184.975708008s@ mbc={}] start_peering_interval up [4,5,3] -> [3,1,2], acting [4,5,3] -> [3,1,2], acting_primary 4 -> 3, up_primary 4 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[5.18( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.219637871s) [2,0,4] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.981201172s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[3.1e( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=15.179255486s) [4,3,5] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1190.940917969s@ mbc={}] start_peering_interval up [3,2,1] -> [4,3,5], acting [3,2,1] -> [4,3,5], acting_primary 3 -> 4, up_primary 3 -> 4, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[4.19( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.214246750s) [3,1,2] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1184.975708008s@ mbc={}] state: transitioning to Primary Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[6.1b( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=11.327329636s) [3,2,1] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1187.088745117s@ mbc={}] start_peering_interval up [4,5,3] -> [3,2,1], acting [4,5,3] -> [3,2,1], acting_primary 4 -> 3, up_primary 4 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[3.1e( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=15.179189682s) [4,3,5] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1190.940917969s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[6.1b( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=11.327329636s) [3,2,1] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1187.088745117s@ mbc={}] state: transitioning to Primary Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[2.1f( empty local-lis/les=37/38 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44 pruub=14.146185875s) [4,2,3] r=2 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 active pruub 1189.907714844s@ mbc={}] start_peering_interval up [2,3,1] -> [4,2,3], acting [2,3,1] -> [4,2,3], acting_primary 2 -> 4, up_primary 2 -> 4, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[6.1a( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=11.318291664s) [5,0,1] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1187.080444336s@ mbc={}] start_peering_interval up [4,5,3] -> [5,0,1], acting [4,5,3] -> [5,0,1], acting_primary 4 -> 5, up_primary 4 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[6.1a( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=11.318244934s) [5,0,1] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1187.080444336s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[4.18( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.219980240s) [2,1,0] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1184.982421875s@ mbc={}] start_peering_interval up [4,5,3] -> [2,1,0], acting [4,5,3] -> [2,1,0], acting_primary 4 -> 2, up_primary 4 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[4.18( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.219953537s) [2,1,0] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.982421875s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[5.19( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.220302582s) [1,5,0] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1184.982788086s@ mbc={}] start_peering_interval up [3,4,2] -> [1,5,0], acting [3,4,2] -> [1,5,0], acting_primary 3 -> 1, up_primary 3 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[5.19( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.220265388s) [1,5,0] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.982788086s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[6.19( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=11.326134682s) [5,1,3] r=2 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1187.088867188s@ mbc={}] start_peering_interval up [4,5,3] -> [5,1,3], acting [4,5,3] -> [5,1,3], acting_primary 4 -> 5, up_primary 4 -> 5, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[6.19( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=11.326096535s) [5,1,3] r=2 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1187.088867188s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[2.1f( empty local-lis/les=37/38 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44 pruub=14.145586967s) [4,2,3] r=2 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.907714844s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[2.1d( empty local-lis/les=37/38 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44 pruub=14.144844055s) [0,5,4] r=-1 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 active pruub 1189.907714844s@ mbc={}] start_peering_interval up [2,3,1] -> [0,5,4], acting [2,3,1] -> [0,5,4], acting_primary 2 -> 0, up_primary 2 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[3.1c( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=15.177972794s) [5,4,3] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1190.940917969s@ mbc={}] start_peering_interval up [3,2,1] -> [5,4,3], acting [3,2,1] -> [5,4,3], acting_primary 3 -> 5, up_primary 3 -> 5, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[3.1c( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=15.177947998s) [5,4,3] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1190.940917969s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[31375]: osd.0 pg_epoch: 44 pg[7.f( v 35'39 (0'0,35'39] local-lis/les=42/43 n=1 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=11.327595711s) [2,0,4] r=1 lpr=44 pi=[42,44)/1 luod=0'0 crt=35'39 lcod 0'0 mlcod 0'0 active pruub 1191.374267578s@ mbc={}] start_peering_interval up [5,0,4] -> [2,0,4], acting [5,0,4] -> [2,0,4], acting_primary 5 -> 2, up_primary 5 -> 2, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[31375]: osd.0 pg_epoch: 44 pg[7.f( v 35'39 (0'0,35'39] local-lis/les=42/43 n=1 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=11.327540398s) [2,0,4] r=1 lpr=44 pi=[42,44)/1 crt=35'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1191.374267578s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[31375]: osd.0 pg_epoch: 44 pg[7.d( v 35'39 (0'0,35'39] local-lis/les=42/43 n=1 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=11.327090263s) [2,0,4] r=1 lpr=44 pi=[42,44)/1 luod=0'0 crt=35'39 lcod 0'0 mlcod 0'0 active pruub 1191.374023438s@ mbc={}] start_peering_interval up [5,0,4] -> [2,0,4], acting [5,0,4] -> [2,0,4], acting_primary 5 -> 2, up_primary 5 -> 2, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[31375]: osd.0 pg_epoch: 44 pg[7.d( v 35'39 (0'0,35'39] local-lis/les=42/43 n=1 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=11.327054024s) [2,0,4] r=1 lpr=44 pi=[42,44)/1 crt=35'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1191.374023438s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[31375]: osd.0 pg_epoch: 44 pg[7.9( v 35'39 (0'0,35'39] local-lis/les=42/43 n=1 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=11.329794884s) [2,0,4] r=1 lpr=44 pi=[42,44)/1 luod=0'0 crt=35'39 lcod 0'0 mlcod 0'0 active pruub 1191.376831055s@ mbc={}] start_peering_interval up [5,0,4] -> [2,0,4], acting [5,0,4] -> [2,0,4], acting_primary 5 -> 2, up_primary 5 -> 2, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[31375]: osd.0 pg_epoch: 44 pg[7.9( v 35'39 (0'0,35'39] local-lis/les=42/43 n=1 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=11.329773903s) [2,0,4] r=1 lpr=44 pi=[42,44)/1 crt=35'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1191.376831055s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[31375]: osd.0 pg_epoch: 44 pg[7.3( v 35'39 (0'0,35'39] local-lis/les=42/43 n=2 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=11.329401016s) [2,0,4] r=1 lpr=44 pi=[42,44)/1 luod=0'0 crt=35'39 lcod 0'0 mlcod 0'0 active pruub 1191.376586914s@ mbc={}] start_peering_interval up [5,0,4] -> [2,0,4], acting [5,0,4] -> [2,0,4], acting_primary 5 -> 2, up_primary 5 -> 2, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[31375]: osd.0 pg_epoch: 44 pg[7.3( v 35'39 (0'0,35'39] local-lis/les=42/43 n=2 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=11.329379082s) [2,0,4] r=1 lpr=44 pi=[42,44)/1 crt=35'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1191.376586914s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[31375]: osd.0 pg_epoch: 44 pg[7.1( v 35'39 (0'0,35'39] local-lis/les=42/43 n=2 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=11.322180748s) [2,0,4] r=1 lpr=44 pi=[42,44)/1 luod=0'0 crt=35'39 lcod 0'0 mlcod 0'0 active pruub 1191.369384766s@ mbc={}] start_peering_interval up [5,0,4] -> [2,0,4], acting [5,0,4] -> [2,0,4], acting_primary 5 -> 2, up_primary 5 -> 2, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[31375]: osd.0 pg_epoch: 44 pg[7.5( v 35'39 (0'0,35'39] local-lis/les=42/43 n=2 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=11.329564095s) [2,0,4] r=1 lpr=44 pi=[42,44)/1 luod=0'0 crt=35'39 lcod 0'0 mlcod 0'0 active pruub 1191.376831055s@ mbc={}] start_peering_interval up [5,0,4] -> [2,0,4], acting [5,0,4] -> [2,0,4], acting_primary 5 -> 2, up_primary 5 -> 2, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[31375]: osd.0 pg_epoch: 44 pg[7.1( v 35'39 (0'0,35'39] local-lis/les=42/43 n=2 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=11.322123528s) [2,0,4] r=1 lpr=44 pi=[42,44)/1 crt=35'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1191.369384766s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[31375]: osd.0 pg_epoch: 44 pg[7.7( v 35'39 (0'0,35'39] local-lis/les=42/43 n=1 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=11.329374313s) [2,0,4] r=1 lpr=44 pi=[42,44)/1 luod=0'0 crt=35'39 lcod 0'0 mlcod 0'0 active pruub 1191.376708984s@ mbc={}] start_peering_interval up [5,0,4] -> [2,0,4], acting [5,0,4] -> [2,0,4], acting_primary 5 -> 2, up_primary 5 -> 2, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[31375]: osd.0 pg_epoch: 44 pg[7.7( v 35'39 (0'0,35'39] local-lis/les=42/43 n=1 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=11.329352379s) [2,0,4] r=1 lpr=44 pi=[42,44)/1 crt=35'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1191.376708984s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[2.1d( empty local-lis/les=37/38 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44 pruub=14.144808769s) [0,5,4] r=-1 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.907714844s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[4.1b( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.212389946s) [2,1,3] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1184.975585938s@ mbc={}] start_peering_interval up [4,5,3] -> [2,1,3], acting [4,5,3] -> [2,1,3], acting_primary 4 -> 2, up_primary 4 -> 2, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[31375]: osd.0 pg_epoch: 44 pg[7.b( v 35'39 (0'0,35'39] local-lis/les=42/43 n=1 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=11.329689026s) [2,0,4] r=1 lpr=44 pi=[42,44)/1 luod=0'0 crt=35'39 lcod 0'0 mlcod 0'0 active pruub 1191.377075195s@ mbc={}] start_peering_interval up [5,0,4] -> [2,0,4], acting [5,0,4] -> [2,0,4], acting_primary 5 -> 2, up_primary 5 -> 2, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[31375]: osd.0 pg_epoch: 44 pg[7.b( v 35'39 (0'0,35'39] local-lis/les=42/43 n=1 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=11.329668045s) [2,0,4] r=1 lpr=44 pi=[42,44)/1 crt=35'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1191.377075195s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[4.1b( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.212314606s) [2,1,3] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.975585938s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[2.1e( empty local-lis/les=37/38 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44 pruub=14.145432472s) [4,0,5] r=-1 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 active pruub 1189.907836914s@ mbc={}] start_peering_interval up [2,3,1] -> [4,0,5], acting [2,3,1] -> [4,0,5], acting_primary 2 -> 4, up_primary 2 -> 4, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[3.1f( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=15.177580833s) [1,5,3] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1190.941040039s@ mbc={}] start_peering_interval up [3,2,1] -> [1,5,3], acting [3,2,1] -> [1,5,3], acting_primary 3 -> 1, up_primary 3 -> 1, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[2.1e( empty local-lis/les=37/38 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44 pruub=14.144385338s) [4,0,5] r=-1 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.907836914s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[6.18( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=11.324546814s) [4,2,3] r=2 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1187.088012695s@ mbc={}] start_peering_interval up [4,5,3] -> [4,2,3], acting [4,5,3] -> [4,2,3], acting_primary 4 -> 4, up_primary 4 -> 4, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[5.1a( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.217505455s) [0,5,4] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1184.981079102s@ mbc={}] start_peering_interval up [3,4,2] -> [0,5,4], acting [3,4,2] -> [0,5,4], acting_primary 3 -> 0, up_primary 3 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[6.18( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=11.324519157s) [4,2,3] r=2 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1187.088012695s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[3.1f( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=15.177515030s) [1,5,3] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1190.941040039s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[31375]: osd.0 pg_epoch: 44 pg[2.1d( empty local-lis/les=0/0 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44) [0,5,4] r=0 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[5.1b( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.216817856s) [2,4,3] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1184.980957031s@ mbc={}] start_peering_interval up [3,4,2] -> [2,4,3], acting [3,4,2] -> [2,4,3], acting_primary 3 -> 2, up_primary 3 -> 2, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[5.1a( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.217470169s) [0,5,4] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.981079102s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[5.1b( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.216782570s) [2,4,3] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.980957031s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[3.1a( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=15.167893410s) [0,1,2] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1190.932250977s@ mbc={}] start_peering_interval up [3,2,1] -> [0,1,2], acting [3,2,1] -> [0,1,2], acting_primary 3 -> 0, up_primary 3 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[4.1a( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.211261749s) [2,4,0] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1184.975585938s@ mbc={}] start_peering_interval up [4,5,3] -> [2,4,0], acting [4,5,3] -> [2,4,0], acting_primary 4 -> 2, up_primary 4 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[3.1a( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=15.167860985s) [0,1,2] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1190.932250977s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[2.1c( empty local-lis/les=37/38 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44 pruub=14.143878937s) [0,2,1] r=-1 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 active pruub 1189.907836914s@ mbc={}] start_peering_interval up [2,3,1] -> [0,2,1], acting [2,3,1] -> [0,2,1], acting_primary 2 -> 0, up_primary 2 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[6.1f( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=11.324327469s) [4,0,5] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1187.088745117s@ mbc={}] start_peering_interval up [4,5,3] -> [4,0,5], acting [4,5,3] -> [4,0,5], acting_primary 4 -> 4, up_primary 4 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[4.1d( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.207446098s) [3,5,4] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1184.972045898s@ mbc={}] start_peering_interval up [4,5,3] -> [3,5,4], acting [4,5,3] -> [3,5,4], acting_primary 4 -> 3, up_primary 4 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[2.1b( empty local-lis/les=37/38 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44 pruub=14.142888069s) [3,2,4] r=0 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 active pruub 1189.907470703s@ mbc={}] start_peering_interval up [2,3,1] -> [3,2,4], acting [2,3,1] -> [3,2,4], acting_primary 2 -> 3, up_primary 2 -> 3, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[4.1a( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.211217880s) [2,4,0] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.975585938s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[6.1f( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=11.324125290s) [4,0,5] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1187.088745117s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[4.1d( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.207446098s) [3,5,4] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1184.972045898s@ mbc={}] state: transitioning to Primary Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[6.1e( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=11.323960304s) [3,5,4] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1187.088989258s@ mbc={}] start_peering_interval up [4,5,3] -> [3,5,4], acting [4,5,3] -> [3,5,4], acting_primary 4 -> 3, up_primary 4 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[6.1e( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=11.323960304s) [3,5,4] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1187.088989258s@ mbc={}] state: transitioning to Primary Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[2.1c( empty local-lis/les=37/38 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44 pruub=14.142973900s) [0,2,1] r=-1 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.907836914s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[31375]: osd.0 pg_epoch: 44 pg[5.1a( empty local-lis/les=0/0 n=0 ec=40/26 lis/c=40/40 les/c/f=41/41/0 sis=44) [0,5,4] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[2.1a( empty local-lis/les=37/38 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44 pruub=14.142504692s) [2,0,1] r=-1 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 active pruub 1189.907714844s@ mbc={}] start_peering_interval up [2,3,1] -> [2,0,1], acting [2,3,1] -> [2,0,1], acting_primary 2 -> 2, up_primary 2 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[31375]: osd.0 pg_epoch: 44 pg[3.1a( empty local-lis/les=0/0 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44) [0,1,2] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[3.1b( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=15.169972420s) [0,5,1] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1190.935546875s@ mbc={}] start_peering_interval up [3,2,1] -> [0,5,1], acting [3,2,1] -> [0,5,1], acting_primary 3 -> 0, up_primary 3 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[5.1c( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.216858864s) [2,3,4] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1184.982421875s@ mbc={}] start_peering_interval up [3,4,2] -> [2,3,4], acting [3,4,2] -> [2,3,4], acting_primary 3 -> 2, up_primary 3 -> 2, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[2.1a( empty local-lis/les=37/38 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44 pruub=14.142071724s) [2,0,1] r=-1 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.907714844s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[3.1b( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=15.169946671s) [0,5,1] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1190.935546875s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[2.1b( empty local-lis/les=37/38 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44 pruub=14.142888069s) [3,2,4] r=0 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1189.907470703s@ mbc={}] state: transitioning to Primary Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[5.1c( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.216783524s) [2,3,4] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.982421875s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[5.1d( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.214881897s) [1,2,0] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1184.980834961s@ mbc={}] start_peering_interval up [3,4,2] -> [1,2,0], acting [3,4,2] -> [1,2,0], acting_primary 3 -> 1, up_primary 3 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[5.1d( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.214780807s) [1,2,0] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.980834961s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[31375]: osd.0 pg_epoch: 44 pg[3.1b( empty local-lis/les=0/0 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44) [0,5,1] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:04:45 localhost ceph-osd[31375]: osd.0 pg_epoch: 44 pg[2.1c( empty local-lis/les=0/0 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44) [0,2,1] r=0 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:04:45 localhost ceph-osd[31375]: osd.0 pg_epoch: 44 pg[7.5( v 35'39 (0'0,35'39] local-lis/les=42/43 n=2 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=11.329502106s) [2,0,4] r=1 lpr=44 pi=[42,44)/1 crt=35'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1191.376831055s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[3.18( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=15.169285774s) [4,0,5] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1190.935546875s@ mbc={}] start_peering_interval up [3,2,1] -> [4,0,5], acting [3,2,1] -> [4,0,5], acting_primary 3 -> 4, up_primary 3 -> 4, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[3.18( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=15.169127464s) [4,0,5] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1190.935546875s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[6.1d( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=11.322879791s) [4,0,5] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1187.088867188s@ mbc={}] start_peering_interval up [4,5,3] -> [4,0,5], acting [4,5,3] -> [4,0,5], acting_primary 4 -> 4, up_primary 4 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[2.19( empty local-lis/les=37/38 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44 pruub=14.141274452s) [4,2,3] r=2 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 active pruub 1189.907348633s@ mbc={}] start_peering_interval up [2,3,1] -> [4,2,3], acting [2,3,1] -> [4,2,3], acting_primary 2 -> 4, up_primary 2 -> 4, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[5.1e( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.214921951s) [1,2,3] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1184.981445312s@ mbc={}] start_peering_interval up [3,4,2] -> [1,2,3], acting [3,4,2] -> [1,2,3], acting_primary 3 -> 1, up_primary 3 -> 1, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[4.1c( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.208747864s) [0,1,2] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1184.975463867s@ mbc={}] start_peering_interval up [4,5,3] -> [0,1,2], acting [4,5,3] -> [0,1,2], acting_primary 4 -> 0, up_primary 4 -> 0, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[4.1f( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.208885193s) [3,5,1] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1184.975585938s@ mbc={}] start_peering_interval up [4,5,3] -> [3,5,1], acting [4,5,3] -> [3,5,1], acting_primary 4 -> 3, up_primary 4 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[4.1f( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.208885193s) [3,5,1] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1184.975585938s@ mbc={}] state: transitioning to Primary Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[6.1d( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=11.322170258s) [4,0,5] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1187.088867188s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[5.1e( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.214732170s) [1,2,3] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.981445312s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[4.1c( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.208498955s) [0,1,2] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.975463867s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[2.8( empty local-lis/les=37/38 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44 pruub=14.140679359s) [2,0,1] r=-1 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 active pruub 1189.907714844s@ mbc={}] start_peering_interval up [2,3,1] -> [2,0,1], acting [2,3,1] -> [2,0,1], acting_primary 2 -> 2, up_primary 2 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[6.c( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=11.314400673s) [1,2,3] r=2 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1187.081542969s@ mbc={}] start_peering_interval up [4,5,3] -> [1,2,3], acting [4,5,3] -> [1,2,3], acting_primary 4 -> 1, up_primary 4 -> 1, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[4.e( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.215451241s) [2,0,1] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1184.982666016s@ mbc={}] start_peering_interval up [4,5,3] -> [2,0,1], acting [4,5,3] -> [2,0,1], acting_primary 4 -> 2, up_primary 4 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[2.8( empty local-lis/les=37/38 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44 pruub=14.140590668s) [2,0,1] r=-1 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.907714844s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[31375]: osd.0 pg_epoch: 44 pg[2.15( empty local-lis/les=0/0 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44) [0,1,2] r=0 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[2.19( empty local-lis/les=37/38 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44 pruub=14.140834808s) [4,2,3] r=2 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.907348633s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[4.e( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.215398788s) [2,0,1] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.982666016s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[6.c( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=11.314369202s) [1,2,3] r=2 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1187.081542969s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[6.1( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=11.319514275s) [3,5,4] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1187.087036133s@ mbc={}] start_peering_interval up [4,5,3] -> [3,5,4], acting [4,5,3] -> [3,5,4], acting_primary 4 -> 3, up_primary 4 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[6.1( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=11.319514275s) [3,5,4] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1187.087036133s@ mbc={}] state: transitioning to Primary Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[4.3( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.210042953s) [0,5,1] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1184.977661133s@ mbc={}] start_peering_interval up [4,5,3] -> [0,5,1], acting [4,5,3] -> [0,5,1], acting_primary 4 -> 0, up_primary 4 -> 0, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[2.5( empty local-lis/les=37/38 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44 pruub=14.139812469s) [3,4,2] r=0 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 active pruub 1189.907470703s@ mbc={}] start_peering_interval up [2,3,1] -> [3,4,2], acting [2,3,1] -> [3,4,2], acting_primary 2 -> 3, up_primary 2 -> 3, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[4.3( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.210004807s) [0,5,1] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.977661133s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[5.f( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.215193748s) [5,0,4] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1184.982543945s@ mbc={}] start_peering_interval up [3,4,2] -> [5,0,4], acting [3,4,2] -> [5,0,4], acting_primary 3 -> 5, up_primary 3 -> 5, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[2.5( empty local-lis/les=37/38 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44 pruub=14.139812469s) [3,4,2] r=0 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1189.907470703s@ mbc={}] state: transitioning to Primary Dec 15 03:04:45 localhost ceph-osd[31375]: osd.0 pg_epoch: 44 pg[3.15( empty local-lis/les=0/0 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44) [0,2,4] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[5.2( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.213839531s) [5,1,3] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1184.981689453s@ mbc={}] start_peering_interval up [3,4,2] -> [5,1,3], acting [3,4,2] -> [5,1,3], acting_primary 3 -> 5, up_primary 3 -> 5, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[5.f( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.214599609s) [5,0,4] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.982543945s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[5.2( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.213812828s) [5,1,3] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.981689453s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[2.3( empty local-lis/les=37/38 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44 pruub=14.139468193s) [5,4,3] r=2 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 active pruub 1189.907714844s@ mbc={}] start_peering_interval up [2,3,1] -> [5,4,3], acting [2,3,1] -> [5,4,3], acting_primary 2 -> 5, up_primary 2 -> 5, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[2.3( empty local-lis/les=37/38 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44 pruub=14.139406204s) [5,4,3] r=2 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.907714844s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[31375]: osd.0 pg_epoch: 44 pg[2.13( empty local-lis/les=0/0 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44) [0,5,4] r=0 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[4.5( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.208250046s) [5,0,1] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1184.976562500s@ mbc={}] start_peering_interval up [4,5,3] -> [5,0,1], acting [4,5,3] -> [5,0,1], acting_primary 4 -> 5, up_primary 4 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[3.2( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=15.172355652s) [4,3,5] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1190.940795898s@ mbc={}] start_peering_interval up [3,2,1] -> [4,3,5], acting [3,2,1] -> [4,3,5], acting_primary 3 -> 4, up_primary 3 -> 4, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[5.4( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.213391304s) [0,1,5] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1184.981811523s@ mbc={}] start_peering_interval up [3,4,2] -> [0,1,5], acting [3,4,2] -> [0,1,5], acting_primary 3 -> 0, up_primary 3 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[4.5( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.208168030s) [5,0,1] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.976562500s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[3.4( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=15.171573639s) [1,3,2] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1190.939941406s@ mbc={}] start_peering_interval up [3,2,1] -> [1,3,2], acting [3,2,1] -> [1,3,2], acting_primary 3 -> 1, up_primary 3 -> 1, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[5.4( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.213311195s) [0,1,5] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.981811523s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[3.4( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=15.171372414s) [1,3,2] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1190.939941406s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[6.6( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=11.318981171s) [1,5,3] r=2 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1187.087768555s@ mbc={}] start_peering_interval up [4,5,3] -> [1,5,3], acting [4,5,3] -> [1,5,3], acting_primary 4 -> 1, up_primary 4 -> 1, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[6.6( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=11.318921089s) [1,5,3] r=2 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1187.087768555s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[3.2( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=15.171966553s) [4,3,5] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1190.940795898s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[6.7( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=11.319546700s) [5,1,3] r=2 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1187.088256836s@ mbc={}] start_peering_interval up [4,5,3] -> [5,1,3], acting [4,5,3] -> [5,1,3], acting_primary 4 -> 5, up_primary 4 -> 5, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[31375]: osd.0 pg_epoch: 44 pg[2.f( empty local-lis/les=0/0 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44) [0,2,4] r=0 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[4.4( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.206823349s) [1,3,2] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1184.975830078s@ mbc={}] start_peering_interval up [4,5,3] -> [1,3,2], acting [4,5,3] -> [1,3,2], acting_primary 4 -> 1, up_primary 4 -> 1, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[2.2( empty local-lis/les=37/38 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44 pruub=14.138446808s) [5,1,0] r=-1 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 active pruub 1189.907470703s@ mbc={}] start_peering_interval up [2,3,1] -> [5,1,0], acting [2,3,1] -> [5,1,0], acting_primary 2 -> 5, up_primary 2 -> 5, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[5.5( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.212672234s) [4,2,0] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1184.981811523s@ mbc={}] start_peering_interval up [3,4,2] -> [4,2,0], acting [3,4,2] -> [4,2,0], acting_primary 3 -> 4, up_primary 3 -> 4, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[3.3( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=15.171236038s) [2,4,3] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1190.939941406s@ mbc={}] start_peering_interval up [3,2,1] -> [2,4,3], acting [3,2,1] -> [2,4,3], acting_primary 3 -> 2, up_primary 3 -> 2, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[3.3( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=15.170708656s) [2,4,3] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1190.939941406s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[31375]: osd.0 pg_epoch: 44 pg[3.e( empty local-lis/les=0/0 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44) [0,5,1] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[6.7( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=11.319099426s) [5,1,3] r=2 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1187.088256836s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[31375]: osd.0 pg_epoch: 44 pg[2.d( empty local-lis/les=0/0 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44) [0,5,1] r=0 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[4.4( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.206172943s) [1,3,2] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.975830078s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[2.2( empty local-lis/les=37/38 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44 pruub=14.137796402s) [5,1,0] r=-1 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.907470703s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[2.4( empty local-lis/les=37/38 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44 pruub=14.137843132s) [1,0,2] r=-1 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 active pruub 1189.907714844s@ mbc={}] start_peering_interval up [2,3,1] -> [1,0,2], acting [2,3,1] -> [1,0,2], acting_primary 2 -> 1, up_primary 2 -> 1, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[4.2( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.206079483s) [3,5,4] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1184.975952148s@ mbc={}] start_peering_interval up [4,5,3] -> [3,5,4], acting [4,5,3] -> [3,5,4], acting_primary 4 -> 3, up_primary 4 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[5.3( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.211530685s) [4,5,0] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1184.981445312s@ mbc={}] start_peering_interval up [3,4,2] -> [4,5,0], acting [3,4,2] -> [4,5,0], acting_primary 3 -> 4, up_primary 3 -> 4, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[4.2( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.206079483s) [3,5,4] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1184.975952148s@ mbc={}] state: transitioning to Primary Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[5.3( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.211490631s) [4,5,0] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.981445312s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[2.4( empty local-lis/les=37/38 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44 pruub=14.137792587s) [1,0,2] r=-1 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.907714844s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[6.3( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=11.316949844s) [5,3,1] r=1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1187.087280273s@ mbc={}] start_peering_interval up [4,5,3] -> [5,3,1], acting [4,5,3] -> [5,3,1], acting_primary 4 -> 5, up_primary 4 -> 5, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[3.6( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=15.169899940s) [4,2,0] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1190.940063477s@ mbc={}] start_peering_interval up [3,2,1] -> [4,2,0], acting [3,2,1] -> [4,2,0], acting_primary 3 -> 4, up_primary 3 -> 4, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[5.5( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.212569237s) [4,2,0] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.981811523s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[6.3( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=11.316904068s) [5,3,1] r=1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1187.087280273s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[3.6( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=15.169847488s) [4,2,0] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1190.940063477s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[4.1( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.207597733s) [3,2,1] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1184.978149414s@ mbc={}] start_peering_interval up [4,5,3] -> [3,2,1], acting [4,5,3] -> [3,2,1], acting_primary 4 -> 3, up_primary 4 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[4.1( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.207597733s) [3,2,1] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1184.978149414s@ mbc={}] state: transitioning to Primary Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[2.7( empty local-lis/les=37/38 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44 pruub=14.136903763s) [5,0,1] r=-1 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 active pruub 1189.907470703s@ mbc={}] start_peering_interval up [2,3,1] -> [5,0,1], acting [2,3,1] -> [5,0,1], acting_primary 2 -> 5, up_primary 2 -> 5, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[3.5( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=15.169288635s) [5,4,3] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1190.940063477s@ mbc={}] start_peering_interval up [3,2,1] -> [5,4,3], acting [3,2,1] -> [5,4,3], acting_primary 3 -> 5, up_primary 3 -> 5, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[2.7( empty local-lis/les=37/38 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44 pruub=14.136816025s) [5,0,1] r=-1 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.907470703s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[3.7( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=15.169478416s) [4,3,2] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1190.940185547s@ mbc={}] start_peering_interval up [3,2,1] -> [4,3,2], acting [3,2,1] -> [4,3,2], acting_primary 3 -> 4, up_primary 3 -> 4, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[3.7( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=15.169430733s) [4,3,2] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1190.940185547s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[3.5( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=15.169192314s) [5,4,3] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1190.940063477s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[5.1( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.210433960s) [2,4,0] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1184.981567383s@ mbc={}] start_peering_interval up [3,4,2] -> [2,4,0], acting [3,4,2] -> [2,4,0], acting_primary 3 -> 2, up_primary 3 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[5.1( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.210385323s) [2,4,0] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.981567383s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[6.5( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=11.317996979s) [5,3,1] r=1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1187.089233398s@ mbc={}] start_peering_interval up [4,5,3] -> [5,3,1], acting [4,5,3] -> [5,3,1], acting_primary 4 -> 5, up_primary 4 -> 5, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[6.5( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=11.317941666s) [5,3,1] r=1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1187.089233398s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[4.7( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.204507828s) [1,0,2] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1184.975952148s@ mbc={}] start_peering_interval up [4,5,3] -> [1,0,2], acting [4,5,3] -> [1,0,2], acting_primary 4 -> 1, up_primary 4 -> 1, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[2.6( empty local-lis/les=37/38 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44 pruub=14.135907173s) [1,0,5] r=-1 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 active pruub 1189.907348633s@ mbc={}] start_peering_interval up [2,3,1] -> [1,0,5], acting [2,3,1] -> [1,0,5], acting_primary 2 -> 1, up_primary 2 -> 1, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[4.7( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.204423904s) [1,0,2] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.975952148s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[2.1( empty local-lis/les=37/38 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44 pruub=14.135882378s) [4,3,5] r=1 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 active pruub 1189.907348633s@ mbc={}] start_peering_interval up [2,3,1] -> [4,3,5], acting [2,3,1] -> [4,3,5], acting_primary 2 -> 4, up_primary 2 -> 4, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[2.1( empty local-lis/les=37/38 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44 pruub=14.135849953s) [4,3,5] r=1 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.907348633s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[3.1( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=15.165132523s) [4,2,3] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1190.937133789s@ mbc={}] start_peering_interval up [3,2,1] -> [4,2,3], acting [3,2,1] -> [4,2,3], acting_primary 3 -> 4, up_primary 3 -> 4, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[31375]: osd.0 pg_epoch: 44 pg[4.1c( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44) [0,1,2] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[3.1( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=15.165072441s) [4,2,3] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1190.937133789s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[31375]: osd.0 pg_epoch: 44 pg[4.3( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44) [0,5,1] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[6.4( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=11.316514015s) [4,2,3] r=2 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1187.088623047s@ mbc={}] start_peering_interval up [4,5,3] -> [4,2,3], acting [4,5,3] -> [4,2,3], acting_primary 4 -> 4, up_primary 4 -> 4, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[2.6( empty local-lis/les=37/38 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44 pruub=14.135855675s) [1,0,5] r=-1 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.907348633s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[6.2( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=11.314888954s) [5,4,0] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1187.087036133s@ mbc={}] start_peering_interval up [4,5,3] -> [5,4,0], acting [4,5,3] -> [5,4,0], acting_primary 4 -> 5, up_primary 4 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[6.4( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=11.316477776s) [4,2,3] r=2 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1187.088623047s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[31375]: osd.0 pg_epoch: 44 pg[5.4( empty local-lis/les=0/0 n=0 ec=40/26 lis/c=40/40 les/c/f=41/41/0 sis=44) [0,1,5] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[4.6( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.203640938s) [0,1,2] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1184.975830078s@ mbc={}] start_peering_interval up [4,5,3] -> [0,1,2], acting [4,5,3] -> [0,1,2], acting_primary 4 -> 0, up_primary 4 -> 0, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[5.7( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.209972382s) [5,1,0] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1184.982910156s@ mbc={}] start_peering_interval up [3,4,2] -> [5,1,0], acting [3,4,2] -> [5,1,0], acting_primary 3 -> 5, up_primary 3 -> 5, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[4.6( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.202970505s) [0,1,2] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.975830078s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[5.7( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.209930420s) [5,1,0] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.982910156s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[5.6( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.208740234s) [4,5,0] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1184.981323242s@ mbc={}] start_peering_interval up [3,4,2] -> [4,5,0], acting [3,4,2] -> [4,5,0], acting_primary 3 -> 4, up_primary 3 -> 4, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[5.6( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.208299637s) [4,5,0] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.981323242s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[2.9( empty local-lis/les=37/38 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44 pruub=14.134285927s) [4,5,3] r=2 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 active pruub 1189.907348633s@ mbc={}] start_peering_interval up [2,3,1] -> [4,5,3], acting [2,3,1] -> [4,5,3], acting_primary 2 -> 4, up_primary 2 -> 4, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[31375]: osd.0 pg_epoch: 44 pg[2.a( empty local-lis/les=0/0 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44) [0,4,2] r=0 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[6.2( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=11.314831734s) [5,4,0] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1187.087036133s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[2.9( empty local-lis/les=37/38 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44 pruub=14.134241104s) [4,5,3] r=2 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.907348633s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[6.d( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=11.313735008s) [2,4,0] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1187.087036133s@ mbc={}] start_peering_interval up [4,5,3] -> [2,4,0], acting [4,5,3] -> [2,4,0], acting_primary 4 -> 2, up_primary 4 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[4.f( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.208865166s) [1,3,5] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1184.982177734s@ mbc={}] start_peering_interval up [4,5,3] -> [1,3,5], acting [4,5,3] -> [1,3,5], acting_primary 4 -> 1, up_primary 4 -> 1, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[4.f( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.208823204s) [1,3,5] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.982177734s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[6.e( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=11.311120987s) [5,1,3] r=2 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1187.084594727s@ mbc={}] start_peering_interval up [4,5,3] -> [5,1,3], acting [4,5,3] -> [5,1,3], acting_primary 4 -> 5, up_primary 4 -> 5, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[6.e( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=11.311095238s) [5,1,3] r=2 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1187.084594727s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[2.a( empty local-lis/les=37/38 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44 pruub=14.132903099s) [0,4,2] r=-1 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 active pruub 1189.906616211s@ mbc={}] start_peering_interval up [2,3,1] -> [0,4,2], acting [2,3,1] -> [0,4,2], acting_primary 2 -> 0, up_primary 2 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[3.8( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=15.166744232s) [3,1,5] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1190.940551758s@ mbc={}] start_peering_interval up [3,2,1] -> [3,1,5], acting [3,2,1] -> [3,1,5], acting_primary 3 -> 3, up_primary 3 -> 3, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[2.a( empty local-lis/les=37/38 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44 pruub=14.132872581s) [0,4,2] r=-1 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.906616211s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[4.c( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.208339691s) [5,4,0] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1184.982177734s@ mbc={}] start_peering_interval up [4,5,3] -> [5,4,0], acting [4,5,3] -> [5,4,0], acting_primary 4 -> 5, up_primary 4 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[3.8( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=15.166744232s) [3,1,5] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1190.940551758s@ mbc={}] state: transitioning to Primary Dec 15 03:04:45 localhost ceph-osd[31375]: osd.0 pg_epoch: 44 pg[4.6( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44) [0,1,2] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[6.d( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=11.313407898s) [2,4,0] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1187.087036133s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[6.f( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=11.310367584s) [4,3,5] r=1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1187.084716797s@ mbc={}] start_peering_interval up [4,5,3] -> [4,3,5], acting [4,5,3] -> [4,3,5], acting_primary 4 -> 4, up_primary 4 -> 4, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[6.f( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=11.310334206s) [4,3,5] r=1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1187.084716797s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[3.a( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=15.166033745s) [5,1,3] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1190.940551758s@ mbc={}] start_peering_interval up [3,2,1] -> [5,1,3], acting [3,2,1] -> [5,1,3], acting_primary 3 -> 5, up_primary 3 -> 5, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[2.b( empty local-lis/les=37/38 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44 pruub=14.132564545s) [3,5,4] r=0 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 active pruub 1189.907226562s@ mbc={}] start_peering_interval up [2,3,1] -> [3,5,4], acting [2,3,1] -> [3,5,4], acting_primary 2 -> 3, up_primary 2 -> 3, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[5.d( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.207805634s) [3,5,4] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1184.982543945s@ mbc={}] start_peering_interval up [3,4,2] -> [3,5,4], acting [3,4,2] -> [3,5,4], acting_primary 3 -> 3, up_primary 3 -> 3, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[2.b( empty local-lis/les=37/38 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44 pruub=14.132564545s) [3,5,4] r=0 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1189.907226562s@ mbc={}] state: transitioning to Primary Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[5.d( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.207805634s) [3,5,4] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1184.982543945s@ mbc={}] state: transitioning to Primary Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[3.b( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=15.167176247s) [4,5,0] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1190.940917969s@ mbc={}] start_peering_interval up [3,2,1] -> [4,5,0], acting [3,2,1] -> [4,5,0], acting_primary 3 -> 4, up_primary 3 -> 4, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[4.d( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.207342148s) [2,0,1] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1184.982177734s@ mbc={}] start_peering_interval up [4,5,3] -> [2,0,1], acting [4,5,3] -> [2,0,1], acting_primary 4 -> 2, up_primary 4 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[3.a( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=15.165976524s) [5,1,3] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1190.940551758s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[4.c( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.208286285s) [5,4,0] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.982177734s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[4.d( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.207286835s) [2,0,1] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.982177734s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[3.b( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=15.165903091s) [4,5,0] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1190.940917969s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[5.c( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.207278252s) [1,2,3] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1184.982543945s@ mbc={}] start_peering_interval up [3,4,2] -> [1,2,3], acting [3,4,2] -> [1,2,3], acting_primary 3 -> 1, up_primary 3 -> 1, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[5.c( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.207225800s) [1,2,3] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.982543945s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[6.8( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=11.313157082s) [2,3,4] r=1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1187.088378906s@ mbc={}] start_peering_interval up [4,5,3] -> [2,3,4], acting [4,5,3] -> [2,3,4], acting_primary 4 -> 2, up_primary 4 -> 2, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[6.8( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=11.313119888s) [2,3,4] r=1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1187.088378906s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[3.d( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=15.160377502s) [5,3,4] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1190.935791016s@ mbc={}] start_peering_interval up [3,2,1] -> [5,3,4], acting [3,2,1] -> [5,3,4], acting_primary 3 -> 5, up_primary 3 -> 5, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[3.d( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=15.160347939s) [5,3,4] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1190.935791016s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[2.c( empty local-lis/les=37/38 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44 pruub=14.131418228s) [3,1,5] r=0 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 active pruub 1189.906982422s@ mbc={}] start_peering_interval up [2,3,1] -> [3,1,5], acting [2,3,1] -> [3,1,5], acting_primary 2 -> 3, up_primary 2 -> 3, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[3.c( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=15.160145760s) [5,4,3] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1190.936035156s@ mbc={}] start_peering_interval up [3,2,1] -> [5,4,3], acting [3,2,1] -> [5,4,3], acting_primary 3 -> 5, up_primary 3 -> 5, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[6.9( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=11.311042786s) [4,3,5] r=1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1187.086914062s@ mbc={}] start_peering_interval up [4,5,3] -> [4,3,5], acting [4,5,3] -> [4,3,5], acting_primary 4 -> 4, up_primary 4 -> 4, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[3.c( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=15.159887314s) [5,4,3] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1190.936035156s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[6.9( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=11.310763359s) [4,3,5] r=1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1187.086914062s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[5.b( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.205998421s) [0,1,5] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1184.981933594s@ mbc={}] start_peering_interval up [3,4,2] -> [0,1,5], acting [3,4,2] -> [0,1,5], acting_primary 3 -> 0, up_primary 3 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[2.d( empty local-lis/les=37/38 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44 pruub=14.130382538s) [0,5,1] r=-1 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 active pruub 1189.906738281s@ mbc={}] start_peering_interval up [2,3,1] -> [0,5,1], acting [2,3,1] -> [0,5,1], acting_primary 2 -> 0, up_primary 2 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[4.b( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.205700874s) [1,2,3] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1184.982055664s@ mbc={}] start_peering_interval up [4,5,3] -> [1,2,3], acting [4,5,3] -> [1,2,3], acting_primary 4 -> 1, up_primary 4 -> 1, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[4.b( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.205606461s) [1,2,3] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.982055664s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[2.d( empty local-lis/les=37/38 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44 pruub=14.130258560s) [0,5,1] r=-1 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.906738281s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[5.b( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.205246925s) [0,1,5] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.981933594s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[2.c( empty local-lis/les=37/38 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44 pruub=14.131418228s) [3,1,5] r=0 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1189.906982422s@ mbc={}] state: transitioning to Primary Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[6.a( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=11.311461449s) [5,4,3] r=2 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1187.088134766s@ mbc={}] start_peering_interval up [4,5,3] -> [5,4,3], acting [4,5,3] -> [5,4,3], acting_primary 4 -> 5, up_primary 4 -> 5, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[6.a( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=11.311290741s) [5,4,3] r=2 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1187.088134766s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[3.f( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=15.163398743s) [2,3,4] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1190.940551758s@ mbc={}] start_peering_interval up [3,2,1] -> [2,3,4], acting [3,2,1] -> [2,3,4], acting_primary 3 -> 2, up_primary 3 -> 2, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[3.f( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=15.163359642s) [2,3,4] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1190.940551758s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[2.e( empty local-lis/les=37/38 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44 pruub=14.129570961s) [4,3,2] r=1 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 active pruub 1189.906738281s@ mbc={}] start_peering_interval up [2,3,1] -> [4,3,2], acting [2,3,1] -> [4,3,2], acting_primary 2 -> 4, up_primary 2 -> 4, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[2.e( empty local-lis/les=37/38 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44 pruub=14.129515648s) [4,3,2] r=1 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.906738281s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[31375]: osd.0 pg_epoch: 44 pg[5.b( empty local-lis/les=0/0 n=0 ec=40/26 lis/c=40/40 les/c/f=41/41/0 sis=44) [0,1,5] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[5.a( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.204446793s) [1,0,2] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1184.981933594s@ mbc={}] start_peering_interval up [3,4,2] -> [1,0,2], acting [3,4,2] -> [1,0,2], acting_primary 3 -> 1, up_primary 3 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[5.a( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.204398155s) [1,0,2] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.981933594s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[4.a( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.205801964s) [2,1,3] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1184.981567383s@ mbc={}] start_peering_interval up [4,5,3] -> [2,1,3], acting [4,5,3] -> [2,1,3], acting_primary 4 -> 2, up_primary 4 -> 2, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[5.9( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.203666687s) [5,0,1] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1184.981323242s@ mbc={}] start_peering_interval up [3,4,2] -> [5,0,1], acting [3,4,2] -> [5,0,1], acting_primary 3 -> 5, up_primary 3 -> 5, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[4.a( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.203855515s) [2,1,3] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.981567383s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[5.9( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.203570366s) [5,0,1] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.981323242s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[3.e( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=15.158180237s) [0,5,1] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1190.935913086s@ mbc={}] start_peering_interval up [3,2,1] -> [0,5,1], acting [3,2,1] -> [0,5,1], acting_primary 3 -> 0, up_primary 3 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[3.e( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=15.158140182s) [0,5,1] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1190.935913086s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[2.f( empty local-lis/les=37/38 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44 pruub=14.129042625s) [0,2,4] r=-1 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 active pruub 1189.906982422s@ mbc={}] start_peering_interval up [2,3,1] -> [0,2,4], acting [2,3,1] -> [0,2,4], acting_primary 2 -> 0, up_primary 2 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[6.b( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=11.309571266s) [1,2,0] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1187.087646484s@ mbc={}] start_peering_interval up [4,5,3] -> [1,2,0], acting [4,5,3] -> [1,2,0], acting_primary 4 -> 1, up_primary 4 -> 1, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[2.f( empty local-lis/les=37/38 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44 pruub=14.129009247s) [0,2,4] r=-1 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.906982422s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[6.b( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=11.309528351s) [1,2,0] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1187.087646484s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[6.14( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=11.302661896s) [4,5,0] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1187.080932617s@ mbc={}] start_peering_interval up [4,5,3] -> [4,5,0], acting [4,5,3] -> [4,5,0], acting_primary 4 -> 4, up_primary 4 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[3.11( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=15.157587051s) [3,5,4] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1190.935913086s@ mbc={}] start_peering_interval up [3,2,1] -> [3,5,4], acting [3,2,1] -> [3,5,4], acting_primary 3 -> 3, up_primary 3 -> 3, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[6.14( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=11.302634239s) [4,5,0] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1187.080932617s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[3.11( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=15.157587051s) [3,5,4] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1190.935913086s@ mbc={}] state: transitioning to Primary Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[2.10( empty local-lis/les=37/38 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44 pruub=14.128520012s) [3,1,5] r=0 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 active pruub 1189.906982422s@ mbc={}] start_peering_interval up [2,3,1] -> [3,1,5], acting [2,3,1] -> [3,1,5], acting_primary 2 -> 3, up_primary 2 -> 3, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[4.16( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.203870773s) [4,2,0] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1184.982421875s@ mbc={}] start_peering_interval up [4,5,3] -> [4,2,0], acting [4,5,3] -> [4,2,0], acting_primary 4 -> 4, up_primary 4 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[2.10( empty local-lis/les=37/38 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44 pruub=14.128520012s) [3,1,5] r=0 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1189.906982422s@ mbc={}] state: transitioning to Primary Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[4.16( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.203843117s) [4,2,0] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.982421875s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[5.17( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.204018593s) [4,3,5] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1184.982788086s@ mbc={}] start_peering_interval up [3,4,2] -> [4,3,5], acting [3,4,2] -> [4,3,5], acting_primary 3 -> 4, up_primary 3 -> 4, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[5.17( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.203993797s) [4,3,5] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.982788086s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[6.15( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=11.302441597s) [2,3,1] r=1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1187.081176758s@ mbc={}] start_peering_interval up [4,5,3] -> [2,3,1], acting [4,5,3] -> [2,3,1], acting_primary 4 -> 2, up_primary 4 -> 2, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[6.15( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=11.302406311s) [2,3,1] r=1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1187.081176758s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[3.10( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=15.156974792s) [5,0,4] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1190.935913086s@ mbc={}] start_peering_interval up [3,2,1] -> [5,0,4], acting [3,2,1] -> [5,0,4], acting_primary 3 -> 5, up_primary 3 -> 5, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[3.10( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=15.156931877s) [5,0,4] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1190.935913086s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[4.8( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.196644783s) [3,2,4] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1184.975708008s@ mbc={}] start_peering_interval up [4,5,3] -> [3,2,4], acting [4,5,3] -> [3,2,4], acting_primary 4 -> 3, up_primary 4 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[2.11( empty local-lis/les=37/38 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44 pruub=14.127783775s) [5,1,3] r=2 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 active pruub 1189.907104492s@ mbc={}] start_peering_interval up [2,3,1] -> [5,1,3], acting [2,3,1] -> [5,1,3], acting_primary 2 -> 5, up_primary 2 -> 5, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[4.17( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.203273773s) [1,2,3] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1184.982666016s@ mbc={}] start_peering_interval up [4,5,3] -> [1,2,3], acting [4,5,3] -> [1,2,3], acting_primary 4 -> 1, up_primary 4 -> 1, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[4.8( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.196644783s) [3,2,4] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1184.975708008s@ mbc={}] state: transitioning to Primary Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[4.17( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.203221321s) [1,2,3] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.982666016s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[5.8( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.202423096s) [0,4,5] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1184.981933594s@ mbc={}] start_peering_interval up [3,4,2] -> [0,4,5], acting [3,4,2] -> [0,4,5], acting_primary 3 -> 0, up_primary 3 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[5.16( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.203338623s) [5,1,3] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1184.982910156s@ mbc={}] start_peering_interval up [3,4,2] -> [5,1,3], acting [3,4,2] -> [5,1,3], acting_primary 3 -> 5, up_primary 3 -> 5, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[5.8( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.202390671s) [0,4,5] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.981933594s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[5.16( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.203260422s) [5,1,3] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.982910156s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[6.16( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=11.300722122s) [4,5,0] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1187.080444336s@ mbc={}] start_peering_interval up [4,5,3] -> [4,5,0], acting [4,5,3] -> [4,5,0], acting_primary 4 -> 4, up_primary 4 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[3.13( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=15.156126976s) [2,4,3] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1190.936035156s@ mbc={}] start_peering_interval up [3,2,1] -> [2,4,3], acting [3,2,1] -> [2,4,3], acting_primary 3 -> 2, up_primary 3 -> 2, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[2.12( empty local-lis/les=37/38 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44 pruub=14.127170563s) [3,1,2] r=0 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 active pruub 1189.906982422s@ mbc={}] start_peering_interval up [2,3,1] -> [3,1,2], acting [2,3,1] -> [3,1,2], acting_primary 2 -> 3, up_primary 2 -> 3, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[2.12( empty local-lis/les=37/38 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44 pruub=14.127170563s) [3,1,2] r=0 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1189.906982422s@ mbc={}] state: transitioning to Primary Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[3.13( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=15.156070709s) [2,4,3] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1190.936035156s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[4.14( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.202492714s) [3,1,5] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1184.982543945s@ mbc={}] start_peering_interval up [4,5,3] -> [3,1,5], acting [4,5,3] -> [3,1,5], acting_primary 4 -> 3, up_primary 4 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[4.14( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.202492714s) [3,1,5] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1184.982543945s@ mbc={}] state: transitioning to Primary Dec 15 03:04:45 localhost ceph-osd[31375]: osd.0 pg_epoch: 44 pg[5.8( empty local-lis/les=0/0 n=0 ec=40/26 lis/c=40/40 les/c/f=41/41/0 sis=44) [0,4,5] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[6.16( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=11.300632477s) [4,5,0] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1187.080444336s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[2.11( empty local-lis/les=37/38 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44 pruub=14.127741814s) [5,1,3] r=2 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.907104492s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[6.17( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=11.300384521s) [3,4,2] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1187.080688477s@ mbc={}] start_peering_interval up [4,5,3] -> [3,4,2], acting [4,5,3] -> [3,4,2], acting_primary 4 -> 3, up_primary 4 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[6.17( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=11.300384521s) [3,4,2] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1187.080688477s@ mbc={}] state: transitioning to Primary Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[2.13( empty local-lis/les=37/38 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44 pruub=14.126435280s) [0,5,4] r=-1 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 active pruub 1189.906982422s@ mbc={}] start_peering_interval up [2,3,1] -> [0,5,4], acting [2,3,1] -> [0,5,4], acting_primary 2 -> 0, up_primary 2 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[2.13( empty local-lis/les=37/38 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44 pruub=14.126409531s) [0,5,4] r=-1 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.906982422s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[5.15( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.208857536s) [5,4,0] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1184.989868164s@ mbc={}] start_peering_interval up [3,4,2] -> [5,4,0], acting [3,4,2] -> [5,4,0], acting_primary 3 -> 5, up_primary 3 -> 5, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[5.14( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.201787949s) [4,3,5] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1184.982788086s@ mbc={}] start_peering_interval up [3,4,2] -> [4,3,5], acting [3,4,2] -> [4,3,5], acting_primary 3 -> 4, up_primary 3 -> 4, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[5.14( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.201762199s) [4,3,5] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.982788086s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[5.15( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.208799362s) [5,4,0] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.989868164s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[4.9( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.196638107s) [0,1,2] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1184.977783203s@ mbc={}] start_peering_interval up [4,5,3] -> [0,1,2], acting [4,5,3] -> [0,1,2], acting_primary 4 -> 0, up_primary 4 -> 0, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[6.10( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=11.300108910s) [4,0,2] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1187.081176758s@ mbc={}] start_peering_interval up [4,5,3] -> [4,0,2], acting [4,5,3] -> [4,0,2], acting_primary 4 -> 4, up_primary 4 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[6.10( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=11.300055504s) [4,0,2] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1187.081176758s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[4.9( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.196580887s) [0,1,2] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.977783203s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[31375]: osd.0 pg_epoch: 44 pg[4.9( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44) [0,1,2] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[3.15( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=15.155231476s) [0,2,4] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1190.936523438s@ mbc={}] start_peering_interval up [3,2,1] -> [0,2,4], acting [3,2,1] -> [0,2,4], acting_primary 3 -> 0, up_primary 3 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[4.12( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.201380730s) [1,3,2] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1184.982788086s@ mbc={}] start_peering_interval up [4,5,3] -> [1,3,2], acting [4,5,3] -> [1,3,2], acting_primary 4 -> 1, up_primary 4 -> 1, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[3.15( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=15.155195236s) [0,2,4] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1190.936523438s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[5.13( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.208219528s) [3,1,5] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1184.989746094s@ mbc={}] start_peering_interval up [3,4,2] -> [3,1,5], acting [3,4,2] -> [3,1,5], acting_primary 3 -> 3, up_primary 3 -> 3, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[4.12( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.201348305s) [1,3,2] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.982788086s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[5.13( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.208219528s) [3,1,5] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1184.989746094s@ mbc={}] state: transitioning to Primary Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[6.11( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=11.299634933s) [4,3,2] r=1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1187.081176758s@ mbc={}] start_peering_interval up [4,5,3] -> [4,3,2], acting [4,5,3] -> [4,3,2], acting_primary 4 -> 4, up_primary 4 -> 4, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[6.11( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=11.299606323s) [4,3,2] r=1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1187.081176758s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[3.14( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=15.154173851s) [2,3,4] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1190.935791016s@ mbc={}] start_peering_interval up [3,2,1] -> [2,3,4], acting [3,2,1] -> [2,3,4], acting_primary 3 -> 2, up_primary 3 -> 2, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[2.15( empty local-lis/les=37/38 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44 pruub=14.125579834s) [0,1,2] r=-1 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 active pruub 1189.907226562s@ mbc={}] start_peering_interval up [2,3,1] -> [0,1,2], acting [2,3,1] -> [0,1,2], acting_primary 2 -> 0, up_primary 2 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[3.14( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=15.154090881s) [2,3,4] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1190.935791016s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[2.15( empty local-lis/les=37/38 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44 pruub=14.125550270s) [0,1,2] r=-1 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.907226562s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[4.13( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.200812340s) [2,3,1] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1184.982666016s@ mbc={}] start_peering_interval up [4,5,3] -> [2,3,1], acting [4,5,3] -> [2,3,1], acting_primary 4 -> 2, up_primary 4 -> 2, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[6.12( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=11.299501419s) [0,2,1] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1187.081542969s@ mbc={}] start_peering_interval up [4,5,3] -> [0,2,1], acting [4,5,3] -> [0,2,1], acting_primary 4 -> 0, up_primary 4 -> 0, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[5.12( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.207869530s) [3,5,4] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1184.989868164s@ mbc={}] start_peering_interval up [3,4,2] -> [3,5,4], acting [3,4,2] -> [3,5,4], acting_primary 3 -> 3, up_primary 3 -> 3, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[4.13( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.200747490s) [2,3,1] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.982666016s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[5.12( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.207869530s) [3,5,4] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1184.989868164s@ mbc={}] state: transitioning to Primary Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[3.12( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=15.153643608s) [1,5,0] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1190.935791016s@ mbc={}] start_peering_interval up [3,2,1] -> [1,5,0], acting [3,2,1] -> [1,5,0], acting_primary 3 -> 1, up_primary 3 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[2.16( empty local-lis/les=37/38 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44 pruub=14.123288155s) [5,3,1] r=1 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 active pruub 1189.907226562s@ mbc={}] start_peering_interval up [2,3,1] -> [5,3,1], acting [2,3,1] -> [5,3,1], acting_primary 2 -> 5, up_primary 2 -> 5, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[4.15( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.198446274s) [3,4,2] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1184.982543945s@ mbc={}] start_peering_interval up [4,5,3] -> [3,4,2], acting [4,5,3] -> [3,4,2], acting_primary 4 -> 3, up_primary 4 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[3.17( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=15.151431084s) [1,5,0] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1190.935546875s@ mbc={}] start_peering_interval up [3,2,1] -> [1,5,0], acting [3,2,1] -> [1,5,0], acting_primary 3 -> 1, up_primary 3 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[4.15( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.198446274s) [3,4,2] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1184.982543945s@ mbc={}] state: transitioning to Primary Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[2.16( empty local-lis/les=37/38 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44 pruub=14.123103142s) [5,3,1] r=1 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.907226562s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[3.17( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=15.151355743s) [1,5,0] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1190.935546875s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[5.11( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.205874443s) [2,3,4] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1184.990356445s@ mbc={}] start_peering_interval up [3,4,2] -> [2,3,4], acting [3,4,2] -> [2,3,4], acting_primary 3 -> 2, up_primary 3 -> 2, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[4.10( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.198137283s) [1,3,2] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1184.982543945s@ mbc={}] start_peering_interval up [4,5,3] -> [1,3,2], acting [4,5,3] -> [1,3,2], acting_primary 4 -> 1, up_primary 4 -> 1, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[4.10( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.197966576s) [1,3,2] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.982543945s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[6.13( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=11.296736717s) [1,0,2] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1187.081298828s@ mbc={}] start_peering_interval up [4,5,3] -> [1,0,2], acting [4,5,3] -> [1,0,2], acting_primary 4 -> 1, up_primary 4 -> 1, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[5.11( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.205844879s) [2,3,4] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.990356445s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[6.13( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=11.296700478s) [1,0,2] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1187.081298828s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[3.16( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=15.156999588s) [2,1,3] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1190.941650391s@ mbc={}] start_peering_interval up [3,2,1] -> [2,1,3], acting [3,2,1] -> [2,1,3], acting_primary 3 -> 2, up_primary 3 -> 2, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[3.16( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=15.156947136s) [2,1,3] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1190.941650391s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[3.12( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=15.151435852s) [1,5,0] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1190.935791016s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[5.10( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.205181122s) [2,3,1] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1184.990112305s@ mbc={}] start_peering_interval up [3,4,2] -> [2,3,1], acting [3,4,2] -> [2,3,1], acting_primary 3 -> 2, up_primary 3 -> 2, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[4.11( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.198257446s) [4,5,0] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1184.983032227s@ mbc={}] start_peering_interval up [4,5,3] -> [4,5,0], acting [4,5,3] -> [4,5,0], acting_primary 4 -> 4, up_primary 4 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[4.11( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.198061943s) [4,5,0] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.983032227s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[5.10( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.205095291s) [2,3,1] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.990112305s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[2.17( empty local-lis/les=37/38 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44 pruub=14.122072220s) [5,3,4] r=1 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 active pruub 1189.907104492s@ mbc={}] start_peering_interval up [2,3,1] -> [5,3,4], acting [2,3,1] -> [5,3,4], acting_primary 2 -> 5, up_primary 2 -> 5, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[2.17( empty local-lis/les=37/38 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44 pruub=14.121994019s) [5,3,4] r=1 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1189.907104492s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[5.1f( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.204827309s) [2,3,1] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1184.990112305s@ mbc={}] start_peering_interval up [3,4,2] -> [2,3,1], acting [3,4,2] -> [2,3,1], acting_primary 3 -> 2, up_primary 3 -> 2, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[6.1c( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=11.303687096s) [3,1,5] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active pruub 1187.088989258s@ mbc={}] start_peering_interval up [4,5,3] -> [3,1,5], acting [4,5,3] -> [3,1,5], acting_primary 4 -> 3, up_primary 4 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[2.18( empty local-lis/les=37/38 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44 pruub=14.121828079s) [3,2,4] r=0 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 active pruub 1189.907104492s@ mbc={}] start_peering_interval up [2,3,1] -> [3,2,4], acting [2,3,1] -> [3,2,4], acting_primary 2 -> 3, up_primary 2 -> 3, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[2.18( empty local-lis/les=37/38 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44 pruub=14.121828079s) [3,2,4] r=0 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1189.907104492s@ mbc={}] state: transitioning to Primary Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[5.1f( empty local-lis/les=40/41 n=0 ec=40/26 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.204798698s) [2,3,1] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.990112305s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[6.1c( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=11.303687096s) [3,1,5] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1187.088989258s@ mbc={}] state: transitioning to Primary Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[4.1e( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.197497368s) [1,5,0] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1184.983032227s@ mbc={}] start_peering_interval up [4,5,3] -> [1,5,0], acting [4,5,3] -> [1,5,0], acting_primary 4 -> 1, up_primary 4 -> 1, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[4.1e( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=9.197453499s) [1,5,0] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1184.983032227s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[3.19( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=15.150171280s) [1,2,3] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1190.935791016s@ mbc={}] start_peering_interval up [3,2,1] -> [1,2,3], acting [3,2,1] -> [1,2,3], acting_primary 3 -> 1, up_primary 3 -> 1, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[3.19( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=15.150128365s) [1,2,3] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1190.935791016s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[32311]: osd.3 pg_epoch: 44 pg[6.12( empty local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=11.298898697s) [0,2,1] r=-1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1187.081542969s@ mbc={}] state: transitioning to Stray Dec 15 03:04:45 localhost ceph-osd[31375]: osd.0 pg_epoch: 44 pg[6.12( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44) [0,2,1] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:04:45 localhost python3[55807]: ansible-file Invoked with path=/var/lib/tripleo-config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:04:46 localhost ceph-osd[32311]: log_channel(cluster) log [DBG] : 3.9 scrub starts Dec 15 03:04:46 localhost ceph-osd[32311]: log_channel(cluster) log [DBG] : 3.9 scrub ok Dec 15 03:04:46 localhost ceph-osd[31375]: osd.0 pg_epoch: 44 pg[5.9( empty local-lis/les=0/0 n=0 ec=40/26 lis/c=40/40 les/c/f=41/41/0 sis=44) [5,0,1] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Dec 15 03:04:46 localhost ceph-osd[31375]: osd.0 pg_epoch: 44 pg[6.b( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44) [1,2,0] r=2 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Dec 15 03:04:46 localhost ceph-osd[31375]: osd.0 pg_epoch: 44 pg[2.2( empty local-lis/les=0/0 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44) [5,1,0] r=2 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Dec 15 03:04:46 localhost ceph-osd[31375]: osd.0 pg_epoch: 44 pg[5.a( empty local-lis/les=0/0 n=0 ec=40/26 lis/c=40/40 les/c/f=41/41/0 sis=44) [1,0,2] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Dec 15 03:04:46 localhost ceph-osd[31375]: osd.0 pg_epoch: 44 pg[5.7( empty local-lis/les=0/0 n=0 ec=40/26 lis/c=40/40 les/c/f=41/41/0 sis=44) [5,1,0] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Dec 15 03:04:46 localhost ceph-osd[31375]: osd.0 pg_epoch: 44 pg[4.7( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44) [1,0,2] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Dec 15 03:04:46 localhost ceph-osd[31375]: osd.0 pg_epoch: 44 pg[5.1( empty local-lis/les=0/0 n=0 ec=40/26 lis/c=40/40 les/c/f=41/41/0 sis=44) [2,4,0] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Dec 15 03:04:46 localhost ceph-osd[31375]: osd.0 pg_epoch: 44 pg[2.6( empty local-lis/les=0/0 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44) [1,0,5] r=1 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Dec 15 03:04:46 localhost ceph-osd[31375]: osd.0 pg_epoch: 44 pg[6.2( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44) [5,4,0] r=2 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Dec 15 03:04:46 localhost ceph-osd[31375]: osd.0 pg_epoch: 44 pg[3.6( empty local-lis/les=0/0 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44) [4,2,0] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Dec 15 03:04:46 localhost ceph-osd[31375]: osd.0 pg_epoch: 44 pg[2.7( empty local-lis/les=0/0 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44) [5,0,1] r=1 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Dec 15 03:04:46 localhost ceph-osd[31375]: osd.0 pg_epoch: 44 pg[2.4( empty local-lis/les=0/0 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44) [1,0,2] r=1 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Dec 15 03:04:46 localhost ceph-osd[31375]: osd.0 pg_epoch: 44 pg[4.5( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44) [5,0,1] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Dec 15 03:04:46 localhost ceph-osd[31375]: osd.0 pg_epoch: 44 pg[6.d( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44) [2,4,0] r=2 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Dec 15 03:04:46 localhost ceph-osd[31375]: osd.0 pg_epoch: 44 pg[5.f( empty local-lis/les=0/0 n=0 ec=40/26 lis/c=40/40 les/c/f=41/41/0 sis=44) [5,0,4] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Dec 15 03:04:46 localhost ceph-osd[31375]: osd.0 pg_epoch: 44 pg[4.e( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44) [2,0,1] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Dec 15 03:04:46 localhost ceph-osd[31375]: osd.0 pg_epoch: 44 pg[2.8( empty local-lis/les=0/0 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44) [2,0,1] r=1 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Dec 15 03:04:46 localhost ceph-osd[31375]: osd.0 pg_epoch: 44 pg[4.d( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44) [2,0,1] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Dec 15 03:04:46 localhost ceph-osd[31375]: osd.0 pg_epoch: 44 pg[3.b( empty local-lis/les=0/0 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44) [4,5,0] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Dec 15 03:04:46 localhost ceph-osd[31375]: osd.0 pg_epoch: 44 pg[4.c( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44) [5,4,0] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Dec 15 03:04:46 localhost ceph-osd[31375]: osd.0 pg_epoch: 44 pg[5.1d( empty local-lis/les=0/0 n=0 ec=40/26 lis/c=40/40 les/c/f=41/41/0 sis=44) [1,2,0] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Dec 15 03:04:46 localhost ceph-osd[31375]: osd.0 pg_epoch: 44 pg[6.1a( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44) [5,0,1] r=1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Dec 15 03:04:46 localhost ceph-osd[31375]: osd.0 pg_epoch: 44 pg[2.1e( empty local-lis/les=0/0 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44) [4,0,5] r=1 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Dec 15 03:04:46 localhost ceph-osd[31375]: osd.0 pg_epoch: 44 pg[5.19( empty local-lis/les=0/0 n=0 ec=40/26 lis/c=40/40 les/c/f=41/41/0 sis=44) [1,5,0] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Dec 15 03:04:46 localhost ceph-osd[31375]: osd.0 pg_epoch: 44 pg[6.10( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44) [4,0,2] r=1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Dec 15 03:04:46 localhost ceph-osd[31375]: osd.0 pg_epoch: 44 pg[3.12( empty local-lis/les=0/0 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44) [1,5,0] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Dec 15 03:04:46 localhost ceph-osd[31375]: osd.0 pg_epoch: 44 pg[6.13( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44) [1,0,2] r=1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Dec 15 03:04:46 localhost ceph-osd[31375]: osd.0 pg_epoch: 44 pg[4.11( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44) [4,5,0] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Dec 15 03:04:46 localhost ceph-osd[31375]: osd.0 pg_epoch: 44 pg[4.1e( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44) [1,5,0] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Dec 15 03:04:46 localhost ceph-osd[31375]: osd.0 pg_epoch: 44 pg[6.14( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44) [4,5,0] r=2 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Dec 15 03:04:46 localhost ceph-osd[31375]: osd.0 pg_epoch: 44 pg[4.16( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44) [4,2,0] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Dec 15 03:04:46 localhost ceph-osd[32311]: osd.3 pg_epoch: 45 pg[4.1( empty local-lis/les=44/45 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44) [3,2,1] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:46 localhost ceph-osd[32311]: osd.3 pg_epoch: 45 pg[2.5( empty local-lis/les=44/45 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44) [3,4,2] r=0 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:46 localhost ceph-osd[32311]: osd.3 pg_epoch: 45 pg[6.1b( empty local-lis/les=44/45 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44) [3,2,1] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:46 localhost ceph-osd[32311]: osd.3 pg_epoch: 45 pg[6.17( empty local-lis/les=44/45 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44) [3,4,2] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:46 localhost ceph-osd[32311]: osd.3 pg_epoch: 45 pg[4.19( empty local-lis/les=44/45 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44) [3,1,2] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:46 localhost ceph-osd[32311]: osd.3 pg_epoch: 45 pg[4.15( empty local-lis/les=44/45 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44) [3,4,2] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:46 localhost ceph-osd[31375]: osd.0 pg_epoch: 44 pg[6.16( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44) [4,5,0] r=2 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Dec 15 03:04:46 localhost ceph-osd[31375]: osd.0 pg_epoch: 44 pg[5.15( empty local-lis/les=0/0 n=0 ec=40/26 lis/c=40/40 les/c/f=41/41/0 sis=44) [5,4,0] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Dec 15 03:04:46 localhost ceph-osd[32311]: osd.3 pg_epoch: 45 pg[2.12( empty local-lis/les=44/45 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44) [3,1,2] r=0 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:46 localhost ceph-osd[32311]: osd.3 pg_epoch: 45 pg[2.18( empty local-lis/les=44/45 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44) [3,2,4] r=0 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:46 localhost ceph-osd[31375]: osd.0 pg_epoch: 44 pg[3.17( empty local-lis/les=0/0 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44) [1,5,0] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Dec 15 03:04:46 localhost ceph-osd[32311]: osd.3 pg_epoch: 45 pg[2.1b( empty local-lis/les=44/45 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44) [3,2,4] r=0 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:46 localhost ceph-osd[31375]: osd.0 pg_epoch: 44 pg[5.3( empty local-lis/les=0/0 n=0 ec=40/26 lis/c=40/40 les/c/f=41/41/0 sis=44) [4,5,0] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Dec 15 03:04:46 localhost ceph-osd[31375]: osd.0 pg_epoch: 44 pg[5.5( empty local-lis/les=0/0 n=0 ec=40/26 lis/c=40/40 les/c/f=41/41/0 sis=44) [4,2,0] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Dec 15 03:04:46 localhost ceph-osd[31375]: osd.0 pg_epoch: 44 pg[5.6( empty local-lis/les=0/0 n=0 ec=40/26 lis/c=40/40 les/c/f=41/41/0 sis=44) [4,5,0] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Dec 15 03:04:46 localhost ceph-osd[32311]: osd.3 pg_epoch: 45 pg[4.8( empty local-lis/les=44/45 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44) [3,2,4] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:46 localhost ceph-osd[31375]: osd.0 pg_epoch: 44 pg[4.1a( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44) [2,4,0] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Dec 15 03:04:46 localhost ceph-osd[31375]: osd.0 pg_epoch: 44 pg[2.1a( empty local-lis/les=0/0 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44) [2,0,1] r=1 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Dec 15 03:04:46 localhost ceph-osd[31375]: osd.0 pg_epoch: 44 pg[4.18( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44) [2,1,0] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Dec 15 03:04:46 localhost ceph-osd[31375]: osd.0 pg_epoch: 44 pg[5.18( empty local-lis/les=0/0 n=0 ec=40/26 lis/c=40/40 les/c/f=41/41/0 sis=44) [2,0,4] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Dec 15 03:04:46 localhost ceph-osd[31375]: osd.0 pg_epoch: 44 pg[3.10( empty local-lis/les=0/0 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44) [5,0,4] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Dec 15 03:04:46 localhost ceph-osd[31375]: osd.0 pg_epoch: 44 pg[6.1f( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44) [4,0,5] r=1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Dec 15 03:04:46 localhost ceph-osd[31375]: osd.0 pg_epoch: 44 pg[3.18( empty local-lis/les=0/0 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44) [4,0,5] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Dec 15 03:04:46 localhost ceph-osd[31375]: osd.0 pg_epoch: 44 pg[6.1d( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44) [4,0,5] r=1 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Dec 15 03:04:46 localhost ceph-osd[32311]: osd.3 pg_epoch: 45 pg[6.1e( empty local-lis/les=44/45 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44) [3,5,4] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:46 localhost ceph-osd[32311]: osd.3 pg_epoch: 45 pg[6.1( empty local-lis/les=44/45 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44) [3,5,4] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:46 localhost ceph-osd[32311]: osd.3 pg_epoch: 45 pg[4.1f( empty local-lis/les=44/45 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44) [3,5,1] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:46 localhost ceph-osd[32311]: osd.3 pg_epoch: 45 pg[4.1d( empty local-lis/les=44/45 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44) [3,5,4] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:46 localhost ceph-osd[32311]: osd.3 pg_epoch: 45 pg[4.2( empty local-lis/les=44/45 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44) [3,5,4] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:46 localhost ceph-osd[32311]: osd.3 pg_epoch: 45 pg[5.d( empty local-lis/les=44/45 n=0 ec=40/26 lis/c=40/40 les/c/f=41/41/0 sis=44) [3,5,4] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:46 localhost ceph-osd[32311]: osd.3 pg_epoch: 45 pg[3.8( empty local-lis/les=44/45 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44) [3,1,5] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:46 localhost ceph-osd[32311]: osd.3 pg_epoch: 45 pg[3.11( empty local-lis/les=44/45 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44) [3,5,4] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:46 localhost ceph-osd[32311]: osd.3 pg_epoch: 45 pg[2.b( empty local-lis/les=44/45 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44) [3,5,4] r=0 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:46 localhost ceph-osd[32311]: osd.3 pg_epoch: 45 pg[2.10( empty local-lis/les=44/45 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44) [3,1,5] r=0 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:46 localhost ceph-osd[32311]: osd.3 pg_epoch: 45 pg[4.14( empty local-lis/les=44/45 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44) [3,1,5] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:46 localhost ceph-osd[32311]: osd.3 pg_epoch: 45 pg[5.13( empty local-lis/les=44/45 n=0 ec=40/26 lis/c=40/40 les/c/f=41/41/0 sis=44) [3,1,5] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:46 localhost ceph-osd[32311]: osd.3 pg_epoch: 45 pg[2.c( empty local-lis/les=44/45 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44) [3,1,5] r=0 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:46 localhost ceph-osd[32311]: osd.3 pg_epoch: 45 pg[5.12( empty local-lis/les=44/45 n=0 ec=40/26 lis/c=40/40 les/c/f=41/41/0 sis=44) [3,5,4] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:46 localhost ceph-osd[32311]: osd.3 pg_epoch: 45 pg[6.1c( empty local-lis/les=44/45 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44) [3,1,5] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:46 localhost ceph-osd[31375]: osd.0 pg_epoch: 45 pg[4.9( empty local-lis/les=44/45 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44) [0,1,2] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:46 localhost ceph-osd[31375]: osd.0 pg_epoch: 45 pg[4.6( empty local-lis/les=44/45 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44) [0,1,2] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:46 localhost ceph-osd[31375]: osd.0 pg_epoch: 45 pg[2.f( empty local-lis/les=44/45 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44) [0,2,4] r=0 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:46 localhost ceph-osd[31375]: osd.0 pg_epoch: 45 pg[3.e( empty local-lis/les=44/45 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44) [0,5,1] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:46 localhost ceph-osd[31375]: osd.0 pg_epoch: 45 pg[5.8( empty local-lis/les=44/45 n=0 ec=40/26 lis/c=40/40 les/c/f=41/41/0 sis=44) [0,4,5] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:46 localhost ceph-osd[31375]: osd.0 pg_epoch: 45 pg[2.d( empty local-lis/les=44/45 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44) [0,5,1] r=0 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:46 localhost ceph-osd[31375]: osd.0 pg_epoch: 45 pg[2.1d( empty local-lis/les=44/45 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44) [0,5,4] r=0 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:46 localhost ceph-osd[31375]: osd.0 pg_epoch: 45 pg[5.1a( empty local-lis/les=44/45 n=0 ec=40/26 lis/c=40/40 les/c/f=41/41/0 sis=44) [0,5,4] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:46 localhost ceph-osd[31375]: osd.0 pg_epoch: 45 pg[4.3( empty local-lis/les=44/45 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44) [0,5,1] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:46 localhost ceph-osd[31375]: osd.0 pg_epoch: 45 pg[5.4( empty local-lis/les=44/45 n=0 ec=40/26 lis/c=40/40 les/c/f=41/41/0 sis=44) [0,1,5] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:46 localhost ceph-osd[31375]: osd.0 pg_epoch: 45 pg[2.1c( empty local-lis/les=44/45 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44) [0,2,1] r=0 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:46 localhost ceph-osd[31375]: osd.0 pg_epoch: 45 pg[3.1a( empty local-lis/les=44/45 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44) [0,1,2] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:46 localhost ceph-osd[31375]: osd.0 pg_epoch: 45 pg[3.1b( empty local-lis/les=44/45 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44) [0,5,1] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:46 localhost ceph-osd[31375]: osd.0 pg_epoch: 45 pg[4.1c( empty local-lis/les=44/45 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44) [0,1,2] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:46 localhost ceph-osd[31375]: osd.0 pg_epoch: 45 pg[5.b( empty local-lis/les=44/45 n=0 ec=40/26 lis/c=40/40 les/c/f=41/41/0 sis=44) [0,1,5] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:46 localhost ceph-osd[31375]: osd.0 pg_epoch: 45 pg[2.15( empty local-lis/les=44/45 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44) [0,1,2] r=0 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:46 localhost ceph-osd[31375]: osd.0 pg_epoch: 45 pg[6.12( empty local-lis/les=44/45 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44) [0,2,1] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:46 localhost ceph-osd[31375]: osd.0 pg_epoch: 45 pg[2.a( empty local-lis/les=44/45 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44) [0,4,2] r=0 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:46 localhost ceph-osd[31375]: osd.0 pg_epoch: 45 pg[3.15( empty local-lis/les=44/45 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44) [0,2,4] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:46 localhost ceph-osd[31375]: osd.0 pg_epoch: 45 pg[2.13( empty local-lis/les=44/45 n=0 ec=37/20 lis/c=37/37 les/c/f=38/38/0 sis=44) [0,5,4] r=0 lpr=44 pi=[37,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:04:48 localhost python3[55855]: ansible-ansible.legacy.stat Invoked with path=/var/lib/tripleo-config/ceph/ceph.client.openstack.keyring follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 03:04:48 localhost python3[55898]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765785887.7463043-91277-78939664672513/source dest=/var/lib/tripleo-config/ceph/ceph.client.openstack.keyring mode=600 _original_basename=ceph.client.openstack.keyring follow=False checksum=12c95e7154f1a9ee6f3f3cf63bf30d2d8ea78471 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:04:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:04:50 localhost systemd[1]: tmp-crun.2lKSmv.mount: Deactivated successfully. Dec 15 03:04:50 localhost podman[55913]: 2025-12-15 08:04:50.752023451 +0000 UTC m=+0.083545961 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, release=1761123044, name=rhosp17/openstack-qdrouterd, version=17.1.12, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, build-date=2025-11-18T22:49:46Z) Dec 15 03:04:50 localhost podman[55913]: 2025-12-15 08:04:50.942135256 +0000 UTC m=+0.273657746 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, name=rhosp17/openstack-qdrouterd, vcs-type=git, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, architecture=x86_64, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1) Dec 15 03:04:50 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:04:53 localhost python3[55989]: ansible-ansible.legacy.stat Invoked with path=/var/lib/tripleo-config/ceph/ceph.client.manila.keyring follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 03:04:53 localhost python3[56032]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765785893.2773085-91277-184913711877203/source dest=/var/lib/tripleo-config/ceph/ceph.client.manila.keyring mode=600 _original_basename=ceph.client.manila.keyring follow=False checksum=334bf10db749aaee0d7bad3af75c5ac4403d08c1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:04:54 localhost ceph-osd[32311]: log_channel(cluster) log [DBG] : 3.1d scrub starts Dec 15 03:04:54 localhost ceph-osd[31375]: log_channel(cluster) log [DBG] : 2.1c scrub starts Dec 15 03:04:58 localhost python3[56094]: ansible-ansible.legacy.stat Invoked with path=/var/lib/tripleo-config/ceph/ceph.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 03:04:58 localhost ceph-osd[32311]: log_channel(cluster) log [DBG] : 3.1d scrub ok Dec 15 03:04:59 localhost ceph-osd[32311]: log_channel(cluster) log [DBG] : 5.0 deep-scrub starts Dec 15 03:04:59 localhost ceph-osd[32311]: log_channel(cluster) log [DBG] : 5.0 deep-scrub ok Dec 15 03:04:59 localhost python3[56137]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765785898.5818276-91277-106586299936733/source dest=/var/lib/tripleo-config/ceph/ceph.conf mode=644 _original_basename=ceph.conf follow=False checksum=b64ca508f62c69b83b0b4d40f0fcd0c8fa8af3e1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:05:00 localhost ceph-osd[31375]: osd.0 pg_epoch: 46 pg[7.2( v 35'39 (0'0,35'39] local-lis/les=42/43 n=2 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=13.043006897s) [4,3,5] r=-1 lpr=46 pi=[42,46)/1 luod=0'0 crt=35'39 lcod 0'0 mlcod 0'0 active pruub 1207.376953125s@ mbc={}] start_peering_interval up [5,0,4] -> [4,3,5], acting [5,0,4] -> [4,3,5], acting_primary 5 -> 4, up_primary 5 -> 4, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:05:00 localhost ceph-osd[31375]: osd.0 pg_epoch: 46 pg[7.6( v 35'39 (0'0,35'39] local-lis/les=42/43 n=2 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=13.040727615s) [4,3,5] r=-1 lpr=46 pi=[42,46)/1 luod=0'0 crt=35'39 lcod 0'0 mlcod 0'0 active pruub 1207.374755859s@ mbc={}] start_peering_interval up [5,0,4] -> [4,3,5], acting [5,0,4] -> [4,3,5], acting_primary 5 -> 4, up_primary 5 -> 4, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:05:00 localhost ceph-osd[31375]: osd.0 pg_epoch: 46 pg[7.2( v 35'39 (0'0,35'39] local-lis/les=42/43 n=2 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=13.042916298s) [4,3,5] r=-1 lpr=46 pi=[42,46)/1 crt=35'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1207.376953125s@ mbc={}] state: transitioning to Stray Dec 15 03:05:00 localhost ceph-osd[31375]: osd.0 pg_epoch: 46 pg[7.6( v 35'39 (0'0,35'39] local-lis/les=42/43 n=2 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=13.040649414s) [4,3,5] r=-1 lpr=46 pi=[42,46)/1 crt=35'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1207.374755859s@ mbc={}] state: transitioning to Stray Dec 15 03:05:00 localhost ceph-osd[31375]: osd.0 pg_epoch: 46 pg[7.a( v 35'39 (0'0,35'39] local-lis/les=42/43 n=1 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=13.042865753s) [4,3,5] r=-1 lpr=46 pi=[42,46)/1 luod=0'0 crt=35'39 lcod 0'0 mlcod 0'0 active pruub 1207.377075195s@ mbc={}] start_peering_interval up [5,0,4] -> [4,3,5], acting [5,0,4] -> [4,3,5], acting_primary 5 -> 4, up_primary 5 -> 4, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:05:00 localhost ceph-osd[31375]: osd.0 pg_epoch: 46 pg[7.e( v 35'39 (0'0,35'39] local-lis/les=42/43 n=1 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=13.040831566s) [4,3,5] r=-1 lpr=46 pi=[42,46)/1 luod=0'0 crt=35'39 lcod 0'0 mlcod 0'0 active pruub 1207.374755859s@ mbc={}] start_peering_interval up [5,0,4] -> [4,3,5], acting [5,0,4] -> [4,3,5], acting_primary 5 -> 4, up_primary 5 -> 4, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:05:00 localhost ceph-osd[31375]: osd.0 pg_epoch: 46 pg[7.a( v 35'39 (0'0,35'39] local-lis/les=42/43 n=1 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=13.042243004s) [4,3,5] r=-1 lpr=46 pi=[42,46)/1 crt=35'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1207.377075195s@ mbc={}] state: transitioning to Stray Dec 15 03:05:00 localhost ceph-osd[31375]: osd.0 pg_epoch: 46 pg[7.e( v 35'39 (0'0,35'39] local-lis/les=42/43 n=1 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=13.040070534s) [4,3,5] r=-1 lpr=46 pi=[42,46)/1 crt=35'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1207.374755859s@ mbc={}] state: transitioning to Stray Dec 15 03:05:01 localhost ceph-osd[32311]: osd.3 pg_epoch: 46 pg[7.2( empty local-lis/les=0/0 n=0 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=46) [4,3,5] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Dec 15 03:05:01 localhost ceph-osd[32311]: osd.3 pg_epoch: 46 pg[7.6( empty local-lis/les=0/0 n=0 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=46) [4,3,5] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Dec 15 03:05:01 localhost ceph-osd[32311]: osd.3 pg_epoch: 46 pg[7.a( empty local-lis/les=0/0 n=0 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=46) [4,3,5] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Dec 15 03:05:01 localhost ceph-osd[32311]: osd.3 pg_epoch: 46 pg[7.e( empty local-lis/les=0/0 n=0 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=46) [4,3,5] r=1 lpr=46 pi=[42,46)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Dec 15 03:05:02 localhost ceph-osd[31375]: osd.0 pg_epoch: 48 pg[7.f( v 35'39 (0'0,35'39] local-lis/les=44/45 n=1 ec=42/33 lis/c=44/44 les/c/f=45/45/0 sis=48 pruub=8.157066345s) [1,2,3] r=-1 lpr=48 pi=[44,48)/1 luod=0'0 crt=35'39 lcod 0'0 mlcod 0'0 active pruub 1205.141479492s@ mbc={}] start_peering_interval up [2,0,4] -> [1,2,3], acting [2,0,4] -> [1,2,3], acting_primary 2 -> 1, up_primary 2 -> 1, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:05:02 localhost ceph-osd[31375]: osd.0 pg_epoch: 48 pg[7.7( v 35'39 (0'0,35'39] local-lis/les=44/45 n=1 ec=42/33 lis/c=44/44 les/c/f=45/45/0 sis=48 pruub=8.155953407s) [1,2,3] r=-1 lpr=48 pi=[44,48)/1 luod=0'0 crt=35'39 lcod 0'0 mlcod 0'0 active pruub 1205.140502930s@ mbc={}] start_peering_interval up [2,0,4] -> [1,2,3], acting [2,0,4] -> [1,2,3], acting_primary 2 -> 1, up_primary 2 -> 1, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:05:02 localhost ceph-osd[31375]: osd.0 pg_epoch: 48 pg[7.f( v 35'39 (0'0,35'39] local-lis/les=44/45 n=1 ec=42/33 lis/c=44/44 les/c/f=45/45/0 sis=48 pruub=8.157003403s) [1,2,3] r=-1 lpr=48 pi=[44,48)/1 crt=35'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1205.141479492s@ mbc={}] state: transitioning to Stray Dec 15 03:05:02 localhost ceph-osd[31375]: osd.0 pg_epoch: 48 pg[7.3( v 35'39 (0'0,35'39] local-lis/les=44/45 n=2 ec=42/33 lis/c=44/44 les/c/f=45/45/0 sis=48 pruub=8.156343460s) [1,2,3] r=-1 lpr=48 pi=[44,48)/1 luod=0'0 crt=35'39 lcod 0'0 mlcod 0'0 active pruub 1205.140869141s@ mbc={}] start_peering_interval up [2,0,4] -> [1,2,3], acting [2,0,4] -> [1,2,3], acting_primary 2 -> 1, up_primary 2 -> 1, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:05:02 localhost ceph-osd[31375]: osd.0 pg_epoch: 48 pg[7.b( v 35'39 (0'0,35'39] local-lis/les=44/45 n=1 ec=42/33 lis/c=44/44 les/c/f=45/45/0 sis=48 pruub=8.156255722s) [1,2,3] r=-1 lpr=48 pi=[44,48)/1 luod=0'0 crt=35'39 lcod 0'0 mlcod 0'0 active pruub 1205.140869141s@ mbc={}] start_peering_interval up [2,0,4] -> [1,2,3], acting [2,0,4] -> [1,2,3], acting_primary 2 -> 1, up_primary 2 -> 1, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:05:02 localhost ceph-osd[31375]: osd.0 pg_epoch: 48 pg[7.7( v 35'39 (0'0,35'39] local-lis/les=44/45 n=1 ec=42/33 lis/c=44/44 les/c/f=45/45/0 sis=48 pruub=8.155858994s) [1,2,3] r=-1 lpr=48 pi=[44,48)/1 crt=35'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1205.140502930s@ mbc={}] state: transitioning to Stray Dec 15 03:05:02 localhost ceph-osd[31375]: osd.0 pg_epoch: 48 pg[7.3( v 35'39 (0'0,35'39] local-lis/les=44/45 n=2 ec=42/33 lis/c=44/44 les/c/f=45/45/0 sis=48 pruub=8.156260490s) [1,2,3] r=-1 lpr=48 pi=[44,48)/1 crt=35'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1205.140869141s@ mbc={}] state: transitioning to Stray Dec 15 03:05:02 localhost ceph-osd[31375]: osd.0 pg_epoch: 48 pg[7.b( v 35'39 (0'0,35'39] local-lis/les=44/45 n=1 ec=42/33 lis/c=44/44 les/c/f=45/45/0 sis=48 pruub=8.156189919s) [1,2,3] r=-1 lpr=48 pi=[44,48)/1 crt=35'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1205.140869141s@ mbc={}] state: transitioning to Stray Dec 15 03:05:03 localhost ceph-osd[32311]: log_channel(cluster) log [DBG] : 5.e scrub starts Dec 15 03:05:03 localhost ceph-osd[32311]: log_channel(cluster) log [DBG] : 5.e scrub ok Dec 15 03:05:04 localhost ceph-osd[32311]: log_channel(cluster) log [DBG] : 2.12 scrub starts Dec 15 03:05:04 localhost ceph-osd[32311]: log_channel(cluster) log [DBG] : 2.12 scrub ok Dec 15 03:05:04 localhost ceph-osd[31375]: osd.0 pg_epoch: 49 pg[7.c( v 35'39 (0'0,35'39] local-lis/les=42/43 n=1 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=49 pruub=8.902082443s) [1,5,0] r=2 lpr=49 pi=[42,49)/1 luod=0'0 crt=35'39 lcod 0'0 mlcod 0'0 active pruub 1207.374389648s@ mbc={}] start_peering_interval up [5,0,4] -> [1,5,0], acting [5,0,4] -> [1,5,0], acting_primary 5 -> 1, up_primary 5 -> 1, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:05:04 localhost ceph-osd[31375]: osd.0 pg_epoch: 49 pg[7.c( v 35'39 (0'0,35'39] local-lis/les=42/43 n=1 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=49 pruub=8.902002335s) [1,5,0] r=2 lpr=49 pi=[42,49)/1 crt=35'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1207.374389648s@ mbc={}] state: transitioning to Stray Dec 15 03:05:04 localhost ceph-osd[31375]: osd.0 pg_epoch: 49 pg[7.4( v 35'39 (0'0,35'39] local-lis/les=42/43 n=2 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=49 pruub=8.901888847s) [1,5,0] r=2 lpr=49 pi=[42,49)/1 luod=0'0 crt=35'39 lcod 0'0 mlcod 0'0 active pruub 1207.374389648s@ mbc={}] start_peering_interval up [5,0,4] -> [1,5,0], acting [5,0,4] -> [1,5,0], acting_primary 5 -> 1, up_primary 5 -> 1, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:05:04 localhost ceph-osd[31375]: osd.0 pg_epoch: 49 pg[7.4( v 35'39 (0'0,35'39] local-lis/les=42/43 n=2 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=49 pruub=8.901805878s) [1,5,0] r=2 lpr=49 pi=[42,49)/1 crt=35'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1207.374389648s@ mbc={}] state: transitioning to Stray Dec 15 03:05:04 localhost ceph-osd[32311]: osd.3 pg_epoch: 48 pg[7.b( empty local-lis/les=0/0 n=0 ec=42/33 lis/c=44/44 les/c/f=45/45/0 sis=48) [1,2,3] r=2 lpr=48 pi=[44,48)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Dec 15 03:05:04 localhost ceph-osd[32311]: osd.3 pg_epoch: 48 pg[7.f( empty local-lis/les=0/0 n=0 ec=42/33 lis/c=44/44 les/c/f=45/45/0 sis=48) [1,2,3] r=2 lpr=48 pi=[44,48)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Dec 15 03:05:04 localhost ceph-osd[32311]: osd.3 pg_epoch: 48 pg[7.3( empty local-lis/les=0/0 n=0 ec=42/33 lis/c=44/44 les/c/f=45/45/0 sis=48) [1,2,3] r=2 lpr=48 pi=[44,48)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Dec 15 03:05:04 localhost ceph-osd[32311]: osd.3 pg_epoch: 48 pg[7.7( empty local-lis/les=0/0 n=0 ec=42/33 lis/c=44/44 les/c/f=45/45/0 sis=48) [1,2,3] r=2 lpr=48 pi=[44,48)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Dec 15 03:05:05 localhost ceph-osd[32311]: log_channel(cluster) log [DBG] : 2.5 scrub starts Dec 15 03:05:05 localhost python3[56199]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 03:05:05 localhost ceph-osd[32311]: log_channel(cluster) log [DBG] : 2.5 scrub ok Dec 15 03:05:05 localhost ceph-osd[31375]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Dec 15 03:05:05 localhost ceph-osd[31375]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1200.1 total, 600.0 interval#012Cumulative writes: 4015 writes, 19K keys, 4015 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 4015 writes, 309 syncs, 12.99 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 617 writes, 2461 keys, 617 commit groups, 1.0 writes per commit group, ingest: 1.20 MB, 0.00 MB/s#012Interval WAL: 617 writes, 107 syncs, 5.77 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 2/0 2.61 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.2 0.01 0.00 1 0.007 0 0 0.0 0.0#012 Sum 2/0 2.61 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.2 0.01 0.00 1 0.007 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.2 0.01 0.00 1 0.007 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x5568990c62d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.8e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-0] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x5568990c62d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.8e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-1] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.1 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memt Dec 15 03:05:05 localhost python3[56244]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765785905.2615113-91819-103381642870045/source _original_basename=tmpfivit5vm follow=False checksum=f17091ee142621a3c8290c8c96b5b52d67b3a864 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:05:07 localhost python3[56306]: ansible-ansible.legacy.stat Invoked with path=/usr/local/sbin/containers-tmpwatch follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 03:05:07 localhost ceph-osd[32311]: log_channel(cluster) log [DBG] : 6.1 deep-scrub starts Dec 15 03:05:07 localhost ceph-osd[32311]: log_channel(cluster) log [DBG] : 6.1 deep-scrub ok Dec 15 03:05:07 localhost python3[56349]: ansible-ansible.legacy.copy Invoked with dest=/usr/local/sbin/containers-tmpwatch group=root mode=493 owner=root src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765785906.7599242-91906-117173874112941/source _original_basename=tmpyb0h1dtc follow=False checksum=84397b037dad9813fed388c4bcdd4871f384cd22 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:05:07 localhost python3[56379]: ansible-cron Invoked with job=/usr/local/sbin/containers-tmpwatch name=Remove old logs special_time=daily user=root state=present backup=False minute=* hour=* day=* month=* weekday=* disabled=False env=False cron_file=None insertafter=None insertbefore=None Dec 15 03:05:08 localhost python3[56397]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_2 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 15 03:05:09 localhost ceph-osd[32311]: log_channel(cluster) log [DBG] : 4.2 scrub starts Dec 15 03:05:09 localhost ceph-osd[32311]: log_channel(cluster) log [DBG] : 4.2 scrub ok Dec 15 03:05:09 localhost ansible-async_wrapper.py[56569]: Invoked with 178678115042 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1765785909.359368-91998-65390299406335/AnsiballZ_command.py _ Dec 15 03:05:09 localhost ansible-async_wrapper.py[56572]: Starting module and watcher Dec 15 03:05:09 localhost ansible-async_wrapper.py[56572]: Start watching 56573 (3600) Dec 15 03:05:09 localhost ansible-async_wrapper.py[56573]: Start module (56573) Dec 15 03:05:09 localhost ansible-async_wrapper.py[56569]: Return async_wrapper task started. Dec 15 03:05:10 localhost ceph-osd[32311]: osd.3 pg_epoch: 51 pg[7.5( empty local-lis/les=0/0 n=0 ec=42/33 lis/c=44/44 les/c/f=45/45/0 sis=51) [3,4,2] r=0 lpr=51 pi=[44,51)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:05:10 localhost ceph-osd[32311]: osd.3 pg_epoch: 51 pg[7.d( empty local-lis/les=0/0 n=0 ec=42/33 lis/c=44/44 les/c/f=45/45/0 sis=51) [3,4,2] r=0 lpr=51 pi=[44,51)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:05:10 localhost ceph-osd[32311]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Dec 15 03:05:10 localhost ceph-osd[32311]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1200.2 total, 600.0 interval#012Cumulative writes: 4821 writes, 22K keys, 4821 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 4820 writes, 363 syncs, 13.28 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1572 writes, 6023 keys, 1572 commit groups, 1.0 writes per commit group, ingest: 2.15 MB, 0.00 MB/s#012Interval WAL: 1571 writes, 223 syncs, 7.04 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 2/0 2.61 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.3 0.00 0.00 1 0.005 0 0 0.0 0.0#012 Sum 2/0 2.61 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.3 0.00 0.00 1 0.005 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.3 0.00 0.00 1 0.005 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.2 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x564edfcfa2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-0] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.2 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x564edfcfa2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-1] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.2 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memta Dec 15 03:05:10 localhost ceph-osd[31375]: osd.0 pg_epoch: 51 pg[7.5( v 35'39 (0'0,35'39] local-lis/les=44/45 n=2 ec=42/33 lis/c=44/44 les/c/f=45/45/0 sis=51 pruub=8.849245071s) [3,4,2] r=-1 lpr=51 pi=[44,51)/1 luod=0'0 crt=35'39 lcod 0'0 mlcod 0'0 active pruub 1213.142089844s@ mbc={}] start_peering_interval up [2,0,4] -> [3,4,2], acting [2,0,4] -> [3,4,2], acting_primary 2 -> 3, up_primary 2 -> 3, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:05:10 localhost ceph-osd[31375]: osd.0 pg_epoch: 51 pg[7.d( v 35'39 (0'0,35'39] local-lis/les=44/45 n=1 ec=42/33 lis/c=44/44 les/c/f=45/45/0 sis=51 pruub=8.848574638s) [3,4,2] r=-1 lpr=51 pi=[44,51)/1 luod=0'0 crt=35'39 lcod 0'0 mlcod 0'0 active pruub 1213.141479492s@ mbc={}] start_peering_interval up [2,0,4] -> [3,4,2], acting [2,0,4] -> [3,4,2], acting_primary 2 -> 3, up_primary 2 -> 3, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:05:10 localhost ceph-osd[31375]: osd.0 pg_epoch: 51 pg[7.5( v 35'39 (0'0,35'39] local-lis/les=44/45 n=2 ec=42/33 lis/c=44/44 les/c/f=45/45/0 sis=51 pruub=8.849136353s) [3,4,2] r=-1 lpr=51 pi=[44,51)/1 crt=35'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1213.142089844s@ mbc={}] state: transitioning to Stray Dec 15 03:05:10 localhost ceph-osd[31375]: osd.0 pg_epoch: 51 pg[7.d( v 35'39 (0'0,35'39] local-lis/les=44/45 n=1 ec=42/33 lis/c=44/44 les/c/f=45/45/0 sis=51 pruub=8.848485947s) [3,4,2] r=-1 lpr=51 pi=[44,51)/1 crt=35'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1213.141479492s@ mbc={}] state: transitioning to Stray Dec 15 03:05:10 localhost ceph-osd[32311]: log_channel(cluster) log [DBG] : 6.1b scrub starts Dec 15 03:05:10 localhost ceph-osd[32311]: log_channel(cluster) log [DBG] : 6.1b scrub ok Dec 15 03:05:10 localhost python3[56593]: ansible-ansible.legacy.async_status Invoked with jid=178678115042.56569 mode=status _async_dir=/tmp/.ansible_async Dec 15 03:05:11 localhost ceph-osd[32311]: log_channel(cluster) log [DBG] : 4.19 scrub starts Dec 15 03:05:11 localhost ceph-osd[32311]: log_channel(cluster) log [DBG] : 4.19 scrub ok Dec 15 03:05:11 localhost ceph-osd[32311]: osd.3 pg_epoch: 52 pg[7.5( v 35'39 lc 35'9 (0'0,35'39] local-lis/les=51/52 n=2 ec=42/33 lis/c=44/44 les/c/f=45/45/0 sis=51) [3,4,2] r=0 lpr=51 pi=[44,51)/1 crt=35'39 lcod 0'0 mlcod 0'0 active+degraded m=2 mbc={255={(2+1)=2}}] state: react AllReplicasActivated Activating complete Dec 15 03:05:11 localhost ceph-osd[32311]: osd.3 pg_epoch: 52 pg[7.d( v 35'39 lc 35'10 (0'0,35'39] local-lis/les=51/52 n=1 ec=42/33 lis/c=44/44 les/c/f=45/45/0 sis=51) [3,4,2] r=0 lpr=51 pi=[44,51)/1 crt=35'39 lcod 0'0 mlcod 0'0 active+degraded m=2 mbc={255={(2+1)=2}}] state: react AllReplicasActivated Activating complete Dec 15 03:05:12 localhost ceph-osd[31375]: log_channel(cluster) log [DBG] : 3.1b deep-scrub starts Dec 15 03:05:12 localhost ceph-osd[31375]: log_channel(cluster) log [DBG] : 3.1b deep-scrub ok Dec 15 03:05:12 localhost ceph-osd[32311]: log_channel(cluster) log [DBG] : 2.1b scrub starts Dec 15 03:05:12 localhost ceph-osd[32311]: osd.3 pg_epoch: 53 pg[7.6( v 35'39 (0'0,35'39] local-lis/les=46/47 n=2 ec=42/33 lis/c=46/46 les/c/f=47/47/0 sis=53 pruub=12.922876358s) [1,2,3] r=2 lpr=53 pi=[46,53)/1 luod=0'0 crt=35'39 mlcod 0'0 active pruub 1215.199096680s@ mbc={}] start_peering_interval up [4,3,5] -> [1,2,3], acting [4,3,5] -> [1,2,3], acting_primary 4 -> 1, up_primary 4 -> 1, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:05:12 localhost ceph-osd[32311]: osd.3 pg_epoch: 53 pg[7.6( v 35'39 (0'0,35'39] local-lis/les=46/47 n=2 ec=42/33 lis/c=46/46 les/c/f=47/47/0 sis=53 pruub=12.922755241s) [1,2,3] r=2 lpr=53 pi=[46,53)/1 crt=35'39 mlcod 0'0 unknown NOTIFY pruub 1215.199096680s@ mbc={}] state: transitioning to Stray Dec 15 03:05:12 localhost ceph-osd[32311]: osd.3 pg_epoch: 53 pg[7.e( v 35'39 (0'0,35'39] local-lis/les=46/47 n=1 ec=42/33 lis/c=46/46 les/c/f=47/47/0 sis=53 pruub=12.921726227s) [1,2,3] r=2 lpr=53 pi=[46,53)/1 luod=0'0 crt=35'39 mlcod 0'0 active pruub 1215.199096680s@ mbc={}] start_peering_interval up [4,3,5] -> [1,2,3], acting [4,3,5] -> [1,2,3], acting_primary 4 -> 1, up_primary 4 -> 1, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:05:12 localhost ceph-osd[32311]: osd.3 pg_epoch: 53 pg[7.e( v 35'39 (0'0,35'39] local-lis/les=46/47 n=1 ec=42/33 lis/c=46/46 les/c/f=47/47/0 sis=53 pruub=12.921532631s) [1,2,3] r=2 lpr=53 pi=[46,53)/1 crt=35'39 mlcod 0'0 unknown NOTIFY pruub 1215.199096680s@ mbc={}] state: transitioning to Stray Dec 15 03:05:13 localhost puppet-user[56591]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Dec 15 03:05:13 localhost puppet-user[56591]: (file: /etc/puppet/hiera.yaml) Dec 15 03:05:13 localhost puppet-user[56591]: Warning: Undefined variable '::deploy_config_name'; Dec 15 03:05:13 localhost puppet-user[56591]: (file & line not available) Dec 15 03:05:13 localhost puppet-user[56591]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Dec 15 03:05:13 localhost puppet-user[56591]: (file & line not available) Dec 15 03:05:13 localhost puppet-user[56591]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8) Dec 15 03:05:13 localhost puppet-user[56591]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69) Dec 15 03:05:13 localhost puppet-user[56591]: Notice: Compiled catalog for np0005559462.localdomain in environment production in 0.14 seconds Dec 15 03:05:13 localhost puppet-user[56591]: Notice: Applied catalog in 0.04 seconds Dec 15 03:05:13 localhost puppet-user[56591]: Application: Dec 15 03:05:13 localhost puppet-user[56591]: Initial environment: production Dec 15 03:05:13 localhost puppet-user[56591]: Converged environment: production Dec 15 03:05:13 localhost puppet-user[56591]: Run mode: user Dec 15 03:05:13 localhost puppet-user[56591]: Changes: Dec 15 03:05:13 localhost puppet-user[56591]: Events: Dec 15 03:05:13 localhost puppet-user[56591]: Resources: Dec 15 03:05:13 localhost puppet-user[56591]: Total: 10 Dec 15 03:05:13 localhost puppet-user[56591]: Time: Dec 15 03:05:13 localhost puppet-user[56591]: Schedule: 0.00 Dec 15 03:05:13 localhost puppet-user[56591]: File: 0.00 Dec 15 03:05:13 localhost puppet-user[56591]: Exec: 0.01 Dec 15 03:05:13 localhost puppet-user[56591]: Augeas: 0.01 Dec 15 03:05:13 localhost puppet-user[56591]: Transaction evaluation: 0.03 Dec 15 03:05:13 localhost puppet-user[56591]: Catalog application: 0.04 Dec 15 03:05:13 localhost puppet-user[56591]: Config retrieval: 0.18 Dec 15 03:05:13 localhost puppet-user[56591]: Last run: 1765785913 Dec 15 03:05:13 localhost puppet-user[56591]: Filebucket: 0.00 Dec 15 03:05:13 localhost puppet-user[56591]: Total: 0.04 Dec 15 03:05:13 localhost puppet-user[56591]: Version: Dec 15 03:05:13 localhost puppet-user[56591]: Config: 1765785913 Dec 15 03:05:13 localhost puppet-user[56591]: Puppet: 7.10.0 Dec 15 03:05:13 localhost ansible-async_wrapper.py[56573]: Module complete (56573) Dec 15 03:05:14 localhost ceph-osd[32311]: log_channel(cluster) log [DBG] : 4.1d scrub starts Dec 15 03:05:14 localhost ceph-osd[32311]: osd.3 pg_epoch: 55 pg[7.7( v 35'39 (0'0,35'39] local-lis/les=48/49 n=1 ec=42/33 lis/c=48/48 les/c/f=49/49/0 sis=55 pruub=13.906693459s) [2,0,1] r=-1 lpr=55 pi=[48,55)/1 luod=0'0 crt=35'39 mlcod 0'0 active pruub 1218.262084961s@ mbc={}] start_peering_interval up [1,2,3] -> [2,0,1], acting [1,2,3] -> [2,0,1], acting_primary 1 -> 2, up_primary 1 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:05:14 localhost ceph-osd[32311]: osd.3 pg_epoch: 55 pg[7.7( v 35'39 (0'0,35'39] local-lis/les=48/49 n=1 ec=42/33 lis/c=48/48 les/c/f=49/49/0 sis=55 pruub=13.906577110s) [2,0,1] r=-1 lpr=55 pi=[48,55)/1 crt=35'39 mlcod 0'0 unknown NOTIFY pruub 1218.262084961s@ mbc={}] state: transitioning to Stray Dec 15 03:05:14 localhost ceph-osd[32311]: osd.3 pg_epoch: 55 pg[7.f( v 35'39 (0'0,35'39] local-lis/les=48/49 n=1 ec=42/33 lis/c=48/48 les/c/f=49/49/0 sis=55 pruub=13.906200409s) [2,0,1] r=-1 lpr=55 pi=[48,55)/1 luod=0'0 crt=35'39 mlcod 0'0 active pruub 1218.261840820s@ mbc={}] start_peering_interval up [1,2,3] -> [2,0,1], acting [1,2,3] -> [2,0,1], acting_primary 1 -> 2, up_primary 1 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:05:14 localhost ceph-osd[32311]: osd.3 pg_epoch: 55 pg[7.f( v 35'39 (0'0,35'39] local-lis/les=48/49 n=1 ec=42/33 lis/c=48/48 les/c/f=49/49/0 sis=55 pruub=13.906121254s) [2,0,1] r=-1 lpr=55 pi=[48,55)/1 crt=35'39 mlcod 0'0 unknown NOTIFY pruub 1218.261840820s@ mbc={}] state: transitioning to Stray Dec 15 03:05:14 localhost ceph-osd[32311]: log_channel(cluster) log [DBG] : 4.1d scrub ok Dec 15 03:05:14 localhost ansible-async_wrapper.py[56572]: Done in kid B. Dec 15 03:05:16 localhost ceph-osd[31375]: osd.0 pg_epoch: 55 pg[7.7( empty local-lis/les=0/0 n=0 ec=42/33 lis/c=48/48 les/c/f=49/49/0 sis=55) [2,0,1] r=1 lpr=55 pi=[48,55)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Dec 15 03:05:16 localhost ceph-osd[31375]: osd.0 pg_epoch: 55 pg[7.f( empty local-lis/les=0/0 n=0 ec=42/33 lis/c=48/48 les/c/f=49/49/0 sis=55) [2,0,1] r=1 lpr=55 pi=[48,55)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Dec 15 03:05:16 localhost ceph-osd[31375]: osd.0 pg_epoch: 57 pg[7.8( v 35'39 (0'0,35'39] local-lis/les=42/43 n=1 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=57 pruub=12.599314690s) [1,2,3] r=-1 lpr=57 pi=[42,57)/1 luod=0'0 crt=35'39 lcod 0'0 mlcod 0'0 active pruub 1223.374877930s@ mbc={}] start_peering_interval up [5,0,4] -> [1,2,3], acting [5,0,4] -> [1,2,3], acting_primary 5 -> 1, up_primary 5 -> 1, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:05:16 localhost ceph-osd[31375]: osd.0 pg_epoch: 57 pg[7.8( v 35'39 (0'0,35'39] local-lis/les=42/43 n=1 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=57 pruub=12.598666191s) [1,2,3] r=-1 lpr=57 pi=[42,57)/1 crt=35'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1223.374877930s@ mbc={}] state: transitioning to Stray Dec 15 03:05:17 localhost ceph-osd[32311]: osd.3 pg_epoch: 57 pg[7.8( empty local-lis/les=0/0 n=0 ec=42/33 lis/c=42/42 les/c/f=43/43/0 sis=57) [1,2,3] r=2 lpr=57 pi=[42,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Dec 15 03:05:18 localhost ceph-osd[31375]: osd.0 pg_epoch: 59 pg[7.9( v 35'39 (0'0,35'39] local-lis/les=44/45 n=1 ec=42/33 lis/c=44/44 les/c/f=45/45/0 sis=59 pruub=8.065738678s) [4,0,2] r=1 lpr=59 pi=[44,59)/1 luod=0'0 crt=35'39 lcod 0'0 mlcod 0'0 active pruub 1221.142089844s@ mbc={}] start_peering_interval up [2,0,4] -> [4,0,2], acting [2,0,4] -> [4,0,2], acting_primary 2 -> 4, up_primary 2 -> 4, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:05:18 localhost ceph-osd[31375]: osd.0 pg_epoch: 59 pg[7.9( v 35'39 (0'0,35'39] local-lis/les=44/45 n=1 ec=42/33 lis/c=44/44 les/c/f=45/45/0 sis=59 pruub=8.065673828s) [4,0,2] r=1 lpr=59 pi=[44,59)/1 crt=35'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1221.142089844s@ mbc={}] state: transitioning to Stray Dec 15 03:05:20 localhost ceph-osd[32311]: log_channel(cluster) log [DBG] : 6.1e scrub starts Dec 15 03:05:20 localhost python3[56846]: ansible-ansible.legacy.async_status Invoked with jid=178678115042.56569 mode=status _async_dir=/tmp/.ansible_async Dec 15 03:05:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:05:21 localhost systemd[1]: tmp-crun.MEgBms.mount: Deactivated successfully. Dec 15 03:05:21 localhost podman[56863]: 2025-12-15 08:05:21.177793902 +0000 UTC m=+0.095528191 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vcs-type=git, build-date=2025-11-18T22:49:46Z, batch=17.1_20251118.1, version=17.1.12, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., architecture=x86_64, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Dec 15 03:05:21 localhost python3[56862]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Dec 15 03:05:21 localhost sshd[56891]: main: sshd: ssh-rsa algorithm is disabled Dec 15 03:05:21 localhost podman[56863]: 2025-12-15 08:05:21.360815616 +0000 UTC m=+0.278549905 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, architecture=x86_64, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., vcs-type=git, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.openshift.expose-services=, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true) Dec 15 03:05:21 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:05:21 localhost python3[56909]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 15 03:05:22 localhost ceph-osd[31375]: log_channel(cluster) log [DBG] : 3.1a scrub starts Dec 15 03:05:22 localhost ceph-osd[31375]: log_channel(cluster) log [DBG] : 3.1a scrub ok Dec 15 03:05:22 localhost python3[56960]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 03:05:22 localhost python3[56978]: ansible-ansible.legacy.file Invoked with setype=svirt_sandbox_file_t selevel=s0 dest=/var/lib/container-puppet/puppetlabs/facter.conf _original_basename=tmpnc1fu3yz recurse=False state=file path=/var/lib/container-puppet/puppetlabs/facter.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Dec 15 03:05:22 localhost python3[57008]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:05:23 localhost ceph-osd[31375]: log_channel(cluster) log [DBG] : 2.15 scrub starts Dec 15 03:05:23 localhost ceph-osd[31375]: log_channel(cluster) log [DBG] : 2.15 scrub ok Dec 15 03:05:24 localhost python3[57112]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None Dec 15 03:05:24 localhost ceph-osd[31375]: log_channel(cluster) log [DBG] : 2.13 scrub starts Dec 15 03:05:24 localhost ceph-osd[31375]: log_channel(cluster) log [DBG] : 2.13 scrub ok Dec 15 03:05:24 localhost ceph-osd[32311]: log_channel(cluster) log [DBG] : 4.1f deep-scrub starts Dec 15 03:05:24 localhost ceph-osd[32311]: log_channel(cluster) log [DBG] : 4.1f deep-scrub ok Dec 15 03:05:24 localhost python3[57131]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:05:25 localhost python3[57163]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 15 03:05:25 localhost ceph-osd[32311]: osd.3 pg_epoch: 61 pg[7.a( v 35'39 (0'0,35'39] local-lis/les=46/47 n=1 ec=42/33 lis/c=46/46 les/c/f=47/47/0 sis=61 pruub=15.481483459s) [3,4,5] r=0 lpr=61 pi=[46,61)/1 luod=0'0 crt=35'39 lcod 0'0 mlcod 0'0 active pruub 1231.199096680s@ mbc={}] start_peering_interval up [4,3,5] -> [3,4,5], acting [4,3,5] -> [3,4,5], acting_primary 4 -> 3, up_primary 4 -> 3, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:05:25 localhost ceph-osd[32311]: osd.3 pg_epoch: 61 pg[7.a( v 35'39 (0'0,35'39] local-lis/les=46/47 n=1 ec=42/33 lis/c=46/46 les/c/f=47/47/0 sis=61 pruub=15.481483459s) [3,4,5] r=0 lpr=61 pi=[46,61)/1 crt=35'39 lcod 0'0 mlcod 0'0 unknown pruub 1231.199096680s@ mbc={}] state: transitioning to Primary Dec 15 03:05:26 localhost ceph-osd[31375]: log_channel(cluster) log [DBG] : 2.a deep-scrub starts Dec 15 03:05:26 localhost python3[57213]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 03:05:26 localhost ceph-osd[31375]: log_channel(cluster) log [DBG] : 2.a deep-scrub ok Dec 15 03:05:26 localhost ceph-osd[32311]: log_channel(cluster) log [DBG] : 6.17 scrub starts Dec 15 03:05:26 localhost python3[57231]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-container-shutdown _original_basename=tripleo-container-shutdown recurse=False state=file path=/usr/libexec/tripleo-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:05:26 localhost ceph-osd[32311]: log_channel(cluster) log [DBG] : 6.17 scrub ok Dec 15 03:05:26 localhost ceph-osd[32311]: osd.3 pg_epoch: 62 pg[7.a( v 35'39 (0'0,35'39] local-lis/les=61/62 n=1 ec=42/33 lis/c=46/46 les/c/f=47/47/0 sis=61) [3,4,5] r=0 lpr=61 pi=[46,61)/1 crt=35'39 lcod 0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:05:26 localhost python3[57293]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 03:05:27 localhost python3[57311]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-start-podman-container _original_basename=tripleo-start-podman-container recurse=False state=file path=/usr/libexec/tripleo-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:05:27 localhost python3[57373]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 03:05:27 localhost ceph-osd[32311]: osd.3 pg_epoch: 63 pg[7.b( v 35'39 (0'0,35'39] local-lis/les=48/49 n=1 ec=42/33 lis/c=48/48 les/c/f=49/49/0 sis=63 pruub=8.496160507s) [1,2,0] r=-1 lpr=63 pi=[48,63)/1 luod=0'0 crt=35'39 mlcod 0'0 active pruub 1226.257934570s@ mbc={}] start_peering_interval up [1,2,3] -> [1,2,0], acting [1,2,3] -> [1,2,0], acting_primary 1 -> 1, up_primary 1 -> 1, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:05:27 localhost ceph-osd[32311]: osd.3 pg_epoch: 63 pg[7.b( v 35'39 (0'0,35'39] local-lis/les=48/49 n=1 ec=42/33 lis/c=48/48 les/c/f=49/49/0 sis=63 pruub=8.496042252s) [1,2,0] r=-1 lpr=63 pi=[48,63)/1 crt=35'39 mlcod 0'0 unknown NOTIFY pruub 1226.257934570s@ mbc={}] state: transitioning to Stray Dec 15 03:05:27 localhost python3[57391]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/tripleo-container-shutdown.service _original_basename=tripleo-container-shutdown-service recurse=False state=file path=/usr/lib/systemd/system/tripleo-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:05:28 localhost python3[57453]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 03:05:28 localhost python3[57471]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset _original_basename=91-tripleo-container-shutdown-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:05:29 localhost ceph-osd[31375]: osd.0 pg_epoch: 63 pg[7.b( empty local-lis/les=0/0 n=0 ec=42/33 lis/c=48/48 les/c/f=49/49/0 sis=63) [1,2,0] r=2 lpr=63 pi=[48,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Dec 15 03:05:29 localhost python3[57501]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 03:05:29 localhost systemd[1]: Reloading. Dec 15 03:05:29 localhost systemd-rc-local-generator[57522]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 03:05:29 localhost systemd-sysv-generator[57527]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 03:05:29 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 03:05:30 localhost python3[57587]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 03:05:30 localhost python3[57605]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/usr/lib/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:05:30 localhost python3[57667]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 03:05:31 localhost python3[57685]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:05:31 localhost ceph-osd[32311]: log_channel(cluster) log [DBG] : 4.14 scrub starts Dec 15 03:05:31 localhost ceph-osd[32311]: log_channel(cluster) log [DBG] : 4.14 scrub ok Dec 15 03:05:31 localhost python3[57715]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 03:05:31 localhost systemd[1]: Reloading. Dec 15 03:05:31 localhost systemd-sysv-generator[57743]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 03:05:31 localhost systemd-rc-local-generator[57739]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 03:05:31 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 03:05:32 localhost systemd[1]: Starting Create netns directory... Dec 15 03:05:32 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Dec 15 03:05:32 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Dec 15 03:05:32 localhost systemd[1]: Finished Create netns directory. Dec 15 03:05:32 localhost ceph-osd[32311]: log_channel(cluster) log [DBG] : 4.1 scrub starts Dec 15 03:05:32 localhost ceph-osd[32311]: log_channel(cluster) log [DBG] : 4.1 scrub ok Dec 15 03:05:32 localhost python3[57772]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6 Dec 15 03:05:32 localhost ceph-osd[31375]: log_channel(cluster) log [DBG] : 2.f scrub starts Dec 15 03:05:33 localhost ceph-osd[31375]: log_channel(cluster) log [DBG] : 2.f scrub ok Dec 15 03:05:34 localhost python3[57829]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step2 config_dir=/var/lib/tripleo-config/container-startup-config/step_2 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False Dec 15 03:05:34 localhost ceph-osd[32311]: log_channel(cluster) log [DBG] : 4.8 scrub starts Dec 15 03:05:34 localhost ceph-osd[32311]: log_channel(cluster) log [DBG] : 4.8 scrub ok Dec 15 03:05:34 localhost podman[57897]: 2025-12-15 08:05:34.575477897 +0000 UTC m=+0.101084840 container create 82e84ac99155092a2f6ba7ce86bb20e20ca2974ae9821a85a7f91ea04fff00bb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, config_id=tripleo_step2, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765784752'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, vendor=Red Hat, Inc., container_name=nova_compute_init_log, name=rhosp17/openstack-nova-compute, distribution-scope=public, tcib_managed=true, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git) Dec 15 03:05:34 localhost podman[57915]: 2025-12-15 08:05:34.600507347 +0000 UTC m=+0.087104125 container create 491a19262c16ca4c262b2c9692375e3b36e2ba6534b49d3827928d6534415b13 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, config_id=tripleo_step2, name=rhosp17/openstack-nova-libvirt, release=1761123044, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765784752'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, io.buildah.version=1.41.4, batch=17.1_20251118.1, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:35:22Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, architecture=x86_64, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_virtqemud_init_logs, version=17.1.12) Dec 15 03:05:34 localhost podman[57897]: 2025-12-15 08:05:34.521747766 +0000 UTC m=+0.047354769 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 Dec 15 03:05:34 localhost systemd[1]: Started libpod-conmon-82e84ac99155092a2f6ba7ce86bb20e20ca2974ae9821a85a7f91ea04fff00bb.scope. Dec 15 03:05:34 localhost systemd[1]: Started libpod-conmon-491a19262c16ca4c262b2c9692375e3b36e2ba6534b49d3827928d6534415b13.scope. Dec 15 03:05:34 localhost systemd[1]: Started libcrun container. Dec 15 03:05:34 localhost podman[57915]: 2025-12-15 08:05:34.545493193 +0000 UTC m=+0.032090041 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Dec 15 03:05:34 localhost systemd[1]: Started libcrun container. Dec 15 03:05:34 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c0eca1cbaf0df6dd2fb69cf60e02f6f367b7d2e448ccc3625823f47bdf01b658/merged/var/log/swtpm supports timestamps until 2038 (0x7fffffff) Dec 15 03:05:34 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b16e8358d4fe22ce856ff2b83b0d0fefb871e4963cf69a561b1e5fd5bd34452f/merged/var/log/nova supports timestamps until 2038 (0x7fffffff) Dec 15 03:05:34 localhost podman[57915]: 2025-12-15 08:05:34.659613591 +0000 UTC m=+0.146210359 container init 491a19262c16ca4c262b2c9692375e3b36e2ba6534b49d3827928d6534415b13 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765784752'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, distribution-scope=public, architecture=x86_64, release=1761123044, io.buildah.version=1.41.4, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, name=rhosp17/openstack-nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtqemud_init_logs, build-date=2025-11-19T00:35:22Z, tcib_managed=true, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step2, managed_by=tripleo_ansible, url=https://www.redhat.com) Dec 15 03:05:34 localhost podman[57915]: 2025-12-15 08:05:34.666840104 +0000 UTC m=+0.153436882 container start 491a19262c16ca4c262b2c9692375e3b36e2ba6534b49d3827928d6534415b13 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, tcib_managed=true, io.buildah.version=1.41.4, url=https://www.redhat.com, com.redhat.component=openstack-nova-libvirt-container, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, container_name=nova_virtqemud_init_logs, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765784752'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, distribution-scope=public, build-date=2025-11-19T00:35:22Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step2) Dec 15 03:05:34 localhost python3[57829]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtqemud_init_logs --conmon-pidfile /run/nova_virtqemud_init_logs.pid --detach=True --env TRIPLEO_DEPLOY_IDENTIFIER=1765784752 --label config_id=tripleo_step2 --label container_name=nova_virtqemud_init_logs --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765784752'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtqemud_init_logs.log --network none --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --user root --volume /var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 /bin/bash -c chown -R tss:tss /var/log/swtpm Dec 15 03:05:34 localhost systemd[1]: libpod-491a19262c16ca4c262b2c9692375e3b36e2ba6534b49d3827928d6534415b13.scope: Deactivated successfully. Dec 15 03:05:34 localhost podman[57897]: 2025-12-15 08:05:34.710930796 +0000 UTC m=+0.236537809 container init 82e84ac99155092a2f6ba7ce86bb20e20ca2974ae9821a85a7f91ea04fff00bb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step2, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, tcib_managed=true, managed_by=tripleo_ansible, batch=17.1_20251118.1, container_name=nova_compute_init_log, architecture=x86_64, url=https://www.redhat.com, distribution-scope=public, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765784752'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, maintainer=OpenStack TripleO Team) Dec 15 03:05:34 localhost podman[57943]: 2025-12-15 08:05:34.72228248 +0000 UTC m=+0.040631279 container died 491a19262c16ca4c262b2c9692375e3b36e2ba6534b49d3827928d6534415b13 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, container_name=nova_virtqemud_init_logs, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step2, batch=17.1_20251118.1, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765784752'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, distribution-scope=public, com.redhat.component=openstack-nova-libvirt-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, url=https://www.redhat.com, build-date=2025-11-19T00:35:22Z, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, maintainer=OpenStack TripleO Team, version=17.1.12, architecture=x86_64, io.buildah.version=1.41.4) Dec 15 03:05:34 localhost systemd[1]: libpod-82e84ac99155092a2f6ba7ce86bb20e20ca2974ae9821a85a7f91ea04fff00bb.scope: Deactivated successfully. Dec 15 03:05:34 localhost podman[57943]: 2025-12-15 08:05:34.75171932 +0000 UTC m=+0.070068119 container cleanup 491a19262c16ca4c262b2c9692375e3b36e2ba6534b49d3827928d6534415b13 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, com.redhat.component=openstack-nova-libvirt-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, managed_by=tripleo_ansible, url=https://www.redhat.com, container_name=nova_virtqemud_init_logs, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, build-date=2025-11-19T00:35:22Z, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, architecture=x86_64, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765784752'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, vcs-type=git, tcib_managed=true, io.buildah.version=1.41.4, config_id=tripleo_step2, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Dec 15 03:05:34 localhost systemd[1]: libpod-conmon-491a19262c16ca4c262b2c9692375e3b36e2ba6534b49d3827928d6534415b13.scope: Deactivated successfully. Dec 15 03:05:34 localhost podman[57897]: 2025-12-15 08:05:34.820168314 +0000 UTC m=+0.345775287 container start 82e84ac99155092a2f6ba7ce86bb20e20ca2974ae9821a85a7f91ea04fff00bb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765784752'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, batch=17.1_20251118.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute_init_log, config_id=tripleo_step2, version=17.1.12, tcib_managed=true, name=rhosp17/openstack-nova-compute, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 15 03:05:34 localhost podman[57969]: 2025-12-15 08:05:34.822413154 +0000 UTC m=+0.084881705 container died 82e84ac99155092a2f6ba7ce86bb20e20ca2974ae9821a85a7f91ea04fff00bb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, vendor=Red Hat, Inc., container_name=nova_compute_init_log, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, release=1761123044, config_id=tripleo_step2, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765784752'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, vcs-type=git) Dec 15 03:05:34 localhost python3[57829]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_compute_init_log --conmon-pidfile /run/nova_compute_init_log.pid --detach=True --env TRIPLEO_DEPLOY_IDENTIFIER=1765784752 --label config_id=tripleo_step2 --label container_name=nova_compute_init_log --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765784752'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_compute_init_log.log --network none --privileged=False --user root --volume /var/log/containers/nova:/var/log/nova:z registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 /bin/bash -c chown -R nova:nova /var/log/nova Dec 15 03:05:34 localhost podman[57969]: 2025-12-15 08:05:34.84426379 +0000 UTC m=+0.106732301 container cleanup 82e84ac99155092a2f6ba7ce86bb20e20ca2974ae9821a85a7f91ea04fff00bb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, com.redhat.component=openstack-nova-compute-container, vcs-type=git, io.buildah.version=1.41.4, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step2, managed_by=tripleo_ansible, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765784752'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, architecture=x86_64, distribution-scope=public, container_name=nova_compute_init_log, url=https://www.redhat.com) Dec 15 03:05:34 localhost systemd[1]: libpod-conmon-82e84ac99155092a2f6ba7ce86bb20e20ca2974ae9821a85a7f91ea04fff00bb.scope: Deactivated successfully. Dec 15 03:05:35 localhost ceph-osd[31375]: log_channel(cluster) log [DBG] : 3.15 scrub starts Dec 15 03:05:35 localhost ceph-osd[31375]: log_channel(cluster) log [DBG] : 3.15 scrub ok Dec 15 03:05:35 localhost podman[58092]: 2025-12-15 08:05:35.263249548 +0000 UTC m=+0.082102841 container create 2bab57e62a1f2efe7f7cd0dadb6e5910c04532af23927d8fbd094c0587904967 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, config_id=tripleo_step2, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, version=17.1.12, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=create_haproxy_wrapper, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, vcs-type=git, managed_by=tripleo_ansible, build-date=2025-11-19T00:14:25Z) Dec 15 03:05:35 localhost podman[58107]: 2025-12-15 08:05:35.292166063 +0000 UTC m=+0.083540899 container create a861cedb0dab51fafcbd9dcfc8359ca2e8b254ea7d9150233bd842397a343525 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, vcs-type=git, config_id=tripleo_step2, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765784752'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, release=1761123044, batch=17.1_20251118.1, distribution-scope=public, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=create_virtlogd_wrapper, com.redhat.component=openstack-nova-libvirt-container, managed_by=tripleo_ansible, io.openshift.expose-services=, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z) Dec 15 03:05:35 localhost systemd[1]: Started libpod-conmon-2bab57e62a1f2efe7f7cd0dadb6e5910c04532af23927d8fbd094c0587904967.scope. Dec 15 03:05:35 localhost podman[58092]: 2025-12-15 08:05:35.217145403 +0000 UTC m=+0.035998736 image pull registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 Dec 15 03:05:35 localhost systemd[1]: Started libcrun container. Dec 15 03:05:35 localhost systemd[1]: Started libpod-conmon-a861cedb0dab51fafcbd9dcfc8359ca2e8b254ea7d9150233bd842397a343525.scope. Dec 15 03:05:35 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a7a0fabd483d0287d5d447d66b01a81ddf0a08e391390a4f77973a583945daec/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 03:05:35 localhost systemd[1]: Started libcrun container. Dec 15 03:05:35 localhost podman[58092]: 2025-12-15 08:05:35.33569662 +0000 UTC m=+0.154549923 container init 2bab57e62a1f2efe7f7cd0dadb6e5910c04532af23927d8fbd094c0587904967 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, config_id=tripleo_step2, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, tcib_managed=true, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=create_haproxy_wrapper, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, vcs-type=git) Dec 15 03:05:35 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e86956d3754c7245b49457025d0c769a2ba1dcff8718bcff3af6d874290efba1/merged/var/lib/container-config-scripts supports timestamps until 2038 (0x7fffffff) Dec 15 03:05:35 localhost podman[58092]: 2025-12-15 08:05:35.344789823 +0000 UTC m=+0.163643136 container start 2bab57e62a1f2efe7f7cd0dadb6e5910c04532af23927d8fbd094c0587904967 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, managed_by=tripleo_ansible, tcib_managed=true, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:14:25Z, distribution-scope=public, architecture=x86_64, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, container_name=create_haproxy_wrapper, config_id=tripleo_step2) Dec 15 03:05:35 localhost podman[58092]: 2025-12-15 08:05:35.345117902 +0000 UTC m=+0.163971205 container attach 2bab57e62a1f2efe7f7cd0dadb6e5910c04532af23927d8fbd094c0587904967 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, build-date=2025-11-19T00:14:25Z, architecture=x86_64, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, container_name=create_haproxy_wrapper, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, distribution-scope=public, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step2, io.openshift.expose-services=, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Dec 15 03:05:35 localhost podman[58107]: 2025-12-15 08:05:35.251577435 +0000 UTC m=+0.042952261 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Dec 15 03:05:35 localhost podman[58107]: 2025-12-15 08:05:35.351484453 +0000 UTC m=+0.142859279 container init a861cedb0dab51fafcbd9dcfc8359ca2e8b254ea7d9150233bd842397a343525 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765784752'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, tcib_managed=true, build-date=2025-11-19T00:35:22Z, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4, release=1761123044, config_id=tripleo_step2, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, batch=17.1_20251118.1, container_name=create_virtlogd_wrapper, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, name=rhosp17/openstack-nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Dec 15 03:05:35 localhost podman[58107]: 2025-12-15 08:05:35.36071982 +0000 UTC m=+0.152094656 container start a861cedb0dab51fafcbd9dcfc8359ca2e8b254ea7d9150233bd842397a343525 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.component=openstack-nova-libvirt-container, build-date=2025-11-19T00:35:22Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, version=17.1.12, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-libvirt, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765784752'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.buildah.version=1.41.4, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, container_name=create_virtlogd_wrapper, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, config_id=tripleo_step2, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible) Dec 15 03:05:35 localhost podman[58107]: 2025-12-15 08:05:35.361023028 +0000 UTC m=+0.152397914 container attach a861cedb0dab51fafcbd9dcfc8359ca2e8b254ea7d9150233bd842397a343525 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, com.redhat.component=openstack-nova-libvirt-container, vendor=Red Hat, Inc., io.buildah.version=1.41.4, config_id=tripleo_step2, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, io.openshift.expose-services=, build-date=2025-11-19T00:35:22Z, name=rhosp17/openstack-nova-libvirt, container_name=create_virtlogd_wrapper, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765784752'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, batch=17.1_20251118.1, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, version=17.1.12, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-libvirt) Dec 15 03:05:35 localhost systemd[1]: var-lib-containers-storage-overlay-c0eca1cbaf0df6dd2fb69cf60e02f6f367b7d2e448ccc3625823f47bdf01b658-merged.mount: Deactivated successfully. Dec 15 03:05:35 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-491a19262c16ca4c262b2c9692375e3b36e2ba6534b49d3827928d6534415b13-userdata-shm.mount: Deactivated successfully. Dec 15 03:05:35 localhost systemd[1]: var-lib-containers-storage-overlay-b16e8358d4fe22ce856ff2b83b0d0fefb871e4963cf69a561b1e5fd5bd34452f-merged.mount: Deactivated successfully. Dec 15 03:05:35 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-82e84ac99155092a2f6ba7ce86bb20e20ca2974ae9821a85a7f91ea04fff00bb-userdata-shm.mount: Deactivated successfully. Dec 15 03:05:35 localhost ceph-osd[31375]: osd.0 pg_epoch: 65 pg[7.c( v 35'39 (0'0,35'39] local-lis/les=49/50 n=1 ec=42/33 lis/c=49/49 les/c/f=50/50/0 sis=65 pruub=9.452054977s) [2,1,3] r=-1 lpr=65 pi=[49,65)/1 luod=0'0 crt=35'39 lcod 0'0 mlcod 0'0 active pruub 1239.549316406s@ mbc={}] start_peering_interval up [1,5,0] -> [2,1,3], acting [1,5,0] -> [2,1,3], acting_primary 1 -> 2, up_primary 1 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:05:35 localhost ceph-osd[31375]: osd.0 pg_epoch: 65 pg[7.c( v 35'39 (0'0,35'39] local-lis/les=49/50 n=1 ec=42/33 lis/c=49/49 les/c/f=50/50/0 sis=65 pruub=9.451436996s) [2,1,3] r=-1 lpr=65 pi=[49,65)/1 crt=35'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1239.549316406s@ mbc={}] state: transitioning to Stray Dec 15 03:05:37 localhost ceph-osd[32311]: osd.3 pg_epoch: 65 pg[7.c( empty local-lis/les=0/0 n=0 ec=42/33 lis/c=49/49 les/c/f=50/50/0 sis=65) [2,1,3] r=2 lpr=65 pi=[49,65)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Dec 15 03:05:37 localhost ovs-vsctl[58224]: ovs|00001|db_ctl_base|ERR|unix:/var/run/openvswitch/db.sock: database connection failed (No such file or directory) Dec 15 03:05:37 localhost systemd[1]: libpod-a861cedb0dab51fafcbd9dcfc8359ca2e8b254ea7d9150233bd842397a343525.scope: Deactivated successfully. Dec 15 03:05:37 localhost systemd[1]: libpod-a861cedb0dab51fafcbd9dcfc8359ca2e8b254ea7d9150233bd842397a343525.scope: Consumed 2.041s CPU time. Dec 15 03:05:37 localhost podman[58107]: 2025-12-15 08:05:37.397065194 +0000 UTC m=+2.188440030 container died a861cedb0dab51fafcbd9dcfc8359ca2e8b254ea7d9150233bd842397a343525 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, distribution-scope=public, build-date=2025-11-19T00:35:22Z, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765784752'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, container_name=create_virtlogd_wrapper, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-libvirt-container, tcib_managed=true, url=https://www.redhat.com, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-type=git, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step2, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 15 03:05:37 localhost systemd[1]: tmp-crun.bTxLCn.mount: Deactivated successfully. Dec 15 03:05:37 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a861cedb0dab51fafcbd9dcfc8359ca2e8b254ea7d9150233bd842397a343525-userdata-shm.mount: Deactivated successfully. Dec 15 03:05:37 localhost podman[58350]: 2025-12-15 08:05:37.491835944 +0000 UTC m=+0.081894356 container cleanup a861cedb0dab51fafcbd9dcfc8359ca2e8b254ea7d9150233bd842397a343525 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765784752'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, tcib_managed=true, name=rhosp17/openstack-nova-libvirt, url=https://www.redhat.com, io.openshift.expose-services=, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, com.redhat.component=openstack-nova-libvirt-container, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=create_virtlogd_wrapper, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, vcs-type=git, config_id=tripleo_step2) Dec 15 03:05:37 localhost systemd[1]: libpod-conmon-a861cedb0dab51fafcbd9dcfc8359ca2e8b254ea7d9150233bd842397a343525.scope: Deactivated successfully. Dec 15 03:05:37 localhost python3[57829]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name create_virtlogd_wrapper --cgroupns=host --conmon-pidfile /run/create_virtlogd_wrapper.pid --detach=False --env TRIPLEO_DEPLOY_IDENTIFIER=1765784752 --label config_id=tripleo_step2 --label container_name=create_virtlogd_wrapper --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765784752'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/create_virtlogd_wrapper.log --network host --pid host --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 /container_puppet_apply.sh 4 file include ::tripleo::profile::base::nova::virtlogd_wrapper Dec 15 03:05:37 localhost ceph-osd[32311]: log_channel(cluster) log [DBG] : 6.1c scrub starts Dec 15 03:05:37 localhost ceph-osd[32311]: log_channel(cluster) log [DBG] : 6.1c scrub ok Dec 15 03:05:38 localhost systemd[1]: libpod-2bab57e62a1f2efe7f7cd0dadb6e5910c04532af23927d8fbd094c0587904967.scope: Deactivated successfully. Dec 15 03:05:38 localhost systemd[1]: libpod-2bab57e62a1f2efe7f7cd0dadb6e5910c04532af23927d8fbd094c0587904967.scope: Consumed 2.007s CPU time. Dec 15 03:05:38 localhost podman[58092]: 2025-12-15 08:05:38.07640899 +0000 UTC m=+2.895262253 container died 2bab57e62a1f2efe7f7cd0dadb6e5910c04532af23927d8fbd094c0587904967 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., batch=17.1_20251118.1, managed_by=tripleo_ansible, tcib_managed=true, distribution-scope=public, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step2, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=create_haproxy_wrapper, release=1761123044, architecture=x86_64, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c) Dec 15 03:05:38 localhost podman[58390]: 2025-12-15 08:05:38.182061342 +0000 UTC m=+0.090698662 container cleanup 2bab57e62a1f2efe7f7cd0dadb6e5910c04532af23927d8fbd094c0587904967 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, managed_by=tripleo_ansible, vcs-type=git, build-date=2025-11-19T00:14:25Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step2, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=create_haproxy_wrapper, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, version=17.1.12, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, io.buildah.version=1.41.4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=) Dec 15 03:05:38 localhost systemd[1]: libpod-conmon-2bab57e62a1f2efe7f7cd0dadb6e5910c04532af23927d8fbd094c0587904967.scope: Deactivated successfully. Dec 15 03:05:38 localhost python3[57829]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name create_haproxy_wrapper --conmon-pidfile /run/create_haproxy_wrapper.pid --detach=False --label config_id=tripleo_step2 --label container_name=create_haproxy_wrapper --label managed_by=tripleo_ansible --label config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/create_haproxy_wrapper.log --network host --pid host --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /var/lib/neutron:/var/lib/neutron:shared,z registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 /container_puppet_apply.sh 4 file include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers Dec 15 03:05:38 localhost systemd[1]: var-lib-containers-storage-overlay-e86956d3754c7245b49457025d0c769a2ba1dcff8718bcff3af6d874290efba1-merged.mount: Deactivated successfully. Dec 15 03:05:38 localhost systemd[1]: var-lib-containers-storage-overlay-a7a0fabd483d0287d5d447d66b01a81ddf0a08e391390a4f77973a583945daec-merged.mount: Deactivated successfully. Dec 15 03:05:38 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2bab57e62a1f2efe7f7cd0dadb6e5910c04532af23927d8fbd094c0587904967-userdata-shm.mount: Deactivated successfully. Dec 15 03:05:38 localhost ceph-osd[32311]: osd.3 pg_epoch: 67 pg[7.d( v 35'39 (0'0,35'39] local-lis/les=51/52 n=1 ec=42/33 lis/c=51/51 les/c/f=52/52/0 sis=67 pruub=12.771553040s) [2,4,0] r=-1 lpr=67 pi=[51,67)/1 crt=35'39 mlcod 0'0 active pruub 1241.187988281s@ mbc={255={}}] start_peering_interval up [3,4,2] -> [2,4,0], acting [3,4,2] -> [2,4,0], acting_primary 3 -> 2, up_primary 3 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:05:38 localhost ceph-osd[32311]: osd.3 pg_epoch: 67 pg[7.d( v 35'39 (0'0,35'39] local-lis/les=51/52 n=1 ec=42/33 lis/c=51/51 les/c/f=52/52/0 sis=67 pruub=12.771300316s) [2,4,0] r=-1 lpr=67 pi=[51,67)/1 crt=35'39 mlcod 0'0 unknown NOTIFY pruub 1241.187988281s@ mbc={}] state: transitioning to Stray Dec 15 03:05:38 localhost python3[58444]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks2.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:05:39 localhost ceph-osd[31375]: osd.0 pg_epoch: 67 pg[7.d( empty local-lis/les=0/0 n=0 ec=42/33 lis/c=51/51 les/c/f=52/52/0 sis=67) [2,4,0] r=2 lpr=67 pi=[51,67)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Dec 15 03:05:40 localhost python3[58565]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks2.json short_hostname=np0005559462 step=2 update_config_hash_only=False Dec 15 03:05:40 localhost ceph-osd[32311]: log_channel(cluster) log [DBG] : 3.8 scrub starts Dec 15 03:05:40 localhost ceph-osd[32311]: osd.3 pg_epoch: 69 pg[7.e( v 35'39 (0'0,35'39] local-lis/les=53/54 n=1 ec=42/33 lis/c=53/53 les/c/f=54/54/0 sis=69 pruub=12.963660240s) [1,0,5] r=-1 lpr=69 pi=[53,69)/1 luod=0'0 crt=35'39 mlcod 0'0 active pruub 1243.611938477s@ mbc={}] start_peering_interval up [1,2,3] -> [1,0,5], acting [1,2,3] -> [1,0,5], acting_primary 1 -> 1, up_primary 1 -> 1, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:05:40 localhost ceph-osd[32311]: osd.3 pg_epoch: 69 pg[7.e( v 35'39 (0'0,35'39] local-lis/les=53/54 n=1 ec=42/33 lis/c=53/53 les/c/f=54/54/0 sis=69 pruub=12.963470459s) [1,0,5] r=-1 lpr=69 pi=[53,69)/1 crt=35'39 mlcod 0'0 unknown NOTIFY pruub 1243.611938477s@ mbc={}] state: transitioning to Stray Dec 15 03:05:41 localhost ceph-osd[31375]: log_channel(cluster) log [DBG] : 4.3 scrub starts Dec 15 03:05:41 localhost ceph-osd[32311]: log_channel(cluster) log [DBG] : 3.8 scrub ok Dec 15 03:05:41 localhost python3[58581]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:05:41 localhost python3[58597]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_2 config_pattern=container-puppet-*.json config_overrides={} debug=True Dec 15 03:05:42 localhost ceph-osd[31375]: osd.0 pg_epoch: 69 pg[7.e( empty local-lis/les=0/0 n=0 ec=42/33 lis/c=53/53 les/c/f=54/54/0 sis=69) [1,0,5] r=1 lpr=69 pi=[53,69)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Dec 15 03:05:42 localhost ceph-osd[31375]: osd.0 pg_epoch: 70 pg[7.f( v 35'39 (0'0,35'39] local-lis/les=55/56 n=1 ec=42/33 lis/c=55/55 les/c/f=56/56/0 sis=70 pruub=13.298669815s) [1,3,5] r=-1 lpr=70 pi=[55,70)/1 luod=0'0 crt=35'39 mlcod 0'0 active pruub 1250.203002930s@ mbc={}] start_peering_interval up [2,0,1] -> [1,3,5], acting [2,0,1] -> [1,3,5], acting_primary 2 -> 1, up_primary 2 -> 1, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:05:42 localhost ceph-osd[31375]: osd.0 pg_epoch: 70 pg[7.f( v 35'39 (0'0,35'39] local-lis/les=55/56 n=1 ec=42/33 lis/c=55/55 les/c/f=56/56/0 sis=70 pruub=13.297309875s) [1,3,5] r=-1 lpr=70 pi=[55,70)/1 crt=35'39 mlcod 0'0 unknown NOTIFY pruub 1250.203002930s@ mbc={}] state: transitioning to Stray Dec 15 03:05:43 localhost ceph-osd[32311]: log_channel(cluster) log [DBG] : 5.d scrub starts Dec 15 03:05:43 localhost ceph-osd[32311]: log_channel(cluster) log [DBG] : 5.d scrub ok Dec 15 03:05:44 localhost ceph-osd[32311]: osd.3 pg_epoch: 70 pg[7.f( empty local-lis/les=0/0 n=0 ec=42/33 lis/c=55/55 les/c/f=56/56/0 sis=70) [1,3,5] r=1 lpr=70 pi=[55,70)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Stray Dec 15 03:05:45 localhost ceph-osd[31375]: log_channel(cluster) log [DBG] : 2.1d scrub starts Dec 15 03:05:45 localhost ceph-osd[31375]: log_channel(cluster) log [DBG] : 2.1d scrub ok Dec 15 03:05:47 localhost ceph-osd[31375]: log_channel(cluster) log [DBG] : 4.6 scrub starts Dec 15 03:05:48 localhost ceph-osd[31375]: log_channel(cluster) log [DBG] : 4.6 scrub ok Dec 15 03:05:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:05:51 localhost podman[58598]: 2025-12-15 08:05:51.731228909 +0000 UTC m=+0.065640250 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:46Z, vendor=Red Hat, Inc., version=17.1.12, tcib_managed=true, architecture=x86_64, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 15 03:05:51 localhost podman[58598]: 2025-12-15 08:05:51.987225579 +0000 UTC m=+0.321636920 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, distribution-scope=public, architecture=x86_64, container_name=metrics_qdr, io.openshift.expose-services=, io.buildah.version=1.41.4, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:46Z, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, version=17.1.12, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Dec 15 03:05:52 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:05:55 localhost ceph-osd[31375]: log_channel(cluster) log [DBG] : 4.1c scrub starts Dec 15 03:05:55 localhost ceph-osd[31375]: log_channel(cluster) log [DBG] : 4.1c scrub ok Dec 15 03:05:55 localhost ceph-osd[32311]: log_channel(cluster) log [DBG] : 2.b scrub starts Dec 15 03:05:55 localhost ceph-osd[32311]: log_channel(cluster) log [DBG] : 2.b scrub ok Dec 15 03:05:56 localhost ceph-osd[32311]: log_channel(cluster) log [DBG] : 3.11 deep-scrub starts Dec 15 03:05:56 localhost ceph-osd[32311]: log_channel(cluster) log [DBG] : 3.11 deep-scrub ok Dec 15 03:05:57 localhost ceph-osd[31375]: log_channel(cluster) log [DBG] : 5.1a scrub starts Dec 15 03:05:57 localhost ceph-osd[31375]: log_channel(cluster) log [DBG] : 5.1a scrub ok Dec 15 03:05:58 localhost ceph-osd[31375]: log_channel(cluster) log [DBG] : 3.e scrub starts Dec 15 03:05:59 localhost ceph-osd[31375]: log_channel(cluster) log [DBG] : 3.e scrub ok Dec 15 03:05:59 localhost ceph-osd[32311]: log_channel(cluster) log [DBG] : 2.10 scrub starts Dec 15 03:05:59 localhost ceph-osd[32311]: log_channel(cluster) log [DBG] : 2.10 scrub ok Dec 15 03:06:03 localhost ceph-osd[31375]: log_channel(cluster) log [DBG] : 2.d deep-scrub starts Dec 15 03:06:03 localhost ceph-osd[31375]: log_channel(cluster) log [DBG] : 2.d deep-scrub ok Dec 15 03:06:03 localhost ceph-osd[32311]: log_channel(cluster) log [DBG] : 2.c deep-scrub starts Dec 15 03:06:03 localhost ceph-osd[32311]: log_channel(cluster) log [DBG] : 2.c deep-scrub ok Dec 15 03:06:04 localhost ceph-osd[31375]: log_channel(cluster) log [DBG] : 5.4 scrub starts Dec 15 03:06:04 localhost ceph-osd[31375]: log_channel(cluster) log [DBG] : 5.4 scrub ok Dec 15 03:06:08 localhost ceph-osd[31375]: log_channel(cluster) log [DBG] : 5.b scrub starts Dec 15 03:06:08 localhost ceph-osd[31375]: log_channel(cluster) log [DBG] : 5.b scrub ok Dec 15 03:06:08 localhost ceph-osd[31375]: log_channel(cluster) log [DBG] : 5.8 scrub starts Dec 15 03:06:09 localhost ceph-osd[31375]: log_channel(cluster) log [DBG] : 5.8 scrub ok Dec 15 03:06:11 localhost ceph-osd[31375]: log_channel(cluster) log [DBG] : 4.9 scrub starts Dec 15 03:06:11 localhost ceph-osd[31375]: log_channel(cluster) log [DBG] : 4.9 scrub ok Dec 15 03:06:12 localhost ceph-osd[32311]: log_channel(cluster) log [DBG] : 5.13 scrub starts Dec 15 03:06:12 localhost ceph-osd[32311]: log_channel(cluster) log [DBG] : 5.13 scrub ok Dec 15 03:06:16 localhost ceph-osd[31375]: log_channel(cluster) log [DBG] : 6.12 scrub starts Dec 15 03:06:16 localhost ceph-osd[31375]: log_channel(cluster) log [DBG] : 6.12 scrub ok Dec 15 03:06:16 localhost ceph-osd[32311]: log_channel(cluster) log [DBG] : 4.15 scrub starts Dec 15 03:06:16 localhost ceph-osd[32311]: log_channel(cluster) log [DBG] : 4.15 scrub ok Dec 15 03:06:17 localhost ceph-osd[32311]: log_channel(cluster) log [DBG] : 5.12 scrub starts Dec 15 03:06:18 localhost ceph-osd[31375]: log_channel(cluster) log [DBG] : 2.1c scrub starts Dec 15 03:06:18 localhost ceph-osd[31375]: log_channel(cluster) log [DBG] : 2.1c scrub ok Dec 15 03:06:18 localhost ceph-osd[32311]: log_channel(cluster) log [DBG] : 5.12 scrub ok Dec 15 03:06:19 localhost ceph-osd[32311]: log_channel(cluster) log [DBG] : 2.18 scrub starts Dec 15 03:06:20 localhost ceph-osd[32311]: log_channel(cluster) log [DBG] : 2.18 scrub ok Dec 15 03:06:21 localhost ceph-osd[31375]: log_channel(cluster) log [DBG] : 4.3 scrub starts Dec 15 03:06:21 localhost ceph-osd[31375]: log_channel(cluster) log [DBG] : 4.3 scrub ok Dec 15 03:06:21 localhost ceph-osd[32311]: log_channel(cluster) log [DBG] : 7.5 scrub starts Dec 15 03:06:21 localhost ceph-osd[32311]: log_channel(cluster) log [DBG] : 7.5 scrub ok Dec 15 03:06:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:06:22 localhost podman[58704]: 2025-12-15 08:06:22.751174216 +0000 UTC m=+0.084569582 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible, release=1761123044, version=17.1.12, batch=17.1_20251118.1, tcib_managed=true, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=) Dec 15 03:06:22 localhost podman[58704]: 2025-12-15 08:06:22.995450778 +0000 UTC m=+0.328846084 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, vcs-type=git, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step1, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Dec 15 03:06:23 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:06:23 localhost ceph-osd[32311]: log_channel(cluster) log [DBG] : 7.a scrub starts Dec 15 03:06:23 localhost ceph-osd[32311]: log_channel(cluster) log [DBG] : 7.a scrub ok Dec 15 03:06:24 localhost ceph-osd[32311]: log_channel(cluster) log [DBG] : 2.1b scrub starts Dec 15 03:06:24 localhost ceph-osd[32311]: log_channel(cluster) log [DBG] : 2.1b scrub ok Dec 15 03:06:25 localhost ceph-osd[32311]: log_channel(cluster) log [DBG] : 6.1e scrub starts Dec 15 03:06:25 localhost ceph-osd[32311]: log_channel(cluster) log [DBG] : 6.1e scrub ok Dec 15 03:06:34 localhost ceph-osd[32311]: osd.3 72 crush map has features 432629239337189376, adjusting msgr requires for clients Dec 15 03:06:34 localhost ceph-osd[32311]: osd.3 72 crush map has features 432629239337189376 was 288514051259245057, adjusting msgr requires for mons Dec 15 03:06:34 localhost ceph-osd[32311]: osd.3 72 crush map has features 3314933000854323200, adjusting msgr requires for osds Dec 15 03:06:34 localhost ceph-osd[32311]: osd.3 pg_epoch: 72 pg[3.9( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=72 pruub=10.996282578s) [0,2,1] r=-1 lpr=72 pi=[38,72)/1 crt=0'0 mlcod 0'0 active pruub 1294.943359375s@ mbc={}] start_peering_interval up [3,2,1] -> [0,2,1], acting [3,2,1] -> [0,2,1], acting_primary 3 -> 0, up_primary 3 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:06:34 localhost ceph-osd[32311]: osd.3 pg_epoch: 72 pg[4.f( empty local-lis/les=44/45 n=0 ec=40/24 lis/c=44/44 les/c/f=45/45/0 sis=72 pruub=12.902430534s) [4,3,5] r=1 lpr=72 pi=[44,72)/1 crt=0'0 mlcod 0'0 active pruub 1296.850219727s@ mbc={}] start_peering_interval up [1,3,5] -> [4,3,5], acting [1,3,5] -> [4,3,5], acting_primary 1 -> 4, up_primary 1 -> 4, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:06:34 localhost ceph-osd[32311]: osd.3 pg_epoch: 72 pg[4.f( empty local-lis/les=44/45 n=0 ec=40/24 lis/c=44/44 les/c/f=45/45/0 sis=72 pruub=12.902276039s) [4,3,5] r=1 lpr=72 pi=[44,72)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1296.850219727s@ mbc={}] state: transitioning to Stray Dec 15 03:06:34 localhost ceph-osd[32311]: osd.3 pg_epoch: 72 pg[3.9( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=72 pruub=10.995655060s) [0,2,1] r=-1 lpr=72 pi=[38,72)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1294.943359375s@ mbc={}] state: transitioning to Stray Dec 15 03:06:34 localhost ceph-osd[31375]: osd.0 72 crush map has features 432629239337189376, adjusting msgr requires for clients Dec 15 03:06:34 localhost ceph-osd[31375]: osd.0 72 crush map has features 432629239337189376 was 288514051259245057, adjusting msgr requires for mons Dec 15 03:06:34 localhost ceph-osd[31375]: osd.0 72 crush map has features 3314933000854323200, adjusting msgr requires for osds Dec 15 03:06:34 localhost ceph-osd[31375]: osd.0 pg_epoch: 72 pg[4.6( empty local-lis/les=44/45 n=0 ec=40/24 lis/c=44/44 les/c/f=45/45/0 sis=72 pruub=12.896583557s) [0,4,2] r=0 lpr=72 pi=[44,72)/1 crt=0'0 mlcod 0'0 active pruub 1301.138793945s@ mbc={}] start_peering_interval up [0,1,2] -> [0,4,2], acting [0,1,2] -> [0,4,2], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015 Dec 15 03:06:34 localhost ceph-osd[31375]: osd.0 pg_epoch: 72 pg[4.6( empty local-lis/les=44/45 n=0 ec=40/24 lis/c=44/44 les/c/f=45/45/0 sis=72 pruub=12.896583557s) [0,4,2] r=0 lpr=72 pi=[44,72)/1 crt=0'0 mlcod 0'0 unknown pruub 1301.138793945s@ mbc={}] state: transitioning to Primary Dec 15 03:06:34 localhost ceph-osd[31375]: osd.0 pg_epoch: 72 pg[3.9( empty local-lis/les=0/0 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=72) [0,2,1] r=0 lpr=72 pi=[38,72)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state: transitioning to Primary Dec 15 03:06:35 localhost ceph-osd[31375]: osd.0 pg_epoch: 73 pg[3.9( empty local-lis/les=72/73 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=72) [0,2,1] r=0 lpr=72 pi=[38,72)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:06:35 localhost ceph-osd[31375]: osd.0 pg_epoch: 73 pg[4.6( empty local-lis/les=72/73 n=0 ec=40/24 lis/c=44/44 les/c/f=45/45/0 sis=72) [0,4,2] r=0 lpr=72 pi=[44,72)/1 crt=0'0 mlcod 0'0 active mbc={}] state: react AllReplicasActivated Activating complete Dec 15 03:06:36 localhost ceph-osd[31375]: log_channel(cluster) log [DBG] : 3.9 scrub starts Dec 15 03:06:37 localhost ceph-osd[31375]: log_channel(cluster) log [DBG] : 3.9 scrub ok Dec 15 03:06:38 localhost sshd[58732]: main: sshd: ssh-rsa algorithm is disabled Dec 15 03:06:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:06:53 localhost podman[58734]: 2025-12-15 08:06:53.753265132 +0000 UTC m=+0.086593704 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, tcib_managed=true, vendor=Red Hat, Inc., managed_by=tripleo_ansible, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, architecture=x86_64, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, distribution-scope=public, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, io.buildah.version=1.41.4) Dec 15 03:06:53 localhost podman[58734]: 2025-12-15 08:06:53.979433938 +0000 UTC m=+0.312762530 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, vendor=Red Hat, Inc., version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, tcib_managed=true, url=https://www.redhat.com, io.buildah.version=1.41.4, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 15 03:06:53 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:07:21 localhost systemd[1]: tmp-crun.j5RAxy.mount: Deactivated successfully. Dec 15 03:07:21 localhost podman[58866]: 2025-12-15 08:07:21.282386059 +0000 UTC m=+0.103108020 container exec 8dcda56b365b42dc8758aab77a9ec80db304780e449052738f7e4e648ae1ecaf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-crash-np0005559462, GIT_BRANCH=main, architecture=x86_64, com.redhat.component=rhceph-container, build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, GIT_CLEAN=True, io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, release=1763362218, RELEASE=main, vcs-type=git, CEPH_POINT_RELEASE=, version=7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0) Dec 15 03:07:21 localhost podman[58866]: 2025-12-15 08:07:21.424559339 +0000 UTC m=+0.245281360 container exec_died 8dcda56b365b42dc8758aab77a9ec80db304780e449052738f7e4e648ae1ecaf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-crash-np0005559462, io.openshift.expose-services=, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-11-26T19:44:28Z, GIT_CLEAN=True, RELEASE=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, GIT_BRANCH=main, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , version=7, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, name=rhceph) Dec 15 03:07:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:07:24 localhost systemd[1]: tmp-crun.YpnjYh.mount: Deactivated successfully. Dec 15 03:07:24 localhost podman[59009]: 2025-12-15 08:07:24.776314315 +0000 UTC m=+0.099290360 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, url=https://www.redhat.com, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, distribution-scope=public, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, version=17.1.12, release=1761123044, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, tcib_managed=true, config_id=tripleo_step1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container) Dec 15 03:07:24 localhost podman[59009]: 2025-12-15 08:07:24.943861994 +0000 UTC m=+0.266838029 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, version=17.1.12, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, vcs-type=git, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd, release=1761123044, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd) Dec 15 03:07:24 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:07:52 localhost sshd[59038]: main: sshd: ssh-rsa algorithm is disabled Dec 15 03:07:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:07:55 localhost podman[59040]: 2025-12-15 08:07:55.75477535 +0000 UTC m=+0.077747442 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, maintainer=OpenStack TripleO Team, release=1761123044, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, architecture=x86_64, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, managed_by=tripleo_ansible, vcs-type=git) Dec 15 03:07:55 localhost podman[59040]: 2025-12-15 08:07:55.963428963 +0000 UTC m=+0.286401085 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, distribution-scope=public, io.openshift.expose-services=, vcs-type=git, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1) Dec 15 03:07:55 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:08:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:08:26 localhost systemd[1]: tmp-crun.Nv7Cbg.mount: Deactivated successfully. Dec 15 03:08:26 localhost podman[59147]: 2025-12-15 08:08:26.768718636 +0000 UTC m=+0.098942891 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, url=https://www.redhat.com, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.openshift.expose-services=, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, container_name=metrics_qdr, config_id=tripleo_step1, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044) Dec 15 03:08:26 localhost podman[59147]: 2025-12-15 08:08:26.962249299 +0000 UTC m=+0.292473534 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.12, release=1761123044, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, architecture=x86_64, container_name=metrics_qdr, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4) Dec 15 03:08:26 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:08:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:08:57 localhost podman[59176]: 2025-12-15 08:08:57.743048828 +0000 UTC m=+0.075278082 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, release=1761123044, vcs-type=git, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, url=https://www.redhat.com, distribution-scope=public, config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 15 03:08:57 localhost podman[59176]: 2025-12-15 08:08:57.940782643 +0000 UTC m=+0.273011957 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, batch=17.1_20251118.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, vendor=Red Hat, Inc., config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4) Dec 15 03:08:57 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:09:02 localhost sshd[59206]: main: sshd: ssh-rsa algorithm is disabled Dec 15 03:09:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:09:28 localhost podman[59284]: 2025-12-15 08:09:28.767574783 +0000 UTC m=+0.101186871 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, io.buildah.version=1.41.4, vcs-type=git, version=17.1.12, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container) Dec 15 03:09:28 localhost podman[59284]: 2025-12-15 08:09:28.965097983 +0000 UTC m=+0.298710031 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, name=rhosp17/openstack-qdrouterd, io.buildah.version=1.41.4, tcib_managed=true, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, managed_by=tripleo_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64) Dec 15 03:09:28 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:09:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:09:59 localhost podman[59315]: 2025-12-15 08:09:59.746907406 +0000 UTC m=+0.078197810 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1) Dec 15 03:09:59 localhost podman[59315]: 2025-12-15 08:09:59.958403688 +0000 UTC m=+0.289694052 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., architecture=x86_64, config_id=tripleo_step1, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:46Z, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd) Dec 15 03:10:00 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:10:14 localhost sshd[59344]: main: sshd: ssh-rsa algorithm is disabled Dec 15 03:10:16 localhost python3[59393]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 03:10:16 localhost python3[59438]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765786215.7564614-98122-168860866243036/source _original_basename=tmpm1t2t7qj follow=False checksum=62439dd24dde40c90e7a39f6a1b31cc6061fe59b backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:10:17 localhost python3[59468]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 15 03:10:19 localhost ansible-async_wrapper.py[59640]: Invoked with 479494144978 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1765786218.586098-98283-133481435682823/AnsiballZ_command.py _ Dec 15 03:10:19 localhost ansible-async_wrapper.py[59643]: Starting module and watcher Dec 15 03:10:19 localhost ansible-async_wrapper.py[59643]: Start watching 59644 (3600) Dec 15 03:10:19 localhost ansible-async_wrapper.py[59644]: Start module (59644) Dec 15 03:10:19 localhost ansible-async_wrapper.py[59640]: Return async_wrapper task started. Dec 15 03:10:19 localhost python3[59664]: ansible-ansible.legacy.async_status Invoked with jid=479494144978.59640 mode=status _async_dir=/tmp/.ansible_async Dec 15 03:10:22 localhost puppet-user[59657]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Dec 15 03:10:22 localhost puppet-user[59657]: (file: /etc/puppet/hiera.yaml) Dec 15 03:10:22 localhost puppet-user[59657]: Warning: Undefined variable '::deploy_config_name'; Dec 15 03:10:22 localhost puppet-user[59657]: (file & line not available) Dec 15 03:10:22 localhost puppet-user[59657]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Dec 15 03:10:22 localhost puppet-user[59657]: (file & line not available) Dec 15 03:10:22 localhost puppet-user[59657]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8) Dec 15 03:10:22 localhost puppet-user[59657]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69) Dec 15 03:10:22 localhost puppet-user[59657]: Notice: Compiled catalog for np0005559462.localdomain in environment production in 0.11 seconds Dec 15 03:10:22 localhost puppet-user[59657]: Notice: Applied catalog in 0.04 seconds Dec 15 03:10:22 localhost puppet-user[59657]: Application: Dec 15 03:10:22 localhost puppet-user[59657]: Initial environment: production Dec 15 03:10:22 localhost puppet-user[59657]: Converged environment: production Dec 15 03:10:22 localhost puppet-user[59657]: Run mode: user Dec 15 03:10:22 localhost puppet-user[59657]: Changes: Dec 15 03:10:22 localhost puppet-user[59657]: Events: Dec 15 03:10:22 localhost puppet-user[59657]: Resources: Dec 15 03:10:22 localhost puppet-user[59657]: Total: 10 Dec 15 03:10:22 localhost puppet-user[59657]: Time: Dec 15 03:10:22 localhost puppet-user[59657]: Schedule: 0.00 Dec 15 03:10:22 localhost puppet-user[59657]: File: 0.00 Dec 15 03:10:22 localhost puppet-user[59657]: Exec: 0.01 Dec 15 03:10:22 localhost puppet-user[59657]: Augeas: 0.01 Dec 15 03:10:22 localhost puppet-user[59657]: Transaction evaluation: 0.03 Dec 15 03:10:22 localhost puppet-user[59657]: Catalog application: 0.04 Dec 15 03:10:22 localhost puppet-user[59657]: Config retrieval: 0.14 Dec 15 03:10:22 localhost puppet-user[59657]: Last run: 1765786222 Dec 15 03:10:22 localhost puppet-user[59657]: Filebucket: 0.00 Dec 15 03:10:22 localhost puppet-user[59657]: Total: 0.04 Dec 15 03:10:22 localhost puppet-user[59657]: Version: Dec 15 03:10:22 localhost puppet-user[59657]: Config: 1765786222 Dec 15 03:10:22 localhost puppet-user[59657]: Puppet: 7.10.0 Dec 15 03:10:23 localhost ansible-async_wrapper.py[59644]: Module complete (59644) Dec 15 03:10:24 localhost ansible-async_wrapper.py[59643]: Done in kid B. Dec 15 03:10:29 localhost python3[59867]: ansible-ansible.legacy.async_status Invoked with jid=479494144978.59640 mode=status _async_dir=/tmp/.ansible_async Dec 15 03:10:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:10:30 localhost systemd[1]: tmp-crun.VxNwUw.mount: Deactivated successfully. Dec 15 03:10:30 localhost podman[59884]: 2025-12-15 08:10:30.59756013 +0000 UTC m=+0.108260219 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., io.buildah.version=1.41.4, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, distribution-scope=public, release=1761123044, build-date=2025-11-18T22:49:46Z) Dec 15 03:10:30 localhost python3[59883]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Dec 15 03:10:30 localhost podman[59884]: 2025-12-15 08:10:30.805672531 +0000 UTC m=+0.316372600 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, distribution-scope=public, managed_by=tripleo_ansible, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true) Dec 15 03:10:30 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:10:31 localhost python3[59928]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 15 03:10:31 localhost python3[59978]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 03:10:31 localhost python3[59996]: ansible-ansible.legacy.file Invoked with setype=svirt_sandbox_file_t selevel=s0 dest=/var/lib/container-puppet/puppetlabs/facter.conf _original_basename=tmpx4h20poj recurse=False state=file path=/var/lib/container-puppet/puppetlabs/facter.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Dec 15 03:10:32 localhost python3[60026]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:10:33 localhost python3[60129]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None Dec 15 03:10:34 localhost python3[60148]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:10:35 localhost python3[60180]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 15 03:10:36 localhost python3[60230]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 03:10:36 localhost python3[60248]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-container-shutdown _original_basename=tripleo-container-shutdown recurse=False state=file path=/usr/libexec/tripleo-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:10:36 localhost python3[60310]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 03:10:37 localhost python3[60328]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-start-podman-container _original_basename=tripleo-start-podman-container recurse=False state=file path=/usr/libexec/tripleo-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:10:37 localhost python3[60390]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 03:10:37 localhost python3[60408]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/tripleo-container-shutdown.service _original_basename=tripleo-container-shutdown-service recurse=False state=file path=/usr/lib/systemd/system/tripleo-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:10:38 localhost python3[60470]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 03:10:38 localhost python3[60488]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset _original_basename=91-tripleo-container-shutdown-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:10:39 localhost python3[60518]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 03:10:39 localhost systemd[1]: Reloading. Dec 15 03:10:39 localhost systemd-sysv-generator[60546]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 03:10:39 localhost systemd-rc-local-generator[60542]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 03:10:39 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 03:10:40 localhost python3[60604]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 03:10:40 localhost python3[60622]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/usr/lib/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:10:40 localhost python3[60684]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 03:10:41 localhost python3[60702]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:10:41 localhost python3[60732]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 03:10:41 localhost systemd[1]: Reloading. Dec 15 03:10:41 localhost systemd-rc-local-generator[60756]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 03:10:41 localhost systemd-sysv-generator[60759]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 03:10:41 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 03:10:42 localhost systemd[1]: Starting Create netns directory... Dec 15 03:10:42 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Dec 15 03:10:42 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Dec 15 03:10:42 localhost systemd[1]: Finished Create netns directory. Dec 15 03:10:42 localhost python3[60790]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6 Dec 15 03:10:44 localhost python3[60846]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step3 config_dir=/var/lib/tripleo-config/container-startup-config/step_3 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False Dec 15 03:10:45 localhost podman[61002]: 2025-12-15 08:10:45.230821009 +0000 UTC m=+0.076797192 container create 52123f040652e4c049640fa84ee13fbb7e46e8e9c52436fdd815aa5c35fd7af7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-libvirt-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtlogd_wrapper, vcs-type=git, tcib_managed=true, name=rhosp17/openstack-nova-libvirt, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, config_id=tripleo_step3, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, build-date=2025-11-19T00:35:22Z, managed_by=tripleo_ansible) Dec 15 03:10:45 localhost podman[61003]: 2025-12-15 08:10:45.257855648 +0000 UTC m=+0.100553314 container create 85aa4d76ae5b313025fcf4e8088557251bf656b26e1a33d194b78bcba6781723 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.buildah.version=1.41.4, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, batch=17.1_20251118.1, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1765784752', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, build-date=2025-11-19T00:36:58Z, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, version=17.1.12, config_id=tripleo_step3, container_name=nova_statedir_owner, maintainer=OpenStack TripleO Team) Dec 15 03:10:45 localhost systemd[1]: Started libpod-conmon-52123f040652e4c049640fa84ee13fbb7e46e8e9c52436fdd815aa5c35fd7af7.scope. Dec 15 03:10:45 localhost podman[61042]: 2025-12-15 08:10:45.289894869 +0000 UTC m=+0.086260973 container create 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., container_name=collectd, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, distribution-scope=public, vcs-type=git, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, managed_by=tripleo_ansible, config_id=tripleo_step3, architecture=x86_64) Dec 15 03:10:45 localhost systemd[1]: Started libcrun container. Dec 15 03:10:45 localhost podman[61002]: 2025-12-15 08:10:45.193740433 +0000 UTC m=+0.039716646 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Dec 15 03:10:45 localhost systemd[1]: Started libpod-conmon-85aa4d76ae5b313025fcf4e8088557251bf656b26e1a33d194b78bcba6781723.scope. Dec 15 03:10:45 localhost podman[61003]: 2025-12-15 08:10:45.196746273 +0000 UTC m=+0.039443959 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 Dec 15 03:10:45 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7838c919c03fee0f47a1131073f9538ccbf08471ed4a61eebb0a717da3a55c5c/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff) Dec 15 03:10:45 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7838c919c03fee0f47a1131073f9538ccbf08471ed4a61eebb0a717da3a55c5c/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Dec 15 03:10:45 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7838c919c03fee0f47a1131073f9538ccbf08471ed4a61eebb0a717da3a55c5c/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff) Dec 15 03:10:45 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7838c919c03fee0f47a1131073f9538ccbf08471ed4a61eebb0a717da3a55c5c/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Dec 15 03:10:45 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7838c919c03fee0f47a1131073f9538ccbf08471ed4a61eebb0a717da3a55c5c/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff) Dec 15 03:10:45 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7838c919c03fee0f47a1131073f9538ccbf08471ed4a61eebb0a717da3a55c5c/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff) Dec 15 03:10:45 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7838c919c03fee0f47a1131073f9538ccbf08471ed4a61eebb0a717da3a55c5c/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff) Dec 15 03:10:45 localhost podman[61020]: 2025-12-15 08:10:45.209262826 +0000 UTC m=+0.037986451 image pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 Dec 15 03:10:45 localhost systemd[1]: Started libcrun container. Dec 15 03:10:45 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b283fbdbefaea7e7733bc01aa84718fe19ad455a23e2ee1af5ac07ede0952e1b/merged/container-config-scripts supports timestamps until 2038 (0x7fffffff) Dec 15 03:10:45 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b283fbdbefaea7e7733bc01aa84718fe19ad455a23e2ee1af5ac07ede0952e1b/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff) Dec 15 03:10:45 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b283fbdbefaea7e7733bc01aa84718fe19ad455a23e2ee1af5ac07ede0952e1b/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Dec 15 03:10:45 localhost systemd[1]: Started libpod-conmon-165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.scope. Dec 15 03:10:45 localhost podman[61037]: 2025-12-15 08:10:45.223908755 +0000 UTC m=+0.035154475 image pull registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 Dec 15 03:10:45 localhost podman[61037]: 2025-12-15 08:10:45.327342705 +0000 UTC m=+0.138588405 container create 0e38c5f94ae7971c602d36252a3143b35e3d7e9f0725bde65bf045185c61e4e9 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, url=https://www.redhat.com, tcib_managed=true, distribution-scope=public, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:49Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '605429f322a7b034ef9794ac46c40b29'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, maintainer=OpenStack TripleO Team, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, com.redhat.component=openstack-rsyslog-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, config_id=tripleo_step3, managed_by=tripleo_ansible, release=1761123044, container_name=rsyslog, name=rhosp17/openstack-rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., io.buildah.version=1.41.4) Dec 15 03:10:45 localhost systemd[1]: Started libcrun container. Dec 15 03:10:45 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3cbc8d0b1fd940058dd16189a8a0f2adea168e1862db457c5612a248dfd3b9d7/merged/scripts supports timestamps until 2038 (0x7fffffff) Dec 15 03:10:45 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3cbc8d0b1fd940058dd16189a8a0f2adea168e1862db457c5612a248dfd3b9d7/merged/var/log/collectd supports timestamps until 2038 (0x7fffffff) Dec 15 03:10:45 localhost podman[61042]: 2025-12-15 08:10:45.24179693 +0000 UTC m=+0.038163034 image pull registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 Dec 15 03:10:45 localhost podman[61002]: 2025-12-15 08:10:45.354054724 +0000 UTC m=+0.200030927 container init 52123f040652e4c049640fa84ee13fbb7e46e8e9c52436fdd815aa5c35fd7af7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, release=1761123044, name=rhosp17/openstack-nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-libvirt-container, tcib_managed=true, build-date=2025-11-19T00:35:22Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, vcs-type=git, container_name=nova_virtlogd_wrapper, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, distribution-scope=public) Dec 15 03:10:45 localhost podman[61020]: 2025-12-15 08:10:45.355916744 +0000 UTC m=+0.184640359 container create 0e27ec486e781641265a9df8382ef1c718ab9fc4a19de8332d0d9a5e7c91faf0 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, config_id=tripleo_step3, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, vcs-type=git, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_init_log, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, io.openshift.expose-services=, tcib_managed=true, version=17.1.12, architecture=x86_64, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Dec 15 03:10:45 localhost podman[61002]: 2025-12-15 08:10:45.363165306 +0000 UTC m=+0.209141509 container start 52123f040652e4c049640fa84ee13fbb7e46e8e9c52436fdd815aa5c35fd7af7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, config_id=tripleo_step3, distribution-scope=public, container_name=nova_virtlogd_wrapper, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., managed_by=tripleo_ansible, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.openshift.expose-services=, tcib_managed=true, architecture=x86_64, build-date=2025-11-19T00:35:22Z) Dec 15 03:10:45 localhost python3[60846]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtlogd_wrapper --cgroupns=host --conmon-pidfile /run/nova_virtlogd_wrapper.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=879500e96bf8dfb93687004bd86f2317 --label config_id=tripleo_step3 --label container_name=nova_virtlogd_wrapper --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtlogd_wrapper.log --network host --pid host --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Dec 15 03:10:45 localhost podman[61003]: 2025-12-15 08:10:45.37648483 +0000 UTC m=+0.219182496 container init 85aa4d76ae5b313025fcf4e8088557251bf656b26e1a33d194b78bcba6781723 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, architecture=x86_64, managed_by=tripleo_ansible, config_id=tripleo_step3, container_name=nova_statedir_owner, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1765784752', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4) Dec 15 03:10:45 localhost systemd[1]: Started libpod-conmon-0e38c5f94ae7971c602d36252a3143b35e3d7e9f0725bde65bf045185c61e4e9.scope. Dec 15 03:10:45 localhost podman[61003]: 2025-12-15 08:10:45.386435855 +0000 UTC m=+0.229133521 container start 85aa4d76ae5b313025fcf4e8088557251bf656b26e1a33d194b78bcba6781723 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, release=1761123044, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, container_name=nova_statedir_owner, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, name=rhosp17/openstack-nova-compute, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1765784752', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, managed_by=tripleo_ansible, version=17.1.12, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64) Dec 15 03:10:45 localhost podman[61003]: 2025-12-15 08:10:45.388746487 +0000 UTC m=+0.231444313 container attach 85aa4d76ae5b313025fcf4e8088557251bf656b26e1a33d194b78bcba6781723 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, config_id=tripleo_step3, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1765784752', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, version=17.1.12, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.expose-services=, container_name=nova_statedir_owner, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, distribution-scope=public, batch=17.1_20251118.1, managed_by=tripleo_ansible) Dec 15 03:10:45 localhost systemd[1]: Started libcrun container. Dec 15 03:10:45 localhost systemd[1]: Started libpod-conmon-0e27ec486e781641265a9df8382ef1c718ab9fc4a19de8332d0d9a5e7c91faf0.scope. Dec 15 03:10:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:10:45 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5f937d9d464184dc9258be95e010d108efe08788960fe016cf6a07726f0a4d55/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff) Dec 15 03:10:45 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5f937d9d464184dc9258be95e010d108efe08788960fe016cf6a07726f0a4d55/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff) Dec 15 03:10:45 localhost podman[61042]: 2025-12-15 08:10:45.399817211 +0000 UTC m=+0.196183335 container init 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, container_name=collectd, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, batch=17.1_20251118.1, build-date=2025-11-18T22:51:28Z, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, config_id=tripleo_step3, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., url=https://www.redhat.com, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd) Dec 15 03:10:45 localhost podman[61037]: 2025-12-15 08:10:45.411137532 +0000 UTC m=+0.222383242 container init 0e38c5f94ae7971c602d36252a3143b35e3d7e9f0725bde65bf045185c61e4e9 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, config_id=tripleo_step3, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, com.redhat.component=openstack-rsyslog-container, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:49Z, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '605429f322a7b034ef9794ac46c40b29'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.buildah.version=1.41.4, container_name=rsyslog, name=rhosp17/openstack-rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, distribution-scope=public, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.12, tcib_managed=true, release=1761123044) Dec 15 03:10:45 localhost systemd-logind[763]: Existing logind session ID 28 used by new audit session, ignoring. Dec 15 03:10:45 localhost podman[61037]: 2025-12-15 08:10:45.4208842 +0000 UTC m=+0.232129910 container start 0e38c5f94ae7971c602d36252a3143b35e3d7e9f0725bde65bf045185c61e4e9 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '605429f322a7b034ef9794ac46c40b29'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:49Z, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-rsyslog-container, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 rsyslog, name=rhosp17/openstack-rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, distribution-scope=public, container_name=rsyslog, config_id=tripleo_step3, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, tcib_managed=true, architecture=x86_64, version=17.1.12, vcs-type=git) Dec 15 03:10:45 localhost systemd[1]: Started libcrun container. Dec 15 03:10:45 localhost python3[60846]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name rsyslog --conmon-pidfile /run/rsyslog.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=605429f322a7b034ef9794ac46c40b29 --label config_id=tripleo_step3 --label container_name=rsyslog --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '605429f322a7b034ef9794ac46c40b29'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/rsyslog.log --network host --privileged=True --security-opt label=disable --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro --volume /var/log/containers:/var/log/containers:ro --volume /var/log/containers/rsyslog:/var/log/rsyslog:rw,z --volume /var/log:/var/log/host:ro --volume /var/lib/rsyslog.container:/var/lib/rsyslog:rw,z registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 Dec 15 03:10:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:10:45 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/53b8a95516d46c09ab3d6aa5613b1755b13426c834b6bc7a5ba27a227397c635/merged/var/log/ceilometer supports timestamps until 2038 (0x7fffffff) Dec 15 03:10:45 localhost systemd[1]: Created slice User Slice of UID 0. Dec 15 03:10:45 localhost podman[61042]: 2025-12-15 08:10:45.441747396 +0000 UTC m=+0.238113500 container start 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, build-date=2025-11-18T22:51:28Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, container_name=collectd, url=https://www.redhat.com, vcs-type=git, io.openshift.expose-services=, release=1761123044, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 15 03:10:45 localhost systemd-logind[763]: Existing logind session ID 28 used by new audit session, ignoring. Dec 15 03:10:45 localhost podman[61020]: 2025-12-15 08:10:45.445290189 +0000 UTC m=+0.274013804 container init 0e27ec486e781641265a9df8382ef1c718ab9fc4a19de8332d0d9a5e7c91faf0 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_init_log, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, tcib_managed=true, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vcs-type=git, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, vendor=Red Hat, Inc., version=17.1.12, architecture=x86_64) Dec 15 03:10:45 localhost systemd[1]: Starting User Runtime Directory /run/user/0... Dec 15 03:10:45 localhost systemd[1]: libpod-85aa4d76ae5b313025fcf4e8088557251bf656b26e1a33d194b78bcba6781723.scope: Deactivated successfully. Dec 15 03:10:45 localhost podman[61020]: 2025-12-15 08:10:45.456332083 +0000 UTC m=+0.285055698 container start 0e27ec486e781641265a9df8382ef1c718ab9fc4a19de8332d0d9a5e7c91faf0 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, io.openshift.expose-services=, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, io.buildah.version=1.41.4, container_name=ceilometer_init_log, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, batch=17.1_20251118.1, managed_by=tripleo_ansible, distribution-scope=public, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, build-date=2025-11-19T00:12:45Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, maintainer=OpenStack TripleO Team) Dec 15 03:10:45 localhost python3[60846]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name collectd --cap-add IPC_LOCK --conmon-pidfile /run/collectd.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=da9a0dc7b40588672419e3ce10063e21 --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step3 --label container_name=collectd --label managed_by=tripleo_ansible --label config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/collectd.log --memory 512m --network host --pid host --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro --volume /var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro --volume /var/log/containers/collectd:/var/log/collectd:rw,z --volume /var/lib/container-config-scripts:/config-scripts:ro --volume /var/lib/container-user-scripts:/scripts:z --volume /run:/run:rw --volume /sys/fs/cgroup:/sys/fs/cgroup:ro registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 Dec 15 03:10:45 localhost podman[61003]: 2025-12-15 08:10:45.459641031 +0000 UTC m=+0.302338687 container died 85aa4d76ae5b313025fcf4e8088557251bf656b26e1a33d194b78bcba6781723 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, version=17.1.12, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1765784752', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, container_name=nova_statedir_owner, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, io.openshift.expose-services=, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, architecture=x86_64) Dec 15 03:10:45 localhost python3[60846]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ceilometer_init_log --conmon-pidfile /run/ceilometer_init_log.pid --detach=True --label config_id=tripleo_step3 --label container_name=ceilometer_init_log --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ceilometer_init_log.log --network none --user root --volume /var/log/containers/ceilometer:/var/log/ceilometer:z registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 /bin/bash -c chown -R ceilometer:ceilometer /var/log/ceilometer Dec 15 03:10:45 localhost systemd[1]: libpod-0e27ec486e781641265a9df8382ef1c718ab9fc4a19de8332d0d9a5e7c91faf0.scope: Deactivated successfully. Dec 15 03:10:45 localhost systemd[1]: Finished User Runtime Directory /run/user/0. Dec 15 03:10:45 localhost systemd[1]: Starting User Manager for UID 0... Dec 15 03:10:45 localhost systemd[1]: libpod-0e38c5f94ae7971c602d36252a3143b35e3d7e9f0725bde65bf045185c61e4e9.scope: Deactivated successfully. Dec 15 03:10:45 localhost podman[61121]: 2025-12-15 08:10:45.515950467 +0000 UTC m=+0.066184790 container died 0e38c5f94ae7971c602d36252a3143b35e3d7e9f0725bde65bf045185c61e4e9 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.component=openstack-rsyslog-container, io.buildah.version=1.41.4, release=1761123044, architecture=x86_64, name=rhosp17/openstack-rsyslog, build-date=2025-11-18T22:49:49Z, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '605429f322a7b034ef9794ac46c40b29'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, container_name=rsyslog, vendor=Red Hat, Inc., tcib_managed=true, vcs-type=git, managed_by=tripleo_ansible) Dec 15 03:10:45 localhost podman[61139]: 2025-12-15 08:10:45.537729676 +0000 UTC m=+0.061554197 container died 0e27ec486e781641265a9df8382ef1c718ab9fc4a19de8332d0d9a5e7c91faf0 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, build-date=2025-11-19T00:12:45Z, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, release=1761123044, vendor=Red Hat, Inc., container_name=ceilometer_init_log, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, config_id=tripleo_step3, managed_by=tripleo_ansible) Dec 15 03:10:45 localhost systemd[61154]: Queued start job for default target Main User Target. Dec 15 03:10:45 localhost systemd[61154]: Created slice User Application Slice. Dec 15 03:10:45 localhost systemd[61154]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system). Dec 15 03:10:45 localhost systemd[61154]: Started Daily Cleanup of User's Temporary Directories. Dec 15 03:10:45 localhost systemd[61154]: Reached target Paths. Dec 15 03:10:45 localhost systemd[61154]: Reached target Timers. Dec 15 03:10:45 localhost systemd[61154]: Starting D-Bus User Message Bus Socket... Dec 15 03:10:45 localhost systemd[61154]: Starting Create User's Volatile Files and Directories... Dec 15 03:10:45 localhost systemd[61154]: Finished Create User's Volatile Files and Directories. Dec 15 03:10:45 localhost systemd[61154]: Listening on D-Bus User Message Bus Socket. Dec 15 03:10:45 localhost systemd[61154]: Reached target Sockets. Dec 15 03:10:45 localhost systemd[61154]: Reached target Basic System. Dec 15 03:10:45 localhost systemd[61154]: Reached target Main User Target. Dec 15 03:10:45 localhost systemd[61154]: Startup finished in 109ms. Dec 15 03:10:45 localhost systemd[1]: Started User Manager for UID 0. Dec 15 03:10:45 localhost systemd[1]: Started Session c1 of User root. Dec 15 03:10:45 localhost systemd[1]: Started Session c2 of User root. Dec 15 03:10:45 localhost podman[61136]: 2025-12-15 08:10:45.664020213 +0000 UTC m=+0.184486725 container cleanup 85aa4d76ae5b313025fcf4e8088557251bf656b26e1a33d194b78bcba6781723 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, release=1761123044, tcib_managed=true, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1765784752', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, architecture=x86_64, container_name=nova_statedir_owner, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step3) Dec 15 03:10:45 localhost systemd[1]: libpod-conmon-85aa4d76ae5b313025fcf4e8088557251bf656b26e1a33d194b78bcba6781723.scope: Deactivated successfully. Dec 15 03:10:45 localhost python3[60846]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_statedir_owner --conmon-pidfile /run/nova_statedir_owner.pid --detach=False --env NOVA_STATEDIR_OWNERSHIP_SKIP=triliovault-mounts --env TRIPLEO_DEPLOY_IDENTIFIER=1765784752 --env __OS_DEBUG=true --label config_id=tripleo_step3 --label container_name=nova_statedir_owner --label managed_by=tripleo_ansible --label config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1765784752', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_statedir_owner.log --network none --privileged=False --security-opt label=disable --user root --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/container-config-scripts:/container-config-scripts:z registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 /container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py Dec 15 03:10:45 localhost podman[61181]: 2025-12-15 08:10:45.682475353 +0000 UTC m=+0.166413724 container cleanup 0e38c5f94ae7971c602d36252a3143b35e3d7e9f0725bde65bf045185c61e4e9 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, name=rhosp17/openstack-rsyslog, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '605429f322a7b034ef9794ac46c40b29'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, vendor=Red Hat, Inc., io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.component=openstack-rsyslog-container, vcs-type=git, version=17.1.12, container_name=rsyslog, managed_by=tripleo_ansible, tcib_managed=true, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, build-date=2025-11-18T22:49:49Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 15 03:10:45 localhost systemd[1]: libpod-conmon-0e38c5f94ae7971c602d36252a3143b35e3d7e9f0725bde65bf045185c61e4e9.scope: Deactivated successfully. Dec 15 03:10:45 localhost systemd[1]: session-c2.scope: Deactivated successfully. Dec 15 03:10:45 localhost systemd[1]: session-c1.scope: Deactivated successfully. Dec 15 03:10:45 localhost podman[61139]: 2025-12-15 08:10:45.728094386 +0000 UTC m=+0.251918897 container cleanup 0e27ec486e781641265a9df8382ef1c718ab9fc4a19de8332d0d9a5e7c91faf0 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, container_name=ceilometer_init_log, batch=17.1_20251118.1, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:12:45Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_id=tripleo_step3, distribution-scope=public, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.4, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 15 03:10:45 localhost systemd[1]: libpod-conmon-0e27ec486e781641265a9df8382ef1c718ab9fc4a19de8332d0d9a5e7c91faf0.scope: Deactivated successfully. Dec 15 03:10:45 localhost podman[61120]: 2025-12-15 08:10:45.727041798 +0000 UTC m=+0.276579492 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=starting, maintainer=OpenStack TripleO Team, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-collectd-container, build-date=2025-11-18T22:51:28Z, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.buildah.version=1.41.4, version=17.1.12, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., release=1761123044, config_id=tripleo_step3, architecture=x86_64, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd) Dec 15 03:10:45 localhost podman[61120]: 2025-12-15 08:10:45.861611744 +0000 UTC m=+0.411149438 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, batch=17.1_20251118.1, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, io.buildah.version=1.41.4, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, io.openshift.expose-services=, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, url=https://www.redhat.com, tcib_managed=true, distribution-scope=public) Dec 15 03:10:45 localhost podman[61120]: unhealthy Dec 15 03:10:45 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Main process exited, code=exited, status=1/FAILURE Dec 15 03:10:45 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Failed with result 'exit-code'. Dec 15 03:10:46 localhost podman[61360]: 2025-12-15 08:10:46.029756723 +0000 UTC m=+0.066011025 container create 92b280dfcbf579f50e259592e35912e53a86821f523ea224be339f770852945b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, com.redhat.component=openstack-nova-libvirt-container, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, name=rhosp17/openstack-nova-libvirt, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, vendor=Red Hat, Inc., build-date=2025-11-19T00:35:22Z) Dec 15 03:10:46 localhost systemd[1]: Started libpod-conmon-92b280dfcbf579f50e259592e35912e53a86821f523ea224be339f770852945b.scope. Dec 15 03:10:46 localhost systemd[1]: Started libcrun container. Dec 15 03:10:46 localhost podman[61360]: 2025-12-15 08:10:45.993058958 +0000 UTC m=+0.029313250 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Dec 15 03:10:46 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/87a1920e0c677ba0eff7b258724290a21d0d758cce1c2e79cf986e2254064ecd/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Dec 15 03:10:46 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/87a1920e0c677ba0eff7b258724290a21d0d758cce1c2e79cf986e2254064ecd/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff) Dec 15 03:10:46 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/87a1920e0c677ba0eff7b258724290a21d0d758cce1c2e79cf986e2254064ecd/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Dec 15 03:10:46 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/87a1920e0c677ba0eff7b258724290a21d0d758cce1c2e79cf986e2254064ecd/merged/var/log/swtpm/libvirt supports timestamps until 2038 (0x7fffffff) Dec 15 03:10:46 localhost podman[61360]: 2025-12-15 08:10:46.104343406 +0000 UTC m=+0.140597698 container init 92b280dfcbf579f50e259592e35912e53a86821f523ea224be339f770852945b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-libvirt, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, io.buildah.version=1.41.4, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, url=https://www.redhat.com, com.redhat.component=openstack-nova-libvirt-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Dec 15 03:10:46 localhost podman[61360]: 2025-12-15 08:10:46.113009627 +0000 UTC m=+0.149263919 container start 92b280dfcbf579f50e259592e35912e53a86821f523ea224be339f770852945b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, build-date=2025-11-19T00:35:22Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, url=https://www.redhat.com, com.redhat.component=openstack-nova-libvirt-container, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, version=17.1.12, name=rhosp17/openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team) Dec 15 03:10:46 localhost podman[61408]: 2025-12-15 08:10:46.234885706 +0000 UTC m=+0.096159547 container create 79c86d80f06802ead502598f2b91dbc5585606176da16f7812de83462116f050 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtsecretd, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, architecture=x86_64, config_id=tripleo_step3, build-date=2025-11-19T00:35:22Z, vcs-type=git, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.expose-services=, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt) Dec 15 03:10:46 localhost systemd[1]: var-lib-containers-storage-overlay-b283fbdbefaea7e7733bc01aa84718fe19ad455a23e2ee1af5ac07ede0952e1b-merged.mount: Deactivated successfully. Dec 15 03:10:46 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-85aa4d76ae5b313025fcf4e8088557251bf656b26e1a33d194b78bcba6781723-userdata-shm.mount: Deactivated successfully. Dec 15 03:10:46 localhost podman[61408]: 2025-12-15 08:10:46.185686708 +0000 UTC m=+0.046960599 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Dec 15 03:10:46 localhost systemd[1]: Started libpod-conmon-79c86d80f06802ead502598f2b91dbc5585606176da16f7812de83462116f050.scope. Dec 15 03:10:46 localhost systemd[1]: Started libcrun container. Dec 15 03:10:46 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8d291eddbcf4928f1b462d99ea0cfc000e1fa49a9a1356648b6b5b68c50a1cf0/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff) Dec 15 03:10:46 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8d291eddbcf4928f1b462d99ea0cfc000e1fa49a9a1356648b6b5b68c50a1cf0/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff) Dec 15 03:10:46 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8d291eddbcf4928f1b462d99ea0cfc000e1fa49a9a1356648b6b5b68c50a1cf0/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff) Dec 15 03:10:46 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8d291eddbcf4928f1b462d99ea0cfc000e1fa49a9a1356648b6b5b68c50a1cf0/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Dec 15 03:10:46 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8d291eddbcf4928f1b462d99ea0cfc000e1fa49a9a1356648b6b5b68c50a1cf0/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Dec 15 03:10:46 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8d291eddbcf4928f1b462d99ea0cfc000e1fa49a9a1356648b6b5b68c50a1cf0/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff) Dec 15 03:10:46 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8d291eddbcf4928f1b462d99ea0cfc000e1fa49a9a1356648b6b5b68c50a1cf0/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff) Dec 15 03:10:46 localhost podman[61408]: 2025-12-15 08:10:46.322025912 +0000 UTC m=+0.183299743 container init 79c86d80f06802ead502598f2b91dbc5585606176da16f7812de83462116f050 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, build-date=2025-11-19T00:35:22Z, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4, config_id=tripleo_step3, batch=17.1_20251118.1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_virtsecretd, tcib_managed=true, com.redhat.component=openstack-nova-libvirt-container, release=1761123044, name=rhosp17/openstack-nova-libvirt, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}) Dec 15 03:10:46 localhost podman[61408]: 2025-12-15 08:10:46.333017595 +0000 UTC m=+0.194291436 container start 79c86d80f06802ead502598f2b91dbc5585606176da16f7812de83462116f050 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, container_name=nova_virtsecretd, com.redhat.component=openstack-nova-libvirt-container, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, build-date=2025-11-19T00:35:22Z, io.buildah.version=1.41.4, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, vcs-type=git, architecture=x86_64, name=rhosp17/openstack-nova-libvirt, config_id=tripleo_step3, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, url=https://www.redhat.com, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team) Dec 15 03:10:46 localhost python3[60846]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtsecretd --cgroupns=host --conmon-pidfile /run/nova_virtsecretd.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=879500e96bf8dfb93687004bd86f2317 --label config_id=tripleo_step3 --label container_name=nova_virtsecretd --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtsecretd.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Dec 15 03:10:46 localhost systemd-logind[763]: Existing logind session ID 28 used by new audit session, ignoring. Dec 15 03:10:46 localhost systemd[1]: Started Session c3 of User root. Dec 15 03:10:46 localhost systemd[1]: session-c3.scope: Deactivated successfully. Dec 15 03:10:46 localhost podman[61547]: 2025-12-15 08:10:46.770757339 +0000 UTC m=+0.069360265 container create 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, url=https://www.redhat.com, vcs-type=git, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, config_id=tripleo_step3, vendor=Red Hat, Inc., managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, version=17.1.12, io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044) Dec 15 03:10:46 localhost podman[61544]: 2025-12-15 08:10:46.805287026 +0000 UTC m=+0.110044845 container create defb2439fc676fe463bf8d7eb18e0214dacf68a5280036dda3b72a4334bffc50 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-libvirt-container, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, build-date=2025-11-19T00:35:22Z, url=https://www.redhat.com, version=17.1.12, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., vcs-type=git, container_name=nova_virtnodedevd, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, config_id=tripleo_step3, tcib_managed=true, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, io.buildah.version=1.41.4) Dec 15 03:10:46 localhost systemd[1]: Started libpod-conmon-2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.scope. Dec 15 03:10:46 localhost systemd[1]: Started libcrun container. Dec 15 03:10:46 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d63ab03ca441fedc5f5fcdf51699b396e9401963b7839d4b0e700c4e4e1e58a9/merged/etc/target supports timestamps until 2038 (0x7fffffff) Dec 15 03:10:46 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d63ab03ca441fedc5f5fcdf51699b396e9401963b7839d4b0e700c4e4e1e58a9/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff) Dec 15 03:10:46 localhost systemd[1]: Started libpod-conmon-defb2439fc676fe463bf8d7eb18e0214dacf68a5280036dda3b72a4334bffc50.scope. Dec 15 03:10:46 localhost podman[61547]: 2025-12-15 08:10:46.730320544 +0000 UTC m=+0.028923490 image pull registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 Dec 15 03:10:46 localhost podman[61544]: 2025-12-15 08:10:46.745058506 +0000 UTC m=+0.049816435 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Dec 15 03:10:46 localhost systemd[1]: Started libcrun container. Dec 15 03:10:46 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/114f79195e61e2cc174bba0adc80dff416857f21c27a337c4dfb908cace7e308/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff) Dec 15 03:10:46 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/114f79195e61e2cc174bba0adc80dff416857f21c27a337c4dfb908cace7e308/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff) Dec 15 03:10:46 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/114f79195e61e2cc174bba0adc80dff416857f21c27a337c4dfb908cace7e308/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Dec 15 03:10:46 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/114f79195e61e2cc174bba0adc80dff416857f21c27a337c4dfb908cace7e308/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff) Dec 15 03:10:46 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/114f79195e61e2cc174bba0adc80dff416857f21c27a337c4dfb908cace7e308/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Dec 15 03:10:46 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/114f79195e61e2cc174bba0adc80dff416857f21c27a337c4dfb908cace7e308/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff) Dec 15 03:10:46 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/114f79195e61e2cc174bba0adc80dff416857f21c27a337c4dfb908cace7e308/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff) Dec 15 03:10:46 localhost podman[61544]: 2025-12-15 08:10:46.854120155 +0000 UTC m=+0.158877964 container init defb2439fc676fe463bf8d7eb18e0214dacf68a5280036dda3b72a4334bffc50 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, config_id=tripleo_step3, container_name=nova_virtnodedevd, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, io.buildah.version=1.41.4, url=https://www.redhat.com, name=rhosp17/openstack-nova-libvirt, release=1761123044, io.openshift.expose-services=, vcs-type=git, tcib_managed=true, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, build-date=2025-11-19T00:35:22Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, version=17.1.12, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 15 03:10:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:10:46 localhost podman[61547]: 2025-12-15 08:10:46.860376701 +0000 UTC m=+0.158979627 container init 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, version=17.1.12, build-date=2025-11-18T23:44:13Z, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, release=1761123044, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., distribution-scope=public, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, com.redhat.component=openstack-iscsid-container, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid) Dec 15 03:10:46 localhost podman[61544]: 2025-12-15 08:10:46.864578272 +0000 UTC m=+0.169336091 container start defb2439fc676fe463bf8d7eb18e0214dacf68a5280036dda3b72a4334bffc50 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, com.redhat.component=openstack-nova-libvirt-container, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, build-date=2025-11-19T00:35:22Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, tcib_managed=true, container_name=nova_virtnodedevd, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, batch=17.1_20251118.1, config_id=tripleo_step3, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 15 03:10:46 localhost python3[60846]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtnodedevd --cgroupns=host --conmon-pidfile /run/nova_virtnodedevd.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=879500e96bf8dfb93687004bd86f2317 --label config_id=tripleo_step3 --label container_name=nova_virtnodedevd --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtnodedevd.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Dec 15 03:10:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:10:46 localhost podman[61547]: 2025-12-15 08:10:46.890248065 +0000 UTC m=+0.188851021 container start 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, tcib_managed=true, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, release=1761123044, maintainer=OpenStack TripleO Team) Dec 15 03:10:46 localhost systemd-logind[763]: Existing logind session ID 28 used by new audit session, ignoring. Dec 15 03:10:46 localhost python3[60846]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name iscsid --conmon-pidfile /run/iscsid.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=182e509007ab5e6e5b2500a552cbd5ba --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step3 --label container_name=iscsid --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/iscsid.log --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run:/run --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro --volume /etc/target:/etc/target:z --volume /var/lib/iscsi:/var/lib/iscsi:z registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 Dec 15 03:10:46 localhost systemd-logind[763]: Existing logind session ID 28 used by new audit session, ignoring. Dec 15 03:10:46 localhost systemd[1]: Started Session c4 of User root. Dec 15 03:10:46 localhost systemd[1]: Started Session c5 of User root. Dec 15 03:10:47 localhost systemd[1]: session-c5.scope: Deactivated successfully. Dec 15 03:10:47 localhost systemd[1]: session-c4.scope: Deactivated successfully. Dec 15 03:10:47 localhost podman[61591]: 2025-12-15 08:10:47.041064083 +0000 UTC m=+0.141940453 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=starting, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, release=1761123044, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, architecture=x86_64, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, vendor=Red Hat, Inc., distribution-scope=public, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible) Dec 15 03:10:47 localhost kernel: Loading iSCSI transport class v2.0-870. Dec 15 03:10:47 localhost podman[61591]: 2025-12-15 08:10:47.084502098 +0000 UTC m=+0.185378508 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, container_name=iscsid, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, io.openshift.expose-services=, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, vendor=Red Hat, Inc., name=rhosp17/openstack-iscsid, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, url=https://www.redhat.com, vcs-type=git, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid) Dec 15 03:10:47 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:10:47 localhost podman[61726]: 2025-12-15 08:10:47.396971443 +0000 UTC m=+0.102044063 container create 68297132a91cfc5a6c8079435c9fe1de3a122b558a334b9752ca0b9c9ef20016 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, batch=17.1_20251118.1, build-date=2025-11-19T00:35:22Z, com.redhat.component=openstack-nova-libvirt-container, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, container_name=nova_virtstoraged, distribution-scope=public, release=1761123044, config_id=tripleo_step3) Dec 15 03:10:47 localhost podman[61726]: 2025-12-15 08:10:47.343950804 +0000 UTC m=+0.049023424 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Dec 15 03:10:47 localhost systemd[1]: Started libpod-conmon-68297132a91cfc5a6c8079435c9fe1de3a122b558a334b9752ca0b9c9ef20016.scope. Dec 15 03:10:47 localhost systemd[1]: Started libcrun container. Dec 15 03:10:47 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/287e8a5e9612465394db15161b4361526f72e331708cc6adfea20036aeff367f/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff) Dec 15 03:10:47 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/287e8a5e9612465394db15161b4361526f72e331708cc6adfea20036aeff367f/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Dec 15 03:10:47 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/287e8a5e9612465394db15161b4361526f72e331708cc6adfea20036aeff367f/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff) Dec 15 03:10:47 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/287e8a5e9612465394db15161b4361526f72e331708cc6adfea20036aeff367f/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff) Dec 15 03:10:47 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/287e8a5e9612465394db15161b4361526f72e331708cc6adfea20036aeff367f/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Dec 15 03:10:47 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/287e8a5e9612465394db15161b4361526f72e331708cc6adfea20036aeff367f/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff) Dec 15 03:10:47 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/287e8a5e9612465394db15161b4361526f72e331708cc6adfea20036aeff367f/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff) Dec 15 03:10:47 localhost podman[61726]: 2025-12-15 08:10:47.484018107 +0000 UTC m=+0.189090697 container init 68297132a91cfc5a6c8079435c9fe1de3a122b558a334b9752ca0b9c9ef20016 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.buildah.version=1.41.4, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_virtstoraged, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, build-date=2025-11-19T00:35:22Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-libvirt, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, batch=17.1_20251118.1, managed_by=tripleo_ansible, version=17.1.12, io.openshift.expose-services=) Dec 15 03:10:47 localhost podman[61726]: 2025-12-15 08:10:47.495240076 +0000 UTC m=+0.200312706 container start 68297132a91cfc5a6c8079435c9fe1de3a122b558a334b9752ca0b9c9ef20016 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, url=https://www.redhat.com, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step3, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, vcs-type=git, container_name=nova_virtstoraged, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 15 03:10:47 localhost python3[60846]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtstoraged --cgroupns=host --conmon-pidfile /run/nova_virtstoraged.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=879500e96bf8dfb93687004bd86f2317 --label config_id=tripleo_step3 --label container_name=nova_virtstoraged --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtstoraged.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Dec 15 03:10:47 localhost systemd-logind[763]: Existing logind session ID 28 used by new audit session, ignoring. Dec 15 03:10:47 localhost systemd[1]: Started Session c6 of User root. Dec 15 03:10:47 localhost systemd[1]: session-c6.scope: Deactivated successfully. Dec 15 03:10:47 localhost podman[61830]: 2025-12-15 08:10:47.980055291 +0000 UTC m=+0.088494773 container create 17f8fb766dee5ec6e0cf3c348d53f962ee388bdd6255520267152d5d12168616 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, architecture=x86_64, build-date=2025-11-19T00:35:22Z, name=rhosp17/openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, release=1761123044, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtqemud, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-libvirt-container, version=17.1.12) Dec 15 03:10:48 localhost systemd[1]: Started libpod-conmon-17f8fb766dee5ec6e0cf3c348d53f962ee388bdd6255520267152d5d12168616.scope. Dec 15 03:10:48 localhost podman[61830]: 2025-12-15 08:10:47.937234923 +0000 UTC m=+0.045674425 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Dec 15 03:10:48 localhost systemd[1]: Started libcrun container. Dec 15 03:10:48 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ca79fd7369f4c32cc46dcf947016bb386f5de0d0f04a624169cf5c85994521b4/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff) Dec 15 03:10:48 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ca79fd7369f4c32cc46dcf947016bb386f5de0d0f04a624169cf5c85994521b4/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff) Dec 15 03:10:48 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ca79fd7369f4c32cc46dcf947016bb386f5de0d0f04a624169cf5c85994521b4/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff) Dec 15 03:10:48 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ca79fd7369f4c32cc46dcf947016bb386f5de0d0f04a624169cf5c85994521b4/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Dec 15 03:10:48 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ca79fd7369f4c32cc46dcf947016bb386f5de0d0f04a624169cf5c85994521b4/merged/var/log/swtpm supports timestamps until 2038 (0x7fffffff) Dec 15 03:10:48 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ca79fd7369f4c32cc46dcf947016bb386f5de0d0f04a624169cf5c85994521b4/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff) Dec 15 03:10:48 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ca79fd7369f4c32cc46dcf947016bb386f5de0d0f04a624169cf5c85994521b4/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Dec 15 03:10:48 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ca79fd7369f4c32cc46dcf947016bb386f5de0d0f04a624169cf5c85994521b4/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff) Dec 15 03:10:48 localhost podman[61830]: 2025-12-15 08:10:48.055701142 +0000 UTC m=+0.164140624 container init 17f8fb766dee5ec6e0cf3c348d53f962ee388bdd6255520267152d5d12168616 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, name=rhosp17/openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, distribution-scope=public, build-date=2025-11-19T00:35:22Z, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.openshift.expose-services=, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_virtqemud, com.redhat.component=openstack-nova-libvirt-container) Dec 15 03:10:48 localhost podman[61830]: 2025-12-15 08:10:48.064169507 +0000 UTC m=+0.172608979 container start 17f8fb766dee5ec6e0cf3c348d53f962ee388bdd6255520267152d5d12168616 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, container_name=nova_virtqemud, com.redhat.component=openstack-nova-libvirt-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, architecture=x86_64, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, io.openshift.expose-services=, managed_by=tripleo_ansible, build-date=2025-11-19T00:35:22Z, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 15 03:10:48 localhost python3[60846]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtqemud --cgroupns=host --conmon-pidfile /run/nova_virtqemud.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=879500e96bf8dfb93687004bd86f2317 --label config_id=tripleo_step3 --label container_name=nova_virtqemud --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtqemud.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro --volume /var/log/containers/libvirt/swtpm:/var/log/swtpm:z registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Dec 15 03:10:48 localhost systemd-logind[763]: Existing logind session ID 28 used by new audit session, ignoring. Dec 15 03:10:48 localhost systemd[1]: Started Session c7 of User root. Dec 15 03:10:48 localhost systemd[1]: session-c7.scope: Deactivated successfully. Dec 15 03:10:48 localhost podman[61934]: 2025-12-15 08:10:48.588418251 +0000 UTC m=+0.090481937 container create ddca038c8593b6a3bb2a52c20e9a406ff5bb4e8cd16a7b42ec23df21f0897120 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, name=rhosp17/openstack-nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, container_name=nova_virtproxyd, description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, architecture=x86_64, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, distribution-scope=public, config_id=tripleo_step3, build-date=2025-11-19T00:35:22Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt) Dec 15 03:10:48 localhost podman[61934]: 2025-12-15 08:10:48.538463533 +0000 UTC m=+0.040527279 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Dec 15 03:10:48 localhost systemd[1]: Started libpod-conmon-ddca038c8593b6a3bb2a52c20e9a406ff5bb4e8cd16a7b42ec23df21f0897120.scope. Dec 15 03:10:48 localhost systemd[1]: Started libcrun container. Dec 15 03:10:48 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/815b56c139a3f32508088ca9c999727b748310f25f30262c114b54984dd7dcbf/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff) Dec 15 03:10:48 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/815b56c139a3f32508088ca9c999727b748310f25f30262c114b54984dd7dcbf/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Dec 15 03:10:48 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/815b56c139a3f32508088ca9c999727b748310f25f30262c114b54984dd7dcbf/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff) Dec 15 03:10:48 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/815b56c139a3f32508088ca9c999727b748310f25f30262c114b54984dd7dcbf/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Dec 15 03:10:48 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/815b56c139a3f32508088ca9c999727b748310f25f30262c114b54984dd7dcbf/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff) Dec 15 03:10:48 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/815b56c139a3f32508088ca9c999727b748310f25f30262c114b54984dd7dcbf/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff) Dec 15 03:10:48 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/815b56c139a3f32508088ca9c999727b748310f25f30262c114b54984dd7dcbf/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff) Dec 15 03:10:48 localhost podman[61934]: 2025-12-15 08:10:48.672365042 +0000 UTC m=+0.174428728 container init ddca038c8593b6a3bb2a52c20e9a406ff5bb4e8cd16a7b42ec23df21f0897120 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, name=rhosp17/openstack-nova-libvirt, build-date=2025-11-19T00:35:22Z, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, container_name=nova_virtproxyd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, distribution-scope=public, release=1761123044, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-libvirt-container, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, url=https://www.redhat.com, tcib_managed=true) Dec 15 03:10:48 localhost podman[61934]: 2025-12-15 08:10:48.679644025 +0000 UTC m=+0.181707721 container start ddca038c8593b6a3bb2a52c20e9a406ff5bb4e8cd16a7b42ec23df21f0897120 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, architecture=x86_64, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-libvirt-container, name=rhosp17/openstack-nova-libvirt, version=17.1.12, container_name=nova_virtproxyd, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., build-date=2025-11-19T00:35:22Z, distribution-scope=public, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 15 03:10:48 localhost python3[60846]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtproxyd --cgroupns=host --conmon-pidfile /run/nova_virtproxyd.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=879500e96bf8dfb93687004bd86f2317 --label config_id=tripleo_step3 --label container_name=nova_virtproxyd --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtproxyd.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Dec 15 03:10:48 localhost systemd-logind[763]: Existing logind session ID 28 used by new audit session, ignoring. Dec 15 03:10:48 localhost systemd[1]: Started Session c8 of User root. Dec 15 03:10:48 localhost systemd[1]: session-c8.scope: Deactivated successfully. Dec 15 03:10:49 localhost python3[62016]: ansible-file Invoked with path=/etc/systemd/system/tripleo_collectd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:10:49 localhost python3[62033]: ansible-file Invoked with path=/etc/systemd/system/tripleo_iscsid.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:10:49 localhost python3[62049]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:10:50 localhost python3[62065]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:10:50 localhost python3[62081]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:10:50 localhost python3[62097]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:10:50 localhost python3[62113]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:10:51 localhost python3[62129]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:10:51 localhost python3[62145]: ansible-file Invoked with path=/etc/systemd/system/tripleo_rsyslog.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:10:51 localhost python3[62161]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_collectd_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 15 03:10:51 localhost python3[62177]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_iscsid_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 15 03:10:52 localhost python3[62193]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 15 03:10:52 localhost python3[62209]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 15 03:10:52 localhost python3[62225]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 15 03:10:52 localhost python3[62241]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 15 03:10:53 localhost python3[62257]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 15 03:10:53 localhost python3[62273]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 15 03:10:53 localhost python3[62289]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_rsyslog_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 15 03:10:54 localhost python3[62350]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765786253.8469589-99578-11481081201715/source dest=/etc/systemd/system/tripleo_collectd.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:10:54 localhost python3[62379]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765786253.8469589-99578-11481081201715/source dest=/etc/systemd/system/tripleo_iscsid.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:10:55 localhost python3[62408]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765786253.8469589-99578-11481081201715/source dest=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:10:55 localhost python3[62437]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765786253.8469589-99578-11481081201715/source dest=/etc/systemd/system/tripleo_nova_virtnodedevd.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:10:56 localhost python3[62466]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765786253.8469589-99578-11481081201715/source dest=/etc/systemd/system/tripleo_nova_virtproxyd.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:10:56 localhost python3[62495]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765786253.8469589-99578-11481081201715/source dest=/etc/systemd/system/tripleo_nova_virtqemud.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:10:57 localhost python3[62524]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765786253.8469589-99578-11481081201715/source dest=/etc/systemd/system/tripleo_nova_virtsecretd.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:10:58 localhost python3[62553]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765786253.8469589-99578-11481081201715/source dest=/etc/systemd/system/tripleo_nova_virtstoraged.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:10:58 localhost python3[62582]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765786253.8469589-99578-11481081201715/source dest=/etc/systemd/system/tripleo_rsyslog.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:10:58 localhost systemd[1]: Stopping User Manager for UID 0... Dec 15 03:10:58 localhost systemd[61154]: Activating special unit Exit the Session... Dec 15 03:10:58 localhost systemd[61154]: Stopped target Main User Target. Dec 15 03:10:58 localhost systemd[61154]: Stopped target Basic System. Dec 15 03:10:58 localhost systemd[61154]: Stopped target Paths. Dec 15 03:10:58 localhost systemd[61154]: Stopped target Sockets. Dec 15 03:10:58 localhost systemd[61154]: Stopped target Timers. Dec 15 03:10:58 localhost systemd[61154]: Stopped Daily Cleanup of User's Temporary Directories. Dec 15 03:10:58 localhost systemd[61154]: Closed D-Bus User Message Bus Socket. Dec 15 03:10:58 localhost systemd[61154]: Stopped Create User's Volatile Files and Directories. Dec 15 03:10:58 localhost systemd[61154]: Removed slice User Application Slice. Dec 15 03:10:58 localhost systemd[61154]: Reached target Shutdown. Dec 15 03:10:58 localhost systemd[61154]: Finished Exit the Session. Dec 15 03:10:58 localhost systemd[61154]: Reached target Exit the Session. Dec 15 03:10:58 localhost systemd[1]: user@0.service: Deactivated successfully. Dec 15 03:10:58 localhost systemd[1]: Stopped User Manager for UID 0. Dec 15 03:10:58 localhost systemd[1]: Stopping User Runtime Directory /run/user/0... Dec 15 03:10:58 localhost systemd[1]: run-user-0.mount: Deactivated successfully. Dec 15 03:10:58 localhost systemd[1]: user-runtime-dir@0.service: Deactivated successfully. Dec 15 03:10:58 localhost systemd[1]: Stopped User Runtime Directory /run/user/0. Dec 15 03:10:58 localhost systemd[1]: Removed slice User Slice of UID 0. Dec 15 03:10:59 localhost python3[62599]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Dec 15 03:10:59 localhost systemd[1]: Reloading. Dec 15 03:10:59 localhost systemd-sysv-generator[62631]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 03:10:59 localhost systemd-rc-local-generator[62625]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 03:10:59 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 03:11:00 localhost python3[62653]: ansible-systemd Invoked with state=restarted name=tripleo_collectd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 03:11:00 localhost systemd[1]: Reloading. Dec 15 03:11:00 localhost systemd-rc-local-generator[62679]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 03:11:00 localhost systemd-sysv-generator[62683]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 03:11:00 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 03:11:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:11:00 localhost systemd[1]: Starting collectd container... Dec 15 03:11:01 localhost podman[62693]: 2025-12-15 08:11:01.106163922 +0000 UTC m=+0.106081241 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, name=rhosp17/openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git) Dec 15 03:11:01 localhost systemd[1]: Started collectd container. Dec 15 03:11:01 localhost podman[62693]: 2025-12-15 08:11:01.384357355 +0000 UTC m=+0.384274694 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, batch=17.1_20251118.1, container_name=metrics_qdr, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, architecture=x86_64, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd) Dec 15 03:11:01 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:11:01 localhost python3[62749]: ansible-systemd Invoked with state=restarted name=tripleo_iscsid.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 03:11:01 localhost systemd[1]: Reloading. Dec 15 03:11:01 localhost systemd-sysv-generator[62780]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 03:11:01 localhost systemd-rc-local-generator[62775]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 03:11:02 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 03:11:02 localhost systemd[1]: Starting iscsid container... Dec 15 03:11:02 localhost systemd[1]: Started iscsid container. Dec 15 03:11:02 localhost python3[62816]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtlogd_wrapper.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 03:11:02 localhost systemd[1]: Reloading. Dec 15 03:11:03 localhost systemd-rc-local-generator[62843]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 03:11:03 localhost systemd-sysv-generator[62846]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 03:11:03 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 03:11:03 localhost systemd[1]: Starting nova_virtlogd_wrapper container... Dec 15 03:11:03 localhost systemd[1]: Started nova_virtlogd_wrapper container. Dec 15 03:11:03 localhost python3[62884]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtnodedevd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 03:11:04 localhost systemd[1]: Reloading. Dec 15 03:11:04 localhost systemd-sysv-generator[62915]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 03:11:04 localhost systemd-rc-local-generator[62909]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 03:11:04 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 03:11:04 localhost systemd[1]: Starting nova_virtnodedevd container... Dec 15 03:11:04 localhost tripleo-start-podman-container[62924]: Creating additional drop-in dependency for "nova_virtnodedevd" (defb2439fc676fe463bf8d7eb18e0214dacf68a5280036dda3b72a4334bffc50) Dec 15 03:11:04 localhost systemd[1]: Reloading. Dec 15 03:11:04 localhost systemd-sysv-generator[62988]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 03:11:04 localhost systemd-rc-local-generator[62985]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 03:11:04 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 03:11:04 localhost systemd[1]: Started nova_virtnodedevd container. Dec 15 03:11:05 localhost python3[63009]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtproxyd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 03:11:05 localhost systemd[1]: Reloading. Dec 15 03:11:05 localhost systemd-sysv-generator[63041]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 03:11:05 localhost systemd-rc-local-generator[63038]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 03:11:05 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 03:11:06 localhost systemd[1]: Starting nova_virtproxyd container... Dec 15 03:11:06 localhost tripleo-start-podman-container[63048]: Creating additional drop-in dependency for "nova_virtproxyd" (ddca038c8593b6a3bb2a52c20e9a406ff5bb4e8cd16a7b42ec23df21f0897120) Dec 15 03:11:06 localhost systemd[1]: Reloading. Dec 15 03:11:06 localhost systemd-rc-local-generator[63104]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 03:11:06 localhost systemd-sysv-generator[63109]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 03:11:06 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 03:11:06 localhost systemd[1]: Started nova_virtproxyd container. Dec 15 03:11:07 localhost python3[63131]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtqemud.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 03:11:08 localhost systemd[1]: Reloading. Dec 15 03:11:08 localhost systemd-rc-local-generator[63161]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 03:11:08 localhost systemd-sysv-generator[63164]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 03:11:08 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 03:11:08 localhost systemd[1]: Starting dnf makecache... Dec 15 03:11:08 localhost systemd[1]: Starting nova_virtqemud container... Dec 15 03:11:08 localhost tripleo-start-podman-container[63172]: Creating additional drop-in dependency for "nova_virtqemud" (17f8fb766dee5ec6e0cf3c348d53f962ee388bdd6255520267152d5d12168616) Dec 15 03:11:08 localhost systemd[1]: Reloading. Dec 15 03:11:08 localhost dnf[63171]: Updating Subscription Management repositories. Dec 15 03:11:08 localhost systemd-rc-local-generator[63226]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 03:11:08 localhost systemd-sysv-generator[63229]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 03:11:08 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 03:11:09 localhost systemd[1]: Started nova_virtqemud container. Dec 15 03:11:09 localhost python3[63255]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtsecretd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 03:11:09 localhost systemd[1]: Reloading. Dec 15 03:11:09 localhost systemd-rc-local-generator[63279]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 03:11:09 localhost systemd-sysv-generator[63287]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 03:11:09 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 03:11:10 localhost systemd[1]: Starting nova_virtsecretd container... Dec 15 03:11:10 localhost tripleo-start-podman-container[63295]: Creating additional drop-in dependency for "nova_virtsecretd" (79c86d80f06802ead502598f2b91dbc5585606176da16f7812de83462116f050) Dec 15 03:11:10 localhost systemd[1]: Reloading. Dec 15 03:11:10 localhost systemd-rc-local-generator[63356]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 03:11:10 localhost systemd-sysv-generator[63359]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 03:11:10 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 03:11:10 localhost systemd[1]: Started nova_virtsecretd container. Dec 15 03:11:10 localhost dnf[63171]: Metadata cache refreshed recently. Dec 15 03:11:10 localhost systemd[1]: dnf-makecache.service: Deactivated successfully. Dec 15 03:11:10 localhost systemd[1]: Finished dnf makecache. Dec 15 03:11:10 localhost systemd[1]: dnf-makecache.service: Consumed 2.227s CPU time. Dec 15 03:11:11 localhost python3[63380]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtstoraged.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 03:11:11 localhost systemd[1]: Reloading. Dec 15 03:11:11 localhost systemd-sysv-generator[63410]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 03:11:11 localhost systemd-rc-local-generator[63407]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 03:11:11 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 03:11:11 localhost systemd[1]: Starting nova_virtstoraged container... Dec 15 03:11:11 localhost tripleo-start-podman-container[63420]: Creating additional drop-in dependency for "nova_virtstoraged" (68297132a91cfc5a6c8079435c9fe1de3a122b558a334b9752ca0b9c9ef20016) Dec 15 03:11:11 localhost systemd[1]: Reloading. Dec 15 03:11:11 localhost systemd-sysv-generator[63478]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 03:11:11 localhost systemd-rc-local-generator[63473]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 03:11:11 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 03:11:12 localhost systemd[1]: Started nova_virtstoraged container. Dec 15 03:11:12 localhost python3[63503]: ansible-systemd Invoked with state=restarted name=tripleo_rsyslog.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 03:11:12 localhost systemd[1]: Reloading. Dec 15 03:11:13 localhost systemd-rc-local-generator[63530]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 03:11:13 localhost systemd-sysv-generator[63535]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 03:11:13 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 03:11:13 localhost systemd[1]: Starting rsyslog container... Dec 15 03:11:13 localhost systemd[1]: Started libcrun container. Dec 15 03:11:13 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5f937d9d464184dc9258be95e010d108efe08788960fe016cf6a07726f0a4d55/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff) Dec 15 03:11:13 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5f937d9d464184dc9258be95e010d108efe08788960fe016cf6a07726f0a4d55/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff) Dec 15 03:11:13 localhost podman[63543]: 2025-12-15 08:11:13.434111558 +0000 UTC m=+0.134929947 container init 0e38c5f94ae7971c602d36252a3143b35e3d7e9f0725bde65bf045185c61e4e9 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, vcs-type=git, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.openshift.expose-services=, container_name=rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '605429f322a7b034ef9794ac46c40b29'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, url=https://www.redhat.com, com.redhat.component=openstack-rsyslog-container, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-rsyslog, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, summary=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_step3, build-date=2025-11-18T22:49:49Z, vendor=Red Hat, Inc., tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 15 03:11:13 localhost podman[63543]: 2025-12-15 08:11:13.449738583 +0000 UTC m=+0.150556962 container start 0e38c5f94ae7971c602d36252a3143b35e3d7e9f0725bde65bf045185c61e4e9 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-rsyslog, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '605429f322a7b034ef9794ac46c40b29'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, container_name=rsyslog, tcib_managed=true, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, io.buildah.version=1.41.4, release=1761123044, com.redhat.component=openstack-rsyslog-container, build-date=2025-11-18T22:49:49Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, vcs-type=git, vendor=Red Hat, Inc., config_id=tripleo_step3, managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog) Dec 15 03:11:13 localhost podman[63543]: rsyslog Dec 15 03:11:13 localhost systemd[1]: Started rsyslog container. Dec 15 03:11:13 localhost systemd[1]: libpod-0e38c5f94ae7971c602d36252a3143b35e3d7e9f0725bde65bf045185c61e4e9.scope: Deactivated successfully. Dec 15 03:11:13 localhost podman[63580]: 2025-12-15 08:11:13.593869344 +0000 UTC m=+0.035439433 container died 0e38c5f94ae7971c602d36252a3143b35e3d7e9f0725bde65bf045185c61e4e9 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-rsyslog-container, release=1761123044, summary=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '605429f322a7b034ef9794ac46c40b29'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=rsyslog, name=rhosp17/openstack-rsyslog, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:49Z, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, batch=17.1_20251118.1, tcib_managed=true) Dec 15 03:11:13 localhost podman[63580]: 2025-12-15 08:11:13.618295284 +0000 UTC m=+0.059865363 container cleanup 0e38c5f94ae7971c602d36252a3143b35e3d7e9f0725bde65bf045185c61e4e9 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-rsyslog-container, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, url=https://www.redhat.com, container_name=rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, name=rhosp17/openstack-rsyslog, release=1761123044, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '605429f322a7b034ef9794ac46c40b29'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:49Z) Dec 15 03:11:13 localhost systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE Dec 15 03:11:13 localhost podman[63595]: 2025-12-15 08:11:13.708965053 +0000 UTC m=+0.057845289 container cleanup 0e38c5f94ae7971c602d36252a3143b35e3d7e9f0725bde65bf045185c61e4e9 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, container_name=rsyslog, name=rhosp17/openstack-rsyslog, batch=17.1_20251118.1, com.redhat.component=openstack-rsyslog-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '605429f322a7b034ef9794ac46c40b29'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, summary=Red Hat OpenStack Platform 17.1 rsyslog, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, config_id=tripleo_step3, distribution-scope=public, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:49Z, io.openshift.expose-services=, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, architecture=x86_64, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, io.buildah.version=1.41.4, version=17.1.12) Dec 15 03:11:13 localhost podman[63595]: rsyslog Dec 15 03:11:13 localhost systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'. Dec 15 03:11:13 localhost systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 1. Dec 15 03:11:13 localhost systemd[1]: Stopped rsyslog container. Dec 15 03:11:13 localhost systemd[1]: Starting rsyslog container... Dec 15 03:11:13 localhost python3[63621]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks3.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:11:14 localhost systemd[1]: Started libcrun container. Dec 15 03:11:14 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5f937d9d464184dc9258be95e010d108efe08788960fe016cf6a07726f0a4d55/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff) Dec 15 03:11:14 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5f937d9d464184dc9258be95e010d108efe08788960fe016cf6a07726f0a4d55/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff) Dec 15 03:11:14 localhost podman[63622]: 2025-12-15 08:11:14.025491536 +0000 UTC m=+0.098313424 container init 0e38c5f94ae7971c602d36252a3143b35e3d7e9f0725bde65bf045185c61e4e9 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, architecture=x86_64, name=rhosp17/openstack-rsyslog, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., container_name=rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:49Z, com.redhat.component=openstack-rsyslog-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.expose-services=, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '605429f322a7b034ef9794ac46c40b29'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3) Dec 15 03:11:14 localhost podman[63622]: 2025-12-15 08:11:14.036787266 +0000 UTC m=+0.109609154 container start 0e38c5f94ae7971c602d36252a3143b35e3d7e9f0725bde65bf045185c61e4e9 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.buildah.version=1.41.4, name=rhosp17/openstack-rsyslog, release=1761123044, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.component=openstack-rsyslog-container, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 rsyslog, batch=17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, build-date=2025-11-18T22:49:49Z, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '605429f322a7b034ef9794ac46c40b29'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, container_name=rsyslog, vendor=Red Hat, Inc.) Dec 15 03:11:14 localhost podman[63622]: rsyslog Dec 15 03:11:14 localhost systemd[1]: Started rsyslog container. Dec 15 03:11:14 localhost systemd[1]: libpod-0e38c5f94ae7971c602d36252a3143b35e3d7e9f0725bde65bf045185c61e4e9.scope: Deactivated successfully. Dec 15 03:11:14 localhost podman[63645]: 2025-12-15 08:11:14.177702591 +0000 UTC m=+0.056052260 container died 0e38c5f94ae7971c602d36252a3143b35e3d7e9f0725bde65bf045185c61e4e9 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=rsyslog, vendor=Red Hat, Inc., architecture=x86_64, vcs-type=git, build-date=2025-11-18T22:49:49Z, tcib_managed=true, url=https://www.redhat.com, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '605429f322a7b034ef9794ac46c40b29'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, com.redhat.component=openstack-rsyslog-container, summary=Red Hat OpenStack Platform 17.1 rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 rsyslog, name=rhosp17/openstack-rsyslog) Dec 15 03:11:14 localhost podman[63645]: 2025-12-15 08:11:14.203286651 +0000 UTC m=+0.081636290 container cleanup 0e38c5f94ae7971c602d36252a3143b35e3d7e9f0725bde65bf045185c61e4e9 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vcs-type=git, build-date=2025-11-18T22:49:49Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, version=17.1.12, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, architecture=x86_64, io.buildah.version=1.41.4, container_name=rsyslog, name=rhosp17/openstack-rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '605429f322a7b034ef9794ac46c40b29'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, release=1761123044, batch=17.1_20251118.1, com.redhat.component=openstack-rsyslog-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_step3, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team) Dec 15 03:11:14 localhost systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE Dec 15 03:11:14 localhost podman[63657]: 2025-12-15 08:11:14.297189748 +0000 UTC m=+0.066278683 container cleanup 0e38c5f94ae7971c602d36252a3143b35e3d7e9f0725bde65bf045185c61e4e9 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '605429f322a7b034ef9794ac46c40b29'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, vcs-type=git, com.redhat.component=openstack-rsyslog-container, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, container_name=rsyslog, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, name=rhosp17/openstack-rsyslog, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, batch=17.1_20251118.1, build-date=2025-11-18T22:49:49Z, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc.) Dec 15 03:11:14 localhost podman[63657]: rsyslog Dec 15 03:11:14 localhost systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'. Dec 15 03:11:14 localhost systemd[1]: tmp-crun.CVZLII.mount: Deactivated successfully. Dec 15 03:11:14 localhost systemd[1]: var-lib-containers-storage-overlay-5f937d9d464184dc9258be95e010d108efe08788960fe016cf6a07726f0a4d55-merged.mount: Deactivated successfully. Dec 15 03:11:14 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0e38c5f94ae7971c602d36252a3143b35e3d7e9f0725bde65bf045185c61e4e9-userdata-shm.mount: Deactivated successfully. Dec 15 03:11:14 localhost systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 2. Dec 15 03:11:14 localhost systemd[1]: Stopped rsyslog container. Dec 15 03:11:14 localhost systemd[1]: Starting rsyslog container... Dec 15 03:11:14 localhost systemd[1]: Started libcrun container. Dec 15 03:11:14 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5f937d9d464184dc9258be95e010d108efe08788960fe016cf6a07726f0a4d55/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff) Dec 15 03:11:14 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5f937d9d464184dc9258be95e010d108efe08788960fe016cf6a07726f0a4d55/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff) Dec 15 03:11:14 localhost podman[63719]: 2025-12-15 08:11:14.71753806 +0000 UTC m=+0.137296450 container init 0e38c5f94ae7971c602d36252a3143b35e3d7e9f0725bde65bf045185c61e4e9 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., url=https://www.redhat.com, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.12, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, com.redhat.component=openstack-rsyslog-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '605429f322a7b034ef9794ac46c40b29'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, architecture=x86_64, name=rhosp17/openstack-rsyslog, release=1761123044, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:49Z, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, managed_by=tripleo_ansible, config_id=tripleo_step3, container_name=rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog) Dec 15 03:11:14 localhost podman[63719]: 2025-12-15 08:11:14.726292053 +0000 UTC m=+0.146050433 container start 0e38c5f94ae7971c602d36252a3143b35e3d7e9f0725bde65bf045185c61e4e9 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, url=https://www.redhat.com, batch=17.1_20251118.1, build-date=2025-11-18T22:49:49Z, com.redhat.component=openstack-rsyslog-container, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, container_name=rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, vcs-type=git, io.openshift.expose-services=, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, name=rhosp17/openstack-rsyslog, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '605429f322a7b034ef9794ac46c40b29'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 rsyslog) Dec 15 03:11:14 localhost podman[63719]: rsyslog Dec 15 03:11:14 localhost systemd[1]: Started rsyslog container. Dec 15 03:11:14 localhost systemd[1]: libpod-0e38c5f94ae7971c602d36252a3143b35e3d7e9f0725bde65bf045185c61e4e9.scope: Deactivated successfully. Dec 15 03:11:14 localhost podman[63754]: 2025-12-15 08:11:14.895363446 +0000 UTC m=+0.052983728 container died 0e38c5f94ae7971c602d36252a3143b35e3d7e9f0725bde65bf045185c61e4e9 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.buildah.version=1.41.4, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '605429f322a7b034ef9794ac46c40b29'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, description=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, tcib_managed=true, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:49Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, container_name=rsyslog, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-rsyslog-container, vcs-type=git, name=rhosp17/openstack-rsyslog, version=17.1.12, managed_by=tripleo_ansible, release=1761123044, url=https://www.redhat.com) Dec 15 03:11:14 localhost podman[63754]: 2025-12-15 08:11:14.922289822 +0000 UTC m=+0.079910114 container cleanup 0e38c5f94ae7971c602d36252a3143b35e3d7e9f0725bde65bf045185c61e4e9 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vendor=Red Hat, Inc., version=17.1.12, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, url=https://www.redhat.com, container_name=rsyslog, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-rsyslog, architecture=x86_64, batch=17.1_20251118.1, com.redhat.component=openstack-rsyslog-container, vcs-type=git, build-date=2025-11-18T22:49:49Z, summary=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '605429f322a7b034ef9794ac46c40b29'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, release=1761123044) Dec 15 03:11:14 localhost systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE Dec 15 03:11:15 localhost podman[63781]: 2025-12-15 08:11:15.007310512 +0000 UTC m=+0.059431691 container cleanup 0e38c5f94ae7971c602d36252a3143b35e3d7e9f0725bde65bf045185c61e4e9 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, com.redhat.component=openstack-rsyslog-container, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, vendor=Red Hat, Inc., managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '605429f322a7b034ef9794ac46c40b29'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, url=https://www.redhat.com, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 rsyslog, container_name=rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:49Z, name=rhosp17/openstack-rsyslog, distribution-scope=public, maintainer=OpenStack TripleO Team, version=17.1.12) Dec 15 03:11:15 localhost podman[63781]: rsyslog Dec 15 03:11:15 localhost systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'. Dec 15 03:11:15 localhost systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 3. Dec 15 03:11:15 localhost systemd[1]: Stopped rsyslog container. Dec 15 03:11:15 localhost systemd[1]: Starting rsyslog container... Dec 15 03:11:15 localhost systemd[1]: Started libcrun container. Dec 15 03:11:15 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5f937d9d464184dc9258be95e010d108efe08788960fe016cf6a07726f0a4d55/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff) Dec 15 03:11:15 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5f937d9d464184dc9258be95e010d108efe08788960fe016cf6a07726f0a4d55/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff) Dec 15 03:11:15 localhost podman[63808]: 2025-12-15 08:11:15.294200167 +0000 UTC m=+0.119945099 container init 0e38c5f94ae7971c602d36252a3143b35e3d7e9f0725bde65bf045185c61e4e9 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, build-date=2025-11-18T22:49:49Z, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '605429f322a7b034ef9794ac46c40b29'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.openshift.expose-services=, com.redhat.component=openstack-rsyslog-container, container_name=rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, url=https://www.redhat.com, batch=17.1_20251118.1, name=rhosp17/openstack-rsyslog, maintainer=OpenStack TripleO Team, distribution-scope=public, tcib_managed=true, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.buildah.version=1.41.4) Dec 15 03:11:15 localhost podman[63808]: 2025-12-15 08:11:15.305071007 +0000 UTC m=+0.130815939 container start 0e38c5f94ae7971c602d36252a3143b35e3d7e9f0725bde65bf045185c61e4e9 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, name=rhosp17/openstack-rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '605429f322a7b034ef9794ac46c40b29'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:49Z, description=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-rsyslog-container, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, release=1761123044, vcs-type=git) Dec 15 03:11:15 localhost podman[63808]: rsyslog Dec 15 03:11:15 localhost systemd[1]: Started rsyslog container. Dec 15 03:11:15 localhost systemd[1]: libpod-0e38c5f94ae7971c602d36252a3143b35e3d7e9f0725bde65bf045185c61e4e9.scope: Deactivated successfully. Dec 15 03:11:15 localhost podman[63858]: 2025-12-15 08:11:15.466916968 +0000 UTC m=+0.049301951 container died 0e38c5f94ae7971c602d36252a3143b35e3d7e9f0725bde65bf045185c61e4e9 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, maintainer=OpenStack TripleO Team, version=17.1.12, config_id=tripleo_step3, name=rhosp17/openstack-rsyslog, container_name=rsyslog, batch=17.1_20251118.1, vendor=Red Hat, Inc., tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '605429f322a7b034ef9794ac46c40b29'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, com.redhat.component=openstack-rsyslog-container, managed_by=tripleo_ansible, url=https://www.redhat.com, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, release=1761123044, io.openshift.expose-services=, build-date=2025-11-18T22:49:49Z) Dec 15 03:11:15 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0e38c5f94ae7971c602d36252a3143b35e3d7e9f0725bde65bf045185c61e4e9-userdata-shm.mount: Deactivated successfully. Dec 15 03:11:15 localhost systemd[1]: var-lib-containers-storage-overlay-5f937d9d464184dc9258be95e010d108efe08788960fe016cf6a07726f0a4d55-merged.mount: Deactivated successfully. Dec 15 03:11:15 localhost podman[63858]: 2025-12-15 08:11:15.506082399 +0000 UTC m=+0.088467342 container cleanup 0e38c5f94ae7971c602d36252a3143b35e3d7e9f0725bde65bf045185c61e4e9 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-rsyslog-container, build-date=2025-11-18T22:49:49Z, summary=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., architecture=x86_64, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, io.buildah.version=1.41.4, distribution-scope=public, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, name=rhosp17/openstack-rsyslog, version=17.1.12, description=Red Hat OpenStack Platform 17.1 rsyslog, container_name=rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '605429f322a7b034ef9794ac46c40b29'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, batch=17.1_20251118.1, vcs-type=git) Dec 15 03:11:15 localhost systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE Dec 15 03:11:15 localhost podman[63873]: 2025-12-15 08:11:15.591674764 +0000 UTC m=+0.058690391 container cleanup 0e38c5f94ae7971c602d36252a3143b35e3d7e9f0725bde65bf045185c61e4e9 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '605429f322a7b034ef9794ac46c40b29'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, com.redhat.component=openstack-rsyslog-container, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, name=rhosp17/openstack-rsyslog, url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 rsyslog, container_name=rsyslog, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:49Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 15 03:11:15 localhost podman[63873]: rsyslog Dec 15 03:11:15 localhost systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'. Dec 15 03:11:15 localhost python3[63865]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks3.json short_hostname=np0005559462 step=3 update_config_hash_only=False Dec 15 03:11:15 localhost systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 4. Dec 15 03:11:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:11:15 localhost systemd[1]: Stopped rsyslog container. Dec 15 03:11:15 localhost systemd[1]: Starting rsyslog container... Dec 15 03:11:16 localhost podman[63886]: 2025-12-15 08:11:16.014077281 +0000 UTC m=+0.089696875 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=starting, config_id=tripleo_step3, architecture=x86_64, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., version=17.1.12, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:51:28Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, batch=17.1_20251118.1, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, managed_by=tripleo_ansible, url=https://www.redhat.com) Dec 15 03:11:16 localhost podman[63886]: 2025-12-15 08:11:16.022839314 +0000 UTC m=+0.098458898 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, config_id=tripleo_step3, build-date=2025-11-18T22:51:28Z, architecture=x86_64, batch=17.1_20251118.1, name=rhosp17/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc.) Dec 15 03:11:16 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:11:16 localhost systemd[1]: Started libcrun container. Dec 15 03:11:16 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5f937d9d464184dc9258be95e010d108efe08788960fe016cf6a07726f0a4d55/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff) Dec 15 03:11:16 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5f937d9d464184dc9258be95e010d108efe08788960fe016cf6a07726f0a4d55/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff) Dec 15 03:11:16 localhost podman[63887]: 2025-12-15 08:11:16.144194169 +0000 UTC m=+0.217943004 container init 0e38c5f94ae7971c602d36252a3143b35e3d7e9f0725bde65bf045185c61e4e9 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '605429f322a7b034ef9794ac46c40b29'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, tcib_managed=true, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, container_name=rsyslog, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:49Z, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_step3, name=rhosp17/openstack-rsyslog, com.redhat.component=openstack-rsyslog-container) Dec 15 03:11:16 localhost podman[63887]: 2025-12-15 08:11:16.152771897 +0000 UTC m=+0.226520732 container start 0e38c5f94ae7971c602d36252a3143b35e3d7e9f0725bde65bf045185c61e4e9 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.12, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '605429f322a7b034ef9794ac46c40b29'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.buildah.version=1.41.4, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, name=rhosp17/openstack-rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.component=openstack-rsyslog-container, release=1761123044, batch=17.1_20251118.1, build-date=2025-11-18T22:49:49Z, distribution-scope=public, config_id=tripleo_step3, container_name=rsyslog) Dec 15 03:11:16 localhost podman[63887]: rsyslog Dec 15 03:11:16 localhost systemd[1]: Started rsyslog container. Dec 15 03:11:16 localhost python3[63930]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:11:16 localhost systemd[1]: libpod-0e38c5f94ae7971c602d36252a3143b35e3d7e9f0725bde65bf045185c61e4e9.scope: Deactivated successfully. Dec 15 03:11:16 localhost podman[63945]: 2025-12-15 08:11:16.298715506 +0000 UTC m=+0.041218816 container died 0e38c5f94ae7971c602d36252a3143b35e3d7e9f0725bde65bf045185c61e4e9 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '605429f322a7b034ef9794ac46c40b29'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, tcib_managed=true, container_name=rsyslog, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.openshift.expose-services=, release=1761123044, architecture=x86_64, build-date=2025-11-18T22:49:49Z, com.redhat.component=openstack-rsyslog-container, distribution-scope=public, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, description=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, name=rhosp17/openstack-rsyslog, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog) Dec 15 03:11:16 localhost podman[63945]: 2025-12-15 08:11:16.32294485 +0000 UTC m=+0.065448080 container cleanup 0e38c5f94ae7971c602d36252a3143b35e3d7e9f0725bde65bf045185c61e4e9 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, name=rhosp17/openstack-rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:49Z, description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, architecture=x86_64, container_name=rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., distribution-scope=public, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '605429f322a7b034ef9794ac46c40b29'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, com.redhat.component=openstack-rsyslog-container, batch=17.1_20251118.1, release=1761123044, maintainer=OpenStack TripleO Team, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog) Dec 15 03:11:16 localhost systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE Dec 15 03:11:16 localhost podman[63961]: 2025-12-15 08:11:16.40758438 +0000 UTC m=+0.059640206 container cleanup 0e38c5f94ae7971c602d36252a3143b35e3d7e9f0725bde65bf045185c61e4e9 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_step3, container_name=rsyslog, distribution-scope=public, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.component=openstack-rsyslog-container, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '605429f322a7b034ef9794ac46c40b29'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, url=https://www.redhat.com, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, name=rhosp17/openstack-rsyslog, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, build-date=2025-11-18T22:49:49Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, tcib_managed=true, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=) Dec 15 03:11:16 localhost podman[63961]: rsyslog Dec 15 03:11:16 localhost systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'. Dec 15 03:11:16 localhost python3[63984]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_3 config_pattern=container-puppet-*.json config_overrides={} debug=True Dec 15 03:11:16 localhost systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 5. Dec 15 03:11:16 localhost systemd[1]: Stopped rsyslog container. Dec 15 03:11:16 localhost systemd[1]: tripleo_rsyslog.service: Start request repeated too quickly. Dec 15 03:11:16 localhost systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'. Dec 15 03:11:16 localhost systemd[1]: Failed to start rsyslog container. Dec 15 03:11:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:11:17 localhost podman[63985]: 2025-12-15 08:11:17.742688575 +0000 UTC m=+0.078068106 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, batch=17.1_20251118.1, url=https://www.redhat.com, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, version=17.1.12, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1761123044, tcib_managed=true, distribution-scope=public, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 15 03:11:17 localhost podman[63985]: 2025-12-15 08:11:17.775095006 +0000 UTC m=+0.110474497 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, name=rhosp17/openstack-iscsid, tcib_managed=true, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, vcs-type=git, batch=17.1_20251118.1, container_name=iscsid, io.openshift.expose-services=, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, version=17.1.12, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d) Dec 15 03:11:17 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:11:25 localhost sshd[64003]: main: sshd: ssh-rsa algorithm is disabled Dec 15 03:11:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:11:31 localhost systemd[1]: tmp-crun.BFC4NC.mount: Deactivated successfully. Dec 15 03:11:31 localhost podman[64082]: 2025-12-15 08:11:31.757049476 +0000 UTC m=+0.090921388 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., config_id=tripleo_step1, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, architecture=x86_64, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, version=17.1.12, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible) Dec 15 03:11:31 localhost podman[64082]: 2025-12-15 08:11:31.946569513 +0000 UTC m=+0.280441495 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, io.openshift.expose-services=, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, tcib_managed=true, batch=17.1_20251118.1) Dec 15 03:11:31 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:11:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:11:46 localhost systemd[1]: tmp-crun.HtWxAn.mount: Deactivated successfully. Dec 15 03:11:46 localhost podman[64111]: 2025-12-15 08:11:46.752582513 +0000 UTC m=+0.087362703 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, version=17.1.12, managed_by=tripleo_ansible, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, com.redhat.component=openstack-collectd-container, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., io.openshift.expose-services=, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd) Dec 15 03:11:46 localhost podman[64111]: 2025-12-15 08:11:46.766257206 +0000 UTC m=+0.101037436 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, release=1761123044, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=collectd, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, distribution-scope=public, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd) Dec 15 03:11:46 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:11:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:11:48 localhost systemd[1]: tmp-crun.LzSrd3.mount: Deactivated successfully. Dec 15 03:11:48 localhost podman[64133]: 2025-12-15 08:11:48.754734028 +0000 UTC m=+0.089635344 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:44:13Z, version=17.1.12, managed_by=tripleo_ansible, vendor=Red Hat, Inc., release=1761123044, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d) Dec 15 03:11:48 localhost podman[64133]: 2025-12-15 08:11:48.765393132 +0000 UTC m=+0.100294368 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, version=17.1.12, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-iscsid-container, batch=17.1_20251118.1, config_id=tripleo_step3, tcib_managed=true, distribution-scope=public, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, vcs-type=git, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64) Dec 15 03:11:48 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:12:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:12:02 localhost systemd[1]: tmp-crun.Zg9j28.mount: Deactivated successfully. Dec 15 03:12:02 localhost podman[64152]: 2025-12-15 08:12:02.736802659 +0000 UTC m=+0.072382215 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, batch=17.1_20251118.1, architecture=x86_64, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-qdrouterd-container, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd) Dec 15 03:12:02 localhost podman[64152]: 2025-12-15 08:12:02.926401249 +0000 UTC m=+0.261980735 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, tcib_managed=true, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, version=17.1.12, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc.) Dec 15 03:12:02 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:12:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:12:17 localhost podman[64182]: 2025-12-15 08:12:17.769234528 +0000 UTC m=+0.094950275 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, container_name=collectd) Dec 15 03:12:17 localhost podman[64182]: 2025-12-15 08:12:17.781203947 +0000 UTC m=+0.106919714 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, architecture=x86_64, name=rhosp17/openstack-collectd, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, com.redhat.component=openstack-collectd-container, release=1761123044, vcs-type=git, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, build-date=2025-11-18T22:51:28Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 15 03:12:17 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:12:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:12:19 localhost systemd[1]: tmp-crun.nxgTdA.mount: Deactivated successfully. Dec 15 03:12:19 localhost podman[64202]: 2025-12-15 08:12:19.743659567 +0000 UTC m=+0.081590470 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, vcs-type=git, release=1761123044, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., name=rhosp17/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 15 03:12:19 localhost podman[64202]: 2025-12-15 08:12:19.754540096 +0000 UTC m=+0.092470969 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, version=17.1.12, container_name=iscsid, build-date=2025-11-18T23:44:13Z, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, config_id=tripleo_step3, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git) Dec 15 03:12:19 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:12:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:12:33 localhost podman[64300]: 2025-12-15 08:12:33.749667675 +0000 UTC m=+0.084756384 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, batch=17.1_20251118.1, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, distribution-scope=public, vendor=Red Hat, Inc.) Dec 15 03:12:33 localhost podman[64300]: 2025-12-15 08:12:33.942632534 +0000 UTC m=+0.277721193 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, managed_by=tripleo_ansible, version=17.1.12, distribution-scope=public, tcib_managed=true, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_id=tripleo_step1) Dec 15 03:12:33 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:12:39 localhost sshd[64329]: main: sshd: ssh-rsa algorithm is disabled Dec 15 03:12:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:12:48 localhost podman[64331]: 2025-12-15 08:12:48.753193921 +0000 UTC m=+0.083221451 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, batch=17.1_20251118.1, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.expose-services=, url=https://www.redhat.com, config_id=tripleo_step3, build-date=2025-11-18T22:51:28Z, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.buildah.version=1.41.4, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd) Dec 15 03:12:48 localhost podman[64331]: 2025-12-15 08:12:48.788781357 +0000 UTC m=+0.118808827 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, architecture=x86_64, io.buildah.version=1.41.4, version=17.1.12, container_name=collectd, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd) Dec 15 03:12:48 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:12:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:12:51 localhost systemd[1]: tmp-crun.hemqZT.mount: Deactivated successfully. Dec 15 03:12:51 localhost podman[64351]: 2025-12-15 08:12:51.274268892 +0000 UTC m=+0.602042806 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, vendor=Red Hat, Inc., batch=17.1_20251118.1, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, tcib_managed=true, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, url=https://www.redhat.com, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid) Dec 15 03:12:51 localhost podman[64351]: 2025-12-15 08:12:51.285669543 +0000 UTC m=+0.613443447 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, url=https://www.redhat.com, release=1761123044, com.redhat.component=openstack-iscsid-container, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, architecture=x86_64, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d) Dec 15 03:12:51 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:13:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:13:04 localhost podman[64371]: 2025-12-15 08:13:04.756358308 +0000 UTC m=+0.086459177 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, distribution-scope=public, batch=17.1_20251118.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, name=rhosp17/openstack-qdrouterd) Dec 15 03:13:04 localhost podman[64371]: 2025-12-15 08:13:04.953860326 +0000 UTC m=+0.283961165 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.expose-services=, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, config_id=tripleo_step1) Dec 15 03:13:04 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:13:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:13:19 localhost podman[64399]: 2025-12-15 08:13:19.752540373 +0000 UTC m=+0.085084410 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., release=1761123044, url=https://www.redhat.com, name=rhosp17/openstack-collectd, distribution-scope=public, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.openshift.expose-services=, container_name=collectd, tcib_managed=true, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-collectd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, vcs-type=git, architecture=x86_64) Dec 15 03:13:19 localhost podman[64399]: 2025-12-15 08:13:19.763376039 +0000 UTC m=+0.095920066 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.openshift.expose-services=, io.buildah.version=1.41.4, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, managed_by=tripleo_ansible, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, version=17.1.12, batch=17.1_20251118.1, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, tcib_managed=true) Dec 15 03:13:19 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:13:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:13:21 localhost podman[64419]: 2025-12-15 08:13:21.763960622 +0000 UTC m=+0.088096529 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, release=1761123044, tcib_managed=true, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, io.openshift.expose-services=, url=https://www.redhat.com, managed_by=tripleo_ansible, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, build-date=2025-11-18T23:44:13Z) Dec 15 03:13:21 localhost podman[64419]: 2025-12-15 08:13:21.798329257 +0000 UTC m=+0.122465144 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, name=rhosp17/openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, distribution-scope=public, release=1761123044, url=https://www.redhat.com, architecture=x86_64, version=17.1.12, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, build-date=2025-11-18T23:44:13Z, config_id=tripleo_step3, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true) Dec 15 03:13:21 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:13:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:13:35 localhost systemd[1]: tmp-crun.n8TadR.mount: Deactivated successfully. Dec 15 03:13:35 localhost podman[64515]: 2025-12-15 08:13:35.765438028 +0000 UTC m=+0.091913939 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, tcib_managed=true, name=rhosp17/openstack-qdrouterd, architecture=x86_64, batch=17.1_20251118.1, io.openshift.expose-services=, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com) Dec 15 03:13:35 localhost podman[64515]: 2025-12-15 08:13:35.985723827 +0000 UTC m=+0.312199668 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd, tcib_managed=true, io.buildah.version=1.41.4, architecture=x86_64, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.openshift.expose-services=, url=https://www.redhat.com, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team) Dec 15 03:13:35 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:13:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:13:50 localhost podman[64544]: 2025-12-15 08:13:50.760109384 +0000 UTC m=+0.087896584 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.component=openstack-collectd-container, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, io.openshift.expose-services=, tcib_managed=true, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, architecture=x86_64, url=https://www.redhat.com) Dec 15 03:13:50 localhost podman[64544]: 2025-12-15 08:13:50.777324197 +0000 UTC m=+0.105111447 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, architecture=x86_64, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, version=17.1.12, container_name=collectd, io.buildah.version=1.41.4, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, release=1761123044, name=rhosp17/openstack-collectd, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, vendor=Red Hat, Inc., batch=17.1_20251118.1) Dec 15 03:13:50 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:13:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:13:52 localhost podman[64565]: 2025-12-15 08:13:52.757938275 +0000 UTC m=+0.083494169 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, vcs-type=git, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, container_name=iscsid, vendor=Red Hat, Inc., url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, build-date=2025-11-18T23:44:13Z, release=1761123044, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 15 03:13:52 localhost podman[64565]: 2025-12-15 08:13:52.793288106 +0000 UTC m=+0.118843950 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, batch=17.1_20251118.1, version=17.1.12, config_id=tripleo_step3, io.buildah.version=1.41.4, io.openshift.expose-services=, vendor=Red Hat, Inc., distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, tcib_managed=true, com.redhat.component=openstack-iscsid-container, vcs-type=git, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, name=rhosp17/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d) Dec 15 03:13:52 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:13:57 localhost sshd[64584]: main: sshd: ssh-rsa algorithm is disabled Dec 15 03:14:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:14:06 localhost podman[64586]: 2025-12-15 08:14:06.764070934 +0000 UTC m=+0.091099429 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, architecture=x86_64, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, distribution-scope=public, vendor=Red Hat, Inc., release=1761123044, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container) Dec 15 03:14:06 localhost podman[64586]: 2025-12-15 08:14:06.9976323 +0000 UTC m=+0.324660755 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, container_name=metrics_qdr, version=17.1.12, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.buildah.version=1.41.4, config_id=tripleo_step1, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Dec 15 03:14:07 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:14:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:14:21 localhost podman[64615]: 2025-12-15 08:14:21.766666857 +0000 UTC m=+0.087868203 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, batch=17.1_20251118.1, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-collectd-container, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., container_name=collectd, architecture=x86_64, url=https://www.redhat.com, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, config_id=tripleo_step3, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, tcib_managed=true) Dec 15 03:14:21 localhost podman[64615]: 2025-12-15 08:14:21.777480962 +0000 UTC m=+0.098682368 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, maintainer=OpenStack TripleO Team, release=1761123044, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, config_id=tripleo_step3, build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, io.buildah.version=1.41.4, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, version=17.1.12, name=rhosp17/openstack-collectd) Dec 15 03:14:21 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:14:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:14:23 localhost podman[64636]: 2025-12-15 08:14:23.771694288 +0000 UTC m=+0.098857703 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_id=tripleo_step3, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:44:13Z, tcib_managed=true, batch=17.1_20251118.1, version=17.1.12, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, vcs-type=git, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, url=https://www.redhat.com, container_name=iscsid, io.buildah.version=1.41.4, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, maintainer=OpenStack TripleO Team) Dec 15 03:14:23 localhost podman[64636]: 2025-12-15 08:14:23.784260088 +0000 UTC m=+0.111423503 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, distribution-scope=public, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, vcs-type=git, io.openshift.expose-services=, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=iscsid, io.buildah.version=1.41.4, tcib_managed=true, url=https://www.redhat.com, version=17.1.12, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid) Dec 15 03:14:23 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:14:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:14:37 localhost systemd[1]: tmp-crun.cns8zR.mount: Deactivated successfully. Dec 15 03:14:37 localhost podman[64765]: 2025-12-15 08:14:37.761754072 +0000 UTC m=+0.092168057 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container) Dec 15 03:14:37 localhost podman[64765]: 2025-12-15 08:14:37.984066232 +0000 UTC m=+0.314480207 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, config_id=tripleo_step1, tcib_managed=true, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Dec 15 03:14:37 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:14:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:14:52 localhost podman[64810]: 2025-12-15 08:14:52.768219229 +0000 UTC m=+0.095606247 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.component=openstack-collectd-container, version=17.1.12, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, vcs-type=git, container_name=collectd, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.buildah.version=1.41.4, name=rhosp17/openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd) Dec 15 03:14:52 localhost podman[64810]: 2025-12-15 08:14:52.778612632 +0000 UTC m=+0.105999670 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, io.buildah.version=1.41.4, release=1761123044, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, distribution-scope=public, container_name=collectd, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd) Dec 15 03:14:52 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:14:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:14:54 localhost podman[64830]: 2025-12-15 08:14:54.742538371 +0000 UTC m=+0.077087180 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_id=tripleo_step3, tcib_managed=true, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, release=1761123044, managed_by=tripleo_ansible, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, build-date=2025-11-18T23:44:13Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, distribution-scope=public) Dec 15 03:14:54 localhost podman[64830]: 2025-12-15 08:14:54.7774733 +0000 UTC m=+0.112022099 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, distribution-scope=public, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:44:13Z, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid) Dec 15 03:14:54 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:15:05 localhost ceph-osd[31375]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Dec 15 03:15:05 localhost ceph-osd[31375]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1800.1 total, 600.0 interval#012Cumulative writes: 4373 writes, 20K keys, 4373 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 4373 writes, 484 syncs, 9.04 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 358 writes, 882 keys, 358 commit groups, 1.0 writes per commit group, ingest: 0.65 MB, 0.00 MB/s#012Interval WAL: 358 writes, 175 syncs, 2.05 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Dec 15 03:15:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:15:08 localhost podman[64849]: 2025-12-15 08:15:08.75217077 +0000 UTC m=+0.083978612 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, version=17.1.12, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4) Dec 15 03:15:08 localhost podman[64849]: 2025-12-15 08:15:08.953489057 +0000 UTC m=+0.285296909 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, vcs-type=git, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, batch=17.1_20251118.1, container_name=metrics_qdr, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container) Dec 15 03:15:08 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:15:10 localhost ceph-osd[32311]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Dec 15 03:15:10 localhost ceph-osd[32311]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1800.2 total, 600.0 interval#012Cumulative writes: 5246 writes, 23K keys, 5246 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 5246 writes, 573 syncs, 9.16 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 425 writes, 944 keys, 425 commit groups, 1.0 writes per commit group, ingest: 0.66 MB, 0.00 MB/s#012Interval WAL: 426 writes, 210 syncs, 2.03 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Dec 15 03:15:17 localhost sshd[64878]: main: sshd: ssh-rsa algorithm is disabled Dec 15 03:15:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:15:23 localhost podman[64880]: 2025-12-15 08:15:23.967681659 +0000 UTC m=+0.300446418 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, version=17.1.12, config_id=tripleo_step3, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1761123044) Dec 15 03:15:24 localhost podman[64880]: 2025-12-15 08:15:24.086601028 +0000 UTC m=+0.419365737 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, tcib_managed=true, name=rhosp17/openstack-collectd, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git) Dec 15 03:15:24 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:15:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:15:25 localhost podman[64900]: 2025-12-15 08:15:25.877138794 +0000 UTC m=+0.211233961 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, name=rhosp17/openstack-iscsid, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, distribution-scope=public, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, vcs-type=git, batch=17.1_20251118.1, container_name=iscsid, version=17.1.12) Dec 15 03:15:25 localhost podman[64900]: 2025-12-15 08:15:25.996150046 +0000 UTC m=+0.330245223 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, architecture=x86_64, maintainer=OpenStack TripleO Team, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, container_name=iscsid, tcib_managed=true, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid) Dec 15 03:15:26 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:15:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:15:39 localhost podman[64919]: 2025-12-15 08:15:39.75142256 +0000 UTC m=+0.085531632 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, version=17.1.12, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, architecture=x86_64, config_id=tripleo_step1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, tcib_managed=true, io.openshift.expose-services=) Dec 15 03:15:39 localhost podman[64919]: 2025-12-15 08:15:39.981533506 +0000 UTC m=+0.315642648 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, vcs-type=git, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.openshift.expose-services=, config_id=tripleo_step1, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, distribution-scope=public, architecture=x86_64, maintainer=OpenStack TripleO Team, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd) Dec 15 03:15:39 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:15:40 localhost python3[64995]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 03:15:41 localhost python3[65040]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765786540.288681-106946-255416139186756/source _original_basename=tmphcd52hr5 follow=False checksum=ee48fb03297eb703b1954c8852d0f67fab51dac1 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:15:42 localhost python3[65154]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/recover_tripleo_nova_virtqemud.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 03:15:42 localhost python3[65223]: ansible-ansible.legacy.copy Invoked with dest=/usr/libexec/recover_tripleo_nova_virtqemud.sh mode=0755 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765786542.040557-107034-268252511301035/source _original_basename=tmplezpb005 follow=False checksum=922b8aa8342176110bffc2e39abdccc2b39e53a9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:15:43 localhost python3[65320]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud_recover.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 03:15:43 localhost python3[65395]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/tripleo_nova_virtqemud_recover.service mode=0644 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765786543.0850925-107299-40125976239802/source _original_basename=tmp0_sn0zkr follow=False checksum=92f73544b703afc85885fa63ab07bdf8f8671554 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:15:44 localhost podman[65464]: Dec 15 03:15:44 localhost podman[65464]: 2025-12-15 08:15:44.189086744 +0000 UTC m=+0.083123638 container create ec0336f61e6554c86ecd8ee455aeed65c8b910cf2cb2ad408d93ce788aaae889 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=thirsty_mendeleev, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., ceph=True, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, GIT_CLEAN=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, version=7, io.openshift.tags=rhceph ceph, distribution-scope=public, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main) Dec 15 03:15:44 localhost systemd[1]: Started libpod-conmon-ec0336f61e6554c86ecd8ee455aeed65c8b910cf2cb2ad408d93ce788aaae889.scope. Dec 15 03:15:44 localhost systemd[1]: Started libcrun container. Dec 15 03:15:44 localhost podman[65464]: 2025-12-15 08:15:44.156363304 +0000 UTC m=+0.050400268 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 03:15:44 localhost podman[65464]: 2025-12-15 08:15:44.267305083 +0000 UTC m=+0.161341997 container init ec0336f61e6554c86ecd8ee455aeed65c8b910cf2cb2ad408d93ce788aaae889 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=thirsty_mendeleev, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, GIT_BRANCH=main, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux , version=7, name=rhceph, CEPH_POINT_RELEASE=, release=1763362218, ceph=True, io.openshift.tags=rhceph ceph, vcs-type=git, build-date=2025-11-26T19:44:28Z) Dec 15 03:15:44 localhost podman[65464]: 2025-12-15 08:15:44.278441166 +0000 UTC m=+0.172478090 container start ec0336f61e6554c86ecd8ee455aeed65c8b910cf2cb2ad408d93ce788aaae889 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=thirsty_mendeleev, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, architecture=x86_64, name=rhceph, CEPH_POINT_RELEASE=, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-26T19:44:28Z, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, release=1763362218, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git) Dec 15 03:15:44 localhost podman[65464]: 2025-12-15 08:15:44.278749184 +0000 UTC m=+0.172786108 container attach ec0336f61e6554c86ecd8ee455aeed65c8b910cf2cb2ad408d93ce788aaae889 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=thirsty_mendeleev, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-11-26T19:44:28Z, distribution-scope=public, RELEASE=main, ceph=True, GIT_BRANCH=main, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1763362218, version=7, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Dec 15 03:15:44 localhost thirsty_mendeleev[65513]: 167 167 Dec 15 03:15:44 localhost systemd[1]: libpod-ec0336f61e6554c86ecd8ee455aeed65c8b910cf2cb2ad408d93ce788aaae889.scope: Deactivated successfully. Dec 15 03:15:44 localhost podman[65464]: 2025-12-15 08:15:44.283434297 +0000 UTC m=+0.177471211 container died ec0336f61e6554c86ecd8ee455aeed65c8b910cf2cb2ad408d93ce788aaae889 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=thirsty_mendeleev, release=1763362218, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z, GIT_CLEAN=True, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, name=rhceph, vcs-type=git, description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., version=7, RELEASE=main, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Dec 15 03:15:44 localhost podman[65519]: 2025-12-15 08:15:44.386522011 +0000 UTC m=+0.088968133 container remove ec0336f61e6554c86ecd8ee455aeed65c8b910cf2cb2ad408d93ce788aaae889 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=thirsty_mendeleev, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., version=7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, GIT_CLEAN=True, RELEASE=main, ceph=True, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , release=1763362218, GIT_BRANCH=main, distribution-scope=public) Dec 15 03:15:44 localhost systemd[1]: libpod-conmon-ec0336f61e6554c86ecd8ee455aeed65c8b910cf2cb2ad408d93ce788aaae889.scope: Deactivated successfully. Dec 15 03:15:44 localhost python3[65516]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud_recover.timer follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 03:15:44 localhost podman[65554]: Dec 15 03:15:44 localhost podman[65554]: 2025-12-15 08:15:44.592016539 +0000 UTC m=+0.066975524 container create 00546437df457b16877d2dee55d860e8b7dff015f5e79728b6e82029e355ba3f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_germain, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, release=1763362218, maintainer=Guillaume Abrioux , GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.openshift.expose-services=, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, build-date=2025-11-26T19:44:28Z, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, RELEASE=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git) Dec 15 03:15:44 localhost systemd[1]: Started libpod-conmon-00546437df457b16877d2dee55d860e8b7dff015f5e79728b6e82029e355ba3f.scope. Dec 15 03:15:44 localhost systemd[1]: Started libcrun container. Dec 15 03:15:44 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/07cc96bc9e9b5d947321842c08bb6463ddaf38b8dbb083c51f58cc798e26ef34/merged/rootfs supports timestamps until 2038 (0x7fffffff) Dec 15 03:15:44 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/07cc96bc9e9b5d947321842c08bb6463ddaf38b8dbb083c51f58cc798e26ef34/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Dec 15 03:15:44 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/07cc96bc9e9b5d947321842c08bb6463ddaf38b8dbb083c51f58cc798e26ef34/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Dec 15 03:15:44 localhost podman[65554]: 2025-12-15 08:15:44.561039174 +0000 UTC m=+0.035998159 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 03:15:44 localhost podman[65554]: 2025-12-15 08:15:44.662329719 +0000 UTC m=+0.137288704 container init 00546437df457b16877d2dee55d860e8b7dff015f5e79728b6e82029e355ba3f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_germain, ceph=True, maintainer=Guillaume Abrioux , url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, com.redhat.component=rhceph-container, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, architecture=x86_64, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, distribution-scope=public, io.buildah.version=1.41.4, version=7, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, GIT_BRANCH=main) Dec 15 03:15:44 localhost podman[65554]: 2025-12-15 08:15:44.673086263 +0000 UTC m=+0.148045248 container start 00546437df457b16877d2dee55d860e8b7dff015f5e79728b6e82029e355ba3f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_germain, architecture=x86_64, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, io.openshift.expose-services=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, release=1763362218, io.openshift.tags=rhceph ceph, version=7, vendor=Red Hat, Inc., GIT_CLEAN=True, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , io.buildah.version=1.41.4, ceph=True, distribution-scope=public, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-11-26T19:44:28Z, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Dec 15 03:15:44 localhost podman[65554]: 2025-12-15 08:15:44.673495113 +0000 UTC m=+0.148454098 container attach 00546437df457b16877d2dee55d860e8b7dff015f5e79728b6e82029e355ba3f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_germain, release=1763362218, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, build-date=2025-11-26T19:44:28Z, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , GIT_BRANCH=main, GIT_CLEAN=True, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, name=rhceph, io.openshift.expose-services=, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main) Dec 15 03:15:44 localhost python3[65604]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/tripleo_nova_virtqemud_recover.timer mode=0644 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765786544.090237-107370-229998439223957/source _original_basename=tmpd9a1asee follow=False checksum=c6e5f76a53c0d6ccaf46c4b48d813dc2891ad8e9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:15:45 localhost systemd[1]: var-lib-containers-storage-overlay-334e6b64dc37c5550cf0c9a7832da5b5d346fc6a0becce3c43611fa21b900544-merged.mount: Deactivated successfully. Dec 15 03:15:45 localhost python3[66112]: ansible-systemd Invoked with daemon_reload=True enabled=True name=tripleo_nova_virtqemud_recover.service daemon_reexec=False scope=system no_block=False state=None force=None masked=None Dec 15 03:15:45 localhost tender_germain[65598]: [ Dec 15 03:15:45 localhost tender_germain[65598]: { Dec 15 03:15:45 localhost tender_germain[65598]: "available": false, Dec 15 03:15:45 localhost tender_germain[65598]: "ceph_device": false, Dec 15 03:15:45 localhost tender_germain[65598]: "device_id": "QEMU_DVD-ROM_QM00001", Dec 15 03:15:45 localhost tender_germain[65598]: "lsm_data": {}, Dec 15 03:15:45 localhost tender_germain[65598]: "lvs": [], Dec 15 03:15:45 localhost tender_germain[65598]: "path": "/dev/sr0", Dec 15 03:15:45 localhost tender_germain[65598]: "rejected_reasons": [ Dec 15 03:15:45 localhost tender_germain[65598]: "Insufficient space (<5GB)", Dec 15 03:15:45 localhost tender_germain[65598]: "Has a FileSystem" Dec 15 03:15:45 localhost tender_germain[65598]: ], Dec 15 03:15:45 localhost tender_germain[65598]: "sys_api": { Dec 15 03:15:45 localhost tender_germain[65598]: "actuators": null, Dec 15 03:15:45 localhost tender_germain[65598]: "device_nodes": "sr0", Dec 15 03:15:45 localhost tender_germain[65598]: "human_readable_size": "482.00 KB", Dec 15 03:15:45 localhost tender_germain[65598]: "id_bus": "ata", Dec 15 03:15:45 localhost tender_germain[65598]: "model": "QEMU DVD-ROM", Dec 15 03:15:45 localhost tender_germain[65598]: "nr_requests": "2", Dec 15 03:15:45 localhost tender_germain[65598]: "partitions": {}, Dec 15 03:15:45 localhost tender_germain[65598]: "path": "/dev/sr0", Dec 15 03:15:45 localhost tender_germain[65598]: "removable": "1", Dec 15 03:15:45 localhost tender_germain[65598]: "rev": "2.5+", Dec 15 03:15:45 localhost tender_germain[65598]: "ro": "0", Dec 15 03:15:45 localhost tender_germain[65598]: "rotational": "1", Dec 15 03:15:45 localhost tender_germain[65598]: "sas_address": "", Dec 15 03:15:45 localhost tender_germain[65598]: "sas_device_handle": "", Dec 15 03:15:45 localhost tender_germain[65598]: "scheduler_mode": "mq-deadline", Dec 15 03:15:45 localhost tender_germain[65598]: "sectors": 0, Dec 15 03:15:45 localhost tender_germain[65598]: "sectorsize": "2048", Dec 15 03:15:45 localhost tender_germain[65598]: "size": 493568.0, Dec 15 03:15:45 localhost tender_germain[65598]: "support_discard": "0", Dec 15 03:15:45 localhost tender_germain[65598]: "type": "disk", Dec 15 03:15:45 localhost tender_germain[65598]: "vendor": "QEMU" Dec 15 03:15:45 localhost tender_germain[65598]: } Dec 15 03:15:45 localhost tender_germain[65598]: } Dec 15 03:15:45 localhost tender_germain[65598]: ] Dec 15 03:15:45 localhost systemd[1]: Reloading. Dec 15 03:15:45 localhost podman[65554]: 2025-12-15 08:15:45.645726111 +0000 UTC m=+1.120685066 container died 00546437df457b16877d2dee55d860e8b7dff015f5e79728b6e82029e355ba3f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_germain, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, io.openshift.expose-services=, version=7, vcs-type=git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, io.buildah.version=1.41.4, ceph=True, maintainer=Guillaume Abrioux , GIT_BRANCH=main, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, build-date=2025-11-26T19:44:28Z, vendor=Red Hat, Inc.) Dec 15 03:15:45 localhost systemd-sysv-generator[67397]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 03:15:45 localhost systemd-rc-local-generator[67393]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 03:15:45 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 03:15:45 localhost systemd[1]: tmp-crun.2jxVdw.mount: Deactivated successfully. Dec 15 03:15:45 localhost systemd[1]: libpod-00546437df457b16877d2dee55d860e8b7dff015f5e79728b6e82029e355ba3f.scope: Deactivated successfully. Dec 15 03:15:45 localhost systemd[1]: var-lib-containers-storage-overlay-07cc96bc9e9b5d947321842c08bb6463ddaf38b8dbb083c51f58cc798e26ef34-merged.mount: Deactivated successfully. Dec 15 03:15:45 localhost podman[67360]: 2025-12-15 08:15:45.926545023 +0000 UTC m=+0.271425725 container remove 00546437df457b16877d2dee55d860e8b7dff015f5e79728b6e82029e355ba3f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_germain, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, release=1763362218, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, distribution-scope=public, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-11-26T19:44:28Z, vcs-type=git, version=7, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container) Dec 15 03:15:45 localhost systemd[1]: libpod-conmon-00546437df457b16877d2dee55d860e8b7dff015f5e79728b6e82029e355ba3f.scope: Deactivated successfully. Dec 15 03:15:45 localhost systemd[1]: Reloading. Dec 15 03:15:46 localhost systemd-rc-local-generator[67436]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 03:15:46 localhost systemd-sysv-generator[67441]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 03:15:46 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 03:15:46 localhost python3[67478]: ansible-systemd Invoked with daemon_reload=True enabled=True name=tripleo_nova_virtqemud_recover.timer state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 03:15:46 localhost systemd[1]: Reloading. Dec 15 03:15:46 localhost systemd-rc-local-generator[67501]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 03:15:46 localhost systemd-sysv-generator[67504]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 03:15:46 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 03:15:47 localhost systemd[1]: Reloading. Dec 15 03:15:47 localhost systemd-sysv-generator[67546]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 03:15:47 localhost systemd-rc-local-generator[67539]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 03:15:47 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 03:15:47 localhost systemd[1]: Started Check and recover tripleo_nova_virtqemud every 10m. Dec 15 03:15:47 localhost python3[67568]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl enable --now tripleo_nova_virtqemud_recover.timer _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 03:15:47 localhost systemd[1]: Reloading. Dec 15 03:15:47 localhost systemd-sysv-generator[67600]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 03:15:47 localhost systemd-rc-local-generator[67595]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 03:15:47 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 03:15:48 localhost python3[67652]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 03:15:49 localhost python3[67695]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/tripleo_nova_libvirt.target group=root mode=0644 owner=root src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765786548.3200817-107596-273207470328134/source _original_basename=tmpfy6o7okt follow=False checksum=c064b4a8e7d3d1d7c62d1f80a09e350659996afd backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:15:49 localhost python3[67725]: ansible-systemd Invoked with daemon_reload=True enabled=True name=tripleo_nova_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 03:15:49 localhost systemd[1]: Reloading. Dec 15 03:15:49 localhost systemd-sysv-generator[67756]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 03:15:49 localhost systemd-rc-local-generator[67753]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 03:15:49 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 03:15:49 localhost systemd[1]: Reached target tripleo_nova_libvirt.target. Dec 15 03:15:50 localhost python3[67780]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_4 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 15 03:15:51 localhost ansible-async_wrapper.py[67952]: Invoked with 748410551285 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1765786551.4166582-107680-231870656920935/AnsiballZ_command.py _ Dec 15 03:15:51 localhost ansible-async_wrapper.py[67955]: Starting module and watcher Dec 15 03:15:51 localhost ansible-async_wrapper.py[67955]: Start watching 67956 (3600) Dec 15 03:15:51 localhost ansible-async_wrapper.py[67956]: Start module (67956) Dec 15 03:15:51 localhost ansible-async_wrapper.py[67952]: Return async_wrapper task started. Dec 15 03:15:52 localhost python3[67976]: ansible-ansible.legacy.async_status Invoked with jid=748410551285.67952 mode=status _async_dir=/tmp/.ansible_async Dec 15 03:15:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:15:54 localhost podman[68027]: 2025-12-15 08:15:54.662969567 +0000 UTC m=+0.094763146 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, batch=17.1_20251118.1, name=rhosp17/openstack-collectd, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, io.buildah.version=1.41.4, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, distribution-scope=public, architecture=x86_64, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 15 03:15:54 localhost podman[68027]: 2025-12-15 08:15:54.67256249 +0000 UTC m=+0.104356089 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, release=1761123044, batch=17.1_20251118.1, config_id=tripleo_step3, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vcs-type=git, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, name=rhosp17/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, vendor=Red Hat, Inc.) Dec 15 03:15:54 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:15:55 localhost puppet-user[67975]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Dec 15 03:15:55 localhost puppet-user[67975]: (file: /etc/puppet/hiera.yaml) Dec 15 03:15:55 localhost puppet-user[67975]: Warning: Undefined variable '::deploy_config_name'; Dec 15 03:15:55 localhost puppet-user[67975]: (file & line not available) Dec 15 03:15:55 localhost puppet-user[67975]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Dec 15 03:15:55 localhost puppet-user[67975]: (file & line not available) Dec 15 03:15:55 localhost puppet-user[67975]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8) Dec 15 03:15:55 localhost puppet-user[67975]: Warning: This method is deprecated, please use match expressions with Stdlib::Compat::String instead. They are described at https://docs.puppet.com/puppet/latest/reference/lang_data_type.html#match-expressions. at ["/etc/puppet/modules/snmp/manifests/params.pp", 310]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Dec 15 03:15:55 localhost puppet-user[67975]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Dec 15 03:15:55 localhost puppet-user[67975]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Dec 15 03:15:55 localhost puppet-user[67975]: with Stdlib::Compat::Bool. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 358]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Dec 15 03:15:55 localhost puppet-user[67975]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Dec 15 03:15:55 localhost puppet-user[67975]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Dec 15 03:15:55 localhost puppet-user[67975]: with Stdlib::Compat::Array. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 367]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Dec 15 03:15:55 localhost puppet-user[67975]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Dec 15 03:15:55 localhost puppet-user[67975]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Dec 15 03:15:55 localhost puppet-user[67975]: with Stdlib::Compat::String. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 382]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Dec 15 03:15:55 localhost puppet-user[67975]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Dec 15 03:15:55 localhost puppet-user[67975]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Dec 15 03:15:55 localhost puppet-user[67975]: with Stdlib::Compat::Numeric. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 388]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Dec 15 03:15:55 localhost puppet-user[67975]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Dec 15 03:15:55 localhost puppet-user[67975]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Dec 15 03:15:55 localhost puppet-user[67975]: with Pattern[]. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 393]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Dec 15 03:15:55 localhost puppet-user[67975]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Dec 15 03:15:55 localhost puppet-user[67975]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69) Dec 15 03:15:55 localhost puppet-user[67975]: Notice: Compiled catalog for np0005559462.localdomain in environment production in 0.21 seconds Dec 15 03:15:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:15:56 localhost systemd[1]: tmp-crun.FvExM4.mount: Deactivated successfully. Dec 15 03:15:56 localhost podman[68115]: 2025-12-15 08:15:56.774588603 +0000 UTC m=+0.106367361 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, name=rhosp17/openstack-iscsid, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true) Dec 15 03:15:56 localhost podman[68115]: 2025-12-15 08:15:56.812449879 +0000 UTC m=+0.144228687 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.buildah.version=1.41.4, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, release=1761123044, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, batch=17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, version=17.1.12, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true) Dec 15 03:15:56 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:15:56 localhost ansible-async_wrapper.py[67955]: 67956 still running (3600) Dec 15 03:16:01 localhost ansible-async_wrapper.py[67955]: 67956 still running (3595) Dec 15 03:16:02 localhost python3[68216]: ansible-ansible.legacy.async_status Invoked with jid=748410551285.67952 mode=status _async_dir=/tmp/.ansible_async Dec 15 03:16:03 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Dec 15 03:16:03 localhost systemd[1]: Starting man-db-cache-update.service... Dec 15 03:16:03 localhost systemd[1]: Reloading. Dec 15 03:16:03 localhost systemd-sysv-generator[68293]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 03:16:03 localhost systemd-rc-local-generator[68289]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 03:16:03 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 03:16:03 localhost systemd[1]: Queuing reload/restart jobs for marked units… Dec 15 03:16:04 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Dec 15 03:16:04 localhost systemd[1]: Finished man-db-cache-update.service. Dec 15 03:16:04 localhost systemd[1]: man-db-cache-update.service: Consumed 1.207s CPU time. Dec 15 03:16:04 localhost systemd[1]: run-red7a4eef15bc44dba951fe1b9bbde39e.service: Deactivated successfully. Dec 15 03:16:05 localhost puppet-user[67975]: Notice: /Stage[main]/Snmp/Package[snmpd]/ensure: created Dec 15 03:16:05 localhost puppet-user[67975]: Notice: /Stage[main]/Snmp/File[snmpd.conf]/content: content changed '{sha256}2b743f970e80e2150759bfc66f2d8d0fbd8b31624f79e2991248d1a5ac57494e' to '{sha256}069ae0f986730bdbaceaf31830c661f9d2c06e5f60527d08e9651f0a2b8381f4' Dec 15 03:16:05 localhost puppet-user[67975]: Notice: /Stage[main]/Snmp/File[snmpd.sysconfig]/content: content changed '{sha256}b63afb2dee7419b6834471f88581d981c8ae5c8b27b9d329ba67a02f3ddd8221' to '{sha256}3917ee8bbc680ad50d77186ad4a1d2705c2025c32fc32f823abbda7f2328dfbd' Dec 15 03:16:05 localhost puppet-user[67975]: Notice: /Stage[main]/Snmp/File[snmptrapd.conf]/content: content changed '{sha256}2e1ca894d609ef337b6243909bf5623c87fd5df98ecbd00c7d4c12cf12f03c4e' to '{sha256}3ecf18da1ba84ea3932607f2b903ee6a038b6f9ac4e1e371e48f3ef61c5052ea' Dec 15 03:16:05 localhost puppet-user[67975]: Notice: /Stage[main]/Snmp/File[snmptrapd.sysconfig]/content: content changed '{sha256}86ee5797ad10cb1ea0f631e9dfa6ae278ecf4f4d16f4c80f831cdde45601b23c' to '{sha256}2244553364afcca151958f8e2003e4c182f5e2ecfbe55405cec73fd818581e97' Dec 15 03:16:05 localhost puppet-user[67975]: Notice: /Stage[main]/Snmp/Service[snmptrapd]: Triggered 'refresh' from 2 events Dec 15 03:16:06 localhost ansible-async_wrapper.py[67955]: 67956 still running (3590) Dec 15 03:16:10 localhost puppet-user[67975]: Notice: /Stage[main]/Tripleo::Profile::Base::Snmp/Snmp::Snmpv3_user[ro_snmp_user]/Exec[create-snmpv3-user-ro_snmp_user]/returns: executed successfully Dec 15 03:16:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:16:10 localhost systemd[1]: Reloading. Dec 15 03:16:10 localhost podman[69319]: 2025-12-15 08:16:10.452195474 +0000 UTC m=+0.096515371 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, url=https://www.redhat.com, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, version=17.1.12, name=rhosp17/openstack-qdrouterd, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Dec 15 03:16:10 localhost systemd-rc-local-generator[69371]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 03:16:10 localhost systemd-sysv-generator[69375]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 03:16:10 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 03:16:10 localhost podman[69319]: 2025-12-15 08:16:10.646848037 +0000 UTC m=+0.291167974 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.4, batch=17.1_20251118.1, release=1761123044, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, version=17.1.12) Dec 15 03:16:10 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:16:10 localhost systemd[1]: Starting Simple Network Management Protocol (SNMP) Daemon.... Dec 15 03:16:10 localhost snmpd[69387]: Can't find directory of RPM packages Dec 15 03:16:10 localhost snmpd[69387]: Duplicate IPv4 address detected, some interfaces may not be visible in IP-MIB Dec 15 03:16:10 localhost systemd[1]: Started Simple Network Management Protocol (SNMP) Daemon.. Dec 15 03:16:10 localhost systemd[1]: Reloading. Dec 15 03:16:11 localhost systemd-sysv-generator[69418]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 03:16:11 localhost systemd-rc-local-generator[69412]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 03:16:11 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 03:16:11 localhost systemd[1]: Reloading. Dec 15 03:16:11 localhost systemd-sysv-generator[69452]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 03:16:11 localhost systemd-rc-local-generator[69448]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 03:16:11 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 03:16:11 localhost puppet-user[67975]: Notice: /Stage[main]/Snmp/Service[snmpd]/ensure: ensure changed 'stopped' to 'running' Dec 15 03:16:11 localhost puppet-user[67975]: Notice: Applied catalog in 15.81 seconds Dec 15 03:16:11 localhost puppet-user[67975]: Application: Dec 15 03:16:11 localhost puppet-user[67975]: Initial environment: production Dec 15 03:16:11 localhost puppet-user[67975]: Converged environment: production Dec 15 03:16:11 localhost puppet-user[67975]: Run mode: user Dec 15 03:16:11 localhost puppet-user[67975]: Changes: Dec 15 03:16:11 localhost puppet-user[67975]: Total: 8 Dec 15 03:16:11 localhost puppet-user[67975]: Events: Dec 15 03:16:11 localhost puppet-user[67975]: Success: 8 Dec 15 03:16:11 localhost puppet-user[67975]: Total: 8 Dec 15 03:16:11 localhost puppet-user[67975]: Resources: Dec 15 03:16:11 localhost puppet-user[67975]: Restarted: 1 Dec 15 03:16:11 localhost puppet-user[67975]: Changed: 8 Dec 15 03:16:11 localhost puppet-user[67975]: Out of sync: 8 Dec 15 03:16:11 localhost puppet-user[67975]: Total: 19 Dec 15 03:16:11 localhost puppet-user[67975]: Time: Dec 15 03:16:11 localhost puppet-user[67975]: Schedule: 0.00 Dec 15 03:16:11 localhost puppet-user[67975]: Augeas: 0.01 Dec 15 03:16:11 localhost puppet-user[67975]: File: 0.06 Dec 15 03:16:11 localhost puppet-user[67975]: Config retrieval: 0.27 Dec 15 03:16:11 localhost puppet-user[67975]: Service: 1.30 Dec 15 03:16:11 localhost puppet-user[67975]: Transaction evaluation: 15.80 Dec 15 03:16:11 localhost puppet-user[67975]: Catalog application: 15.81 Dec 15 03:16:11 localhost puppet-user[67975]: Last run: 1765786571 Dec 15 03:16:11 localhost puppet-user[67975]: Exec: 5.07 Dec 15 03:16:11 localhost puppet-user[67975]: Filebucket: 0.00 Dec 15 03:16:11 localhost puppet-user[67975]: Package: 9.23 Dec 15 03:16:11 localhost puppet-user[67975]: Total: 15.81 Dec 15 03:16:11 localhost puppet-user[67975]: Version: Dec 15 03:16:11 localhost puppet-user[67975]: Config: 1765786555 Dec 15 03:16:11 localhost puppet-user[67975]: Puppet: 7.10.0 Dec 15 03:16:11 localhost ansible-async_wrapper.py[67956]: Module complete (67956) Dec 15 03:16:11 localhost ansible-async_wrapper.py[67955]: Done in kid B. Dec 15 03:16:12 localhost python3[69476]: ansible-ansible.legacy.async_status Invoked with jid=748410551285.67952 mode=status _async_dir=/tmp/.ansible_async Dec 15 03:16:13 localhost python3[69492]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Dec 15 03:16:14 localhost python3[69508]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 15 03:16:14 localhost python3[69558]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 03:16:14 localhost python3[69576]: ansible-ansible.legacy.file Invoked with setype=svirt_sandbox_file_t selevel=s0 dest=/var/lib/container-puppet/puppetlabs/facter.conf _original_basename=tmpfhp0ejj_ recurse=False state=file path=/var/lib/container-puppet/puppetlabs/facter.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Dec 15 03:16:15 localhost python3[69606]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:16:16 localhost python3[69710]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None Dec 15 03:16:17 localhost python3[69729]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:16:17 localhost python3[69761]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 15 03:16:18 localhost python3[69811]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 03:16:18 localhost python3[69829]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-container-shutdown _original_basename=tripleo-container-shutdown recurse=False state=file path=/usr/libexec/tripleo-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:16:19 localhost python3[69891]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 03:16:19 localhost python3[69909]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-start-podman-container _original_basename=tripleo-start-podman-container recurse=False state=file path=/usr/libexec/tripleo-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:16:20 localhost python3[69971]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 03:16:20 localhost python3[69989]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/tripleo-container-shutdown.service _original_basename=tripleo-container-shutdown-service recurse=False state=file path=/usr/lib/systemd/system/tripleo-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:16:20 localhost python3[70051]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 03:16:21 localhost python3[70069]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset _original_basename=91-tripleo-container-shutdown-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:16:21 localhost python3[70099]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 03:16:21 localhost systemd[1]: Reloading. Dec 15 03:16:21 localhost systemd-rc-local-generator[70121]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 03:16:21 localhost systemd-sysv-generator[70125]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 03:16:21 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 03:16:22 localhost python3[70184]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 03:16:22 localhost python3[70202]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/usr/lib/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:16:23 localhost python3[70264]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 03:16:23 localhost python3[70282]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:16:24 localhost python3[70312]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 03:16:24 localhost systemd[1]: Reloading. Dec 15 03:16:24 localhost systemd-sysv-generator[70340]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 03:16:24 localhost systemd-rc-local-generator[70335]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 03:16:24 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 03:16:24 localhost systemd[1]: Starting Create netns directory... Dec 15 03:16:24 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Dec 15 03:16:24 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Dec 15 03:16:24 localhost systemd[1]: Finished Create netns directory. Dec 15 03:16:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:16:25 localhost systemd[1]: tmp-crun.QGGUFh.mount: Deactivated successfully. Dec 15 03:16:25 localhost podman[70370]: 2025-12-15 08:16:25.104667092 +0000 UTC m=+0.103948016 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, distribution-scope=public, batch=17.1_20251118.1, build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.buildah.version=1.41.4, release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=) Dec 15 03:16:25 localhost podman[70370]: 2025-12-15 08:16:25.118212639 +0000 UTC m=+0.117493563 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, vendor=Red Hat, Inc., tcib_managed=true, maintainer=OpenStack TripleO Team, release=1761123044, version=17.1.12, distribution-scope=public, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64) Dec 15 03:16:25 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:16:25 localhost python3[70369]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6 Dec 15 03:16:27 localhost python3[70450]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step4 config_dir=/var/lib/tripleo-config/container-startup-config/step_4 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False Dec 15 03:16:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:16:27 localhost systemd[1]: tmp-crun.h9FIoB.mount: Deactivated successfully. Dec 15 03:16:27 localhost podman[70471]: 2025-12-15 08:16:27.206842849 +0000 UTC m=+0.093017129 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, distribution-scope=public, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, name=rhosp17/openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, release=1761123044, architecture=x86_64, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, version=17.1.12, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d) Dec 15 03:16:27 localhost podman[70471]: 2025-12-15 08:16:27.220333124 +0000 UTC m=+0.106507424 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vendor=Red Hat, Inc., io.buildah.version=1.41.4, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, build-date=2025-11-18T23:44:13Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, vcs-type=git, url=https://www.redhat.com) Dec 15 03:16:27 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:16:27 localhost podman[70607]: 2025-12-15 08:16:27.436938554 +0000 UTC m=+0.113259331 container create 3698017634ba8d0e94860da3e45c9dd2f90f13373203a4c53126c29a97271b70 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, batch=17.1_20251118.1, container_name=nova_libvirt_init_secret, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step4, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-libvirt-container, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, distribution-scope=public, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, build-date=2025-11-19T00:35:22Z, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 15 03:16:27 localhost podman[70627]: 2025-12-15 08:16:27.466601606 +0000 UTC m=+0.121344475 container create ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, version=17.1.12, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20251118.1, release=1761123044, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, tcib_managed=true, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.openshift.expose-services=) Dec 15 03:16:27 localhost podman[70607]: 2025-12-15 08:16:27.380052898 +0000 UTC m=+0.056373705 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 Dec 15 03:16:27 localhost podman[70630]: 2025-12-15 08:16:27.492113427 +0000 UTC m=+0.132952590 container create 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, release=1761123044, managed_by=tripleo_ansible, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:12:45Z, url=https://www.redhat.com, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 15 03:16:27 localhost systemd[1]: Started libpod-conmon-3698017634ba8d0e94860da3e45c9dd2f90f13373203a4c53126c29a97271b70.scope. Dec 15 03:16:27 localhost podman[70627]: 2025-12-15 08:16:27.398622467 +0000 UTC m=+0.053365426 image pull registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 Dec 15 03:16:27 localhost systemd[1]: Started libcrun container. Dec 15 03:16:27 localhost systemd[1]: Started libpod-conmon-ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.scope. Dec 15 03:16:27 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ba4f6a51fa4a8ce51a3b8b83f42d30ba19f6e65ec47784832d65f45914b624c3/merged/etc/nova supports timestamps until 2038 (0x7fffffff) Dec 15 03:16:27 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ba4f6a51fa4a8ce51a3b8b83f42d30ba19f6e65ec47784832d65f45914b624c3/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff) Dec 15 03:16:27 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ba4f6a51fa4a8ce51a3b8b83f42d30ba19f6e65ec47784832d65f45914b624c3/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Dec 15 03:16:27 localhost podman[70675]: 2025-12-15 08:16:27.512288648 +0000 UTC m=+0.101757689 container create 8e2498e738e9cef942e7802bbcc12ec46a5fc25647d21ec915ced2edb246ff5c (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, batch=17.1_20251118.1, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml); if [ X"$CMS_OPTS" != X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765784752'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, version=17.1.12, config_id=tripleo_step4, tcib_managed=true, container_name=configure_cms_options, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vendor=Red Hat, Inc., build-date=2025-11-18T23:34:05Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044) Dec 15 03:16:27 localhost systemd[1]: Started libcrun container. Dec 15 03:16:27 localhost podman[70630]: 2025-12-15 08:16:27.421090518 +0000 UTC m=+0.061929681 image pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 Dec 15 03:16:27 localhost systemd[1]: Started libpod-conmon-97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.scope. Dec 15 03:16:27 localhost podman[70607]: 2025-12-15 08:16:27.524361416 +0000 UTC m=+0.200682193 container init 3698017634ba8d0e94860da3e45c9dd2f90f13373203a4c53126c29a97271b70 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, io.buildah.version=1.41.4, batch=17.1_20251118.1, distribution-scope=public, managed_by=tripleo_ansible, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, io.openshift.expose-services=, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step4, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-libvirt-container, container_name=nova_libvirt_init_secret, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:35:22Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-libvirt, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-libvirt) Dec 15 03:16:27 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7b239ddaa407a2a969ac6b39fe52abf4714e7608ccfdf4f5f9e0e55a4123ba70/merged/var/log/containers supports timestamps until 2038 (0x7fffffff) Dec 15 03:16:27 localhost podman[70607]: 2025-12-15 08:16:27.532097019 +0000 UTC m=+0.208417796 container start 3698017634ba8d0e94860da3e45c9dd2f90f13373203a4c53126c29a97271b70 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, name=rhosp17/openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, io.buildah.version=1.41.4, build-date=2025-11-19T00:35:22Z, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step4, container_name=nova_libvirt_init_secret, url=https://www.redhat.com) Dec 15 03:16:27 localhost podman[70607]: 2025-12-15 08:16:27.532271314 +0000 UTC m=+0.208592091 container attach 3698017634ba8d0e94860da3e45c9dd2f90f13373203a4c53126c29a97271b70 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, batch=17.1_20251118.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_libvirt_init_secret, name=rhosp17/openstack-nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, vcs-type=git, architecture=x86_64, maintainer=OpenStack TripleO Team, distribution-scope=public, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, com.redhat.component=openstack-nova-libvirt-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, build-date=2025-11-19T00:35:22Z, io.openshift.expose-services=, config_id=tripleo_step4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt) Dec 15 03:16:27 localhost systemd[1]: Started libcrun container. Dec 15 03:16:27 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dcb3adb8ad1cc413fefbcfb3f7ed9e47d52e4bf4c16860b1049f35510ac414e2/merged/var/log/ceilometer supports timestamps until 2038 (0x7fffffff) Dec 15 03:16:27 localhost podman[70675]: 2025-12-15 08:16:27.457878326 +0000 UTC m=+0.047347387 image pull registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 Dec 15 03:16:27 localhost systemd[1]: Started libpod-conmon-8e2498e738e9cef942e7802bbcc12ec46a5fc25647d21ec915ced2edb246ff5c.scope. Dec 15 03:16:27 localhost podman[70674]: 2025-12-15 08:16:27.576471537 +0000 UTC m=+0.171750361 container create d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, batch=17.1_20251118.1, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, release=1761123044, version=17.1.12, distribution-scope=public, build-date=2025-11-19T00:11:48Z, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, vendor=Red Hat, Inc., io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute) Dec 15 03:16:27 localhost systemd[1]: Started libcrun container. Dec 15 03:16:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:16:27 localhost podman[70627]: 2025-12-15 08:16:27.585907996 +0000 UTC m=+0.240650865 container init ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, version=17.1.12, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, url=https://www.redhat.com, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, release=1761123044, name=rhosp17/openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, vendor=Red Hat, Inc., tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git) Dec 15 03:16:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:16:27 localhost podman[70675]: 2025-12-15 08:16:27.602752479 +0000 UTC m=+0.192221530 container init 8e2498e738e9cef942e7802bbcc12ec46a5fc25647d21ec915ced2edb246ff5c (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vendor=Red Hat, Inc., tcib_managed=true, name=rhosp17/openstack-ovn-controller, release=1761123044, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=configure_cms_options, managed_by=tripleo_ansible, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml); if [ X"$CMS_OPTS" != X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765784752'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, distribution-scope=public) Dec 15 03:16:27 localhost podman[70630]: 2025-12-15 08:16:27.602672546 +0000 UTC m=+0.243511719 container init 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, url=https://www.redhat.com, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, version=17.1.12, io.buildah.version=1.41.4, vcs-type=git, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, tcib_managed=true, build-date=2025-11-19T00:12:45Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Dec 15 03:16:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:16:27 localhost podman[70675]: 2025-12-15 08:16:27.615115914 +0000 UTC m=+0.204584955 container start 8e2498e738e9cef942e7802bbcc12ec46a5fc25647d21ec915ced2edb246ff5c (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, managed_by=tripleo_ansible, tcib_managed=true, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml); if [ X"$CMS_OPTS" != X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765784752'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, release=1761123044, url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., container_name=configure_cms_options, version=17.1.12, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public) Dec 15 03:16:27 localhost podman[70675]: 2025-12-15 08:16:27.615519275 +0000 UTC m=+0.204988356 container attach 8e2498e738e9cef942e7802bbcc12ec46a5fc25647d21ec915ced2edb246ff5c (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, tcib_managed=true, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, url=https://www.redhat.com, name=rhosp17/openstack-ovn-controller, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml); if [ X"$CMS_OPTS" != X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765784752'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, vendor=Red Hat, Inc., vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=configure_cms_options, io.openshift.expose-services=, release=1761123044, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12) Dec 15 03:16:27 localhost podman[70627]: 2025-12-15 08:16:27.622013025 +0000 UTC m=+0.276755894 container start ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20251118.1, release=1761123044, distribution-scope=public, vcs-type=git, version=17.1.12, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-cron, tcib_managed=true, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:32Z, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible) Dec 15 03:16:27 localhost systemd[1]: Started libpod-conmon-d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.scope. Dec 15 03:16:27 localhost python3[70450]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name logrotate_crond --conmon-pidfile /run/logrotate_crond.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=53ed83bb0cae779ff95edb2002262c6f --healthcheck-command /usr/share/openstack-tripleo-common/healthcheck/cron --label config_id=tripleo_step4 --label container_name=logrotate_crond --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/logrotate_crond.log --network none --pid host --privileged=True --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro --volume /var/log/containers:/var/log/containers:z registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 Dec 15 03:16:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:16:27 localhost podman[70630]: 2025-12-15 08:16:27.637620267 +0000 UTC m=+0.278459420 container start 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, distribution-scope=public, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, vcs-type=git, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.4, tcib_managed=true, config_id=tripleo_step4) Dec 15 03:16:27 localhost podman[70674]: 2025-12-15 08:16:27.538863468 +0000 UTC m=+0.134142312 image pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1 Dec 15 03:16:27 localhost python3[70450]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ceilometer_agent_ipmi --conmon-pidfile /run/ceilometer_agent_ipmi.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=fcee5a4a91f85471fca7b61211375646 --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step4 --label container_name=ceilometer_agent_ipmi --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ceilometer_agent_ipmi.log --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro --volume /var/log/containers/ceilometer:/var/log/ceilometer:z registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 Dec 15 03:16:27 localhost systemd[1]: Started libcrun container. Dec 15 03:16:27 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/22a343801c56d43a44097fa6d3d71cdf7d4806c36e876e53679f0b5dcaee3588/merged/var/log/ceilometer supports timestamps until 2038 (0x7fffffff) Dec 15 03:16:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:16:27 localhost podman[70674]: 2025-12-15 08:16:27.700967104 +0000 UTC m=+0.296245948 container init d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, config_id=tripleo_step4, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, distribution-scope=public, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=) Dec 15 03:16:27 localhost ovs-vsctl[70792]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . external_ids ovn-cms-options Dec 15 03:16:27 localhost systemd[1]: libpod-8e2498e738e9cef942e7802bbcc12ec46a5fc25647d21ec915ced2edb246ff5c.scope: Deactivated successfully. Dec 15 03:16:27 localhost systemd[1]: libpod-3698017634ba8d0e94860da3e45c9dd2f90f13373203a4c53126c29a97271b70.scope: Deactivated successfully. Dec 15 03:16:27 localhost podman[70607]: 2025-12-15 08:16:27.727727518 +0000 UTC m=+0.404048285 container died 3698017634ba8d0e94860da3e45c9dd2f90f13373203a4c53126c29a97271b70 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-libvirt-container, version=17.1.12, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, name=rhosp17/openstack-nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, distribution-scope=public, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, build-date=2025-11-19T00:35:22Z, container_name=nova_libvirt_init_secret, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, tcib_managed=true, release=1761123044, vcs-type=git, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Dec 15 03:16:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:16:27 localhost podman[70749]: 2025-12-15 08:16:27.736461288 +0000 UTC m=+0.097095626 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=starting, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, managed_by=tripleo_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, batch=17.1_20251118.1, build-date=2025-11-19T00:12:45Z, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-ipmi, release=1761123044) Dec 15 03:16:27 localhost podman[70674]: 2025-12-15 08:16:27.747059777 +0000 UTC m=+0.342338601 container start d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, build-date=2025-11-19T00:11:48Z, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20251118.1, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12) Dec 15 03:16:27 localhost python3[70450]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ceilometer_agent_compute --conmon-pidfile /run/ceilometer_agent_compute.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=fcee5a4a91f85471fca7b61211375646 --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step4 --label container_name=ceilometer_agent_compute --label managed_by=tripleo_ansible --label config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ceilometer_agent_compute.log --network host --privileged=False --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/log/containers/ceilometer:/var/log/ceilometer:z registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1 Dec 15 03:16:27 localhost podman[70675]: 2025-12-15 08:16:27.820256883 +0000 UTC m=+0.409725944 container died 8e2498e738e9cef942e7802bbcc12ec46a5fc25647d21ec915ced2edb246ff5c (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, io.buildah.version=1.41.4, io.openshift.expose-services=, vendor=Red Hat, Inc., distribution-scope=public, architecture=x86_64, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml); if [ X"$CMS_OPTS" != X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765784752'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, release=1761123044, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, tcib_managed=true, container_name=configure_cms_options, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com) Dec 15 03:16:27 localhost podman[70740]: 2025-12-15 08:16:27.803112122 +0000 UTC m=+0.178233072 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=starting, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, url=https://www.redhat.com, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, batch=17.1_20251118.1, container_name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 15 03:16:27 localhost podman[70749]: 2025-12-15 08:16:27.873928696 +0000 UTC m=+0.234563044 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, distribution-scope=public, maintainer=OpenStack TripleO Team, release=1761123044, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true) Dec 15 03:16:27 localhost podman[70749]: unhealthy Dec 15 03:16:27 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Main process exited, code=exited, status=1/FAILURE Dec 15 03:16:27 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Failed with result 'exit-code'. Dec 15 03:16:28 localhost podman[70803]: 2025-12-15 08:16:28.004043591 +0000 UTC m=+0.275315288 container cleanup 8e2498e738e9cef942e7802bbcc12ec46a5fc25647d21ec915ced2edb246ff5c (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, build-date=2025-11-18T23:34:05Z, distribution-scope=public, url=https://www.redhat.com, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml); if [ X"$CMS_OPTS" != X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765784752'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, container_name=configure_cms_options, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, vcs-type=git, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp17/openstack-ovn-controller, tcib_managed=true, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc.) Dec 15 03:16:28 localhost podman[70813]: 2025-12-15 08:16:28.008658742 +0000 UTC m=+0.267529742 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=starting, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-type=git, release=1761123044, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, url=https://www.redhat.com, version=17.1.12, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, tcib_managed=true, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, build-date=2025-11-19T00:11:48Z) Dec 15 03:16:28 localhost systemd[1]: libpod-conmon-8e2498e738e9cef942e7802bbcc12ec46a5fc25647d21ec915ced2edb246ff5c.scope: Deactivated successfully. Dec 15 03:16:28 localhost podman[70813]: 2025-12-15 08:16:28.019581169 +0000 UTC m=+0.278452209 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, batch=17.1_20251118.1, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:11:48Z, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=) Dec 15 03:16:28 localhost podman[70813]: unhealthy Dec 15 03:16:28 localhost podman[70814]: 2025-12-15 08:16:28.034615965 +0000 UTC m=+0.290567288 container cleanup 3698017634ba8d0e94860da3e45c9dd2f90f13373203a4c53126c29a97271b70 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:35:22Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-libvirt, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, release=1761123044, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-libvirt-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, tcib_managed=true, vcs-type=git, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_libvirt_init_secret, summary=Red Hat OpenStack Platform 17.1 nova-libvirt) Dec 15 03:16:28 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Main process exited, code=exited, status=1/FAILURE Dec 15 03:16:28 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Failed with result 'exit-code'. Dec 15 03:16:28 localhost systemd[1]: libpod-conmon-3698017634ba8d0e94860da3e45c9dd2f90f13373203a4c53126c29a97271b70.scope: Deactivated successfully. Dec 15 03:16:28 localhost python3[70450]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_libvirt_init_secret --cgroupns=host --conmon-pidfile /run/nova_libvirt_init_secret.pid --detach=False --env LIBVIRT_DEFAULT_URI=qemu:///system --env TRIPLEO_CONFIG_HASH=879500e96bf8dfb93687004bd86f2317 --label config_id=tripleo_step4 --label container_name=nova_libvirt_init_secret --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_libvirt_init_secret.log --network host --privileged=False --security-opt label=disable --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova --volume /etc/libvirt:/etc/libvirt --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro --volume /var/lib/tripleo-config/ceph:/etc/ceph:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 /nova_libvirt_init_secret.sh ceph:openstack Dec 15 03:16:28 localhost python3[70450]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name configure_cms_options --conmon-pidfile /run/configure_cms_options.pid --detach=False --env TRIPLEO_DEPLOY_IDENTIFIER=1765784752 --label config_id=tripleo_step4 --label container_name=configure_cms_options --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml); if [ X"$CMS_OPTS" != X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765784752'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/configure_cms_options.log --network host --privileged=True --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 /bin/bash -c CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml); if [ X"$CMS_OPTS" != X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi Dec 15 03:16:28 localhost podman[70740]: 2025-12-15 08:16:28.083499892 +0000 UTC m=+0.458620842 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, vcs-type=git, batch=17.1_20251118.1, distribution-scope=public, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, architecture=x86_64, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:32Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron) Dec 15 03:16:28 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 03:16:28 localhost systemd[1]: var-lib-containers-storage-overlay-ba4f6a51fa4a8ce51a3b8b83f42d30ba19f6e65ec47784832d65f45914b624c3-merged.mount: Deactivated successfully. Dec 15 03:16:28 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3698017634ba8d0e94860da3e45c9dd2f90f13373203a4c53126c29a97271b70-userdata-shm.mount: Deactivated successfully. Dec 15 03:16:28 localhost podman[70988]: 2025-12-15 08:16:28.246574083 +0000 UTC m=+0.092468914 container create b8952a0fad4d24a09ed6a21aad65c114d84153cd41e6bc9d05b75a8964e8afff (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, version=17.1.12, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765784752'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.buildah.version=1.41.4, batch=17.1_20251118.1, config_id=tripleo_step4, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, architecture=x86_64, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=setup_ovs_manager, build-date=2025-11-19T00:14:25Z, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, distribution-scope=public, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn) Dec 15 03:16:28 localhost systemd[1]: Started libpod-conmon-b8952a0fad4d24a09ed6a21aad65c114d84153cd41e6bc9d05b75a8964e8afff.scope. Dec 15 03:16:28 localhost podman[70988]: 2025-12-15 08:16:28.204683241 +0000 UTC m=+0.050578092 image pull registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 Dec 15 03:16:28 localhost systemd[1]: Started libcrun container. Dec 15 03:16:28 localhost podman[70988]: 2025-12-15 08:16:28.361153709 +0000 UTC m=+0.207048520 container init b8952a0fad4d24a09ed6a21aad65c114d84153cd41e6bc9d05b75a8964e8afff (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=setup_ovs_manager, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.buildah.version=1.41.4, url=https://www.redhat.com, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765784752'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, architecture=x86_64, distribution-scope=public, build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, managed_by=tripleo_ansible) Dec 15 03:16:28 localhost podman[70988]: 2025-12-15 08:16:28.374333486 +0000 UTC m=+0.220228287 container start b8952a0fad4d24a09ed6a21aad65c114d84153cd41e6bc9d05b75a8964e8afff (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, container_name=setup_ovs_manager, tcib_managed=true, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, architecture=x86_64, build-date=2025-11-19T00:14:25Z, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, vcs-type=git, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765784752'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Dec 15 03:16:28 localhost podman[70988]: 2025-12-15 08:16:28.374520761 +0000 UTC m=+0.220415572 container attach b8952a0fad4d24a09ed6a21aad65c114d84153cd41e6bc9d05b75a8964e8afff (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, container_name=setup_ovs_manager, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, batch=17.1_20251118.1, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765784752'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Dec 15 03:16:28 localhost podman[71037]: 2025-12-15 08:16:28.477328537 +0000 UTC m=+0.082438871 container create 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, tcib_managed=true, architecture=x86_64, container_name=nova_migration_target, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, vendor=Red Hat, Inc., release=1761123044) Dec 15 03:16:28 localhost podman[71037]: 2025-12-15 08:16:28.432864787 +0000 UTC m=+0.037975111 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 Dec 15 03:16:28 localhost systemd[1]: Started libpod-conmon-4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.scope. Dec 15 03:16:28 localhost systemd[1]: Started libcrun container. Dec 15 03:16:28 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1925df910a0bd163709115d5c6434edae9eb72581a26c20b4795234cbdad634b/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Dec 15 03:16:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:16:28 localhost podman[71037]: 2025-12-15 08:16:28.578257484 +0000 UTC m=+0.183367848 container init 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, version=17.1.12, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_migration_target, io.buildah.version=1.41.4, architecture=x86_64, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, com.redhat.component=openstack-nova-compute-container) Dec 15 03:16:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:16:28 localhost podman[71037]: 2025-12-15 08:16:28.617491326 +0000 UTC m=+0.222601650 container start 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_id=tripleo_step4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.buildah.version=1.41.4, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, tcib_managed=true, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_migration_target, name=rhosp17/openstack-nova-compute, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, vcs-type=git, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}) Dec 15 03:16:28 localhost python3[70450]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_migration_target --conmon-pidfile /run/nova_migration_target.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=879500e96bf8dfb93687004bd86f2317 --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step4 --label container_name=nova_migration_target --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_migration_target.log --network host --privileged=True --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /etc/ssh:/host-ssh:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 Dec 15 03:16:28 localhost podman[71068]: 2025-12-15 08:16:28.700341956 +0000 UTC m=+0.074061060 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=starting, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, version=17.1.12, managed_by=tripleo_ansible, config_id=tripleo_step4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, architecture=x86_64) Dec 15 03:16:29 localhost podman[71068]: 2025-12-15 08:16:29.028176434 +0000 UTC m=+0.401895558 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, tcib_managed=true, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, architecture=x86_64, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, release=1761123044, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Dec 15 03:16:29 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 03:16:29 localhost kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure Dec 15 03:16:29 localhost systemd[1]: tmp-crun.rumplo.mount: Deactivated successfully. Dec 15 03:16:31 localhost ovs-vsctl[71233]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager Dec 15 03:16:31 localhost systemd[1]: libpod-b8952a0fad4d24a09ed6a21aad65c114d84153cd41e6bc9d05b75a8964e8afff.scope: Deactivated successfully. Dec 15 03:16:31 localhost systemd[1]: libpod-b8952a0fad4d24a09ed6a21aad65c114d84153cd41e6bc9d05b75a8964e8afff.scope: Consumed 3.152s CPU time. Dec 15 03:16:31 localhost podman[70988]: 2025-12-15 08:16:31.589847876 +0000 UTC m=+3.435742757 container died b8952a0fad4d24a09ed6a21aad65c114d84153cd41e6bc9d05b75a8964e8afff (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765784752'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, build-date=2025-11-19T00:14:25Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=setup_ovs_manager, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.expose-services=, release=1761123044, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true) Dec 15 03:16:31 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b8952a0fad4d24a09ed6a21aad65c114d84153cd41e6bc9d05b75a8964e8afff-userdata-shm.mount: Deactivated successfully. Dec 15 03:16:31 localhost systemd[1]: var-lib-containers-storage-overlay-021518f28c3ecffc119cd61d385817388e410377ea08d35485a54384d237f335-merged.mount: Deactivated successfully. Dec 15 03:16:31 localhost podman[71234]: 2025-12-15 08:16:31.703032874 +0000 UTC m=+0.097515088 container cleanup b8952a0fad4d24a09ed6a21aad65c114d84153cd41e6bc9d05b75a8964e8afff (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, container_name=setup_ovs_manager, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.buildah.version=1.41.4, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765784752'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, tcib_managed=true, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 15 03:16:31 localhost systemd[1]: libpod-conmon-b8952a0fad4d24a09ed6a21aad65c114d84153cd41e6bc9d05b75a8964e8afff.scope: Deactivated successfully. Dec 15 03:16:31 localhost python3[70450]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name setup_ovs_manager --conmon-pidfile /run/setup_ovs_manager.pid --detach=False --env TRIPLEO_DEPLOY_IDENTIFIER=1765784752 --label config_id=tripleo_step4 --label container_name=setup_ovs_manager --label managed_by=tripleo_ansible --label config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1765784752'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/setup_ovs_manager.log --network host --privileged=True --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 /container_puppet_apply.sh 4 exec include tripleo::profile::base::neutron::ovn_metadata Dec 15 03:16:32 localhost podman[71344]: 2025-12-15 08:16:32.21504611 +0000 UTC m=+0.080305625 container create 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, container_name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, vcs-type=git, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container) Dec 15 03:16:32 localhost podman[71345]: 2025-12-15 08:16:32.246651162 +0000 UTC m=+0.103706401 container create 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., version=17.1.12, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, io.openshift.expose-services=, release=1761123044, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ovn-controller, url=https://www.redhat.com, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, distribution-scope=public) Dec 15 03:16:32 localhost systemd[1]: Started libpod-conmon-4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.scope. Dec 15 03:16:32 localhost systemd[1]: Started libpod-conmon-2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.scope. Dec 15 03:16:32 localhost podman[71344]: 2025-12-15 08:16:32.17285411 +0000 UTC m=+0.038113705 image pull registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 Dec 15 03:16:32 localhost podman[71345]: 2025-12-15 08:16:32.192433545 +0000 UTC m=+0.049488844 image pull registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 Dec 15 03:16:32 localhost systemd[1]: Started libcrun container. Dec 15 03:16:32 localhost systemd[1]: Started libcrun container. Dec 15 03:16:32 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ad536c756d37421165059407ca1ade816c0cce3c0bd15797d12fce327284d9de/merged/run/ovn supports timestamps until 2038 (0x7fffffff) Dec 15 03:16:32 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ad536c756d37421165059407ca1ade816c0cce3c0bd15797d12fce327284d9de/merged/var/log/ovn supports timestamps until 2038 (0x7fffffff) Dec 15 03:16:32 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ad536c756d37421165059407ca1ade816c0cce3c0bd15797d12fce327284d9de/merged/var/log/openvswitch supports timestamps until 2038 (0x7fffffff) Dec 15 03:16:32 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/efe3dd72d150707edc7bd6bf365298295ccd8023b849757f6c8f1cb42ddbcc93/merged/var/log/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 03:16:32 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/efe3dd72d150707edc7bd6bf365298295ccd8023b849757f6c8f1cb42ddbcc93/merged/etc/neutron/kill_scripts supports timestamps until 2038 (0x7fffffff) Dec 15 03:16:32 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/efe3dd72d150707edc7bd6bf365298295ccd8023b849757f6c8f1cb42ddbcc93/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 03:16:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:16:32 localhost podman[71345]: 2025-12-15 08:16:32.350410982 +0000 UTC m=+0.207466241 container init 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, container_name=ovn_controller, version=17.1.12, distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller) Dec 15 03:16:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:16:32 localhost podman[71345]: 2025-12-15 08:16:32.392804388 +0000 UTC m=+0.249859647 container start 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., release=1761123044, batch=17.1_20251118.1, url=https://www.redhat.com, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272) Dec 15 03:16:32 localhost systemd-logind[763]: Existing logind session ID 28 used by new audit session, ignoring. Dec 15 03:16:32 localhost python3[70450]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck 6642 --label config_id=tripleo_step4 --label container_name=ovn_controller --label managed_by=tripleo_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ovn_controller.log --network host --privileged=True --user root --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/log/containers/openvswitch:/var/log/openvswitch:z --volume /var/log/containers/openvswitch:/var/log/ovn:z registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 Dec 15 03:16:32 localhost systemd[1]: Created slice User Slice of UID 0. Dec 15 03:16:32 localhost systemd[1]: Starting User Runtime Directory /run/user/0... Dec 15 03:16:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:16:32 localhost podman[71344]: 2025-12-15 08:16:32.416542833 +0000 UTC m=+0.281802358 container init 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, release=1761123044, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, vcs-type=git, architecture=x86_64, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, managed_by=tripleo_ansible, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., version=17.1.12, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, url=https://www.redhat.com, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, batch=17.1_20251118.1) Dec 15 03:16:32 localhost systemd[1]: Finished User Runtime Directory /run/user/0. Dec 15 03:16:32 localhost systemd[1]: Starting User Manager for UID 0... Dec 15 03:16:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:16:32 localhost podman[71382]: 2025-12-15 08:16:32.491076504 +0000 UTC m=+0.087694478 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=starting, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vendor=Red Hat, Inc., url=https://www.redhat.com, architecture=x86_64, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, io.openshift.expose-services=, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, io.buildah.version=1.41.4, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, batch=17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}) Dec 15 03:16:32 localhost podman[71344]: 2025-12-15 08:16:32.512305764 +0000 UTC m=+0.377565289 container start 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, container_name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, version=17.1.12, managed_by=tripleo_ansible, vendor=Red Hat, Inc., release=1761123044, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, architecture=x86_64, url=https://www.redhat.com, name=rhosp17/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Dec 15 03:16:32 localhost python3[70450]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=a56a6f14b467cd9064e40c03defa5ed7 --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step4 --label container_name=ovn_metadata_agent --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ovn_metadata_agent.log --network host --pid host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/neutron:/var/log/neutron:z --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /run/netns:/run/netns:shared --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 Dec 15 03:16:32 localhost podman[71409]: 2025-12-15 08:16:32.556060546 +0000 UTC m=+0.087104434 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=starting, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, url=https://www.redhat.com, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, vendor=Red Hat, Inc., batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, architecture=x86_64, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn) Dec 15 03:16:32 localhost podman[71382]: 2025-12-15 08:16:32.580621031 +0000 UTC m=+0.177238955 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, config_id=tripleo_step4, vendor=Red Hat, Inc., url=https://www.redhat.com, batch=17.1_20251118.1, container_name=ovn_controller, io.buildah.version=1.41.4, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, name=rhosp17/openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 15 03:16:32 localhost podman[71382]: unhealthy Dec 15 03:16:32 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Main process exited, code=exited, status=1/FAILURE Dec 15 03:16:32 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Failed with result 'exit-code'. Dec 15 03:16:32 localhost podman[71409]: 2025-12-15 08:16:32.596473569 +0000 UTC m=+0.127517497 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., vcs-type=git, config_id=tripleo_step4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com) Dec 15 03:16:32 localhost podman[71409]: unhealthy Dec 15 03:16:32 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Main process exited, code=exited, status=1/FAILURE Dec 15 03:16:32 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Failed with result 'exit-code'. Dec 15 03:16:32 localhost systemd[71403]: Queued start job for default target Main User Target. Dec 15 03:16:32 localhost systemd[71403]: Created slice User Application Slice. Dec 15 03:16:32 localhost systemd[71403]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system). Dec 15 03:16:32 localhost systemd[71403]: Started Daily Cleanup of User's Temporary Directories. Dec 15 03:16:32 localhost systemd[71403]: Reached target Paths. Dec 15 03:16:32 localhost systemd[71403]: Reached target Timers. Dec 15 03:16:32 localhost systemd[71403]: Starting D-Bus User Message Bus Socket... Dec 15 03:16:32 localhost systemd[71403]: Starting Create User's Volatile Files and Directories... Dec 15 03:16:32 localhost systemd[71403]: Finished Create User's Volatile Files and Directories. Dec 15 03:16:32 localhost systemd[71403]: Listening on D-Bus User Message Bus Socket. Dec 15 03:16:32 localhost systemd[71403]: Reached target Sockets. Dec 15 03:16:32 localhost systemd[71403]: Reached target Basic System. Dec 15 03:16:32 localhost systemd[71403]: Reached target Main User Target. Dec 15 03:16:32 localhost systemd[71403]: Startup finished in 196ms. Dec 15 03:16:32 localhost systemd[1]: Started User Manager for UID 0. Dec 15 03:16:32 localhost systemd[1]: Started Session c9 of User root. Dec 15 03:16:32 localhost systemd[1]: session-c9.scope: Deactivated successfully. Dec 15 03:16:32 localhost kernel: device br-int entered promiscuous mode Dec 15 03:16:32 localhost NetworkManager[5963]: [1765786592.8208] manager: (br-int): new Generic device (/org/freedesktop/NetworkManager/Devices/11) Dec 15 03:16:32 localhost systemd-udevd[71495]: Network interface NamePolicy= disabled on kernel command line. Dec 15 03:16:33 localhost python3[71515]: ansible-file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:16:33 localhost python3[71531]: ansible-file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_ipmi.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:16:33 localhost python3[71547]: ansible-file Invoked with path=/etc/systemd/system/tripleo_logrotate_crond.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:16:33 localhost kernel: device genev_sys_6081 entered promiscuous mode Dec 15 03:16:33 localhost NetworkManager[5963]: [1765786593.8435] device (genev_sys_6081): carrier: link connected Dec 15 03:16:33 localhost NetworkManager[5963]: [1765786593.8440] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/12) Dec 15 03:16:34 localhost python3[71566]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:16:34 localhost python3[71584]: ansible-file Invoked with path=/etc/systemd/system/tripleo_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:16:34 localhost python3[71602]: ansible-file Invoked with path=/etc/systemd/system/tripleo_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:16:34 localhost python3[71618]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 15 03:16:35 localhost python3[71636]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_ipmi_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 15 03:16:35 localhost python3[71654]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_logrotate_crond_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 15 03:16:35 localhost python3[71670]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_migration_target_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 15 03:16:35 localhost python3[71686]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_ovn_controller_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 15 03:16:36 localhost python3[71702]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_ovn_metadata_agent_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 15 03:16:36 localhost python3[71763]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765786596.132337-108910-70019724778612/source dest=/etc/systemd/system/tripleo_ceilometer_agent_compute.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:16:37 localhost sshd[71777]: main: sshd: ssh-rsa algorithm is disabled Dec 15 03:16:37 localhost python3[71793]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765786596.132337-108910-70019724778612/source dest=/etc/systemd/system/tripleo_ceilometer_agent_ipmi.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:16:37 localhost python3[71823]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765786596.132337-108910-70019724778612/source dest=/etc/systemd/system/tripleo_logrotate_crond.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:16:38 localhost python3[71852]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765786596.132337-108910-70019724778612/source dest=/etc/systemd/system/tripleo_nova_migration_target.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:16:38 localhost python3[71881]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765786596.132337-108910-70019724778612/source dest=/etc/systemd/system/tripleo_ovn_controller.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:16:39 localhost python3[71910]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765786596.132337-108910-70019724778612/source dest=/etc/systemd/system/tripleo_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:16:39 localhost python3[71926]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Dec 15 03:16:39 localhost systemd[1]: Reloading. Dec 15 03:16:39 localhost systemd-sysv-generator[71952]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 03:16:39 localhost systemd-rc-local-generator[71947]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 03:16:39 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 03:16:40 localhost python3[71977]: ansible-systemd Invoked with state=restarted name=tripleo_ceilometer_agent_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 03:16:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:16:40 localhost systemd[1]: Reloading. Dec 15 03:16:40 localhost systemd-sysv-generator[72022]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 03:16:40 localhost systemd-rc-local-generator[72018]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 03:16:40 localhost podman[71981]: 2025-12-15 08:16:40.889219155 +0000 UTC m=+0.111844645 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20251118.1, io.openshift.expose-services=, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=rhosp17/openstack-qdrouterd, tcib_managed=true, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible) Dec 15 03:16:40 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 03:16:41 localhost podman[71981]: 2025-12-15 08:16:41.061306295 +0000 UTC m=+0.283931855 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, maintainer=OpenStack TripleO Team, version=17.1.12, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, batch=17.1_20251118.1, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.buildah.version=1.41.4, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, vcs-type=git, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 15 03:16:41 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:16:41 localhost systemd[1]: Starting ceilometer_agent_compute container... Dec 15 03:16:41 localhost tripleo-start-podman-container[72046]: Creating additional drop-in dependency for "ceilometer_agent_compute" (d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146) Dec 15 03:16:41 localhost systemd[1]: Reloading. Dec 15 03:16:41 localhost systemd-sysv-generator[72106]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 03:16:41 localhost systemd-rc-local-generator[72101]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 03:16:41 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 03:16:41 localhost systemd[1]: Started ceilometer_agent_compute container. Dec 15 03:16:42 localhost python3[72129]: ansible-systemd Invoked with state=restarted name=tripleo_ceilometer_agent_ipmi.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 03:16:42 localhost systemd[1]: Reloading. Dec 15 03:16:42 localhost systemd-rc-local-generator[72156]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 03:16:42 localhost systemd-sysv-generator[72159]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 03:16:42 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 03:16:42 localhost systemd[1]: Starting ceilometer_agent_ipmi container... Dec 15 03:16:42 localhost systemd[1]: Started ceilometer_agent_ipmi container. Dec 15 03:16:42 localhost systemd[1]: Stopping User Manager for UID 0... Dec 15 03:16:42 localhost systemd[71403]: Activating special unit Exit the Session... Dec 15 03:16:42 localhost systemd[71403]: Stopped target Main User Target. Dec 15 03:16:42 localhost systemd[71403]: Stopped target Basic System. Dec 15 03:16:42 localhost systemd[71403]: Stopped target Paths. Dec 15 03:16:42 localhost systemd[71403]: Stopped target Sockets. Dec 15 03:16:42 localhost systemd[71403]: Stopped target Timers. Dec 15 03:16:42 localhost systemd[71403]: Stopped Daily Cleanup of User's Temporary Directories. Dec 15 03:16:42 localhost systemd[71403]: Closed D-Bus User Message Bus Socket. Dec 15 03:16:42 localhost systemd[71403]: Stopped Create User's Volatile Files and Directories. Dec 15 03:16:42 localhost systemd[71403]: Removed slice User Application Slice. Dec 15 03:16:42 localhost systemd[71403]: Reached target Shutdown. Dec 15 03:16:42 localhost systemd[71403]: Finished Exit the Session. Dec 15 03:16:42 localhost systemd[71403]: Reached target Exit the Session. Dec 15 03:16:42 localhost systemd[1]: user@0.service: Deactivated successfully. Dec 15 03:16:42 localhost systemd[1]: Stopped User Manager for UID 0. Dec 15 03:16:42 localhost systemd[1]: Stopping User Runtime Directory /run/user/0... Dec 15 03:16:42 localhost systemd[1]: run-user-0.mount: Deactivated successfully. Dec 15 03:16:42 localhost systemd[1]: user-runtime-dir@0.service: Deactivated successfully. Dec 15 03:16:42 localhost systemd[1]: Stopped User Runtime Directory /run/user/0. Dec 15 03:16:42 localhost systemd[1]: Removed slice User Slice of UID 0. Dec 15 03:16:43 localhost python3[72197]: ansible-systemd Invoked with state=restarted name=tripleo_logrotate_crond.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 03:16:43 localhost systemd[1]: Reloading. Dec 15 03:16:43 localhost systemd-sysv-generator[72225]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 03:16:43 localhost systemd-rc-local-generator[72221]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 03:16:43 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 03:16:43 localhost systemd[1]: Starting logrotate_crond container... Dec 15 03:16:43 localhost systemd[1]: Started logrotate_crond container. Dec 15 03:16:44 localhost python3[72263]: ansible-systemd Invoked with state=restarted name=tripleo_nova_migration_target.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 03:16:44 localhost systemd[1]: Reloading. Dec 15 03:16:44 localhost systemd-rc-local-generator[72287]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 03:16:44 localhost systemd-sysv-generator[72291]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 03:16:44 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 03:16:45 localhost systemd[1]: Starting nova_migration_target container... Dec 15 03:16:45 localhost systemd[1]: Started nova_migration_target container. Dec 15 03:16:45 localhost python3[72330]: ansible-systemd Invoked with state=restarted name=tripleo_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 03:16:45 localhost systemd[1]: Reloading. Dec 15 03:16:45 localhost systemd-rc-local-generator[72354]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 03:16:45 localhost systemd-sysv-generator[72360]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 03:16:46 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 03:16:46 localhost systemd[1]: Starting ovn_controller container... Dec 15 03:16:46 localhost tripleo-start-podman-container[72369]: Creating additional drop-in dependency for "ovn_controller" (2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1) Dec 15 03:16:46 localhost systemd[1]: Reloading. Dec 15 03:16:46 localhost systemd-rc-local-generator[72424]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 03:16:46 localhost systemd-sysv-generator[72428]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 03:16:46 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 03:16:46 localhost systemd[1]: Started ovn_controller container. Dec 15 03:16:47 localhost python3[72493]: ansible-systemd Invoked with state=restarted name=tripleo_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 03:16:47 localhost systemd[1]: Reloading. Dec 15 03:16:47 localhost systemd-sysv-generator[72543]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 03:16:47 localhost systemd-rc-local-generator[72537]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 03:16:47 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 03:16:47 localhost systemd[1]: Starting ovn_metadata_agent container... Dec 15 03:16:47 localhost systemd[1]: Started ovn_metadata_agent container. Dec 15 03:16:48 localhost python3[72611]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks4.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:16:49 localhost python3[72732]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks4.json short_hostname=np0005559462 step=4 update_config_hash_only=False Dec 15 03:16:50 localhost python3[72748]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:16:50 localhost python3[72764]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_4 config_pattern=container-puppet-*.json config_overrides={} debug=True Dec 15 03:16:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:16:55 localhost systemd[1]: tmp-crun.tF4nET.mount: Deactivated successfully. Dec 15 03:16:55 localhost podman[72766]: 2025-12-15 08:16:55.764107508 +0000 UTC m=+0.094446387 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, release=1761123044, com.redhat.component=openstack-collectd-container, vcs-type=git, architecture=x86_64, build-date=2025-11-18T22:51:28Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, distribution-scope=public, vendor=Red Hat, Inc., tcib_managed=true, config_id=tripleo_step3, managed_by=tripleo_ansible) Dec 15 03:16:55 localhost podman[72766]: 2025-12-15 08:16:55.801516103 +0000 UTC m=+0.131854952 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.buildah.version=1.41.4, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2025-11-18T22:51:28Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, name=rhosp17/openstack-collectd, tcib_managed=true, release=1761123044, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-collectd-container, vcs-type=git, url=https://www.redhat.com, vendor=Red Hat, Inc., distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 15 03:16:55 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:16:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:16:57 localhost podman[72786]: 2025-12-15 08:16:57.776177569 +0000 UTC m=+0.107997380 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, distribution-scope=public, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:44:13Z, vcs-type=git, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., io.openshift.expose-services=, container_name=iscsid, release=1761123044, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid) Dec 15 03:16:57 localhost podman[72786]: 2025-12-15 08:16:57.810238331 +0000 UTC m=+0.142058192 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-type=git, container_name=iscsid, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.openshift.expose-services=, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, version=17.1.12, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d) Dec 15 03:16:57 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:16:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:16:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:16:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:16:58 localhost podman[72808]: 2025-12-15 08:16:58.758468948 +0000 UTC m=+0.076102286 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=starting, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, io.openshift.expose-services=, version=17.1.12, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true) Dec 15 03:16:58 localhost podman[72808]: 2025-12-15 08:16:58.794376789 +0000 UTC m=+0.112010057 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, release=1761123044, build-date=2025-11-19T00:11:48Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, vcs-type=git, url=https://www.redhat.com, managed_by=tripleo_ansible, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute) Dec 15 03:16:58 localhost podman[72805]: 2025-12-15 08:16:58.805289311 +0000 UTC m=+0.130020610 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=starting, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, vcs-type=git, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, distribution-scope=public) Dec 15 03:16:58 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 03:16:58 localhost podman[72807]: 2025-12-15 08:16:58.874723289 +0000 UTC m=+0.194566307 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, tcib_managed=true, version=17.1.12, build-date=2025-11-18T22:49:32Z, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, url=https://www.redhat.com, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, config_id=tripleo_step4, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-cron, io.openshift.expose-services=, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron) Dec 15 03:16:58 localhost podman[72807]: 2025-12-15 08:16:58.888515247 +0000 UTC m=+0.208358305 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, batch=17.1_20251118.1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:32Z, version=17.1.12, architecture=x86_64, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, distribution-scope=public) Dec 15 03:16:58 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 03:16:58 localhost podman[72805]: 2025-12-15 08:16:58.942101962 +0000 UTC m=+0.266833271 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, release=1761123044, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, name=rhosp17/openstack-ceilometer-ipmi, version=17.1.12, batch=17.1_20251118.1, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:12:45Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., io.buildah.version=1.41.4, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4) Dec 15 03:16:58 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 03:16:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:16:59 localhost podman[72875]: 2025-12-15 08:16:59.743506041 +0000 UTC m=+0.078082940 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, batch=17.1_20251118.1, vcs-type=git, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible) Dec 15 03:16:59 localhost systemd[1]: tmp-crun.fSGxcX.mount: Deactivated successfully. Dec 15 03:17:00 localhost podman[72875]: 2025-12-15 08:17:00.10047069 +0000 UTC m=+0.435047559 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, tcib_managed=true, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, release=1761123044, vcs-type=git, io.openshift.expose-services=, version=17.1.12, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, container_name=nova_migration_target, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute) Dec 15 03:17:00 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 03:17:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:17:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:17:02 localhost podman[72899]: 2025-12-15 08:17:02.737082227 +0000 UTC m=+0.070146838 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=starting, distribution-scope=public, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, tcib_managed=true, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:34:05Z) Dec 15 03:17:02 localhost podman[72899]: 2025-12-15 08:17:02.763297899 +0000 UTC m=+0.096362460 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, distribution-scope=public, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, name=rhosp17/openstack-ovn-controller, version=17.1.12, io.buildah.version=1.41.4, tcib_managed=true, build-date=2025-11-18T23:34:05Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-type=git) Dec 15 03:17:02 localhost systemd[1]: tmp-crun.mcpHVZ.mount: Deactivated successfully. Dec 15 03:17:02 localhost podman[72900]: 2025-12-15 08:17:02.804219363 +0000 UTC m=+0.132354512 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=starting, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, release=1761123044, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.12, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c) Dec 15 03:17:02 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Deactivated successfully. Dec 15 03:17:02 localhost podman[72900]: 2025-12-15 08:17:02.860261122 +0000 UTC m=+0.188396231 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, distribution-scope=public, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., version=17.1.12, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent) Dec 15 03:17:02 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Deactivated successfully. Dec 15 03:17:10 localhost snmpd[69387]: empty variable list in _query Dec 15 03:17:10 localhost snmpd[69387]: empty variable list in _query Dec 15 03:17:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:17:11 localhost podman[72945]: 2025-12-15 08:17:11.76220262 +0000 UTC m=+0.096004159 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., config_id=tripleo_step1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, release=1761123044, tcib_managed=true, version=17.1.12, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64) Dec 15 03:17:11 localhost podman[72945]: 2025-12-15 08:17:11.956352014 +0000 UTC m=+0.290153493 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.buildah.version=1.41.4, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, managed_by=tripleo_ansible, version=17.1.12, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, tcib_managed=true, vendor=Red Hat, Inc., vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Dec 15 03:17:11 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:17:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:17:26 localhost podman[72974]: 2025-12-15 08:17:26.760507635 +0000 UTC m=+0.095195028 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-type=git, io.buildah.version=1.41.4, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, build-date=2025-11-18T22:51:28Z, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, managed_by=tripleo_ansible, release=1761123044, url=https://www.redhat.com, batch=17.1_20251118.1, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd) Dec 15 03:17:26 localhost podman[72974]: 2025-12-15 08:17:26.79847016 +0000 UTC m=+0.133157543 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, url=https://www.redhat.com, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, name=rhosp17/openstack-collectd, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true) Dec 15 03:17:26 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:17:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:17:28 localhost podman[72994]: 2025-12-15 08:17:28.737084183 +0000 UTC m=+0.065950445 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, tcib_managed=true, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, distribution-scope=public, vcs-type=git, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d) Dec 15 03:17:28 localhost podman[72994]: 2025-12-15 08:17:28.774531866 +0000 UTC m=+0.103398138 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, config_id=tripleo_step3, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, name=rhosp17/openstack-iscsid, maintainer=OpenStack TripleO Team, distribution-scope=public, managed_by=tripleo_ansible, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, tcib_managed=true, url=https://www.redhat.com, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., io.buildah.version=1.41.4) Dec 15 03:17:28 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:17:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:17:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:17:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:17:29 localhost systemd[1]: tmp-crun.LlW6JQ.mount: Deactivated successfully. Dec 15 03:17:29 localhost podman[73013]: 2025-12-15 08:17:29.760365829 +0000 UTC m=+0.085924010 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, tcib_managed=true, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:12:45Z, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1761123044, io.openshift.expose-services=, url=https://www.redhat.com) Dec 15 03:17:29 localhost systemd[1]: tmp-crun.osF3kt.mount: Deactivated successfully. Dec 15 03:17:29 localhost podman[73013]: 2025-12-15 08:17:29.820937869 +0000 UTC m=+0.146496040 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, managed_by=tripleo_ansible, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, distribution-scope=public, batch=17.1_20251118.1, config_id=tripleo_step4, release=1761123044, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, build-date=2025-11-19T00:12:45Z, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Dec 15 03:17:29 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 03:17:29 localhost podman[73014]: 2025-12-15 08:17:29.823139558 +0000 UTC m=+0.146051828 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, version=17.1.12, batch=17.1_20251118.1, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=logrotate_crond, url=https://www.redhat.com, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, build-date=2025-11-18T22:49:32Z, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron) Dec 15 03:17:29 localhost podman[73015]: 2025-12-15 08:17:29.906330704 +0000 UTC m=+0.226260024 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, url=https://www.redhat.com, batch=17.1_20251118.1, release=1761123044, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-type=git, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, version=17.1.12, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:11:48Z, name=rhosp17/openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 15 03:17:29 localhost podman[73015]: 2025-12-15 08:17:29.938390111 +0000 UTC m=+0.258319521 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, version=17.1.12, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, io.buildah.version=1.41.4, vendor=Red Hat, Inc., tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, build-date=2025-11-19T00:11:48Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, vcs-type=git, architecture=x86_64) Dec 15 03:17:29 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 03:17:29 localhost podman[73014]: 2025-12-15 08:17:29.958098429 +0000 UTC m=+0.281010759 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, release=1761123044, maintainer=OpenStack TripleO Team, distribution-scope=public, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., config_id=tripleo_step4, version=17.1.12, com.redhat.component=openstack-cron-container, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, io.openshift.expose-services=, build-date=2025-11-18T22:49:32Z, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond) Dec 15 03:17:29 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 03:17:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:17:30 localhost podman[73087]: 2025-12-15 08:17:30.748236987 +0000 UTC m=+0.078105761 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vcs-type=git, config_id=tripleo_step4, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, version=17.1.12, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, maintainer=OpenStack TripleO Team, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, architecture=x86_64, container_name=nova_migration_target, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com) Dec 15 03:17:31 localhost podman[73087]: 2025-12-15 08:17:31.090317128 +0000 UTC m=+0.420185882 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, vcs-type=git, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, tcib_managed=true, url=https://www.redhat.com, container_name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, release=1761123044, batch=17.1_20251118.1) Dec 15 03:17:31 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 03:17:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:17:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:17:33 localhost podman[73112]: 2025-12-15 08:17:33.758147771 +0000 UTC m=+0.082762056 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, version=17.1.12, name=rhosp17/openstack-ovn-controller, config_id=tripleo_step4, container_name=ovn_controller, url=https://www.redhat.com, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller) Dec 15 03:17:33 localhost podman[73112]: 2025-12-15 08:17:33.783277913 +0000 UTC m=+0.107892188 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, build-date=2025-11-18T23:34:05Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, config_id=tripleo_step4, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20251118.1, io.openshift.expose-services=, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, release=1761123044, version=17.1.12, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272) Dec 15 03:17:33 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Deactivated successfully. Dec 15 03:17:33 localhost systemd[1]: tmp-crun.CIrskV.mount: Deactivated successfully. Dec 15 03:17:33 localhost podman[73113]: 2025-12-15 08:17:33.873751053 +0000 UTC m=+0.193803155 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, architecture=x86_64, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., version=17.1.12, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1761123044, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 15 03:17:33 localhost podman[73113]: 2025-12-15 08:17:33.932363721 +0000 UTC m=+0.252415843 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, architecture=x86_64, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, batch=17.1_20251118.1, vendor=Red Hat, Inc., tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, distribution-scope=public, build-date=2025-11-19T00:14:25Z, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=) Dec 15 03:17:33 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Deactivated successfully. Dec 15 03:17:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:17:42 localhost podman[73160]: 2025-12-15 08:17:42.758854925 +0000 UTC m=+0.087315027 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, managed_by=tripleo_ansible, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, io.openshift.expose-services=, tcib_managed=true, config_id=tripleo_step1, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, release=1761123044, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, architecture=x86_64, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container) Dec 15 03:17:42 localhost podman[73160]: 2025-12-15 08:17:42.977433182 +0000 UTC m=+0.305893254 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-type=git, config_id=tripleo_step1, managed_by=tripleo_ansible, version=17.1.12, distribution-scope=public, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.buildah.version=1.41.4, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, tcib_managed=true, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 15 03:17:42 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:17:49 localhost systemd[1]: tmp-crun.VFyK6y.mount: Deactivated successfully. Dec 15 03:17:49 localhost podman[73291]: 2025-12-15 08:17:49.328836259 +0000 UTC m=+0.097029727 container exec 8dcda56b365b42dc8758aab77a9ec80db304780e449052738f7e4e648ae1ecaf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-crash-np0005559462, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, name=rhceph, build-date=2025-11-26T19:44:28Z, RELEASE=main, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_CLEAN=True, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, architecture=x86_64, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git) Dec 15 03:17:49 localhost podman[73291]: 2025-12-15 08:17:49.453480564 +0000 UTC m=+0.221674022 container exec_died 8dcda56b365b42dc8758aab77a9ec80db304780e449052738f7e4e648ae1ecaf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-crash-np0005559462, architecture=x86_64, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux , release=1763362218, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, RELEASE=main, build-date=2025-11-26T19:44:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=rhceph, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container) Dec 15 03:17:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:17:57 localhost systemd[1]: tmp-crun.dNugSg.mount: Deactivated successfully. Dec 15 03:17:57 localhost podman[73435]: 2025-12-15 08:17:57.773953621 +0000 UTC m=+0.101765064 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, url=https://www.redhat.com, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=collectd, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.buildah.version=1.41.4, build-date=2025-11-18T22:51:28Z, tcib_managed=true, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, architecture=x86_64) Dec 15 03:17:57 localhost podman[73435]: 2025-12-15 08:17:57.791523431 +0000 UTC m=+0.119334884 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, batch=17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, version=17.1.12, name=rhosp17/openstack-collectd, architecture=x86_64, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:51:28Z, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, release=1761123044) Dec 15 03:17:57 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:17:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:17:59 localhost podman[73455]: 2025-12-15 08:17:59.752503413 +0000 UTC m=+0.081140392 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, com.redhat.component=openstack-iscsid-container, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, distribution-scope=public, io.buildah.version=1.41.4, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, config_id=tripleo_step3, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, url=https://www.redhat.com, tcib_managed=true) Dec 15 03:17:59 localhost podman[73455]: 2025-12-15 08:17:59.761408641 +0000 UTC m=+0.090045650 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, url=https://www.redhat.com, version=17.1.12, release=1761123044, vendor=Red Hat, Inc., name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, managed_by=tripleo_ansible, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid) Dec 15 03:17:59 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:18:00 localhost sshd[73474]: main: sshd: ssh-rsa algorithm is disabled Dec 15 03:18:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:18:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:18:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:18:00 localhost systemd[1]: tmp-crun.F8MQTw.mount: Deactivated successfully. Dec 15 03:18:00 localhost podman[73477]: 2025-12-15 08:18:00.773141778 +0000 UTC m=+0.092778863 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, build-date=2025-11-19T00:11:48Z, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, release=1761123044, io.openshift.expose-services=, distribution-scope=public, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, config_id=tripleo_step4, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, io.buildah.version=1.41.4) Dec 15 03:18:00 localhost podman[73477]: 2025-12-15 08:18:00.804448686 +0000 UTC m=+0.124085771 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.expose-services=, vendor=Red Hat, Inc., tcib_managed=true, batch=17.1_20251118.1, architecture=x86_64, build-date=2025-11-19T00:11:48Z, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_compute, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676) Dec 15 03:18:00 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 03:18:01 localhost systemd[1]: tmp-crun.KDlufY.mount: Deactivated successfully. Dec 15 03:18:01 localhost podman[73475]: 2025-12-15 08:18:01.069917758 +0000 UTC m=+0.396394335 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, release=1761123044, batch=17.1_20251118.1, vendor=Red Hat, Inc., architecture=x86_64, name=rhosp17/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2025-11-19T00:12:45Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team) Dec 15 03:18:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:18:01 localhost podman[73476]: 2025-12-15 08:18:01.130789577 +0000 UTC m=+0.452105266 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, distribution-scope=public, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, container_name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, build-date=2025-11-18T22:49:32Z, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.openshift.expose-services=, name=rhosp17/openstack-cron, vcs-type=git) Dec 15 03:18:01 localhost podman[73476]: 2025-12-15 08:18:01.139064138 +0000 UTC m=+0.460379827 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=logrotate_crond, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:32Z, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-type=git, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}) Dec 15 03:18:01 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 03:18:01 localhost podman[73547]: 2025-12-15 08:18:01.22807142 +0000 UTC m=+0.085383166 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, version=17.1.12, distribution-scope=public, architecture=x86_64, config_id=tripleo_step4, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=) Dec 15 03:18:01 localhost podman[73475]: 2025-12-15 08:18:01.244131909 +0000 UTC m=+0.570608506 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, tcib_managed=true, build-date=2025-11-19T00:12:45Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, url=https://www.redhat.com, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, io.buildah.version=1.41.4, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Dec 15 03:18:01 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 03:18:01 localhost podman[73547]: 2025-12-15 08:18:01.599255429 +0000 UTC m=+0.456567135 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, tcib_managed=true, container_name=nova_migration_target, version=17.1.12, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., architecture=x86_64, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Dec 15 03:18:01 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 03:18:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:18:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:18:04 localhost podman[73575]: 2025-12-15 08:18:04.73125689 +0000 UTC m=+0.060305234 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.buildah.version=1.41.4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:14:25Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, io.openshift.expose-services=, container_name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, url=https://www.redhat.com, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Dec 15 03:18:04 localhost systemd[1]: tmp-crun.NLRRcP.mount: Deactivated successfully. Dec 15 03:18:04 localhost podman[73575]: 2025-12-15 08:18:04.78132516 +0000 UTC m=+0.110373514 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, tcib_managed=true, version=17.1.12, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., config_id=tripleo_step4, vcs-type=git, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:14:25Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 15 03:18:04 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Deactivated successfully. Dec 15 03:18:04 localhost podman[73574]: 2025-12-15 08:18:04.859149682 +0000 UTC m=+0.186497001 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, config_id=tripleo_step4, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, architecture=x86_64, build-date=2025-11-18T23:34:05Z, distribution-scope=public, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller) Dec 15 03:18:04 localhost podman[73574]: 2025-12-15 08:18:04.910438974 +0000 UTC m=+0.237786303 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, distribution-scope=public, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.4, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, url=https://www.redhat.com, architecture=x86_64, batch=17.1_20251118.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:34:05Z, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller) Dec 15 03:18:04 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Deactivated successfully. Dec 15 03:18:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:18:13 localhost podman[73623]: 2025-12-15 08:18:13.747427316 +0000 UTC m=+0.076704244 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, vcs-type=git, config_id=tripleo_step1, batch=17.1_20251118.1, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible) Dec 15 03:18:13 localhost podman[73623]: 2025-12-15 08:18:13.912373198 +0000 UTC m=+0.241650216 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, release=1761123044, vendor=Red Hat, Inc., vcs-type=git, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, config_id=tripleo_step1, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 15 03:18:13 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:18:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:18:28 localhost systemd[1]: tmp-crun.oFGcyh.mount: Deactivated successfully. Dec 15 03:18:28 localhost podman[73652]: 2025-12-15 08:18:28.780289046 +0000 UTC m=+0.103324876 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, batch=17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, distribution-scope=public, url=https://www.redhat.com, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, vcs-type=git, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd) Dec 15 03:18:28 localhost podman[73652]: 2025-12-15 08:18:28.81111513 +0000 UTC m=+0.134150960 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2025-11-18T22:51:28Z, tcib_managed=true, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, version=17.1.12, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, architecture=x86_64, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, release=1761123044, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd) Dec 15 03:18:28 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:18:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:18:30 localhost podman[73673]: 2025-12-15 08:18:30.75218677 +0000 UTC m=+0.079805507 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, release=1761123044, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., name=rhosp17/openstack-iscsid, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, distribution-scope=public, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid) Dec 15 03:18:30 localhost podman[73673]: 2025-12-15 08:18:30.784797571 +0000 UTC m=+0.112416278 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vendor=Red Hat, Inc., batch=17.1_20251118.1, vcs-type=git, url=https://www.redhat.com, config_id=tripleo_step3, version=17.1.12, name=rhosp17/openstack-iscsid, io.buildah.version=1.41.4, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public) Dec 15 03:18:30 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:18:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:18:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:18:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:18:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:18:31 localhost podman[73692]: 2025-12-15 08:18:31.767466061 +0000 UTC m=+0.093875642 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vcs-type=git, distribution-scope=public, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, release=1761123044) Dec 15 03:18:31 localhost systemd[1]: tmp-crun.s4IbOK.mount: Deactivated successfully. Dec 15 03:18:31 localhost podman[73693]: 2025-12-15 08:18:31.823369526 +0000 UTC m=+0.144994730 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_id=tripleo_step4, architecture=x86_64, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, version=17.1.12, build-date=2025-11-19T00:12:45Z, url=https://www.redhat.com, vcs-type=git, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Dec 15 03:18:31 localhost podman[73693]: 2025-12-15 08:18:31.857459248 +0000 UTC m=+0.179084442 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, version=17.1.12, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, release=1761123044, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.buildah.version=1.41.4, container_name=ceilometer_agent_ipmi) Dec 15 03:18:31 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 03:18:31 localhost podman[73694]: 2025-12-15 08:18:31.881142602 +0000 UTC m=+0.198117181 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.12, build-date=2025-11-18T22:49:32Z, io.openshift.expose-services=, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, name=rhosp17/openstack-cron, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron) Dec 15 03:18:31 localhost podman[73694]: 2025-12-15 08:18:31.918529602 +0000 UTC m=+0.235504101 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, url=https://www.redhat.com, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, build-date=2025-11-18T22:49:32Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, vcs-type=git, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, release=1761123044) Dec 15 03:18:31 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 03:18:31 localhost podman[73698]: 2025-12-15 08:18:31.934416158 +0000 UTC m=+0.249216499 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, tcib_managed=true, url=https://www.redhat.com, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, release=1761123044, name=rhosp17/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.4, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z) Dec 15 03:18:31 localhost podman[73698]: 2025-12-15 08:18:31.962428467 +0000 UTC m=+0.277228878 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, distribution-scope=public, build-date=2025-11-19T00:11:48Z, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, version=17.1.12) Dec 15 03:18:31 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 03:18:32 localhost podman[73692]: 2025-12-15 08:18:32.137753207 +0000 UTC m=+0.464162808 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, url=https://www.redhat.com, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, tcib_managed=true, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, config_id=tripleo_step4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 15 03:18:32 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 03:18:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:18:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:18:35 localhost podman[73789]: 2025-12-15 08:18:35.758397639 +0000 UTC m=+0.086917526 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, managed_by=tripleo_ansible, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, config_id=tripleo_step4, architecture=x86_64, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn) Dec 15 03:18:35 localhost podman[73789]: 2025-12-15 08:18:35.808388646 +0000 UTC m=+0.136908473 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, vcs-type=git, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.buildah.version=1.41.4, version=17.1.12, build-date=2025-11-19T00:14:25Z, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, release=1761123044, config_id=tripleo_step4, vendor=Red Hat, Inc., org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Dec 15 03:18:35 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Deactivated successfully. Dec 15 03:18:35 localhost podman[73788]: 2025-12-15 08:18:35.810342958 +0000 UTC m=+0.141779963 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, managed_by=tripleo_ansible, distribution-scope=public, container_name=ovn_controller, maintainer=OpenStack TripleO Team, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, batch=17.1_20251118.1, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=) Dec 15 03:18:35 localhost podman[73788]: 2025-12-15 08:18:35.889952658 +0000 UTC m=+0.221389693 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, config_id=tripleo_step4, url=https://www.redhat.com, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, batch=17.1_20251118.1, vcs-type=git, io.buildah.version=1.41.4, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, distribution-scope=public) Dec 15 03:18:35 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Deactivated successfully. Dec 15 03:18:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:18:44 localhost podman[73838]: 2025-12-15 08:18:44.754940999 +0000 UTC m=+0.087843381 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, url=https://www.redhat.com, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, release=1761123044, managed_by=tripleo_ansible, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 15 03:18:44 localhost podman[73838]: 2025-12-15 08:18:44.987233423 +0000 UTC m=+0.320135785 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, tcib_managed=true, config_id=tripleo_step1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible) Dec 15 03:18:45 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:18:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:18:59 localhost podman[73943]: 2025-12-15 08:18:59.762635284 +0000 UTC m=+0.087706917 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.4, io.openshift.expose-services=, release=1761123044, tcib_managed=true, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, container_name=collectd, url=https://www.redhat.com) Dec 15 03:18:59 localhost podman[73943]: 2025-12-15 08:18:59.803429666 +0000 UTC m=+0.128501319 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, distribution-scope=public, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, build-date=2025-11-18T22:51:28Z, container_name=collectd) Dec 15 03:18:59 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:19:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:19:01 localhost podman[73962]: 2025-12-15 08:19:01.766112954 +0000 UTC m=+0.093676878 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, build-date=2025-11-18T23:44:13Z, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.12, url=https://www.redhat.com, managed_by=tripleo_ansible, release=1761123044, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, tcib_managed=true, vendor=Red Hat, Inc., vcs-type=git) Dec 15 03:19:01 localhost podman[73962]: 2025-12-15 08:19:01.805401915 +0000 UTC m=+0.132965869 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, url=https://www.redhat.com, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, tcib_managed=true, container_name=iscsid, build-date=2025-11-18T23:44:13Z, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1) Dec 15 03:19:01 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:19:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:19:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:19:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:19:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:19:02 localhost systemd[1]: tmp-crun.HqMc3Y.mount: Deactivated successfully. Dec 15 03:19:02 localhost podman[73984]: 2025-12-15 08:19:02.758910973 +0000 UTC m=+0.084285946 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=17.1.12, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:32Z, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git) Dec 15 03:19:02 localhost podman[73984]: 2025-12-15 08:19:02.792326267 +0000 UTC m=+0.117701250 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1761123044, io.buildah.version=1.41.4, vcs-type=git, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.12, container_name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:32Z, batch=17.1_20251118.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, config_id=tripleo_step4, maintainer=OpenStack TripleO Team) Dec 15 03:19:02 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 03:19:02 localhost podman[73982]: 2025-12-15 08:19:02.811349556 +0000 UTC m=+0.140921401 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, distribution-scope=public, architecture=x86_64, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, version=17.1.12) Dec 15 03:19:02 localhost systemd[1]: tmp-crun.WQ0vSM.mount: Deactivated successfully. Dec 15 03:19:02 localhost podman[73983]: 2025-12-15 08:19:02.873801497 +0000 UTC m=+0.199212580 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-type=git, release=1761123044, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20251118.1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, version=17.1.12, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4) Dec 15 03:19:02 localhost podman[73983]: 2025-12-15 08:19:02.932578149 +0000 UTC m=+0.257989282 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_ipmi, batch=17.1_20251118.1, tcib_managed=true, maintainer=OpenStack TripleO Team, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, version=17.1.12, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.expose-services=) Dec 15 03:19:02 localhost podman[73986]: 2025-12-15 08:19:02.937500041 +0000 UTC m=+0.258873057 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, managed_by=tripleo_ansible, release=1761123044, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, container_name=ceilometer_agent_compute, build-date=2025-11-19T00:11:48Z, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, batch=17.1_20251118.1, io.openshift.expose-services=) Dec 15 03:19:02 localhost podman[73986]: 2025-12-15 08:19:02.972479877 +0000 UTC m=+0.293852883 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:11:48Z, config_id=tripleo_step4, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.41.4, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, container_name=ceilometer_agent_compute, batch=17.1_20251118.1) Dec 15 03:19:02 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 03:19:03 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 03:19:03 localhost podman[73982]: 2025-12-15 08:19:03.221573661 +0000 UTC m=+0.551145486 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-type=git, name=rhosp17/openstack-nova-compute, container_name=nova_migration_target, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, architecture=x86_64) Dec 15 03:19:03 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 03:19:03 localhost systemd[1]: tmp-crun.gHk5cB.mount: Deactivated successfully. Dec 15 03:19:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:19:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:19:06 localhost podman[74076]: 2025-12-15 08:19:06.755066952 +0000 UTC m=+0.079821817 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.openshift.expose-services=, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, vcs-type=git, io.buildah.version=1.41.4, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, batch=17.1_20251118.1, vendor=Red Hat, Inc., build-date=2025-11-18T23:34:05Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller) Dec 15 03:19:06 localhost podman[74077]: 2025-12-15 08:19:06.808796679 +0000 UTC m=+0.129160206 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, architecture=x86_64, batch=17.1_20251118.1, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2025-11-19T00:14:25Z, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, release=1761123044, vendor=Red Hat, Inc., config_id=tripleo_step4, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com) Dec 15 03:19:06 localhost podman[74076]: 2025-12-15 08:19:06.830906481 +0000 UTC m=+0.155661376 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_controller, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, build-date=2025-11-18T23:34:05Z, vcs-type=git) Dec 15 03:19:06 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Deactivated successfully. Dec 15 03:19:06 localhost podman[74077]: 2025-12-15 08:19:06.882677715 +0000 UTC m=+0.203041232 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vendor=Red Hat, Inc., url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.12, build-date=2025-11-19T00:14:25Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, managed_by=tripleo_ansible, architecture=x86_64, config_id=tripleo_step4, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_metadata_agent) Dec 15 03:19:06 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Deactivated successfully. Dec 15 03:19:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:19:15 localhost podman[74124]: 2025-12-15 08:19:15.759068773 +0000 UTC m=+0.089426094 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, config_id=tripleo_step1, vcs-type=git, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, architecture=x86_64, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, vendor=Red Hat, Inc.) Dec 15 03:19:15 localhost podman[74124]: 2025-12-15 08:19:15.946276811 +0000 UTC m=+0.276634082 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, tcib_managed=true, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, release=1761123044, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd) Dec 15 03:19:15 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:19:24 localhost sshd[74153]: main: sshd: ssh-rsa algorithm is disabled Dec 15 03:19:27 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Dec 15 03:19:27 localhost recover_tripleo_nova_virtqemud[74156]: 61849 Dec 15 03:19:27 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Dec 15 03:19:27 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Dec 15 03:19:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:19:30 localhost podman[74157]: 2025-12-15 08:19:30.734471944 +0000 UTC m=+0.068958235 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, release=1761123044, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, container_name=collectd, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., architecture=x86_64, config_id=tripleo_step3) Dec 15 03:19:30 localhost podman[74157]: 2025-12-15 08:19:30.7496317 +0000 UTC m=+0.084117951 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, container_name=collectd, architecture=x86_64, config_id=tripleo_step3, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com) Dec 15 03:19:30 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:19:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:19:32 localhost podman[74177]: 2025-12-15 08:19:32.757207828 +0000 UTC m=+0.084931043 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, version=17.1.12, architecture=x86_64, batch=17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, distribution-scope=public, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true) Dec 15 03:19:32 localhost podman[74177]: 2025-12-15 08:19:32.797431495 +0000 UTC m=+0.125154700 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=iscsid, distribution-scope=public, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, io.buildah.version=1.41.4) Dec 15 03:19:32 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:19:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:19:32 localhost podman[74197]: 2025-12-15 08:19:32.922411558 +0000 UTC m=+0.083128734 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, build-date=2025-11-18T22:49:32Z, vendor=Red Hat, Inc., vcs-type=git, io.buildah.version=1.41.4, version=17.1.12, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-cron, tcib_managed=true) Dec 15 03:19:32 localhost podman[74197]: 2025-12-15 08:19:32.93334311 +0000 UTC m=+0.094060286 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2025-11-18T22:49:32Z, distribution-scope=public, architecture=x86_64, name=rhosp17/openstack-cron, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, config_id=tripleo_step4, container_name=logrotate_crond) Dec 15 03:19:32 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 03:19:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:19:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:19:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:19:33 localhost podman[74216]: 2025-12-15 08:19:33.764428375 +0000 UTC m=+0.093863733 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, container_name=nova_migration_target, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., tcib_managed=true, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Dec 15 03:19:33 localhost systemd[1]: tmp-crun.d4Rkc9.mount: Deactivated successfully. Dec 15 03:19:33 localhost podman[74217]: 2025-12-15 08:19:33.859234291 +0000 UTC m=+0.182264317 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.buildah.version=1.41.4, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_ipmi, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, vcs-type=git, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676) Dec 15 03:19:33 localhost podman[74218]: 2025-12-15 08:19:33.837199972 +0000 UTC m=+0.158005218 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, io.buildah.version=1.41.4, version=17.1.12, distribution-scope=public, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, tcib_managed=true, url=https://www.redhat.com, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 15 03:19:33 localhost podman[74217]: 2025-12-15 08:19:33.888341809 +0000 UTC m=+0.211371855 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 15 03:19:33 localhost podman[74218]: 2025-12-15 08:19:33.920438339 +0000 UTC m=+0.241243565 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, version=17.1.12, build-date=2025-11-19T00:11:48Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute) Dec 15 03:19:33 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 03:19:33 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 03:19:34 localhost podman[74216]: 2025-12-15 08:19:34.139428657 +0000 UTC m=+0.468864005 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., io.openshift.expose-services=, version=17.1.12, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, release=1761123044, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, maintainer=OpenStack TripleO Team, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z) Dec 15 03:19:34 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 03:19:35 localhost python3[74338]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 03:19:36 localhost python3[74383]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765786775.3972151-113156-136293857562965/source _original_basename=tmpn00l68az follow=False checksum=039e0b234f00fbd1242930f0d5dc67e8b4c067fe backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:19:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:19:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:19:37 localhost podman[74415]: 2025-12-15 08:19:37.237951301 +0000 UTC m=+0.086642479 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:14:25Z, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, config_id=tripleo_step4, vcs-type=git, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, managed_by=tripleo_ansible, architecture=x86_64, maintainer=OpenStack TripleO Team, release=1761123044) Dec 15 03:19:37 localhost podman[74415]: 2025-12-15 08:19:37.289279444 +0000 UTC m=+0.137970522 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, version=17.1.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, config_id=tripleo_step4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z) Dec 15 03:19:37 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Deactivated successfully. Dec 15 03:19:37 localhost python3[74414]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_5 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 15 03:19:37 localhost podman[74413]: 2025-12-15 08:19:37.289077939 +0000 UTC m=+0.142486953 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, architecture=x86_64, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4) Dec 15 03:19:37 localhost podman[74413]: 2025-12-15 08:19:37.374464443 +0000 UTC m=+0.227873427 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, release=1761123044, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.buildah.version=1.41.4, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:34:05Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 15 03:19:37 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Deactivated successfully. Dec 15 03:19:39 localhost ansible-async_wrapper.py[74632]: Invoked with 217584232926 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1765786778.510489-113308-206891761685342/AnsiballZ_command.py _ Dec 15 03:19:39 localhost ansible-async_wrapper.py[74635]: Starting module and watcher Dec 15 03:19:39 localhost ansible-async_wrapper.py[74635]: Start watching 74636 (3600) Dec 15 03:19:39 localhost ansible-async_wrapper.py[74636]: Start module (74636) Dec 15 03:19:39 localhost ansible-async_wrapper.py[74632]: Return async_wrapper task started. Dec 15 03:19:39 localhost python3[74656]: ansible-ansible.legacy.async_status Invoked with jid=217584232926.74632 mode=status _async_dir=/tmp/.ansible_async Dec 15 03:19:43 localhost puppet-user[74655]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5 Dec 15 03:19:43 localhost puppet-user[74655]: (file: /etc/puppet/hiera.yaml) Dec 15 03:19:43 localhost puppet-user[74655]: Warning: Undefined variable '::deploy_config_name'; Dec 15 03:19:43 localhost puppet-user[74655]: (file & line not available) Dec 15 03:19:43 localhost puppet-user[74655]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html Dec 15 03:19:43 localhost puppet-user[74655]: (file & line not available) Dec 15 03:19:43 localhost puppet-user[74655]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8) Dec 15 03:19:43 localhost puppet-user[74655]: Warning: This method is deprecated, please use match expressions with Stdlib::Compat::String instead. They are described at https://docs.puppet.com/puppet/latest/reference/lang_data_type.html#match-expressions. at ["/etc/puppet/modules/snmp/manifests/params.pp", 310]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Dec 15 03:19:43 localhost puppet-user[74655]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Dec 15 03:19:43 localhost puppet-user[74655]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Dec 15 03:19:43 localhost puppet-user[74655]: with Stdlib::Compat::Bool. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 358]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Dec 15 03:19:43 localhost puppet-user[74655]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Dec 15 03:19:43 localhost puppet-user[74655]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Dec 15 03:19:43 localhost puppet-user[74655]: with Stdlib::Compat::Array. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 367]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Dec 15 03:19:43 localhost puppet-user[74655]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Dec 15 03:19:43 localhost puppet-user[74655]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Dec 15 03:19:43 localhost puppet-user[74655]: with Stdlib::Compat::String. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 382]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Dec 15 03:19:43 localhost puppet-user[74655]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Dec 15 03:19:43 localhost puppet-user[74655]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Dec 15 03:19:43 localhost puppet-user[74655]: with Stdlib::Compat::Numeric. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 388]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Dec 15 03:19:43 localhost puppet-user[74655]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Dec 15 03:19:43 localhost puppet-user[74655]: Warning: This method is deprecated, please use the stdlib validate_legacy function, Dec 15 03:19:43 localhost puppet-user[74655]: with Pattern[]. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 393]:["/var/lib/tripleo-config/puppet_step_config.pp", 4] Dec 15 03:19:43 localhost puppet-user[74655]: (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation') Dec 15 03:19:43 localhost puppet-user[74655]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69) Dec 15 03:19:43 localhost puppet-user[74655]: Notice: Compiled catalog for np0005559462.localdomain in environment production in 0.32 seconds Dec 15 03:19:44 localhost puppet-user[74655]: Notice: Applied catalog in 0.26 seconds Dec 15 03:19:44 localhost puppet-user[74655]: Application: Dec 15 03:19:44 localhost puppet-user[74655]: Initial environment: production Dec 15 03:19:44 localhost puppet-user[74655]: Converged environment: production Dec 15 03:19:44 localhost puppet-user[74655]: Run mode: user Dec 15 03:19:44 localhost puppet-user[74655]: Changes: Dec 15 03:19:44 localhost puppet-user[74655]: Events: Dec 15 03:19:44 localhost puppet-user[74655]: Resources: Dec 15 03:19:44 localhost puppet-user[74655]: Total: 19 Dec 15 03:19:44 localhost puppet-user[74655]: Time: Dec 15 03:19:44 localhost puppet-user[74655]: Schedule: 0.00 Dec 15 03:19:44 localhost puppet-user[74655]: Package: 0.00 Dec 15 03:19:44 localhost puppet-user[74655]: Exec: 0.01 Dec 15 03:19:44 localhost puppet-user[74655]: Augeas: 0.01 Dec 15 03:19:44 localhost puppet-user[74655]: File: 0.03 Dec 15 03:19:44 localhost puppet-user[74655]: Service: 0.07 Dec 15 03:19:44 localhost puppet-user[74655]: Transaction evaluation: 0.25 Dec 15 03:19:44 localhost puppet-user[74655]: Catalog application: 0.26 Dec 15 03:19:44 localhost puppet-user[74655]: Config retrieval: 0.38 Dec 15 03:19:44 localhost puppet-user[74655]: Last run: 1765786784 Dec 15 03:19:44 localhost puppet-user[74655]: Filebucket: 0.00 Dec 15 03:19:44 localhost puppet-user[74655]: Total: 0.27 Dec 15 03:19:44 localhost puppet-user[74655]: Version: Dec 15 03:19:44 localhost puppet-user[74655]: Config: 1765786783 Dec 15 03:19:44 localhost puppet-user[74655]: Puppet: 7.10.0 Dec 15 03:19:44 localhost ansible-async_wrapper.py[74635]: 74636 still running (3600) Dec 15 03:19:44 localhost ansible-async_wrapper.py[74636]: Module complete (74636) Dec 15 03:19:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:19:46 localhost podman[74780]: 2025-12-15 08:19:46.755009907 +0000 UTC m=+0.083322591 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.12, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, io.buildah.version=1.41.4, url=https://www.redhat.com, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd) Dec 15 03:19:46 localhost podman[74780]: 2025-12-15 08:19:46.970961774 +0000 UTC m=+0.299274398 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, url=https://www.redhat.com, batch=17.1_20251118.1, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, release=1761123044, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, name=rhosp17/openstack-qdrouterd, version=17.1.12, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4) Dec 15 03:19:46 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:19:49 localhost ansible-async_wrapper.py[74635]: Done in kid B. Dec 15 03:19:49 localhost python3[74825]: ansible-ansible.legacy.async_status Invoked with jid=217584232926.74632 mode=status _async_dir=/tmp/.ansible_async Dec 15 03:19:50 localhost python3[74841]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Dec 15 03:19:50 localhost python3[74857]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 15 03:19:51 localhost python3[74907]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 03:19:51 localhost python3[74925]: ansible-ansible.legacy.file Invoked with setype=svirt_sandbox_file_t selevel=s0 dest=/var/lib/container-puppet/puppetlabs/facter.conf _original_basename=tmpwafnovyo recurse=False state=file path=/var/lib/container-puppet/puppetlabs/facter.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None Dec 15 03:19:51 localhost python3[74955]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:19:53 localhost python3[75060]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None Dec 15 03:19:53 localhost python3[75109]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:19:54 localhost python3[75188]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 15 03:19:55 localhost python3[75238]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 03:19:55 localhost python3[75256]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-container-shutdown _original_basename=tripleo-container-shutdown recurse=False state=file path=/usr/libexec/tripleo-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:19:56 localhost python3[75318]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 03:19:56 localhost python3[75336]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-start-podman-container _original_basename=tripleo-start-podman-container recurse=False state=file path=/usr/libexec/tripleo-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:19:57 localhost python3[75398]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 03:19:57 localhost python3[75416]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/tripleo-container-shutdown.service _original_basename=tripleo-container-shutdown-service recurse=False state=file path=/usr/lib/systemd/system/tripleo-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:19:57 localhost python3[75478]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 03:19:58 localhost python3[75496]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset _original_basename=91-tripleo-container-shutdown-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:19:58 localhost python3[75526]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 03:19:58 localhost systemd[1]: Reloading. Dec 15 03:19:58 localhost systemd-sysv-generator[75550]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 03:19:58 localhost systemd-rc-local-generator[75547]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 03:19:58 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 03:19:59 localhost python3[75612]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 03:20:00 localhost python3[75630]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/usr/lib/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:20:00 localhost python3[75692]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True Dec 15 03:20:00 localhost python3[75710]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:20:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:20:01 localhost podman[75741]: 2025-12-15 08:20:01.161144563 +0000 UTC m=+0.092051544 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, distribution-scope=public, vcs-type=git, batch=17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3) Dec 15 03:20:01 localhost podman[75741]: 2025-12-15 08:20:01.203730912 +0000 UTC m=+0.134637853 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, version=17.1.12, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, vcs-type=git, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, batch=17.1_20251118.1) Dec 15 03:20:01 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:20:01 localhost python3[75740]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 03:20:01 localhost systemd[1]: Reloading. Dec 15 03:20:01 localhost systemd-sysv-generator[75790]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 03:20:01 localhost systemd-rc-local-generator[75785]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 03:20:01 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 03:20:01 localhost systemd[1]: Starting Create netns directory... Dec 15 03:20:01 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Dec 15 03:20:01 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Dec 15 03:20:01 localhost systemd[1]: Finished Create netns directory. Dec 15 03:20:02 localhost python3[75818]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6 Dec 15 03:20:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:20:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:20:03 localhost podman[75863]: 2025-12-15 08:20:03.766499643 +0000 UTC m=+0.093667477 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, architecture=x86_64, io.buildah.version=1.41.4, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, url=https://www.redhat.com, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, distribution-scope=public, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 15 03:20:03 localhost systemd[1]: tmp-crun.4AAKq6.mount: Deactivated successfully. Dec 15 03:20:03 localhost podman[75862]: 2025-12-15 08:20:03.815949466 +0000 UTC m=+0.146249054 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, tcib_managed=true, release=1761123044, io.openshift.expose-services=, batch=17.1_20251118.1, vendor=Red Hat, Inc., version=17.1.12, io.buildah.version=1.41.4, architecture=x86_64, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, vcs-type=git, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2025-11-18T23:44:13Z, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid) Dec 15 03:20:03 localhost podman[75863]: 2025-12-15 08:20:03.827263228 +0000 UTC m=+0.154431062 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:32Z, architecture=x86_64, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, managed_by=tripleo_ansible, vcs-type=git, tcib_managed=true, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., io.buildah.version=1.41.4, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, url=https://www.redhat.com, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 15 03:20:03 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 03:20:03 localhost podman[75862]: 2025-12-15 08:20:03.851120487 +0000 UTC m=+0.181420065 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, distribution-scope=public, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, build-date=2025-11-18T23:44:13Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, url=https://www.redhat.com, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, batch=17.1_20251118.1, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, architecture=x86_64) Dec 15 03:20:03 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:20:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:20:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:20:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:20:04 localhost podman[75916]: 2025-12-15 08:20:04.192028928 +0000 UTC m=+0.068329250 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, container_name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, version=17.1.12, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, config_id=tripleo_step4, batch=17.1_20251118.1, distribution-scope=public) Dec 15 03:20:04 localhost podman[75917]: 2025-12-15 08:20:04.244879981 +0000 UTC m=+0.113950099 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, managed_by=tripleo_ansible, io.buildah.version=1.41.4, config_id=tripleo_step4, distribution-scope=public, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, build-date=2025-11-19T00:11:48Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc.) Dec 15 03:20:04 localhost podman[75917]: 2025-12-15 08:20:04.272055488 +0000 UTC m=+0.141125596 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, build-date=2025-11-19T00:11:48Z, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, architecture=x86_64, vendor=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=) Dec 15 03:20:04 localhost podman[75946]: 2025-12-15 08:20:04.293912973 +0000 UTC m=+0.077956507 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.buildah.version=1.41.4, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., container_name=nova_migration_target, release=1761123044, managed_by=tripleo_ansible, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, version=17.1.12) Dec 15 03:20:04 localhost python3[75915]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step5 config_dir=/var/lib/tripleo-config/container-startup-config/step_5 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False Dec 15 03:20:04 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 03:20:04 localhost podman[75916]: 2025-12-15 08:20:04.37414114 +0000 UTC m=+0.250441462 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-ipmi, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, tcib_managed=true, io.openshift.expose-services=, architecture=x86_64, container_name=ceilometer_agent_ipmi) Dec 15 03:20:04 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 03:20:04 localhost podman[76027]: 2025-12-15 08:20:04.63696719 +0000 UTC m=+0.091566420 container create 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, config_id=tripleo_step5, architecture=x86_64, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, batch=17.1_20251118.1, version=17.1.12, name=rhosp17/openstack-nova-compute, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z) Dec 15 03:20:04 localhost podman[75946]: 2025-12-15 08:20:04.682763196 +0000 UTC m=+0.466806750 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., io.openshift.expose-services=, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, url=https://www.redhat.com, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 15 03:20:04 localhost systemd[1]: Started libpod-conmon-36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.scope. Dec 15 03:20:04 localhost podman[76027]: 2025-12-15 08:20:04.593416426 +0000 UTC m=+0.048015656 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 Dec 15 03:20:04 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 03:20:04 localhost systemd[1]: Started libcrun container. Dec 15 03:20:04 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/52d2cac538be3ecf871864e9c47de12a2631999deebfb497d729204226e684f5/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Dec 15 03:20:04 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/52d2cac538be3ecf871864e9c47de12a2631999deebfb497d729204226e684f5/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff) Dec 15 03:20:04 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/52d2cac538be3ecf871864e9c47de12a2631999deebfb497d729204226e684f5/merged/var/log/nova supports timestamps until 2038 (0x7fffffff) Dec 15 03:20:04 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/52d2cac538be3ecf871864e9c47de12a2631999deebfb497d729204226e684f5/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Dec 15 03:20:04 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/52d2cac538be3ecf871864e9c47de12a2631999deebfb497d729204226e684f5/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff) Dec 15 03:20:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 03:20:04 localhost podman[76027]: 2025-12-15 08:20:04.755163163 +0000 UTC m=+0.209762343 container init 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, url=https://www.redhat.com, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, vendor=Red Hat, Inc., container_name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, config_id=tripleo_step5, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Dec 15 03:20:04 localhost systemd[1]: tmp-crun.Zfty30.mount: Deactivated successfully. Dec 15 03:20:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 03:20:04 localhost podman[76027]: 2025-12-15 08:20:04.798962804 +0000 UTC m=+0.253561994 container start 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, release=1761123044, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container) Dec 15 03:20:04 localhost systemd-logind[763]: Existing logind session ID 28 used by new audit session, ignoring. Dec 15 03:20:04 localhost python3[75915]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_compute --conmon-pidfile /run/nova_compute.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env LIBGUESTFS_BACKEND=direct --env TRIPLEO_CONFIG_HASH=182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317 --healthcheck-command /openstack/healthcheck 5672 --ipc host --label config_id=tripleo_step5 --label container_name=nova_compute --label managed_by=tripleo_ansible --label config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_compute.log --network host --privileged=True --ulimit nofile=131072 --ulimit memlock=67108864 --user nova --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/nova:/var/log/nova --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /dev:/dev --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /run/nova:/run/nova:z --volume /var/lib/iscsi:/var/lib/iscsi:z --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /sys/class/net:/sys/class/net --volume /sys/bus/pci:/sys/bus/pci --volume /boot:/boot:ro --volume /var/lib/nova:/var/lib/nova:shared registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 Dec 15 03:20:04 localhost systemd[1]: Created slice User Slice of UID 0. Dec 15 03:20:04 localhost systemd[1]: Starting User Runtime Directory /run/user/0... Dec 15 03:20:04 localhost systemd[1]: Finished User Runtime Directory /run/user/0. Dec 15 03:20:04 localhost systemd[1]: Starting User Manager for UID 0... Dec 15 03:20:04 localhost podman[76049]: 2025-12-15 08:20:04.912194054 +0000 UTC m=+0.102375810 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=starting, url=https://www.redhat.com, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, vcs-type=git, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, tcib_managed=true, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, version=17.1.12, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Dec 15 03:20:04 localhost podman[76049]: 2025-12-15 08:20:04.967435171 +0000 UTC m=+0.157616947 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, version=17.1.12, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vcs-type=git, io.buildah.version=1.41.4, io.openshift.expose-services=, batch=17.1_20251118.1, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, name=rhosp17/openstack-nova-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Dec 15 03:20:04 localhost podman[76049]: unhealthy Dec 15 03:20:04 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Main process exited, code=exited, status=1/FAILURE Dec 15 03:20:04 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Failed with result 'exit-code'. Dec 15 03:20:05 localhost systemd[76066]: Queued start job for default target Main User Target. Dec 15 03:20:05 localhost systemd[76066]: Created slice User Application Slice. Dec 15 03:20:05 localhost systemd[76066]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system). Dec 15 03:20:05 localhost systemd[76066]: Started Daily Cleanup of User's Temporary Directories. Dec 15 03:20:05 localhost systemd[76066]: Reached target Paths. Dec 15 03:20:05 localhost systemd[76066]: Reached target Timers. Dec 15 03:20:05 localhost systemd[76066]: Starting D-Bus User Message Bus Socket... Dec 15 03:20:05 localhost systemd[76066]: Starting Create User's Volatile Files and Directories... Dec 15 03:20:05 localhost systemd[76066]: Listening on D-Bus User Message Bus Socket. Dec 15 03:20:05 localhost systemd[76066]: Reached target Sockets. Dec 15 03:20:05 localhost systemd[76066]: Finished Create User's Volatile Files and Directories. Dec 15 03:20:05 localhost systemd[76066]: Reached target Basic System. Dec 15 03:20:05 localhost systemd[76066]: Reached target Main User Target. Dec 15 03:20:05 localhost systemd[76066]: Startup finished in 148ms. Dec 15 03:20:05 localhost systemd[1]: Started User Manager for UID 0. Dec 15 03:20:05 localhost systemd[1]: Started Session c10 of User root. Dec 15 03:20:05 localhost systemd[1]: session-c10.scope: Deactivated successfully. Dec 15 03:20:05 localhost podman[76150]: 2025-12-15 08:20:05.317023004 +0000 UTC m=+0.075781599 container create cd6d91a86e15fddc432716c2f4a78daffc735710046efa8e7d238a55268a3f33 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_wait_for_compute_service, config_id=tripleo_step5, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, io.openshift.expose-services=, version=17.1.12, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, release=1761123044, name=rhosp17/openstack-nova-compute, tcib_managed=true, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute) Dec 15 03:20:05 localhost systemd[1]: Started libpod-conmon-cd6d91a86e15fddc432716c2f4a78daffc735710046efa8e7d238a55268a3f33.scope. Dec 15 03:20:05 localhost podman[76150]: 2025-12-15 08:20:05.272546274 +0000 UTC m=+0.031304879 image pull registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 Dec 15 03:20:05 localhost systemd[1]: Started libcrun container. Dec 15 03:20:05 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d46c1c020b0a74cb43cb11fd76641b6c77d04e774f05063e246f5f793ab911b3/merged/container-config-scripts supports timestamps until 2038 (0x7fffffff) Dec 15 03:20:05 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d46c1c020b0a74cb43cb11fd76641b6c77d04e774f05063e246f5f793ab911b3/merged/var/log/nova supports timestamps until 2038 (0x7fffffff) Dec 15 03:20:05 localhost podman[76150]: 2025-12-15 08:20:05.396370177 +0000 UTC m=+0.155128772 container init cd6d91a86e15fddc432716c2f4a78daffc735710046efa8e7d238a55268a3f33 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_wait_for_compute_service, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, io.buildah.version=1.41.4, config_id=tripleo_step5, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 15 03:20:05 localhost podman[76150]: 2025-12-15 08:20:05.405196483 +0000 UTC m=+0.163955088 container start cd6d91a86e15fddc432716c2f4a78daffc735710046efa8e7d238a55268a3f33 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, architecture=x86_64, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, release=1761123044, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, version=17.1.12, config_id=tripleo_step5, distribution-scope=public, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, vendor=Red Hat, Inc., container_name=nova_wait_for_compute_service, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, url=https://www.redhat.com) Dec 15 03:20:05 localhost podman[76150]: 2025-12-15 08:20:05.405551013 +0000 UTC m=+0.164309688 container attach cd6d91a86e15fddc432716c2f4a78daffc735710046efa8e7d238a55268a3f33 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, distribution-scope=public, container_name=nova_wait_for_compute_service, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, tcib_managed=true, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.buildah.version=1.41.4, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step5, release=1761123044) Dec 15 03:20:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:20:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:20:07 localhost podman[76175]: 2025-12-15 08:20:07.760943426 +0000 UTC m=+0.088610572 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vendor=Red Hat, Inc., vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, url=https://www.redhat.com, container_name=ovn_metadata_agent, distribution-scope=public, release=1761123044, version=17.1.12, architecture=x86_64, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 15 03:20:07 localhost podman[76174]: 2025-12-15 08:20:07.808347534 +0000 UTC m=+0.135690051 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, version=17.1.12, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, architecture=x86_64, container_name=ovn_controller, vendor=Red Hat, Inc.) Dec 15 03:20:07 localhost podman[76175]: 2025-12-15 08:20:07.812558877 +0000 UTC m=+0.140226093 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, tcib_managed=true, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, version=17.1.12, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Dec 15 03:20:07 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Deactivated successfully. Dec 15 03:20:07 localhost podman[76174]: 2025-12-15 08:20:07.861512457 +0000 UTC m=+0.188854964 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, build-date=2025-11-18T23:34:05Z, batch=17.1_20251118.1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, url=https://www.redhat.com, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4) Dec 15 03:20:07 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Deactivated successfully. Dec 15 03:20:15 localhost systemd[1]: Stopping User Manager for UID 0... Dec 15 03:20:15 localhost systemd[76066]: Activating special unit Exit the Session... Dec 15 03:20:15 localhost systemd[76066]: Stopped target Main User Target. Dec 15 03:20:15 localhost systemd[76066]: Stopped target Basic System. Dec 15 03:20:15 localhost systemd[76066]: Stopped target Paths. Dec 15 03:20:15 localhost systemd[76066]: Stopped target Sockets. Dec 15 03:20:15 localhost systemd[76066]: Stopped target Timers. Dec 15 03:20:15 localhost systemd[76066]: Stopped Daily Cleanup of User's Temporary Directories. Dec 15 03:20:15 localhost systemd[76066]: Closed D-Bus User Message Bus Socket. Dec 15 03:20:15 localhost systemd[76066]: Stopped Create User's Volatile Files and Directories. Dec 15 03:20:15 localhost systemd[76066]: Removed slice User Application Slice. Dec 15 03:20:15 localhost systemd[76066]: Reached target Shutdown. Dec 15 03:20:15 localhost systemd[76066]: Finished Exit the Session. Dec 15 03:20:15 localhost systemd[76066]: Reached target Exit the Session. Dec 15 03:20:15 localhost systemd[1]: user@0.service: Deactivated successfully. Dec 15 03:20:15 localhost systemd[1]: Stopped User Manager for UID 0. Dec 15 03:20:15 localhost systemd[1]: Stopping User Runtime Directory /run/user/0... Dec 15 03:20:15 localhost systemd[1]: run-user-0.mount: Deactivated successfully. Dec 15 03:20:15 localhost systemd[1]: user-runtime-dir@0.service: Deactivated successfully. Dec 15 03:20:15 localhost systemd[1]: Stopped User Runtime Directory /run/user/0. Dec 15 03:20:15 localhost systemd[1]: Removed slice User Slice of UID 0. Dec 15 03:20:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:20:17 localhost podman[76222]: 2025-12-15 08:20:17.767033396 +0000 UTC m=+0.097370955 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, distribution-scope=public, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, version=17.1.12, container_name=metrics_qdr, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, vcs-type=git, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com) Dec 15 03:20:17 localhost podman[76222]: 2025-12-15 08:20:17.983806056 +0000 UTC m=+0.314143635 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.buildah.version=1.41.4, batch=17.1_20251118.1, version=17.1.12, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, managed_by=tripleo_ansible, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd) Dec 15 03:20:17 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:20:21 localhost systemd[1]: session-27.scope: Deactivated successfully. Dec 15 03:20:21 localhost systemd[1]: session-27.scope: Consumed 2.977s CPU time. Dec 15 03:20:21 localhost systemd-logind[763]: Session 27 logged out. Waiting for processes to exit. Dec 15 03:20:21 localhost systemd-logind[763]: Removed session 27. Dec 15 03:20:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:20:31 localhost podman[76252]: 2025-12-15 08:20:31.757115421 +0000 UTC m=+0.090596594 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, name=rhosp17/openstack-collectd, architecture=x86_64, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.buildah.version=1.41.4, batch=17.1_20251118.1, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 15 03:20:31 localhost podman[76252]: 2025-12-15 08:20:31.773576142 +0000 UTC m=+0.107057275 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, name=rhosp17/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vendor=Red Hat, Inc., architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, batch=17.1_20251118.1, config_id=tripleo_step3, container_name=collectd, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git) Dec 15 03:20:31 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:20:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:20:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:20:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:20:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:20:34 localhost systemd[1]: tmp-crun.8V3pMA.mount: Deactivated successfully. Dec 15 03:20:34 localhost podman[76273]: 2025-12-15 08:20:34.76832671 +0000 UTC m=+0.095708161 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, batch=17.1_20251118.1, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-ipmi, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.expose-services=, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, maintainer=OpenStack TripleO Team) Dec 15 03:20:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:20:34 localhost podman[76272]: 2025-12-15 08:20:34.819049657 +0000 UTC m=+0.150752713 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20251118.1, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, container_name=iscsid, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, maintainer=OpenStack TripleO Team, release=1761123044, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_id=tripleo_step3, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, build-date=2025-11-18T23:44:13Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12) Dec 15 03:20:34 localhost podman[76272]: 2025-12-15 08:20:34.832432455 +0000 UTC m=+0.164135551 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, version=17.1.12, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., distribution-scope=public, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, tcib_managed=true, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20251118.1, release=1761123044, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, container_name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:44:13Z) Dec 15 03:20:34 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:20:34 localhost podman[76273]: 2025-12-15 08:20:34.872051105 +0000 UTC m=+0.199432626 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, architecture=x86_64, io.openshift.expose-services=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, distribution-scope=public, vcs-type=git, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, url=https://www.redhat.com, release=1761123044, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team) Dec 15 03:20:34 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 03:20:34 localhost podman[76274]: 2025-12-15 08:20:34.886169733 +0000 UTC m=+0.208332705 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, architecture=x86_64, release=1761123044, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:32Z, vcs-type=git, container_name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, version=17.1.12, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, maintainer=OpenStack TripleO Team) Dec 15 03:20:34 localhost podman[76274]: 2025-12-15 08:20:34.919937626 +0000 UTC m=+0.242100578 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, io.buildah.version=1.41.4, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-cron, architecture=x86_64, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., batch=17.1_20251118.1, container_name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, release=1761123044, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z) Dec 15 03:20:34 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 03:20:34 localhost podman[76280]: 2025-12-15 08:20:34.940417134 +0000 UTC m=+0.256274317 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, maintainer=OpenStack TripleO Team, release=1761123044, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, distribution-scope=public, url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Dec 15 03:20:34 localhost podman[76280]: 2025-12-15 08:20:34.972817411 +0000 UTC m=+0.288674544 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, managed_by=tripleo_ansible, architecture=x86_64, vendor=Red Hat, Inc., io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:11:48Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Dec 15 03:20:34 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 03:20:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 03:20:35 localhost podman[76375]: 2025-12-15 08:20:35.106137318 +0000 UTC m=+0.093451251 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=starting, maintainer=OpenStack TripleO Team, architecture=x86_64, name=rhosp17/openstack-nova-compute, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, vcs-type=git, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5) Dec 15 03:20:35 localhost podman[76321]: 2025-12-15 08:20:35.068136191 +0000 UTC m=+0.282898550 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, distribution-scope=public, config_id=tripleo_step4, io.buildah.version=1.41.4, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute) Dec 15 03:20:35 localhost podman[76375]: 2025-12-15 08:20:35.162736062 +0000 UTC m=+0.150049965 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, managed_by=tripleo_ansible, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Dec 15 03:20:35 localhost podman[76375]: unhealthy Dec 15 03:20:35 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Main process exited, code=exited, status=1/FAILURE Dec 15 03:20:35 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Failed with result 'exit-code'. Dec 15 03:20:35 localhost podman[76321]: 2025-12-15 08:20:35.461894495 +0000 UTC m=+0.676656834 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.openshift.expose-services=, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, version=17.1.12, build-date=2025-11-19T00:36:58Z) Dec 15 03:20:35 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 03:20:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:20:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:20:38 localhost podman[76406]: 2025-12-15 08:20:38.756392272 +0000 UTC m=+0.087641465 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, build-date=2025-11-18T23:34:05Z, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-type=git, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, io.openshift.expose-services=, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 15 03:20:38 localhost podman[76406]: 2025-12-15 08:20:38.811351883 +0000 UTC m=+0.142601066 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, config_id=tripleo_step4, release=1761123044, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vcs-type=git, managed_by=tripleo_ansible, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=, version=17.1.12) Dec 15 03:20:38 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Deactivated successfully. Dec 15 03:20:38 localhost podman[76407]: 2025-12-15 08:20:38.811189697 +0000 UTC m=+0.136298996 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, distribution-scope=public, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, io.openshift.expose-services=, vendor=Red Hat, Inc., version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, url=https://www.redhat.com, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_metadata_agent) Dec 15 03:20:38 localhost podman[76407]: 2025-12-15 08:20:38.896536041 +0000 UTC m=+0.221645340 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vcs-type=git, config_id=tripleo_step4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, io.openshift.expose-services=, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2025-11-19T00:14:25Z, batch=17.1_20251118.1, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn) Dec 15 03:20:38 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Deactivated successfully. Dec 15 03:20:46 localhost sshd[76452]: main: sshd: ssh-rsa algorithm is disabled Dec 15 03:20:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:20:48 localhost podman[76454]: 2025-12-15 08:20:48.772524231 +0000 UTC m=+0.099181324 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, architecture=x86_64, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-11-18T22:49:46Z, tcib_managed=true, config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, distribution-scope=public) Dec 15 03:20:48 localhost podman[76454]: 2025-12-15 08:20:48.971601816 +0000 UTC m=+0.298258839 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, managed_by=tripleo_ansible, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, container_name=metrics_qdr, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc.) Dec 15 03:20:48 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:21:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:21:02 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Dec 15 03:21:02 localhost recover_tripleo_nova_virtqemud[76561]: 61849 Dec 15 03:21:02 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Dec 15 03:21:02 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Dec 15 03:21:02 localhost podman[76559]: 2025-12-15 08:21:02.781062298 +0000 UTC m=+0.097280593 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.buildah.version=1.41.4, batch=17.1_20251118.1, config_id=tripleo_step3, managed_by=tripleo_ansible, io.openshift.expose-services=, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 15 03:21:02 localhost podman[76559]: 2025-12-15 08:21:02.792885554 +0000 UTC m=+0.109103859 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, version=17.1.12, container_name=collectd, build-date=2025-11-18T22:51:28Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, io.buildah.version=1.41.4, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, managed_by=tripleo_ansible) Dec 15 03:21:02 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:21:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:21:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 03:21:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:21:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:21:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:21:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:21:05 localhost podman[76584]: 2025-12-15 08:21:05.76819249 +0000 UTC m=+0.088794235 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, vcs-type=git, name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, version=17.1.12, release=1761123044, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2025-11-19T00:12:45Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi) Dec 15 03:21:05 localhost systemd[1]: tmp-crun.8rQSVG.mount: Deactivated successfully. Dec 15 03:21:05 localhost podman[76584]: 2025-12-15 08:21:05.822160115 +0000 UTC m=+0.142761810 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, managed_by=tripleo_ansible, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, distribution-scope=public, build-date=2025-11-19T00:12:45Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi) Dec 15 03:21:05 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 03:21:05 localhost podman[76583]: 2025-12-15 08:21:05.827096006 +0000 UTC m=+0.149609543 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, tcib_managed=true, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, container_name=nova_migration_target, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Dec 15 03:21:05 localhost podman[76589]: 2025-12-15 08:21:05.885862979 +0000 UTC m=+0.199590561 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, name=rhosp17/openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1761123044, managed_by=tripleo_ansible, version=17.1.12, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, architecture=x86_64, config_id=tripleo_step4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:32Z, io.openshift.expose-services=, vcs-type=git, batch=17.1_20251118.1) Dec 15 03:21:05 localhost podman[76589]: 2025-12-15 08:21:05.900054848 +0000 UTC m=+0.213782410 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, maintainer=OpenStack TripleO Team, release=1761123044, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, container_name=logrotate_crond, tcib_managed=true, config_id=tripleo_step4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, url=https://www.redhat.com) Dec 15 03:21:05 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 03:21:05 localhost podman[76582]: 2025-12-15 08:21:05.981286171 +0000 UTC m=+0.305376991 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=starting, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, build-date=2025-11-19T00:36:58Z, vcs-type=git, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, batch=17.1_20251118.1, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible) Dec 15 03:21:06 localhost podman[76595]: 2025-12-15 08:21:06.045591452 +0000 UTC m=+0.355515333 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, tcib_managed=true, build-date=2025-11-19T00:11:48Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, release=1761123044, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.12, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, architecture=x86_64, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute) Dec 15 03:21:06 localhost podman[76582]: 2025-12-15 08:21:06.063693747 +0000 UTC m=+0.387784547 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, managed_by=tripleo_ansible, vcs-type=git, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, architecture=x86_64, config_id=tripleo_step5, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, url=https://www.redhat.com) Dec 15 03:21:06 localhost podman[76582]: unhealthy Dec 15 03:21:06 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Main process exited, code=exited, status=1/FAILURE Dec 15 03:21:06 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Failed with result 'exit-code'. Dec 15 03:21:06 localhost podman[76595]: 2025-12-15 08:21:06.076257662 +0000 UTC m=+0.386181533 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com, vendor=Red Hat, Inc., release=1761123044, io.buildah.version=1.41.4, managed_by=tripleo_ansible, build-date=2025-11-19T00:11:48Z, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=) Dec 15 03:21:06 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 03:21:06 localhost podman[76581]: 2025-12-15 08:21:06.082052557 +0000 UTC m=+0.410845712 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, url=https://www.redhat.com, vcs-type=git, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, architecture=x86_64, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true) Dec 15 03:21:06 localhost podman[76581]: 2025-12-15 08:21:06.161686938 +0000 UTC m=+0.490480103 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, url=https://www.redhat.com, distribution-scope=public, io.buildah.version=1.41.4, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, container_name=iscsid, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d) Dec 15 03:21:06 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:21:06 localhost podman[76583]: 2025-12-15 08:21:06.223531012 +0000 UTC m=+0.546044549 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, io.buildah.version=1.41.4) Dec 15 03:21:06 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 03:21:06 localhost systemd[1]: tmp-crun.3qkoff.mount: Deactivated successfully. Dec 15 03:21:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:21:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:21:09 localhost podman[76714]: 2025-12-15 08:21:09.738537038 +0000 UTC m=+0.074198306 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, maintainer=OpenStack TripleO Team, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.openshift.expose-services=, architecture=x86_64, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, io.buildah.version=1.41.4, tcib_managed=true, distribution-scope=public, config_id=tripleo_step4, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container) Dec 15 03:21:09 localhost podman[76715]: 2025-12-15 08:21:09.758676957 +0000 UTC m=+0.088749116 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, url=https://www.redhat.com, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:14:25Z, config_id=tripleo_step4, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn) Dec 15 03:21:09 localhost podman[76714]: 2025-12-15 08:21:09.767739999 +0000 UTC m=+0.103401297 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, distribution-scope=public, io.buildah.version=1.41.4, vendor=Red Hat, Inc., release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, batch=17.1_20251118.1, config_id=tripleo_step4, build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller) Dec 15 03:21:09 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Deactivated successfully. Dec 15 03:21:09 localhost podman[76715]: 2025-12-15 08:21:09.800870976 +0000 UTC m=+0.130943095 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, managed_by=tripleo_ansible, url=https://www.redhat.com, release=1761123044, vendor=Red Hat, Inc., tcib_managed=true, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1) Dec 15 03:21:09 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Deactivated successfully. Dec 15 03:21:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:21:19 localhost podman[76765]: 2025-12-15 08:21:19.755205981 +0000 UTC m=+0.089402670 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, url=https://www.redhat.com, io.buildah.version=1.41.4, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, build-date=2025-11-18T22:49:46Z, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible) Dec 15 03:21:20 localhost podman[76765]: 2025-12-15 08:21:20.01345075 +0000 UTC m=+0.347647409 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, distribution-scope=public, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 15 03:21:20 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:21:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:21:33 localhost podman[76793]: 2025-12-15 08:21:33.737543485 +0000 UTC m=+0.073015112 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.4, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-collectd, batch=17.1_20251118.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd) Dec 15 03:21:33 localhost podman[76793]: 2025-12-15 08:21:33.747510401 +0000 UTC m=+0.082982058 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., release=1761123044, config_id=tripleo_step3, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, architecture=x86_64, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, container_name=collectd, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 15 03:21:33 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:21:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:21:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 03:21:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:21:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:21:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:21:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:21:36 localhost podman[76814]: 2025-12-15 08:21:36.766954377 +0000 UTC m=+0.094386403 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., distribution-scope=public, version=17.1.12, managed_by=tripleo_ansible, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, tcib_managed=true, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid) Dec 15 03:21:36 localhost podman[76814]: 2025-12-15 08:21:36.805537458 +0000 UTC m=+0.132969494 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, build-date=2025-11-18T23:44:13Z, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, version=17.1.12, io.buildah.version=1.41.4, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, release=1761123044, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., batch=17.1_20251118.1) Dec 15 03:21:36 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:21:36 localhost systemd[1]: tmp-crun.d9ACbP.mount: Deactivated successfully. Dec 15 03:21:36 localhost podman[76818]: 2025-12-15 08:21:36.832603551 +0000 UTC m=+0.148893559 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-type=git, name=rhosp17/openstack-cron, tcib_managed=true, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, batch=17.1_20251118.1, architecture=x86_64, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=) Dec 15 03:21:36 localhost podman[76816]: 2025-12-15 08:21:36.877186512 +0000 UTC m=+0.195021142 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, batch=17.1_20251118.1, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, io.buildah.version=1.41.4, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.openshift.expose-services=, vendor=Red Hat, Inc., config_id=tripleo_step4) Dec 15 03:21:36 localhost podman[76818]: 2025-12-15 08:21:36.894453963 +0000 UTC m=+0.210743971 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, container_name=logrotate_crond, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vendor=Red Hat, Inc., config_id=tripleo_step4, build-date=2025-11-18T22:49:32Z, com.redhat.component=openstack-cron-container, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.openshift.expose-services=, name=rhosp17/openstack-cron, url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 15 03:21:36 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 03:21:36 localhost podman[76830]: 2025-12-15 08:21:36.990643693 +0000 UTC m=+0.302854971 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, build-date=2025-11-19T00:11:48Z, url=https://www.redhat.com, io.openshift.expose-services=, architecture=x86_64, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, container_name=ceilometer_agent_compute, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Dec 15 03:21:37 localhost podman[76817]: 2025-12-15 08:21:37.043162706 +0000 UTC m=+0.362061404 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, release=1761123044, io.buildah.version=1.41.4, tcib_managed=true, distribution-scope=public, url=https://www.redhat.com, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2025-11-19T00:12:45Z) Dec 15 03:21:37 localhost podman[76830]: 2025-12-15 08:21:37.056478912 +0000 UTC m=+0.368690180 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, version=17.1.12, build-date=2025-11-19T00:11:48Z, batch=17.1_20251118.1, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64) Dec 15 03:21:37 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 03:21:37 localhost podman[76817]: 2025-12-15 08:21:37.098448793 +0000 UTC m=+0.417347471 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, release=1761123044, vendor=Red Hat, Inc., batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi) Dec 15 03:21:37 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 03:21:37 localhost podman[76815]: 2025-12-15 08:21:37.158381004 +0000 UTC m=+0.483602390 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, architecture=x86_64, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, distribution-scope=public, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, vcs-type=git, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Dec 15 03:21:37 localhost podman[76815]: 2025-12-15 08:21:37.215332076 +0000 UTC m=+0.540553512 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, version=17.1.12, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.buildah.version=1.41.4, release=1761123044, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team) Dec 15 03:21:37 localhost podman[76815]: unhealthy Dec 15 03:21:37 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Main process exited, code=exited, status=1/FAILURE Dec 15 03:21:37 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Failed with result 'exit-code'. Dec 15 03:21:37 localhost podman[76816]: 2025-12-15 08:21:37.273474549 +0000 UTC m=+0.591309169 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, container_name=nova_migration_target, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, architecture=x86_64, batch=17.1_20251118.1, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public) Dec 15 03:21:37 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 03:21:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:21:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:21:40 localhost podman[76949]: 2025-12-15 08:21:40.753105289 +0000 UTC m=+0.081671323 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, architecture=x86_64, url=https://www.redhat.com, managed_by=tripleo_ansible, version=17.1.12, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, release=1761123044, config_id=tripleo_step4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, tcib_managed=true) Dec 15 03:21:40 localhost podman[76948]: 2025-12-15 08:21:40.814510769 +0000 UTC m=+0.142639102 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, managed_by=tripleo_ansible, release=1761123044, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, tcib_managed=true, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.buildah.version=1.41.4, architecture=x86_64) Dec 15 03:21:40 localhost podman[76949]: 2025-12-15 08:21:40.820286653 +0000 UTC m=+0.148852697 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, io.openshift.expose-services=, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, tcib_managed=true, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vendor=Red Hat, Inc., distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, batch=17.1_20251118.1) Dec 15 03:21:40 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Deactivated successfully. Dec 15 03:21:40 localhost podman[76948]: 2025-12-15 08:21:40.866511888 +0000 UTC m=+0.194640221 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, distribution-scope=public, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, io.buildah.version=1.41.4, vendor=Red Hat, Inc., tcib_managed=true, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, vcs-type=git, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, release=1761123044, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}) Dec 15 03:21:40 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Deactivated successfully. Dec 15 03:21:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:21:50 localhost systemd[1]: tmp-crun.Rvm5Sx.mount: Deactivated successfully. Dec 15 03:21:50 localhost podman[76997]: 2025-12-15 08:21:50.763173481 +0000 UTC m=+0.095439281 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1761123044, managed_by=tripleo_ansible, vcs-type=git, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., version=17.1.12, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 15 03:21:50 localhost podman[76997]: 2025-12-15 08:21:50.967969592 +0000 UTC m=+0.300235422 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, tcib_managed=true, vcs-type=git, name=rhosp17/openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, distribution-scope=public) Dec 15 03:21:50 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:22:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:22:04 localhost podman[77106]: 2025-12-15 08:22:04.766100084 +0000 UTC m=+0.092585716 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, name=rhosp17/openstack-collectd, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, container_name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, batch=17.1_20251118.1, release=1761123044, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 15 03:22:04 localhost podman[77106]: 2025-12-15 08:22:04.822593233 +0000 UTC m=+0.149078915 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, managed_by=tripleo_ansible, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, distribution-scope=public, release=1761123044, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 15 03:22:04 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:22:05 localhost sshd[77126]: main: sshd: ssh-rsa algorithm is disabled Dec 15 03:22:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:22:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 03:22:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:22:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:22:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:22:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:22:07 localhost podman[77148]: 2025-12-15 08:22:07.578153488 +0000 UTC m=+0.085597987 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, version=17.1.12, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20251118.1, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:32Z, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, url=https://www.redhat.com, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, container_name=logrotate_crond, name=rhosp17/openstack-cron) Dec 15 03:22:07 localhost podman[77148]: 2025-12-15 08:22:07.616351229 +0000 UTC m=+0.123795728 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, batch=17.1_20251118.1, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, managed_by=tripleo_ansible, release=1761123044, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, version=17.1.12, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron) Dec 15 03:22:07 localhost podman[77134]: 2025-12-15 08:22:07.630329302 +0000 UTC m=+0.142024855 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, config_id=tripleo_step4, architecture=x86_64, distribution-scope=public, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, build-date=2025-11-19T00:12:45Z, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676) Dec 15 03:22:07 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 03:22:07 localhost podman[77129]: 2025-12-15 08:22:07.684364686 +0000 UTC m=+0.204093244 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, config_id=tripleo_step5, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, architecture=x86_64, container_name=nova_compute, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git) Dec 15 03:22:07 localhost podman[77149]: 2025-12-15 08:22:07.604331788 +0000 UTC m=+0.105794847 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:11:48Z, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, vendor=Red Hat, Inc., distribution-scope=public, container_name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64) Dec 15 03:22:07 localhost podman[77134]: 2025-12-15 08:22:07.713736621 +0000 UTC m=+0.225432234 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, distribution-scope=public, managed_by=tripleo_ansible, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Dec 15 03:22:07 localhost podman[77129]: 2025-12-15 08:22:07.724378735 +0000 UTC m=+0.244107363 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, managed_by=tripleo_ansible, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 15 03:22:07 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 03:22:07 localhost podman[77129]: unhealthy Dec 15 03:22:07 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Main process exited, code=exited, status=1/FAILURE Dec 15 03:22:07 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Failed with result 'exit-code'. Dec 15 03:22:07 localhost podman[77149]: 2025-12-15 08:22:07.743607238 +0000 UTC m=+0.245070327 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, batch=17.1_20251118.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:11:48Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute) Dec 15 03:22:07 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 03:22:07 localhost podman[77128]: 2025-12-15 08:22:07.798480365 +0000 UTC m=+0.321892361 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, release=1761123044, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, url=https://www.redhat.com, vcs-type=git, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, version=17.1.12, architecture=x86_64) Dec 15 03:22:07 localhost podman[77128]: 2025-12-15 08:22:07.83645425 +0000 UTC m=+0.359866296 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true, version=17.1.12, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, release=1761123044, vendor=Red Hat, Inc., architecture=x86_64, name=rhosp17/openstack-iscsid, config_id=tripleo_step3) Dec 15 03:22:07 localhost podman[77130]: 2025-12-15 08:22:07.841213616 +0000 UTC m=+0.360053419 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-type=git, release=1761123044, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, build-date=2025-11-19T00:36:58Z, architecture=x86_64, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_migration_target, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container) Dec 15 03:22:07 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:22:08 localhost podman[77130]: 2025-12-15 08:22:08.239158317 +0000 UTC m=+0.757998080 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, version=17.1.12, io.openshift.expose-services=, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.buildah.version=1.41.4, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target) Dec 15 03:22:08 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 03:22:08 localhost systemd[1]: tmp-crun.tl62os.mount: Deactivated successfully. Dec 15 03:22:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:22:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:22:11 localhost podman[77263]: 2025-12-15 08:22:11.758783016 +0000 UTC m=+0.085995599 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, batch=17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, tcib_managed=true, release=1761123044, config_id=tripleo_step4, url=https://www.redhat.com, managed_by=tripleo_ansible) Dec 15 03:22:11 localhost systemd[1]: tmp-crun.RavHI1.mount: Deactivated successfully. Dec 15 03:22:11 localhost podman[77262]: 2025-12-15 08:22:11.82520335 +0000 UTC m=+0.152336961 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, io.buildah.version=1.41.4, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1761123044, build-date=2025-11-18T23:34:05Z, url=https://www.redhat.com, vcs-type=git, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 15 03:22:11 localhost podman[77263]: 2025-12-15 08:22:11.833366198 +0000 UTC m=+0.160578771 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, tcib_managed=true, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., architecture=x86_64, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:14:25Z, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, release=1761123044, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4) Dec 15 03:22:11 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Deactivated successfully. Dec 15 03:22:11 localhost podman[77262]: 2025-12-15 08:22:11.849653063 +0000 UTC m=+0.176786624 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, build-date=2025-11-18T23:34:05Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, config_id=tripleo_step4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller) Dec 15 03:22:11 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Deactivated successfully. Dec 15 03:22:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:22:21 localhost podman[77307]: 2025-12-15 08:22:21.750582279 +0000 UTC m=+0.085028032 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, release=1761123044, tcib_managed=true, name=rhosp17/openstack-qdrouterd, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container) Dec 15 03:22:21 localhost podman[77307]: 2025-12-15 08:22:21.945466676 +0000 UTC m=+0.279912419 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, vcs-type=git, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, url=https://www.redhat.com, tcib_managed=true, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, build-date=2025-11-18T22:49:46Z, vendor=Red Hat, Inc., managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Dec 15 03:22:21 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:22:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:22:35 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Dec 15 03:22:35 localhost recover_tripleo_nova_virtqemud[77339]: 61849 Dec 15 03:22:35 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Dec 15 03:22:35 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Dec 15 03:22:35 localhost podman[77337]: 2025-12-15 08:22:35.757253703 +0000 UTC m=+0.085970227 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, vcs-type=git, name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, version=17.1.12, release=1761123044, container_name=collectd, config_id=tripleo_step3, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd) Dec 15 03:22:35 localhost podman[77337]: 2025-12-15 08:22:35.768958485 +0000 UTC m=+0.097675019 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20251118.1, name=rhosp17/openstack-collectd, tcib_managed=true, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, build-date=2025-11-18T22:51:28Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., architecture=x86_64, container_name=collectd, url=https://www.redhat.com, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 15 03:22:35 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:22:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:22:37 localhost podman[77358]: 2025-12-15 08:22:37.738066351 +0000 UTC m=+0.073785721 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp17/openstack-cron, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:32Z, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, container_name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, version=17.1.12, config_id=tripleo_step4, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, release=1761123044, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, maintainer=OpenStack TripleO Team) Dec 15 03:22:37 localhost podman[77358]: 2025-12-15 08:22:37.750324519 +0000 UTC m=+0.086043889 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, tcib_managed=true, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:32Z, summary=Red Hat OpenStack Platform 17.1 cron) Dec 15 03:22:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:22:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 03:22:37 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 03:22:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:22:37 localhost systemd[1]: tmp-crun.iSGUlk.mount: Deactivated successfully. Dec 15 03:22:37 localhost podman[77377]: 2025-12-15 08:22:37.851349628 +0000 UTC m=+0.087151949 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, vendor=Red Hat, Inc., config_id=tripleo_step4, distribution-scope=public, build-date=2025-11-19T00:12:45Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, vcs-type=git, architecture=x86_64, name=rhosp17/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676) Dec 15 03:22:37 localhost podman[77377]: 2025-12-15 08:22:37.927518743 +0000 UTC m=+0.163321064 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, vendor=Red Hat, Inc., tcib_managed=true, io.openshift.expose-services=, batch=17.1_20251118.1, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, distribution-scope=public, build-date=2025-11-19T00:12:45Z, version=17.1.12) Dec 15 03:22:37 localhost podman[77405]: 2025-12-15 08:22:37.935841405 +0000 UTC m=+0.081648382 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com, managed_by=tripleo_ansible, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.12, container_name=ceilometer_agent_compute, vcs-type=git, maintainer=OpenStack TripleO Team, release=1761123044, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:11:48Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., config_id=tripleo_step4) Dec 15 03:22:37 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 03:22:37 localhost podman[77378]: 2025-12-15 08:22:37.903050579 +0000 UTC m=+0.134725470 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, version=17.1.12, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, architecture=x86_64, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step5, vcs-type=git, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible) Dec 15 03:22:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:22:37 localhost podman[77378]: 2025-12-15 08:22:37.986452437 +0000 UTC m=+0.218127348 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_id=tripleo_step5, version=17.1.12, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, vcs-type=git, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.buildah.version=1.41.4, architecture=x86_64) Dec 15 03:22:37 localhost podman[77378]: unhealthy Dec 15 03:22:37 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Main process exited, code=exited, status=1/FAILURE Dec 15 03:22:37 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Failed with result 'exit-code'. Dec 15 03:22:38 localhost podman[77446]: 2025-12-15 08:22:38.030849814 +0000 UTC m=+0.062611624 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, url=https://www.redhat.com, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, vcs-type=git, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid) Dec 15 03:22:38 localhost podman[77405]: 2025-12-15 08:22:38.042381172 +0000 UTC m=+0.188188189 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, release=1761123044, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, managed_by=tripleo_ansible, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vcs-type=git, build-date=2025-11-19T00:11:48Z) Dec 15 03:22:38 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 03:22:38 localhost podman[77446]: 2025-12-15 08:22:38.094834663 +0000 UTC m=+0.126596513 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, name=rhosp17/openstack-iscsid, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.buildah.version=1.41.4, managed_by=tripleo_ansible, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, release=1761123044) Dec 15 03:22:38 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:22:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:22:38 localhost systemd[1]: tmp-crun.4pRdPY.mount: Deactivated successfully. Dec 15 03:22:38 localhost podman[77471]: 2025-12-15 08:22:38.746485201 +0000 UTC m=+0.077448439 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, distribution-scope=public, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., release=1761123044, build-date=2025-11-19T00:36:58Z, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute) Dec 15 03:22:39 localhost podman[77471]: 2025-12-15 08:22:39.193106634 +0000 UTC m=+0.524069902 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, url=https://www.redhat.com, version=17.1.12, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, vcs-type=git, io.openshift.expose-services=, io.buildah.version=1.41.4, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, distribution-scope=public, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step4, architecture=x86_64, vendor=Red Hat, Inc.) Dec 15 03:22:39 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 03:22:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:22:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:22:42 localhost systemd[1]: tmp-crun.ryldwT.mount: Deactivated successfully. Dec 15 03:22:42 localhost podman[77493]: 2025-12-15 08:22:42.751912599 +0000 UTC m=+0.083112630 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, architecture=x86_64, config_id=tripleo_step4, io.openshift.expose-services=, batch=17.1_20251118.1, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, release=1761123044, name=rhosp17/openstack-ovn-controller, distribution-scope=public) Dec 15 03:22:42 localhost podman[77493]: 2025-12-15 08:22:42.777369839 +0000 UTC m=+0.108569900 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, url=https://www.redhat.com, release=1761123044, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, build-date=2025-11-18T23:34:05Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.openshift.expose-services=, managed_by=tripleo_ansible, tcib_managed=true, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, distribution-scope=public, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272) Dec 15 03:22:42 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Deactivated successfully. Dec 15 03:22:42 localhost systemd[1]: tmp-crun.Xn5nw2.mount: Deactivated successfully. Dec 15 03:22:42 localhost podman[77494]: 2025-12-15 08:22:42.872588403 +0000 UTC m=+0.200244580 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, release=1761123044, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, architecture=x86_64, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, build-date=2025-11-19T00:14:25Z, vcs-type=git, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team) Dec 15 03:22:42 localhost podman[77494]: 2025-12-15 08:22:42.915254172 +0000 UTC m=+0.242910349 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, url=https://www.redhat.com, io.buildah.version=1.41.4, vendor=Red Hat, Inc., architecture=x86_64, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, version=17.1.12, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent) Dec 15 03:22:42 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Deactivated successfully. Dec 15 03:22:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:22:52 localhost podman[77542]: 2025-12-15 08:22:52.74881117 +0000 UTC m=+0.082985799 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.expose-services=, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, vendor=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, batch=17.1_20251118.1, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Dec 15 03:22:52 localhost podman[77542]: 2025-12-15 08:22:52.944287782 +0000 UTC m=+0.278462371 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., version=17.1.12, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, tcib_managed=true, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1) Dec 15 03:22:52 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:23:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:23:06 localhost systemd[1]: tmp-crun.CZiBHj.mount: Deactivated successfully. Dec 15 03:23:06 localhost podman[77650]: 2025-12-15 08:23:06.769854405 +0000 UTC m=+0.096142320 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, build-date=2025-11-18T22:51:28Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vcs-type=git, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.41.4, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, com.redhat.component=openstack-collectd-container, distribution-scope=public, name=rhosp17/openstack-collectd, url=https://www.redhat.com) Dec 15 03:23:06 localhost podman[77650]: 2025-12-15 08:23:06.812515564 +0000 UTC m=+0.138803509 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, name=rhosp17/openstack-collectd, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, vendor=Red Hat, Inc., batch=17.1_20251118.1, architecture=x86_64, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, version=17.1.12, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 15 03:23:06 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:23:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:23:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:23:08 localhost podman[77696]: 2025-12-15 08:23:08.031435728 +0000 UTC m=+0.080222264 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, tcib_managed=true, architecture=x86_64, name=rhosp17/openstack-cron, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron) Dec 15 03:23:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 03:23:08 localhost podman[77696]: 2025-12-15 08:23:08.078378203 +0000 UTC m=+0.127164739 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, build-date=2025-11-18T22:49:32Z, name=rhosp17/openstack-cron, config_id=tripleo_step4, distribution-scope=public, container_name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container) Dec 15 03:23:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:23:08 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 03:23:08 localhost podman[77697]: 2025-12-15 08:23:08.099060715 +0000 UTC m=+0.143283049 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1761123044, vcs-type=git, url=https://www.redhat.com, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.4, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4) Dec 15 03:23:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:23:08 localhost podman[77697]: 2025-12-15 08:23:08.144669584 +0000 UTC m=+0.188891868 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_id=tripleo_step4, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.4, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:12:45Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc.) Dec 15 03:23:08 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 03:23:08 localhost podman[77771]: 2025-12-15 08:23:08.21188217 +0000 UTC m=+0.073995149 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, url=https://www.redhat.com, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, name=rhosp17/openstack-iscsid, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, architecture=x86_64, config_id=tripleo_step3, managed_by=tripleo_ansible, version=17.1.12, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, build-date=2025-11-18T23:44:13Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, maintainer=OpenStack TripleO Team) Dec 15 03:23:08 localhost podman[77734]: 2025-12-15 08:23:08.194687469 +0000 UTC m=+0.147809189 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., container_name=nova_compute, build-date=2025-11-19T00:36:58Z, version=17.1.12, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, vcs-type=git, distribution-scope=public, config_id=tripleo_step5, batch=17.1_20251118.1, managed_by=tripleo_ansible) Dec 15 03:23:08 localhost podman[77752]: 2025-12-15 08:23:08.281091268 +0000 UTC m=+0.175836639 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, io.openshift.expose-services=, batch=17.1_20251118.1, build-date=2025-11-19T00:11:48Z, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, vendor=Red Hat, Inc., config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64) Dec 15 03:23:08 localhost podman[77771]: 2025-12-15 08:23:08.299712176 +0000 UTC m=+0.161825175 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, build-date=2025-11-18T23:44:13Z, architecture=x86_64, distribution-scope=public, config_id=tripleo_step3, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, url=https://www.redhat.com, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., release=1761123044, io.buildah.version=1.41.4, managed_by=tripleo_ansible) Dec 15 03:23:08 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:23:08 localhost podman[77734]: 2025-12-15 08:23:08.325447793 +0000 UTC m=+0.278569573 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, container_name=nova_compute, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, version=17.1.12) Dec 15 03:23:08 localhost podman[77752]: 2025-12-15 08:23:08.339698494 +0000 UTC m=+0.234443875 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, architecture=x86_64, url=https://www.redhat.com, vcs-type=git, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, io.buildah.version=1.41.4, io.openshift.expose-services=, version=17.1.12, vendor=Red Hat, Inc., batch=17.1_20251118.1, build-date=2025-11-19T00:11:48Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container) Dec 15 03:23:08 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Deactivated successfully. Dec 15 03:23:08 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 03:23:09 localhost systemd[1]: tmp-crun.xPdvRH.mount: Deactivated successfully. Dec 15 03:23:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:23:09 localhost podman[77857]: 2025-12-15 08:23:09.746770364 +0000 UTC m=+0.081119188 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, architecture=x86_64, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-type=git, name=rhosp17/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute) Dec 15 03:23:10 localhost podman[77857]: 2025-12-15 08:23:10.166806555 +0000 UTC m=+0.501155369 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, name=rhosp17/openstack-nova-compute, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, url=https://www.redhat.com, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, architecture=x86_64, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 15 03:23:10 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 03:23:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:23:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:23:13 localhost podman[77904]: 2025-12-15 08:23:13.756030964 +0000 UTC m=+0.089173963 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, config_id=tripleo_step4, url=https://www.redhat.com, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Dec 15 03:23:13 localhost podman[77903]: 2025-12-15 08:23:13.801466538 +0000 UTC m=+0.135794559 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, tcib_managed=true, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, version=17.1.12, build-date=2025-11-18T23:34:05Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, distribution-scope=public, url=https://www.redhat.com, vendor=Red Hat, Inc., release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272) Dec 15 03:23:13 localhost podman[77904]: 2025-12-15 08:23:13.828580951 +0000 UTC m=+0.161723960 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.openshift.expose-services=, url=https://www.redhat.com, distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.buildah.version=1.41.4, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., batch=17.1_20251118.1, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Dec 15 03:23:13 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Deactivated successfully. Dec 15 03:23:13 localhost podman[77903]: 2025-12-15 08:23:13.854450523 +0000 UTC m=+0.188778544 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, managed_by=tripleo_ansible, build-date=2025-11-18T23:34:05Z, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.buildah.version=1.41.4, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, container_name=ovn_controller, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}) Dec 15 03:23:13 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Deactivated successfully. Dec 15 03:23:18 localhost systemd[1]: libpod-cd6d91a86e15fddc432716c2f4a78daffc735710046efa8e7d238a55268a3f33.scope: Deactivated successfully. Dec 15 03:23:18 localhost podman[76150]: 2025-12-15 08:23:18.117317417 +0000 UTC m=+192.876076062 container died cd6d91a86e15fddc432716c2f4a78daffc735710046efa8e7d238a55268a3f33 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, version=17.1.12, io.openshift.expose-services=, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vcs-type=git, container_name=nova_wait_for_compute_service, maintainer=OpenStack TripleO Team, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute) Dec 15 03:23:18 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cd6d91a86e15fddc432716c2f4a78daffc735710046efa8e7d238a55268a3f33-userdata-shm.mount: Deactivated successfully. Dec 15 03:23:18 localhost systemd[1]: var-lib-containers-storage-overlay-d46c1c020b0a74cb43cb11fd76641b6c77d04e774f05063e246f5f793ab911b3-merged.mount: Deactivated successfully. Dec 15 03:23:18 localhost podman[77946]: 2025-12-15 08:23:18.214577915 +0000 UTC m=+0.083754578 container cleanup cd6d91a86e15fddc432716c2f4a78daffc735710046efa8e7d238a55268a3f33 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_wait_for_compute_service, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.buildah.version=1.41.4, managed_by=tripleo_ansible, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, vcs-type=git, maintainer=OpenStack TripleO Team, version=17.1.12, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 15 03:23:18 localhost systemd[1]: libpod-conmon-cd6d91a86e15fddc432716c2f4a78daffc735710046efa8e7d238a55268a3f33.scope: Deactivated successfully. Dec 15 03:23:18 localhost python3[75915]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_wait_for_compute_service --conmon-pidfile /run/nova_wait_for_compute_service.pid --detach=False --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env __OS_DEBUG=true --env TRIPLEO_CONFIG_HASH=879500e96bf8dfb93687004bd86f2317 --label config_id=tripleo_step5 --label container_name=nova_wait_for_compute_service --label managed_by=tripleo_ansible --label config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_wait_for_compute_service.log --network host --user nova --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/log/containers/nova:/var/log/nova --volume /var/lib/container-config-scripts:/container-config-scripts registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 Dec 15 03:23:18 localhost python3[78001]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:23:19 localhost python3[78017]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_compute_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Dec 15 03:23:19 localhost python3[78078]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1765786999.1826105-117870-278403610497759/source dest=/etc/systemd/system/tripleo_nova_compute.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:23:20 localhost python3[78094]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Dec 15 03:23:20 localhost systemd[1]: Reloading. Dec 15 03:23:20 localhost systemd-sysv-generator[78124]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 03:23:20 localhost systemd-rc-local-generator[78120]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 03:23:20 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 03:23:21 localhost python3[78146]: ansible-systemd Invoked with state=restarted name=tripleo_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 03:23:21 localhost systemd[1]: Reloading. Dec 15 03:23:21 localhost systemd-rc-local-generator[78170]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 03:23:21 localhost systemd-sysv-generator[78174]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 03:23:21 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 03:23:21 localhost systemd[1]: Starting nova_compute container... Dec 15 03:23:21 localhost tripleo-start-podman-container[78186]: Creating additional drop-in dependency for "nova_compute" (36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5) Dec 15 03:23:21 localhost systemd[1]: Reloading. Dec 15 03:23:21 localhost systemd-sysv-generator[78245]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 03:23:21 localhost systemd-rc-local-generator[78242]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 03:23:22 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 03:23:22 localhost systemd[1]: Started nova_compute container. Dec 15 03:23:22 localhost python3[78283]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks5.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:23:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:23:23 localhost podman[78332]: 2025-12-15 08:23:23.268957855 +0000 UTC m=+0.081085148 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, architecture=x86_64, managed_by=tripleo_ansible, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, config_id=tripleo_step1, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, batch=17.1_20251118.1, distribution-scope=public) Dec 15 03:23:23 localhost podman[78332]: 2025-12-15 08:23:23.45856591 +0000 UTC m=+0.270693223 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, version=17.1.12, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, distribution-scope=public, managed_by=tripleo_ansible, batch=17.1_20251118.1, config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true) Dec 15 03:23:23 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:23:23 localhost sshd[78389]: main: sshd: ssh-rsa algorithm is disabled Dec 15 03:23:24 localhost python3[78435]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks5.json short_hostname=np0005559462 step=5 update_config_hash_only=False Dec 15 03:23:24 localhost python3[78451]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 03:23:25 localhost python3[78467]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_5 config_pattern=container-puppet-*.json config_overrides={} debug=True Dec 15 03:23:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:23:37 localhost systemd[1]: tmp-crun.if0AnH.mount: Deactivated successfully. Dec 15 03:23:37 localhost podman[78468]: 2025-12-15 08:23:37.752116036 +0000 UTC m=+0.084151399 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, name=rhosp17/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, version=17.1.12, managed_by=tripleo_ansible, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., release=1761123044, build-date=2025-11-18T22:51:28Z, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, container_name=collectd) Dec 15 03:23:37 localhost podman[78468]: 2025-12-15 08:23:37.788148269 +0000 UTC m=+0.120183672 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, config_id=tripleo_step3, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, version=17.1.12, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, tcib_managed=true) Dec 15 03:23:37 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:23:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:23:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 03:23:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:23:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:23:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:23:38 localhost podman[78488]: 2025-12-15 08:23:38.775082946 +0000 UTC m=+0.099737075 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, version=17.1.12, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, architecture=x86_64, release=1761123044, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=iscsid, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.buildah.version=1.41.4, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com) Dec 15 03:23:38 localhost podman[78488]: 2025-12-15 08:23:38.808517489 +0000 UTC m=+0.133171608 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, container_name=iscsid, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, build-date=2025-11-18T23:44:13Z, vcs-type=git, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, name=rhosp17/openstack-iscsid) Dec 15 03:23:38 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:23:38 localhost podman[78490]: 2025-12-15 08:23:38.883845801 +0000 UTC m=+0.203870066 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, architecture=x86_64, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, build-date=2025-11-19T00:12:45Z, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, url=https://www.redhat.com, vendor=Red Hat, Inc., config_id=tripleo_step4) Dec 15 03:23:38 localhost podman[78491]: 2025-12-15 08:23:38.930268312 +0000 UTC m=+0.246978248 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, tcib_managed=true, version=17.1.12, name=rhosp17/openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, build-date=2025-11-18T22:49:32Z, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, io.buildah.version=1.41.4, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container) Dec 15 03:23:38 localhost podman[78491]: 2025-12-15 08:23:38.938470101 +0000 UTC m=+0.255179987 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, batch=17.1_20251118.1, build-date=2025-11-18T22:49:32Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, version=17.1.12, name=rhosp17/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, tcib_managed=true, release=1761123044, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com) Dec 15 03:23:38 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 03:23:38 localhost podman[78489]: 2025-12-15 08:23:38.982332283 +0000 UTC m=+0.306530510 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, container_name=nova_compute, vcs-type=git, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, managed_by=tripleo_ansible, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public) Dec 15 03:23:38 localhost podman[78490]: 2025-12-15 08:23:38.992919435 +0000 UTC m=+0.312943690 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, build-date=2025-11-19T00:12:45Z, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, name=rhosp17/openstack-ceilometer-ipmi, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, tcib_managed=true, container_name=ceilometer_agent_ipmi, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.4) Dec 15 03:23:39 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 03:23:39 localhost podman[78489]: 2025-12-15 08:23:39.013946108 +0000 UTC m=+0.338144335 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, architecture=x86_64, managed_by=tripleo_ansible, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, vendor=Red Hat, Inc., url=https://www.redhat.com, config_id=tripleo_step5, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_compute, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12) Dec 15 03:23:39 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Deactivated successfully. Dec 15 03:23:39 localhost podman[78493]: 2025-12-15 08:23:38.934087454 +0000 UTC m=+0.246884406 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., url=https://www.redhat.com, batch=17.1_20251118.1, distribution-scope=public, vcs-type=git, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, release=1761123044) Dec 15 03:23:39 localhost podman[78493]: 2025-12-15 08:23:39.067416216 +0000 UTC m=+0.380213148 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-type=git, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_compute, batch=17.1_20251118.1, distribution-scope=public, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., tcib_managed=true, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, url=https://www.redhat.com, build-date=2025-11-19T00:11:48Z, config_id=tripleo_step4, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute) Dec 15 03:23:39 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 03:23:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:23:40 localhost podman[78598]: 2025-12-15 08:23:40.760049885 +0000 UTC m=+0.088373231 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-type=git, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, version=17.1.12, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, container_name=nova_migration_target, release=1761123044, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4) Dec 15 03:23:41 localhost podman[78598]: 2025-12-15 08:23:41.156379034 +0000 UTC m=+0.484702380 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vendor=Red Hat, Inc., batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, version=17.1.12, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, io.openshift.expose-services=, vcs-type=git, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute) Dec 15 03:23:41 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 03:23:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:23:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:23:44 localhost podman[78622]: 2025-12-15 08:23:44.755578206 +0000 UTC m=+0.085993357 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, build-date=2025-11-18T23:34:05Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, config_id=tripleo_step4, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, tcib_managed=true, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.buildah.version=1.41.4) Dec 15 03:23:44 localhost podman[78622]: 2025-12-15 08:23:44.802159241 +0000 UTC m=+0.132574382 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ovn-controller, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_id=tripleo_step4, vendor=Red Hat, Inc., version=17.1.12, batch=17.1_20251118.1, url=https://www.redhat.com, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z) Dec 15 03:23:44 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Deactivated successfully. Dec 15 03:23:44 localhost podman[78623]: 2025-12-15 08:23:44.801946306 +0000 UTC m=+0.129503271 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vendor=Red Hat, Inc., io.buildah.version=1.41.4, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, tcib_managed=true, batch=17.1_20251118.1, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, container_name=ovn_metadata_agent, architecture=x86_64, build-date=2025-11-19T00:14:25Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn) Dec 15 03:23:44 localhost podman[78623]: 2025-12-15 08:23:44.881685985 +0000 UTC m=+0.209242910 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, distribution-scope=public, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, version=17.1.12, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, build-date=2025-11-19T00:14:25Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4) Dec 15 03:23:44 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Deactivated successfully. Dec 15 03:23:48 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Dec 15 03:23:48 localhost recover_tripleo_nova_virtqemud[78674]: 61849 Dec 15 03:23:48 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Dec 15 03:23:48 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Dec 15 03:23:50 localhost sshd[78675]: main: sshd: ssh-rsa algorithm is disabled Dec 15 03:23:50 localhost systemd-logind[763]: New session 33 of user zuul. Dec 15 03:23:50 localhost systemd[1]: Started Session 33 of User zuul. Dec 15 03:23:51 localhost python3[78784]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Dec 15 03:23:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:23:53 localhost podman[78821]: 2025-12-15 08:23:53.73965033 +0000 UTC m=+0.075496618 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, container_name=metrics_qdr, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-11-18T22:49:46Z, version=17.1.12, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd) Dec 15 03:23:53 localhost podman[78821]: 2025-12-15 08:23:53.928628639 +0000 UTC m=+0.264474937 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=metrics_qdr, distribution-scope=public, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, url=https://www.redhat.com, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, vendor=Red Hat, Inc., tcib_managed=true, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd) Dec 15 03:23:53 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:23:59 localhost python3[79077]: ansible-ansible.legacy.dnf Invoked with name=['iptables'] allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None state=None Dec 15 03:24:06 localhost python3[79248]: ansible-ansible.builtin.iptables Invoked with action=insert chain=INPUT comment=allow ssh access for zuul executor in_interface=eth0 jump=ACCEPT protocol=tcp source=38.102.83.114 table=filter state=present ip_version=ipv4 match=[] destination_ports=[] ctstate=[] syn=ignore flush=False chain_management=False numeric=False rule_num=None wait=None to_source=None destination=None to_destination=None tcp_flags=None gateway=None log_prefix=None log_level=None goto=None out_interface=None fragment=None set_counters=None source_port=None destination_port=None to_ports=None set_dscp_mark=None set_dscp_mark_class=None src_range=None dst_range=None match_set=None match_set_flags=None limit=None limit_burst=None uid_owner=None gid_owner=None reject_with=None icmp_type=None policy=None Dec 15 03:24:06 localhost kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled Dec 15 03:24:06 localhost systemd-journald[47230]: Field hash table of /run/log/journal/738a39f68bc78fb81032e509449fb759/system.journal has a fill level at 81.1 (270 of 333 items), suggesting rotation. Dec 15 03:24:06 localhost systemd-journald[47230]: /run/log/journal/738a39f68bc78fb81032e509449fb759/system.journal: Journal header limits reached or header out-of-date, rotating. Dec 15 03:24:06 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Dec 15 03:24:06 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Dec 15 03:24:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:24:08 localhost podman[79317]: 2025-12-15 08:24:08.744010657 +0000 UTC m=+0.073131264 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, managed_by=tripleo_ansible, version=17.1.12, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, io.buildah.version=1.41.4, batch=17.1_20251118.1, build-date=2025-11-18T22:51:28Z, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}) Dec 15 03:24:08 localhost podman[79317]: 2025-12-15 08:24:08.758069673 +0000 UTC m=+0.087190270 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, name=rhosp17/openstack-collectd, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, io.openshift.expose-services=, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, distribution-scope=public, container_name=collectd, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd) Dec 15 03:24:08 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:24:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:24:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 03:24:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:24:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:24:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:24:09 localhost podman[79338]: 2025-12-15 08:24:09.75284326 +0000 UTC m=+0.072383825 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1761123044, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, version=17.1.12, vendor=Red Hat, Inc., name=rhosp17/openstack-cron, url=https://www.redhat.com, io.openshift.expose-services=, container_name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, managed_by=tripleo_ansible) Dec 15 03:24:09 localhost podman[79337]: 2025-12-15 08:24:09.826697512 +0000 UTC m=+0.147217974 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, release=1761123044, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, url=https://www.redhat.com, tcib_managed=true, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, version=17.1.12) Dec 15 03:24:09 localhost podman[79338]: 2025-12-15 08:24:09.83823359 +0000 UTC m=+0.157774155 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-type=git, config_id=tripleo_step4, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1761123044, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:32Z, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, url=https://www.redhat.com, architecture=x86_64) Dec 15 03:24:09 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 03:24:09 localhost podman[79337]: 2025-12-15 08:24:09.883705095 +0000 UTC m=+0.204225597 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, distribution-scope=public, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, container_name=ceilometer_agent_ipmi, release=1761123044, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-ipmi) Dec 15 03:24:09 localhost podman[79345]: 2025-12-15 08:24:09.791892943 +0000 UTC m=+0.102469959 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, io.buildah.version=1.41.4, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, config_id=tripleo_step4, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:11:48Z, io.openshift.expose-services=) Dec 15 03:24:09 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 03:24:09 localhost podman[79345]: 2025-12-15 08:24:09.92167919 +0000 UTC m=+0.232256186 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-type=git, url=https://www.redhat.com, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., build-date=2025-11-19T00:11:48Z, batch=17.1_20251118.1, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, distribution-scope=public, container_name=ceilometer_agent_compute, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Dec 15 03:24:09 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 03:24:09 localhost podman[79336]: 2025-12-15 08:24:09.969533838 +0000 UTC m=+0.290411920 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.buildah.version=1.41.4, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, architecture=x86_64, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., config_id=tripleo_step5, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Dec 15 03:24:10 localhost podman[79336]: 2025-12-15 08:24:10.018314021 +0000 UTC m=+0.339192103 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, tcib_managed=true, io.openshift.expose-services=, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, config_id=tripleo_step5, distribution-scope=public, version=17.1.12, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute) Dec 15 03:24:10 localhost podman[79335]: 2025-12-15 08:24:10.023939701 +0000 UTC m=+0.350892415 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, tcib_managed=true, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, container_name=iscsid, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, version=17.1.12) Dec 15 03:24:10 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Deactivated successfully. Dec 15 03:24:10 localhost podman[79335]: 2025-12-15 08:24:10.037275588 +0000 UTC m=+0.364228272 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, managed_by=tripleo_ansible, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, batch=17.1_20251118.1, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, maintainer=OpenStack TripleO Team, version=17.1.12, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, vendor=Red Hat, Inc.) Dec 15 03:24:10 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:24:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:24:11 localhost systemd[1]: tmp-crun.uDw2Lj.mount: Deactivated successfully. Dec 15 03:24:11 localhost podman[79450]: 2025-12-15 08:24:11.741287351 +0000 UTC m=+0.077873991 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, release=1761123044, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, io.openshift.expose-services=, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_id=tripleo_step4) Dec 15 03:24:12 localhost podman[79450]: 2025-12-15 08:24:12.102638655 +0000 UTC m=+0.439225255 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, config_id=tripleo_step4, distribution-scope=public, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, managed_by=tripleo_ansible, release=1761123044, io.openshift.expose-services=, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc.) Dec 15 03:24:12 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 03:24:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:24:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:24:15 localhost podman[79472]: 2025-12-15 08:24:15.757491286 +0000 UTC m=+0.084432526 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, maintainer=OpenStack TripleO Team, architecture=x86_64, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, build-date=2025-11-18T23:34:05Z, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, distribution-scope=public) Dec 15 03:24:15 localhost podman[79472]: 2025-12-15 08:24:15.811759305 +0000 UTC m=+0.138700475 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, name=rhosp17/openstack-ovn-controller, version=17.1.12, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, container_name=ovn_controller, io.buildah.version=1.41.4, vcs-type=git) Dec 15 03:24:15 localhost podman[79473]: 2025-12-15 08:24:15.820789467 +0000 UTC m=+0.142542518 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, version=17.1.12, release=1761123044, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, distribution-scope=public, build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., batch=17.1_20251118.1) Dec 15 03:24:15 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Deactivated successfully. Dec 15 03:24:15 localhost podman[79473]: 2025-12-15 08:24:15.871718858 +0000 UTC m=+0.193471879 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.expose-services=, io.buildah.version=1.41.4, distribution-scope=public, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, build-date=2025-11-19T00:14:25Z, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_metadata_agent, tcib_managed=true, config_id=tripleo_step4) Dec 15 03:24:15 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Deactivated successfully. Dec 15 03:24:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:24:24 localhost systemd[1]: tmp-crun.NV6Vtt.mount: Deactivated successfully. Dec 15 03:24:24 localhost podman[79518]: 2025-12-15 08:24:24.769279259 +0000 UTC m=+0.097912856 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc.) Dec 15 03:24:25 localhost podman[79518]: 2025-12-15 08:24:25.003456476 +0000 UTC m=+0.332090053 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-type=git, vendor=Red Hat, Inc., tcib_managed=true, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, url=https://www.redhat.com, version=17.1.12, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:46Z, release=1761123044, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd) Dec 15 03:24:25 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:24:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:24:39 localhost systemd[1]: tmp-crun.CoNghV.mount: Deactivated successfully. Dec 15 03:24:39 localhost podman[79547]: 2025-12-15 08:24:39.750741875 +0000 UTC m=+0.085625958 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, version=17.1.12, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., release=1761123044, container_name=collectd, name=rhosp17/openstack-collectd, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, io.openshift.expose-services=, url=https://www.redhat.com, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true) Dec 15 03:24:39 localhost podman[79547]: 2025-12-15 08:24:39.789403718 +0000 UTC m=+0.124287801 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:51:28Z, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, container_name=collectd, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, url=https://www.redhat.com, io.openshift.expose-services=, tcib_managed=true, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, managed_by=tripleo_ansible) Dec 15 03:24:39 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:24:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:24:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 03:24:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:24:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:24:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:24:40 localhost systemd[1]: tmp-crun.Ex09uz.mount: Deactivated successfully. Dec 15 03:24:40 localhost podman[79569]: 2025-12-15 08:24:40.768884725 +0000 UTC m=+0.092257396 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, url=https://www.redhat.com, vendor=Red Hat, Inc., config_id=tripleo_step4, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, name=rhosp17/openstack-cron, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team) Dec 15 03:24:40 localhost podman[79569]: 2025-12-15 08:24:40.781324778 +0000 UTC m=+0.104697449 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, container_name=logrotate_crond, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, distribution-scope=public, build-date=2025-11-18T22:49:32Z, io.buildah.version=1.41.4, name=rhosp17/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, release=1761123044) Dec 15 03:24:40 localhost podman[79568]: 2025-12-15 08:24:40.812225843 +0000 UTC m=+0.137778971 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:12:45Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, release=1761123044, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, batch=17.1_20251118.1, vcs-type=git, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, name=rhosp17/openstack-ceilometer-ipmi) Dec 15 03:24:40 localhost podman[79568]: 2025-12-15 08:24:40.863104293 +0000 UTC m=+0.188657361 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_id=tripleo_step4, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, architecture=x86_64, distribution-scope=public, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, build-date=2025-11-19T00:12:45Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044) Dec 15 03:24:40 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 03:24:40 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 03:24:40 localhost podman[79566]: 2025-12-15 08:24:40.864970313 +0000 UTC m=+0.197262882 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20251118.1, tcib_managed=true, architecture=x86_64, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, distribution-scope=public, release=1761123044, vcs-type=git, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team) Dec 15 03:24:40 localhost podman[79567]: 2025-12-15 08:24:40.928471709 +0000 UTC m=+0.257437739 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, name=rhosp17/openstack-nova-compute, container_name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, architecture=x86_64, config_id=tripleo_step5, managed_by=tripleo_ansible, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team) Dec 15 03:24:40 localhost podman[79567]: 2025-12-15 08:24:40.955972423 +0000 UTC m=+0.284938483 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step5, container_name=nova_compute, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 15 03:24:40 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Deactivated successfully. Dec 15 03:24:40 localhost podman[79575]: 2025-12-15 08:24:40.971749485 +0000 UTC m=+0.291869729 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, container_name=ceilometer_agent_compute, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, managed_by=tripleo_ansible, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public) Dec 15 03:24:41 localhost podman[79566]: 2025-12-15 08:24:40.999704272 +0000 UTC m=+0.331996881 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-iscsid, build-date=2025-11-18T23:44:13Z, container_name=iscsid, batch=17.1_20251118.1, distribution-scope=public, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., tcib_managed=true, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d) Dec 15 03:24:41 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:24:41 localhost podman[79575]: 2025-12-15 08:24:41.025493191 +0000 UTC m=+0.345613455 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, url=https://www.redhat.com, release=1761123044, config_id=tripleo_step4, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:11:48Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, name=rhosp17/openstack-ceilometer-compute, io.openshift.expose-services=, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute) Dec 15 03:24:41 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 03:24:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:24:42 localhost podman[79683]: 2025-12-15 08:24:42.749080567 +0000 UTC m=+0.080277696 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, url=https://www.redhat.com, tcib_managed=true, container_name=nova_migration_target, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., version=17.1.12, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Dec 15 03:24:43 localhost podman[79683]: 2025-12-15 08:24:43.145496037 +0000 UTC m=+0.476693176 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_id=tripleo_step4, name=rhosp17/openstack-nova-compute, architecture=x86_64, container_name=nova_migration_target, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vendor=Red Hat, Inc., version=17.1.12, url=https://www.redhat.com, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}) Dec 15 03:24:43 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 03:24:44 localhost sshd[79704]: main: sshd: ssh-rsa algorithm is disabled Dec 15 03:24:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:24:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:24:46 localhost podman[79707]: 2025-12-15 08:24:46.758825458 +0000 UTC m=+0.088721920 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, version=17.1.12, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, batch=17.1_20251118.1, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, architecture=x86_64, url=https://www.redhat.com, io.openshift.expose-services=, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c) Dec 15 03:24:46 localhost podman[79707]: 2025-12-15 08:24:46.797896942 +0000 UTC m=+0.127793364 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, release=1761123044, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12) Dec 15 03:24:46 localhost podman[79706]: 2025-12-15 08:24:46.808193997 +0000 UTC m=+0.141523060 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, config_id=tripleo_step4, build-date=2025-11-18T23:34:05Z, vcs-type=git, release=1761123044, version=17.1.12, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, batch=17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 15 03:24:46 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Deactivated successfully. Dec 15 03:24:46 localhost podman[79706]: 2025-12-15 08:24:46.857408753 +0000 UTC m=+0.190737756 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:34:05Z, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, architecture=x86_64, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, distribution-scope=public, url=https://www.redhat.com, version=17.1.12, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller) Dec 15 03:24:46 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Deactivated successfully. Dec 15 03:24:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:24:55 localhost podman[79754]: 2025-12-15 08:24:55.751934853 +0000 UTC m=+0.087828758 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, architecture=x86_64, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, tcib_managed=true, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, build-date=2025-11-18T22:49:46Z) Dec 15 03:24:55 localhost podman[79754]: 2025-12-15 08:24:55.971906389 +0000 UTC m=+0.307800234 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, release=1761123044, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.41.4, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, version=17.1.12, config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc.) Dec 15 03:24:55 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:25:05 localhost systemd[1]: session-33.scope: Deactivated successfully. Dec 15 03:25:05 localhost systemd[1]: session-33.scope: Consumed 5.854s CPU time. Dec 15 03:25:05 localhost systemd-logind[763]: Session 33 logged out. Waiting for processes to exit. Dec 15 03:25:05 localhost systemd-logind[763]: Removed session 33. Dec 15 03:25:05 localhost ceph-osd[31375]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Dec 15 03:25:05 localhost ceph-osd[31375]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 2400.1 total, 600.0 interval#012Cumulative writes: 4373 writes, 20K keys, 4373 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 4373 writes, 484 syncs, 9.04 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Dec 15 03:25:10 localhost ceph-osd[32311]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Dec 15 03:25:10 localhost ceph-osd[32311]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 2400.2 total, 600.0 interval#012Cumulative writes: 5246 writes, 23K keys, 5246 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 5246 writes, 573 syncs, 9.16 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Dec 15 03:25:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:25:10 localhost podman[79904]: 2025-12-15 08:25:10.753976659 +0000 UTC m=+0.086308617 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, vcs-type=git, release=1761123044, config_id=tripleo_step3, batch=17.1_20251118.1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, distribution-scope=public, url=https://www.redhat.com, container_name=collectd, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4) Dec 15 03:25:10 localhost podman[79904]: 2025-12-15 08:25:10.769352819 +0000 UTC m=+0.101684797 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, batch=17.1_20251118.1, name=rhosp17/openstack-collectd, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_id=tripleo_step3, build-date=2025-11-18T22:51:28Z, container_name=collectd, url=https://www.redhat.com) Dec 15 03:25:10 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:25:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:25:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 03:25:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:25:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:25:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:25:11 localhost systemd[1]: tmp-crun.4my1hO.mount: Deactivated successfully. Dec 15 03:25:11 localhost podman[79924]: 2025-12-15 08:25:11.775131309 +0000 UTC m=+0.103648630 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=, release=1761123044, io.buildah.version=1.41.4, version=17.1.12, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, container_name=iscsid, vendor=Red Hat, Inc., url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20251118.1, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true) Dec 15 03:25:11 localhost podman[79924]: 2025-12-15 08:25:11.786263996 +0000 UTC m=+0.114781347 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, release=1761123044, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, architecture=x86_64, url=https://www.redhat.com, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid) Dec 15 03:25:11 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:25:11 localhost podman[79938]: 2025-12-15 08:25:11.824738274 +0000 UTC m=+0.138206823 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, architecture=x86_64, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-compute, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, io.buildah.version=1.41.4, config_id=tripleo_step4, build-date=2025-11-19T00:11:48Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute) Dec 15 03:25:11 localhost podman[79926]: 2025-12-15 08:25:11.883639968 +0000 UTC m=+0.205926532 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, vcs-type=git, build-date=2025-11-19T00:12:45Z, url=https://www.redhat.com, batch=17.1_20251118.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, tcib_managed=true, io.openshift.expose-services=) Dec 15 03:25:11 localhost podman[79938]: 2025-12-15 08:25:11.901366332 +0000 UTC m=+0.214834931 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, build-date=2025-11-19T00:11:48Z, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, vcs-type=git, maintainer=OpenStack TripleO Team, release=1761123044, io.buildah.version=1.41.4, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.12, name=rhosp17/openstack-ceilometer-compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute) Dec 15 03:25:11 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 03:25:11 localhost podman[79926]: 2025-12-15 08:25:11.940458276 +0000 UTC m=+0.262744810 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, build-date=2025-11-19T00:12:45Z, distribution-scope=public, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, release=1761123044, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, managed_by=tripleo_ansible, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc.) Dec 15 03:25:11 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 03:25:11 localhost podman[79925]: 2025-12-15 08:25:11.977889126 +0000 UTC m=+0.303341044 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, container_name=nova_compute, vendor=Red Hat, Inc., release=1761123044, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.buildah.version=1.41.4, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute) Dec 15 03:25:12 localhost podman[79925]: 2025-12-15 08:25:12.002637847 +0000 UTC m=+0.328089705 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, container_name=nova_compute, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, vendor=Red Hat, Inc., config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, tcib_managed=true, batch=17.1_20251118.1, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, architecture=x86_64, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12) Dec 15 03:25:12 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Deactivated successfully. Dec 15 03:25:12 localhost podman[79930]: 2025-12-15 08:25:12.076598233 +0000 UTC m=+0.393489653 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, version=17.1.12, config_id=tripleo_step4, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1761123044, build-date=2025-11-18T22:49:32Z, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, container_name=logrotate_crond, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron) Dec 15 03:25:12 localhost podman[79930]: 2025-12-15 08:25:12.083752885 +0000 UTC m=+0.400644325 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, version=17.1.12, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, build-date=2025-11-18T22:49:32Z, vcs-type=git, distribution-scope=public, batch=17.1_20251118.1, io.buildah.version=1.41.4, tcib_managed=true, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, io.openshift.expose-services=, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-cron, url=https://www.redhat.com, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 15 03:25:12 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 03:25:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:25:13 localhost podman[80039]: 2025-12-15 08:25:13.745224271 +0000 UTC m=+0.079390532 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, vcs-type=git, build-date=2025-11-19T00:36:58Z, release=1761123044, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, name=rhosp17/openstack-nova-compute, architecture=x86_64, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vendor=Red Hat, Inc., container_name=nova_migration_target, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, tcib_managed=true, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public) Dec 15 03:25:14 localhost podman[80039]: 2025-12-15 08:25:14.168061687 +0000 UTC m=+0.502227888 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, distribution-scope=public, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.openshift.expose-services=, architecture=x86_64) Dec 15 03:25:14 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 03:25:14 localhost sshd[80062]: main: sshd: ssh-rsa algorithm is disabled Dec 15 03:25:14 localhost systemd-logind[763]: New session 34 of user zuul. Dec 15 03:25:14 localhost systemd[1]: Started Session 34 of User zuul. Dec 15 03:25:14 localhost python3[80081]: ansible-ansible.legacy.dnf Invoked with name=['systemd-container'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None Dec 15 03:25:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:25:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:25:17 localhost podman[80084]: 2025-12-15 08:25:17.759533275 +0000 UTC m=+0.083223085 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, build-date=2025-11-19T00:14:25Z, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, managed_by=tripleo_ansible, version=17.1.12, distribution-scope=public, vcs-type=git, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 15 03:25:17 localhost systemd[1]: tmp-crun.1K2dfl.mount: Deactivated successfully. Dec 15 03:25:17 localhost podman[80083]: 2025-12-15 08:25:17.817444291 +0000 UTC m=+0.139597389 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, container_name=ovn_controller, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, tcib_managed=true) Dec 15 03:25:17 localhost podman[80084]: 2025-12-15 08:25:17.834522697 +0000 UTC m=+0.158212487 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, managed_by=tripleo_ansible, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, architecture=x86_64, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn) Dec 15 03:25:17 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Deactivated successfully. Dec 15 03:25:17 localhost podman[80083]: 2025-12-15 08:25:17.87355621 +0000 UTC m=+0.195709318 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, distribution-scope=public, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, version=17.1.12, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, url=https://www.redhat.com, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, container_name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 15 03:25:17 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Deactivated successfully. Dec 15 03:25:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:25:26 localhost systemd[1]: tmp-crun.8t9RW2.mount: Deactivated successfully. Dec 15 03:25:26 localhost podman[80131]: 2025-12-15 08:25:26.74966883 +0000 UTC m=+0.085047693 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, name=rhosp17/openstack-qdrouterd, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, distribution-scope=public, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, tcib_managed=true, version=17.1.12, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible) Dec 15 03:25:26 localhost podman[80131]: 2025-12-15 08:25:26.972949795 +0000 UTC m=+0.308328628 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, version=17.1.12, batch=17.1_20251118.1, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., vcs-type=git, tcib_managed=true, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1) Dec 15 03:25:26 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:25:40 localhost python3[80176]: ansible-ansible.legacy.dnf Invoked with name=['sos'] state=latest allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None Dec 15 03:25:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:25:41 localhost podman[80178]: 2025-12-15 08:25:41.748433801 +0000 UTC m=+0.078363653 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, build-date=2025-11-18T22:51:28Z, vcs-type=git, release=1761123044, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, architecture=x86_64, batch=17.1_20251118.1) Dec 15 03:25:41 localhost podman[80178]: 2025-12-15 08:25:41.760311838 +0000 UTC m=+0.090241720 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_id=tripleo_step3, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, com.redhat.component=openstack-collectd-container, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, version=17.1.12, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.openshift.expose-services=, architecture=x86_64, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, io.buildah.version=1.41.4) Dec 15 03:25:41 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:25:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:25:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 03:25:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:25:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:25:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:25:42 localhost podman[80198]: 2025-12-15 08:25:42.77051363 +0000 UTC m=+0.096271311 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, architecture=x86_64, io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., config_id=tripleo_step3, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, version=17.1.12, tcib_managed=true, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 15 03:25:42 localhost podman[80198]: 2025-12-15 08:25:42.782808888 +0000 UTC m=+0.108566609 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, architecture=x86_64, name=rhosp17/openstack-iscsid, io.openshift.expose-services=, url=https://www.redhat.com, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:44:13Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20251118.1, container_name=iscsid, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, version=17.1.12, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1761123044, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 15 03:25:42 localhost systemd[1]: tmp-crun.reALH1.mount: Deactivated successfully. Dec 15 03:25:42 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:25:42 localhost podman[80199]: 2025-12-15 08:25:42.832703499 +0000 UTC m=+0.154783661 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-type=git, managed_by=tripleo_ansible, batch=17.1_20251118.1, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, version=17.1.12, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container) Dec 15 03:25:42 localhost podman[80206]: 2025-12-15 08:25:42.794356736 +0000 UTC m=+0.106797552 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-cron, release=1761123044, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, build-date=2025-11-18T22:49:32Z, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://www.redhat.com) Dec 15 03:25:42 localhost podman[80200]: 2025-12-15 08:25:42.867081457 +0000 UTC m=+0.184047033 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, vendor=Red Hat, Inc., url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, vcs-type=git, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z) Dec 15 03:25:42 localhost podman[80199]: 2025-12-15 08:25:42.884383578 +0000 UTC m=+0.206463760 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, config_id=tripleo_step5, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, architecture=x86_64, distribution-scope=public, version=17.1.12, tcib_managed=true, url=https://www.redhat.com, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z) Dec 15 03:25:42 localhost podman[80200]: 2025-12-15 08:25:42.892644619 +0000 UTC m=+0.209610175 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, container_name=ceilometer_agent_ipmi, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.12, release=1761123044, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, architecture=x86_64, config_id=tripleo_step4, io.buildah.version=1.41.4, tcib_managed=true, io.openshift.expose-services=) Dec 15 03:25:42 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Deactivated successfully. Dec 15 03:25:42 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 03:25:42 localhost podman[80206]: 2025-12-15 08:25:42.974651858 +0000 UTC m=+0.287092674 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, io.openshift.expose-services=, vcs-type=git, config_id=tripleo_step4, url=https://www.redhat.com, io.buildah.version=1.41.4, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp17/openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, com.redhat.component=openstack-cron-container, distribution-scope=public, build-date=2025-11-18T22:49:32Z, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron) Dec 15 03:25:42 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 03:25:43 localhost podman[80213]: 2025-12-15 08:25:43.034658619 +0000 UTC m=+0.343587141 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, build-date=2025-11-19T00:11:48Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, release=1761123044, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, version=17.1.12, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, distribution-scope=public, managed_by=tripleo_ansible, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc.) Dec 15 03:25:43 localhost podman[80213]: 2025-12-15 08:25:43.062293427 +0000 UTC m=+0.371221949 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, build-date=2025-11-19T00:11:48Z, version=17.1.12, vendor=Red Hat, Inc., architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_id=tripleo_step4, io.buildah.version=1.41.4, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044) Dec 15 03:25:43 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 03:25:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:25:44 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Dec 15 03:25:44 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Dec 15 03:25:44 localhost systemd[1]: Starting man-db-cache-update.service... Dec 15 03:25:44 localhost recover_tripleo_nova_virtqemud[80323]: 61849 Dec 15 03:25:44 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Dec 15 03:25:44 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Dec 15 03:25:44 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Dec 15 03:25:44 localhost podman[80316]: 2025-12-15 08:25:44.41125142 +0000 UTC m=+0.101780478 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, managed_by=tripleo_ansible, version=17.1.12, batch=17.1_20251118.1, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, release=1761123044, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team) Dec 15 03:25:44 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Dec 15 03:25:44 localhost systemd[1]: Finished man-db-cache-update.service. Dec 15 03:25:44 localhost systemd[1]: run-ra1e5fa6889344dedb65f39d542e42f78.service: Deactivated successfully. Dec 15 03:25:44 localhost systemd[1]: run-r3d24c1871c274735845d264553d9782f.service: Deactivated successfully. Dec 15 03:25:44 localhost podman[80316]: 2025-12-15 08:25:44.839871739 +0000 UTC m=+0.530400727 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, architecture=x86_64, vcs-type=git, build-date=2025-11-19T00:36:58Z, release=1761123044, io.openshift.expose-services=, io.buildah.version=1.41.4, config_id=tripleo_step4, distribution-scope=public, container_name=nova_migration_target) Dec 15 03:25:44 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 03:25:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:25:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:25:48 localhost systemd[1]: tmp-crun.BSSwI2.mount: Deactivated successfully. Dec 15 03:25:48 localhost podman[80481]: 2025-12-15 08:25:48.761488014 +0000 UTC m=+0.088885064 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, tcib_managed=true, batch=17.1_20251118.1, config_id=tripleo_step4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1761123044, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:34:05Z, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ovn-controller, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller) Dec 15 03:25:48 localhost podman[80482]: 2025-12-15 08:25:48.804746809 +0000 UTC m=+0.130618848 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, build-date=2025-11-19T00:14:25Z, version=17.1.12, tcib_managed=true, release=1761123044, vcs-type=git, managed_by=tripleo_ansible, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c) Dec 15 03:25:48 localhost podman[80481]: 2025-12-15 08:25:48.809217558 +0000 UTC m=+0.136614598 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, vendor=Red Hat, Inc., io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, release=1761123044, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 15 03:25:48 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Deactivated successfully. Dec 15 03:25:48 localhost podman[80482]: 2025-12-15 08:25:48.872123478 +0000 UTC m=+0.197995457 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1761123044, distribution-scope=public, vcs-type=git, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.4, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn) Dec 15 03:25:48 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Deactivated successfully. Dec 15 03:25:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:25:57 localhost podman[80529]: 2025-12-15 08:25:57.748960416 +0000 UTC m=+0.084204029 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.expose-services=, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, distribution-scope=public, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.4, batch=17.1_20251118.1, container_name=metrics_qdr) Dec 15 03:25:57 localhost podman[80529]: 2025-12-15 08:25:57.94435489 +0000 UTC m=+0.279598503 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, version=17.1.12, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, managed_by=tripleo_ansible, io.openshift.expose-services=, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, batch=17.1_20251118.1, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, io.buildah.version=1.41.4, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 15 03:25:57 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:26:07 localhost sshd[80672]: main: sshd: ssh-rsa algorithm is disabled Dec 15 03:26:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:26:12 localhost systemd[1]: tmp-crun.jhWhI3.mount: Deactivated successfully. Dec 15 03:26:12 localhost podman[80734]: 2025-12-15 08:26:12.762695943 +0000 UTC m=+0.093867486 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, vcs-type=git, url=https://www.redhat.com, distribution-scope=public, name=rhosp17/openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, maintainer=OpenStack TripleO Team, release=1761123044, io.openshift.expose-services=, build-date=2025-11-18T22:51:28Z, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-collectd-container, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, config_id=tripleo_step3, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd) Dec 15 03:26:12 localhost podman[80734]: 2025-12-15 08:26:12.777317923 +0000 UTC m=+0.108489466 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, container_name=collectd, architecture=x86_64, tcib_managed=true, url=https://www.redhat.com, vcs-type=git, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-11-18T22:51:28Z, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, distribution-scope=public, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd) Dec 15 03:26:12 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:26:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:26:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 03:26:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:26:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:26:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:26:13 localhost systemd[1]: tmp-crun.TVJosj.mount: Deactivated successfully. Dec 15 03:26:13 localhost podman[80764]: 2025-12-15 08:26:13.77533985 +0000 UTC m=+0.088843182 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, architecture=x86_64, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., release=1761123044, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, managed_by=tripleo_ansible, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, version=17.1.12) Dec 15 03:26:13 localhost podman[80764]: 2025-12-15 08:26:13.802905716 +0000 UTC m=+0.116409058 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, managed_by=tripleo_ansible, tcib_managed=true, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_id=tripleo_step4, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.12, release=1761123044, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.41.4, vcs-type=git, build-date=2025-11-19T00:11:48Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute) Dec 15 03:26:13 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 03:26:13 localhost podman[80756]: 2025-12-15 08:26:13.853848665 +0000 UTC m=+0.173356617 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, build-date=2025-11-19T00:12:45Z, vendor=Red Hat, Inc., tcib_managed=true, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, version=17.1.12, distribution-scope=public, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Dec 15 03:26:13 localhost podman[80754]: 2025-12-15 08:26:13.755889682 +0000 UTC m=+0.084313952 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2025-11-18T23:44:13Z, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, tcib_managed=true, architecture=x86_64, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, distribution-scope=public, vcs-type=git, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, release=1761123044, maintainer=OpenStack TripleO Team, version=17.1.12) Dec 15 03:26:13 localhost podman[80756]: 2025-12-15 08:26:13.906274475 +0000 UTC m=+0.225782427 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, release=1761123044, managed_by=tripleo_ansible, vcs-type=git, url=https://www.redhat.com, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, version=17.1.12, name=rhosp17/openstack-ceilometer-ipmi, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, build-date=2025-11-19T00:12:45Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc.) Dec 15 03:26:13 localhost podman[80755]: 2025-12-15 08:26:13.807266633 +0000 UTC m=+0.133559296 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.expose-services=, container_name=nova_compute, tcib_managed=true, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, architecture=x86_64, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Dec 15 03:26:13 localhost podman[80757]: 2025-12-15 08:26:13.916335304 +0000 UTC m=+0.233455943 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-cron, version=17.1.12, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, url=https://www.redhat.com, managed_by=tripleo_ansible, architecture=x86_64, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, io.openshift.expose-services=, container_name=logrotate_crond, batch=17.1_20251118.1, tcib_managed=true, vendor=Red Hat, Inc.) Dec 15 03:26:13 localhost podman[80754]: 2025-12-15 08:26:13.939848341 +0000 UTC m=+0.268272631 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., architecture=x86_64, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, tcib_managed=true, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=iscsid, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid) Dec 15 03:26:13 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:26:13 localhost podman[80757]: 2025-12-15 08:26:13.952283943 +0000 UTC m=+0.269404572 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, release=1761123044, version=17.1.12, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vcs-type=git, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, name=rhosp17/openstack-cron, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64) Dec 15 03:26:13 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 03:26:13 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 03:26:13 localhost podman[80755]: 2025-12-15 08:26:13.991030086 +0000 UTC m=+0.317322789 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, version=17.1.12, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, vendor=Red Hat, Inc., managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, architecture=x86_64, batch=17.1_20251118.1, io.openshift.expose-services=, tcib_managed=true, container_name=nova_compute) Dec 15 03:26:14 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Deactivated successfully. Dec 15 03:26:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:26:15 localhost systemd[1]: tmp-crun.Z9SP4u.mount: Deactivated successfully. Dec 15 03:26:15 localhost podman[80868]: 2025-12-15 08:26:15.768117636 +0000 UTC m=+0.100937255 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, architecture=x86_64, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, container_name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute) Dec 15 03:26:16 localhost podman[80868]: 2025-12-15 08:26:16.092246348 +0000 UTC m=+0.425065807 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, vcs-type=git, distribution-scope=public, tcib_managed=true, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vendor=Red Hat, Inc.) Dec 15 03:26:16 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 03:26:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:26:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:26:19 localhost podman[80892]: 2025-12-15 08:26:19.758908966 +0000 UTC m=+0.082384030 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, architecture=x86_64, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, vcs-type=git, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, build-date=2025-11-19T00:14:25Z, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, io.buildah.version=1.41.4, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, distribution-scope=public) Dec 15 03:26:19 localhost systemd[1]: tmp-crun.Pg1M2y.mount: Deactivated successfully. Dec 15 03:26:19 localhost podman[80891]: 2025-12-15 08:26:19.81933794 +0000 UTC m=+0.145686351 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, config_id=tripleo_step4, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, version=17.1.12, container_name=ovn_controller, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com) Dec 15 03:26:19 localhost podman[80892]: 2025-12-15 08:26:19.832651086 +0000 UTC m=+0.156126150 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, distribution-scope=public, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, version=17.1.12, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.openshift.expose-services=, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, maintainer=OpenStack TripleO Team, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn) Dec 15 03:26:19 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Deactivated successfully. Dec 15 03:26:19 localhost podman[80891]: 2025-12-15 08:26:19.872468248 +0000 UTC m=+0.198816609 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-type=git, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, name=rhosp17/openstack-ovn-controller, release=1761123044, build-date=2025-11-18T23:34:05Z, batch=17.1_20251118.1, config_id=tripleo_step4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.4, container_name=ovn_controller, vendor=Red Hat, Inc.) Dec 15 03:26:19 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Deactivated successfully. Dec 15 03:26:21 localhost python3[80955]: ansible-ansible.legacy.command Invoked with _raw_params=subscription-manager repos --disable rhel-9-for-x86_64-baseos-eus-rpms --disable rhel-9-for-x86_64-appstream-eus-rpms --disable rhel-9-for-x86_64-highavailability-eus-rpms --disable openstack-17.1-for-rhel-9-x86_64-rpms --disable fast-datapath-for-rhel-9-x86_64-rpms _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 03:26:25 localhost rhsm-service[6578]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Dec 15 03:26:25 localhost rhsm-service[6578]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Dec 15 03:26:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:26:28 localhost podman[81086]: 2025-12-15 08:26:28.748295868 +0000 UTC m=+0.078789654 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, name=rhosp17/openstack-qdrouterd, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, vendor=Red Hat, Inc., config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}) Dec 15 03:26:28 localhost podman[81086]: 2025-12-15 08:26:28.946422246 +0000 UTC m=+0.276916052 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.buildah.version=1.41.4, vendor=Red Hat, Inc., tcib_managed=true, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, release=1761123044, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public) Dec 15 03:26:28 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:26:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:26:43 localhost podman[81174]: 2025-12-15 08:26:43.748786663 +0000 UTC m=+0.082231396 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, architecture=x86_64, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, release=1761123044, url=https://www.redhat.com, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}) Dec 15 03:26:43 localhost podman[81174]: 2025-12-15 08:26:43.765313814 +0000 UTC m=+0.098758587 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, tcib_managed=true, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, container_name=collectd, distribution-scope=public, vendor=Red Hat, Inc., url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20251118.1, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 15 03:26:43 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:26:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:26:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 03:26:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:26:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:26:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:26:44 localhost podman[81196]: 2025-12-15 08:26:44.762107458 +0000 UTC m=+0.092025768 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, distribution-scope=public, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., build-date=2025-11-18T23:44:13Z, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, version=17.1.12, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4) Dec 15 03:26:44 localhost podman[81196]: 2025-12-15 08:26:44.770132452 +0000 UTC m=+0.100050752 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, version=17.1.12, com.redhat.component=openstack-iscsid-container, batch=17.1_20251118.1, container_name=iscsid, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 15 03:26:44 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:26:44 localhost podman[81198]: 2025-12-15 08:26:44.814803193 +0000 UTC m=+0.137634124 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, vcs-type=git, name=rhosp17/openstack-ceilometer-ipmi, release=1761123044, distribution-scope=public, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z, io.openshift.expose-services=, batch=17.1_20251118.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container) Dec 15 03:26:44 localhost podman[81197]: 2025-12-15 08:26:44.86752454 +0000 UTC m=+0.192754695 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, container_name=nova_compute, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container) Dec 15 03:26:44 localhost podman[81198]: 2025-12-15 08:26:44.871388164 +0000 UTC m=+0.194219085 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., config_id=tripleo_step4, distribution-scope=public, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, architecture=x86_64, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi) Dec 15 03:26:44 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 03:26:44 localhost podman[81208]: 2025-12-15 08:26:44.925812266 +0000 UTC m=+0.241861996 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, build-date=2025-11-19T00:11:48Z, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, io.openshift.expose-services=, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, managed_by=tripleo_ansible, release=1761123044, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, tcib_managed=true) Dec 15 03:26:44 localhost podman[81204]: 2025-12-15 08:26:44.978075512 +0000 UTC m=+0.296300980 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, release=1761123044, vcs-type=git, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, vendor=Red Hat, Inc., io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.41.4, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, name=rhosp17/openstack-cron, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, config_id=tripleo_step4) Dec 15 03:26:44 localhost podman[81197]: 2025-12-15 08:26:44.998139867 +0000 UTC m=+0.323370062 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, container_name=nova_compute, version=17.1.12, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., tcib_managed=true, release=1761123044, config_id=tripleo_step5, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Dec 15 03:26:45 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Deactivated successfully. Dec 15 03:26:45 localhost podman[81204]: 2025-12-15 08:26:45.01437974 +0000 UTC m=+0.332605188 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, io.buildah.version=1.41.4, batch=17.1_20251118.1, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, config_id=tripleo_step4, vcs-type=git, com.redhat.component=openstack-cron-container, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, release=1761123044) Dec 15 03:26:45 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 03:26:45 localhost podman[81208]: 2025-12-15 08:26:45.052015784 +0000 UTC m=+0.368065575 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_id=tripleo_step4, container_name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-compute, release=1761123044, build-date=2025-11-19T00:11:48Z, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, batch=17.1_20251118.1, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, version=17.1.12, managed_by=tripleo_ansible) Dec 15 03:26:45 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 03:26:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:26:46 localhost podman[81308]: 2025-12-15 08:26:46.742709819 +0000 UTC m=+0.077985912 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, container_name=nova_migration_target, release=1761123044, name=rhosp17/openstack-nova-compute, vcs-type=git, version=17.1.12, batch=17.1_20251118.1, config_id=tripleo_step4, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, managed_by=tripleo_ansible, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Dec 15 03:26:47 localhost podman[81308]: 2025-12-15 08:26:47.142468518 +0000 UTC m=+0.477744621 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, vendor=Red Hat, Inc., container_name=nova_migration_target, config_id=tripleo_step4, architecture=x86_64, build-date=2025-11-19T00:36:58Z, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute) Dec 15 03:26:47 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 03:26:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:26:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:26:50 localhost podman[81333]: 2025-12-15 08:26:50.752367905 +0000 UTC m=+0.080878440 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, io.buildah.version=1.41.4, url=https://www.redhat.com, architecture=x86_64, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., config_id=tripleo_step4, distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, tcib_managed=true, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1) Dec 15 03:26:50 localhost systemd[1]: tmp-crun.UljVZc.mount: Deactivated successfully. Dec 15 03:26:50 localhost podman[81332]: 2025-12-15 08:26:50.818200292 +0000 UTC m=+0.148755462 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, url=https://www.redhat.com, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, batch=17.1_20251118.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, architecture=x86_64, build-date=2025-11-18T23:34:05Z, tcib_managed=true, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044) Dec 15 03:26:50 localhost podman[81333]: 2025-12-15 08:26:50.822554198 +0000 UTC m=+0.151064723 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, batch=17.1_20251118.1, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.4, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, distribution-scope=public, vcs-type=git) Dec 15 03:26:50 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Deactivated successfully. Dec 15 03:26:50 localhost podman[81332]: 2025-12-15 08:26:50.866431019 +0000 UTC m=+0.196986189 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, version=17.1.12, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, name=rhosp17/openstack-ovn-controller, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., config_id=tripleo_step4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, vcs-type=git, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, build-date=2025-11-18T23:34:05Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible) Dec 15 03:26:50 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Deactivated successfully. Dec 15 03:26:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:26:59 localhost podman[81380]: 2025-12-15 08:26:59.742172488 +0000 UTC m=+0.077414788 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, tcib_managed=true, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, version=17.1.12, container_name=metrics_qdr, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Dec 15 03:26:59 localhost podman[81380]: 2025-12-15 08:26:59.958277085 +0000 UTC m=+0.293519275 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.12, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1) Dec 15 03:26:59 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:27:08 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Dec 15 03:27:08 localhost recover_tripleo_nova_virtqemud[81425]: 61849 Dec 15 03:27:08 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Dec 15 03:27:08 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Dec 15 03:27:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:27:14 localhost podman[81533]: 2025-12-15 08:27:14.762148932 +0000 UTC m=+0.090059034 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, architecture=x86_64, vcs-type=git, distribution-scope=public, tcib_managed=true, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.12, config_id=tripleo_step3, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}) Dec 15 03:27:14 localhost podman[81533]: 2025-12-15 08:27:14.777398709 +0000 UTC m=+0.105308801 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, build-date=2025-11-18T22:51:28Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, version=17.1.12, architecture=x86_64, io.openshift.expose-services=, name=rhosp17/openstack-collectd, container_name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, config_id=tripleo_step3, url=https://www.redhat.com, tcib_managed=true) Dec 15 03:27:14 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:27:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:27:14 localhost systemd[1]: tmp-crun.TYPmdB.mount: Deactivated successfully. Dec 15 03:27:14 localhost podman[81552]: 2025-12-15 08:27:14.908314273 +0000 UTC m=+0.082394280 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z, config_id=tripleo_step3, release=1761123044, tcib_managed=true, version=17.1.12, container_name=iscsid, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid) Dec 15 03:27:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:27:14 localhost podman[81552]: 2025-12-15 08:27:14.92541736 +0000 UTC m=+0.099497367 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vendor=Red Hat, Inc., tcib_managed=true, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, release=1761123044, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:44:13Z, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, batch=17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid) Dec 15 03:27:14 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:27:15 localhost podman[81570]: 2025-12-15 08:27:15.02021748 +0000 UTC m=+0.085589456 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, build-date=2025-11-19T00:12:45Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, tcib_managed=true, config_id=tripleo_step4, distribution-scope=public, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.12) Dec 15 03:27:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 03:27:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:27:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:27:15 localhost podman[81570]: 2025-12-15 08:27:15.083288153 +0000 UTC m=+0.148660129 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.12, distribution-scope=public, vcs-type=git, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, architecture=x86_64, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, managed_by=tripleo_ansible, tcib_managed=true, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi) Dec 15 03:27:15 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 03:27:15 localhost podman[81607]: 2025-12-15 08:27:15.164735416 +0000 UTC m=+0.085630175 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:32Z, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, maintainer=OpenStack TripleO Team, release=1761123044, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1) Dec 15 03:27:15 localhost podman[81607]: 2025-12-15 08:27:15.201329633 +0000 UTC m=+0.122224412 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, container_name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, distribution-scope=public, com.redhat.component=openstack-cron-container, architecture=x86_64, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc.) Dec 15 03:27:15 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 03:27:15 localhost podman[81590]: 2025-12-15 08:27:15.220833374 +0000 UTC m=+0.182000899 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, release=1761123044, vcs-type=git, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute) Dec 15 03:27:15 localhost podman[81590]: 2025-12-15 08:27:15.278477943 +0000 UTC m=+0.239645438 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, release=1761123044, managed_by=tripleo_ansible, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1) Dec 15 03:27:15 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Deactivated successfully. Dec 15 03:27:15 localhost podman[81608]: 2025-12-15 08:27:15.279825208 +0000 UTC m=+0.194257265 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-type=git, build-date=2025-11-19T00:11:48Z, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, url=https://www.redhat.com, architecture=x86_64, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, managed_by=tripleo_ansible) Dec 15 03:27:15 localhost podman[81608]: 2025-12-15 08:27:15.360368008 +0000 UTC m=+0.274800035 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, tcib_managed=true, io.buildah.version=1.41.4, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, url=https://www.redhat.com, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044) Dec 15 03:27:15 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 03:27:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:27:17 localhost systemd[1]: tmp-crun.o2cIb7.mount: Deactivated successfully. Dec 15 03:27:17 localhost podman[81669]: 2025-12-15 08:27:17.748588509 +0000 UTC m=+0.086462450 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.buildah.version=1.41.4, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, container_name=nova_migration_target, io.openshift.expose-services=, architecture=x86_64, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, vcs-type=git, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public) Dec 15 03:27:17 localhost python3[81696]: ansible-ansible.legacy.command Invoked with _raw_params=subscription-manager repos --disable rhceph-7-tools-for-rhel-9-x86_64-rpms _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 03:27:18 localhost podman[81669]: 2025-12-15 08:27:18.120281038 +0000 UTC m=+0.458154979 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., io.buildah.version=1.41.4, batch=17.1_20251118.1, distribution-scope=public, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target) Dec 15 03:27:18 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 03:27:20 localhost rhsm-service[6578]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Dec 15 03:27:21 localhost rhsm-service[6578]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server. Dec 15 03:27:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:27:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:27:21 localhost podman[81834]: 2025-12-15 08:27:21.766077723 +0000 UTC m=+0.090780745 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, name=rhosp17/openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, version=17.1.12, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, url=https://www.redhat.com, managed_by=tripleo_ansible, build-date=2025-11-18T23:34:05Z, release=1761123044, tcib_managed=true, vcs-type=git, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public) Dec 15 03:27:21 localhost podman[81835]: 2025-12-15 08:27:21.809615404 +0000 UTC m=+0.134391728 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, vcs-type=git, batch=17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, release=1761123044, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, tcib_managed=true, build-date=2025-11-19T00:14:25Z, architecture=x86_64) Dec 15 03:27:21 localhost podman[81834]: 2025-12-15 08:27:21.818331698 +0000 UTC m=+0.143034770 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, version=17.1.12, io.openshift.expose-services=, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, io.buildah.version=1.41.4, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, build-date=2025-11-18T23:34:05Z, batch=17.1_20251118.1, tcib_managed=true, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, name=rhosp17/openstack-ovn-controller) Dec 15 03:27:21 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Deactivated successfully. Dec 15 03:27:21 localhost podman[81835]: 2025-12-15 08:27:21.881474212 +0000 UTC m=+0.206250506 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, distribution-scope=public, tcib_managed=true, io.openshift.expose-services=, version=17.1.12, batch=17.1_20251118.1, architecture=x86_64, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, release=1761123044, vcs-type=git, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Dec 15 03:27:21 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Deactivated successfully. Dec 15 03:27:30 localhost sshd[81940]: main: sshd: ssh-rsa algorithm is disabled Dec 15 03:27:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:27:30 localhost podman[81942]: 2025-12-15 08:27:30.754937009 +0000 UTC m=+0.086704745 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp17/openstack-qdrouterd, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, tcib_managed=true, distribution-scope=public, managed_by=tripleo_ansible, release=1761123044, io.buildah.version=1.41.4, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 15 03:27:30 localhost podman[81942]: 2025-12-15 08:27:30.990181768 +0000 UTC m=+0.321949494 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2025-11-18T22:49:46Z, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, batch=17.1_20251118.1, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, url=https://www.redhat.com, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, architecture=x86_64, maintainer=OpenStack TripleO Team, distribution-scope=public) Dec 15 03:27:31 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:27:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:27:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:27:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 03:27:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:27:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:27:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:27:45 localhost systemd[1]: tmp-crun.61snUg.mount: Deactivated successfully. Dec 15 03:27:45 localhost podman[81971]: 2025-12-15 08:27:45.760692754 +0000 UTC m=+0.088542414 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, name=rhosp17/openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, build-date=2025-11-18T22:51:28Z, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, managed_by=tripleo_ansible, version=17.1.12, container_name=collectd) Dec 15 03:27:45 localhost podman[81972]: 2025-12-15 08:27:45.741528802 +0000 UTC m=+0.070760169 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, container_name=iscsid, tcib_managed=true, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, distribution-scope=public, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_id=tripleo_step3, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, release=1761123044) Dec 15 03:27:45 localhost podman[81971]: 2025-12-15 08:27:45.798266367 +0000 UTC m=+0.126116037 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2025-11-18T22:51:28Z, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, architecture=x86_64, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, managed_by=tripleo_ansible, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, io.openshift.expose-services=, config_id=tripleo_step3, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd) Dec 15 03:27:45 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:27:45 localhost podman[81974]: 2025-12-15 08:27:45.819339459 +0000 UTC m=+0.138512618 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, batch=17.1_20251118.1, io.buildah.version=1.41.4, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc.) Dec 15 03:27:45 localhost podman[81986]: 2025-12-15 08:27:45.861606347 +0000 UTC m=+0.176712018 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.expose-services=, release=1761123044, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, batch=17.1_20251118.1, build-date=2025-11-19T00:11:48Z, version=17.1.12, maintainer=OpenStack TripleO Team, vcs-type=git, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.buildah.version=1.41.4, architecture=x86_64, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute) Dec 15 03:27:45 localhost podman[81973]: 2025-12-15 08:27:45.868504791 +0000 UTC m=+0.190242708 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, architecture=x86_64, distribution-scope=public, config_id=tripleo_step5, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, release=1761123044, build-date=2025-11-19T00:36:58Z, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Dec 15 03:27:45 localhost podman[81973]: 2025-12-15 08:27:45.916117442 +0000 UTC m=+0.237855329 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, release=1761123044, container_name=nova_compute, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 15 03:27:45 localhost podman[81985]: 2025-12-15 08:27:45.929104288 +0000 UTC m=+0.244189558 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, build-date=2025-11-18T22:49:32Z, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, release=1761123044, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, config_id=tripleo_step4, distribution-scope=public, managed_by=tripleo_ansible, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron) Dec 15 03:27:45 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Deactivated successfully. Dec 15 03:27:45 localhost podman[81985]: 2025-12-15 08:27:45.940297027 +0000 UTC m=+0.255382307 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, url=https://www.redhat.com, version=17.1.12, vcs-type=git, name=rhosp17/openstack-cron, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4) Dec 15 03:27:45 localhost podman[81986]: 2025-12-15 08:27:45.942974708 +0000 UTC m=+0.258080419 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, container_name=ceilometer_agent_compute, build-date=2025-11-19T00:11:48Z, architecture=x86_64, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, vendor=Red Hat, Inc., vcs-type=git, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 15 03:27:45 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 03:27:46 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 03:27:46 localhost podman[81974]: 2025-12-15 08:27:46.044629501 +0000 UTC m=+0.363802690 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, vcs-type=git, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, build-date=2025-11-19T00:12:45Z, tcib_managed=true, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi) Dec 15 03:27:46 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 03:27:46 localhost podman[81972]: 2025-12-15 08:27:46.128455208 +0000 UTC m=+0.457686645 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, container_name=iscsid, name=rhosp17/openstack-iscsid, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, vcs-type=git, vendor=Red Hat, Inc., build-date=2025-11-18T23:44:13Z, tcib_managed=true, distribution-scope=public, config_id=tripleo_step3, managed_by=tripleo_ansible, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044) Dec 15 03:27:46 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:27:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:27:48 localhost podman[82103]: 2025-12-15 08:27:48.746900153 +0000 UTC m=+0.081338252 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, url=https://www.redhat.com, vcs-type=git, container_name=nova_migration_target, architecture=x86_64, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, io.buildah.version=1.41.4, version=17.1.12, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Dec 15 03:27:49 localhost podman[82103]: 2025-12-15 08:27:49.123437594 +0000 UTC m=+0.457875653 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.buildah.version=1.41.4, config_id=tripleo_step4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vcs-type=git, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, release=1761123044, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, managed_by=tripleo_ansible, tcib_managed=true, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute) Dec 15 03:27:49 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 03:27:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:27:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:27:52 localhost podman[82126]: 2025-12-15 08:27:52.750552459 +0000 UTC m=+0.080900941 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, vendor=Red Hat, Inc., container_name=ovn_controller, version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, architecture=x86_64, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller) Dec 15 03:27:52 localhost podman[82126]: 2025-12-15 08:27:52.799886766 +0000 UTC m=+0.130235248 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, vcs-type=git, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, config_id=tripleo_step4, batch=17.1_20251118.1, managed_by=tripleo_ansible, container_name=ovn_controller, maintainer=OpenStack TripleO Team) Dec 15 03:27:52 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Deactivated successfully. Dec 15 03:27:52 localhost podman[82127]: 2025-12-15 08:27:52.801519839 +0000 UTC m=+0.129861407 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, url=https://www.redhat.com, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Dec 15 03:27:52 localhost podman[82127]: 2025-12-15 08:27:52.885021477 +0000 UTC m=+0.213363015 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, config_id=tripleo_step4, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, maintainer=OpenStack TripleO Team, tcib_managed=true, batch=17.1_20251118.1) Dec 15 03:27:52 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Deactivated successfully. Dec 15 03:27:53 localhost python3[82184]: ansible-ansible.builtin.slurp Invoked with path=/home/zuul/ansible_hostname src=/home/zuul/ansible_hostname Dec 15 03:28:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:28:01 localhost podman[82185]: 2025-12-15 08:28:01.751397036 +0000 UTC m=+0.085112722 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, release=1761123044, container_name=metrics_qdr, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, batch=17.1_20251118.1, vcs-type=git, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, version=17.1.12, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible) Dec 15 03:28:01 localhost podman[82185]: 2025-12-15 08:28:01.947304755 +0000 UTC m=+0.281020381 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, tcib_managed=true, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, build-date=2025-11-18T22:49:46Z, version=17.1.12, config_id=tripleo_step1, container_name=metrics_qdr) Dec 15 03:28:01 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:28:10 localhost podman[82331]: 2025-12-15 08:28:10.679485832 +0000 UTC m=+0.091458461 container exec 8dcda56b365b42dc8758aab77a9ec80db304780e449052738f7e4e648ae1ecaf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-crash-np0005559462, io.buildah.version=1.41.4, vcs-type=git, ceph=True, release=1763362218, maintainer=Guillaume Abrioux , build-date=2025-11-26T19:44:28Z, com.redhat.component=rhceph-container, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, RELEASE=main, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph) Dec 15 03:28:10 localhost podman[82331]: 2025-12-15 08:28:10.769307409 +0000 UTC m=+0.181279988 container exec_died 8dcda56b365b42dc8758aab77a9ec80db304780e449052738f7e4e648ae1ecaf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-crash-np0005559462, distribution-scope=public, GIT_CLEAN=True, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-11-26T19:44:28Z, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, io.openshift.expose-services=, name=rhceph, maintainer=Guillaume Abrioux , release=1763362218, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Dec 15 03:28:12 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Dec 15 03:28:12 localhost recover_tripleo_nova_virtqemud[82500]: 61849 Dec 15 03:28:12 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Dec 15 03:28:12 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Dec 15 03:28:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:28:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:28:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 03:28:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:28:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:28:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:28:16 localhost podman[82503]: 2025-12-15 08:28:16.783496714 +0000 UTC m=+0.104947832 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, batch=17.1_20251118.1, tcib_managed=true, container_name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12) Dec 15 03:28:16 localhost podman[82503]: 2025-12-15 08:28:16.821460588 +0000 UTC m=+0.142911716 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, container_name=nova_compute, managed_by=tripleo_ansible, config_id=tripleo_step5, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, distribution-scope=public, release=1761123044, io.openshift.expose-services=, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute) Dec 15 03:28:16 localhost podman[82501]: 2025-12-15 08:28:16.834250849 +0000 UTC m=+0.158580334 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, url=https://www.redhat.com, architecture=x86_64, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, managed_by=tripleo_ansible, tcib_managed=true, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3) Dec 15 03:28:16 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Deactivated successfully. Dec 15 03:28:16 localhost podman[82524]: 2025-12-15 08:28:16.889183415 +0000 UTC m=+0.195902680 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_id=tripleo_step4, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.41.4, build-date=2025-11-19T00:11:48Z, version=17.1.12, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Dec 15 03:28:16 localhost podman[82501]: 2025-12-15 08:28:16.899952043 +0000 UTC m=+0.224281528 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, container_name=collectd, version=17.1.12, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, batch=17.1_20251118.1, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com, vcs-type=git, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd) Dec 15 03:28:16 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:28:16 localhost podman[82524]: 2025-12-15 08:28:16.945137378 +0000 UTC m=+0.251856613 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, tcib_managed=true, url=https://www.redhat.com, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20251118.1, build-date=2025-11-19T00:11:48Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Dec 15 03:28:16 localhost podman[82515]: 2025-12-15 08:28:16.800651902 +0000 UTC m=+0.107375767 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=logrotate_crond, name=rhosp17/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, release=1761123044, io.openshift.expose-services=, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, build-date=2025-11-18T22:49:32Z) Dec 15 03:28:16 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 03:28:16 localhost podman[82502]: 2025-12-15 08:28:16.986711658 +0000 UTC m=+0.309995315 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.buildah.version=1.41.4, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, url=https://www.redhat.com, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, maintainer=OpenStack TripleO Team, distribution-scope=public, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, batch=17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d) Dec 15 03:28:16 localhost podman[82502]: 2025-12-15 08:28:16.996234502 +0000 UTC m=+0.319518159 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, batch=17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:44:13Z, container_name=iscsid, io.buildah.version=1.41.4, com.redhat.component=openstack-iscsid-container, architecture=x86_64, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, io.openshift.expose-services=, version=17.1.12, distribution-scope=public, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid) Dec 15 03:28:17 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:28:17 localhost podman[82504]: 2025-12-15 08:28:17.039724443 +0000 UTC m=+0.356257579 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, config_id=tripleo_step4, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vendor=Red Hat, Inc., version=17.1.12, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 15 03:28:17 localhost podman[82504]: 2025-12-15 08:28:17.071337617 +0000 UTC m=+0.387870793 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, batch=17.1_20251118.1, release=1761123044, build-date=2025-11-19T00:12:45Z, managed_by=tripleo_ansible, vendor=Red Hat, Inc., url=https://www.redhat.com, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container) Dec 15 03:28:17 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 03:28:17 localhost podman[82515]: 2025-12-15 08:28:17.089515831 +0000 UTC m=+0.396239726 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, batch=17.1_20251118.1, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, tcib_managed=true, release=1761123044, architecture=x86_64, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, version=17.1.12, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, build-date=2025-11-18T22:49:32Z, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 15 03:28:17 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 03:28:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:28:19 localhost podman[82639]: 2025-12-15 08:28:19.741377588 +0000 UTC m=+0.076430840 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, config_id=tripleo_step4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., version=17.1.12) Dec 15 03:28:20 localhost podman[82639]: 2025-12-15 08:28:20.140167732 +0000 UTC m=+0.475220984 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, architecture=x86_64, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, vendor=Red Hat, Inc., batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, release=1761123044, distribution-scope=public, config_id=tripleo_step4) Dec 15 03:28:20 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 03:28:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:28:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:28:23 localhost podman[82663]: 2025-12-15 08:28:23.750542231 +0000 UTC m=+0.078931328 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, config_id=tripleo_step4, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, vendor=Red Hat, Inc., batch=17.1_20251118.1, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Dec 15 03:28:23 localhost podman[82662]: 2025-12-15 08:28:23.812656298 +0000 UTC m=+0.142244247 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, url=https://www.redhat.com, io.buildah.version=1.41.4, distribution-scope=public, architecture=x86_64, vcs-type=git, build-date=2025-11-18T23:34:05Z, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, container_name=ovn_controller, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true) Dec 15 03:28:23 localhost podman[82663]: 2025-12-15 08:28:23.832634043 +0000 UTC m=+0.161023190 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, config_id=tripleo_step4, release=1761123044, io.buildah.version=1.41.4, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, architecture=x86_64, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Dec 15 03:28:23 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Deactivated successfully. Dec 15 03:28:23 localhost podman[82662]: 2025-12-15 08:28:23.858285737 +0000 UTC m=+0.187873646 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, release=1761123044, url=https://www.redhat.com, distribution-scope=public, version=17.1.12, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64) Dec 15 03:28:23 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Deactivated successfully. Dec 15 03:28:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:28:32 localhost systemd[1]: tmp-crun.y5MMPr.mount: Deactivated successfully. Dec 15 03:28:32 localhost podman[82709]: 2025-12-15 08:28:32.749391205 +0000 UTC m=+0.087363813 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, tcib_managed=true, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, vcs-type=git, managed_by=tripleo_ansible, url=https://www.redhat.com, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, version=17.1.12, architecture=x86_64, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, vendor=Red Hat, Inc.) Dec 15 03:28:32 localhost podman[82709]: 2025-12-15 08:28:32.946418693 +0000 UTC m=+0.284391281 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, distribution-scope=public, vcs-type=git, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, batch=17.1_20251118.1, config_id=tripleo_step1, io.openshift.expose-services=, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 qdrouterd) Dec 15 03:28:32 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:28:47 localhost sshd[82738]: main: sshd: ssh-rsa algorithm is disabled Dec 15 03:28:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:28:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:28:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 03:28:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:28:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:28:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:28:47 localhost podman[82749]: 2025-12-15 08:28:47.794005076 +0000 UTC m=+0.097367340 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vendor=Red Hat, Inc., tcib_managed=true, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, url=https://www.redhat.com, release=1761123044, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2025-11-18T22:49:32Z, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, name=rhosp17/openstack-cron) Dec 15 03:28:47 localhost podman[82742]: 2025-12-15 08:28:47.833708755 +0000 UTC m=+0.145107894 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, vcs-type=git, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_compute, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, config_id=tripleo_step5, io.openshift.expose-services=, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute) Dec 15 03:28:47 localhost podman[82748]: 2025-12-15 08:28:47.834136737 +0000 UTC m=+0.141786565 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, architecture=x86_64, batch=17.1_20251118.1, io.buildah.version=1.41.4, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, release=1761123044, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team) Dec 15 03:28:47 localhost podman[82741]: 2025-12-15 08:28:47.884882591 +0000 UTC m=+0.197350648 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, architecture=x86_64, batch=17.1_20251118.1, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1761123044, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, distribution-scope=public, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container) Dec 15 03:28:47 localhost podman[82742]: 2025-12-15 08:28:47.887503801 +0000 UTC m=+0.198902930 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, architecture=x86_64, io.buildah.version=1.41.4, io.openshift.expose-services=, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true) Dec 15 03:28:47 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Deactivated successfully. Dec 15 03:28:47 localhost podman[82754]: 2025-12-15 08:28:47.944597015 +0000 UTC m=+0.241722613 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, version=17.1.12, build-date=2025-11-19T00:11:48Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 15 03:28:47 localhost podman[82741]: 2025-12-15 08:28:47.966241502 +0000 UTC m=+0.278709539 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, release=1761123044, container_name=iscsid, io.openshift.expose-services=, build-date=2025-11-18T23:44:13Z, distribution-scope=public, managed_by=tripleo_ansible, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, io.buildah.version=1.41.4, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 15 03:28:47 localhost podman[82748]: 2025-12-15 08:28:47.968513723 +0000 UTC m=+0.276163511 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, name=rhosp17/openstack-ceilometer-ipmi, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, io.buildah.version=1.41.4, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, managed_by=tripleo_ansible, version=17.1.12, distribution-scope=public, build-date=2025-11-19T00:12:45Z, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, vendor=Red Hat, Inc.) Dec 15 03:28:47 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:28:47 localhost podman[82749]: 2025-12-15 08:28:47.987458499 +0000 UTC m=+0.290820803 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, release=1761123044, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, io.buildah.version=1.41.4, url=https://www.redhat.com, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, batch=17.1_20251118.1, container_name=logrotate_crond, distribution-scope=public, com.redhat.component=openstack-cron-container, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team) Dec 15 03:28:48 localhost podman[82754]: 2025-12-15 08:28:48.004484903 +0000 UTC m=+0.301610461 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, url=https://www.redhat.com, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, vcs-type=git, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20251118.1, managed_by=tripleo_ansible, build-date=2025-11-19T00:11:48Z, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 15 03:28:48 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 03:28:48 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 03:28:48 localhost podman[82740]: 2025-12-15 08:28:47.765069754 +0000 UTC m=+0.084575859 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_id=tripleo_step3, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, architecture=x86_64, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, tcib_managed=true) Dec 15 03:28:48 localhost podman[82740]: 2025-12-15 08:28:48.046449113 +0000 UTC m=+0.365955268 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, maintainer=OpenStack TripleO Team, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, distribution-scope=public, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, container_name=collectd, tcib_managed=true, vendor=Red Hat, Inc., managed_by=tripleo_ansible, architecture=x86_64, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com, release=1761123044) Dec 15 03:28:48 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:28:48 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 03:28:50 localhost sshd[82872]: main: sshd: ssh-rsa algorithm is disabled Dec 15 03:28:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:28:50 localhost systemd[1]: tmp-crun.UfOCqg.mount: Deactivated successfully. Dec 15 03:28:50 localhost podman[82874]: 2025-12-15 08:28:50.729446751 +0000 UTC m=+0.066207199 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, version=17.1.12, tcib_managed=true, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, architecture=x86_64, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, batch=17.1_20251118.1) Dec 15 03:28:51 localhost podman[82874]: 2025-12-15 08:28:51.075536717 +0000 UTC m=+0.412297105 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, release=1761123044, architecture=x86_64, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, tcib_managed=true, vendor=Red Hat, Inc., batch=17.1_20251118.1, vcs-type=git, version=17.1.12) Dec 15 03:28:51 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 03:28:52 localhost sshd[82895]: main: sshd: ssh-rsa algorithm is disabled Dec 15 03:28:53 localhost sshd[82897]: main: sshd: ssh-rsa algorithm is disabled Dec 15 03:28:54 localhost systemd[1]: session-34.scope: Deactivated successfully. Dec 15 03:28:54 localhost systemd[1]: session-34.scope: Consumed 19.383s CPU time. Dec 15 03:28:54 localhost systemd-logind[763]: Session 34 logged out. Waiting for processes to exit. Dec 15 03:28:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:28:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:28:54 localhost systemd-logind[763]: Removed session 34. Dec 15 03:28:54 localhost systemd[1]: tmp-crun.cb6CY8.mount: Deactivated successfully. Dec 15 03:28:54 localhost podman[82900]: 2025-12-15 08:28:54.158258934 +0000 UTC m=+0.089897540 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.buildah.version=1.41.4, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1761123044, config_id=tripleo_step4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, distribution-scope=public) Dec 15 03:28:54 localhost podman[82900]: 2025-12-15 08:28:54.207058786 +0000 UTC m=+0.138697412 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, version=17.1.12, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-type=git, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, managed_by=tripleo_ansible, config_id=tripleo_step4, build-date=2025-11-18T23:34:05Z, vendor=Red Hat, Inc.) Dec 15 03:28:54 localhost podman[82901]: 2025-12-15 08:28:54.204089827 +0000 UTC m=+0.132779665 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1761123044, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc.) Dec 15 03:28:54 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Deactivated successfully. Dec 15 03:28:54 localhost podman[82901]: 2025-12-15 08:28:54.286685002 +0000 UTC m=+0.215374890 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, tcib_managed=true, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, release=1761123044, build-date=2025-11-19T00:14:25Z, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Dec 15 03:28:54 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Deactivated successfully. Dec 15 03:28:56 localhost sshd[82947]: main: sshd: ssh-rsa algorithm is disabled Dec 15 03:28:58 localhost sshd[82949]: main: sshd: ssh-rsa algorithm is disabled Dec 15 03:29:01 localhost sshd[82951]: main: sshd: ssh-rsa algorithm is disabled Dec 15 03:29:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:29:03 localhost podman[82953]: 2025-12-15 08:29:03.752909709 +0000 UTC m=+0.091084864 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, release=1761123044, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., batch=17.1_20251118.1, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git) Dec 15 03:29:03 localhost podman[82953]: 2025-12-15 08:29:03.981622203 +0000 UTC m=+0.319797298 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, url=https://www.redhat.com, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, vcs-type=git, config_id=tripleo_step1, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Dec 15 03:29:03 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:29:05 localhost sshd[82982]: main: sshd: ssh-rsa algorithm is disabled Dec 15 03:29:08 localhost sshd[82984]: main: sshd: ssh-rsa algorithm is disabled Dec 15 03:29:12 localhost sshd[83006]: main: sshd: ssh-rsa algorithm is disabled Dec 15 03:29:15 localhost sshd[83111]: main: sshd: ssh-rsa algorithm is disabled Dec 15 03:29:17 localhost sshd[83113]: main: sshd: ssh-rsa algorithm is disabled Dec 15 03:29:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:29:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:29:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 03:29:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:29:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:29:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:29:18 localhost podman[83116]: 2025-12-15 08:29:18.821502371 +0000 UTC m=+0.146025918 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=iscsid, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4, tcib_managed=true, vcs-type=git) Dec 15 03:29:18 localhost podman[83126]: 2025-12-15 08:29:18.784423061 +0000 UTC m=+0.099142287 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.expose-services=, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, url=https://www.redhat.com, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-compute-container, build-date=2025-11-19T00:11:48Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676) Dec 15 03:29:18 localhost podman[83115]: 2025-12-15 08:29:18.749497169 +0000 UTC m=+0.077156910 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.41.4, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:51:28Z, vendor=Red Hat, Inc., io.openshift.expose-services=, tcib_managed=true, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, container_name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 15 03:29:18 localhost podman[83118]: 2025-12-15 08:29:18.80420678 +0000 UTC m=+0.125308066 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.expose-services=, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:12:45Z, name=rhosp17/openstack-ceilometer-ipmi, version=17.1.12, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, io.buildah.version=1.41.4, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 15 03:29:18 localhost podman[83116]: 2025-12-15 08:29:18.858626782 +0000 UTC m=+0.183150339 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, batch=17.1_20251118.1, name=rhosp17/openstack-iscsid, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, architecture=x86_64, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1761123044, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, version=17.1.12, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 15 03:29:18 localhost podman[83126]: 2025-12-15 08:29:18.867410706 +0000 UTC m=+0.182129962 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, architecture=x86_64, batch=17.1_20251118.1, distribution-scope=public, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, build-date=2025-11-19T00:11:48Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-compute, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4) Dec 15 03:29:18 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:29:18 localhost podman[83115]: 2025-12-15 08:29:18.883382283 +0000 UTC m=+0.211042064 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vcs-type=git, io.openshift.expose-services=, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.buildah.version=1.41.4, vendor=Red Hat, Inc.) Dec 15 03:29:18 localhost podman[83118]: 2025-12-15 08:29:18.883751023 +0000 UTC m=+0.204852319 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, vcs-type=git, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible) Dec 15 03:29:18 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 03:29:18 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 03:29:18 localhost podman[83119]: 2025-12-15 08:29:18.867232331 +0000 UTC m=+0.183387955 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, build-date=2025-11-18T22:49:32Z, url=https://www.redhat.com, config_id=tripleo_step4, distribution-scope=public, version=17.1.12, batch=17.1_20251118.1, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, release=1761123044) Dec 15 03:29:18 localhost podman[83117]: 2025-12-15 08:29:18.932946345 +0000 UTC m=+0.253236719 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, distribution-scope=public, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, io.buildah.version=1.41.4, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, release=1761123044, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.openshift.expose-services=, config_id=tripleo_step5) Dec 15 03:29:18 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:29:18 localhost podman[83119]: 2025-12-15 08:29:18.946124757 +0000 UTC m=+0.262280391 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-cron, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=openstack-cron-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, tcib_managed=true) Dec 15 03:29:18 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 03:29:19 localhost podman[83117]: 2025-12-15 08:29:19.014163693 +0000 UTC m=+0.334454057 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, url=https://www.redhat.com, batch=17.1_20251118.1, config_id=tripleo_step5, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, tcib_managed=true, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 15 03:29:19 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Deactivated successfully. Dec 15 03:29:21 localhost sshd[83253]: main: sshd: ssh-rsa algorithm is disabled Dec 15 03:29:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:29:21 localhost systemd[1]: tmp-crun.h7XxaN.mount: Deactivated successfully. Dec 15 03:29:21 localhost podman[83255]: 2025-12-15 08:29:21.758874347 +0000 UTC m=+0.089079479 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, version=17.1.12, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, vendor=Red Hat, Inc., architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, distribution-scope=public, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Dec 15 03:29:22 localhost podman[83255]: 2025-12-15 08:29:22.154564719 +0000 UTC m=+0.484769861 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, version=17.1.12, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, release=1761123044, managed_by=tripleo_ansible, container_name=nova_migration_target, vendor=Red Hat, Inc., distribution-scope=public, build-date=2025-11-19T00:36:58Z, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.buildah.version=1.41.4, tcib_managed=true, name=rhosp17/openstack-nova-compute) Dec 15 03:29:22 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 03:29:24 localhost sshd[83278]: main: sshd: ssh-rsa algorithm is disabled Dec 15 03:29:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:29:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:29:24 localhost podman[83280]: 2025-12-15 08:29:24.771472402 +0000 UTC m=+0.098578293 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, config_id=tripleo_step4, release=1761123044, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, url=https://www.redhat.com, architecture=x86_64, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git) Dec 15 03:29:24 localhost podman[83281]: 2025-12-15 08:29:24.820932401 +0000 UTC m=+0.145151405 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, version=17.1.12, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, build-date=2025-11-19T00:14:25Z, config_id=tripleo_step4, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, distribution-scope=public, tcib_managed=true, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn) Dec 15 03:29:24 localhost podman[83280]: 2025-12-15 08:29:24.849241887 +0000 UTC m=+0.176347778 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-type=git, vendor=Red Hat, Inc., tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, architecture=x86_64, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 15 03:29:24 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Deactivated successfully. Dec 15 03:29:24 localhost podman[83281]: 2025-12-15 08:29:24.892226015 +0000 UTC m=+0.216444989 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, version=17.1.12, tcib_managed=true, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, config_id=tripleo_step4, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible) Dec 15 03:29:24 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Deactivated successfully. Dec 15 03:29:26 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Dec 15 03:29:26 localhost recover_tripleo_nova_virtqemud[83331]: 61849 Dec 15 03:29:26 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Dec 15 03:29:26 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Dec 15 03:29:26 localhost sshd[83332]: main: sshd: ssh-rsa algorithm is disabled Dec 15 03:29:29 localhost sshd[83334]: main: sshd: ssh-rsa algorithm is disabled Dec 15 03:29:33 localhost sshd[83336]: main: sshd: ssh-rsa algorithm is disabled Dec 15 03:29:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:29:34 localhost systemd[1]: tmp-crun.ANR64B.mount: Deactivated successfully. Dec 15 03:29:34 localhost podman[83338]: 2025-12-15 08:29:34.770369546 +0000 UTC m=+0.098609633 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, tcib_managed=true, container_name=metrics_qdr, architecture=x86_64, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, release=1761123044, managed_by=tripleo_ansible, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1) Dec 15 03:29:34 localhost podman[83338]: 2025-12-15 08:29:34.972683975 +0000 UTC m=+0.300924062 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, version=17.1.12, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:46Z, batch=17.1_20251118.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 15 03:29:34 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:29:35 localhost sshd[83367]: main: sshd: ssh-rsa algorithm is disabled Dec 15 03:29:39 localhost sshd[83369]: main: sshd: ssh-rsa algorithm is disabled Dec 15 03:29:42 localhost sshd[83371]: main: sshd: ssh-rsa algorithm is disabled Dec 15 03:29:44 localhost sshd[83373]: main: sshd: ssh-rsa algorithm is disabled Dec 15 03:29:48 localhost sshd[83375]: main: sshd: ssh-rsa algorithm is disabled Dec 15 03:29:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:29:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:29:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 03:29:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:29:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:29:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:29:49 localhost podman[83377]: 2025-12-15 08:29:49.769229977 +0000 UTC m=+0.096587749 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20251118.1, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1761123044, tcib_managed=true, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, build-date=2025-11-18T22:51:28Z, version=17.1.12, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd) Dec 15 03:29:49 localhost podman[83377]: 2025-12-15 08:29:49.778814243 +0000 UTC m=+0.106172055 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, version=17.1.12, url=https://www.redhat.com, io.buildah.version=1.41.4, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, release=1761123044, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, name=rhosp17/openstack-collectd) Dec 15 03:29:49 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:29:49 localhost systemd[1]: tmp-crun.eyyjok.mount: Deactivated successfully. Dec 15 03:29:49 localhost podman[83379]: 2025-12-15 08:29:49.864746536 +0000 UTC m=+0.183922711 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.buildah.version=1.41.4, managed_by=tripleo_ansible, architecture=x86_64, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, distribution-scope=public, container_name=nova_compute, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 15 03:29:49 localhost podman[83379]: 2025-12-15 08:29:49.919777975 +0000 UTC m=+0.238954120 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, url=https://www.redhat.com, config_id=tripleo_step5, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Dec 15 03:29:49 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Deactivated successfully. Dec 15 03:29:49 localhost podman[83384]: 2025-12-15 08:29:49.969301686 +0000 UTC m=+0.284470023 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, release=1761123044, vcs-type=git, io.buildah.version=1.41.4, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, url=https://www.redhat.com, batch=17.1_20251118.1, build-date=2025-11-19T00:12:45Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Dec 15 03:29:49 localhost podman[83391]: 2025-12-15 08:29:49.92258772 +0000 UTC m=+0.236048842 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, tcib_managed=true, managed_by=tripleo_ansible, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=logrotate_crond, release=1761123044, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:32Z, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4) Dec 15 03:29:49 localhost podman[83393]: 2025-12-15 08:29:49.838804974 +0000 UTC m=+0.148771702 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, release=1761123044, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, name=rhosp17/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, version=17.1.12) Dec 15 03:29:50 localhost podman[83391]: 2025-12-15 08:29:50.002297907 +0000 UTC m=+0.315759039 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, managed_by=tripleo_ansible, vendor=Red Hat, Inc., version=17.1.12, description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-cron, architecture=x86_64, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, config_id=tripleo_step4, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1761123044) Dec 15 03:29:50 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 03:29:50 localhost podman[83393]: 2025-12-15 08:29:50.019380883 +0000 UTC m=+0.329347631 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-compute, config_id=tripleo_step4, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, build-date=2025-11-19T00:11:48Z, url=https://www.redhat.com, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, maintainer=OpenStack TripleO Team) Dec 15 03:29:50 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 03:29:50 localhost podman[83384]: 2025-12-15 08:29:50.069717747 +0000 UTC m=+0.384886094 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2025-11-19T00:12:45Z, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, io.buildah.version=1.41.4, vendor=Red Hat, Inc., batch=17.1_20251118.1, container_name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vcs-type=git, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676) Dec 15 03:29:50 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 03:29:50 localhost podman[83378]: 2025-12-15 08:29:50.0743122 +0000 UTC m=+0.398585960 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, config_id=tripleo_step3, tcib_managed=true, batch=17.1_20251118.1, container_name=iscsid, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., vcs-type=git, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}) Dec 15 03:29:50 localhost podman[83378]: 2025-12-15 08:29:50.160347226 +0000 UTC m=+0.484620976 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.openshift.expose-services=, architecture=x86_64, tcib_managed=true, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d) Dec 15 03:29:50 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:29:51 localhost sshd[83549]: main: sshd: ssh-rsa algorithm is disabled Dec 15 03:29:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:29:52 localhost systemd[1]: tmp-crun.QkOUIV.mount: Deactivated successfully. Dec 15 03:29:52 localhost podman[83591]: 2025-12-15 08:29:52.757691498 +0000 UTC m=+0.086146750 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, io.openshift.expose-services=, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, container_name=nova_migration_target, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, version=17.1.12) Dec 15 03:29:53 localhost podman[83591]: 2025-12-15 08:29:53.087400107 +0000 UTC m=+0.415855359 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, maintainer=OpenStack TripleO Team, architecture=x86_64, batch=17.1_20251118.1, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible) Dec 15 03:29:53 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 03:29:54 localhost sshd[83666]: main: sshd: ssh-rsa algorithm is disabled Dec 15 03:29:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:29:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:29:55 localhost podman[83668]: 2025-12-15 08:29:55.756770601 +0000 UTC m=+0.082510722 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, io.buildah.version=1.41.4, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, release=1761123044, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4) Dec 15 03:29:55 localhost podman[83669]: 2025-12-15 08:29:55.828910967 +0000 UTC m=+0.150417936 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vcs-type=git, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., managed_by=tripleo_ansible, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, url=https://www.redhat.com) Dec 15 03:29:55 localhost podman[83668]: 2025-12-15 08:29:55.833199132 +0000 UTC m=+0.158939293 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1761123044, io.buildah.version=1.41.4, url=https://www.redhat.com, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, build-date=2025-11-18T23:34:05Z, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, managed_by=tripleo_ansible, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vendor=Red Hat, Inc., container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 15 03:29:55 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Deactivated successfully. Dec 15 03:29:55 localhost podman[83669]: 2025-12-15 08:29:55.908377197 +0000 UTC m=+0.229884146 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1761123044, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.buildah.version=1.41.4, version=17.1.12, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.openshift.expose-services=) Dec 15 03:29:55 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Deactivated successfully. Dec 15 03:29:57 localhost sshd[83922]: main: sshd: ssh-rsa algorithm is disabled Dec 15 03:29:58 localhost systemd-logind[763]: Existing logind session ID 28 used by new audit session, ignoring. Dec 15 03:29:58 localhost systemd[1]: Created slice User Slice of UID 0. Dec 15 03:29:58 localhost systemd[1]: Starting User Runtime Directory /run/user/0... Dec 15 03:29:58 localhost systemd[1]: Finished User Runtime Directory /run/user/0. Dec 15 03:29:58 localhost systemd[1]: Starting User Manager for UID 0... Dec 15 03:29:58 localhost systemd[83969]: Queued start job for default target Main User Target. Dec 15 03:29:58 localhost systemd[83969]: Created slice User Application Slice. Dec 15 03:29:58 localhost systemd[83969]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system). Dec 15 03:29:58 localhost systemd[83969]: Started Daily Cleanup of User's Temporary Directories. Dec 15 03:29:58 localhost systemd[83969]: Reached target Paths. Dec 15 03:29:58 localhost systemd[83969]: Reached target Timers. Dec 15 03:29:58 localhost systemd[83969]: Starting D-Bus User Message Bus Socket... Dec 15 03:29:58 localhost systemd[83969]: Starting Create User's Volatile Files and Directories... Dec 15 03:29:58 localhost systemd[83969]: Finished Create User's Volatile Files and Directories. Dec 15 03:29:58 localhost systemd[83969]: Listening on D-Bus User Message Bus Socket. Dec 15 03:29:58 localhost systemd[83969]: Reached target Sockets. Dec 15 03:29:58 localhost systemd[83969]: Reached target Basic System. Dec 15 03:29:58 localhost systemd[83969]: Reached target Main User Target. Dec 15 03:29:58 localhost systemd[83969]: Startup finished in 146ms. Dec 15 03:29:58 localhost systemd[1]: Started User Manager for UID 0. Dec 15 03:29:58 localhost systemd[1]: Started Session c11 of User root. Dec 15 03:30:00 localhost kernel: tun: Universal TUN/TAP device driver, 1.6 Dec 15 03:30:00 localhost kernel: device tap03ef8889-32 entered promiscuous mode Dec 15 03:30:00 localhost NetworkManager[5963]: [1765787400.2184] manager: (tap03ef8889-32): new Tun device (/org/freedesktop/NetworkManager/Devices/13) Dec 15 03:30:00 localhost systemd-udevd[84005]: Network interface NamePolicy= disabled on kernel command line. Dec 15 03:30:00 localhost NetworkManager[5963]: [1765787400.2345] device (tap03ef8889-32): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external') Dec 15 03:30:00 localhost NetworkManager[5963]: [1765787400.2350] device (tap03ef8889-32): state change: unavailable -> disconnected (reason 'none', sys-iface-state: 'external') Dec 15 03:30:00 localhost systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Dec 15 03:30:00 localhost systemd[1]: Starting Virtual Machine and Container Registration Service... Dec 15 03:30:00 localhost systemd[1]: Started Virtual Machine and Container Registration Service. Dec 15 03:30:00 localhost systemd-machined[84011]: New machine qemu-1-instance-00000002. Dec 15 03:30:00 localhost systemd[1]: Started Virtual Machine qemu-1-instance-00000002. Dec 15 03:30:00 localhost NetworkManager[5963]: [1765787400.5406] manager: (tapbefb7a72-10): new Veth device (/org/freedesktop/NetworkManager/Devices/14) Dec 15 03:30:00 localhost systemd-udevd[84004]: Network interface NamePolicy= disabled on kernel command line. Dec 15 03:30:00 localhost kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tapbefb7a72-11: link becomes ready Dec 15 03:30:00 localhost kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tapbefb7a72-10: link becomes ready Dec 15 03:30:00 localhost NetworkManager[5963]: [1765787400.5822] device (tapbefb7a72-10): carrier: link connected Dec 15 03:30:00 localhost kernel: device tapbefb7a72-10 entered promiscuous mode Dec 15 03:30:01 localhost sshd[84098]: main: sshd: ssh-rsa algorithm is disabled Dec 15 03:30:02 localhost systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs... Dec 15 03:30:02 localhost systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs. Dec 15 03:30:02 localhost podman[84137]: 2025-12-15 08:30:02.388628432 +0000 UTC m=+0.092670524 container create 9e7442899f892fc0ac4ff30dac8e7344498d50f9a93adf21950447c50c9a3168 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-befb7a72-17a9-4bcb-b561-84b8f626685a, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:14:25Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, url=https://www.redhat.com, batch=17.1_20251118.1, vcs-type=git, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container) Dec 15 03:30:02 localhost systemd[1]: Started libpod-conmon-9e7442899f892fc0ac4ff30dac8e7344498d50f9a93adf21950447c50c9a3168.scope. Dec 15 03:30:02 localhost podman[84137]: 2025-12-15 08:30:02.342382728 +0000 UTC m=+0.046424880 image pull registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 Dec 15 03:30:02 localhost systemd[1]: Started libcrun container. Dec 15 03:30:02 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/666eed5dd2f565d13149f1b7f3b5648936bfa7703003f9a7dc35b189f28bc0ec/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 03:30:02 localhost systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged. Dec 15 03:30:02 localhost systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service. Dec 15 03:30:02 localhost podman[84137]: 2025-12-15 08:30:02.474091383 +0000 UTC m=+0.178133515 container init 9e7442899f892fc0ac4ff30dac8e7344498d50f9a93adf21950447c50c9a3168 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-befb7a72-17a9-4bcb-b561-84b8f626685a, url=https://www.redhat.com, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, vendor=Red Hat, Inc., release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.12, distribution-scope=public, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, build-date=2025-11-19T00:14:25Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, batch=17.1_20251118.1) Dec 15 03:30:02 localhost podman[84137]: 2025-12-15 08:30:02.483869364 +0000 UTC m=+0.187911506 container start 9e7442899f892fc0ac4ff30dac8e7344498d50f9a93adf21950447c50c9a3168 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-befb7a72-17a9-4bcb-b561-84b8f626685a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, version=17.1.12, io.openshift.expose-services=, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, io.buildah.version=1.41.4, release=1761123044) Dec 15 03:30:03 localhost sshd[84170]: main: sshd: ssh-rsa algorithm is disabled Dec 15 03:30:03 localhost setroubleshoot[84114]: SELinux is preventing /usr/libexec/qemu-kvm from read access on the file max_map_count. For complete SELinux messages run: sealert -l efd43a6a-fefb-4317-a644-4d61e31ab16b Dec 15 03:30:03 localhost setroubleshoot[84114]: SELinux is preventing /usr/libexec/qemu-kvm from read access on the file max_map_count.#012#012***** Plugin qemu_file_image (98.8 confidence) suggests *******************#012#012If max_map_count is a virtualization target#012Then you need to change the label on max_map_count'#012Do#012# semanage fcontext -a -t virt_image_t 'max_map_count'#012# restorecon -v 'max_map_count'#012#012***** Plugin catchall (2.13 confidence) suggests **************************#012#012If you believe that qemu-kvm should be allowed read access on the max_map_count file by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'qemu-kvm' --raw | audit2allow -M my-qemukvm#012# semodule -X 300 -i my-qemukvm.pp#012 Dec 15 03:30:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:30:05 localhost systemd[1]: tmp-crun.grZs5X.mount: Deactivated successfully. Dec 15 03:30:05 localhost podman[84173]: 2025-12-15 08:30:05.320486211 +0000 UTC m=+0.117147337 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true) Dec 15 03:30:05 localhost podman[84173]: 2025-12-15 08:30:05.502840118 +0000 UTC m=+0.299501184 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, batch=17.1_20251118.1, container_name=metrics_qdr, io.buildah.version=1.41.4, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, vcs-type=git, io.openshift.expose-services=, url=https://www.redhat.com, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, build-date=2025-11-18T22:49:46Z, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true) Dec 15 03:30:05 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:30:06 localhost sshd[84202]: main: sshd: ssh-rsa algorithm is disabled Dec 15 03:30:10 localhost sshd[84204]: main: sshd: ssh-rsa algorithm is disabled Dec 15 03:30:10 localhost snmpd[69387]: empty variable list in _query Dec 15 03:30:10 localhost snmpd[69387]: empty variable list in _query Dec 15 03:30:12 localhost sshd[84208]: main: sshd: ssh-rsa algorithm is disabled Dec 15 03:30:12 localhost systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully. Dec 15 03:30:13 localhost systemd[1]: setroubleshootd.service: Deactivated successfully. Dec 15 03:30:16 localhost sshd[84332]: main: sshd: ssh-rsa algorithm is disabled Dec 15 03:30:17 localhost sshd[84334]: main: sshd: ssh-rsa algorithm is disabled Dec 15 03:30:19 localhost sshd[84336]: main: sshd: ssh-rsa algorithm is disabled Dec 15 03:30:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:30:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:30:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 03:30:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:30:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:30:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:30:20 localhost systemd[1]: tmp-crun.7SV7W0.mount: Deactivated successfully. Dec 15 03:30:20 localhost podman[84354]: 2025-12-15 08:30:20.784906077 +0000 UTC m=+0.094711169 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, tcib_managed=true, release=1761123044, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.12, vcs-type=git, io.openshift.expose-services=, batch=17.1_20251118.1, distribution-scope=public, build-date=2025-11-19T00:11:48Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 15 03:30:20 localhost podman[84351]: 2025-12-15 08:30:20.797428901 +0000 UTC m=+0.115063562 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.expose-services=, name=rhosp17/openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:32Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=logrotate_crond, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, com.redhat.component=openstack-cron-container, tcib_managed=true, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1761123044, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron) Dec 15 03:30:20 localhost podman[84339]: 2025-12-15 08:30:20.761181193 +0000 UTC m=+0.087571198 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., url=https://www.redhat.com, config_id=tripleo_step3, tcib_managed=true, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2025-11-18T23:44:13Z, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, io.buildah.version=1.41.4, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, distribution-scope=public, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 15 03:30:20 localhost podman[84354]: 2025-12-15 08:30:20.838287022 +0000 UTC m=+0.148092124 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, release=1761123044, config_id=tripleo_step4, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, build-date=2025-11-19T00:11:48Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, url=https://www.redhat.com, tcib_managed=true, vendor=Red Hat, Inc.) Dec 15 03:30:20 localhost podman[84339]: 2025-12-15 08:30:20.845233207 +0000 UTC m=+0.171623222 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, managed_by=tripleo_ansible, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, build-date=2025-11-18T23:44:13Z, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-iscsid, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, architecture=x86_64, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vendor=Red Hat, Inc., release=1761123044, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid) Dec 15 03:30:20 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 03:30:20 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:30:20 localhost podman[84351]: 2025-12-15 08:30:20.86332322 +0000 UTC m=+0.180957861 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:32Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, distribution-scope=public, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, batch=17.1_20251118.1, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, vcs-type=git, com.redhat.component=openstack-cron-container, architecture=x86_64) Dec 15 03:30:20 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 03:30:20 localhost podman[84338]: 2025-12-15 08:30:20.84161487 +0000 UTC m=+0.167226164 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, version=17.1.12, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, distribution-scope=public, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vcs-type=git, container_name=collectd, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, config_id=tripleo_step3, batch=17.1_20251118.1) Dec 15 03:30:20 localhost podman[84340]: 2025-12-15 08:30:20.906109532 +0000 UTC m=+0.227455692 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., url=https://www.redhat.com, tcib_managed=true, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 15 03:30:20 localhost podman[84338]: 2025-12-15 08:30:20.924301627 +0000 UTC m=+0.249912931 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=collectd, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, config_id=tripleo_step3, tcib_managed=true, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, architecture=x86_64, managed_by=tripleo_ansible, release=1761123044, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd) Dec 15 03:30:20 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:30:20 localhost podman[84340]: 2025-12-15 08:30:20.959342812 +0000 UTC m=+0.280688952 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, release=1761123044, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z) Dec 15 03:30:20 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Deactivated successfully. Dec 15 03:30:21 localhost podman[84341]: 2025-12-15 08:30:21.014956647 +0000 UTC m=+0.331339465 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, build-date=2025-11-19T00:12:45Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, version=17.1.12, vendor=Red Hat, Inc., tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Dec 15 03:30:21 localhost podman[84341]: 2025-12-15 08:30:21.071379073 +0000 UTC m=+0.387761871 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, version=17.1.12, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, distribution-scope=public) Dec 15 03:30:21 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 03:30:21 localhost haproxy-metadata-proxy-befb7a72-17a9-4bcb-b561-84b8f626685a[84166]: 192.168.0.201:33168 [15/Dec/2025:08:30:20.539] listener listener/metadata 0/0/0/1320/1320 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1" Dec 15 03:30:21 localhost haproxy-metadata-proxy-befb7a72-17a9-4bcb-b561-84b8f626685a[84166]: 192.168.0.201:33180 [15/Dec/2025:08:30:21.934] listener listener/metadata 0/0/0/11/11 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1" Dec 15 03:30:22 localhost haproxy-metadata-proxy-befb7a72-17a9-4bcb-b561-84b8f626685a[84166]: 192.168.0.201:33188 [15/Dec/2025:08:30:21.988] listener listener/metadata 0/0/0/13/13 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1" Dec 15 03:30:22 localhost haproxy-metadata-proxy-befb7a72-17a9-4bcb-b561-84b8f626685a[84166]: 192.168.0.201:33198 [15/Dec/2025:08:30:22.040] listener listener/metadata 0/0/0/9/9 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" Dec 15 03:30:22 localhost haproxy-metadata-proxy-befb7a72-17a9-4bcb-b561-84b8f626685a[84166]: 192.168.0.201:33214 [15/Dec/2025:08:30:22.091] listener listener/metadata 0/0/0/12/12 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1" Dec 15 03:30:22 localhost haproxy-metadata-proxy-befb7a72-17a9-4bcb-b561-84b8f626685a[84166]: 192.168.0.201:33230 [15/Dec/2025:08:30:22.141] listener listener/metadata 0/0/0/13/13 200 133 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" Dec 15 03:30:22 localhost haproxy-metadata-proxy-befb7a72-17a9-4bcb-b561-84b8f626685a[84166]: 192.168.0.201:33246 [15/Dec/2025:08:30:22.193] listener listener/metadata 0/0/0/10/10 200 134 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" Dec 15 03:30:22 localhost haproxy-metadata-proxy-befb7a72-17a9-4bcb-b561-84b8f626685a[84166]: 192.168.0.201:33256 [15/Dec/2025:08:30:22.242] listener listener/metadata 0/0/0/12/12 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1" Dec 15 03:30:22 localhost sshd[84475]: main: sshd: ssh-rsa algorithm is disabled Dec 15 03:30:22 localhost haproxy-metadata-proxy-befb7a72-17a9-4bcb-b561-84b8f626685a[84166]: 192.168.0.201:33268 [15/Dec/2025:08:30:22.293] listener listener/metadata 0/0/0/12/12 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" Dec 15 03:30:22 localhost haproxy-metadata-proxy-befb7a72-17a9-4bcb-b561-84b8f626685a[84166]: 192.168.0.201:33272 [15/Dec/2025:08:30:22.344] listener listener/metadata 0/0/0/10/10 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1" Dec 15 03:30:22 localhost haproxy-metadata-proxy-befb7a72-17a9-4bcb-b561-84b8f626685a[84166]: 192.168.0.201:33284 [15/Dec/2025:08:30:22.392] listener listener/metadata 0/0/0/10/10 200 139 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" Dec 15 03:30:22 localhost haproxy-metadata-proxy-befb7a72-17a9-4bcb-b561-84b8f626685a[84166]: 192.168.0.201:33288 [15/Dec/2025:08:30:22.430] listener listener/metadata 0/0/0/12/12 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" Dec 15 03:30:22 localhost haproxy-metadata-proxy-befb7a72-17a9-4bcb-b561-84b8f626685a[84166]: 192.168.0.201:33292 [15/Dec/2025:08:30:22.469] listener listener/metadata 0/0/0/10/10 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ephemeral0 HTTP/1.1" Dec 15 03:30:22 localhost haproxy-metadata-proxy-befb7a72-17a9-4bcb-b561-84b8f626685a[84166]: 192.168.0.201:33304 [15/Dec/2025:08:30:22.507] listener listener/metadata 0/0/0/11/11 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" Dec 15 03:30:22 localhost haproxy-metadata-proxy-befb7a72-17a9-4bcb-b561-84b8f626685a[84166]: 192.168.0.201:33308 [15/Dec/2025:08:30:22.559] listener listener/metadata 0/0/0/11/11 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" Dec 15 03:30:22 localhost haproxy-metadata-proxy-befb7a72-17a9-4bcb-b561-84b8f626685a[84166]: 192.168.0.201:33316 [15/Dec/2025:08:30:22.608] listener listener/metadata 0/0/0/13/13 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" Dec 15 03:30:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:30:23 localhost systemd[1]: tmp-crun.BwDiAg.mount: Deactivated successfully. Dec 15 03:30:23 localhost podman[84477]: 2025-12-15 08:30:23.667593104 +0000 UTC m=+0.076925585 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, vendor=Red Hat, Inc., architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, managed_by=tripleo_ansible, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, config_id=tripleo_step4, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, tcib_managed=true, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Dec 15 03:30:24 localhost podman[84477]: 2025-12-15 08:30:24.081585782 +0000 UTC m=+0.490918253 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.buildah.version=1.41.4, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Dec 15 03:30:24 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 03:30:24 localhost sshd[84503]: main: sshd: ssh-rsa algorithm is disabled Dec 15 03:30:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:30:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:30:26 localhost systemd[1]: tmp-crun.OWCnOm.mount: Deactivated successfully. Dec 15 03:30:26 localhost podman[84505]: 2025-12-15 08:30:26.599932776 +0000 UTC m=+0.079624506 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.4, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, release=1761123044, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vcs-type=git) Dec 15 03:30:26 localhost podman[84505]: 2025-12-15 08:30:26.669513663 +0000 UTC m=+0.149205393 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, distribution-scope=public, tcib_managed=true, release=1761123044, batch=17.1_20251118.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., io.openshift.expose-services=, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4) Dec 15 03:30:26 localhost systemd[1]: tmp-crun.wdQubm.mount: Deactivated successfully. Dec 15 03:30:26 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Deactivated successfully. Dec 15 03:30:26 localhost podman[84506]: 2025-12-15 08:30:26.68401755 +0000 UTC m=+0.161160922 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, architecture=x86_64, version=17.1.12, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c) Dec 15 03:30:26 localhost podman[84506]: 2025-12-15 08:30:26.733336817 +0000 UTC m=+0.210480199 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, batch=17.1_20251118.1, architecture=x86_64, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, config_id=tripleo_step4, vcs-type=git, build-date=2025-11-19T00:14:25Z, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc.) Dec 15 03:30:26 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Deactivated successfully. Dec 15 03:30:29 localhost ceph-osd[31375]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0. Dec 15 03:30:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:30:35 localhost podman[84549]: 2025-12-15 08:30:35.757881745 +0000 UTC m=+0.082502033 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, batch=17.1_20251118.1, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, version=17.1.12, vcs-type=git, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container) Dec 15 03:30:35 localhost podman[84549]: 2025-12-15 08:30:35.947845575 +0000 UTC m=+0.272465903 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, release=1761123044, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., batch=17.1_20251118.1, architecture=x86_64, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd) Dec 15 03:30:35 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:30:48 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Dec 15 03:30:48 localhost recover_tripleo_nova_virtqemud[84579]: 61849 Dec 15 03:30:48 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Dec 15 03:30:48 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Dec 15 03:30:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:30:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:30:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 03:30:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:30:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:30:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:30:51 localhost systemd[1]: tmp-crun.Xm69M3.mount: Deactivated successfully. Dec 15 03:30:51 localhost systemd[1]: tmp-crun.0lgA83.mount: Deactivated successfully. Dec 15 03:30:51 localhost podman[84600]: 2025-12-15 08:30:51.793169096 +0000 UTC m=+0.097957475 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, batch=17.1_20251118.1, io.openshift.expose-services=, vendor=Red Hat, Inc., release=1761123044, version=17.1.12, vcs-type=git, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:11:48Z, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, config_id=tripleo_step4, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_compute, distribution-scope=public, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Dec 15 03:30:51 localhost podman[84600]: 2025-12-15 08:30:51.823348001 +0000 UTC m=+0.128136370 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, tcib_managed=true) Dec 15 03:30:51 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 03:30:51 localhost podman[84580]: 2025-12-15 08:30:51.806853641 +0000 UTC m=+0.131489220 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, distribution-scope=public, io.buildah.version=1.41.4, release=1761123044, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, container_name=collectd, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, name=rhosp17/openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, build-date=2025-11-18T22:51:28Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd) Dec 15 03:30:51 localhost podman[84581]: 2025-12-15 08:30:51.858842909 +0000 UTC m=+0.182626965 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, release=1761123044, url=https://www.redhat.com, tcib_managed=true, build-date=2025-11-18T23:44:13Z, distribution-scope=public, architecture=x86_64, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, vcs-type=git, container_name=iscsid, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 15 03:30:51 localhost podman[84582]: 2025-12-15 08:30:51.768622311 +0000 UTC m=+0.090504397 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.expose-services=, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step5, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, version=17.1.12, tcib_managed=true, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 15 03:30:51 localhost podman[84583]: 2025-12-15 08:30:51.8371543 +0000 UTC m=+0.151636058 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, build-date=2025-11-19T00:12:45Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, tcib_managed=true, io.buildah.version=1.41.4, vcs-type=git, distribution-scope=public, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container) Dec 15 03:30:51 localhost podman[84580]: 2025-12-15 08:30:51.888454439 +0000 UTC m=+0.213090018 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.buildah.version=1.41.4, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, version=17.1.12, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, url=https://www.redhat.com) Dec 15 03:30:51 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:30:51 localhost podman[84582]: 2025-12-15 08:30:51.903227484 +0000 UTC m=+0.225109500 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, vcs-type=git, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, tcib_managed=true, config_id=tripleo_step5, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}) Dec 15 03:30:51 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Deactivated successfully. Dec 15 03:30:51 localhost podman[84583]: 2025-12-15 08:30:51.971370722 +0000 UTC m=+0.285852470 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, version=17.1.12, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, managed_by=tripleo_ansible, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 15 03:30:51 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 03:30:51 localhost podman[84581]: 2025-12-15 08:30:51.994598832 +0000 UTC m=+0.318382978 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, url=https://www.redhat.com, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, name=rhosp17/openstack-iscsid, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, tcib_managed=true, vendor=Red Hat, Inc.) Dec 15 03:30:52 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:30:52 localhost podman[84597]: 2025-12-15 08:30:51.974457884 +0000 UTC m=+0.286654921 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, architecture=x86_64, io.openshift.expose-services=, build-date=2025-11-18T22:49:32Z, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1761123044, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, container_name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 15 03:30:52 localhost podman[84597]: 2025-12-15 08:30:52.05373341 +0000 UTC m=+0.365930437 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:32Z, vcs-type=git, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.buildah.version=1.41.4, io.openshift.expose-services=, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, batch=17.1_20251118.1, release=1761123044, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, version=17.1.12) Dec 15 03:30:52 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 03:30:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:30:54 localhost podman[84716]: 2025-12-15 08:30:54.74507037 +0000 UTC m=+0.078632399 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step4, architecture=x86_64, container_name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, distribution-scope=public, url=https://www.redhat.com, version=17.1.12) Dec 15 03:30:55 localhost podman[84716]: 2025-12-15 08:30:55.083225306 +0000 UTC m=+0.416787255 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, vcs-type=git, release=1761123044, distribution-scope=public, version=17.1.12, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, vendor=Red Hat, Inc., managed_by=tripleo_ansible, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 15 03:30:55 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 03:30:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:30:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:30:57 localhost systemd[1]: tmp-crun.eMuHrJ.mount: Deactivated successfully. Dec 15 03:30:57 localhost podman[84739]: 2025-12-15 08:30:57.761412905 +0000 UTC m=+0.086132970 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, distribution-scope=public, name=rhosp17/openstack-ovn-controller, io.buildah.version=1.41.4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, version=17.1.12, config_id=tripleo_step4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}) Dec 15 03:30:57 localhost podman[84740]: 2025-12-15 08:30:57.807554946 +0000 UTC m=+0.129606979 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, config_id=tripleo_step4, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 15 03:30:57 localhost podman[84739]: 2025-12-15 08:30:57.83990383 +0000 UTC m=+0.164623895 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, url=https://www.redhat.com, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, name=rhosp17/openstack-ovn-controller, distribution-scope=public, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, maintainer=OpenStack TripleO Team, version=17.1.12, config_id=tripleo_step4, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}) Dec 15 03:30:57 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Deactivated successfully. Dec 15 03:30:57 localhost podman[84740]: 2025-12-15 08:30:57.881743047 +0000 UTC m=+0.203795050 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, build-date=2025-11-19T00:14:25Z, architecture=x86_64, io.buildah.version=1.41.4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, managed_by=tripleo_ansible, io.openshift.expose-services=, batch=17.1_20251118.1, vcs-type=git, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 15 03:30:57 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Deactivated successfully. Dec 15 03:31:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:31:06 localhost systemd[1]: tmp-crun.caDXGn.mount: Deactivated successfully. Dec 15 03:31:06 localhost podman[84788]: 2025-12-15 08:31:06.773620035 +0000 UTC m=+0.101707045 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, architecture=x86_64, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, distribution-scope=public, vcs-type=git, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, vendor=Red Hat, Inc., managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 15 03:31:06 localhost podman[84788]: 2025-12-15 08:31:06.973390396 +0000 UTC m=+0.301477456 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, container_name=metrics_qdr, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, vcs-type=git, build-date=2025-11-18T22:49:46Z, architecture=x86_64, vendor=Red Hat, Inc., batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 15 03:31:06 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:31:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:31:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:31:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 03:31:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:31:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:31:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:31:22 localhost systemd[1]: tmp-crun.HLCDaC.mount: Deactivated successfully. Dec 15 03:31:22 localhost podman[84943]: 2025-12-15 08:31:22.775098735 +0000 UTC m=+0.096698312 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, url=https://www.redhat.com, vcs-type=git, config_id=tripleo_step5, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, distribution-scope=public, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, io.openshift.expose-services=, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64) Dec 15 03:31:22 localhost podman[84962]: 2025-12-15 08:31:22.795210492 +0000 UTC m=+0.102931019 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, architecture=x86_64, name=rhosp17/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, version=17.1.12, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, tcib_managed=true, release=1761123044, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z) Dec 15 03:31:22 localhost podman[84943]: 2025-12-15 08:31:22.811280261 +0000 UTC m=+0.132879868 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, io.buildah.version=1.41.4, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step5, version=17.1.12, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, url=https://www.redhat.com) Dec 15 03:31:22 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Deactivated successfully. Dec 15 03:31:22 localhost podman[84962]: 2025-12-15 08:31:22.851533295 +0000 UTC m=+0.159253842 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, build-date=2025-11-19T00:11:48Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, config_id=tripleo_step4, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc.) Dec 15 03:31:22 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 03:31:22 localhost podman[84942]: 2025-12-15 08:31:22.867610404 +0000 UTC m=+0.193862865 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, release=1761123044, io.buildah.version=1.41.4, distribution-scope=public, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, url=https://www.redhat.com, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, vcs-type=git, architecture=x86_64, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3) Dec 15 03:31:22 localhost podman[84942]: 2025-12-15 08:31:22.905353861 +0000 UTC m=+0.231606372 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, release=1761123044, config_id=tripleo_step3, build-date=2025-11-18T23:44:13Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vendor=Red Hat, Inc., version=17.1.12, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid) Dec 15 03:31:22 localhost podman[84941]: 2025-12-15 08:31:22.916476868 +0000 UTC m=+0.241907767 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, container_name=collectd, vendor=Red Hat, Inc., architecture=x86_64, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:51:28Z, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd) Dec 15 03:31:22 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:31:22 localhost podman[84941]: 2025-12-15 08:31:22.931302254 +0000 UTC m=+0.256733183 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2025-11-18T22:51:28Z, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.buildah.version=1.41.4, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, release=1761123044, managed_by=tripleo_ansible, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd) Dec 15 03:31:22 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:31:23 localhost podman[84956]: 2025-12-15 08:31:23.005809472 +0000 UTC m=+0.315974673 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:32Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-cron, vendor=Red Hat, Inc., io.openshift.expose-services=, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, container_name=logrotate_crond, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64) Dec 15 03:31:23 localhost podman[84956]: 2025-12-15 08:31:23.01735248 +0000 UTC m=+0.327517651 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, release=1761123044, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, vcs-type=git, container_name=logrotate_crond, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, com.redhat.component=openstack-cron-container, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron) Dec 15 03:31:23 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 03:31:23 localhost podman[84944]: 2025-12-15 08:31:22.972547934 +0000 UTC m=+0.289622670 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, distribution-scope=public, vendor=Red Hat, Inc., vcs-type=git, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, version=17.1.12, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp17/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, release=1761123044) Dec 15 03:31:23 localhost podman[84944]: 2025-12-15 08:31:23.103576332 +0000 UTC m=+0.420651008 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.4, build-date=2025-11-19T00:12:45Z, tcib_managed=true, architecture=x86_64, maintainer=OpenStack TripleO Team, distribution-scope=public, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Dec 15 03:31:23 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 03:31:23 localhost systemd[1]: tmp-crun.zQMA8C.mount: Deactivated successfully. Dec 15 03:31:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:31:25 localhost podman[85080]: 2025-12-15 08:31:25.747138557 +0000 UTC m=+0.081247040 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-type=git, url=https://www.redhat.com, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, tcib_managed=true, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute) Dec 15 03:31:26 localhost podman[85080]: 2025-12-15 08:31:26.153648136 +0000 UTC m=+0.487756619 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vcs-type=git, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, release=1761123044) Dec 15 03:31:26 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 03:31:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:31:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:31:28 localhost podman[85104]: 2025-12-15 08:31:28.75779811 +0000 UTC m=+0.088585666 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, url=https://www.redhat.com, version=17.1.12, container_name=ovn_controller, distribution-scope=public, build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller) Dec 15 03:31:28 localhost podman[85105]: 2025-12-15 08:31:28.813085425 +0000 UTC m=+0.140834660 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, release=1761123044, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, url=https://www.redhat.com, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2025-11-19T00:14:25Z, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Dec 15 03:31:28 localhost podman[85104]: 2025-12-15 08:31:28.831545648 +0000 UTC m=+0.162333264 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, version=17.1.12, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1761123044, batch=17.1_20251118.1, io.openshift.expose-services=, build-date=2025-11-18T23:34:05Z, tcib_managed=true, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller) Dec 15 03:31:28 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Deactivated successfully. Dec 15 03:31:28 localhost podman[85105]: 2025-12-15 08:31:28.887704586 +0000 UTC m=+0.215453821 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, architecture=x86_64, release=1761123044, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, io.openshift.expose-services=, build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, url=https://www.redhat.com, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn) Dec 15 03:31:28 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Deactivated successfully. Dec 15 03:31:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:31:37 localhost podman[85153]: 2025-12-15 08:31:37.756048987 +0000 UTC m=+0.083846309 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vendor=Red Hat, Inc., config_id=tripleo_step1, distribution-scope=public, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, url=https://www.redhat.com, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, build-date=2025-11-18T22:49:46Z) Dec 15 03:31:37 localhost podman[85153]: 2025-12-15 08:31:37.953380753 +0000 UTC m=+0.281178135 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, release=1761123044, container_name=metrics_qdr, architecture=x86_64, tcib_managed=true, vendor=Red Hat, Inc., config_id=tripleo_step1, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, name=rhosp17/openstack-qdrouterd, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4) Dec 15 03:31:37 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:31:44 localhost sshd[85182]: main: sshd: ssh-rsa algorithm is disabled Dec 15 03:31:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:31:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:31:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 03:31:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:31:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:31:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:31:53 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Dec 15 03:31:53 localhost recover_tripleo_nova_virtqemud[85223]: 61849 Dec 15 03:31:53 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Dec 15 03:31:53 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Dec 15 03:31:53 localhost podman[85186]: 2025-12-15 08:31:53.78924758 +0000 UTC m=+0.105988899 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, version=17.1.12, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, architecture=x86_64, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, tcib_managed=true, release=1761123044, container_name=nova_compute, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc.) Dec 15 03:31:53 localhost podman[85186]: 2025-12-15 08:31:53.825370145 +0000 UTC m=+0.142111514 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., container_name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, vcs-type=git, architecture=x86_64, config_id=tripleo_step5, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, release=1761123044, distribution-scope=public, io.openshift.expose-services=) Dec 15 03:31:53 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Deactivated successfully. Dec 15 03:31:53 localhost podman[85184]: 2025-12-15 08:31:53.843468787 +0000 UTC m=+0.163442993 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, batch=17.1_20251118.1, architecture=x86_64, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, release=1761123044, distribution-scope=public, url=https://www.redhat.com, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, version=17.1.12, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=collectd) Dec 15 03:31:53 localhost podman[85205]: 2025-12-15 08:31:53.906254594 +0000 UTC m=+0.209763770 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, url=https://www.redhat.com, managed_by=tripleo_ansible, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, io.openshift.expose-services=, version=17.1.12, release=1761123044, name=rhosp17/openstack-ceilometer-compute, distribution-scope=public, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, io.buildah.version=1.41.4) Dec 15 03:31:53 localhost podman[85184]: 2025-12-15 08:31:53.959680989 +0000 UTC m=+0.279655235 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, vcs-type=git, batch=17.1_20251118.1, config_id=tripleo_step3, build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.4, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., container_name=collectd, url=https://www.redhat.com, release=1761123044, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=) Dec 15 03:31:53 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:31:53 localhost podman[85205]: 2025-12-15 08:31:53.9773083 +0000 UTC m=+0.280817426 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_id=tripleo_step4, vcs-type=git, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, tcib_managed=true, distribution-scope=public, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, build-date=2025-11-19T00:11:48Z, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Dec 15 03:31:53 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 03:31:54 localhost podman[85187]: 2025-12-15 08:31:54.028475435 +0000 UTC m=+0.342687817 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, batch=17.1_20251118.1, container_name=ceilometer_agent_ipmi, config_id=tripleo_step4, io.buildah.version=1.41.4, architecture=x86_64, vendor=Red Hat, Inc., url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 15 03:31:54 localhost podman[85187]: 2025-12-15 08:31:54.083052552 +0000 UTC m=+0.397264954 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, build-date=2025-11-19T00:12:45Z, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.expose-services=, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, release=1761123044, name=rhosp17/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, batch=17.1_20251118.1, container_name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 15 03:31:54 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 03:31:54 localhost podman[85185]: 2025-12-15 08:31:54.141152453 +0000 UTC m=+0.461553370 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., io.buildah.version=1.41.4, batch=17.1_20251118.1, release=1761123044, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:44:13Z, architecture=x86_64) Dec 15 03:31:54 localhost podman[85185]: 2025-12-15 08:31:54.179263789 +0000 UTC m=+0.499664696 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, batch=17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, build-date=2025-11-18T23:44:13Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, name=rhosp17/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, architecture=x86_64, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, managed_by=tripleo_ansible, config_id=tripleo_step3) Dec 15 03:31:54 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:31:54 localhost podman[85191]: 2025-12-15 08:31:54.089929775 +0000 UTC m=+0.398286491 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, distribution-scope=public, build-date=2025-11-18T22:49:32Z, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-cron, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.4, architecture=x86_64, container_name=logrotate_crond, version=17.1.12, url=https://www.redhat.com, config_id=tripleo_step4, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible) Dec 15 03:31:54 localhost podman[85191]: 2025-12-15 08:31:54.223314355 +0000 UTC m=+0.531671091 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, build-date=2025-11-18T22:49:32Z, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.openshift.expose-services=, batch=17.1_20251118.1, name=rhosp17/openstack-cron, tcib_managed=true, release=1761123044, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 15 03:31:54 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 03:31:54 localhost systemd[1]: tmp-crun.eHQomE.mount: Deactivated successfully. Dec 15 03:31:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:31:56 localhost podman[85318]: 2025-12-15 08:31:56.745823959 +0000 UTC m=+0.078485525 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, architecture=x86_64, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, release=1761123044, tcib_managed=true, version=17.1.12, vendor=Red Hat, Inc., batch=17.1_20251118.1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, container_name=nova_migration_target, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Dec 15 03:31:57 localhost podman[85318]: 2025-12-15 08:31:57.13737553 +0000 UTC m=+0.470037106 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, tcib_managed=true, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, build-date=2025-11-19T00:36:58Z, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.buildah.version=1.41.4, vendor=Red Hat, Inc., batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, url=https://www.redhat.com, vcs-type=git) Dec 15 03:31:57 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 03:31:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:31:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:31:59 localhost podman[85343]: 2025-12-15 08:31:59.758441115 +0000 UTC m=+0.087472345 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, architecture=x86_64, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vcs-type=git, managed_by=tripleo_ansible, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, release=1761123044, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, batch=17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, tcib_managed=true) Dec 15 03:31:59 localhost podman[85342]: 2025-12-15 08:31:59.802247103 +0000 UTC m=+0.134785488 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, distribution-scope=public, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, tcib_managed=true, container_name=ovn_controller, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, io.buildah.version=1.41.4, vendor=Red Hat, Inc., managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller) Dec 15 03:31:59 localhost podman[85343]: 2025-12-15 08:31:59.840590887 +0000 UTC m=+0.169622097 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, vcs-type=git, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c) Dec 15 03:31:59 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Deactivated successfully. Dec 15 03:31:59 localhost podman[85342]: 2025-12-15 08:31:59.854868128 +0000 UTC m=+0.187406513 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, vcs-type=git, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, vendor=Red Hat, Inc., release=1761123044, distribution-scope=public, managed_by=tripleo_ansible, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller) Dec 15 03:31:59 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Deactivated successfully. Dec 15 03:32:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:32:08 localhost systemd[1]: tmp-crun.C6UrpI.mount: Deactivated successfully. Dec 15 03:32:08 localhost podman[85388]: 2025-12-15 08:32:08.771055435 +0000 UTC m=+0.099641921 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, architecture=x86_64, vcs-type=git, name=rhosp17/openstack-qdrouterd, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1) Dec 15 03:32:08 localhost podman[85388]: 2025-12-15 08:32:08.982352824 +0000 UTC m=+0.310939300 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, build-date=2025-11-18T22:49:46Z, version=17.1.12, vcs-type=git, vendor=Red Hat, Inc., container_name=metrics_qdr, distribution-scope=public, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 15 03:32:08 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:32:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:32:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:32:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 03:32:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:32:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:32:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:32:24 localhost podman[85543]: 2025-12-15 08:32:24.813655012 +0000 UTC m=+0.109640417 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, io.buildah.version=1.41.4, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, build-date=2025-11-19T00:12:45Z, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4) Dec 15 03:32:24 localhost systemd[1]: tmp-crun.cMBDn6.mount: Deactivated successfully. Dec 15 03:32:24 localhost podman[85540]: 2025-12-15 08:32:24.857142162 +0000 UTC m=+0.156793556 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, container_name=collectd, url=https://www.redhat.com, vendor=Red Hat, Inc., batch=17.1_20251118.1, build-date=2025-11-18T22:51:28Z, architecture=x86_64, name=rhosp17/openstack-collectd) Dec 15 03:32:24 localhost podman[85540]: 2025-12-15 08:32:24.868226178 +0000 UTC m=+0.167877562 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, vendor=Red Hat, Inc., release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, vcs-type=git, build-date=2025-11-18T22:51:28Z, io.openshift.expose-services=, name=rhosp17/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, container_name=collectd, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 15 03:32:24 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:32:24 localhost podman[85542]: 2025-12-15 08:32:24.906381856 +0000 UTC m=+0.203154293 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, vendor=Red Hat, Inc., container_name=nova_compute, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, config_id=tripleo_step5, io.openshift.expose-services=, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute) Dec 15 03:32:24 localhost podman[85543]: 2025-12-15 08:32:24.919539557 +0000 UTC m=+0.215524972 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, batch=17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vendor=Red Hat, Inc., config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com) Dec 15 03:32:24 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 03:32:24 localhost podman[85542]: 2025-12-15 08:32:24.967377305 +0000 UTC m=+0.264149742 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.buildah.version=1.41.4, io.openshift.expose-services=, vendor=Red Hat, Inc., release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, architecture=x86_64, version=17.1.12) Dec 15 03:32:24 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Deactivated successfully. Dec 15 03:32:25 localhost podman[85544]: 2025-12-15 08:32:24.971537795 +0000 UTC m=+0.260217146 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, name=rhosp17/openstack-cron, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2025-11-18T22:49:32Z, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., release=1761123044, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 15 03:32:25 localhost podman[85541]: 2025-12-15 08:32:25.028258329 +0000 UTC m=+0.326048563 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, release=1761123044, name=rhosp17/openstack-iscsid, managed_by=tripleo_ansible, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, container_name=iscsid, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public) Dec 15 03:32:25 localhost podman[85541]: 2025-12-15 08:32:25.041332418 +0000 UTC m=+0.339122612 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-iscsid, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, version=17.1.12, vcs-type=git, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=, io.buildah.version=1.41.4, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, distribution-scope=public) Dec 15 03:32:25 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:32:25 localhost podman[85544]: 2025-12-15 08:32:25.056342788 +0000 UTC m=+0.345022179 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, build-date=2025-11-18T22:49:32Z, distribution-scope=public, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, io.buildah.version=1.41.4, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, tcib_managed=true, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team) Dec 15 03:32:25 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 03:32:25 localhost podman[85557]: 2025-12-15 08:32:25.131382471 +0000 UTC m=+0.415237543 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1761123044, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, batch=17.1_20251118.1, architecture=x86_64, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4) Dec 15 03:32:25 localhost podman[85557]: 2025-12-15 08:32:25.189749569 +0000 UTC m=+0.473604611 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, architecture=x86_64, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, release=1761123044, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, url=https://www.redhat.com, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, maintainer=OpenStack TripleO Team, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 15 03:32:25 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 03:32:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:32:27 localhost podman[85675]: 2025-12-15 08:32:27.757768479 +0000 UTC m=+0.086056038 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, config_id=tripleo_step4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=) Dec 15 03:32:28 localhost podman[85675]: 2025-12-15 08:32:28.153261594 +0000 UTC m=+0.481549113 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, tcib_managed=true, config_id=tripleo_step4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, managed_by=tripleo_ansible, container_name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}) Dec 15 03:32:28 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 03:32:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:32:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:32:30 localhost systemd[1]: tmp-crun.6XGd9b.mount: Deactivated successfully. Dec 15 03:32:30 localhost podman[85698]: 2025-12-15 08:32:30.765461662 +0000 UTC m=+0.097014839 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:34:05Z, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, version=17.1.12, config_id=tripleo_step4, release=1761123044, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=) Dec 15 03:32:30 localhost podman[85699]: 2025-12-15 08:32:30.797663042 +0000 UTC m=+0.127416282 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, release=1761123044, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, managed_by=tripleo_ansible, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, distribution-scope=public, tcib_managed=true, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container) Dec 15 03:32:30 localhost podman[85698]: 2025-12-15 08:32:30.816482364 +0000 UTC m=+0.148035501 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, version=17.1.12, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, maintainer=OpenStack TripleO Team, architecture=x86_64, io.buildah.version=1.41.4, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, tcib_managed=true, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, managed_by=tripleo_ansible, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller) Dec 15 03:32:30 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Deactivated successfully. Dec 15 03:32:30 localhost podman[85699]: 2025-12-15 08:32:30.848392726 +0000 UTC m=+0.178145986 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Dec 15 03:32:30 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Deactivated successfully. Dec 15 03:32:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:32:39 localhost podman[85746]: 2025-12-15 08:32:39.751079541 +0000 UTC m=+0.085495284 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, tcib_managed=true, distribution-scope=public, vcs-type=git, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, managed_by=tripleo_ansible, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=) Dec 15 03:32:39 localhost podman[85746]: 2025-12-15 08:32:39.930048938 +0000 UTC m=+0.264464610 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, url=https://www.redhat.com, architecture=x86_64, version=17.1.12, batch=17.1_20251118.1, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 15 03:32:39 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:32:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:32:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:32:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 03:32:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:32:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:32:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:32:55 localhost podman[85778]: 2025-12-15 08:32:55.755279494 +0000 UTC m=+0.074226912 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, url=https://www.redhat.com, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, config_id=tripleo_step4, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=) Dec 15 03:32:55 localhost podman[85778]: 2025-12-15 08:32:55.809937193 +0000 UTC m=+0.128884621 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., io.buildah.version=1.41.4, tcib_managed=true, url=https://www.redhat.com, release=1761123044, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, name=rhosp17/openstack-ceilometer-ipmi) Dec 15 03:32:55 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 03:32:55 localhost podman[85777]: 2025-12-15 08:32:55.816038335 +0000 UTC m=+0.135775464 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, release=1761123044, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, tcib_managed=true, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_compute, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.buildah.version=1.41.4, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z) Dec 15 03:32:55 localhost podman[85775]: 2025-12-15 08:32:55.886318552 +0000 UTC m=+0.210808128 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, architecture=x86_64, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, build-date=2025-11-18T22:51:28Z, io.buildah.version=1.41.4, version=17.1.12, config_id=tripleo_step3, release=1761123044, batch=17.1_20251118.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public) Dec 15 03:32:55 localhost podman[85801]: 2025-12-15 08:32:55.934050006 +0000 UTC m=+0.237840769 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.4, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, vcs-type=git, io.openshift.expose-services=, release=1761123044, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, build-date=2025-11-19T00:11:48Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 15 03:32:55 localhost podman[85775]: 2025-12-15 08:32:55.947618557 +0000 UTC m=+0.272108153 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1761123044, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, name=rhosp17/openstack-collectd, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, url=https://www.redhat.com, vendor=Red Hat, Inc., config_id=tripleo_step3, build-date=2025-11-18T22:51:28Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container) Dec 15 03:32:55 localhost podman[85789]: 2025-12-15 08:32:55.979616022 +0000 UTC m=+0.291887242 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, build-date=2025-11-18T22:49:32Z, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, managed_by=tripleo_ansible, io.openshift.expose-services=, version=17.1.12, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20251118.1, architecture=x86_64) Dec 15 03:32:55 localhost podman[85789]: 2025-12-15 08:32:55.991386855 +0000 UTC m=+0.303658075 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:32Z, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, distribution-scope=public) Dec 15 03:32:56 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 03:32:56 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:32:56 localhost podman[85801]: 2025-12-15 08:32:56.04775563 +0000 UTC m=+0.351546393 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, release=1761123044, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, distribution-scope=public, config_id=tripleo_step4, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, io.openshift.expose-services=, build-date=2025-11-19T00:11:48Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., architecture=x86_64) Dec 15 03:32:56 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 03:32:56 localhost podman[85776]: 2025-12-15 08:32:56.140746112 +0000 UTC m=+0.464271112 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, release=1761123044, architecture=x86_64, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, managed_by=tripleo_ansible, build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid) Dec 15 03:32:56 localhost podman[85777]: 2025-12-15 08:32:56.152518987 +0000 UTC m=+0.472256156 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, tcib_managed=true, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, architecture=x86_64, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.openshift.expose-services=, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., config_id=tripleo_step5, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Dec 15 03:32:56 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Deactivated successfully. Dec 15 03:32:56 localhost podman[85776]: 2025-12-15 08:32:56.204258477 +0000 UTC m=+0.527783527 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, tcib_managed=true, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:44:13Z, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, io.buildah.version=1.41.4, url=https://www.redhat.com, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible) Dec 15 03:32:56 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:32:56 localhost systemd[1]: tmp-crun.BPtadG.mount: Deactivated successfully. Dec 15 03:32:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:32:58 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Dec 15 03:32:58 localhost recover_tripleo_nova_virtqemud[85921]: 61849 Dec 15 03:32:58 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Dec 15 03:32:58 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Dec 15 03:32:58 localhost systemd[1]: tmp-crun.ehX80K.mount: Deactivated successfully. Dec 15 03:32:58 localhost podman[85914]: 2025-12-15 08:32:58.75857666 +0000 UTC m=+0.089480639 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.openshift.expose-services=, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute) Dec 15 03:32:59 localhost podman[85914]: 2025-12-15 08:32:59.125355099 +0000 UTC m=+0.456259088 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, distribution-scope=public, io.buildah.version=1.41.4, container_name=nova_migration_target, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, build-date=2025-11-19T00:36:58Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, vcs-type=git) Dec 15 03:32:59 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 03:33:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:33:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:33:01 localhost systemd[1]: tmp-crun.uZDwSj.mount: Deactivated successfully. Dec 15 03:33:01 localhost systemd[1]: tmp-crun.SND3cm.mount: Deactivated successfully. Dec 15 03:33:01 localhost podman[85941]: 2025-12-15 08:33:01.807636636 +0000 UTC m=+0.134019268 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, version=17.1.12, build-date=2025-11-18T23:34:05Z, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_id=tripleo_step4) Dec 15 03:33:01 localhost podman[85942]: 2025-12-15 08:33:01.774936104 +0000 UTC m=+0.101080268 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., batch=17.1_20251118.1, tcib_managed=true, version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c) Dec 15 03:33:01 localhost podman[85941]: 2025-12-15 08:33:01.827282501 +0000 UTC m=+0.153665073 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, release=1761123044, tcib_managed=true, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, managed_by=tripleo_ansible, build-date=2025-11-18T23:34:05Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vcs-type=git) Dec 15 03:33:01 localhost podman[85942]: 2025-12-15 08:33:01.860272761 +0000 UTC m=+0.186416925 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, url=https://www.redhat.com, io.openshift.expose-services=, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc.) Dec 15 03:33:01 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Deactivated successfully. Dec 15 03:33:01 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Deactivated successfully. Dec 15 03:33:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:33:10 localhost podman[85990]: 2025-12-15 08:33:10.745654766 +0000 UTC m=+0.080327305 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, container_name=metrics_qdr, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, build-date=2025-11-18T22:49:46Z, name=rhosp17/openstack-qdrouterd, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd) Dec 15 03:33:10 localhost podman[85990]: 2025-12-15 08:33:10.931473455 +0000 UTC m=+0.266146004 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, batch=17.1_20251118.1, io.buildah.version=1.41.4, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:46Z, distribution-scope=public, architecture=x86_64, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, version=17.1.12, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 15 03:33:10 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:33:12 localhost sshd[86019]: main: sshd: ssh-rsa algorithm is disabled Dec 15 03:33:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:33:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:33:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 03:33:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:33:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:33:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:33:26 localhost systemd[1]: tmp-crun.7CDw5C.mount: Deactivated successfully. Dec 15 03:33:26 localhost podman[86166]: 2025-12-15 08:33:26.815823607 +0000 UTC m=+0.116862520 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, build-date=2025-11-19T00:11:48Z, name=rhosp17/openstack-ceilometer-compute, version=17.1.12, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676) Dec 15 03:33:26 localhost podman[86144]: 2025-12-15 08:33:26.833290794 +0000 UTC m=+0.155167933 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, build-date=2025-11-18T22:51:28Z, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1761123044, tcib_managed=true, config_id=tripleo_step3, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-collectd, vcs-type=git, architecture=x86_64) Dec 15 03:33:26 localhost podman[86146]: 2025-12-15 08:33:26.796211433 +0000 UTC m=+0.115926814 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, release=1761123044, vendor=Red Hat, Inc., container_name=nova_compute, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, batch=17.1_20251118.1, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 15 03:33:26 localhost podman[86144]: 2025-12-15 08:33:26.865174004 +0000 UTC m=+0.187051143 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, tcib_managed=true, config_id=tripleo_step3, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., release=1761123044, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, com.redhat.component=openstack-collectd-container, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, name=rhosp17/openstack-collectd, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.expose-services=) Dec 15 03:33:26 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:33:26 localhost podman[86145]: 2025-12-15 08:33:26.878225773 +0000 UTC m=+0.200870253 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-type=git, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, version=17.1.12, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, build-date=2025-11-18T23:44:13Z, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., url=https://www.redhat.com, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=iscsid, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid) Dec 15 03:33:26 localhost podman[86145]: 2025-12-15 08:33:26.913655258 +0000 UTC m=+0.236299748 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, url=https://www.redhat.com, vcs-type=git, container_name=iscsid, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, build-date=2025-11-18T23:44:13Z, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true) Dec 15 03:33:26 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:33:26 localhost podman[86159]: 2025-12-15 08:33:26.928098053 +0000 UTC m=+0.237674033 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, container_name=logrotate_crond, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, build-date=2025-11-18T22:49:32Z, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, name=rhosp17/openstack-cron, architecture=x86_64, version=17.1.12, config_id=tripleo_step4, url=https://www.redhat.com) Dec 15 03:33:26 localhost podman[86159]: 2025-12-15 08:33:26.938279875 +0000 UTC m=+0.247855835 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, release=1761123044, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, architecture=x86_64, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, version=17.1.12, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 15 03:33:26 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 03:33:26 localhost podman[86147]: 2025-12-15 08:33:26.85639149 +0000 UTC m=+0.154261348 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, distribution-scope=public, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, version=17.1.12, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., url=https://www.redhat.com, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Dec 15 03:33:26 localhost podman[86146]: 2025-12-15 08:33:26.980951694 +0000 UTC m=+0.300667005 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, build-date=2025-11-19T00:36:58Z, container_name=nova_compute, url=https://www.redhat.com, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12) Dec 15 03:33:26 localhost podman[86147]: 2025-12-15 08:33:26.989367758 +0000 UTC m=+0.287237626 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.expose-services=, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, build-date=2025-11-19T00:12:45Z, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12) Dec 15 03:33:26 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Deactivated successfully. Dec 15 03:33:27 localhost podman[86166]: 2025-12-15 08:33:27.001463902 +0000 UTC m=+0.302502795 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, container_name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, build-date=2025-11-19T00:11:48Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, io.openshift.expose-services=, vendor=Red Hat, Inc., tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container) Dec 15 03:33:27 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 03:33:27 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 03:33:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:33:29 localhost systemd[1]: tmp-crun.WpL3EX.mount: Deactivated successfully. Dec 15 03:33:29 localhost podman[86279]: 2025-12-15 08:33:29.76882379 +0000 UTC m=+0.088716228 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-type=git, maintainer=OpenStack TripleO Team, release=1761123044, distribution-scope=public, container_name=nova_migration_target, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12) Dec 15 03:33:30 localhost podman[86279]: 2025-12-15 08:33:30.119733216 +0000 UTC m=+0.439625694 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., release=1761123044, io.buildah.version=1.41.4, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, container_name=nova_migration_target, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4) Dec 15 03:33:30 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 03:33:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:33:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:33:32 localhost systemd[1]: tmp-crun.DzGrMX.mount: Deactivated successfully. Dec 15 03:33:32 localhost podman[86303]: 2025-12-15 08:33:32.760002283 +0000 UTC m=+0.084980969 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, batch=17.1_20251118.1, io.buildah.version=1.41.4, architecture=x86_64, container_name=ovn_controller, config_id=tripleo_step4, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, name=rhosp17/openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, build-date=2025-11-18T23:34:05Z, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12) Dec 15 03:33:32 localhost podman[86304]: 2025-12-15 08:33:32.819203813 +0000 UTC m=+0.140194893 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, batch=17.1_20251118.1, release=1761123044, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container) Dec 15 03:33:32 localhost podman[86303]: 2025-12-15 08:33:32.833434823 +0000 UTC m=+0.158413559 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, release=1761123044, config_id=tripleo_step4, io.openshift.expose-services=, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, io.buildah.version=1.41.4, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, name=rhosp17/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 15 03:33:32 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Deactivated successfully. Dec 15 03:33:32 localhost podman[86304]: 2025-12-15 08:33:32.898460089 +0000 UTC m=+0.219451189 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, vendor=Red Hat, Inc., managed_by=tripleo_ansible, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4) Dec 15 03:33:32 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Deactivated successfully. Dec 15 03:33:33 localhost systemd[1]: tmp-crun.94bjI0.mount: Deactivated successfully. Dec 15 03:33:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:33:41 localhost systemd[1]: tmp-crun.S2rBfO.mount: Deactivated successfully. Dec 15 03:33:41 localhost podman[86351]: 2025-12-15 08:33:41.765420854 +0000 UTC m=+0.096550979 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.expose-services=, version=17.1.12, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, tcib_managed=true, io.buildah.version=1.41.4, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, config_id=tripleo_step1) Dec 15 03:33:41 localhost podman[86351]: 2025-12-15 08:33:41.96277157 +0000 UTC m=+0.293901685 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vendor=Red Hat, Inc., release=1761123044, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, io.openshift.expose-services=, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Dec 15 03:33:41 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:33:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:33:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:33:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 03:33:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:33:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:33:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:33:57 localhost systemd[1]: tmp-crun.y7Nsdn.mount: Deactivated successfully. Dec 15 03:33:57 localhost podman[86396]: 2025-12-15 08:33:57.781157582 +0000 UTC m=+0.095127830 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:11:48Z, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, config_id=tripleo_step4, container_name=ceilometer_agent_compute, release=1761123044, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, url=https://www.redhat.com, architecture=x86_64, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute) Dec 15 03:33:57 localhost podman[86381]: 2025-12-15 08:33:57.810122874 +0000 UTC m=+0.138043805 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, name=rhosp17/openstack-collectd, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, managed_by=tripleo_ansible, io.openshift.expose-services=) Dec 15 03:33:57 localhost podman[86381]: 2025-12-15 08:33:57.817532422 +0000 UTC m=+0.145453383 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, container_name=collectd, version=17.1.12, architecture=x86_64, com.redhat.component=openstack-collectd-container, build-date=2025-11-18T22:51:28Z, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, name=rhosp17/openstack-collectd, tcib_managed=true, io.openshift.expose-services=, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 15 03:33:57 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:33:57 localhost podman[86382]: 2025-12-15 08:33:57.760686026 +0000 UTC m=+0.087373414 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, version=17.1.12, container_name=iscsid, distribution-scope=public, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, release=1761123044, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 15 03:33:57 localhost podman[86396]: 2025-12-15 08:33:57.86241146 +0000 UTC m=+0.176381668 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:11:48Z, architecture=x86_64, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, release=1761123044, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-compute, vendor=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 15 03:33:57 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 03:33:57 localhost podman[86383]: 2025-12-15 08:33:57.922432572 +0000 UTC m=+0.245675618 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, version=17.1.12, build-date=2025-11-19T00:36:58Z, vcs-type=git, url=https://www.redhat.com, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, release=1761123044, io.buildah.version=1.41.4, config_id=tripleo_step5) Dec 15 03:33:57 localhost podman[86390]: 2025-12-15 08:33:57.864092385 +0000 UTC m=+0.180620722 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, url=https://www.redhat.com, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=logrotate_crond, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.expose-services=, build-date=2025-11-18T22:49:32Z, io.buildah.version=1.41.4) Dec 15 03:33:57 localhost podman[86384]: 2025-12-15 08:33:57.973451994 +0000 UTC m=+0.291203773 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.4, version=17.1.12, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 15 03:33:57 localhost podman[86382]: 2025-12-15 08:33:57.996302084 +0000 UTC m=+0.322989522 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_id=tripleo_step3, url=https://www.redhat.com, name=rhosp17/openstack-iscsid, vcs-type=git, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, release=1761123044, io.buildah.version=1.41.4, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20251118.1, tcib_managed=true) Dec 15 03:33:58 localhost podman[86383]: 2025-12-15 08:33:58.004566324 +0000 UTC m=+0.327809450 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-type=git, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, version=17.1.12, build-date=2025-11-19T00:36:58Z, release=1761123044, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.openshift.expose-services=) Dec 15 03:33:58 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:33:58 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Deactivated successfully. Dec 15 03:33:58 localhost podman[86390]: 2025-12-15 08:33:58.048753443 +0000 UTC m=+0.365281810 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, io.openshift.expose-services=, version=17.1.12, distribution-scope=public, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, build-date=2025-11-18T22:49:32Z, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, tcib_managed=true, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron) Dec 15 03:33:58 localhost podman[86384]: 2025-12-15 08:33:58.057389474 +0000 UTC m=+0.375141243 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, vcs-type=git, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.4, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, version=17.1.12, vendor=Red Hat, Inc., architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Dec 15 03:33:58 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 03:33:58 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 03:34:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:34:00 localhost podman[86518]: 2025-12-15 08:34:00.730910028 +0000 UTC m=+0.068964382 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, io.openshift.expose-services=, tcib_managed=true, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.buildah.version=1.41.4, vcs-type=git, architecture=x86_64) Dec 15 03:34:01 localhost podman[86518]: 2025-12-15 08:34:01.09028375 +0000 UTC m=+0.428338154 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, tcib_managed=true, release=1761123044, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, url=https://www.redhat.com, io.buildah.version=1.41.4, config_id=tripleo_step4, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 15 03:34:01 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 03:34:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:34:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:34:03 localhost podman[86543]: 2025-12-15 08:34:03.758299707 +0000 UTC m=+0.085002700 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., managed_by=tripleo_ansible, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, batch=17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, vcs-type=git) Dec 15 03:34:03 localhost podman[86542]: 2025-12-15 08:34:03.80637761 +0000 UTC m=+0.135627751 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, release=1761123044, container_name=ovn_controller, vendor=Red Hat, Inc., vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller) Dec 15 03:34:03 localhost podman[86543]: 2025-12-15 08:34:03.826850717 +0000 UTC m=+0.153553730 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, tcib_managed=true, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, version=17.1.12, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com) Dec 15 03:34:03 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Deactivated successfully. Dec 15 03:34:03 localhost podman[86542]: 2025-12-15 08:34:03.883878868 +0000 UTC m=+0.213129039 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, build-date=2025-11-18T23:34:05Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.buildah.version=1.41.4, tcib_managed=true, managed_by=tripleo_ansible) Dec 15 03:34:03 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Deactivated successfully. Dec 15 03:34:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:34:12 localhost podman[86590]: 2025-12-15 08:34:12.752416233 +0000 UTC m=+0.083477124 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, distribution-scope=public, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, container_name=metrics_qdr, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, config_id=tripleo_step1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, vcs-type=git) Dec 15 03:34:12 localhost podman[86590]: 2025-12-15 08:34:12.943145604 +0000 UTC m=+0.274206465 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, managed_by=tripleo_ansible, container_name=metrics_qdr, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, batch=17.1_20251118.1) Dec 15 03:34:12 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:34:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:34:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:34:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 03:34:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:34:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:34:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:34:28 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Dec 15 03:34:28 localhost recover_tripleo_nova_virtqemud[86778]: 61849 Dec 15 03:34:28 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Dec 15 03:34:28 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Dec 15 03:34:28 localhost systemd[1]: tmp-crun.yl1O2P.mount: Deactivated successfully. Dec 15 03:34:28 localhost podman[86741]: 2025-12-15 08:34:28.770494206 +0000 UTC m=+0.087269865 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, io.openshift.expose-services=, release=1761123044, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, architecture=x86_64, maintainer=OpenStack TripleO Team, distribution-scope=public, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z) Dec 15 03:34:28 localhost podman[86741]: 2025-12-15 08:34:28.824435429 +0000 UTC m=+0.141211098 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, version=17.1.12, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, vcs-type=git, release=1761123044, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute) Dec 15 03:34:28 localhost podman[86739]: 2025-12-15 08:34:28.832756831 +0000 UTC m=+0.157285127 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.expose-services=, vcs-type=git, build-date=2025-11-18T22:51:28Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, version=17.1.12, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, container_name=collectd) Dec 15 03:34:28 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Deactivated successfully. Dec 15 03:34:28 localhost podman[86739]: 2025-12-15 08:34:28.87156903 +0000 UTC m=+0.196097436 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, com.redhat.component=openstack-collectd-container, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, version=17.1.12, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, vcs-type=git, build-date=2025-11-18T22:51:28Z, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=) Dec 15 03:34:28 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:34:28 localhost podman[86761]: 2025-12-15 08:34:28.893034924 +0000 UTC m=+0.198482080 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, managed_by=tripleo_ansible, release=1761123044, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., vcs-type=git, build-date=2025-11-19T00:11:48Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Dec 15 03:34:28 localhost podman[86761]: 2025-12-15 08:34:28.920032636 +0000 UTC m=+0.225479822 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, version=17.1.12, release=1761123044, build-date=2025-11-19T00:11:48Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, config_id=tripleo_step4, url=https://www.redhat.com, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, container_name=ceilometer_agent_compute) Dec 15 03:34:28 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 03:34:28 localhost podman[86740]: 2025-12-15 08:34:28.939543848 +0000 UTC m=+0.256412639 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4, container_name=iscsid, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, release=1761123044, name=rhosp17/openstack-iscsid, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, architecture=x86_64, vcs-type=git) Dec 15 03:34:28 localhost podman[86740]: 2025-12-15 08:34:28.950341016 +0000 UTC m=+0.267209777 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, vendor=Red Hat, Inc., org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, managed_by=tripleo_ansible, io.buildah.version=1.41.4, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=iscsid, name=rhosp17/openstack-iscsid, vcs-type=git, build-date=2025-11-18T23:44:13Z, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid) Dec 15 03:34:28 localhost podman[86748]: 2025-12-15 08:34:28.981452909 +0000 UTC m=+0.294492078 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, name=rhosp17/openstack-ceilometer-ipmi, release=1761123044, io.openshift.expose-services=, tcib_managed=true, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, build-date=2025-11-19T00:12:45Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676) Dec 15 03:34:29 localhost podman[86748]: 2025-12-15 08:34:29.008256756 +0000 UTC m=+0.321295945 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, version=17.1.12, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-ipmi-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-type=git, architecture=x86_64, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676) Dec 15 03:34:29 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:34:29 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 03:34:29 localhost podman[86749]: 2025-12-15 08:34:29.090117755 +0000 UTC m=+0.391262815 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, build-date=2025-11-18T22:49:32Z, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, container_name=logrotate_crond, name=rhosp17/openstack-cron, version=17.1.12, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 15 03:34:29 localhost podman[86749]: 2025-12-15 08:34:29.126313923 +0000 UTC m=+0.427458973 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, url=https://www.redhat.com, build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, release=1761123044, version=17.1.12, config_id=tripleo_step4, batch=17.1_20251118.1, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, name=rhosp17/openstack-cron, io.openshift.expose-services=) Dec 15 03:34:29 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 03:34:29 localhost systemd[1]: tmp-crun.vrXeIV.mount: Deactivated successfully. Dec 15 03:34:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:34:31 localhost systemd[1]: tmp-crun.Dn52Fv.mount: Deactivated successfully. Dec 15 03:34:31 localhost podman[86874]: 2025-12-15 08:34:31.774141142 +0000 UTC m=+0.105288967 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_id=tripleo_step4, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, tcib_managed=true, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, distribution-scope=public, build-date=2025-11-19T00:36:58Z, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Dec 15 03:34:32 localhost podman[86874]: 2025-12-15 08:34:32.154507675 +0000 UTC m=+0.485655460 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, io.openshift.expose-services=, tcib_managed=true, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vcs-type=git, version=17.1.12) Dec 15 03:34:32 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 03:34:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:34:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:34:34 localhost systemd[1]: tmp-crun.TuUeDI.mount: Deactivated successfully. Dec 15 03:34:34 localhost podman[86899]: 2025-12-15 08:34:34.743469971 +0000 UTC m=+0.078287625 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z, distribution-scope=public, release=1761123044, batch=17.1_20251118.1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_controller, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller) Dec 15 03:34:34 localhost podman[86900]: 2025-12-15 08:34:34.759631183 +0000 UTC m=+0.086872714 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, batch=17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, build-date=2025-11-19T00:14:25Z, release=1761123044, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible) Dec 15 03:34:34 localhost podman[86899]: 2025-12-15 08:34:34.793330624 +0000 UTC m=+0.128148258 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, maintainer=OpenStack TripleO Team, container_name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ovn-controller, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, release=1761123044, batch=17.1_20251118.1, vendor=Red Hat, Inc., vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, config_id=tripleo_step4, build-date=2025-11-18T23:34:05Z, url=https://www.redhat.com, architecture=x86_64) Dec 15 03:34:34 localhost podman[86900]: 2025-12-15 08:34:34.804254937 +0000 UTC m=+0.131496478 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, batch=17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn) Dec 15 03:34:34 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Deactivated successfully. Dec 15 03:34:34 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Deactivated successfully. Dec 15 03:34:43 localhost sshd[86948]: main: sshd: ssh-rsa algorithm is disabled Dec 15 03:34:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:34:43 localhost podman[86949]: 2025-12-15 08:34:43.779248434 +0000 UTC m=+0.111878523 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.expose-services=, batch=17.1_20251118.1, release=1761123044, vcs-type=git, config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd) Dec 15 03:34:43 localhost podman[86949]: 2025-12-15 08:34:43.982582252 +0000 UTC m=+0.315212391 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., io.openshift.expose-services=, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, batch=17.1_20251118.1, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, io.buildah.version=1.41.4, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 15 03:34:43 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:34:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:34:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:34:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 03:34:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:34:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:34:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:34:59 localhost systemd[83969]: Created slice User Background Tasks Slice. Dec 15 03:34:59 localhost systemd[83969]: Starting Cleanup of User's Temporary Files and Directories... Dec 15 03:34:59 localhost systemd[83969]: Finished Cleanup of User's Temporary Files and Directories. Dec 15 03:34:59 localhost systemd[1]: tmp-crun.7StYNs.mount: Deactivated successfully. Dec 15 03:34:59 localhost podman[86985]: 2025-12-15 08:34:59.828522197 +0000 UTC m=+0.143074267 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, build-date=2025-11-18T22:49:32Z, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, version=17.1.12, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, architecture=x86_64, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, tcib_managed=true, release=1761123044) Dec 15 03:34:59 localhost podman[86985]: 2025-12-15 08:34:59.861555751 +0000 UTC m=+0.176107781 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp17/openstack-cron, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=logrotate_crond, tcib_managed=true, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:32Z, io.buildah.version=1.41.4, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 15 03:34:59 localhost podman[86979]: 2025-12-15 08:34:59.872123363 +0000 UTC m=+0.196107146 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z, vcs-type=git, container_name=collectd, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, batch=17.1_20251118.1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible) Dec 15 03:34:59 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 03:34:59 localhost podman[86994]: 2025-12-15 08:34:59.791063995 +0000 UTC m=+0.101490265 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, distribution-scope=public, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:11:48Z, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, tcib_managed=true, version=17.1.12, url=https://www.redhat.com, vcs-type=git, name=rhosp17/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute) Dec 15 03:34:59 localhost podman[86994]: 2025-12-15 08:34:59.924370531 +0000 UTC m=+0.234797011 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:11:48Z, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, url=https://www.redhat.com, managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1) Dec 15 03:34:59 localhost podman[86982]: 2025-12-15 08:34:59.930837283 +0000 UTC m=+0.248928899 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.buildah.version=1.41.4, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, config_id=tripleo_step4, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, vcs-type=git, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, build-date=2025-11-19T00:12:45Z, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 15 03:34:59 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 03:34:59 localhost podman[86980]: 2025-12-15 08:34:59.965265985 +0000 UTC m=+0.289631968 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, url=https://www.redhat.com, vendor=Red Hat, Inc., batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, version=17.1.12, io.openshift.expose-services=, architecture=x86_64, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, distribution-scope=public, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1761123044, build-date=2025-11-18T23:44:13Z) Dec 15 03:34:59 localhost podman[86982]: 2025-12-15 08:34:59.966251811 +0000 UTC m=+0.284343427 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp17/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:12:45Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, architecture=x86_64, batch=17.1_20251118.1, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Dec 15 03:34:59 localhost podman[86980]: 2025-12-15 08:34:59.974284096 +0000 UTC m=+0.298650039 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, distribution-scope=public, com.redhat.component=openstack-iscsid-container, container_name=iscsid, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-iscsid, io.buildah.version=1.41.4, tcib_managed=true, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-11-18T23:44:13Z) Dec 15 03:34:59 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:35:00 localhost podman[86979]: 2025-12-15 08:35:00.009635801 +0000 UTC m=+0.333619514 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, vcs-type=git, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:51:28Z, tcib_managed=true, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., io.openshift.expose-services=, version=17.1.12, com.redhat.component=openstack-collectd-container, release=1761123044, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, url=https://www.redhat.com, io.buildah.version=1.41.4, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 15 03:35:00 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:35:00 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 03:35:00 localhost podman[86981]: 2025-12-15 08:34:59.977786329 +0000 UTC m=+0.297132418 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step5, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, distribution-scope=public, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Dec 15 03:35:00 localhost podman[86981]: 2025-12-15 08:35:00.060334248 +0000 UTC m=+0.379680287 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, release=1761123044, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, config_id=tripleo_step5, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, managed_by=tripleo_ansible, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12) Dec 15 03:35:00 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Deactivated successfully. Dec 15 03:35:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:35:02 localhost podman[87116]: 2025-12-15 08:35:02.739420393 +0000 UTC m=+0.072111230 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.expose-services=, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}) Dec 15 03:35:03 localhost podman[87116]: 2025-12-15 08:35:03.111505805 +0000 UTC m=+0.444196682 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, architecture=x86_64, distribution-scope=public, url=https://www.redhat.com, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Dec 15 03:35:03 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 03:35:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:35:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:35:05 localhost systemd[1]: tmp-crun.Nup6sw.mount: Deactivated successfully. Dec 15 03:35:05 localhost podman[87139]: 2025-12-15 08:35:05.766066215 +0000 UTC m=+0.086285979 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, container_name=ovn_controller, io.buildah.version=1.41.4, name=rhosp17/openstack-ovn-controller, version=17.1.12, build-date=2025-11-18T23:34:05Z, url=https://www.redhat.com, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, distribution-scope=public, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team) Dec 15 03:35:05 localhost systemd[1]: tmp-crun.JskT9L.mount: Deactivated successfully. Dec 15 03:35:05 localhost podman[87139]: 2025-12-15 08:35:05.824615561 +0000 UTC m=+0.144835285 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20251118.1, io.openshift.expose-services=, vcs-type=git, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, architecture=x86_64, io.buildah.version=1.41.4, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 15 03:35:05 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Deactivated successfully. Dec 15 03:35:05 localhost ceph-osd[31375]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Dec 15 03:35:05 localhost ceph-osd[31375]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 3000.1 total, 600.0 interval#012Cumulative writes: 4815 writes, 21K keys, 4815 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 4815 writes, 628 syncs, 7.67 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 442 writes, 1855 keys, 442 commit groups, 1.0 writes per commit group, ingest: 2.50 MB, 0.00 MB/s#012Interval WAL: 442 writes, 144 syncs, 3.07 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Dec 15 03:35:05 localhost podman[87140]: 2025-12-15 08:35:05.830118308 +0000 UTC m=+0.143096708 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, release=1761123044, io.openshift.expose-services=, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, tcib_managed=true, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Dec 15 03:35:05 localhost podman[87140]: 2025-12-15 08:35:05.909304836 +0000 UTC m=+0.222283286 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, version=17.1.12, distribution-scope=public, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn) Dec 15 03:35:05 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Deactivated successfully. Dec 15 03:35:10 localhost ceph-osd[32311]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Dec 15 03:35:10 localhost ceph-osd[32311]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 3000.2 total, 600.0 interval#012Cumulative writes: 5745 writes, 25K keys, 5745 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 5745 writes, 763 syncs, 7.53 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 499 writes, 1913 keys, 499 commit groups, 1.0 writes per commit group, ingest: 2.26 MB, 0.00 MB/s#012Interval WAL: 499 writes, 190 syncs, 2.63 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Dec 15 03:35:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:35:14 localhost systemd[1]: tmp-crun.fD712e.mount: Deactivated successfully. Dec 15 03:35:14 localhost podman[87186]: 2025-12-15 08:35:14.764777517 +0000 UTC m=+0.094076588 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, managed_by=tripleo_ansible, architecture=x86_64, distribution-scope=public, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., release=1761123044) Dec 15 03:35:14 localhost podman[87186]: 2025-12-15 08:35:14.977173917 +0000 UTC m=+0.306472918 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., config_id=tripleo_step1, io.buildah.version=1.41.4, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20251118.1, io.openshift.expose-services=, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, tcib_managed=true, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd) Dec 15 03:35:14 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:35:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:35:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:35:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 03:35:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:35:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:35:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:35:30 localhost systemd[1]: tmp-crun.sa4AxD.mount: Deactivated successfully. Dec 15 03:35:30 localhost podman[87339]: 2025-12-15 08:35:30.809631195 +0000 UTC m=+0.115003987 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, managed_by=tripleo_ansible, build-date=2025-11-19T00:12:45Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-ipmi) Dec 15 03:35:30 localhost podman[87339]: 2025-12-15 08:35:30.827456451 +0000 UTC m=+0.132829233 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, distribution-scope=public, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, batch=17.1_20251118.1, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, build-date=2025-11-19T00:12:45Z, architecture=x86_64, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Dec 15 03:35:30 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 03:35:30 localhost podman[87337]: 2025-12-15 08:35:30.869143676 +0000 UTC m=+0.186024776 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, config_id=tripleo_step3, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, container_name=iscsid, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1761123044, vcs-type=git) Dec 15 03:35:30 localhost podman[87337]: 2025-12-15 08:35:30.876539154 +0000 UTC m=+0.193420274 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, tcib_managed=true, container_name=iscsid, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, vcs-type=git, version=17.1.12, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, url=https://www.redhat.com, architecture=x86_64) Dec 15 03:35:30 localhost podman[87353]: 2025-12-15 08:35:30.882687659 +0000 UTC m=+0.181784393 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, name=rhosp17/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, build-date=2025-11-19T00:11:48Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, managed_by=tripleo_ansible, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, batch=17.1_20251118.1, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 15 03:35:30 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:35:30 localhost podman[87336]: 2025-12-15 08:35:30.789289221 +0000 UTC m=+0.108395810 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vendor=Red Hat, Inc., distribution-scope=public, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z, version=17.1.12, architecture=x86_64, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, name=rhosp17/openstack-collectd, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4) Dec 15 03:35:30 localhost podman[87353]: 2025-12-15 08:35:30.909232009 +0000 UTC m=+0.208328693 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, distribution-scope=public, managed_by=tripleo_ansible, config_id=tripleo_step4, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.12, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Dec 15 03:35:30 localhost podman[87336]: 2025-12-15 08:35:30.918117307 +0000 UTC m=+0.237223986 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, container_name=collectd, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, io.buildah.version=1.41.4, batch=17.1_20251118.1, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, release=1761123044, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, tcib_managed=true, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc.) Dec 15 03:35:30 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 03:35:30 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:35:30 localhost podman[87351]: 2025-12-15 08:35:30.957448939 +0000 UTC m=+0.263488079 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.openshift.expose-services=, tcib_managed=true, name=rhosp17/openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2025-11-18T22:49:32Z, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, config_id=tripleo_step4, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.component=openstack-cron-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 15 03:35:30 localhost podman[87338]: 2025-12-15 08:35:30.984744948 +0000 UTC m=+0.290522661 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step5, container_name=nova_compute, architecture=x86_64, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, distribution-scope=public, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.openshift.expose-services=) Dec 15 03:35:30 localhost podman[87351]: 2025-12-15 08:35:30.988683494 +0000 UTC m=+0.294722664 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, build-date=2025-11-18T22:49:32Z, url=https://www.redhat.com, container_name=logrotate_crond, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, name=rhosp17/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.12, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 15 03:35:31 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 03:35:31 localhost podman[87338]: 2025-12-15 08:35:31.01023169 +0000 UTC m=+0.316009403 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, container_name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_id=tripleo_step5, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044) Dec 15 03:35:31 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Deactivated successfully. Dec 15 03:35:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:35:33 localhost podman[87467]: 2025-12-15 08:35:33.761203977 +0000 UTC m=+0.089368131 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, batch=17.1_20251118.1, tcib_managed=true, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, architecture=x86_64, build-date=2025-11-19T00:36:58Z, release=1761123044, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_migration_target, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute) Dec 15 03:35:34 localhost podman[87467]: 2025-12-15 08:35:34.132853997 +0000 UTC m=+0.461018111 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, managed_by=tripleo_ansible, io.openshift.expose-services=, url=https://www.redhat.com, config_id=tripleo_step4, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, vcs-type=git, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_migration_target, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, distribution-scope=public, build-date=2025-11-19T00:36:58Z, tcib_managed=true, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc.) Dec 15 03:35:34 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 03:35:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:35:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:35:36 localhost podman[87490]: 2025-12-15 08:35:36.755665318 +0000 UTC m=+0.059433002 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, url=https://www.redhat.com, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, vcs-type=git, version=17.1.12, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, io.buildah.version=1.41.4, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272) Dec 15 03:35:36 localhost podman[87490]: 2025-12-15 08:35:36.78233577 +0000 UTC m=+0.086103524 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_id=tripleo_step4, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, batch=17.1_20251118.1, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, release=1761123044, io.buildah.version=1.41.4, tcib_managed=true, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272) Dec 15 03:35:36 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Deactivated successfully. Dec 15 03:35:36 localhost systemd[1]: tmp-crun.mH1zQF.mount: Deactivated successfully. Dec 15 03:35:36 localhost podman[87491]: 2025-12-15 08:35:36.884785121 +0000 UTC m=+0.179278026 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., architecture=x86_64, config_id=tripleo_step4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.12, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, distribution-scope=public, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 15 03:35:36 localhost podman[87491]: 2025-12-15 08:35:36.933489583 +0000 UTC m=+0.227982488 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, url=https://www.redhat.com, tcib_managed=true, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, architecture=x86_64, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, release=1761123044, build-date=2025-11-19T00:14:25Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12) Dec 15 03:35:36 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Deactivated successfully. Dec 15 03:35:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:35:45 localhost podman[87538]: 2025-12-15 08:35:45.756467962 +0000 UTC m=+0.088853597 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, version=17.1.12, config_id=tripleo_step1, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.buildah.version=1.41.4, distribution-scope=public, container_name=metrics_qdr, io.openshift.expose-services=, maintainer=OpenStack TripleO Team) Dec 15 03:35:45 localhost podman[87538]: 2025-12-15 08:35:45.946565547 +0000 UTC m=+0.278951202 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, architecture=x86_64, io.buildah.version=1.41.4, config_id=tripleo_step1, url=https://www.redhat.com, vcs-type=git, distribution-scope=public, managed_by=tripleo_ansible, io.openshift.expose-services=) Dec 15 03:35:45 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:35:58 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Dec 15 03:35:58 localhost recover_tripleo_nova_virtqemud[87569]: 61849 Dec 15 03:35:58 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Dec 15 03:35:58 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Dec 15 03:36:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:36:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:36:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 03:36:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:36:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:36:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:36:01 localhost systemd[1]: tmp-crun.zLHZoD.mount: Deactivated successfully. Dec 15 03:36:01 localhost podman[87572]: 2025-12-15 08:36:01.782328105 +0000 UTC m=+0.103258803 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-type=git, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, managed_by=tripleo_ansible, architecture=x86_64, tcib_managed=true, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Dec 15 03:36:01 localhost systemd[1]: tmp-crun.BIiih8.mount: Deactivated successfully. Dec 15 03:36:01 localhost podman[87573]: 2025-12-15 08:36:01.834203673 +0000 UTC m=+0.149106250 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.4, url=https://www.redhat.com, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, architecture=x86_64, release=1761123044, batch=17.1_20251118.1, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=) Dec 15 03:36:01 localhost podman[87572]: 2025-12-15 08:36:01.858940215 +0000 UTC m=+0.179870963 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, vendor=Red Hat, Inc., io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, container_name=nova_compute, batch=17.1_20251118.1, url=https://www.redhat.com) Dec 15 03:36:01 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Deactivated successfully. Dec 15 03:36:01 localhost podman[87573]: 2025-12-15 08:36:01.891404172 +0000 UTC m=+0.206306769 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, batch=17.1_20251118.1, architecture=x86_64, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-ipmi-container, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, container_name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=) Dec 15 03:36:01 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 03:36:01 localhost podman[87571]: 2025-12-15 08:36:01.930085997 +0000 UTC m=+0.253455920 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, tcib_managed=true, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, vendor=Red Hat, Inc., name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, build-date=2025-11-18T23:44:13Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, version=17.1.12, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, managed_by=tripleo_ansible, architecture=x86_64, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 15 03:36:01 localhost podman[87571]: 2025-12-15 08:36:01.945215571 +0000 UTC m=+0.268585504 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, architecture=x86_64, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=iscsid, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com) Dec 15 03:36:01 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:36:01 localhost podman[87570]: 2025-12-15 08:36:01.986241659 +0000 UTC m=+0.311478002 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, batch=17.1_20251118.1, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, config_id=tripleo_step3, io.openshift.expose-services=, version=17.1.12, name=rhosp17/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, architecture=x86_64, distribution-scope=public, vcs-type=git) Dec 15 03:36:01 localhost podman[87580]: 2025-12-15 08:36:01.994187812 +0000 UTC m=+0.304602788 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, distribution-scope=public, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, vcs-type=git, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, vendor=Red Hat, Inc., managed_by=tripleo_ansible, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 15 03:36:01 localhost podman[87589]: 2025-12-15 08:36:01.946114586 +0000 UTC m=+0.250817780 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, release=1761123044, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-compute, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:11:48Z, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.buildah.version=1.41.4) Dec 15 03:36:02 localhost podman[87580]: 2025-12-15 08:36:02.030475622 +0000 UTC m=+0.340890598 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, batch=17.1_20251118.1, config_id=tripleo_step4, url=https://www.redhat.com, build-date=2025-11-18T22:49:32Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.openshift.expose-services=, release=1761123044, architecture=x86_64, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, version=17.1.12, name=rhosp17/openstack-cron) Dec 15 03:36:02 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 03:36:02 localhost podman[87570]: 2025-12-15 08:36:02.046257354 +0000 UTC m=+0.371493757 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd, release=1761123044, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, vcs-type=git, tcib_managed=true, container_name=collectd, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, batch=17.1_20251118.1) Dec 15 03:36:02 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:36:02 localhost podman[87589]: 2025-12-15 08:36:02.08161332 +0000 UTC m=+0.386316514 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, architecture=x86_64, batch=17.1_20251118.1, io.buildah.version=1.41.4, vcs-type=git, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, release=1761123044) Dec 15 03:36:02 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 03:36:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:36:04 localhost systemd[1]: tmp-crun.62RU9q.mount: Deactivated successfully. Dec 15 03:36:04 localhost podman[87702]: 2025-12-15 08:36:04.754817377 +0000 UTC m=+0.087793299 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, tcib_managed=true, managed_by=tripleo_ansible, release=1761123044, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://www.redhat.com, io.buildah.version=1.41.4, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Dec 15 03:36:05 localhost podman[87702]: 2025-12-15 08:36:05.16057358 +0000 UTC m=+0.493549492 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, name=rhosp17/openstack-nova-compute, distribution-scope=public, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, config_id=tripleo_step4, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Dec 15 03:36:05 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 03:36:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:36:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:36:07 localhost systemd[1]: tmp-crun.G5tNcq.mount: Deactivated successfully. Dec 15 03:36:07 localhost podman[87726]: 2025-12-15 08:36:07.761170605 +0000 UTC m=+0.087970154 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, distribution-scope=public, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, name=rhosp17/openstack-ovn-controller, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, release=1761123044, managed_by=tripleo_ansible, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true) Dec 15 03:36:07 localhost podman[87726]: 2025-12-15 08:36:07.783285946 +0000 UTC m=+0.110085435 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.openshift.expose-services=, url=https://www.redhat.com, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_id=tripleo_step4, build-date=2025-11-18T23:34:05Z, version=17.1.12, tcib_managed=true, vcs-type=git, maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible) Dec 15 03:36:07 localhost podman[87727]: 2025-12-15 08:36:07.797138427 +0000 UTC m=+0.120962646 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, vcs-type=git, maintainer=OpenStack TripleO Team, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, vendor=Red Hat, Inc., org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c) Dec 15 03:36:07 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Deactivated successfully. Dec 15 03:36:07 localhost podman[87727]: 2025-12-15 08:36:07.836632473 +0000 UTC m=+0.160456732 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=tripleo_ansible, build-date=2025-11-19T00:14:25Z, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20251118.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team) Dec 15 03:36:07 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Deactivated successfully. Dec 15 03:36:15 localhost sshd[87771]: main: sshd: ssh-rsa algorithm is disabled Dec 15 03:36:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:36:16 localhost podman[87773]: 2025-12-15 08:36:16.762917584 +0000 UTC m=+0.088099478 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, vcs-type=git, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, io.openshift.expose-services=) Dec 15 03:36:16 localhost podman[87773]: 2025-12-15 08:36:16.979437575 +0000 UTC m=+0.304619479 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, version=17.1.12, container_name=metrics_qdr, config_id=tripleo_step1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., distribution-scope=public, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, architecture=x86_64) Dec 15 03:36:16 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:36:32 localhost sshd[87977]: main: sshd: ssh-rsa algorithm is disabled Dec 15 03:36:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:36:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:36:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 03:36:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:36:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:36:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:36:32 localhost podman[87978]: 2025-12-15 08:36:32.795126025 +0000 UTC m=+0.112727686 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.buildah.version=1.41.4, name=rhosp17/openstack-collectd, managed_by=tripleo_ansible, release=1761123044, build-date=2025-11-18T22:51:28Z, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.openshift.expose-services=, tcib_managed=true, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd) Dec 15 03:36:32 localhost podman[87979]: 2025-12-15 08:36:32.836674737 +0000 UTC m=+0.150469986 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:44:13Z, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, architecture=x86_64, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, io.openshift.expose-services=, release=1761123044, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, io.buildah.version=1.41.4, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1) Dec 15 03:36:32 localhost podman[87978]: 2025-12-15 08:36:32.855544861 +0000 UTC m=+0.173146512 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1761123044, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64) Dec 15 03:36:32 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:36:32 localhost podman[87993]: 2025-12-15 08:36:32.948763765 +0000 UTC m=+0.254200230 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, distribution-scope=public, build-date=2025-11-19T00:11:48Z, vendor=Red Hat, Inc., managed_by=tripleo_ansible, batch=17.1_20251118.1) Dec 15 03:36:32 localhost podman[87982]: 2025-12-15 08:36:32.994315323 +0000 UTC m=+0.300911380 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:32Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, tcib_managed=true, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com) Dec 15 03:36:33 localhost podman[87993]: 2025-12-15 08:36:33.005247845 +0000 UTC m=+0.310684320 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20251118.1, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, build-date=2025-11-19T00:11:48Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, managed_by=tripleo_ansible, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64) Dec 15 03:36:33 localhost podman[87982]: 2025-12-15 08:36:33.005526963 +0000 UTC m=+0.312123030 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_id=tripleo_step4, container_name=logrotate_crond, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, build-date=2025-11-18T22:49:32Z, architecture=x86_64, vendor=Red Hat, Inc., io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20251118.1, name=rhosp17/openstack-cron, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12) Dec 15 03:36:33 localhost podman[87980]: 2025-12-15 08:36:32.912977177 +0000 UTC m=+0.229240632 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, container_name=nova_compute, release=1761123044, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, architecture=x86_64) Dec 15 03:36:33 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 03:36:33 localhost podman[87979]: 2025-12-15 08:36:33.025072636 +0000 UTC m=+0.338867905 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, build-date=2025-11-18T23:44:13Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, managed_by=tripleo_ansible, architecture=x86_64, config_id=tripleo_step3, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, batch=17.1_20251118.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container) Dec 15 03:36:33 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:36:33 localhost podman[87980]: 2025-12-15 08:36:33.047704531 +0000 UTC m=+0.363967996 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute, url=https://www.redhat.com, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, vcs-type=git, version=17.1.12, release=1761123044, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, architecture=x86_64, distribution-scope=public) Dec 15 03:36:33 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Deactivated successfully. Dec 15 03:36:33 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 03:36:33 localhost podman[87981]: 2025-12-15 08:36:33.155246497 +0000 UTC m=+0.464106554 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, build-date=2025-11-19T00:12:45Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git) Dec 15 03:36:33 localhost podman[87981]: 2025-12-15 08:36:33.231350372 +0000 UTC m=+0.540210409 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., config_id=tripleo_step4, distribution-scope=public, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, version=17.1.12, build-date=2025-11-19T00:12:45Z, tcib_managed=true, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.buildah.version=1.41.4, batch=17.1_20251118.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676) Dec 15 03:36:33 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 03:36:33 localhost systemd[1]: tmp-crun.Z2BOCl.mount: Deactivated successfully. Dec 15 03:36:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:36:35 localhost systemd[1]: tmp-crun.Atwa5F.mount: Deactivated successfully. Dec 15 03:36:35 localhost podman[88115]: 2025-12-15 08:36:35.769051587 +0000 UTC m=+0.095342001 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_id=tripleo_step4, version=17.1.12, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, release=1761123044, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 15 03:36:36 localhost podman[88115]: 2025-12-15 08:36:36.157580289 +0000 UTC m=+0.483870703 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, version=17.1.12, architecture=x86_64, batch=17.1_20251118.1, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-type=git, config_id=tripleo_step4, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Dec 15 03:36:36 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 03:36:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:36:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:36:38 localhost podman[88138]: 2025-12-15 08:36:38.752251826 +0000 UTC m=+0.078874881 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:34:05Z, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, release=1761123044, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git) Dec 15 03:36:38 localhost podman[88138]: 2025-12-15 08:36:38.777243124 +0000 UTC m=+0.103866169 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, architecture=x86_64, url=https://www.redhat.com, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, io.buildah.version=1.41.4, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ovn-controller) Dec 15 03:36:38 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Deactivated successfully. Dec 15 03:36:38 localhost systemd[1]: tmp-crun.5RhrG4.mount: Deactivated successfully. Dec 15 03:36:38 localhost podman[88139]: 2025-12-15 08:36:38.868953437 +0000 UTC m=+0.192650923 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.openshift.expose-services=, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, managed_by=tripleo_ansible, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, version=17.1.12, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn) Dec 15 03:36:38 localhost podman[88139]: 2025-12-15 08:36:38.918692868 +0000 UTC m=+0.242390364 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, architecture=x86_64, vcs-type=git, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, tcib_managed=true, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, container_name=ovn_metadata_agent, config_id=tripleo_step4, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn) Dec 15 03:36:38 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Deactivated successfully. Dec 15 03:36:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:36:47 localhost systemd[1]: tmp-crun.yrt0QA.mount: Deactivated successfully. Dec 15 03:36:47 localhost podman[88186]: 2025-12-15 08:36:47.771379924 +0000 UTC m=+0.100957771 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, vendor=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:46Z, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, container_name=metrics_qdr, url=https://www.redhat.com, config_id=tripleo_step1, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, version=17.1.12, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 15 03:36:47 localhost podman[88186]: 2025-12-15 08:36:47.992476418 +0000 UTC m=+0.322054275 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, tcib_managed=true, batch=17.1_20251118.1, url=https://www.redhat.com, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, container_name=metrics_qdr, io.buildah.version=1.41.4, com.redhat.component=openstack-qdrouterd-container) Dec 15 03:36:48 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:37:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:37:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:37:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 03:37:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:37:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:37:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:37:03 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Dec 15 03:37:03 localhost recover_tripleo_nova_virtqemud[88251]: 61849 Dec 15 03:37:03 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Dec 15 03:37:03 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Dec 15 03:37:03 localhost systemd[1]: tmp-crun.XpvLN0.mount: Deactivated successfully. Dec 15 03:37:03 localhost podman[88216]: 2025-12-15 08:37:03.774895684 +0000 UTC m=+0.094999601 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, version=17.1.12, vcs-type=git, build-date=2025-11-19T00:36:58Z, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., release=1761123044, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, architecture=x86_64, maintainer=OpenStack TripleO Team, io.openshift.expose-services=) Dec 15 03:37:03 localhost systemd[1]: tmp-crun.KyiOLL.mount: Deactivated successfully. Dec 15 03:37:03 localhost podman[88214]: 2025-12-15 08:37:03.801050644 +0000 UTC m=+0.123610417 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, version=17.1.12, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vcs-type=git, managed_by=tripleo_ansible, url=https://www.redhat.com, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, name=rhosp17/openstack-collectd, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, com.redhat.component=openstack-collectd-container) Dec 15 03:37:03 localhost podman[88216]: 2025-12-15 08:37:03.829739211 +0000 UTC m=+0.149843118 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, container_name=nova_compute, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, version=17.1.12, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, release=1761123044, config_id=tripleo_step5, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64) Dec 15 03:37:03 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Deactivated successfully. Dec 15 03:37:03 localhost podman[88214]: 2025-12-15 08:37:03.864892462 +0000 UTC m=+0.187452235 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, vcs-type=git, build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, config_id=tripleo_step3, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 15 03:37:03 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:37:03 localhost podman[88235]: 2025-12-15 08:37:03.832781012 +0000 UTC m=+0.139021139 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, version=17.1.12, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, architecture=x86_64, batch=17.1_20251118.1, distribution-scope=public, container_name=ceilometer_agent_compute, url=https://www.redhat.com, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, maintainer=OpenStack TripleO Team) Dec 15 03:37:03 localhost podman[88235]: 2025-12-15 08:37:03.914444417 +0000 UTC m=+0.220684604 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, distribution-scope=public, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, com.redhat.component=openstack-ceilometer-compute-container, build-date=2025-11-19T00:11:48Z, managed_by=tripleo_ansible, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute) Dec 15 03:37:03 localhost podman[88234]: 2025-12-15 08:37:03.815381157 +0000 UTC m=+0.125410775 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, distribution-scope=public, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=logrotate_crond, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:32Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, architecture=x86_64, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, url=https://www.redhat.com) Dec 15 03:37:03 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 03:37:03 localhost podman[88215]: 2025-12-15 08:37:03.927536347 +0000 UTC m=+0.246173985 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, batch=17.1_20251118.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, version=17.1.12, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, name=rhosp17/openstack-iscsid, io.openshift.expose-services=, url=https://www.redhat.com, container_name=iscsid, config_id=tripleo_step3, io.buildah.version=1.41.4) Dec 15 03:37:03 localhost podman[88215]: 2025-12-15 08:37:03.963367356 +0000 UTC m=+0.282005014 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, release=1761123044, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2025-11-18T23:44:13Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vcs-type=git, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, com.redhat.component=openstack-iscsid-container, architecture=x86_64, io.buildah.version=1.41.4, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid) Dec 15 03:37:03 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:37:04 localhost podman[88222]: 2025-12-15 08:37:03.86670814 +0000 UTC m=+0.177588191 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, version=17.1.12, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, build-date=2025-11-19T00:12:45Z, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, release=1761123044, config_id=tripleo_step4, vcs-type=git, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 15 03:37:04 localhost podman[88222]: 2025-12-15 08:37:04.050326051 +0000 UTC m=+0.361206122 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, url=https://www.redhat.com, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1761123044, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, io.openshift.expose-services=, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, version=17.1.12, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:12:45Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 15 03:37:04 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 03:37:04 localhost podman[88234]: 2025-12-15 08:37:04.099586538 +0000 UTC m=+0.409616106 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, config_id=tripleo_step4, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2025-11-18T22:49:32Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=17.1.12, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 15 03:37:04 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 03:37:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:37:06 localhost systemd[1]: tmp-crun.4lqb3B.mount: Deactivated successfully. Dec 15 03:37:06 localhost podman[88353]: 2025-12-15 08:37:06.751458148 +0000 UTC m=+0.086965087 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, release=1761123044, io.openshift.expose-services=, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, tcib_managed=true, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, architecture=x86_64, batch=17.1_20251118.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, version=17.1.12) Dec 15 03:37:07 localhost podman[88353]: 2025-12-15 08:37:07.129622862 +0000 UTC m=+0.465129791 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, architecture=x86_64, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, managed_by=tripleo_ansible, distribution-scope=public, io.buildah.version=1.41.4, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., vcs-type=git, com.redhat.component=openstack-nova-compute-container, version=17.1.12) Dec 15 03:37:07 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 03:37:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:37:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:37:09 localhost systemd[1]: tmp-crun.tS9dUx.mount: Deactivated successfully. Dec 15 03:37:09 localhost podman[88376]: 2025-12-15 08:37:09.755070834 +0000 UTC m=+0.084423099 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, container_name=ovn_controller, release=1761123044, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, vcs-type=git, managed_by=tripleo_ansible, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, build-date=2025-11-18T23:34:05Z, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller) Dec 15 03:37:09 localhost podman[88376]: 2025-12-15 08:37:09.771617046 +0000 UTC m=+0.100969391 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, build-date=2025-11-18T23:34:05Z, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vcs-type=git, version=17.1.12, config_id=tripleo_step4) Dec 15 03:37:09 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Deactivated successfully. Dec 15 03:37:09 localhost podman[88377]: 2025-12-15 08:37:09.85737461 +0000 UTC m=+0.183354545 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, managed_by=tripleo_ansible, batch=17.1_20251118.1, distribution-scope=public, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, release=1761123044, url=https://www.redhat.com, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:14:25Z) Dec 15 03:37:09 localhost podman[88377]: 2025-12-15 08:37:09.91458953 +0000 UTC m=+0.240569455 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, vcs-type=git, config_id=tripleo_step4, vendor=Red Hat, Inc., url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent) Dec 15 03:37:09 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Deactivated successfully. Dec 15 03:37:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:37:18 localhost systemd[1]: tmp-crun.7sORLB.mount: Deactivated successfully. Dec 15 03:37:18 localhost podman[88425]: 2025-12-15 08:37:18.758151212 +0000 UTC m=+0.088505239 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, vendor=Red Hat, Inc., config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, architecture=x86_64, vcs-type=git, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, version=17.1.12) Dec 15 03:37:18 localhost podman[88425]: 2025-12-15 08:37:18.975689 +0000 UTC m=+0.306042997 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, release=1761123044, architecture=x86_64, batch=17.1_20251118.1, vendor=Red Hat, Inc., distribution-scope=public) Dec 15 03:37:18 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:37:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:37:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:37:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 03:37:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:37:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:37:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:37:34 localhost systemd[1]: tmp-crun.09EPn2.mount: Deactivated successfully. Dec 15 03:37:34 localhost systemd[1]: tmp-crun.EHGZDB.mount: Deactivated successfully. Dec 15 03:37:34 localhost podman[88595]: 2025-12-15 08:37:34.876532875 +0000 UTC m=+0.181063344 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, version=17.1.12, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676) Dec 15 03:37:34 localhost podman[88576]: 2025-12-15 08:37:34.840055559 +0000 UTC m=+0.156718882 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, release=1761123044, batch=17.1_20251118.1, version=17.1.12, config_id=tripleo_step3, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2025-11-18T23:44:13Z, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid) Dec 15 03:37:34 localhost podman[88579]: 2025-12-15 08:37:34.934494084 +0000 UTC m=+0.241279664 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, vendor=Red Hat, Inc., release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, build-date=2025-11-18T22:49:32Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-cron, architecture=x86_64, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, tcib_managed=true) Dec 15 03:37:34 localhost podman[88579]: 2025-12-15 08:37:34.9676085 +0000 UTC m=+0.274394080 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.expose-services=, batch=17.1_20251118.1, build-date=2025-11-18T22:49:32Z, release=1761123044, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, architecture=x86_64, distribution-scope=public, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, name=rhosp17/openstack-cron, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, version=17.1.12) Dec 15 03:37:34 localhost podman[88575]: 2025-12-15 08:37:34.976778435 +0000 UTC m=+0.293412617 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, com.redhat.component=openstack-collectd-container, container_name=collectd, batch=17.1_20251118.1, build-date=2025-11-18T22:51:28Z, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, maintainer=OpenStack TripleO Team, release=1761123044, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., managed_by=tripleo_ansible, url=https://www.redhat.com, name=rhosp17/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.12, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 15 03:37:34 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 03:37:34 localhost podman[88575]: 2025-12-15 08:37:34.984650646 +0000 UTC m=+0.301284808 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, vcs-type=git, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, io.buildah.version=1.41.4) Dec 15 03:37:35 localhost podman[88595]: 2025-12-15 08:37:35.002923595 +0000 UTC m=+0.307454134 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-type=git, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.openshift.expose-services=, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z) Dec 15 03:37:35 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:37:35 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 03:37:35 localhost podman[88576]: 2025-12-15 08:37:35.02029747 +0000 UTC m=+0.336960823 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, container_name=iscsid, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3) Dec 15 03:37:35 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:37:35 localhost podman[88577]: 2025-12-15 08:37:35.087257021 +0000 UTC m=+0.404309614 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, batch=17.1_20251118.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, config_id=tripleo_step5, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, vendor=Red Hat, Inc.) Dec 15 03:37:35 localhost podman[88578]: 2025-12-15 08:37:34.806138362 +0000 UTC m=+0.113728033 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, io.openshift.expose-services=, tcib_managed=true, architecture=x86_64, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., batch=17.1_20251118.1, build-date=2025-11-19T00:12:45Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Dec 15 03:37:35 localhost podman[88577]: 2025-12-15 08:37:35.115405793 +0000 UTC m=+0.432458436 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, architecture=x86_64, batch=17.1_20251118.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, config_id=tripleo_step5, container_name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 15 03:37:35 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Deactivated successfully. Dec 15 03:37:35 localhost podman[88578]: 2025-12-15 08:37:35.136826207 +0000 UTC m=+0.444415898 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, io.buildah.version=1.41.4, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-type=git, version=17.1.12, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=) Dec 15 03:37:35 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 03:37:35 localhost systemd[1]: tmp-crun.yACgZx.mount: Deactivated successfully. Dec 15 03:37:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:37:37 localhost systemd[1]: tmp-crun.sAjU4U.mount: Deactivated successfully. Dec 15 03:37:37 localhost podman[88706]: 2025-12-15 08:37:37.765165395 +0000 UTC m=+0.093699487 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, batch=17.1_20251118.1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, version=17.1.12, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_id=tripleo_step4, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044) Dec 15 03:37:38 localhost podman[88706]: 2025-12-15 08:37:38.137644858 +0000 UTC m=+0.466179000 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, io.buildah.version=1.41.4, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, architecture=x86_64, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, vendor=Red Hat, Inc., container_name=nova_migration_target) Dec 15 03:37:38 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 03:37:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:37:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:37:40 localhost podman[88730]: 2025-12-15 08:37:40.754614521 +0000 UTC m=+0.079996961 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, release=1761123044, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.12, io.openshift.expose-services=, io.buildah.version=1.41.4, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 15 03:37:40 localhost podman[88730]: 2025-12-15 08:37:40.794441166 +0000 UTC m=+0.119823606 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-type=git, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1761123044, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., url=https://www.redhat.com, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1) Dec 15 03:37:40 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Deactivated successfully. Dec 15 03:37:40 localhost podman[88729]: 2025-12-15 08:37:40.804221398 +0000 UTC m=+0.133665646 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp17/openstack-ovn-controller, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, version=17.1.12, maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.4, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, url=https://www.redhat.com, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 15 03:37:40 localhost podman[88729]: 2025-12-15 08:37:40.887676599 +0000 UTC m=+0.217120767 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, name=rhosp17/openstack-ovn-controller, tcib_managed=true, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=OpenStack TripleO Team, container_name=ovn_controller, batch=17.1_20251118.1) Dec 15 03:37:40 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Deactivated successfully. Dec 15 03:37:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:37:49 localhost systemd[1]: tmp-crun.3zew45.mount: Deactivated successfully. Dec 15 03:37:49 localhost podman[88775]: 2025-12-15 08:37:49.769762861 +0000 UTC m=+0.104059145 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible) Dec 15 03:37:50 localhost podman[88775]: 2025-12-15 08:37:50.083526963 +0000 UTC m=+0.417823177 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, io.openshift.expose-services=, tcib_managed=true, version=17.1.12, architecture=x86_64, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, vendor=Red Hat, Inc., distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:46Z) Dec 15 03:37:50 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:37:52 localhost sshd[88804]: main: sshd: ssh-rsa algorithm is disabled Dec 15 03:38:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:38:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:38:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 03:38:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:38:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:38:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:38:05 localhost podman[88809]: 2025-12-15 08:38:05.769145111 +0000 UTC m=+0.084309536 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step5, container_name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vcs-type=git, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc.) Dec 15 03:38:05 localhost podman[88818]: 2025-12-15 08:38:05.804330763 +0000 UTC m=+0.110943979 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, version=17.1.12, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.41.4, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, name=rhosp17/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git) Dec 15 03:38:05 localhost podman[88808]: 2025-12-15 08:38:05.827004459 +0000 UTC m=+0.142382539 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, tcib_managed=true, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., distribution-scope=public, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, version=17.1.12, config_id=tripleo_step3, url=https://www.redhat.com, name=rhosp17/openstack-iscsid, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d) Dec 15 03:38:05 localhost podman[88807]: 2025-12-15 08:38:05.882058702 +0000 UTC m=+0.203206347 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=collectd, io.openshift.expose-services=, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container) Dec 15 03:38:05 localhost podman[88807]: 2025-12-15 08:38:05.898382618 +0000 UTC m=+0.219530273 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, name=rhosp17/openstack-collectd, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, release=1761123044, distribution-scope=public, batch=17.1_20251118.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git) Dec 15 03:38:05 localhost podman[88808]: 2025-12-15 08:38:05.911496819 +0000 UTC m=+0.226874929 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.component=openstack-iscsid-container, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, io.buildah.version=1.41.4, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, build-date=2025-11-18T23:44:13Z, distribution-scope=public, release=1761123044, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid) Dec 15 03:38:05 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:38:05 localhost podman[88818]: 2025-12-15 08:38:05.940452533 +0000 UTC m=+0.247065769 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., batch=17.1_20251118.1, release=1761123044, build-date=2025-11-19T00:11:48Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, url=https://www.redhat.com) Dec 15 03:38:05 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:38:05 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 03:38:05 localhost podman[88809]: 2025-12-15 08:38:05.958886236 +0000 UTC m=+0.274050701 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, container_name=nova_compute, version=17.1.12, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, tcib_managed=true, release=1761123044) Dec 15 03:38:05 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Deactivated successfully. Dec 15 03:38:06 localhost podman[88810]: 2025-12-15 08:38:06.045241066 +0000 UTC m=+0.355562881 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vendor=Red Hat, Inc., distribution-scope=public, container_name=ceilometer_agent_ipmi, vcs-type=git, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, url=https://www.redhat.com, build-date=2025-11-19T00:12:45Z, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi) Dec 15 03:38:06 localhost podman[88816]: 2025-12-15 08:38:06.096905778 +0000 UTC m=+0.405248070 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.openshift.expose-services=, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, container_name=logrotate_crond, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:32Z) Dec 15 03:38:06 localhost podman[88816]: 2025-12-15 08:38:06.104881681 +0000 UTC m=+0.413223973 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, maintainer=OpenStack TripleO Team, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2025-11-18T22:49:32Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, release=1761123044, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, url=https://www.redhat.com, vendor=Red Hat, Inc., config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=) Dec 15 03:38:06 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 03:38:06 localhost podman[88810]: 2025-12-15 08:38:06.156227354 +0000 UTC m=+0.466549209 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-type=git, batch=17.1_20251118.1, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.12, url=https://www.redhat.com, architecture=x86_64, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi) Dec 15 03:38:06 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 03:38:06 localhost systemd[1]: tmp-crun.WPn0oW.mount: Deactivated successfully. Dec 15 03:38:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:38:08 localhost podman[88941]: 2025-12-15 08:38:08.752107413 +0000 UTC m=+0.083697640 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, architecture=x86_64, config_id=tripleo_step4, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, url=https://www.redhat.com, distribution-scope=public) Dec 15 03:38:09 localhost podman[88941]: 2025-12-15 08:38:09.191484235 +0000 UTC m=+0.523074462 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, release=1761123044, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, architecture=x86_64, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Dec 15 03:38:09 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 03:38:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:38:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:38:11 localhost podman[88965]: 2025-12-15 08:38:11.757443424 +0000 UTC m=+0.087392079 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, container_name=ovn_controller, tcib_managed=true, url=https://www.redhat.com, config_id=tripleo_step4, release=1761123044, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, managed_by=tripleo_ansible, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, name=rhosp17/openstack-ovn-controller, vcs-type=git) Dec 15 03:38:11 localhost podman[88966]: 2025-12-15 08:38:11.810872263 +0000 UTC m=+0.137962581 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.4, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, release=1761123044, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, build-date=2025-11-19T00:14:25Z, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible) Dec 15 03:38:11 localhost podman[88965]: 2025-12-15 08:38:11.866636924 +0000 UTC m=+0.196585529 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, config_id=tripleo_step4, release=1761123044, managed_by=tripleo_ansible, architecture=x86_64, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, container_name=ovn_controller, io.buildah.version=1.41.4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, build-date=2025-11-18T23:34:05Z, batch=17.1_20251118.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller) Dec 15 03:38:11 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Deactivated successfully. Dec 15 03:38:11 localhost podman[88966]: 2025-12-15 08:38:11.883442204 +0000 UTC m=+0.210532592 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.buildah.version=1.41.4, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, url=https://www.redhat.com, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 15 03:38:11 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Deactivated successfully. Dec 15 03:38:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:38:20 localhost systemd[1]: tmp-crun.SZPgWF.mount: Deactivated successfully. Dec 15 03:38:20 localhost podman[89034]: 2025-12-15 08:38:20.77504788 +0000 UTC m=+0.102738889 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, tcib_managed=true, vendor=Red Hat, Inc., io.buildah.version=1.41.4, batch=17.1_20251118.1, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, vcs-type=git, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, build-date=2025-11-18T22:49:46Z) Dec 15 03:38:20 localhost podman[89034]: 2025-12-15 08:38:20.951306364 +0000 UTC m=+0.278997413 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20251118.1, release=1761123044, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 15 03:38:20 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:38:29 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Dec 15 03:38:29 localhost recover_tripleo_nova_virtqemud[89081]: 61849 Dec 15 03:38:29 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Dec 15 03:38:29 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Dec 15 03:38:30 localhost podman[89169]: 2025-12-15 08:38:30.34904643 +0000 UTC m=+0.092966348 container exec 8dcda56b365b42dc8758aab77a9ec80db304780e449052738f7e4e648ae1ecaf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-crash-np0005559462, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, GIT_BRANCH=main, release=1763362218, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, io.buildah.version=1.41.4, build-date=2025-11-26T19:44:28Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, CEPH_POINT_RELEASE=, GIT_CLEAN=True, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , io.openshift.expose-services=, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0) Dec 15 03:38:30 localhost podman[89169]: 2025-12-15 08:38:30.472384159 +0000 UTC m=+0.216304107 container exec_died 8dcda56b365b42dc8758aab77a9ec80db304780e449052738f7e4e648ae1ecaf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-crash-np0005559462, io.openshift.expose-services=, release=1763362218, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, version=7, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, vcs-type=git, RELEASE=main, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, GIT_CLEAN=True, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, build-date=2025-11-26T19:44:28Z) Dec 15 03:38:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:38:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:38:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 03:38:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:38:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:38:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:38:36 localhost systemd[1]: tmp-crun.04TXMt.mount: Deactivated successfully. Dec 15 03:38:36 localhost podman[89314]: 2025-12-15 08:38:36.772155001 +0000 UTC m=+0.094127488 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, tcib_managed=true, architecture=x86_64, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., version=17.1.12) Dec 15 03:38:36 localhost podman[89315]: 2025-12-15 08:38:36.833773159 +0000 UTC m=+0.155858669 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, name=rhosp17/openstack-nova-compute, vcs-type=git, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., batch=17.1_20251118.1, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}) Dec 15 03:38:36 localhost podman[89316]: 2025-12-15 08:38:36.887119046 +0000 UTC m=+0.204252514 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, url=https://www.redhat.com, tcib_managed=true, version=17.1.12, vcs-type=git, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, distribution-scope=public, build-date=2025-11-19T00:12:45Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 15 03:38:36 localhost podman[89314]: 2025-12-15 08:38:36.912129915 +0000 UTC m=+0.234102402 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, container_name=iscsid, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, com.redhat.component=openstack-iscsid-container, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-type=git, release=1761123044, managed_by=tripleo_ansible, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, build-date=2025-11-18T23:44:13Z, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, io.openshift.expose-services=) Dec 15 03:38:36 localhost podman[89331]: 2025-12-15 08:38:36.8083929 +0000 UTC m=+0.110197548 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:11:48Z, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, config_id=tripleo_step4, batch=17.1_20251118.1, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, distribution-scope=public, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute) Dec 15 03:38:36 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:38:36 localhost podman[89317]: 2025-12-15 08:38:36.988744074 +0000 UTC m=+0.300287513 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, tcib_managed=true, config_id=tripleo_step4, container_name=logrotate_crond, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, managed_by=tripleo_ansible, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2025-11-18T22:49:32Z, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team) Dec 15 03:38:37 localhost podman[89315]: 2025-12-15 08:38:37.018640193 +0000 UTC m=+0.340725783 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_id=tripleo_step5, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-type=git, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, container_name=nova_compute, version=17.1.12, io.openshift.expose-services=, url=https://www.redhat.com, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team) Dec 15 03:38:37 localhost podman[89316]: 2025-12-15 08:38:37.018968052 +0000 UTC m=+0.336101520 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, batch=17.1_20251118.1, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, name=rhosp17/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, distribution-scope=public, version=17.1.12, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, vcs-type=git, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container) Dec 15 03:38:37 localhost podman[89317]: 2025-12-15 08:38:37.026320059 +0000 UTC m=+0.337863468 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, managed_by=tripleo_ansible, release=1761123044, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, build-date=2025-11-18T22:49:32Z, name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, version=17.1.12, tcib_managed=true, container_name=logrotate_crond, batch=17.1_20251118.1, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 15 03:38:37 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Deactivated successfully. Dec 15 03:38:37 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 03:38:37 localhost podman[89331]: 2025-12-15 08:38:37.046379935 +0000 UTC m=+0.348184583 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, vendor=Red Hat, Inc., config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, tcib_managed=true, io.openshift.expose-services=, vcs-type=git, version=17.1.12, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible) Dec 15 03:38:37 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 03:38:37 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 03:38:37 localhost podman[89313]: 2025-12-15 08:38:37.093315211 +0000 UTC m=+0.404739136 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_id=tripleo_step3, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, container_name=collectd, architecture=x86_64, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.openshift.expose-services=, io.buildah.version=1.41.4, distribution-scope=public) Dec 15 03:38:37 localhost podman[89313]: 2025-12-15 08:38:37.106513284 +0000 UTC m=+0.417937199 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, managed_by=tripleo_ansible, config_id=tripleo_step3, build-date=2025-11-18T22:51:28Z, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, tcib_managed=true, io.buildah.version=1.41.4, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, version=17.1.12) Dec 15 03:38:37 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:38:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:38:39 localhost podman[89450]: 2025-12-15 08:38:39.748197779 +0000 UTC m=+0.083064453 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, release=1761123044, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, version=17.1.12, tcib_managed=true, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, vendor=Red Hat, Inc.) Dec 15 03:38:40 localhost podman[89450]: 2025-12-15 08:38:40.148412443 +0000 UTC m=+0.483279077 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, vcs-type=git, distribution-scope=public, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, batch=17.1_20251118.1, version=17.1.12, container_name=nova_migration_target, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, release=1761123044, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 15 03:38:40 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 03:38:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:38:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:38:42 localhost podman[89473]: 2025-12-15 08:38:42.746235775 +0000 UTC m=+0.079081316 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.4, version=17.1.12, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, vendor=Red Hat, Inc., tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, distribution-scope=public) Dec 15 03:38:42 localhost podman[89474]: 2025-12-15 08:38:42.804888914 +0000 UTC m=+0.134038626 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, release=1761123044, tcib_managed=true, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, config_id=tripleo_step4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, architecture=x86_64, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c) Dec 15 03:38:42 localhost podman[89473]: 2025-12-15 08:38:42.818965321 +0000 UTC m=+0.151810812 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, release=1761123044, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, distribution-scope=public, url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64) Dec 15 03:38:42 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Deactivated successfully. Dec 15 03:38:42 localhost podman[89474]: 2025-12-15 08:38:42.856326059 +0000 UTC m=+0.185475771 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, distribution-scope=public, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, io.buildah.version=1.41.4, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.12, container_name=ovn_metadata_agent, batch=17.1_20251118.1, url=https://www.redhat.com, vcs-type=git, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 15 03:38:42 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Deactivated successfully. Dec 15 03:38:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:38:51 localhost systemd[1]: tmp-crun.Ifd93I.mount: Deactivated successfully. Dec 15 03:38:51 localhost podman[89520]: 2025-12-15 08:38:51.756691698 +0000 UTC m=+0.091417065 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, release=1761123044, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, vendor=Red Hat, Inc., architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-11-18T22:49:46Z, tcib_managed=true) Dec 15 03:38:51 localhost podman[89520]: 2025-12-15 08:38:51.965437132 +0000 UTC m=+0.300162459 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, name=rhosp17/openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-type=git, io.buildah.version=1.41.4, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 15 03:38:51 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:39:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:39:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:39:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 03:39:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:39:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:39:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:39:07 localhost systemd[1]: tmp-crun.qL4W9i.mount: Deactivated successfully. Dec 15 03:39:07 localhost systemd[1]: tmp-crun.5E8kih.mount: Deactivated successfully. Dec 15 03:39:07 localhost podman[89557]: 2025-12-15 08:39:07.832785433 +0000 UTC m=+0.152844140 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, build-date=2025-11-19T00:12:45Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, release=1761123044, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com) Dec 15 03:39:07 localhost podman[89549]: 2025-12-15 08:39:07.767042824 +0000 UTC m=+0.099439560 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, distribution-scope=public, io.openshift.expose-services=, name=rhosp17/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, architecture=x86_64, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com) Dec 15 03:39:07 localhost podman[89557]: 2025-12-15 08:39:07.893370913 +0000 UTC m=+0.213429600 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, tcib_managed=true, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, distribution-scope=public, managed_by=tripleo_ansible, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676) Dec 15 03:39:07 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 03:39:07 localhost podman[89549]: 2025-12-15 08:39:07.919262646 +0000 UTC m=+0.251659372 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, version=17.1.12, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.openshift.expose-services=, name=rhosp17/openstack-collectd, io.buildah.version=1.41.4, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, batch=17.1_20251118.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd) Dec 15 03:39:07 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:39:07 localhost podman[89569]: 2025-12-15 08:39:07.800092348 +0000 UTC m=+0.110255070 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., url=https://www.redhat.com) Dec 15 03:39:07 localhost podman[89551]: 2025-12-15 08:39:07.818561072 +0000 UTC m=+0.143416256 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, release=1761123044, maintainer=OpenStack TripleO Team, vcs-type=git, distribution-scope=public, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., managed_by=tripleo_ansible, container_name=nova_compute, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64) Dec 15 03:39:07 localhost podman[89565]: 2025-12-15 08:39:07.960995552 +0000 UTC m=+0.275290755 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:32Z, name=rhosp17/openstack-cron, batch=17.1_20251118.1, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, config_id=tripleo_step4, url=https://www.redhat.com) Dec 15 03:39:07 localhost podman[89569]: 2025-12-15 08:39:07.965208294 +0000 UTC m=+0.275370966 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, build-date=2025-11-19T00:11:48Z, vendor=Red Hat, Inc., config_id=tripleo_step4, architecture=x86_64, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Dec 15 03:39:07 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 03:39:08 localhost podman[89551]: 2025-12-15 08:39:08.015329305 +0000 UTC m=+0.340184509 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, vendor=Red Hat, Inc., container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z) Dec 15 03:39:08 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Deactivated successfully. Dec 15 03:39:08 localhost podman[89565]: 2025-12-15 08:39:08.04954714 +0000 UTC m=+0.363842373 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, architecture=x86_64, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, name=rhosp17/openstack-cron, config_id=tripleo_step4, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, io.openshift.expose-services=, version=17.1.12, tcib_managed=true, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron) Dec 15 03:39:08 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 03:39:08 localhost podman[89550]: 2025-12-15 08:39:08.020405051 +0000 UTC m=+0.347907826 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, tcib_managed=true, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, config_id=tripleo_step3, batch=17.1_20251118.1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, architecture=x86_64, name=rhosp17/openstack-iscsid, version=17.1.12, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.41.4, container_name=iscsid, release=1761123044) Dec 15 03:39:08 localhost podman[89550]: 2025-12-15 08:39:08.105312941 +0000 UTC m=+0.432815746 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1761123044, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com) Dec 15 03:39:08 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:39:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:39:10 localhost podman[89682]: 2025-12-15 08:39:10.74335909 +0000 UTC m=+0.074389731 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, tcib_managed=true, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., distribution-scope=public, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container) Dec 15 03:39:11 localhost podman[89682]: 2025-12-15 08:39:11.076291164 +0000 UTC m=+0.407321775 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, release=1761123044, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, batch=17.1_20251118.1) Dec 15 03:39:11 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 03:39:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:39:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:39:13 localhost systemd[1]: tmp-crun.rpiXHh.mount: Deactivated successfully. Dec 15 03:39:13 localhost podman[89706]: 2025-12-15 08:39:13.754079615 +0000 UTC m=+0.085175779 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, tcib_managed=true, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, architecture=x86_64, vcs-type=git, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, name=rhosp17/openstack-ovn-controller, release=1761123044, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ovn-controller) Dec 15 03:39:13 localhost podman[89706]: 2025-12-15 08:39:13.806365233 +0000 UTC m=+0.137461437 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, build-date=2025-11-18T23:34:05Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, config_id=tripleo_step4, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, architecture=x86_64, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, distribution-scope=public) Dec 15 03:39:13 localhost podman[89707]: 2025-12-15 08:39:13.80626945 +0000 UTC m=+0.133992054 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, config_id=tripleo_step4, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_metadata_agent, io.openshift.expose-services=, vcs-type=git, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, release=1761123044, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container) Dec 15 03:39:13 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Deactivated successfully. Dec 15 03:39:13 localhost podman[89707]: 2025-12-15 08:39:13.890328359 +0000 UTC m=+0.218050993 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-type=git, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_metadata_agent, release=1761123044, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, build-date=2025-11-19T00:14:25Z) Dec 15 03:39:13 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Deactivated successfully. Dec 15 03:39:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:39:22 localhost systemd[1]: tmp-crun.CyvjhP.mount: Deactivated successfully. Dec 15 03:39:22 localhost podman[89777]: 2025-12-15 08:39:22.759872843 +0000 UTC m=+0.094364424 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, build-date=2025-11-18T22:49:46Z, vendor=Red Hat, Inc., batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, version=17.1.12, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4) Dec 15 03:39:22 localhost podman[89777]: 2025-12-15 08:39:22.953262375 +0000 UTC m=+0.287753956 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, distribution-scope=public, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, url=https://www.redhat.com, version=17.1.12, vcs-type=git, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, config_id=tripleo_step1) Dec 15 03:39:22 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:39:28 localhost sshd[89806]: main: sshd: ssh-rsa algorithm is disabled Dec 15 03:39:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:39:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:39:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 03:39:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:39:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:39:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:39:38 localhost systemd[1]: tmp-crun.QeobKj.mount: Deactivated successfully. Dec 15 03:39:38 localhost podman[89904]: 2025-12-15 08:39:38.786236548 +0000 UTC m=+0.090296256 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20251118.1, config_id=tripleo_step4, build-date=2025-11-19T00:11:48Z, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, tcib_managed=true, distribution-scope=public, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Dec 15 03:39:38 localhost podman[89884]: 2025-12-15 08:39:38.798196449 +0000 UTC m=+0.117353880 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, version=17.1.12, url=https://www.redhat.com, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:51:28Z, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, container_name=collectd) Dec 15 03:39:38 localhost podman[89886]: 2025-12-15 08:39:38.82520014 +0000 UTC m=+0.139474141 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.expose-services=, url=https://www.redhat.com, managed_by=tripleo_ansible, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, distribution-scope=public, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, container_name=nova_compute, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step5, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12) Dec 15 03:39:38 localhost podman[89904]: 2025-12-15 08:39:38.834852309 +0000 UTC m=+0.138911997 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, build-date=2025-11-19T00:11:48Z, url=https://www.redhat.com, container_name=ceilometer_agent_compute, release=1761123044, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true) Dec 15 03:39:38 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 03:39:38 localhost podman[89886]: 2025-12-15 08:39:38.852344657 +0000 UTC m=+0.166618698 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, url=https://www.redhat.com, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute, version=17.1.12, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, tcib_managed=true, config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64) Dec 15 03:39:38 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Deactivated successfully. Dec 15 03:39:38 localhost podman[89884]: 2025-12-15 08:39:38.887516847 +0000 UTC m=+0.206674368 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, managed_by=tripleo_ansible, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4, version=17.1.12, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, architecture=x86_64, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, tcib_managed=true, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd) Dec 15 03:39:38 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:39:38 localhost podman[89898]: 2025-12-15 08:39:38.930874587 +0000 UTC m=+0.235736946 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.buildah.version=1.41.4, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., batch=17.1_20251118.1, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron) Dec 15 03:39:38 localhost podman[89898]: 2025-12-15 08:39:38.943147315 +0000 UTC m=+0.248009684 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, container_name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, release=1761123044, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, build-date=2025-11-18T22:49:32Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vcs-type=git, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, config_id=tripleo_step4, distribution-scope=public, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 15 03:39:38 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 03:39:38 localhost podman[89885]: 2025-12-15 08:39:38.755787414 +0000 UTC m=+0.076509717 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, tcib_managed=true, com.redhat.component=openstack-iscsid-container, distribution-scope=public, config_id=tripleo_step3, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.expose-services=, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., release=1761123044) Dec 15 03:39:38 localhost podman[89885]: 2025-12-15 08:39:38.991370256 +0000 UTC m=+0.312092619 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid, architecture=x86_64, container_name=iscsid, version=17.1.12, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid) Dec 15 03:39:39 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:39:39 localhost podman[89887]: 2025-12-15 08:39:39.037739925 +0000 UTC m=+0.344409472 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, url=https://www.redhat.com, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-ipmi, distribution-scope=public, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, release=1761123044, managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, container_name=ceilometer_agent_ipmi) Dec 15 03:39:39 localhost podman[89887]: 2025-12-15 08:39:39.08839061 +0000 UTC m=+0.395060117 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, architecture=x86_64, io.buildah.version=1.41.4, url=https://www.redhat.com, vcs-type=git, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., io.openshift.expose-services=, version=17.1.12, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:12:45Z, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, batch=17.1_20251118.1, config_id=tripleo_step4) Dec 15 03:39:39 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 03:39:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:39:41 localhost systemd[1]: tmp-crun.NGn5d2.mount: Deactivated successfully. Dec 15 03:39:41 localhost podman[90021]: 2025-12-15 08:39:41.734227876 +0000 UTC m=+0.067342442 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, version=17.1.12, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_migration_target, vendor=Red Hat, Inc., managed_by=tripleo_ansible, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step4, io.buildah.version=1.41.4) Dec 15 03:39:42 localhost podman[90021]: 2025-12-15 08:39:42.140460251 +0000 UTC m=+0.473574807 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, distribution-scope=public, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true) Dec 15 03:39:42 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 03:39:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:39:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:39:44 localhost systemd[1]: tmp-crun.o5gOcX.mount: Deactivated successfully. Dec 15 03:39:44 localhost podman[90045]: 2025-12-15 08:39:44.762478261 +0000 UTC m=+0.086588437 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, vcs-type=git, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., version=17.1.12, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, release=1761123044, io.buildah.version=1.41.4, managed_by=tripleo_ansible, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, config_id=tripleo_step4) Dec 15 03:39:44 localhost systemd[1]: tmp-crun.HtnDe0.mount: Deactivated successfully. Dec 15 03:39:44 localhost podman[90044]: 2025-12-15 08:39:44.822092145 +0000 UTC m=+0.149174161 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vendor=Red Hat, Inc., url=https://www.redhat.com, container_name=ovn_controller, tcib_managed=true, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, build-date=2025-11-18T23:34:05Z, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.buildah.version=1.41.4, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, vcs-type=git, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 15 03:39:44 localhost podman[90045]: 2025-12-15 08:39:44.876552901 +0000 UTC m=+0.200663097 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, tcib_managed=true, version=17.1.12, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, release=1761123044, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c) Dec 15 03:39:44 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Deactivated successfully. Dec 15 03:39:44 localhost podman[90044]: 2025-12-15 08:39:44.927630558 +0000 UTC m=+0.254712584 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, name=rhosp17/openstack-ovn-controller, architecture=x86_64, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, batch=17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.buildah.version=1.41.4) Dec 15 03:39:44 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Deactivated successfully. Dec 15 03:39:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:39:53 localhost podman[90092]: 2025-12-15 08:39:53.759674981 +0000 UTC m=+0.088978560 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, distribution-scope=public, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible) Dec 15 03:39:53 localhost podman[90092]: 2025-12-15 08:39:53.996279569 +0000 UTC m=+0.325583198 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, container_name=metrics_qdr, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., distribution-scope=public, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, build-date=2025-11-18T22:49:46Z, version=17.1.12) Dec 15 03:39:54 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:40:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:40:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:40:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 03:40:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:40:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:40:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:40:09 localhost podman[90122]: 2025-12-15 08:40:09.791504982 +0000 UTC m=+0.113111246 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1761123044, tcib_managed=true, distribution-scope=public, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, vendor=Red Hat, Inc., name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, build-date=2025-11-18T23:44:13Z, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, version=17.1.12, config_id=tripleo_step3) Dec 15 03:40:09 localhost podman[90122]: 2025-12-15 08:40:09.827487884 +0000 UTC m=+0.149094158 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vendor=Red Hat, Inc., io.buildah.version=1.41.4, release=1761123044, architecture=x86_64, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, name=rhosp17/openstack-iscsid, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.openshift.expose-services=, tcib_managed=true, build-date=2025-11-18T23:44:13Z, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team) Dec 15 03:40:09 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:40:09 localhost podman[90130]: 2025-12-15 08:40:09.853796847 +0000 UTC m=+0.163335589 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, architecture=x86_64, name=rhosp17/openstack-cron, build-date=2025-11-18T22:49:32Z, vendor=Red Hat, Inc., config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, batch=17.1_20251118.1, release=1761123044, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, com.redhat.component=openstack-cron-container, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, url=https://www.redhat.com, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron) Dec 15 03:40:09 localhost podman[90130]: 2025-12-15 08:40:09.892471982 +0000 UTC m=+0.202010694 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20251118.1, release=1761123044, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, container_name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z) Dec 15 03:40:09 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 03:40:09 localhost podman[90123]: 2025-12-15 08:40:09.947394681 +0000 UTC m=+0.267881326 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., config_id=tripleo_step5, tcib_managed=true, url=https://www.redhat.com, distribution-scope=public, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, version=17.1.12, architecture=x86_64, container_name=nova_compute, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Dec 15 03:40:09 localhost podman[90135]: 2025-12-15 08:40:09.897298731 +0000 UTC m=+0.207368937 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, architecture=x86_64, build-date=2025-11-19T00:11:48Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., batch=17.1_20251118.1, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, version=17.1.12, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, io.openshift.expose-services=, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, io.buildah.version=1.41.4, release=1761123044, name=rhosp17/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 15 03:40:09 localhost podman[90135]: 2025-12-15 08:40:09.981454412 +0000 UTC m=+0.291524628 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:11:48Z, tcib_managed=true, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., version=17.1.12, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vcs-type=git) Dec 15 03:40:09 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 03:40:10 localhost podman[90123]: 2025-12-15 08:40:10.03444841 +0000 UTC m=+0.354934995 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_compute, batch=17.1_20251118.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.openshift.expose-services=, vcs-type=git, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, name=rhosp17/openstack-nova-compute, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Dec 15 03:40:10 localhost podman[90124]: 2025-12-15 08:40:10.037616954 +0000 UTC m=+0.351570164 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, maintainer=OpenStack TripleO Team, tcib_managed=true, name=rhosp17/openstack-ceilometer-ipmi, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, managed_by=tripleo_ansible) Dec 15 03:40:10 localhost podman[90121]: 2025-12-15 08:40:09.986809015 +0000 UTC m=+0.311614025 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com, architecture=x86_64, io.openshift.expose-services=, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, name=rhosp17/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, container_name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, com.redhat.component=openstack-collectd-container, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, vcs-type=git, release=1761123044, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 15 03:40:10 localhost podman[90121]: 2025-12-15 08:40:10.072461486 +0000 UTC m=+0.397266476 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2025-11-18T22:51:28Z, vendor=Red Hat, Inc., distribution-scope=public, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, release=1761123044, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, tcib_managed=true, container_name=collectd, url=https://www.redhat.com, managed_by=tripleo_ansible) Dec 15 03:40:10 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:40:10 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Deactivated successfully. Dec 15 03:40:10 localhost podman[90124]: 2025-12-15 08:40:10.16760153 +0000 UTC m=+0.481554680 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, name=rhosp17/openstack-ceilometer-ipmi, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-type=git, build-date=2025-11-19T00:12:45Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, container_name=ceilometer_agent_ipmi) Dec 15 03:40:10 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 03:40:10 localhost systemd[1]: tmp-crun.XqgQSP.mount: Deactivated successfully. Dec 15 03:40:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:40:12 localhost systemd[1]: tmp-crun.MqEYlF.mount: Deactivated successfully. Dec 15 03:40:12 localhost podman[90259]: 2025-12-15 08:40:12.746896626 +0000 UTC m=+0.082455126 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, config_id=tripleo_step4, release=1761123044, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, io.buildah.version=1.41.4, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public) Dec 15 03:40:13 localhost podman[90259]: 2025-12-15 08:40:13.159596524 +0000 UTC m=+0.495155004 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, config_id=tripleo_step4, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, version=17.1.12, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public) Dec 15 03:40:13 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 03:40:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:40:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:40:15 localhost podman[90282]: 2025-12-15 08:40:15.737424492 +0000 UTC m=+0.068056102 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.4, container_name=ovn_controller, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, version=17.1.12, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, tcib_managed=true, vcs-type=git, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., architecture=x86_64) Dec 15 03:40:15 localhost podman[90282]: 2025-12-15 08:40:15.761759492 +0000 UTC m=+0.092391162 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20251118.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, io.openshift.expose-services=, distribution-scope=public, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team, release=1761123044) Dec 15 03:40:15 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Deactivated successfully. Dec 15 03:40:15 localhost systemd[1]: tmp-crun.2vNoAb.mount: Deactivated successfully. Dec 15 03:40:15 localhost podman[90283]: 2025-12-15 08:40:15.861217013 +0000 UTC m=+0.192142311 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, batch=17.1_20251118.1, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, architecture=x86_64, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Dec 15 03:40:15 localhost podman[90283]: 2025-12-15 08:40:15.902462316 +0000 UTC m=+0.233387654 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, build-date=2025-11-19T00:14:25Z, name=rhosp17/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, config_id=tripleo_step4, managed_by=tripleo_ansible, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, batch=17.1_20251118.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git) Dec 15 03:40:15 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Deactivated successfully. Dec 15 03:40:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:40:24 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Dec 15 03:40:24 localhost recover_tripleo_nova_virtqemud[90360]: 61849 Dec 15 03:40:24 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Dec 15 03:40:24 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Dec 15 03:40:24 localhost podman[90353]: 2025-12-15 08:40:24.760067451 +0000 UTC m=+0.085432456 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, distribution-scope=public, container_name=metrics_qdr, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, maintainer=OpenStack TripleO Team) Dec 15 03:40:24 localhost podman[90353]: 2025-12-15 08:40:24.974153757 +0000 UTC m=+0.299518702 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2025-11-18T22:49:46Z, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, tcib_managed=true, io.buildah.version=1.41.4, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd) Dec 15 03:40:24 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:40:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:40:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:40:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 03:40:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:40:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:40:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:40:40 localhost systemd[1]: tmp-crun.0oblt2.mount: Deactivated successfully. Dec 15 03:40:40 localhost podman[90461]: 2025-12-15 08:40:40.804726674 +0000 UTC m=+0.120658358 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, architecture=x86_64, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=iscsid, vendor=Red Hat, Inc., version=17.1.12, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1761123044, build-date=2025-11-18T23:44:13Z, tcib_managed=true, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public) Dec 15 03:40:40 localhost podman[90460]: 2025-12-15 08:40:40.824314568 +0000 UTC m=+0.147634580 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, distribution-scope=public, com.redhat.component=openstack-collectd-container, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, name=rhosp17/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., container_name=collectd, build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1) Dec 15 03:40:40 localhost podman[90462]: 2025-12-15 08:40:40.779434078 +0000 UTC m=+0.101362982 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vendor=Red Hat, Inc., container_name=nova_compute, architecture=x86_64, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, distribution-scope=public, build-date=2025-11-19T00:36:58Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, tcib_managed=true, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 15 03:40:40 localhost podman[90470]: 2025-12-15 08:40:40.837184362 +0000 UTC m=+0.149575421 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, name=rhosp17/openstack-cron, build-date=2025-11-18T22:49:32Z, batch=17.1_20251118.1, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-type=git, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4) Dec 15 03:40:40 localhost podman[90470]: 2025-12-15 08:40:40.848515715 +0000 UTC m=+0.160906774 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, vendor=Red Hat, Inc., io.openshift.expose-services=, name=rhosp17/openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1761123044, version=17.1.12, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.buildah.version=1.41.4, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 15 03:40:40 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 03:40:40 localhost podman[90461]: 2025-12-15 08:40:40.888330871 +0000 UTC m=+0.204262585 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, distribution-scope=public, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, architecture=x86_64, container_name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, name=rhosp17/openstack-iscsid, batch=17.1_20251118.1) Dec 15 03:40:40 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:40:40 localhost podman[90462]: 2025-12-15 08:40:40.928472934 +0000 UTC m=+0.250401778 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, maintainer=OpenStack TripleO Team, container_name=nova_compute, io.buildah.version=1.41.4, release=1761123044, build-date=2025-11-19T00:36:58Z, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, vcs-type=git, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 15 03:40:40 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Deactivated successfully. Dec 15 03:40:40 localhost podman[90463]: 2025-12-15 08:40:40.945776737 +0000 UTC m=+0.257976101 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, build-date=2025-11-19T00:12:45Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, distribution-scope=public, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, container_name=ceilometer_agent_ipmi, config_id=tripleo_step4, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676) Dec 15 03:40:40 localhost podman[90460]: 2025-12-15 08:40:40.960282014 +0000 UTC m=+0.283602026 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z, release=1761123044, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, vcs-type=git, com.redhat.component=openstack-collectd-container, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, io.buildah.version=1.41.4, name=rhosp17/openstack-collectd, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 15 03:40:40 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:40:40 localhost podman[90463]: 2025-12-15 08:40:40.979305523 +0000 UTC m=+0.291504827 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, vcs-type=git, name=rhosp17/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public) Dec 15 03:40:40 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 03:40:41 localhost podman[90481]: 2025-12-15 08:40:40.931275129 +0000 UTC m=+0.236345712 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, distribution-scope=public, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, release=1761123044, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, vendor=Red Hat, Inc., io.openshift.expose-services=, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com) Dec 15 03:40:41 localhost podman[90481]: 2025-12-15 08:40:41.064600265 +0000 UTC m=+0.369670838 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, version=17.1.12, release=1761123044, build-date=2025-11-19T00:11:48Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, container_name=ceilometer_agent_compute, architecture=x86_64, vcs-type=git, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute) Dec 15 03:40:41 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 03:40:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:40:43 localhost systemd[1]: tmp-crun.rFIQJp.mount: Deactivated successfully. Dec 15 03:40:43 localhost podman[90593]: 2025-12-15 08:40:43.761223819 +0000 UTC m=+0.095364802 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, release=1761123044, container_name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, vcs-type=git, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_id=tripleo_step4, distribution-scope=public, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible) Dec 15 03:40:44 localhost podman[90593]: 2025-12-15 08:40:44.159475971 +0000 UTC m=+0.493616924 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, architecture=x86_64, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., container_name=nova_migration_target) Dec 15 03:40:44 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 03:40:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:40:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:40:46 localhost systemd[1]: tmp-crun.HCTRjr.mount: Deactivated successfully. Dec 15 03:40:46 localhost podman[90618]: 2025-12-15 08:40:46.762165353 +0000 UTC m=+0.088404755 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, com.redhat.component=openstack-ovn-controller-container, build-date=2025-11-18T23:34:05Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, release=1761123044, name=rhosp17/openstack-ovn-controller, config_id=tripleo_step4, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com) Dec 15 03:40:46 localhost podman[90619]: 2025-12-15 08:40:46.819132897 +0000 UTC m=+0.141412354 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, vcs-type=git, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, version=17.1.12, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2025-11-19T00:14:25Z, io.buildah.version=1.41.4, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c) Dec 15 03:40:46 localhost podman[90618]: 2025-12-15 08:40:46.840710373 +0000 UTC m=+0.166949775 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, build-date=2025-11-18T23:34:05Z, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.buildah.version=1.41.4, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044) Dec 15 03:40:46 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Deactivated successfully. Dec 15 03:40:46 localhost podman[90619]: 2025-12-15 08:40:46.895565971 +0000 UTC m=+0.217845448 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, maintainer=OpenStack TripleO Team, distribution-scope=public, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1761123044, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64) Dec 15 03:40:46 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Deactivated successfully. Dec 15 03:40:47 localhost systemd[1]: tmp-crun.mTtkZF.mount: Deactivated successfully. Dec 15 03:40:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:40:55 localhost podman[90666]: 2025-12-15 08:40:55.777454578 +0000 UTC m=+0.107728893 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp17/openstack-qdrouterd, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, io.openshift.expose-services=, release=1761123044, vcs-type=git, managed_by=tripleo_ansible, config_id=tripleo_step1, vendor=Red Hat, Inc., tcib_managed=true, build-date=2025-11-18T22:49:46Z, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.buildah.version=1.41.4, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public) Dec 15 03:40:56 localhost podman[90666]: 2025-12-15 08:40:56.000075782 +0000 UTC m=+0.330350107 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., tcib_managed=true, url=https://www.redhat.com, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, vcs-type=git, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, distribution-scope=public, version=17.1.12) Dec 15 03:40:56 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:41:04 localhost sshd[90694]: main: sshd: ssh-rsa algorithm is disabled Dec 15 03:41:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:41:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:41:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 03:41:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:41:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:41:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:41:11 localhost podman[90696]: 2025-12-15 08:41:11.745423077 +0000 UTC m=+0.071799892 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-11-18T22:51:28Z, vendor=Red Hat, Inc., tcib_managed=true, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.openshift.expose-services=, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, com.redhat.component=openstack-collectd-container, version=17.1.12, name=rhosp17/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd) Dec 15 03:41:11 localhost podman[90696]: 2025-12-15 08:41:11.761238599 +0000 UTC m=+0.087615444 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, container_name=collectd, architecture=x86_64, distribution-scope=public, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1761123044, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, vcs-type=git, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.4, tcib_managed=true) Dec 15 03:41:11 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:41:11 localhost podman[90718]: 2025-12-15 08:41:11.833604955 +0000 UTC m=+0.137114859 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:11:48Z, managed_by=tripleo_ansible, version=17.1.12, url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, container_name=ceilometer_agent_compute, io.openshift.expose-services=, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true) Dec 15 03:41:11 localhost podman[90718]: 2025-12-15 08:41:11.867404759 +0000 UTC m=+0.170914703 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, release=1761123044, build-date=2025-11-19T00:11:48Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.12, config_id=tripleo_step4, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-compute) Dec 15 03:41:11 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 03:41:11 localhost podman[90698]: 2025-12-15 08:41:11.886719095 +0000 UTC m=+0.204949392 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, architecture=x86_64, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, vcs-type=git, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, io.buildah.version=1.41.4) Dec 15 03:41:11 localhost podman[90697]: 2025-12-15 08:41:11.839092782 +0000 UTC m=+0.159730533 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, version=17.1.12, distribution-scope=public, managed_by=tripleo_ansible, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid, io.openshift.expose-services=, vendor=Red Hat, Inc., batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, vcs-type=git, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64) Dec 15 03:41:11 localhost podman[90699]: 2025-12-15 08:41:11.95231467 +0000 UTC m=+0.266437858 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, batch=17.1_20251118.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, version=17.1.12, tcib_managed=true, maintainer=OpenStack TripleO Team, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Dec 15 03:41:11 localhost podman[90706]: 2025-12-15 08:41:11.993557503 +0000 UTC m=+0.300763195 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, release=1761123044, architecture=x86_64, managed_by=tripleo_ansible, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:32Z, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, container_name=logrotate_crond, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 15 03:41:12 localhost podman[90706]: 2025-12-15 08:41:12.004321601 +0000 UTC m=+0.311527293 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, url=https://www.redhat.com, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, tcib_managed=true, build-date=2025-11-18T22:49:32Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, architecture=x86_64, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible) Dec 15 03:41:12 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 03:41:12 localhost podman[90698]: 2025-12-15 08:41:12.018650124 +0000 UTC m=+0.336880471 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, release=1761123044, architecture=x86_64, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, tcib_managed=true, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, container_name=nova_compute) Dec 15 03:41:12 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Deactivated successfully. Dec 15 03:41:12 localhost podman[90699]: 2025-12-15 08:41:12.060526354 +0000 UTC m=+0.374649502 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-type=git, batch=17.1_20251118.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.4, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, managed_by=tripleo_ansible) Dec 15 03:41:12 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 03:41:12 localhost podman[90697]: 2025-12-15 08:41:12.073240545 +0000 UTC m=+0.393878316 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.component=openstack-iscsid-container, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1761123044, vcs-type=git, distribution-scope=public, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-11-18T23:44:13Z) Dec 15 03:41:12 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:41:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:41:14 localhost podman[90835]: 2025-12-15 08:41:14.759186122 +0000 UTC m=+0.091564499 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, version=17.1.12, vcs-type=git, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, tcib_managed=true, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, architecture=x86_64, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, managed_by=tripleo_ansible, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute) Dec 15 03:41:15 localhost podman[90835]: 2025-12-15 08:41:15.202307645 +0000 UTC m=+0.534685992 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, tcib_managed=true, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, io.buildah.version=1.41.4) Dec 15 03:41:15 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 03:41:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:41:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:41:17 localhost podman[90859]: 2025-12-15 08:41:17.745492274 +0000 UTC m=+0.072578201 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.openshift.expose-services=, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, release=1761123044, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, version=17.1.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:14:25Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, config_id=tripleo_step4) Dec 15 03:41:17 localhost podman[90859]: 2025-12-15 08:41:17.794748362 +0000 UTC m=+0.121834279 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:14:25Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, url=https://www.redhat.com, version=17.1.12, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64) Dec 15 03:41:17 localhost systemd[1]: tmp-crun.7Qsjq9.mount: Deactivated successfully. Dec 15 03:41:17 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Deactivated successfully. Dec 15 03:41:17 localhost podman[90858]: 2025-12-15 08:41:17.820746867 +0000 UTC m=+0.149842728 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, batch=17.1_20251118.1, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, distribution-scope=public, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, architecture=x86_64, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, release=1761123044, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, tcib_managed=true, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible) Dec 15 03:41:17 localhost podman[90858]: 2025-12-15 08:41:17.874416432 +0000 UTC m=+0.203512313 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, name=rhosp17/openstack-ovn-controller, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, config_id=tripleo_step4, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller) Dec 15 03:41:17 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Deactivated successfully. Dec 15 03:41:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:41:26 localhost podman[90925]: 2025-12-15 08:41:26.746230119 +0000 UTC m=+0.081188473 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, maintainer=OpenStack TripleO Team, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, container_name=metrics_qdr, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 15 03:41:26 localhost podman[90925]: 2025-12-15 08:41:26.975653555 +0000 UTC m=+0.310611919 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, batch=17.1_20251118.1, vcs-type=git, version=17.1.12, managed_by=tripleo_ansible, tcib_managed=true, distribution-scope=public, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd) Dec 15 03:41:26 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:41:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:41:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:41:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 03:41:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:41:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:41:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:41:42 localhost systemd[1]: tmp-crun.ovAEhB.mount: Deactivated successfully. Dec 15 03:41:42 localhost podman[91038]: 2025-12-15 08:41:42.80197684 +0000 UTC m=+0.114296358 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, build-date=2025-11-19T00:12:45Z, url=https://www.redhat.com, batch=17.1_20251118.1, architecture=x86_64, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, distribution-scope=public, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, release=1761123044, version=17.1.12, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true) Dec 15 03:41:42 localhost podman[91050]: 2025-12-15 08:41:42.821177724 +0000 UTC m=+0.090623335 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, vcs-type=git, build-date=2025-11-19T00:11:48Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, batch=17.1_20251118.1, config_id=tripleo_step4, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-compute) Dec 15 03:41:42 localhost podman[91044]: 2025-12-15 08:41:42.900101375 +0000 UTC m=+0.204692356 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, url=https://www.redhat.com, name=rhosp17/openstack-cron, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, container_name=logrotate_crond, architecture=x86_64, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2025-11-18T22:49:32Z, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 15 03:41:42 localhost podman[91032]: 2025-12-15 08:41:42.868119889 +0000 UTC m=+0.181373321 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, batch=17.1_20251118.1, release=1761123044, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute) Dec 15 03:41:42 localhost podman[91044]: 2025-12-15 08:41:42.935343697 +0000 UTC m=+0.239934668 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-cron, vcs-type=git, vendor=Red Hat, Inc., release=1761123044, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, url=https://www.redhat.com, tcib_managed=true) Dec 15 03:41:42 localhost podman[91030]: 2025-12-15 08:41:42.94742114 +0000 UTC m=+0.270878956 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, distribution-scope=public, build-date=2025-11-18T22:51:28Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, version=17.1.12, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=collectd, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.buildah.version=1.41.4, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 15 03:41:42 localhost podman[91032]: 2025-12-15 08:41:42.952932938 +0000 UTC m=+0.266186380 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, tcib_managed=true, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z) Dec 15 03:41:42 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 03:41:42 localhost podman[91030]: 2025-12-15 08:41:42.964306872 +0000 UTC m=+0.287764728 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, com.redhat.component=openstack-collectd-container, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, vendor=Red Hat, Inc., distribution-scope=public, architecture=x86_64, config_id=tripleo_step3, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, url=https://www.redhat.com, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-type=git, build-date=2025-11-18T22:51:28Z, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd) Dec 15 03:41:42 localhost podman[91050]: 2025-12-15 08:41:42.975710817 +0000 UTC m=+0.245156478 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, batch=17.1_20251118.1, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, tcib_managed=true, release=1761123044, config_id=tripleo_step4, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, build-date=2025-11-19T00:11:48Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute) Dec 15 03:41:42 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Deactivated successfully. Dec 15 03:41:42 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:41:42 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 03:41:43 localhost podman[91038]: 2025-12-15 08:41:43.019627651 +0000 UTC m=+0.331947209 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, version=17.1.12, io.openshift.expose-services=, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 15 03:41:43 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 03:41:43 localhost podman[91031]: 2025-12-15 08:41:43.111408736 +0000 UTC m=+0.429800726 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, name=rhosp17/openstack-iscsid, io.buildah.version=1.41.4, config_id=tripleo_step3, build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, managed_by=tripleo_ansible, release=1761123044, vcs-type=git, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 15 03:41:43 localhost podman[91031]: 2025-12-15 08:41:43.127225109 +0000 UTC m=+0.445617099 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, distribution-scope=public, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, managed_by=tripleo_ansible, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, version=17.1.12, tcib_managed=true, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 15 03:41:43 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:41:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:41:45 localhost podman[91166]: 2025-12-15 08:41:45.748218309 +0000 UTC m=+0.082583109 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, tcib_managed=true, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, managed_by=tripleo_ansible, container_name=nova_migration_target, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4) Dec 15 03:41:46 localhost podman[91166]: 2025-12-15 08:41:46.134279765 +0000 UTC m=+0.468644505 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.openshift.expose-services=, distribution-scope=public, url=https://www.redhat.com, version=17.1.12, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, container_name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true) Dec 15 03:41:46 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 03:41:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:41:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:41:48 localhost systemd[1]: tmp-crun.N7dAWv.mount: Deactivated successfully. Dec 15 03:41:48 localhost podman[91189]: 2025-12-15 08:41:48.751942138 +0000 UTC m=+0.081216763 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, container_name=ovn_controller, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, config_id=tripleo_step4, version=17.1.12, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, architecture=x86_64, batch=17.1_20251118.1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller) Dec 15 03:41:48 localhost systemd[1]: tmp-crun.k1WvwB.mount: Deactivated successfully. Dec 15 03:41:48 localhost podman[91190]: 2025-12-15 08:41:48.808396528 +0000 UTC m=+0.134243911 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, batch=17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, release=1761123044, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., vcs-type=git, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container) Dec 15 03:41:48 localhost podman[91189]: 2025-12-15 08:41:48.826479171 +0000 UTC m=+0.155753866 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.buildah.version=1.41.4, config_id=tripleo_step4, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, distribution-scope=public, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1) Dec 15 03:41:48 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Deactivated successfully. Dec 15 03:41:48 localhost podman[91190]: 2025-12-15 08:41:48.852812046 +0000 UTC m=+0.178659429 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.buildah.version=1.41.4, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20251118.1, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1761123044, build-date=2025-11-19T00:14:25Z, version=17.1.12, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team) Dec 15 03:41:48 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Deactivated successfully. Dec 15 03:41:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:41:57 localhost podman[91238]: 2025-12-15 08:41:57.744036306 +0000 UTC m=+0.079521879 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, release=1761123044, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, vcs-type=git, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., version=17.1.12, io.openshift.expose-services=, tcib_managed=true, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Dec 15 03:41:57 localhost podman[91238]: 2025-12-15 08:41:57.935420644 +0000 UTC m=+0.270906257 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.buildah.version=1.41.4, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, vcs-type=git, batch=17.1_20251118.1, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, release=1761123044, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container) Dec 15 03:41:57 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:42:08 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Dec 15 03:42:08 localhost recover_tripleo_nova_virtqemud[91269]: 61849 Dec 15 03:42:08 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Dec 15 03:42:08 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Dec 15 03:42:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:42:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:42:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 03:42:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:42:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:42:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:42:13 localhost systemd[1]: tmp-crun.gHeTJq.mount: Deactivated successfully. Dec 15 03:42:13 localhost podman[91270]: 2025-12-15 08:42:13.771231852 +0000 UTC m=+0.092968567 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, name=rhosp17/openstack-collectd, io.openshift.expose-services=, config_id=tripleo_step3, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, vcs-type=git, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, distribution-scope=public, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, version=17.1.12, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, container_name=collectd, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044) Dec 15 03:42:13 localhost podman[91271]: 2025-12-15 08:42:13.788900715 +0000 UTC m=+0.107959809 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, build-date=2025-11-18T23:44:13Z, name=rhosp17/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, managed_by=tripleo_ansible, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, tcib_managed=true, distribution-scope=public, vcs-type=git, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=) Dec 15 03:42:13 localhost podman[91271]: 2025-12-15 08:42:13.797202777 +0000 UTC m=+0.116261871 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, release=1761123044, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.buildah.version=1.41.4, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}) Dec 15 03:42:13 localhost podman[91270]: 2025-12-15 08:42:13.802413006 +0000 UTC m=+0.124149731 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, vcs-type=git, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=collectd, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, vendor=Red Hat, Inc., url=https://www.redhat.com, tcib_managed=true, io.buildah.version=1.41.4, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd) Dec 15 03:42:13 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:42:13 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:42:13 localhost podman[91284]: 2025-12-15 08:42:13.840563646 +0000 UTC m=+0.145070360 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, container_name=logrotate_crond, io.buildah.version=1.41.4, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron) Dec 15 03:42:13 localhost podman[91278]: 2025-12-15 08:42:13.858144617 +0000 UTC m=+0.171045076 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, batch=17.1_20251118.1, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-ipmi, distribution-scope=public, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, release=1761123044, url=https://www.redhat.com, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible) Dec 15 03:42:13 localhost podman[91278]: 2025-12-15 08:42:13.888304714 +0000 UTC m=+0.201205153 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.expose-services=, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, build-date=2025-11-19T00:12:45Z, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, container_name=ceilometer_agent_ipmi) Dec 15 03:42:13 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 03:42:13 localhost podman[91272]: 2025-12-15 08:42:13.905133764 +0000 UTC m=+0.217218921 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_id=tripleo_step5, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, url=https://www.redhat.com, managed_by=tripleo_ansible, distribution-scope=public, name=rhosp17/openstack-nova-compute, container_name=nova_compute, tcib_managed=true, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, release=1761123044, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z) Dec 15 03:42:13 localhost podman[91296]: 2025-12-15 08:42:13.950259211 +0000 UTC m=+0.247073310 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, tcib_managed=true, batch=17.1_20251118.1, io.openshift.expose-services=, vcs-type=git, build-date=2025-11-19T00:11:48Z, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, io.buildah.version=1.41.4, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Dec 15 03:42:13 localhost podman[91272]: 2025-12-15 08:42:13.956937759 +0000 UTC m=+0.269022936 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, io.buildah.version=1.41.4, container_name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, release=1761123044, distribution-scope=public, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=) Dec 15 03:42:13 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Deactivated successfully. Dec 15 03:42:13 localhost podman[91284]: 2025-12-15 08:42:13.970019119 +0000 UTC m=+0.274525853 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, url=https://www.redhat.com, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.openshift.expose-services=, vcs-type=git, name=rhosp17/openstack-cron, build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12) Dec 15 03:42:13 localhost podman[91296]: 2025-12-15 08:42:13.977310634 +0000 UTC m=+0.274124753 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., tcib_managed=true, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, distribution-scope=public, version=17.1.12, io.buildah.version=1.41.4) Dec 15 03:42:13 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 03:42:13 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 03:42:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:42:16 localhost systemd[1]: tmp-crun.vdN89w.mount: Deactivated successfully. Dec 15 03:42:16 localhost podman[91404]: 2025-12-15 08:42:16.749576461 +0000 UTC m=+0.082627192 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, container_name=nova_migration_target, tcib_managed=true, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4) Dec 15 03:42:17 localhost podman[91404]: 2025-12-15 08:42:17.143920058 +0000 UTC m=+0.476970759 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, distribution-scope=public, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Dec 15 03:42:17 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 03:42:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:42:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:42:19 localhost podman[91428]: 2025-12-15 08:42:19.748925583 +0000 UTC m=+0.078805820 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., architecture=x86_64, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, url=https://www.redhat.com, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_controller, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, managed_by=tripleo_ansible, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, distribution-scope=public) Dec 15 03:42:19 localhost podman[91428]: 2025-12-15 08:42:19.799336351 +0000 UTC m=+0.129216548 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., batch=17.1_20251118.1, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, io.buildah.version=1.41.4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, maintainer=OpenStack TripleO Team, release=1761123044, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller) Dec 15 03:42:19 localhost systemd[1]: tmp-crun.ncZl66.mount: Deactivated successfully. Dec 15 03:42:19 localhost podman[91429]: 2025-12-15 08:42:19.812296657 +0000 UTC m=+0.139029209 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, io.openshift.expose-services=, build-date=2025-11-19T00:14:25Z, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, vendor=Red Hat, Inc., io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, batch=17.1_20251118.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Dec 15 03:42:19 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Deactivated successfully. Dec 15 03:42:19 localhost podman[91429]: 2025-12-15 08:42:19.882950157 +0000 UTC m=+0.209682739 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, release=1761123044, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, vcs-type=git, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, distribution-scope=public) Dec 15 03:42:19 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Deactivated successfully. Dec 15 03:42:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:42:28 localhost podman[91476]: 2025-12-15 08:42:28.756901003 +0000 UTC m=+0.088274711 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, url=https://www.redhat.com, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true) Dec 15 03:42:28 localhost podman[91476]: 2025-12-15 08:42:28.963315704 +0000 UTC m=+0.294689382 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, version=17.1.12, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-11-18T22:49:46Z, batch=17.1_20251118.1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.buildah.version=1.41.4) Dec 15 03:42:28 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:42:40 localhost sshd[91583]: main: sshd: ssh-rsa algorithm is disabled Dec 15 03:42:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:42:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:42:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 03:42:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:42:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:42:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:42:44 localhost systemd[1]: tmp-crun.r7ZlHZ.mount: Deactivated successfully. Dec 15 03:42:44 localhost podman[91587]: 2025-12-15 08:42:44.785084152 +0000 UTC m=+0.098735703 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vendor=Red Hat, Inc., distribution-scope=public, release=1761123044, io.buildah.version=1.41.4, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, version=17.1.12) Dec 15 03:42:44 localhost podman[91593]: 2025-12-15 08:42:44.793047124 +0000 UTC m=+0.108496483 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, release=1761123044, distribution-scope=public, tcib_managed=true, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, architecture=x86_64) Dec 15 03:42:44 localhost podman[91586]: 2025-12-15 08:42:44.744884926 +0000 UTC m=+0.066385006 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.openshift.expose-services=, tcib_managed=true, release=1761123044, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:44:13Z, architecture=x86_64, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.12, io.buildah.version=1.41.4, distribution-scope=public) Dec 15 03:42:44 localhost podman[91587]: 2025-12-15 08:42:44.81232965 +0000 UTC m=+0.125981231 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.expose-services=, config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, vendor=Red Hat, Inc., io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true) Dec 15 03:42:44 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Deactivated successfully. Dec 15 03:42:44 localhost podman[91586]: 2025-12-15 08:42:44.830362783 +0000 UTC m=+0.151862843 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, tcib_managed=true, release=1761123044, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, container_name=iscsid, version=17.1.12, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, architecture=x86_64, batch=17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, io.buildah.version=1.41.4) Dec 15 03:42:44 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:42:44 localhost podman[91593]: 2025-12-15 08:42:44.867399614 +0000 UTC m=+0.182849043 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, release=1761123044, batch=17.1_20251118.1, vcs-type=git, version=17.1.12, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 15 03:42:44 localhost podman[91599]: 2025-12-15 08:42:44.875383307 +0000 UTC m=+0.184072885 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_id=tripleo_step4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, vcs-type=git, com.redhat.component=openstack-cron-container, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, name=rhosp17/openstack-cron, build-date=2025-11-18T22:49:32Z, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron) Dec 15 03:42:44 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 03:42:44 localhost podman[91585]: 2025-12-15 08:42:44.906475819 +0000 UTC m=+0.229357617 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, release=1761123044, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, build-date=2025-11-18T22:51:28Z, batch=17.1_20251118.1, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, tcib_managed=true, io.buildah.version=1.41.4, io.openshift.expose-services=, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 15 03:42:44 localhost podman[91599]: 2025-12-15 08:42:44.913452035 +0000 UTC m=+0.222141623 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, container_name=logrotate_crond, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2025-11-18T22:49:32Z, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp17/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, distribution-scope=public, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, batch=17.1_20251118.1, version=17.1.12, architecture=x86_64) Dec 15 03:42:44 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 03:42:44 localhost podman[91600]: 2025-12-15 08:42:44.884423839 +0000 UTC m=+0.189736197 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, version=17.1.12, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, container_name=ceilometer_agent_compute, batch=17.1_20251118.1, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., build-date=2025-11-19T00:11:48Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 15 03:42:44 localhost podman[91585]: 2025-12-15 08:42:44.940294594 +0000 UTC m=+0.263176392 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, architecture=x86_64, name=rhosp17/openstack-collectd, batch=17.1_20251118.1, io.buildah.version=1.41.4, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, tcib_managed=true, config_id=tripleo_step3, container_name=collectd, version=17.1.12, url=https://www.redhat.com, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 15 03:42:44 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:42:44 localhost podman[91600]: 2025-12-15 08:42:44.966397002 +0000 UTC m=+0.271709360 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:11:48Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, batch=17.1_20251118.1, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, tcib_managed=true, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 15 03:42:44 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 03:42:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:42:47 localhost podman[91718]: 2025-12-15 08:42:47.731723409 +0000 UTC m=+0.063347985 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, vcs-type=git, name=rhosp17/openstack-nova-compute, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.openshift.expose-services=, distribution-scope=public, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container) Dec 15 03:42:48 localhost podman[91718]: 2025-12-15 08:42:48.09954925 +0000 UTC m=+0.431173836 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vendor=Red Hat, Inc., tcib_managed=true, config_id=tripleo_step4, name=rhosp17/openstack-nova-compute, architecture=x86_64, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, distribution-scope=public, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team) Dec 15 03:42:48 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 03:42:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:42:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:42:50 localhost systemd[1]: tmp-crun.dKBkNR.mount: Deactivated successfully. Dec 15 03:42:50 localhost podman[91742]: 2025-12-15 08:42:50.770564674 +0000 UTC m=+0.098265730 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, batch=17.1_20251118.1, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, architecture=x86_64, io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, config_id=tripleo_step4, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, distribution-scope=public, maintainer=OpenStack TripleO Team, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 15 03:42:50 localhost systemd[1]: tmp-crun.JPKUbE.mount: Deactivated successfully. Dec 15 03:42:50 localhost podman[91743]: 2025-12-15 08:42:50.822168785 +0000 UTC m=+0.147457296 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.12, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, tcib_managed=true, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.41.4, config_id=tripleo_step4, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 15 03:42:50 localhost podman[91742]: 2025-12-15 08:42:50.874738841 +0000 UTC m=+0.202439897 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, distribution-scope=public, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, vcs-type=git, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com) Dec 15 03:42:50 localhost podman[91743]: 2025-12-15 08:42:50.889198338 +0000 UTC m=+0.214486829 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, version=17.1.12, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, batch=17.1_20251118.1, vcs-type=git, release=1761123044, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Dec 15 03:42:50 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Deactivated successfully. Dec 15 03:42:50 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Deactivated successfully. Dec 15 03:42:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:42:59 localhost podman[91791]: 2025-12-15 08:42:59.750582905 +0000 UTC m=+0.085628542 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, vendor=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, distribution-scope=public, managed_by=tripleo_ansible) Dec 15 03:42:59 localhost podman[91791]: 2025-12-15 08:42:59.97658624 +0000 UTC m=+0.311631807 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, version=17.1.12, tcib_managed=true, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, release=1761123044, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, architecture=x86_64, io.openshift.expose-services=, container_name=metrics_qdr, managed_by=tripleo_ansible, config_id=tripleo_step1, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4) Dec 15 03:43:00 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:43:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:43:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:43:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 03:43:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:43:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:43:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:43:15 localhost systemd[1]: tmp-crun.0Cc1PK.mount: Deactivated successfully. Dec 15 03:43:15 localhost podman[91820]: 2025-12-15 08:43:15.820056741 +0000 UTC m=+0.108031691 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, io.buildah.version=1.41.4, io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, tcib_managed=true, config_id=tripleo_step3, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., managed_by=tripleo_ansible, version=17.1.12, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid) Dec 15 03:43:15 localhost podman[91833]: 2025-12-15 08:43:15.870624334 +0000 UTC m=+0.145839683 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, release=1761123044, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, io.buildah.version=1.41.4, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, config_id=tripleo_step4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64) Dec 15 03:43:15 localhost podman[91819]: 2025-12-15 08:43:15.850849235 +0000 UTC m=+0.142120254 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, url=https://www.redhat.com, container_name=collectd, version=17.1.12, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, io.openshift.expose-services=, distribution-scope=public, vcs-type=git, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1761123044, build-date=2025-11-18T22:51:28Z, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc.) Dec 15 03:43:15 localhost podman[91820]: 2025-12-15 08:43:15.904443338 +0000 UTC m=+0.192418328 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1761123044, architecture=x86_64, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, version=17.1.12, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, batch=17.1_20251118.1, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., vcs-type=git, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible) Dec 15 03:43:15 localhost podman[91840]: 2025-12-15 08:43:15.912884964 +0000 UTC m=+0.187355253 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, container_name=ceilometer_agent_compute) Dec 15 03:43:15 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:43:15 localhost podman[91833]: 2025-12-15 08:43:15.956509612 +0000 UTC m=+0.231724991 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, batch=17.1_20251118.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, version=17.1.12, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, build-date=2025-11-18T22:49:32Z, config_id=tripleo_step4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 15 03:43:15 localhost podman[91821]: 2025-12-15 08:43:15.964384612 +0000 UTC m=+0.249188777 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, distribution-scope=public, managed_by=tripleo_ansible, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, vendor=Red Hat, Inc., architecture=x86_64, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}) Dec 15 03:43:15 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 03:43:15 localhost podman[91819]: 2025-12-15 08:43:15.988311522 +0000 UTC m=+0.279582611 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.4, architecture=x86_64, distribution-scope=public, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, build-date=2025-11-18T22:51:28Z, version=17.1.12, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, url=https://www.redhat.com, container_name=collectd, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc.) Dec 15 03:43:16 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:43:16 localhost podman[91840]: 2025-12-15 08:43:16.040968531 +0000 UTC m=+0.315438840 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, vendor=Red Hat, Inc., release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, managed_by=tripleo_ansible, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute) Dec 15 03:43:16 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 03:43:16 localhost podman[91826]: 2025-12-15 08:43:15.99009344 +0000 UTC m=+0.268875944 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, url=https://www.redhat.com, release=1761123044, managed_by=tripleo_ansible, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, version=17.1.12, build-date=2025-11-19T00:12:45Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Dec 15 03:43:16 localhost podman[91821]: 2025-12-15 08:43:16.095477089 +0000 UTC m=+0.380281294 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, container_name=nova_compute, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, distribution-scope=public, architecture=x86_64, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, tcib_managed=true, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, version=17.1.12, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 15 03:43:16 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Deactivated successfully. Dec 15 03:43:16 localhost podman[91826]: 2025-12-15 08:43:16.119457001 +0000 UTC m=+0.398239505 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, tcib_managed=true, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, version=17.1.12, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, config_id=tripleo_step4, release=1761123044, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, architecture=x86_64, distribution-scope=public) Dec 15 03:43:16 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 03:43:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:43:18 localhost systemd[1]: tmp-crun.2QNkxv.mount: Deactivated successfully. Dec 15 03:43:18 localhost podman[91951]: 2025-12-15 08:43:18.750785123 +0000 UTC m=+0.085068866 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, config_id=tripleo_step4, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, release=1761123044, distribution-scope=public) Dec 15 03:43:19 localhost podman[91951]: 2025-12-15 08:43:19.117966996 +0000 UTC m=+0.452250779 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, tcib_managed=true, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, container_name=nova_migration_target, managed_by=tripleo_ansible, io.openshift.expose-services=, release=1761123044, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, config_id=tripleo_step4, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 15 03:43:19 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 03:43:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:43:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:43:21 localhost podman[91976]: 2025-12-15 08:43:21.749939286 +0000 UTC m=+0.079868518 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, config_id=tripleo_step4, architecture=x86_64, release=1761123044, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, build-date=2025-11-18T23:34:05Z, com.redhat.component=openstack-ovn-controller-container, version=17.1.12, batch=17.1_20251118.1, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git) Dec 15 03:43:21 localhost podman[91976]: 2025-12-15 08:43:21.82820842 +0000 UTC m=+0.158137672 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, config_id=tripleo_step4, architecture=x86_64, version=17.1.12, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, maintainer=OpenStack TripleO Team, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller) Dec 15 03:43:21 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Deactivated successfully. Dec 15 03:43:21 localhost podman[91977]: 2025-12-15 08:43:21.846048667 +0000 UTC m=+0.173404010 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, version=17.1.12, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, distribution-scope=public, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Dec 15 03:43:21 localhost podman[91977]: 2025-12-15 08:43:21.886783797 +0000 UTC m=+0.214139090 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, config_id=tripleo_step4, vcs-type=git, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, batch=17.1_20251118.1, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, url=https://www.redhat.com) Dec 15 03:43:21 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Deactivated successfully. Dec 15 03:43:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:43:30 localhost systemd[1]: tmp-crun.zWLkYm.mount: Deactivated successfully. Dec 15 03:43:30 localhost podman[92025]: 2025-12-15 08:43:30.784114186 +0000 UTC m=+0.114936855 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, release=1761123044, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., managed_by=tripleo_ansible, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, url=https://www.redhat.com, vcs-type=git, container_name=metrics_qdr, config_id=tripleo_step1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20251118.1, version=17.1.12, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Dec 15 03:43:31 localhost podman[92025]: 2025-12-15 08:43:31.021371563 +0000 UTC m=+0.352194242 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, vendor=Red Hat, Inc., distribution-scope=public, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}) Dec 15 03:43:31 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:43:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:43:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:43:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 03:43:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:43:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:43:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:43:46 localhost podman[92132]: 2025-12-15 08:43:46.767965401 +0000 UTC m=+0.087081581 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.buildah.version=1.41.4, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, architecture=x86_64, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, version=17.1.12, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}) Dec 15 03:43:46 localhost systemd[1]: tmp-crun.jt26Ki.mount: Deactivated successfully. Dec 15 03:43:46 localhost podman[92130]: 2025-12-15 08:43:46.81950715 +0000 UTC m=+0.142361939 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, release=1761123044, distribution-scope=public, build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, version=17.1.12, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, tcib_managed=true, config_id=tripleo_step3, managed_by=tripleo_ansible, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, io.buildah.version=1.41.4, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 15 03:43:46 localhost podman[92132]: 2025-12-15 08:43:46.820306371 +0000 UTC m=+0.139422471 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, container_name=nova_compute, release=1761123044, version=17.1.12, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, tcib_managed=true, batch=17.1_20251118.1, config_id=tripleo_step5, vcs-type=git) Dec 15 03:43:46 localhost systemd[1]: tmp-crun.lxAO6H.mount: Deactivated successfully. Dec 15 03:43:46 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Deactivated successfully. Dec 15 03:43:46 localhost podman[92131]: 2025-12-15 08:43:46.866035474 +0000 UTC m=+0.188912644 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, container_name=iscsid, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, io.buildah.version=1.41.4, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, maintainer=OpenStack TripleO Team, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., release=1761123044, architecture=x86_64, distribution-scope=public) Dec 15 03:43:46 localhost podman[92131]: 2025-12-15 08:43:46.878330154 +0000 UTC m=+0.201207324 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, container_name=iscsid, config_id=tripleo_step3, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, vcs-type=git, architecture=x86_64, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., io.buildah.version=1.41.4, tcib_managed=true, release=1761123044) Dec 15 03:43:46 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:43:46 localhost podman[92134]: 2025-12-15 08:43:46.918286403 +0000 UTC m=+0.229715637 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, build-date=2025-11-18T22:49:32Z, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vendor=Red Hat, Inc., batch=17.1_20251118.1, name=rhosp17/openstack-cron, io.buildah.version=1.41.4, managed_by=tripleo_ansible, distribution-scope=public, architecture=x86_64) Dec 15 03:43:46 localhost podman[92134]: 2025-12-15 08:43:46.925363762 +0000 UTC m=+0.236792986 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, distribution-scope=public, vendor=Red Hat, Inc., version=17.1.12, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, config_id=tripleo_step4, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.4, architecture=x86_64, url=https://www.redhat.com, tcib_managed=true, release=1761123044, summary=Red Hat OpenStack Platform 17.1 cron) Dec 15 03:43:46 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 03:43:46 localhost podman[92130]: 2025-12-15 08:43:46.953628548 +0000 UTC m=+0.276483387 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.component=openstack-collectd-container, architecture=x86_64, batch=17.1_20251118.1, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-type=git, version=17.1.12, container_name=collectd, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, distribution-scope=public, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, config_id=tripleo_step3) Dec 15 03:43:46 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:43:46 localhost podman[92145]: 2025-12-15 08:43:46.976204672 +0000 UTC m=+0.285002695 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:11:48Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, distribution-scope=public, managed_by=tripleo_ansible, vcs-type=git) Dec 15 03:43:47 localhost podman[92145]: 2025-12-15 08:43:47.010574462 +0000 UTC m=+0.319372495 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_id=tripleo_step4, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, vcs-type=git, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:11:48Z, io.buildah.version=1.41.4, vendor=Red Hat, Inc.) Dec 15 03:43:47 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 03:43:47 localhost podman[92133]: 2025-12-15 08:43:47.079781303 +0000 UTC m=+0.397068833 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_id=tripleo_step4, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, version=17.1.12, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, release=1761123044, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 15 03:43:47 localhost podman[92133]: 2025-12-15 08:43:47.106327083 +0000 UTC m=+0.423614583 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, tcib_managed=true) Dec 15 03:43:47 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 03:43:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:43:49 localhost podman[92270]: 2025-12-15 08:43:49.762328225 +0000 UTC m=+0.093257616 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vendor=Red Hat, Inc., tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, architecture=x86_64, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, version=17.1.12, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 15 03:43:50 localhost podman[92270]: 2025-12-15 08:43:50.129156028 +0000 UTC m=+0.460085399 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, tcib_managed=true, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, architecture=x86_64, url=https://www.redhat.com, version=17.1.12, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-type=git, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 15 03:43:50 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 03:43:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:43:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:43:52 localhost podman[92294]: 2025-12-15 08:43:52.750346111 +0000 UTC m=+0.077854734 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, batch=17.1_20251118.1, vendor=Red Hat, Inc., distribution-scope=public, version=17.1.12, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, config_id=tripleo_step4, release=1761123044) Dec 15 03:43:52 localhost systemd[1]: tmp-crun.3RWDdw.mount: Deactivated successfully. Dec 15 03:43:52 localhost podman[92293]: 2025-12-15 08:43:52.810901232 +0000 UTC m=+0.142085953 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, release=1761123044, version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller) Dec 15 03:43:52 localhost podman[92293]: 2025-12-15 08:43:52.857745324 +0000 UTC m=+0.188930055 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, release=1761123044, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, container_name=ovn_controller, vendor=Red Hat, Inc., architecture=x86_64, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 15 03:43:52 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Deactivated successfully. Dec 15 03:43:52 localhost podman[92294]: 2025-12-15 08:43:52.912945421 +0000 UTC m=+0.240454044 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, io.buildah.version=1.41.4, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044) Dec 15 03:43:52 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Deactivated successfully. Dec 15 03:44:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:44:01 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Dec 15 03:44:01 localhost recover_tripleo_nova_virtqemud[92344]: 61849 Dec 15 03:44:01 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Dec 15 03:44:01 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Dec 15 03:44:01 localhost systemd[1]: tmp-crun.XNPYSD.mount: Deactivated successfully. Dec 15 03:44:01 localhost podman[92339]: 2025-12-15 08:44:01.771178034 +0000 UTC m=+0.097656674 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, tcib_managed=true, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.4, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, distribution-scope=public, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team) Dec 15 03:44:01 localhost podman[92339]: 2025-12-15 08:44:01.994339494 +0000 UTC m=+0.320818084 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, config_id=tripleo_step1, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, version=17.1.12, distribution-scope=public, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 15 03:44:02 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:44:16 localhost sshd[92372]: main: sshd: ssh-rsa algorithm is disabled Dec 15 03:44:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:44:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:44:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 03:44:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:44:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:44:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:44:17 localhost systemd[1]: tmp-crun.zNGMX9.mount: Deactivated successfully. Dec 15 03:44:17 localhost podman[92375]: 2025-12-15 08:44:17.771416768 +0000 UTC m=+0.094489279 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., managed_by=tripleo_ansible, build-date=2025-11-18T23:44:13Z, config_id=tripleo_step3, version=17.1.12, distribution-scope=public, url=https://www.redhat.com, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, vcs-type=git, release=1761123044) Dec 15 03:44:17 localhost systemd[1]: tmp-crun.i1LaBP.mount: Deactivated successfully. Dec 15 03:44:17 localhost podman[92376]: 2025-12-15 08:44:17.821166348 +0000 UTC m=+0.139884963 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, release=1761123044, vcs-type=git, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, io.openshift.expose-services=, architecture=x86_64, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12) Dec 15 03:44:17 localhost podman[92377]: 2025-12-15 08:44:17.821039925 +0000 UTC m=+0.136648416 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, release=1761123044, version=17.1.12, vcs-type=git, tcib_managed=true, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, io.openshift.expose-services=, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1) Dec 15 03:44:17 localhost podman[92376]: 2025-12-15 08:44:17.847334159 +0000 UTC m=+0.166052814 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., tcib_managed=true, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, container_name=nova_compute) Dec 15 03:44:17 localhost podman[92375]: 2025-12-15 08:44:17.855953689 +0000 UTC m=+0.179026250 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, vendor=Red Hat, Inc., io.buildah.version=1.41.4, config_id=tripleo_step3, io.openshift.expose-services=, container_name=iscsid, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2025-11-18T23:44:13Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, vcs-type=git, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid) Dec 15 03:44:17 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Deactivated successfully. Dec 15 03:44:17 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:44:17 localhost podman[92377]: 2025-12-15 08:44:17.902382151 +0000 UTC m=+0.217990642 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, version=17.1.12, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, release=1761123044, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Dec 15 03:44:17 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 03:44:17 localhost podman[92378]: 2025-12-15 08:44:17.977211163 +0000 UTC m=+0.283532186 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, url=https://www.redhat.com, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.buildah.version=1.41.4, version=17.1.12, name=rhosp17/openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, release=1761123044, batch=17.1_20251118.1, vcs-type=git, maintainer=OpenStack TripleO Team) Dec 15 03:44:17 localhost podman[92374]: 2025-12-15 08:44:17.9711228 +0000 UTC m=+0.293994106 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-type=git, tcib_managed=true, name=rhosp17/openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, url=https://www.redhat.com, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.buildah.version=1.41.4, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, config_id=tripleo_step3, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd) Dec 15 03:44:18 localhost podman[92390]: 2025-12-15 08:44:18.033750815 +0000 UTC m=+0.338135756 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, version=17.1.12, name=rhosp17/openstack-ceilometer-compute, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, build-date=2025-11-19T00:11:48Z, config_id=tripleo_step4, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute) Dec 15 03:44:18 localhost podman[92374]: 2025-12-15 08:44:18.063424809 +0000 UTC m=+0.386296085 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd, architecture=x86_64, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, io.buildah.version=1.41.4, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, url=https://www.redhat.com, vcs-type=git, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, build-date=2025-11-18T22:51:28Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 15 03:44:18 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:44:18 localhost podman[92390]: 2025-12-15 08:44:18.089409184 +0000 UTC m=+0.393794095 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, vendor=Red Hat, Inc., batch=17.1_20251118.1, tcib_managed=true, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, build-date=2025-11-19T00:11:48Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.buildah.version=1.41.4, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Dec 15 03:44:18 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 03:44:18 localhost podman[92378]: 2025-12-15 08:44:18.115212294 +0000 UTC m=+0.421533317 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-type=git, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, name=rhosp17/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:32Z, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., batch=17.1_20251118.1, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, release=1761123044, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 15 03:44:18 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 03:44:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:44:20 localhost podman[92513]: 2025-12-15 08:44:20.751917511 +0000 UTC m=+0.083366461 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, config_id=tripleo_step4, distribution-scope=public) Dec 15 03:44:21 localhost podman[92513]: 2025-12-15 08:44:21.130329804 +0000 UTC m=+0.461778714 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, config_id=tripleo_step4, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, version=17.1.12, release=1761123044, tcib_managed=true, url=https://www.redhat.com, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.41.4, container_name=nova_migration_target, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute) Dec 15 03:44:21 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 03:44:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:44:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:44:23 localhost systemd[1]: tmp-crun.yWOGyj.mount: Deactivated successfully. Dec 15 03:44:23 localhost systemd[1]: tmp-crun.2tVvHB.mount: Deactivated successfully. Dec 15 03:44:23 localhost podman[92536]: 2025-12-15 08:44:23.805054445 +0000 UTC m=+0.136438950 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, build-date=2025-11-18T23:34:05Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, release=1761123044, distribution-scope=public, maintainer=OpenStack TripleO Team, version=17.1.12, managed_by=tripleo_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 15 03:44:23 localhost podman[92537]: 2025-12-15 08:44:23.771384085 +0000 UTC m=+0.095045124 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vcs-type=git, release=1761123044, build-date=2025-11-19T00:14:25Z, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, maintainer=OpenStack TripleO Team, version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn) Dec 15 03:44:23 localhost podman[92536]: 2025-12-15 08:44:23.828405311 +0000 UTC m=+0.159789816 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, build-date=2025-11-18T23:34:05Z, distribution-scope=public, name=rhosp17/openstack-ovn-controller, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, vcs-type=git, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 15 03:44:23 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Deactivated successfully. Dec 15 03:44:23 localhost podman[92537]: 2025-12-15 08:44:23.855674491 +0000 UTC m=+0.179335550 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, vendor=Red Hat, Inc., org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, io.openshift.expose-services=, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, vcs-type=git, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, version=17.1.12, distribution-scope=public, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2025-11-19T00:14:25Z) Dec 15 03:44:23 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Deactivated successfully. Dec 15 03:44:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:44:32 localhost systemd[1]: tmp-crun.EbjiX6.mount: Deactivated successfully. Dec 15 03:44:32 localhost podman[92581]: 2025-12-15 08:44:32.748727992 +0000 UTC m=+0.084263755 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1761123044, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd) Dec 15 03:44:32 localhost podman[92581]: 2025-12-15 08:44:32.96919259 +0000 UTC m=+0.304728403 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, url=https://www.redhat.com, config_id=tripleo_step1, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, container_name=metrics_qdr, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.12, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, release=1761123044, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Dec 15 03:44:32 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:44:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:44:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:44:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 03:44:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:44:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:44:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:44:48 localhost systemd[1]: tmp-crun.tIkvVR.mount: Deactivated successfully. Dec 15 03:44:48 localhost podman[92718]: 2025-12-15 08:44:48.787195789 +0000 UTC m=+0.110023124 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, vcs-type=git, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, container_name=collectd, name=rhosp17/openstack-collectd, io.buildah.version=1.41.4, build-date=2025-11-18T22:51:28Z, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, release=1761123044, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}) Dec 15 03:44:48 localhost podman[92720]: 2025-12-15 08:44:48.827023555 +0000 UTC m=+0.147277631 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, tcib_managed=true, distribution-scope=public, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step5, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, release=1761123044, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, url=https://www.redhat.com, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12) Dec 15 03:44:48 localhost podman[92718]: 2025-12-15 08:44:48.828358421 +0000 UTC m=+0.151185706 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20251118.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, tcib_managed=true, distribution-scope=public, container_name=collectd, name=rhosp17/openstack-collectd, release=1761123044, build-date=2025-11-18T22:51:28Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}) Dec 15 03:44:48 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:44:48 localhost podman[92728]: 2025-12-15 08:44:48.935909408 +0000 UTC m=+0.249853995 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, version=17.1.12, config_id=tripleo_step4, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:32Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, release=1761123044, name=rhosp17/openstack-cron, batch=17.1_20251118.1, container_name=logrotate_crond, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc.) Dec 15 03:44:48 localhost podman[92728]: 2025-12-15 08:44:48.949247655 +0000 UTC m=+0.263192262 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, io.openshift.expose-services=, io.buildah.version=1.41.4, managed_by=tripleo_ansible, batch=17.1_20251118.1, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, build-date=2025-11-18T22:49:32Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true) Dec 15 03:44:48 localhost podman[92719]: 2025-12-15 08:44:48.952964104 +0000 UTC m=+0.275260725 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1761123044, vcs-type=git, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, architecture=x86_64, build-date=2025-11-18T23:44:13Z, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, url=https://www.redhat.com, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, vendor=Red Hat, Inc., distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid) Dec 15 03:44:48 localhost podman[92720]: 2025-12-15 08:44:48.963445694 +0000 UTC m=+0.283699740 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, managed_by=tripleo_ansible, batch=17.1_20251118.1, vendor=Red Hat, Inc., version=17.1.12, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute) Dec 15 03:44:48 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Deactivated successfully. Dec 15 03:44:49 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 03:44:49 localhost podman[92721]: 2025-12-15 08:44:49.03764683 +0000 UTC m=+0.350772756 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:12:45Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.4, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, distribution-scope=public, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, vcs-type=git, config_id=tripleo_step4) Dec 15 03:44:49 localhost podman[92719]: 2025-12-15 08:44:49.043663211 +0000 UTC m=+0.365959822 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, architecture=x86_64, tcib_managed=true, release=1761123044, com.redhat.component=openstack-iscsid-container, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, url=https://www.redhat.com, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, vcs-type=git, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d) Dec 15 03:44:49 localhost podman[92737]: 2025-12-15 08:44:49.053413751 +0000 UTC m=+0.356878738 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, distribution-scope=public, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, tcib_managed=true, io.buildah.version=1.41.4, build-date=2025-11-19T00:11:48Z, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64) Dec 15 03:44:49 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:44:49 localhost podman[92721]: 2025-12-15 08:44:49.088468969 +0000 UTC m=+0.401594915 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., architecture=x86_64, version=17.1.12, managed_by=tripleo_ansible, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:12:45Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676) Dec 15 03:44:49 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 03:44:49 localhost podman[92737]: 2025-12-15 08:44:49.108414363 +0000 UTC m=+0.411879340 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vendor=Red Hat, Inc., io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, architecture=x86_64, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, release=1761123044, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, io.openshift.expose-services=) Dec 15 03:44:49 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 03:44:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:44:51 localhost systemd[1]: tmp-crun.O8ZySA.mount: Deactivated successfully. Dec 15 03:44:51 localhost podman[92866]: 2025-12-15 08:44:51.776210542 +0000 UTC m=+0.106043048 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, tcib_managed=true, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, distribution-scope=public) Dec 15 03:44:52 localhost podman[92866]: 2025-12-15 08:44:52.149394085 +0000 UTC m=+0.479226601 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, architecture=x86_64, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, tcib_managed=true, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, distribution-scope=public, name=rhosp17/openstack-nova-compute, release=1761123044, batch=17.1_20251118.1, vcs-type=git, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Dec 15 03:44:52 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 03:44:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:44:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:44:54 localhost systemd[1]: tmp-crun.kUEU7d.mount: Deactivated successfully. Dec 15 03:44:54 localhost podman[92889]: 2025-12-15 08:44:54.770190146 +0000 UTC m=+0.095379153 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:14:25Z, io.buildah.version=1.41.4, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, version=17.1.12, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64) Dec 15 03:44:54 localhost systemd[1]: tmp-crun.95haMY.mount: Deactivated successfully. Dec 15 03:44:54 localhost podman[92888]: 2025-12-15 08:44:54.820120232 +0000 UTC m=+0.148080203 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, distribution-scope=public, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.buildah.version=1.41.4, release=1761123044, container_name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, vcs-type=git, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, name=rhosp17/openstack-ovn-controller, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc.) Dec 15 03:44:54 localhost podman[92889]: 2025-12-15 08:44:54.846404514 +0000 UTC m=+0.171593561 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.expose-services=, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, distribution-scope=public, config_id=tripleo_step4, release=1761123044, build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vendor=Red Hat, Inc., org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, batch=17.1_20251118.1) Dec 15 03:44:54 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Deactivated successfully. Dec 15 03:44:54 localhost podman[92888]: 2025-12-15 08:44:54.8724022 +0000 UTC m=+0.200362091 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-type=git, tcib_managed=true, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:34:05Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20251118.1, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, io.buildah.version=1.41.4) Dec 15 03:44:54 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Deactivated successfully. Dec 15 03:45:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:45:03 localhost systemd[1]: tmp-crun.sNdG4P.mount: Deactivated successfully. Dec 15 03:45:03 localhost podman[92936]: 2025-12-15 08:45:03.757199834 +0000 UTC m=+0.091252543 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-type=git, tcib_managed=true, name=rhosp17/openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, version=17.1.12, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1) Dec 15 03:45:03 localhost podman[92936]: 2025-12-15 08:45:03.952132829 +0000 UTC m=+0.286185498 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, url=https://www.redhat.com, distribution-scope=public, vendor=Red Hat, Inc., container_name=metrics_qdr, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z) Dec 15 03:45:03 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:45:05 localhost ceph-osd[31375]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Dec 15 03:45:05 localhost ceph-osd[31375]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 3600.1 total, 600.0 interval#012Cumulative writes: 4815 writes, 21K keys, 4815 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 4815 writes, 628 syncs, 7.67 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Dec 15 03:45:10 localhost ceph-osd[32311]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Dec 15 03:45:10 localhost ceph-osd[32311]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 3600.2 total, 600.0 interval#012Cumulative writes: 5745 writes, 25K keys, 5745 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s#012Cumulative WAL: 5745 writes, 763 syncs, 7.53 writes per sync, written: 0.02 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Dec 15 03:45:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:45:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:45:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 03:45:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:45:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:45:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:45:19 localhost podman[92966]: 2025-12-15 08:45:19.763356245 +0000 UTC m=+0.083414873 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-type=git, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, config_id=tripleo_step5, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Dec 15 03:45:19 localhost systemd[1]: tmp-crun.OQg4gG.mount: Deactivated successfully. Dec 15 03:45:19 localhost podman[92964]: 2025-12-15 08:45:19.811002309 +0000 UTC m=+0.140974433 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, config_id=tripleo_step3, url=https://www.redhat.com, vcs-type=git, io.buildah.version=1.41.4, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, version=17.1.12, io.openshift.expose-services=, release=1761123044, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}) Dec 15 03:45:19 localhost podman[92966]: 2025-12-15 08:45:19.815865178 +0000 UTC m=+0.135923796 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, batch=17.1_20251118.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, io.openshift.expose-services=, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Dec 15 03:45:19 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Deactivated successfully. Dec 15 03:45:19 localhost podman[92978]: 2025-12-15 08:45:19.853142186 +0000 UTC m=+0.168552499 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, architecture=x86_64, build-date=2025-11-18T22:49:32Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-cron, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, release=1761123044, distribution-scope=public, config_id=tripleo_step4, managed_by=tripleo_ansible) Dec 15 03:45:19 localhost podman[92964]: 2025-12-15 08:45:19.867333476 +0000 UTC m=+0.197305540 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:51:28Z, config_id=tripleo_step3, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-collectd, url=https://www.redhat.com, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, distribution-scope=public, vendor=Red Hat, Inc., release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.buildah.version=1.41.4, architecture=x86_64, tcib_managed=true, version=17.1.12, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd) Dec 15 03:45:19 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:45:19 localhost podman[92972]: 2025-12-15 08:45:19.780364979 +0000 UTC m=+0.098622349 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, io.openshift.expose-services=, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, build-date=2025-11-19T00:12:45Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, batch=17.1_20251118.1) Dec 15 03:45:19 localhost podman[92979]: 2025-12-15 08:45:19.883290742 +0000 UTC m=+0.195818848 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:11:48Z, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_compute, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container) Dec 15 03:45:19 localhost podman[92965]: 2025-12-15 08:45:19.926403606 +0000 UTC m=+0.253992986 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, version=17.1.12, architecture=x86_64, managed_by=tripleo_ansible, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.openshift.expose-services=, config_id=tripleo_step3, io.buildah.version=1.41.4) Dec 15 03:45:19 localhost podman[92979]: 2025-12-15 08:45:19.935510229 +0000 UTC m=+0.248038295 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, io.buildah.version=1.41.4, architecture=x86_64, container_name=ceilometer_agent_compute, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:11:48Z, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, batch=17.1_20251118.1) Dec 15 03:45:19 localhost podman[92978]: 2025-12-15 08:45:19.941295224 +0000 UTC m=+0.256705557 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, build-date=2025-11-18T22:49:32Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, container_name=logrotate_crond, vendor=Red Hat, Inc., distribution-scope=public, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, architecture=x86_64, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20251118.1, config_id=tripleo_step4, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container) Dec 15 03:45:19 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 03:45:19 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 03:45:19 localhost podman[92965]: 2025-12-15 08:45:19.990356427 +0000 UTC m=+0.317945847 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, build-date=2025-11-18T23:44:13Z, version=17.1.12, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, container_name=iscsid, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d) Dec 15 03:45:20 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:45:20 localhost podman[92972]: 2025-12-15 08:45:20.013186218 +0000 UTC m=+0.331443608 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, tcib_managed=true, distribution-scope=public, url=https://www.redhat.com, vcs-type=git, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp17/openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., architecture=x86_64, release=1761123044, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 15 03:45:20 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 03:45:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:45:22 localhost podman[93098]: 2025-12-15 08:45:22.749970012 +0000 UTC m=+0.082004795 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step4, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, release=1761123044, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, architecture=x86_64, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute) Dec 15 03:45:23 localhost podman[93098]: 2025-12-15 08:45:23.188261986 +0000 UTC m=+0.520296749 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., config_id=tripleo_step4, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, release=1761123044, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4) Dec 15 03:45:23 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 03:45:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:45:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:45:25 localhost systemd[1]: tmp-crun.IkbTXw.mount: Deactivated successfully. Dec 15 03:45:25 localhost podman[93121]: 2025-12-15 08:45:25.753968953 +0000 UTC m=+0.084735487 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, release=1761123044, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, build-date=2025-11-18T23:34:05Z, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, config_id=tripleo_step4) Dec 15 03:45:25 localhost systemd[1]: tmp-crun.1VWmUB.mount: Deactivated successfully. Dec 15 03:45:25 localhost podman[93121]: 2025-12-15 08:45:25.809318354 +0000 UTC m=+0.140084818 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, release=1761123044, tcib_managed=true, build-date=2025-11-18T23:34:05Z, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, vendor=Red Hat, Inc., version=17.1.12, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20251118.1, vcs-type=git, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller) Dec 15 03:45:25 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Deactivated successfully. Dec 15 03:45:25 localhost podman[93122]: 2025-12-15 08:45:25.80840067 +0000 UTC m=+0.133898753 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-type=git, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, build-date=2025-11-19T00:14:25Z, managed_by=tripleo_ansible, distribution-scope=public, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4) Dec 15 03:45:25 localhost podman[93122]: 2025-12-15 08:45:25.891505303 +0000 UTC m=+0.217003386 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, url=https://www.redhat.com, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, version=17.1.12, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2025-11-19T00:14:25Z, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public) Dec 15 03:45:25 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Deactivated successfully. Dec 15 03:45:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:45:34 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Dec 15 03:45:34 localhost recover_tripleo_nova_virtqemud[93169]: 61849 Dec 15 03:45:34 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Dec 15 03:45:34 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Dec 15 03:45:34 localhost podman[93167]: 2025-12-15 08:45:34.766176297 +0000 UTC m=+0.090419109 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, architecture=x86_64, io.buildah.version=1.41.4, container_name=metrics_qdr, batch=17.1_20251118.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, distribution-scope=public, vcs-type=git, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 15 03:45:34 localhost podman[93167]: 2025-12-15 08:45:34.979049962 +0000 UTC m=+0.303292704 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, release=1761123044, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, batch=17.1_20251118.1, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, version=17.1.12, config_id=tripleo_step1) Dec 15 03:45:34 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:45:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:45:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:45:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 03:45:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:45:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:45:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:45:50 localhost systemd[1]: tmp-crun.bn6eoz.mount: Deactivated successfully. Dec 15 03:45:50 localhost podman[93282]: 2025-12-15 08:45:50.441598408 +0000 UTC m=+0.106400027 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, url=https://www.redhat.com, batch=17.1_20251118.1, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1761123044, vcs-type=git, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., config_id=tripleo_step4) Dec 15 03:45:50 localhost podman[93282]: 2025-12-15 08:45:50.478329691 +0000 UTC m=+0.143131340 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2025-11-19T00:12:45Z, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, config_id=tripleo_step4, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 15 03:45:50 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 03:45:50 localhost podman[93284]: 2025-12-15 08:45:50.480051316 +0000 UTC m=+0.141413563 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, batch=17.1_20251118.1, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron) Dec 15 03:45:50 localhost podman[93295]: 2025-12-15 08:45:50.548232451 +0000 UTC m=+0.208458128 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, version=17.1.12, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, config_id=tripleo_step4, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:11:48Z, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute) Dec 15 03:45:50 localhost podman[93284]: 2025-12-15 08:45:50.562370919 +0000 UTC m=+0.223733206 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, vcs-type=git, com.redhat.component=openstack-cron-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, tcib_managed=true, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, io.buildah.version=1.41.4, vendor=Red Hat, Inc., name=rhosp17/openstack-cron, version=17.1.12, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, build-date=2025-11-18T22:49:32Z) Dec 15 03:45:50 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 03:45:50 localhost podman[93295]: 2025-12-15 08:45:50.608482072 +0000 UTC m=+0.268707699 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, managed_by=tripleo_ansible, tcib_managed=true, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.41.4, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, build-date=2025-11-19T00:11:48Z, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676) Dec 15 03:45:50 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 03:45:50 localhost podman[93275]: 2025-12-15 08:45:50.70408266 +0000 UTC m=+0.375552088 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-collectd-container, distribution-scope=public, build-date=2025-11-18T22:51:28Z, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp17/openstack-collectd, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 15 03:45:50 localhost podman[93280]: 2025-12-15 08:45:50.749188237 +0000 UTC m=+0.418687332 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, config_id=tripleo_step5, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., tcib_managed=true, container_name=nova_compute) Dec 15 03:45:50 localhost podman[93275]: 2025-12-15 08:45:50.768109813 +0000 UTC m=+0.439579251 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 15 03:45:50 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:45:50 localhost podman[93280]: 2025-12-15 08:45:50.7844531 +0000 UTC m=+0.453952215 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., version=17.1.12, container_name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public) Dec 15 03:45:50 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Deactivated successfully. Dec 15 03:45:50 localhost podman[93278]: 2025-12-15 08:45:50.853925019 +0000 UTC m=+0.524036310 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, architecture=x86_64, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.expose-services=, build-date=2025-11-18T23:44:13Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3) Dec 15 03:45:50 localhost podman[93278]: 2025-12-15 08:45:50.942698904 +0000 UTC m=+0.612810185 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-type=git, version=17.1.12, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, tcib_managed=true, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.expose-services=, vendor=Red Hat, Inc.) Dec 15 03:45:50 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:45:51 localhost podman[93468]: Dec 15 03:45:51 localhost podman[93468]: 2025-12-15 08:45:51.198374333 +0000 UTC m=+0.088672613 container create 984dd4e42057683704fe060dc3b6f35041046795e335db708058bcb04832f552 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_raman, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , architecture=x86_64, build-date=2025-11-26T19:44:28Z, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, ceph=True, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, com.redhat.component=rhceph-container, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, io.openshift.expose-services=, RELEASE=main, name=rhceph, release=1763362218, distribution-scope=public, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc.) Dec 15 03:45:51 localhost systemd[1]: Started libpod-conmon-984dd4e42057683704fe060dc3b6f35041046795e335db708058bcb04832f552.scope. Dec 15 03:45:51 localhost podman[93468]: 2025-12-15 08:45:51.160163091 +0000 UTC m=+0.050461431 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 03:45:51 localhost systemd[1]: Started libcrun container. Dec 15 03:45:51 localhost podman[93468]: 2025-12-15 08:45:51.277065738 +0000 UTC m=+0.167364018 container init 984dd4e42057683704fe060dc3b6f35041046795e335db708058bcb04832f552 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_raman, maintainer=Guillaume Abrioux , architecture=x86_64, io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, release=1763362218, build-date=2025-11-26T19:44:28Z, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, com.redhat.component=rhceph-container, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, RELEASE=main, version=7, description=Red Hat Ceph Storage 7, name=rhceph, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True) Dec 15 03:45:51 localhost podman[93468]: 2025-12-15 08:45:51.28684309 +0000 UTC m=+0.177141370 container start 984dd4e42057683704fe060dc3b6f35041046795e335db708058bcb04832f552 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_raman, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, release=1763362218, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, name=rhceph, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, vcs-type=git, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-26T19:44:28Z, ceph=True, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Dec 15 03:45:51 localhost podman[93468]: 2025-12-15 08:45:51.287090847 +0000 UTC m=+0.177389127 container attach 984dd4e42057683704fe060dc3b6f35041046795e335db708058bcb04832f552 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_raman, build-date=2025-11-26T19:44:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, distribution-scope=public, architecture=x86_64, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, maintainer=Guillaume Abrioux , name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, release=1763362218, io.openshift.expose-services=, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Dec 15 03:45:51 localhost priceless_raman[93484]: 167 167 Dec 15 03:45:51 localhost systemd[1]: libpod-984dd4e42057683704fe060dc3b6f35041046795e335db708058bcb04832f552.scope: Deactivated successfully. Dec 15 03:45:51 localhost podman[93468]: 2025-12-15 08:45:51.292010748 +0000 UTC m=+0.182309088 container died 984dd4e42057683704fe060dc3b6f35041046795e335db708058bcb04832f552 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_raman, vendor=Red Hat, Inc., ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, RELEASE=main, vcs-type=git, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, name=rhceph, maintainer=Guillaume Abrioux , GIT_BRANCH=main, architecture=x86_64, release=1763362218, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph) Dec 15 03:45:51 localhost podman[93489]: 2025-12-15 08:45:51.39373149 +0000 UTC m=+0.089520666 container remove 984dd4e42057683704fe060dc3b6f35041046795e335db708058bcb04832f552 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_raman, name=rhceph, GIT_CLEAN=True, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, version=7, build-date=2025-11-26T19:44:28Z, com.redhat.component=rhceph-container, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, vcs-type=git, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, ceph=True, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=Guillaume Abrioux ) Dec 15 03:45:51 localhost systemd[1]: libpod-conmon-984dd4e42057683704fe060dc3b6f35041046795e335db708058bcb04832f552.scope: Deactivated successfully. Dec 15 03:45:51 localhost podman[93513]: Dec 15 03:45:51 localhost podman[93513]: 2025-12-15 08:45:51.612254856 +0000 UTC m=+0.079916900 container create 8e4ded5b9249449917d3dab9344f024aa3af7b071c197ea949f4a9ddaaabeb4d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=angry_jones, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, name=rhceph, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, vcs-type=git, architecture=x86_64, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, release=1763362218, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Dec 15 03:45:51 localhost systemd[1]: Started libpod-conmon-8e4ded5b9249449917d3dab9344f024aa3af7b071c197ea949f4a9ddaaabeb4d.scope. Dec 15 03:45:51 localhost podman[93513]: 2025-12-15 08:45:51.578700848 +0000 UTC m=+0.046362942 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 03:45:51 localhost systemd[1]: Started libcrun container. Dec 15 03:45:51 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0b39fed9b6ab09f521d1447c4957e6277b6cac1002e6011f2fbea40ddad8f3e1/merged/rootfs supports timestamps until 2038 (0x7fffffff) Dec 15 03:45:51 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0b39fed9b6ab09f521d1447c4957e6277b6cac1002e6011f2fbea40ddad8f3e1/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Dec 15 03:45:51 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0b39fed9b6ab09f521d1447c4957e6277b6cac1002e6011f2fbea40ddad8f3e1/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Dec 15 03:45:51 localhost podman[93513]: 2025-12-15 08:45:51.693307464 +0000 UTC m=+0.160969508 container init 8e4ded5b9249449917d3dab9344f024aa3af7b071c197ea949f4a9ddaaabeb4d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=angry_jones, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, release=1763362218, version=7, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=rhceph, vcs-type=git, GIT_CLEAN=True, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container) Dec 15 03:45:51 localhost podman[93513]: 2025-12-15 08:45:51.711190492 +0000 UTC m=+0.178852546 container start 8e4ded5b9249449917d3dab9344f024aa3af7b071c197ea949f4a9ddaaabeb4d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=angry_jones, io.buildah.version=1.41.4, architecture=x86_64, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, com.redhat.component=rhceph-container, io.openshift.expose-services=, distribution-scope=public, CEPH_POINT_RELEASE=, release=1763362218, version=7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Dec 15 03:45:51 localhost podman[93513]: 2025-12-15 08:45:51.711525021 +0000 UTC m=+0.179187135 container attach 8e4ded5b9249449917d3dab9344f024aa3af7b071c197ea949f4a9ddaaabeb4d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=angry_jones, release=1763362218, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vendor=Red Hat, Inc., distribution-scope=public, version=7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, ceph=True, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=) Dec 15 03:45:52 localhost systemd[1]: tmp-crun.ChEpJy.mount: Deactivated successfully. Dec 15 03:45:52 localhost angry_jones[93528]: [ Dec 15 03:45:52 localhost angry_jones[93528]: { Dec 15 03:45:52 localhost angry_jones[93528]: "available": false, Dec 15 03:45:52 localhost angry_jones[93528]: "ceph_device": false, Dec 15 03:45:52 localhost angry_jones[93528]: "device_id": "QEMU_DVD-ROM_QM00001", Dec 15 03:45:52 localhost angry_jones[93528]: "lsm_data": {}, Dec 15 03:45:52 localhost angry_jones[93528]: "lvs": [], Dec 15 03:45:52 localhost angry_jones[93528]: "path": "/dev/sr0", Dec 15 03:45:52 localhost angry_jones[93528]: "rejected_reasons": [ Dec 15 03:45:52 localhost angry_jones[93528]: "Insufficient space (<5GB)", Dec 15 03:45:52 localhost angry_jones[93528]: "Has a FileSystem" Dec 15 03:45:52 localhost angry_jones[93528]: ], Dec 15 03:45:52 localhost angry_jones[93528]: "sys_api": { Dec 15 03:45:52 localhost angry_jones[93528]: "actuators": null, Dec 15 03:45:52 localhost angry_jones[93528]: "device_nodes": "sr0", Dec 15 03:45:52 localhost angry_jones[93528]: "human_readable_size": "482.00 KB", Dec 15 03:45:52 localhost angry_jones[93528]: "id_bus": "ata", Dec 15 03:45:52 localhost angry_jones[93528]: "model": "QEMU DVD-ROM", Dec 15 03:45:52 localhost angry_jones[93528]: "nr_requests": "2", Dec 15 03:45:52 localhost angry_jones[93528]: "partitions": {}, Dec 15 03:45:52 localhost angry_jones[93528]: "path": "/dev/sr0", Dec 15 03:45:52 localhost angry_jones[93528]: "removable": "1", Dec 15 03:45:52 localhost angry_jones[93528]: "rev": "2.5+", Dec 15 03:45:52 localhost angry_jones[93528]: "ro": "0", Dec 15 03:45:52 localhost angry_jones[93528]: "rotational": "1", Dec 15 03:45:52 localhost angry_jones[93528]: "sas_address": "", Dec 15 03:45:52 localhost angry_jones[93528]: "sas_device_handle": "", Dec 15 03:45:52 localhost angry_jones[93528]: "scheduler_mode": "mq-deadline", Dec 15 03:45:52 localhost angry_jones[93528]: "sectors": 0, Dec 15 03:45:52 localhost angry_jones[93528]: "sectorsize": "2048", Dec 15 03:45:52 localhost angry_jones[93528]: "size": 493568.0, Dec 15 03:45:52 localhost angry_jones[93528]: "support_discard": "0", Dec 15 03:45:52 localhost angry_jones[93528]: "type": "disk", Dec 15 03:45:52 localhost angry_jones[93528]: "vendor": "QEMU" Dec 15 03:45:52 localhost angry_jones[93528]: } Dec 15 03:45:52 localhost angry_jones[93528]: } Dec 15 03:45:52 localhost angry_jones[93528]: ] Dec 15 03:45:52 localhost systemd[1]: libpod-8e4ded5b9249449917d3dab9344f024aa3af7b071c197ea949f4a9ddaaabeb4d.scope: Deactivated successfully. Dec 15 03:45:52 localhost systemd[1]: libpod-8e4ded5b9249449917d3dab9344f024aa3af7b071c197ea949f4a9ddaaabeb4d.scope: Consumed 1.032s CPU time. Dec 15 03:45:52 localhost podman[93513]: 2025-12-15 08:45:52.70542566 +0000 UTC m=+1.173087724 container died 8e4ded5b9249449917d3dab9344f024aa3af7b071c197ea949f4a9ddaaabeb4d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=angry_jones, GIT_CLEAN=True, RELEASE=main, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.openshift.expose-services=, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-11-26T19:44:28Z, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, architecture=x86_64, version=7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux , io.buildah.version=1.41.4, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main) Dec 15 03:45:52 localhost systemd[1]: tmp-crun.gwn2Wz.mount: Deactivated successfully. Dec 15 03:45:52 localhost systemd[1]: var-lib-containers-storage-overlay-0b39fed9b6ab09f521d1447c4957e6277b6cac1002e6011f2fbea40ddad8f3e1-merged.mount: Deactivated successfully. Dec 15 03:45:52 localhost podman[95386]: 2025-12-15 08:45:52.824608568 +0000 UTC m=+0.103353885 container remove 8e4ded5b9249449917d3dab9344f024aa3af7b071c197ea949f4a9ddaaabeb4d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=angry_jones, distribution-scope=public, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., version=7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, release=1763362218, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, GIT_CLEAN=True, ceph=True, vcs-type=git, io.openshift.expose-services=, build-date=2025-11-26T19:44:28Z, maintainer=Guillaume Abrioux , cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Dec 15 03:45:52 localhost systemd[1]: libpod-conmon-8e4ded5b9249449917d3dab9344f024aa3af7b071c197ea949f4a9ddaaabeb4d.scope: Deactivated successfully. Dec 15 03:45:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:45:53 localhost systemd[1]: tmp-crun.k8geoe.mount: Deactivated successfully. Dec 15 03:45:53 localhost podman[95414]: 2025-12-15 08:45:53.534648583 +0000 UTC m=+0.101387363 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, batch=17.1_20251118.1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, vcs-type=git, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., config_id=tripleo_step4, version=17.1.12, io.buildah.version=1.41.4, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Dec 15 03:45:53 localhost podman[95414]: 2025-12-15 08:45:53.943405588 +0000 UTC m=+0.510144308 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, batch=17.1_20251118.1, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., architecture=x86_64, release=1761123044, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, vcs-type=git) Dec 15 03:45:53 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 03:45:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:45:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:45:56 localhost podman[95437]: 2025-12-15 08:45:56.759337179 +0000 UTC m=+0.083931876 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, build-date=2025-11-18T23:34:05Z, vendor=Red Hat, Inc., config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, url=https://www.redhat.com, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, container_name=ovn_controller, batch=17.1_20251118.1) Dec 15 03:45:56 localhost systemd[1]: tmp-crun.I0RRjL.mount: Deactivated successfully. Dec 15 03:45:56 localhost podman[95437]: 2025-12-15 08:45:56.834786897 +0000 UTC m=+0.159381534 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.expose-services=, managed_by=tripleo_ansible, architecture=x86_64, tcib_managed=true, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, release=1761123044, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, build-date=2025-11-18T23:34:05Z, name=rhosp17/openstack-ovn-controller, io.buildah.version=1.41.4, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public) Dec 15 03:45:56 localhost podman[95437]: unhealthy Dec 15 03:45:56 localhost podman[95438]: 2025-12-15 08:45:56.84163355 +0000 UTC m=+0.159114687 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, managed_by=tripleo_ansible, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, version=17.1.12, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.buildah.version=1.41.4, build-date=2025-11-19T00:14:25Z, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 15 03:45:56 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Main process exited, code=exited, status=1/FAILURE Dec 15 03:45:56 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Failed with result 'exit-code'. Dec 15 03:45:56 localhost podman[95438]: 2025-12-15 08:45:56.89730895 +0000 UTC m=+0.214790037 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vendor=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, release=1761123044, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.4, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, url=https://www.redhat.com, version=17.1.12, build-date=2025-11-19T00:14:25Z, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, config_id=tripleo_step4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c) Dec 15 03:45:56 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Deactivated successfully. Dec 15 03:46:00 localhost sshd[95489]: main: sshd: ssh-rsa algorithm is disabled Dec 15 03:46:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:46:05 localhost systemd[1]: tmp-crun.w0fYW5.mount: Deactivated successfully. Dec 15 03:46:05 localhost podman[95491]: 2025-12-15 08:46:05.768742076 +0000 UTC m=+0.100669614 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., architecture=x86_64, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=metrics_qdr, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12) Dec 15 03:46:05 localhost podman[95491]: 2025-12-15 08:46:05.971357027 +0000 UTC m=+0.303284535 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, version=17.1.12, url=https://www.redhat.com, distribution-scope=public, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 15 03:46:05 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:46:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:46:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:46:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:46:20 localhost podman[95520]: 2025-12-15 08:46:20.759044145 +0000 UTC m=+0.081578683 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-type=git, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, managed_by=tripleo_ansible, tcib_managed=true, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1) Dec 15 03:46:20 localhost podman[95520]: 2025-12-15 08:46:20.795231193 +0000 UTC m=+0.117765711 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, version=17.1.12, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, vcs-type=git, io.buildah.version=1.41.4) Dec 15 03:46:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:46:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 03:46:20 localhost podman[95521]: 2025-12-15 08:46:20.816133892 +0000 UTC m=+0.140257963 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, tcib_managed=true, url=https://www.redhat.com, release=1761123044, vcs-type=git, build-date=2025-11-18T22:49:32Z, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 15 03:46:20 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 03:46:20 localhost podman[95521]: 2025-12-15 08:46:20.850309567 +0000 UTC m=+0.174433638 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, batch=17.1_20251118.1, build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.12, config_id=tripleo_step4, name=rhosp17/openstack-cron, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64, vcs-type=git, container_name=logrotate_crond, tcib_managed=true, maintainer=OpenStack TripleO Team) Dec 15 03:46:20 localhost systemd[1]: tmp-crun.KYB8ro.mount: Deactivated successfully. Dec 15 03:46:20 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 03:46:20 localhost podman[95522]: 2025-12-15 08:46:20.869729937 +0000 UTC m=+0.191710211 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, distribution-scope=public, tcib_managed=true, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, container_name=ceilometer_agent_compute, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, build-date=2025-11-19T00:11:48Z) Dec 15 03:46:20 localhost podman[95522]: 2025-12-15 08:46:20.895585488 +0000 UTC m=+0.217565772 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com, release=1761123044, maintainer=OpenStack TripleO Team, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, vcs-type=git, build-date=2025-11-19T00:11:48Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 15 03:46:20 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 03:46:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:46:20 localhost podman[95578]: 2025-12-15 08:46:20.988019381 +0000 UTC m=+0.160855505 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, io.openshift.expose-services=, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, vcs-type=git, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 15 03:46:21 localhost podman[95578]: 2025-12-15 08:46:21.018392103 +0000 UTC m=+0.191228267 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, distribution-scope=public, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, release=1761123044, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, version=17.1.12, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=) Dec 15 03:46:21 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Deactivated successfully. Dec 15 03:46:21 localhost podman[95628]: 2025-12-15 08:46:21.079273721 +0000 UTC m=+0.084182213 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, build-date=2025-11-18T23:44:13Z, vcs-type=git, version=17.1.12, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, release=1761123044, container_name=iscsid, maintainer=OpenStack TripleO Team, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d) Dec 15 03:46:21 localhost podman[95628]: 2025-12-15 08:46:21.094435818 +0000 UTC m=+0.099344250 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, release=1761123044, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, url=https://www.redhat.com, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid) Dec 15 03:46:21 localhost podman[95576]: 2025-12-15 08:46:20.993853547 +0000 UTC m=+0.172206509 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, distribution-scope=public, com.redhat.component=openstack-collectd-container, tcib_managed=true, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., config_id=tripleo_step3, io.buildah.version=1.41.4, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2025-11-18T22:51:28Z, architecture=x86_64, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team) Dec 15 03:46:21 localhost podman[95576]: 2025-12-15 08:46:21.128313174 +0000 UTC m=+0.306666126 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, version=17.1.12, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, release=1761123044, vcs-type=git, url=https://www.redhat.com) Dec 15 03:46:21 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:46:21 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:46:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:46:24 localhost podman[95658]: 2025-12-15 08:46:24.7519139 +0000 UTC m=+0.086980988 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.buildah.version=1.41.4, release=1761123044, config_id=tripleo_step4, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target) Dec 15 03:46:25 localhost podman[95658]: 2025-12-15 08:46:25.12684637 +0000 UTC m=+0.461913518 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, vcs-type=git, io.buildah.version=1.41.4, version=17.1.12, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute) Dec 15 03:46:25 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 03:46:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:46:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:46:27 localhost podman[95682]: 2025-12-15 08:46:27.751912995 +0000 UTC m=+0.082493717 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, config_id=tripleo_step4, architecture=x86_64, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, distribution-scope=public, io.buildah.version=1.41.4, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, vcs-type=git, managed_by=tripleo_ansible) Dec 15 03:46:27 localhost podman[95682]: 2025-12-15 08:46:27.812930778 +0000 UTC m=+0.143511500 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, batch=17.1_20251118.1, vcs-type=git, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp17/openstack-ovn-controller, version=17.1.12, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, release=1761123044, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z) Dec 15 03:46:27 localhost podman[95683]: 2025-12-15 08:46:27.816245676 +0000 UTC m=+0.140067138 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, batch=17.1_20251118.1, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}) Dec 15 03:46:27 localhost podman[95682]: unhealthy Dec 15 03:46:27 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Main process exited, code=exited, status=1/FAILURE Dec 15 03:46:27 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Failed with result 'exit-code'. Dec 15 03:46:27 localhost podman[95683]: 2025-12-15 08:46:27.900958703 +0000 UTC m=+0.224780215 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, container_name=ovn_metadata_agent, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:14:25Z, release=1761123044) Dec 15 03:46:27 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Deactivated successfully. Dec 15 03:46:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:46:36 localhost podman[95733]: 2025-12-15 08:46:36.759434195 +0000 UTC m=+0.088154290 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, container_name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, release=1761123044, url=https://www.redhat.com, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, vcs-type=git, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.12) Dec 15 03:46:36 localhost podman[95733]: 2025-12-15 08:46:36.958736796 +0000 UTC m=+0.287456811 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vendor=Red Hat, Inc., managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.12, io.buildah.version=1.41.4, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, batch=17.1_20251118.1, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 15 03:46:36 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:46:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:46:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:46:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 03:46:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:46:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:46:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:46:51 localhost podman[95761]: 2025-12-15 08:46:51.775479181 +0000 UTC m=+0.104147857 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., container_name=collectd, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.12, architecture=x86_64) Dec 15 03:46:51 localhost podman[95761]: 2025-12-15 08:46:51.812494791 +0000 UTC m=+0.141163427 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, tcib_managed=true, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, config_id=tripleo_step3, container_name=collectd, distribution-scope=public, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, vendor=Red Hat, Inc.) Dec 15 03:46:51 localhost systemd[1]: tmp-crun.ZgCbwO.mount: Deactivated successfully. Dec 15 03:46:51 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:46:51 localhost podman[95762]: 2025-12-15 08:46:51.836024871 +0000 UTC m=+0.159399555 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.component=openstack-iscsid-container, build-date=2025-11-18T23:44:13Z, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, tcib_managed=true, version=17.1.12, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, vendor=Red Hat, Inc., release=1761123044, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64) Dec 15 03:46:51 localhost podman[95771]: 2025-12-15 08:46:51.897957578 +0000 UTC m=+0.213617826 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, tcib_managed=true, distribution-scope=public, batch=17.1_20251118.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, build-date=2025-11-18T22:49:32Z, vendor=Red Hat, Inc., container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.4, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vcs-type=git) Dec 15 03:46:51 localhost podman[95763]: 2025-12-15 08:46:51.865255503 +0000 UTC m=+0.188281338 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, distribution-scope=public, container_name=nova_compute, release=1761123044, build-date=2025-11-19T00:36:58Z, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 15 03:46:51 localhost podman[95762]: 2025-12-15 08:46:51.920298066 +0000 UTC m=+0.243672770 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, maintainer=OpenStack TripleO Team, release=1761123044, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.4, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, com.redhat.component=openstack-iscsid-container, name=rhosp17/openstack-iscsid, build-date=2025-11-18T23:44:13Z, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid) Dec 15 03:46:51 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:46:51 localhost podman[95764]: 2025-12-15 08:46:51.929162823 +0000 UTC m=+0.247161573 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, url=https://www.redhat.com, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, build-date=2025-11-19T00:12:45Z, io.openshift.expose-services=, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, distribution-scope=public, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Dec 15 03:46:51 localhost podman[95763]: 2025-12-15 08:46:51.943853295 +0000 UTC m=+0.266879080 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, architecture=x86_64, config_id=tripleo_step5, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, batch=17.1_20251118.1, tcib_managed=true, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-type=git, version=17.1.12, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container) Dec 15 03:46:51 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Deactivated successfully. Dec 15 03:46:51 localhost podman[95764]: 2025-12-15 08:46:51.983394753 +0000 UTC m=+0.301393453 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_id=tripleo_step4, vcs-type=git, architecture=x86_64, version=17.1.12, release=1761123044, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.expose-services=, url=https://www.redhat.com, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:12:45Z, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_ipmi, distribution-scope=public, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-ipmi-container) Dec 15 03:46:51 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 03:46:52 localhost podman[95780]: 2025-12-15 08:46:52.056745245 +0000 UTC m=+0.370826981 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, version=17.1.12, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., batch=17.1_20251118.1, url=https://www.redhat.com, config_id=tripleo_step4, distribution-scope=public, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, vcs-type=git, release=1761123044, build-date=2025-11-19T00:11:48Z, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Dec 15 03:46:52 localhost podman[95771]: 2025-12-15 08:46:52.081373974 +0000 UTC m=+0.397034242 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, name=rhosp17/openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, tcib_managed=true, container_name=logrotate_crond, url=https://www.redhat.com, build-date=2025-11-18T22:49:32Z, batch=17.1_20251118.1, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, com.redhat.component=openstack-cron-container, architecture=x86_64, release=1761123044, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team) Dec 15 03:46:52 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 03:46:52 localhost podman[95780]: 2025-12-15 08:46:52.112357923 +0000 UTC m=+0.426439619 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, name=rhosp17/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com) Dec 15 03:46:52 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 03:46:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:46:55 localhost podman[96012]: 2025-12-15 08:46:55.682171811 +0000 UTC m=+0.100502749 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, architecture=x86_64, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, vcs-type=git, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.expose-services=) Dec 15 03:46:56 localhost podman[96012]: 2025-12-15 08:46:56.046498308 +0000 UTC m=+0.464829276 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, version=17.1.12, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, url=https://www.redhat.com, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, release=1761123044, architecture=x86_64, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute) Dec 15 03:46:56 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 03:46:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:46:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:46:58 localhost systemd[1]: tmp-crun.SVZBOo.mount: Deactivated successfully. Dec 15 03:46:58 localhost podman[96038]: 2025-12-15 08:46:58.737358973 +0000 UTC m=+0.072656795 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, distribution-scope=public, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=, version=17.1.12, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, batch=17.1_20251118.1, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller) Dec 15 03:46:58 localhost systemd[1]: tmp-crun.Vyw8El.mount: Deactivated successfully. Dec 15 03:46:58 localhost podman[96039]: 2025-12-15 08:46:58.801638053 +0000 UTC m=+0.131034657 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, managed_by=tripleo_ansible, release=1761123044, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, maintainer=OpenStack TripleO Team) Dec 15 03:46:58 localhost podman[96038]: 2025-12-15 08:46:58.826012445 +0000 UTC m=+0.161310287 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, tcib_managed=true, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:34:05Z, version=17.1.12, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, vcs-type=git, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044) Dec 15 03:46:58 localhost podman[96038]: unhealthy Dec 15 03:46:58 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Main process exited, code=exited, status=1/FAILURE Dec 15 03:46:58 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Failed with result 'exit-code'. Dec 15 03:46:58 localhost podman[96039]: 2025-12-15 08:46:58.841585481 +0000 UTC m=+0.170982115 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, distribution-scope=public, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, version=17.1.12, io.openshift.expose-services=, vcs-type=git, container_name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible) Dec 15 03:46:58 localhost podman[96039]: unhealthy Dec 15 03:46:58 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Main process exited, code=exited, status=1/FAILURE Dec 15 03:46:58 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Failed with result 'exit-code'. Dec 15 03:47:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:47:07 localhost podman[96077]: 2025-12-15 08:47:07.756752167 +0000 UTC m=+0.084657556 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, vendor=Red Hat, Inc., release=1761123044, vcs-type=git, maintainer=OpenStack TripleO Team, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, config_id=tripleo_step1, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd) Dec 15 03:47:07 localhost podman[96077]: 2025-12-15 08:47:07.942485086 +0000 UTC m=+0.270390425 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., version=17.1.12, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, architecture=x86_64, container_name=metrics_qdr, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 15 03:47:07 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:47:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:47:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:47:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 03:47:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:47:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:47:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:47:22 localhost systemd[1]: tmp-crun.rZjpFG.mount: Deactivated successfully. Dec 15 03:47:22 localhost podman[96107]: 2025-12-15 08:47:22.781876083 +0000 UTC m=+0.104555578 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, architecture=x86_64, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, com.redhat.component=openstack-collectd-container, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, vendor=Red Hat, Inc., container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 15 03:47:22 localhost podman[96107]: 2025-12-15 08:47:22.814186977 +0000 UTC m=+0.136866482 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:51:28Z, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, container_name=collectd, vcs-type=git, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc.) Dec 15 03:47:22 localhost podman[96109]: 2025-12-15 08:47:22.827602226 +0000 UTC m=+0.145109003 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, vendor=Red Hat, Inc., batch=17.1_20251118.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, architecture=x86_64, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, release=1761123044, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=) Dec 15 03:47:22 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:47:22 localhost podman[96109]: 2025-12-15 08:47:22.887383175 +0000 UTC m=+0.204889892 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, batch=17.1_20251118.1, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, tcib_managed=true, name=rhosp17/openstack-nova-compute, vcs-type=git, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64) Dec 15 03:47:22 localhost podman[96124]: 2025-12-15 08:47:22.896076898 +0000 UTC m=+0.202014485 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, vcs-type=git, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, release=1761123044, vendor=Red Hat, Inc., batch=17.1_20251118.1, container_name=ceilometer_agent_compute, tcib_managed=true, build-date=2025-11-19T00:11:48Z, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute) Dec 15 03:47:22 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Deactivated successfully. Dec 15 03:47:22 localhost podman[96116]: 2025-12-15 08:47:22.93203873 +0000 UTC m=+0.243410213 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, version=17.1.12, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, tcib_managed=true, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2025-11-18T22:49:32Z, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=logrotate_crond, name=rhosp17/openstack-cron, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 15 03:47:22 localhost podman[96116]: 2025-12-15 08:47:22.940296791 +0000 UTC m=+0.251668284 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, maintainer=OpenStack TripleO Team, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, name=rhosp17/openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, build-date=2025-11-18T22:49:32Z, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true) Dec 15 03:47:22 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 03:47:22 localhost podman[96124]: 2025-12-15 08:47:22.952718823 +0000 UTC m=+0.258656430 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-compute, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, tcib_managed=true, vcs-type=git, build-date=2025-11-19T00:11:48Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible) Dec 15 03:47:22 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 03:47:23 localhost podman[96110]: 2025-12-15 08:47:23.000218554 +0000 UTC m=+0.314367151 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-type=git, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, config_id=tripleo_step4, io.buildah.version=1.41.4, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, version=17.1.12, vendor=Red Hat, Inc., batch=17.1_20251118.1) Dec 15 03:47:23 localhost podman[96108]: 2025-12-15 08:47:23.036937016 +0000 UTC m=+0.354094013 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, architecture=x86_64, io.openshift.expose-services=, build-date=2025-11-18T23:44:13Z, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, container_name=iscsid, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid) Dec 15 03:47:23 localhost podman[96110]: 2025-12-15 08:47:23.057373952 +0000 UTC m=+0.371522499 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, architecture=x86_64, managed_by=tripleo_ansible, url=https://www.redhat.com, io.buildah.version=1.41.4, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, distribution-scope=public, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:12:45Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 15 03:47:23 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 03:47:23 localhost podman[96108]: 2025-12-15 08:47:23.075520818 +0000 UTC m=+0.392677815 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, container_name=iscsid, version=17.1.12, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, build-date=2025-11-18T23:44:13Z, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, batch=17.1_20251118.1) Dec 15 03:47:23 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:47:23 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Dec 15 03:47:23 localhost recover_tripleo_nova_virtqemud[96247]: 61849 Dec 15 03:47:23 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Dec 15 03:47:23 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Dec 15 03:47:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:47:26 localhost systemd[1]: tmp-crun.NB15mX.mount: Deactivated successfully. Dec 15 03:47:26 localhost podman[96248]: 2025-12-15 08:47:26.769884617 +0000 UTC m=+0.101471835 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, managed_by=tripleo_ansible, container_name=nova_migration_target, version=17.1.12, io.openshift.expose-services=, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, build-date=2025-11-19T00:36:58Z, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, release=1761123044, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Dec 15 03:47:27 localhost podman[96248]: 2025-12-15 08:47:27.179954907 +0000 UTC m=+0.511542125 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, batch=17.1_20251118.1, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, vcs-type=git, url=https://www.redhat.com, version=17.1.12, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target) Dec 15 03:47:27 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 03:47:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:47:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:47:29 localhost podman[96271]: 2025-12-15 08:47:29.730355175 +0000 UTC m=+0.063004457 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, distribution-scope=public, tcib_managed=true, batch=17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, build-date=2025-11-18T23:34:05Z, architecture=x86_64, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4) Dec 15 03:47:29 localhost podman[96271]: 2025-12-15 08:47:29.740688521 +0000 UTC m=+0.073337883 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, version=17.1.12, container_name=ovn_controller, distribution-scope=public, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller) Dec 15 03:47:29 localhost podman[96271]: unhealthy Dec 15 03:47:29 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Main process exited, code=exited, status=1/FAILURE Dec 15 03:47:29 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Failed with result 'exit-code'. Dec 15 03:47:29 localhost podman[96272]: 2025-12-15 08:47:29.784214896 +0000 UTC m=+0.111668319 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:14:25Z, url=https://www.redhat.com, architecture=x86_64, vendor=Red Hat, Inc., vcs-type=git, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, tcib_managed=true) Dec 15 03:47:29 localhost podman[96272]: 2025-12-15 08:47:29.797196083 +0000 UTC m=+0.124649536 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, architecture=x86_64, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, release=1761123044, distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1) Dec 15 03:47:29 localhost podman[96272]: unhealthy Dec 15 03:47:29 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Main process exited, code=exited, status=1/FAILURE Dec 15 03:47:29 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Failed with result 'exit-code'. Dec 15 03:47:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:47:38 localhost systemd[1]: tmp-crun.MEnOOV.mount: Deactivated successfully. Dec 15 03:47:38 localhost podman[96309]: 2025-12-15 08:47:38.760359533 +0000 UTC m=+0.089109726 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, batch=17.1_20251118.1, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, build-date=2025-11-18T22:49:46Z, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, vcs-type=git, io.openshift.expose-services=) Dec 15 03:47:38 localhost podman[96309]: 2025-12-15 08:47:38.998318889 +0000 UTC m=+0.327069042 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.buildah.version=1.41.4, distribution-scope=public, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, release=1761123044, version=17.1.12, vendor=Red Hat, Inc., tcib_managed=true, architecture=x86_64, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com) Dec 15 03:47:39 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:47:49 localhost sshd[96338]: main: sshd: ssh-rsa algorithm is disabled Dec 15 03:47:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:47:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:47:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 03:47:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:47:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:47:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:47:53 localhost podman[96341]: 2025-12-15 08:47:53.773179925 +0000 UTC m=+0.094782607 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, version=17.1.12, architecture=x86_64, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=iscsid, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, io.openshift.expose-services=, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, release=1761123044, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_id=tripleo_step3, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 15 03:47:53 localhost podman[96342]: 2025-12-15 08:47:53.827971261 +0000 UTC m=+0.147355283 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Dec 15 03:47:53 localhost podman[96348]: 2025-12-15 08:47:53.802214442 +0000 UTC m=+0.107805335 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, url=https://www.redhat.com, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container, release=1761123044, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:32Z, vendor=Red Hat, Inc., config_id=tripleo_step4, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, io.openshift.expose-services=, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 15 03:47:53 localhost podman[96355]: 2025-12-15 08:47:53.891017408 +0000 UTC m=+0.201327387 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.openshift.expose-services=, batch=17.1_20251118.1, build-date=2025-11-19T00:11:48Z, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64) Dec 15 03:47:53 localhost podman[96341]: 2025-12-15 08:47:53.904776546 +0000 UTC m=+0.226379288 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vendor=Red Hat, Inc., io.openshift.expose-services=, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, config_id=tripleo_step3, managed_by=tripleo_ansible, io.buildah.version=1.41.4, url=https://www.redhat.com, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, name=rhosp17/openstack-iscsid, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:44:13Z, tcib_managed=true, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 15 03:47:53 localhost podman[96342]: 2025-12-15 08:47:53.918472392 +0000 UTC m=+0.237856454 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, url=https://www.redhat.com, batch=17.1_20251118.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, version=17.1.12, container_name=nova_compute, release=1761123044, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, config_id=tripleo_step5, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}) Dec 15 03:47:53 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Deactivated successfully. Dec 15 03:47:53 localhost podman[96343]: 2025-12-15 08:47:53.937533352 +0000 UTC m=+0.250558444 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.buildah.version=1.41.4, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, architecture=x86_64, io.openshift.expose-services=, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044) Dec 15 03:47:53 localhost podman[96355]: 2025-12-15 08:47:53.949334108 +0000 UTC m=+0.259644077 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, tcib_managed=true, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-compute, managed_by=tripleo_ansible, batch=17.1_20251118.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.12, vendor=Red Hat, Inc., architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, build-date=2025-11-19T00:11:48Z, com.redhat.component=openstack-ceilometer-compute-container) Dec 15 03:47:53 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 03:47:53 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:47:53 localhost podman[96343]: 2025-12-15 08:47:53.993373386 +0000 UTC m=+0.306398498 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, release=1761123044, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, container_name=ceilometer_agent_ipmi, vcs-type=git, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, architecture=x86_64, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:12:45Z) Dec 15 03:47:54 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 03:47:54 localhost podman[96340]: 2025-12-15 08:47:54.046050835 +0000 UTC m=+0.367637286 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, distribution-scope=public, container_name=collectd, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd, config_id=tripleo_step3) Dec 15 03:47:54 localhost podman[96348]: 2025-12-15 08:47:54.093507494 +0000 UTC m=+0.399098387 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.component=openstack-cron-container, release=1761123044, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, version=17.1.12, vcs-type=git, build-date=2025-11-18T22:49:32Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron) Dec 15 03:47:54 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 03:47:54 localhost podman[96340]: 2025-12-15 08:47:54.111750273 +0000 UTC m=+0.433336754 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4, batch=17.1_20251118.1, build-date=2025-11-18T22:51:28Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, release=1761123044, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, name=rhosp17/openstack-collectd) Dec 15 03:47:54 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:47:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:47:57 localhost systemd[1]: tmp-crun.qwG1rB.mount: Deactivated successfully. Dec 15 03:47:57 localhost podman[96555]: 2025-12-15 08:47:57.751262415 +0000 UTC m=+0.087095751 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.expose-services=, release=1761123044, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, vcs-type=git, tcib_managed=true, distribution-scope=public, architecture=x86_64, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Dec 15 03:47:58 localhost podman[96555]: 2025-12-15 08:47:58.126306618 +0000 UTC m=+0.462139924 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, version=17.1.12, io.openshift.expose-services=, tcib_managed=true, url=https://www.redhat.com, distribution-scope=public, batch=17.1_20251118.1, container_name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 15 03:47:58 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 03:48:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:48:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:48:00 localhost podman[96579]: 2025-12-15 08:48:00.749575705 +0000 UTC m=+0.071024680 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vcs-type=git, version=17.1.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, batch=17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, distribution-scope=public, io.openshift.expose-services=) Dec 15 03:48:00 localhost podman[96579]: 2025-12-15 08:48:00.765518062 +0000 UTC m=+0.086967027 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, config_id=tripleo_step4, managed_by=tripleo_ansible, architecture=x86_64, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, container_name=ovn_metadata_agent, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, batch=17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4) Dec 15 03:48:00 localhost podman[96579]: unhealthy Dec 15 03:48:00 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Main process exited, code=exited, status=1/FAILURE Dec 15 03:48:00 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Failed with result 'exit-code'. Dec 15 03:48:00 localhost podman[96578]: 2025-12-15 08:48:00.819626779 +0000 UTC m=+0.143500089 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, io.buildah.version=1.41.4, release=1761123044, managed_by=tripleo_ansible, url=https://www.redhat.com, config_id=tripleo_step4, vcs-type=git) Dec 15 03:48:00 localhost podman[96578]: 2025-12-15 08:48:00.864488119 +0000 UTC m=+0.188361429 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, distribution-scope=public, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, release=1761123044, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, managed_by=tripleo_ansible, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ovn-controller, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, version=17.1.12, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, config_id=tripleo_step4, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container) Dec 15 03:48:00 localhost podman[96578]: unhealthy Dec 15 03:48:00 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Main process exited, code=exited, status=1/FAILURE Dec 15 03:48:00 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Failed with result 'exit-code'. Dec 15 03:48:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:48:09 localhost systemd[1]: tmp-crun.0M9VhM.mount: Deactivated successfully. Dec 15 03:48:09 localhost podman[96617]: 2025-12-15 08:48:09.755435887 +0000 UTC m=+0.092852105 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, version=17.1.12, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., managed_by=tripleo_ansible, release=1761123044, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1) Dec 15 03:48:09 localhost podman[96617]: 2025-12-15 08:48:09.940480517 +0000 UTC m=+0.277896685 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, name=rhosp17/openstack-qdrouterd, tcib_managed=true, build-date=2025-11-18T22:49:46Z, batch=17.1_20251118.1, container_name=metrics_qdr, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4) Dec 15 03:48:09 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:48:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:48:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:48:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 03:48:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:48:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:48:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:48:24 localhost podman[96646]: 2025-12-15 08:48:24.776083988 +0000 UTC m=+0.101966608 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.expose-services=, vcs-type=git, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, tcib_managed=true, vendor=Red Hat, Inc., batch=17.1_20251118.1, build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, distribution-scope=public) Dec 15 03:48:24 localhost podman[96646]: 2025-12-15 08:48:24.786275611 +0000 UTC m=+0.112158211 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., vcs-type=git, config_id=tripleo_step3, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, managed_by=tripleo_ansible, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, architecture=x86_64) Dec 15 03:48:24 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:48:24 localhost systemd[1]: tmp-crun.R4Ug8w.mount: Deactivated successfully. Dec 15 03:48:24 localhost podman[96660]: 2025-12-15 08:48:24.880755458 +0000 UTC m=+0.188846023 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:11:48Z, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, version=17.1.12, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute) Dec 15 03:48:24 localhost podman[96660]: 2025-12-15 08:48:24.936392296 +0000 UTC m=+0.244482831 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, managed_by=tripleo_ansible, config_id=tripleo_step4, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.openshift.expose-services=, release=1761123044, maintainer=OpenStack TripleO Team, distribution-scope=public, build-date=2025-11-19T00:11:48Z, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, batch=17.1_20251118.1, container_name=ceilometer_agent_compute, architecture=x86_64, io.buildah.version=1.41.4, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute) Dec 15 03:48:24 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 03:48:24 localhost podman[96647]: 2025-12-15 08:48:24.84721162 +0000 UTC m=+0.165856148 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-type=git, tcib_managed=true, version=17.1.12, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, name=rhosp17/openstack-iscsid, build-date=2025-11-18T23:44:13Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, config_id=tripleo_step3, container_name=iscsid, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}) Dec 15 03:48:24 localhost podman[96647]: 2025-12-15 08:48:24.981259897 +0000 UTC m=+0.299904344 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, tcib_managed=true, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:44:13Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com) Dec 15 03:48:24 localhost podman[96655]: 2025-12-15 08:48:24.937972999 +0000 UTC m=+0.248650393 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64, batch=17.1_20251118.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.buildah.version=1.41.4, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, build-date=2025-11-18T22:49:32Z, name=rhosp17/openstack-cron) Dec 15 03:48:24 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:48:25 localhost podman[96653]: 2025-12-15 08:48:24.984600386 +0000 UTC m=+0.298945048 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-type=git, architecture=x86_64, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, release=1761123044, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12) Dec 15 03:48:25 localhost podman[96648]: 2025-12-15 08:48:25.043317277 +0000 UTC m=+0.360675509 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, maintainer=OpenStack TripleO Team, release=1761123044, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, build-date=2025-11-19T00:36:58Z, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute) Dec 15 03:48:25 localhost podman[96653]: 2025-12-15 08:48:25.067430102 +0000 UTC m=+0.381774764 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20251118.1, vcs-type=git, build-date=2025-11-19T00:12:45Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi, version=17.1.12, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc.) Dec 15 03:48:25 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 03:48:25 localhost podman[96648]: 2025-12-15 08:48:25.117887192 +0000 UTC m=+0.435245454 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, version=17.1.12, container_name=nova_compute, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute) Dec 15 03:48:25 localhost podman[96655]: 2025-12-15 08:48:25.124239312 +0000 UTC m=+0.434916746 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:32Z, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20251118.1, name=rhosp17/openstack-cron, distribution-scope=public, io.buildah.version=1.41.4, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_id=tripleo_step4) Dec 15 03:48:25 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Deactivated successfully. Dec 15 03:48:25 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 03:48:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:48:28 localhost podman[96779]: 2025-12-15 08:48:28.713224713 +0000 UTC m=+0.051809786 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, distribution-scope=public, architecture=x86_64, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, release=1761123044, url=https://www.redhat.com, container_name=nova_migration_target, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, tcib_managed=true, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container) Dec 15 03:48:29 localhost podman[96779]: 2025-12-15 08:48:29.076317177 +0000 UTC m=+0.414902320 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, container_name=nova_migration_target, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.buildah.version=1.41.4, architecture=x86_64, url=https://www.redhat.com, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044) Dec 15 03:48:29 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 03:48:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:48:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:48:31 localhost systemd[1]: tmp-crun.CSFb0t.mount: Deactivated successfully. Dec 15 03:48:31 localhost podman[96801]: 2025-12-15 08:48:31.763061973 +0000 UTC m=+0.096397000 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, build-date=2025-11-18T23:34:05Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.openshift.expose-services=, config_id=tripleo_step4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, container_name=ovn_controller, vendor=Red Hat, Inc., release=1761123044, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, tcib_managed=true, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ovn-controller) Dec 15 03:48:31 localhost podman[96801]: 2025-12-15 08:48:31.777422927 +0000 UTC m=+0.110757914 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, build-date=2025-11-18T23:34:05Z, version=17.1.12, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller, tcib_managed=true, architecture=x86_64, vendor=Red Hat, Inc., release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, url=https://www.redhat.com, managed_by=tripleo_ansible, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272) Dec 15 03:48:31 localhost podman[96801]: unhealthy Dec 15 03:48:31 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Main process exited, code=exited, status=1/FAILURE Dec 15 03:48:31 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Failed with result 'exit-code'. Dec 15 03:48:31 localhost podman[96802]: 2025-12-15 08:48:31.857391026 +0000 UTC m=+0.185523703 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, build-date=2025-11-19T00:14:25Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, container_name=ovn_metadata_agent, tcib_managed=true, architecture=x86_64, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, io.openshift.expose-services=, io.buildah.version=1.41.4) Dec 15 03:48:31 localhost podman[96802]: 2025-12-15 08:48:31.8982873 +0000 UTC m=+0.226420037 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-type=git, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.buildah.version=1.41.4, build-date=2025-11-19T00:14:25Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible) Dec 15 03:48:31 localhost podman[96802]: unhealthy Dec 15 03:48:31 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Main process exited, code=exited, status=1/FAILURE Dec 15 03:48:31 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Failed with result 'exit-code'. Dec 15 03:48:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:48:40 localhost podman[96841]: 2025-12-15 08:48:40.741988834 +0000 UTC m=+0.073388504 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, version=17.1.12, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.openshift.expose-services=, vcs-type=git, config_id=tripleo_step1) Dec 15 03:48:40 localhost podman[96841]: 2025-12-15 08:48:40.913332217 +0000 UTC m=+0.244731927 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, distribution-scope=public, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, managed_by=tripleo_ansible, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, batch=17.1_20251118.1, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, name=rhosp17/openstack-qdrouterd, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 15 03:48:40 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:48:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:48:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:48:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 03:48:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:48:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:48:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:48:55 localhost systemd[1]: tmp-crun.dpKW8l.mount: Deactivated successfully. Dec 15 03:48:55 localhost podman[96885]: 2025-12-15 08:48:55.767939835 +0000 UTC m=+0.083551866 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-type=git, tcib_managed=true, version=17.1.12, distribution-scope=public, url=https://www.redhat.com, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, build-date=2025-11-18T22:49:32Z, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, name=rhosp17/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron) Dec 15 03:48:55 localhost systemd[1]: tmp-crun.PnpYTK.mount: Deactivated successfully. Dec 15 03:48:55 localhost podman[96888]: 2025-12-15 08:48:55.783255145 +0000 UTC m=+0.090187844 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp17/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, url=https://www.redhat.com, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, maintainer=OpenStack TripleO Team) Dec 15 03:48:55 localhost podman[96871]: 2025-12-15 08:48:55.821912149 +0000 UTC m=+0.142181124 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-type=git, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_compute, batch=17.1_20251118.1, release=1761123044, url=https://www.redhat.com) Dec 15 03:48:55 localhost podman[96888]: 2025-12-15 08:48:55.827858888 +0000 UTC m=+0.134791557 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:11:48Z, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, url=https://www.redhat.com, release=1761123044, io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public) Dec 15 03:48:55 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 03:48:55 localhost podman[96870]: 2025-12-15 08:48:55.864050076 +0000 UTC m=+0.188915104 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20251118.1, version=17.1.12, config_id=tripleo_step3, url=https://www.redhat.com, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-iscsid, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., build-date=2025-11-18T23:44:13Z, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, managed_by=tripleo_ansible) Dec 15 03:48:55 localhost podman[96870]: 2025-12-15 08:48:55.870239462 +0000 UTC m=+0.195104490 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, tcib_managed=true, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, managed_by=tripleo_ansible, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, version=17.1.12, config_id=tripleo_step3, url=https://www.redhat.com, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid) Dec 15 03:48:55 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:48:55 localhost podman[96877]: 2025-12-15 08:48:55.882284314 +0000 UTC m=+0.197407562 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, tcib_managed=true, config_id=tripleo_step4, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, build-date=2025-11-19T00:12:45Z, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.4, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Dec 15 03:48:55 localhost podman[96871]: 2025-12-15 08:48:55.894284925 +0000 UTC m=+0.214553900 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, config_id=tripleo_step5, container_name=nova_compute, url=https://www.redhat.com, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, vcs-type=git, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64) Dec 15 03:48:55 localhost podman[96885]: 2025-12-15 08:48:55.906308517 +0000 UTC m=+0.221920578 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1761123044, build-date=2025-11-18T22:49:32Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., distribution-scope=public, url=https://www.redhat.com, vcs-type=git, maintainer=OpenStack TripleO Team, version=17.1.12, io.buildah.version=1.41.4, architecture=x86_64, tcib_managed=true, container_name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container) Dec 15 03:48:55 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Deactivated successfully. Dec 15 03:48:55 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 03:48:55 localhost podman[96869]: 2025-12-15 08:48:55.981126598 +0000 UTC m=+0.309942652 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, build-date=2025-11-18T22:51:28Z, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, name=rhosp17/openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., distribution-scope=public, batch=17.1_20251118.1, url=https://www.redhat.com, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, io.buildah.version=1.41.4, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, maintainer=OpenStack TripleO Team) Dec 15 03:48:55 localhost podman[96869]: 2025-12-15 08:48:55.993310124 +0000 UTC m=+0.322126228 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.expose-services=, release=1761123044, config_id=tripleo_step3, name=rhosp17/openstack-collectd, version=17.1.12, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, architecture=x86_64, container_name=collectd, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, distribution-scope=public, build-date=2025-11-18T22:51:28Z) Dec 15 03:48:56 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:48:56 localhost podman[96877]: 2025-12-15 08:48:56.010230887 +0000 UTC m=+0.325354055 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, build-date=2025-11-19T00:12:45Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.expose-services=, vendor=Red Hat, Inc.) Dec 15 03:48:56 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 03:48:58 localhost systemd[1]: tmp-crun.2fyNjk.mount: Deactivated successfully. Dec 15 03:48:58 localhost podman[97101]: 2025-12-15 08:48:58.270095863 +0000 UTC m=+0.094442047 container exec 8dcda56b365b42dc8758aab77a9ec80db304780e449052738f7e4e648ae1ecaf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-crash-np0005559462, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, build-date=2025-11-26T19:44:28Z, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, ceph=True, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, release=1763362218, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., GIT_CLEAN=True, distribution-scope=public, io.buildah.version=1.41.4, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Dec 15 03:48:58 localhost podman[97101]: 2025-12-15 08:48:58.400405859 +0000 UTC m=+0.224751993 container exec_died 8dcda56b365b42dc8758aab77a9ec80db304780e449052738f7e4e648ae1ecaf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-crash-np0005559462, maintainer=Guillaume Abrioux , release=1763362218, version=7, architecture=x86_64, build-date=2025-11-26T19:44:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, RELEASE=main, GIT_CLEAN=True, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vendor=Red Hat, Inc.) Dec 15 03:48:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:48:59 localhost podman[97230]: 2025-12-15 08:48:59.775170696 +0000 UTC m=+0.095977988 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, architecture=x86_64, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, vcs-type=git, tcib_managed=true, io.buildah.version=1.41.4, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, version=17.1.12, distribution-scope=public, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 15 03:49:00 localhost podman[97230]: 2025-12-15 08:49:00.178553337 +0000 UTC m=+0.499360609 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, version=17.1.12, build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, architecture=x86_64, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute) Dec 15 03:49:00 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 03:49:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:49:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:49:02 localhost podman[97269]: 2025-12-15 08:49:02.744371307 +0000 UTC m=+0.075170471 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, name=rhosp17/openstack-ovn-controller, config_id=tripleo_step4, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, container_name=ovn_controller, tcib_managed=true, build-date=2025-11-18T23:34:05Z, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, io.buildah.version=1.41.4, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 15 03:49:02 localhost podman[97269]: 2025-12-15 08:49:02.757848438 +0000 UTC m=+0.088647632 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, vendor=Red Hat, Inc., container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, batch=17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, release=1761123044, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, url=https://www.redhat.com, version=17.1.12, build-date=2025-11-18T23:34:05Z, tcib_managed=true) Dec 15 03:49:02 localhost podman[97269]: unhealthy Dec 15 03:49:02 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Main process exited, code=exited, status=1/FAILURE Dec 15 03:49:02 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Failed with result 'exit-code'. Dec 15 03:49:02 localhost podman[97270]: 2025-12-15 08:49:02.804860205 +0000 UTC m=+0.133948124 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.4, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, vcs-type=git, version=17.1.12, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, build-date=2025-11-19T00:14:25Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}) Dec 15 03:49:02 localhost podman[97270]: 2025-12-15 08:49:02.824337146 +0000 UTC m=+0.153425115 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.12, config_id=tripleo_step4, batch=17.1_20251118.1, architecture=x86_64, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vcs-type=git, io.buildah.version=1.41.4, release=1761123044) Dec 15 03:49:02 localhost podman[97270]: unhealthy Dec 15 03:49:02 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Main process exited, code=exited, status=1/FAILURE Dec 15 03:49:02 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Failed with result 'exit-code'. Dec 15 03:49:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:49:11 localhost podman[97307]: 2025-12-15 08:49:11.765145558 +0000 UTC m=+0.095201418 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, architecture=x86_64, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, batch=17.1_20251118.1, config_id=tripleo_step1, distribution-scope=public, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:46Z, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=) Dec 15 03:49:11 localhost podman[97307]: 2025-12-15 08:49:11.953976959 +0000 UTC m=+0.284032809 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, release=1761123044, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, config_id=tripleo_step1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, version=17.1.12, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git) Dec 15 03:49:11 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:49:18 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Dec 15 03:49:18 localhost recover_tripleo_nova_virtqemud[97337]: 61849 Dec 15 03:49:18 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Dec 15 03:49:18 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Dec 15 03:49:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:49:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:49:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 03:49:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:49:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:49:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:49:26 localhost podman[97339]: 2025-12-15 08:49:26.778182252 +0000 UTC m=+0.100031558 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., build-date=2025-11-18T23:44:13Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, distribution-scope=public, name=rhosp17/openstack-iscsid, tcib_managed=true, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, release=1761123044, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid) Dec 15 03:49:26 localhost podman[97339]: 2025-12-15 08:49:26.791045235 +0000 UTC m=+0.112894581 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, managed_by=tripleo_ansible, distribution-scope=public, version=17.1.12, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, release=1761123044, vcs-type=git, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}) Dec 15 03:49:26 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:49:26 localhost systemd[1]: tmp-crun.sQrYIL.mount: Deactivated successfully. Dec 15 03:49:26 localhost podman[97340]: 2025-12-15 08:49:26.84318491 +0000 UTC m=+0.159351034 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, version=17.1.12, container_name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, release=1761123044, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, distribution-scope=public, tcib_managed=true) Dec 15 03:49:26 localhost podman[97340]: 2025-12-15 08:49:26.873012719 +0000 UTC m=+0.189178833 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, release=1761123044, managed_by=tripleo_ansible, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_compute, io.buildah.version=1.41.4, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com) Dec 15 03:49:26 localhost podman[97338]: 2025-12-15 08:49:26.888161954 +0000 UTC m=+0.213425281 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, tcib_managed=true, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, url=https://www.redhat.com, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, version=17.1.12, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, maintainer=OpenStack TripleO Team, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 15 03:49:26 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Deactivated successfully. Dec 15 03:49:26 localhost podman[97338]: 2025-12-15 08:49:26.897681418 +0000 UTC m=+0.222944775 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2025-11-18T22:51:28Z, container_name=collectd, distribution-scope=public, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, io.openshift.expose-services=, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp17/openstack-collectd, io.buildah.version=1.41.4, url=https://www.redhat.com, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 15 03:49:26 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:49:26 localhost podman[97347]: 2025-12-15 08:49:26.937710989 +0000 UTC m=+0.249851646 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, container_name=logrotate_crond, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, release=1761123044, version=17.1.12, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-cron, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:32Z) Dec 15 03:49:26 localhost podman[97353]: 2025-12-15 08:49:26.988112908 +0000 UTC m=+0.297145070 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, vcs-type=git, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, build-date=2025-11-19T00:11:48Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://www.redhat.com, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, release=1761123044, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4) Dec 15 03:49:27 localhost podman[97347]: 2025-12-15 08:49:27.001668069 +0000 UTC m=+0.313808696 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, name=rhosp17/openstack-cron, managed_by=tripleo_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, container_name=logrotate_crond, url=https://www.redhat.com, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, version=17.1.12, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc.) Dec 15 03:49:27 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 03:49:27 localhost podman[97353]: 2025-12-15 08:49:27.019330753 +0000 UTC m=+0.328362925 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, version=17.1.12, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.4, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4) Dec 15 03:49:27 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 03:49:27 localhost podman[97341]: 2025-12-15 08:49:27.088950035 +0000 UTC m=+0.404716268 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1761123044, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4, architecture=x86_64) Dec 15 03:49:27 localhost podman[97341]: 2025-12-15 08:49:27.122423481 +0000 UTC m=+0.438189704 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, version=17.1.12, batch=17.1_20251118.1, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, tcib_managed=true, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, maintainer=OpenStack TripleO Team) Dec 15 03:49:27 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 03:49:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:49:30 localhost podman[97473]: 2025-12-15 08:49:30.721331097 +0000 UTC m=+0.059609395 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, distribution-scope=public, tcib_managed=true, io.openshift.expose-services=, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, url=https://www.redhat.com, io.buildah.version=1.41.4) Dec 15 03:49:31 localhost podman[97473]: 2025-12-15 08:49:31.082355035 +0000 UTC m=+0.420633333 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, version=17.1.12, io.openshift.expose-services=, config_id=tripleo_step4, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, architecture=x86_64, container_name=nova_migration_target) Dec 15 03:49:31 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 03:49:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:49:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:49:33 localhost podman[97498]: 2025-12-15 08:49:33.75596492 +0000 UTC m=+0.084778839 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, build-date=2025-11-18T23:34:05Z, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, vendor=Red Hat, Inc., vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, distribution-scope=public, vcs-type=git, release=1761123044, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, maintainer=OpenStack TripleO Team) Dec 15 03:49:33 localhost podman[97498]: 2025-12-15 08:49:33.771782093 +0000 UTC m=+0.100596012 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, distribution-scope=public, version=17.1.12, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, url=https://www.redhat.com, config_id=tripleo_step4, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, name=rhosp17/openstack-ovn-controller) Dec 15 03:49:33 localhost podman[97498]: unhealthy Dec 15 03:49:33 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Main process exited, code=exited, status=1/FAILURE Dec 15 03:49:33 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Failed with result 'exit-code'. Dec 15 03:49:33 localhost systemd[1]: tmp-crun.rpHgkC.mount: Deactivated successfully. Dec 15 03:49:33 localhost podman[97499]: 2025-12-15 08:49:33.872715984 +0000 UTC m=+0.195094350 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, distribution-scope=public, tcib_managed=true, release=1761123044, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Dec 15 03:49:33 localhost podman[97499]: 2025-12-15 08:49:33.884382216 +0000 UTC m=+0.206760612 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, config_id=tripleo_step4, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, release=1761123044, distribution-scope=public, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c) Dec 15 03:49:33 localhost podman[97499]: unhealthy Dec 15 03:49:33 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Main process exited, code=exited, status=1/FAILURE Dec 15 03:49:33 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Failed with result 'exit-code'. Dec 15 03:49:35 localhost sshd[97538]: main: sshd: ssh-rsa algorithm is disabled Dec 15 03:49:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:49:42 localhost podman[97540]: 2025-12-15 08:49:42.735473788 +0000 UTC m=+0.068997717 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, container_name=metrics_qdr, vendor=Red Hat, Inc., batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.12, architecture=x86_64, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, url=https://www.redhat.com, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd) Dec 15 03:49:43 localhost podman[97540]: 2025-12-15 08:49:43.009741166 +0000 UTC m=+0.343265165 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, config_id=tripleo_step1, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible, architecture=x86_64, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, io.buildah.version=1.41.4, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Dec 15 03:49:43 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:49:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:49:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:49:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 03:49:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:49:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:49:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:49:57 localhost systemd[1]: tmp-crun.BF6JVb.mount: Deactivated successfully. Dec 15 03:49:57 localhost podman[97583]: 2025-12-15 08:49:57.781917149 +0000 UTC m=+0.099315868 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, url=https://www.redhat.com, vcs-type=git, name=rhosp17/openstack-cron, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:32Z, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, batch=17.1_20251118.1, version=17.1.12, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, distribution-scope=public, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1761123044, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4) Dec 15 03:49:57 localhost podman[97583]: 2025-12-15 08:49:57.821440467 +0000 UTC m=+0.138839146 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2025-11-18T22:49:32Z, config_id=tripleo_step4, container_name=logrotate_crond, distribution-scope=public, name=rhosp17/openstack-cron, vendor=Red Hat, Inc., managed_by=tripleo_ansible, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, release=1761123044, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 15 03:49:57 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 03:49:57 localhost podman[97571]: 2025-12-15 08:49:57.799022787 +0000 UTC m=+0.117100514 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, version=17.1.12, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, config_id=tripleo_step5, io.buildah.version=1.41.4, architecture=x86_64, tcib_managed=true, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Dec 15 03:49:57 localhost podman[97569]: 2025-12-15 08:49:57.822838424 +0000 UTC m=+0.154181895 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, container_name=collectd, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, config_id=tripleo_step3, batch=17.1_20251118.1, managed_by=tripleo_ansible, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, build-date=2025-11-18T22:51:28Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, release=1761123044) Dec 15 03:49:57 localhost podman[97571]: 2025-12-15 08:49:57.879774547 +0000 UTC m=+0.197852254 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, architecture=x86_64, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, managed_by=tripleo_ansible, distribution-scope=public, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 15 03:49:57 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Deactivated successfully. Dec 15 03:49:57 localhost podman[97584]: 2025-12-15 08:49:57.930816002 +0000 UTC m=+0.244086620 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, release=1761123044, build-date=2025-11-19T00:11:48Z, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, managed_by=tripleo_ansible, tcib_managed=true, distribution-scope=public, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute) Dec 15 03:49:57 localhost podman[97569]: 2025-12-15 08:49:57.975755426 +0000 UTC m=+0.307098997 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, distribution-scope=public, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-collectd-container, container_name=collectd, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, build-date=2025-11-18T22:51:28Z, architecture=x86_64, io.openshift.expose-services=, vcs-type=git, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, version=17.1.12) Dec 15 03:49:57 localhost podman[97584]: 2025-12-15 08:49:57.990235003 +0000 UTC m=+0.303505631 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, build-date=2025-11-19T00:11:48Z, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.buildah.version=1.41.4, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20251118.1, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, managed_by=tripleo_ansible, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com) Dec 15 03:49:57 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:49:58 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 03:49:58 localhost podman[97577]: 2025-12-15 08:49:58.047187606 +0000 UTC m=+0.367452532 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, architecture=x86_64, config_id=tripleo_step4, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:12:45Z, managed_by=tripleo_ansible, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Dec 15 03:49:58 localhost podman[97577]: 2025-12-15 08:49:58.07201973 +0000 UTC m=+0.392284656 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, vendor=Red Hat, Inc., build-date=2025-11-19T00:12:45Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, tcib_managed=true) Dec 15 03:49:58 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 03:49:58 localhost podman[97570]: 2025-12-15 08:49:58.15316615 +0000 UTC m=+0.481203693 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, io.openshift.expose-services=, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, managed_by=tripleo_ansible, distribution-scope=public) Dec 15 03:49:58 localhost podman[97570]: 2025-12-15 08:49:58.192482083 +0000 UTC m=+0.520519586 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, io.buildah.version=1.41.4, version=17.1.12, batch=17.1_20251118.1, architecture=x86_64, name=rhosp17/openstack-iscsid, distribution-scope=public) Dec 15 03:49:58 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:50:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:50:01 localhost systemd[1]: tmp-crun.NfdddQ.mount: Deactivated successfully. Dec 15 03:50:01 localhost podman[97772]: 2025-12-15 08:50:01.755294704 +0000 UTC m=+0.079579369 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, container_name=nova_migration_target, version=17.1.12, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, io.buildah.version=1.41.4, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-type=git, architecture=x86_64, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4) Dec 15 03:50:02 localhost podman[97772]: 2025-12-15 08:50:02.115263665 +0000 UTC m=+0.439548410 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, config_id=tripleo_step4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 15 03:50:02 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 03:50:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:50:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:50:04 localhost systemd[1]: tmp-crun.Jhj6Nt.mount: Deactivated successfully. Dec 15 03:50:04 localhost podman[97810]: 2025-12-15 08:50:04.763030787 +0000 UTC m=+0.090579004 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, version=17.1.12, container_name=ovn_controller, io.openshift.expose-services=, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, release=1761123044, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, build-date=2025-11-18T23:34:05Z, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64) Dec 15 03:50:04 localhost systemd[1]: tmp-crun.0F9eFp.mount: Deactivated successfully. Dec 15 03:50:04 localhost podman[97811]: 2025-12-15 08:50:04.799484702 +0000 UTC m=+0.122945800 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, managed_by=tripleo_ansible, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, batch=17.1_20251118.1, architecture=x86_64, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, version=17.1.12, release=1761123044) Dec 15 03:50:04 localhost podman[97810]: 2025-12-15 08:50:04.810445325 +0000 UTC m=+0.137993502 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, version=17.1.12, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, tcib_managed=true, managed_by=tripleo_ansible, container_name=ovn_controller, io.openshift.expose-services=, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public) Dec 15 03:50:04 localhost podman[97810]: unhealthy Dec 15 03:50:04 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Main process exited, code=exited, status=1/FAILURE Dec 15 03:50:04 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Failed with result 'exit-code'. Dec 15 03:50:04 localhost podman[97811]: 2025-12-15 08:50:04.839234265 +0000 UTC m=+0.162695433 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, release=1761123044, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vcs-type=git, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, managed_by=tripleo_ansible, vendor=Red Hat, Inc., vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, build-date=2025-11-19T00:14:25Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Dec 15 03:50:04 localhost podman[97811]: unhealthy Dec 15 03:50:04 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Main process exited, code=exited, status=1/FAILURE Dec 15 03:50:04 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Failed with result 'exit-code'. Dec 15 03:50:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:50:13 localhost systemd[1]: tmp-crun.NnTqbD.mount: Deactivated successfully. Dec 15 03:50:13 localhost podman[97850]: 2025-12-15 08:50:13.763018322 +0000 UTC m=+0.097552341 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1761123044, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.openshift.expose-services=, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd) Dec 15 03:50:13 localhost podman[97850]: 2025-12-15 08:50:13.975732672 +0000 UTC m=+0.310266721 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, container_name=metrics_qdr, architecture=x86_64, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, config_id=tripleo_step1, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 15 03:50:13 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:50:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:50:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:50:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 03:50:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:50:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:50:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:50:28 localhost systemd[1]: tmp-crun.Z45eqi.mount: Deactivated successfully. Dec 15 03:50:28 localhost podman[97880]: 2025-12-15 08:50:28.771240213 +0000 UTC m=+0.096193694 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, container_name=collectd, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd, release=1761123044, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, managed_by=tripleo_ansible, vcs-type=git, architecture=x86_64, batch=17.1_20251118.1, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2025-11-18T22:51:28Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 collectd) Dec 15 03:50:28 localhost podman[97880]: 2025-12-15 08:50:28.778401235 +0000 UTC m=+0.103354706 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, release=1761123044, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, com.redhat.component=openstack-collectd-container, container_name=collectd, vcs-type=git, distribution-scope=public, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible) Dec 15 03:50:28 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:50:28 localhost systemd[1]: tmp-crun.BLJcWa.mount: Deactivated successfully. Dec 15 03:50:28 localhost podman[97885]: 2025-12-15 08:50:28.837120706 +0000 UTC m=+0.151557315 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step5, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true) Dec 15 03:50:28 localhost podman[97885]: 2025-12-15 08:50:28.869582094 +0000 UTC m=+0.184018683 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, version=17.1.12, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, distribution-scope=public, vcs-type=git, config_id=tripleo_step5, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, release=1761123044, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=) Dec 15 03:50:28 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Deactivated successfully. Dec 15 03:50:28 localhost podman[97893]: 2025-12-15 08:50:28.884255676 +0000 UTC m=+0.189994833 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.12, build-date=2025-11-18T22:49:32Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, architecture=x86_64, tcib_managed=true, name=rhosp17/openstack-cron, url=https://www.redhat.com, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, vendor=Red Hat, Inc., container_name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, vcs-type=git) Dec 15 03:50:28 localhost podman[97893]: 2025-12-15 08:50:28.918348359 +0000 UTC m=+0.224087576 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, distribution-scope=public, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.openshift.expose-services=, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:32Z, tcib_managed=true, url=https://www.redhat.com, vcs-type=git, container_name=logrotate_crond, batch=17.1_20251118.1, release=1761123044, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-cron-container, version=17.1.12) Dec 15 03:50:28 localhost podman[97881]: 2025-12-15 08:50:28.930622457 +0000 UTC m=+0.253123432 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20251118.1, container_name=iscsid, vendor=Red Hat, Inc., vcs-type=git, build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, architecture=x86_64, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4, distribution-scope=public, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid) Dec 15 03:50:28 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 03:50:28 localhost podman[97899]: 2025-12-15 08:50:28.85705985 +0000 UTC m=+0.156600511 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vcs-type=git, managed_by=tripleo_ansible, release=1761123044, name=rhosp17/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.buildah.version=1.41.4, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, build-date=2025-11-19T00:11:48Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_compute, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.12) Dec 15 03:50:28 localhost podman[97881]: 2025-12-15 08:50:28.968389267 +0000 UTC m=+0.290890252 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., batch=17.1_20251118.1, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, vcs-type=git, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, tcib_managed=true, url=https://www.redhat.com, architecture=x86_64, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d) Dec 15 03:50:28 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:50:28 localhost podman[97899]: 2025-12-15 08:50:28.990396506 +0000 UTC m=+0.289937127 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, tcib_managed=true, build-date=2025-11-19T00:11:48Z, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 15 03:50:29 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 03:50:29 localhost podman[97892]: 2025-12-15 08:50:29.04585272 +0000 UTC m=+0.352191603 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, distribution-scope=public, io.buildah.version=1.41.4, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20251118.1, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.12, architecture=x86_64) Dec 15 03:50:29 localhost podman[97892]: 2025-12-15 08:50:29.074305901 +0000 UTC m=+0.380644824 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., config_id=tripleo_step4, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 15 03:50:29 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 03:50:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:50:32 localhost podman[98015]: 2025-12-15 08:50:32.739513394 +0000 UTC m=+0.070762545 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, container_name=nova_migration_target, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, vcs-type=git, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.expose-services=, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, architecture=x86_64, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4) Dec 15 03:50:33 localhost podman[98015]: 2025-12-15 08:50:33.104646491 +0000 UTC m=+0.435895632 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, container_name=nova_migration_target, config_id=tripleo_step4, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, managed_by=tripleo_ansible, url=https://www.redhat.com, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}) Dec 15 03:50:33 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 03:50:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:50:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:50:35 localhost systemd[1]: tmp-crun.3I3f8V.mount: Deactivated successfully. Dec 15 03:50:35 localhost podman[98038]: 2025-12-15 08:50:35.760381987 +0000 UTC m=+0.095989279 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2025-11-18T23:34:05Z, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, io.openshift.expose-services=, io.buildah.version=1.41.4, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, batch=17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, tcib_managed=true, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, version=17.1.12, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller) Dec 15 03:50:35 localhost podman[98038]: 2025-12-15 08:50:35.797652314 +0000 UTC m=+0.133259556 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, tcib_managed=true, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, batch=17.1_20251118.1, io.buildah.version=1.41.4, config_id=tripleo_step4, container_name=ovn_controller, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z, release=1761123044, managed_by=tripleo_ansible) Dec 15 03:50:35 localhost podman[98038]: unhealthy Dec 15 03:50:35 localhost podman[98039]: 2025-12-15 08:50:35.810529829 +0000 UTC m=+0.144913098 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, distribution-scope=public, vcs-type=git, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.12, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_metadata_agent, release=1761123044, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn) Dec 15 03:50:35 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Main process exited, code=exited, status=1/FAILURE Dec 15 03:50:35 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Failed with result 'exit-code'. Dec 15 03:50:35 localhost podman[98039]: 2025-12-15 08:50:35.850158509 +0000 UTC m=+0.184541738 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, vcs-type=git, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, release=1761123044, maintainer=OpenStack TripleO Team, version=17.1.12, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, io.openshift.expose-services=, url=https://www.redhat.com) Dec 15 03:50:35 localhost podman[98039]: unhealthy Dec 15 03:50:35 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Main process exited, code=exited, status=1/FAILURE Dec 15 03:50:35 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Failed with result 'exit-code'. Dec 15 03:50:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:50:44 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Dec 15 03:50:44 localhost recover_tripleo_nova_virtqemud[98085]: 61849 Dec 15 03:50:44 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Dec 15 03:50:44 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Dec 15 03:50:44 localhost podman[98078]: 2025-12-15 08:50:44.745958129 +0000 UTC m=+0.079838227 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.openshift.expose-services=, url=https://www.redhat.com, config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-type=git, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, container_name=metrics_qdr, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd) Dec 15 03:50:44 localhost podman[98078]: 2025-12-15 08:50:44.962100461 +0000 UTC m=+0.295980539 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, release=1761123044, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Dec 15 03:50:44 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:50:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:50:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:50:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 03:50:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:50:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:50:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:50:59 localhost podman[98109]: 2025-12-15 08:50:59.818474334 +0000 UTC m=+0.132673320 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-type=git, release=1761123044, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, container_name=iscsid, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible) Dec 15 03:50:59 localhost podman[98109]: 2025-12-15 08:50:59.832269113 +0000 UTC m=+0.146468089 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.buildah.version=1.41.4, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1761123044, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, distribution-scope=public, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=tripleo_ansible, url=https://www.redhat.com, tcib_managed=true, architecture=x86_64, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d) Dec 15 03:50:59 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:50:59 localhost podman[98117]: 2025-12-15 08:50:59.890631565 +0000 UTC m=+0.194611968 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, tcib_managed=true, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.12, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, build-date=2025-11-18T22:49:32Z, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, batch=17.1_20251118.1, container_name=logrotate_crond, distribution-scope=public, release=1761123044, io.buildah.version=1.41.4, vcs-type=git, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible) Dec 15 03:50:59 localhost podman[98111]: 2025-12-15 08:50:59.791572384 +0000 UTC m=+0.099117852 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, version=17.1.12, batch=17.1_20251118.1, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, config_id=tripleo_step4, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1761123044) Dec 15 03:50:59 localhost podman[98117]: 2025-12-15 08:50:59.921926752 +0000 UTC m=+0.225907135 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, architecture=x86_64, build-date=2025-11-18T22:49:32Z, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.buildah.version=1.41.4, container_name=logrotate_crond, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20251118.1, release=1761123044, tcib_managed=true, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, vcs-type=git, name=rhosp17/openstack-cron) Dec 15 03:50:59 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 03:50:59 localhost podman[98127]: 2025-12-15 08:50:59.938511305 +0000 UTC m=+0.240946736 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, distribution-scope=public, io.openshift.expose-services=, vcs-type=git, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:11:48Z, release=1761123044, tcib_managed=true, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20251118.1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 15 03:50:59 localhost podman[98110]: 2025-12-15 08:50:59.967899631 +0000 UTC m=+0.278762978 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, version=17.1.12, batch=17.1_20251118.1, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, distribution-scope=public, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, tcib_managed=true, managed_by=tripleo_ansible, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute) Dec 15 03:50:59 localhost podman[98111]: 2025-12-15 08:50:59.973300846 +0000 UTC m=+0.280846354 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, name=rhosp17/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, architecture=x86_64) Dec 15 03:50:59 localhost podman[98108]: 2025-12-15 08:50:59.872180211 +0000 UTC m=+0.189227713 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, release=1761123044, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, managed_by=tripleo_ansible, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd) Dec 15 03:50:59 localhost podman[98110]: 2025-12-15 08:50:59.994404891 +0000 UTC m=+0.305268318 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step5, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc.) Dec 15 03:51:00 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Deactivated successfully. Dec 15 03:51:00 localhost podman[98108]: 2025-12-15 08:51:00.009435592 +0000 UTC m=+0.326483124 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., managed_by=tripleo_ansible, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.4, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, version=17.1.12, config_id=tripleo_step3, name=rhosp17/openstack-collectd, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, tcib_managed=true, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z) Dec 15 03:51:00 localhost podman[98127]: 2025-12-15 08:51:00.019661796 +0000 UTC m=+0.322097187 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, architecture=x86_64, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, version=17.1.12, release=1761123044, url=https://www.redhat.com, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., build-date=2025-11-19T00:11:48Z, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute) Dec 15 03:51:00 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:51:00 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 03:51:00 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 03:51:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:51:03 localhost systemd[1]: tmp-crun.neZ4vZ.mount: Deactivated successfully. Dec 15 03:51:03 localhost podman[98319]: 2025-12-15 08:51:03.566783408 +0000 UTC m=+0.094846059 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, build-date=2025-11-19T00:36:58Z, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, config_id=tripleo_step4, io.openshift.expose-services=, container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Dec 15 03:51:03 localhost podman[98319]: 2025-12-15 08:51:03.966714627 +0000 UTC m=+0.494777258 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, tcib_managed=true, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., container_name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, batch=17.1_20251118.1, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, io.buildah.version=1.41.4, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute) Dec 15 03:51:03 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 03:51:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:51:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:51:06 localhost systemd[1]: tmp-crun.I6rB8K.mount: Deactivated successfully. Dec 15 03:51:06 localhost podman[98346]: 2025-12-15 08:51:06.741094177 +0000 UTC m=+0.070853276 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, io.openshift.expose-services=, tcib_managed=true, container_name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, config_id=tripleo_step4, batch=17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible) Dec 15 03:51:06 localhost podman[98346]: 2025-12-15 08:51:06.757249209 +0000 UTC m=+0.087007958 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, architecture=x86_64, config_id=tripleo_step4, distribution-scope=public, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, tcib_managed=true) Dec 15 03:51:06 localhost podman[98346]: unhealthy Dec 15 03:51:06 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Main process exited, code=exited, status=1/FAILURE Dec 15 03:51:06 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Failed with result 'exit-code'. Dec 15 03:51:06 localhost podman[98345]: 2025-12-15 08:51:06.765779567 +0000 UTC m=+0.095605708 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, container_name=ovn_controller, maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.4, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ovn-controller, version=17.1.12, tcib_managed=true, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com) Dec 15 03:51:06 localhost podman[98345]: 2025-12-15 08:51:06.845220902 +0000 UTC m=+0.175047113 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, build-date=2025-11-18T23:34:05Z, vcs-type=git, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.buildah.version=1.41.4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc.) Dec 15 03:51:06 localhost podman[98345]: unhealthy Dec 15 03:51:06 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Main process exited, code=exited, status=1/FAILURE Dec 15 03:51:06 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Failed with result 'exit-code'. Dec 15 03:51:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:51:15 localhost podman[98384]: 2025-12-15 08:51:15.748372507 +0000 UTC m=+0.078610544 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, distribution-scope=public, name=rhosp17/openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.expose-services=, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, tcib_managed=true, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12) Dec 15 03:51:15 localhost podman[98384]: 2025-12-15 08:51:15.926007989 +0000 UTC m=+0.256246026 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp17/openstack-qdrouterd, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, vendor=Red Hat, Inc., io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, batch=17.1_20251118.1, managed_by=tripleo_ansible, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd) Dec 15 03:51:15 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:51:19 localhost sshd[98413]: main: sshd: ssh-rsa algorithm is disabled Dec 15 03:51:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:51:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:51:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 03:51:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:51:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:51:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:51:30 localhost systemd[1]: tmp-crun.NGa0Jo.mount: Deactivated successfully. Dec 15 03:51:30 localhost podman[98418]: 2025-12-15 08:51:30.827530913 +0000 UTC m=+0.140877295 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, container_name=nova_compute, release=1761123044, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vendor=Red Hat, Inc., config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Dec 15 03:51:30 localhost podman[98418]: 2025-12-15 08:51:30.859464706 +0000 UTC m=+0.172811068 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, batch=17.1_20251118.1, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, container_name=nova_compute, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, config_id=tripleo_step5, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team) Dec 15 03:51:30 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Deactivated successfully. Dec 15 03:51:30 localhost podman[98430]: 2025-12-15 08:51:30.935148029 +0000 UTC m=+0.242641055 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, distribution-scope=public, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, config_id=tripleo_step4, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1761123044, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, batch=17.1_20251118.1, container_name=logrotate_crond, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 15 03:51:30 localhost podman[98416]: 2025-12-15 08:51:30.800414908 +0000 UTC m=+0.119281448 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, architecture=x86_64, build-date=2025-11-18T22:51:28Z, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, config_id=tripleo_step3, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, name=rhosp17/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd) Dec 15 03:51:30 localhost podman[98430]: 2025-12-15 08:51:30.966083825 +0000 UTC m=+0.273576861 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:32Z, distribution-scope=public, config_id=tripleo_step4, io.buildah.version=1.41.4, release=1761123044, architecture=x86_64, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, name=rhosp17/openstack-cron, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron) Dec 15 03:51:30 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 03:51:30 localhost podman[98417]: 2025-12-15 08:51:30.983150871 +0000 UTC m=+0.300665405 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., release=1761123044, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, tcib_managed=true, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, vcs-type=git, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, architecture=x86_64, version=17.1.12, batch=17.1_20251118.1, io.buildah.version=1.41.4) Dec 15 03:51:30 localhost podman[98420]: 2025-12-15 08:51:30.986252404 +0000 UTC m=+0.294141460 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:12:45Z, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, url=https://www.redhat.com, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, tcib_managed=true, version=17.1.12, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git) Dec 15 03:51:30 localhost podman[98417]: 2025-12-15 08:51:30.998276355 +0000 UTC m=+0.315790859 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, batch=17.1_20251118.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, distribution-scope=public, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, architecture=x86_64, vcs-type=git) Dec 15 03:51:31 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:51:31 localhost podman[98420]: 2025-12-15 08:51:31.017420186 +0000 UTC m=+0.325309292 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, url=https://www.redhat.com, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, vcs-type=git, name=rhosp17/openstack-ceilometer-ipmi, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_ipmi, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:12:45Z, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4) Dec 15 03:51:31 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 03:51:31 localhost podman[98416]: 2025-12-15 08:51:31.03592339 +0000 UTC m=+0.354789930 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20251118.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=collectd, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, architecture=x86_64, build-date=2025-11-18T22:51:28Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, config_id=tripleo_step3, io.buildah.version=1.41.4, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp17/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 15 03:51:31 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:51:31 localhost podman[98432]: 2025-12-15 08:51:31.093399026 +0000 UTC m=+0.397309356 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-compute, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:11:48Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4) Dec 15 03:51:31 localhost podman[98432]: 2025-12-15 08:51:31.150546393 +0000 UTC m=+0.454456703 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, url=https://www.redhat.com, build-date=2025-11-19T00:11:48Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, version=17.1.12, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, tcib_managed=true, io.openshift.expose-services=, distribution-scope=public, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Dec 15 03:51:31 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 03:51:31 localhost systemd[1]: tmp-crun.yYohQG.mount: Deactivated successfully. Dec 15 03:51:33 localhost sshd[98555]: main: sshd: ssh-rsa algorithm is disabled Dec 15 03:51:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:51:34 localhost systemd[1]: tmp-crun.cYWYuE.mount: Deactivated successfully. Dec 15 03:51:34 localhost podman[98557]: 2025-12-15 08:51:34.752794045 +0000 UTC m=+0.086126202 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, config_id=tripleo_step4, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, vcs-type=git, architecture=x86_64, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12) Dec 15 03:51:35 localhost podman[98557]: 2025-12-15 08:51:35.107255855 +0000 UTC m=+0.440587982 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, container_name=nova_migration_target, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, architecture=x86_64, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute) Dec 15 03:51:35 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 03:51:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:51:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:51:37 localhost systemd[1]: tmp-crun.aXDnxJ.mount: Deactivated successfully. Dec 15 03:51:37 localhost podman[98583]: 2025-12-15 08:51:37.749374246 +0000 UTC m=+0.081282723 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:14:25Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, version=17.1.12, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, release=1761123044, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public) Dec 15 03:51:37 localhost systemd[1]: tmp-crun.fyjNMB.mount: Deactivated successfully. Dec 15 03:51:37 localhost podman[98582]: 2025-12-15 08:51:37.77719707 +0000 UTC m=+0.108012478 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, release=1761123044, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12) Dec 15 03:51:37 localhost podman[98583]: 2025-12-15 08:51:37.803401179 +0000 UTC m=+0.135309696 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vcs-type=git, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, version=17.1.12, tcib_managed=true, container_name=ovn_metadata_agent) Dec 15 03:51:37 localhost podman[98583]: unhealthy Dec 15 03:51:37 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Main process exited, code=exited, status=1/FAILURE Dec 15 03:51:37 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Failed with result 'exit-code'. Dec 15 03:51:37 localhost podman[98582]: 2025-12-15 08:51:37.818838862 +0000 UTC m=+0.149654270 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.expose-services=, release=1761123044, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, vcs-type=git, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:34:05Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, container_name=ovn_controller, distribution-scope=public, config_id=tripleo_step4) Dec 15 03:51:37 localhost podman[98582]: unhealthy Dec 15 03:51:37 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Main process exited, code=exited, status=1/FAILURE Dec 15 03:51:37 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Failed with result 'exit-code'. Dec 15 03:51:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:51:46 localhost podman[98621]: 2025-12-15 08:51:46.760275184 +0000 UTC m=+0.086589214 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, architecture=x86_64, vendor=Red Hat, Inc., vcs-type=git, managed_by=tripleo_ansible) Dec 15 03:51:47 localhost podman[98621]: 2025-12-15 08:51:47.02737463 +0000 UTC m=+0.353688660 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, config_id=tripleo_step1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, release=1761123044, container_name=metrics_qdr, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, io.buildah.version=1.41.4, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Dec 15 03:51:47 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:51:58 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Dec 15 03:51:58 localhost recover_tripleo_nova_virtqemud[98652]: 61849 Dec 15 03:51:58 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Dec 15 03:51:58 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Dec 15 03:52:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:52:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:52:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 03:52:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:52:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:52:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:52:01 localhost podman[98661]: 2025-12-15 08:52:01.768057854 +0000 UTC m=+0.080858962 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, vendor=Red Hat, Inc., version=17.1.12, build-date=2025-11-19T00:12:45Z, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, managed_by=tripleo_ansible, tcib_managed=true, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64) Dec 15 03:52:01 localhost podman[98679]: 2025-12-15 08:52:01.794861449 +0000 UTC m=+0.092464291 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., tcib_managed=true, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1761123044, io.openshift.expose-services=, architecture=x86_64, io.buildah.version=1.41.4, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:11:48Z, distribution-scope=public, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container) Dec 15 03:52:01 localhost podman[98661]: 2025-12-15 08:52:01.854329219 +0000 UTC m=+0.167130347 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, release=1761123044, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.buildah.version=1.41.4, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, version=17.1.12, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Dec 15 03:52:01 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 03:52:01 localhost podman[98679]: 2025-12-15 08:52:01.879496511 +0000 UTC m=+0.177099373 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, tcib_managed=true, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., build-date=2025-11-19T00:11:48Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_compute) Dec 15 03:52:01 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 03:52:01 localhost podman[98667]: 2025-12-15 08:52:01.890960377 +0000 UTC m=+0.194880598 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.expose-services=, name=rhosp17/openstack-cron, tcib_managed=true, vcs-type=git, config_id=tripleo_step4, container_name=logrotate_crond, batch=17.1_20251118.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, build-date=2025-11-18T22:49:32Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 15 03:52:01 localhost podman[98654]: 2025-12-15 08:52:01.8210507 +0000 UTC m=+0.139836728 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, batch=17.1_20251118.1, tcib_managed=true, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, vcs-type=git, config_id=tripleo_step3, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid) Dec 15 03:52:01 localhost podman[98667]: 2025-12-15 08:52:01.92735403 +0000 UTC m=+0.231274201 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, build-date=2025-11-18T22:49:32Z, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, distribution-scope=public, architecture=x86_64, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, vendor=Red Hat, Inc., tcib_managed=true) Dec 15 03:52:01 localhost podman[98655]: 2025-12-15 08:52:01.933565726 +0000 UTC m=+0.248629714 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, name=rhosp17/openstack-nova-compute, vcs-type=git, io.openshift.expose-services=, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, tcib_managed=true, config_id=tripleo_step5, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute) Dec 15 03:52:01 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 03:52:01 localhost podman[98653]: 2025-12-15 08:52:01.975070365 +0000 UTC m=+0.299211796 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, batch=17.1_20251118.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, managed_by=tripleo_ansible, vcs-type=git, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, url=https://www.redhat.com) Dec 15 03:52:01 localhost podman[98655]: 2025-12-15 08:52:01.991293318 +0000 UTC m=+0.306357276 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, release=1761123044, io.buildah.version=1.41.4, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vcs-type=git, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Dec 15 03:52:02 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Deactivated successfully. Dec 15 03:52:02 localhost podman[98654]: 2025-12-15 08:52:02.0059648 +0000 UTC m=+0.324750868 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, vendor=Red Hat, Inc., vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, container_name=iscsid, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, version=17.1.12, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:44:13Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 15 03:52:02 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:52:02 localhost podman[98653]: 2025-12-15 08:52:02.060966679 +0000 UTC m=+0.385108150 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, name=rhosp17/openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, io.openshift.expose-services=, release=1761123044, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, container_name=collectd, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:51:28Z) Dec 15 03:52:02 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:52:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:52:05 localhost podman[98865]: 2025-12-15 08:52:05.748299217 +0000 UTC m=+0.074755519 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, version=17.1.12, url=https://www.redhat.com, vcs-type=git, config_id=tripleo_step4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, container_name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Dec 15 03:52:06 localhost podman[98865]: 2025-12-15 08:52:06.124373544 +0000 UTC m=+0.450829766 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., config_id=tripleo_step4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, batch=17.1_20251118.1, io.openshift.expose-services=, distribution-scope=public, name=rhosp17/openstack-nova-compute, container_name=nova_migration_target) Dec 15 03:52:06 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 03:52:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:52:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:52:08 localhost podman[98888]: 2025-12-15 08:52:08.747375104 +0000 UTC m=+0.082102234 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, io.buildah.version=1.41.4, config_id=tripleo_step4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, distribution-scope=public, build-date=2025-11-18T23:34:05Z, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., architecture=x86_64) Dec 15 03:52:08 localhost systemd[1]: tmp-crun.SZOhFz.mount: Deactivated successfully. Dec 15 03:52:08 localhost podman[98888]: 2025-12-15 08:52:08.799420835 +0000 UTC m=+0.134147925 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, tcib_managed=true, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, vcs-type=git, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, build-date=2025-11-18T23:34:05Z, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, distribution-scope=public, batch=17.1_20251118.1, container_name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp17/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 15 03:52:08 localhost podman[98888]: unhealthy Dec 15 03:52:08 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Main process exited, code=exited, status=1/FAILURE Dec 15 03:52:08 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Failed with result 'exit-code'. Dec 15 03:52:08 localhost podman[98889]: 2025-12-15 08:52:08.802746144 +0000 UTC m=+0.136113388 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, io.openshift.expose-services=, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, vcs-type=git, tcib_managed=true) Dec 15 03:52:08 localhost podman[98889]: 2025-12-15 08:52:08.88267223 +0000 UTC m=+0.216039444 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, architecture=x86_64, release=1761123044, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, managed_by=tripleo_ansible, distribution-scope=public, config_id=tripleo_step4, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git) Dec 15 03:52:08 localhost podman[98889]: unhealthy Dec 15 03:52:08 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Main process exited, code=exited, status=1/FAILURE Dec 15 03:52:08 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Failed with result 'exit-code'. Dec 15 03:52:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:52:17 localhost podman[98924]: 2025-12-15 08:52:17.751901663 +0000 UTC m=+0.087022357 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, name=rhosp17/openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vendor=Red Hat, Inc., container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, managed_by=tripleo_ansible, config_id=tripleo_step1, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1) Dec 15 03:52:17 localhost podman[98924]: 2025-12-15 08:52:17.969926128 +0000 UTC m=+0.305046782 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, version=17.1.12, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., architecture=x86_64, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd) Dec 15 03:52:17 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:52:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:52:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:52:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 03:52:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:52:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:52:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:52:32 localhost podman[98956]: 2025-12-15 08:52:32.769262669 +0000 UTC m=+0.087624972 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, release=1761123044, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, io.buildah.version=1.41.4, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:12:45Z, tcib_managed=true, managed_by=tripleo_ansible, version=17.1.12) Dec 15 03:52:32 localhost podman[98954]: 2025-12-15 08:52:32.822771618 +0000 UTC m=+0.145886688 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.4, com.redhat.component=openstack-iscsid-container, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:44:13Z, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, architecture=x86_64, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, name=rhosp17/openstack-iscsid, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 15 03:52:32 localhost podman[98954]: 2025-12-15 08:52:32.830933567 +0000 UTC m=+0.154048657 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, container_name=iscsid, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, tcib_managed=true, version=17.1.12) Dec 15 03:52:32 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:52:32 localhost podman[98962]: 2025-12-15 08:52:32.890448526 +0000 UTC m=+0.204362540 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, batch=17.1_20251118.1, vcs-type=git, url=https://www.redhat.com, release=1761123044, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:32Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=logrotate_crond, tcib_managed=true, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64) Dec 15 03:52:32 localhost podman[98956]: 2025-12-15 08:52:32.896642963 +0000 UTC m=+0.215005326 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, tcib_managed=true, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., io.openshift.expose-services=, config_id=tripleo_step4, build-date=2025-11-19T00:12:45Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, release=1761123044, version=17.1.12, architecture=x86_64, name=rhosp17/openstack-ceilometer-ipmi, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Dec 15 03:52:32 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 03:52:32 localhost podman[98953]: 2025-12-15 08:52:32.866040905 +0000 UTC m=+0.191979411 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, version=17.1.12, io.buildah.version=1.41.4, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, architecture=x86_64, container_name=collectd, tcib_managed=true, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, managed_by=tripleo_ansible, name=rhosp17/openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git) Dec 15 03:52:32 localhost podman[98962]: 2025-12-15 08:52:32.924299882 +0000 UTC m=+0.238213866 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-type=git, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, tcib_managed=true, version=17.1.12, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, container_name=logrotate_crond, build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron) Dec 15 03:52:32 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 03:52:32 localhost podman[98953]: 2025-12-15 08:52:32.944977814 +0000 UTC m=+0.270942321 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, io.openshift.expose-services=, managed_by=tripleo_ansible, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, release=1761123044, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, container_name=collectd, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, tcib_managed=true, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, config_id=tripleo_step3, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 15 03:52:32 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:52:33 localhost podman[98969]: 2025-12-15 08:52:33.032201974 +0000 UTC m=+0.343566910 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, architecture=x86_64, container_name=ceilometer_agent_compute, release=1761123044, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, url=https://www.redhat.com, batch=17.1_20251118.1, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z) Dec 15 03:52:33 localhost podman[98955]: 2025-12-15 08:52:33.080465083 +0000 UTC m=+0.398380634 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., version=17.1.12, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true) Dec 15 03:52:33 localhost podman[98955]: 2025-12-15 08:52:33.103637143 +0000 UTC m=+0.421552724 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, version=17.1.12, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, url=https://www.redhat.com, architecture=x86_64, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, container_name=nova_compute, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team) Dec 15 03:52:33 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Deactivated successfully. Dec 15 03:52:33 localhost podman[98969]: 2025-12-15 08:52:33.15931121 +0000 UTC m=+0.470676176 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, io.openshift.expose-services=, batch=17.1_20251118.1, io.buildah.version=1.41.4, version=17.1.12, build-date=2025-11-19T00:11:48Z, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Dec 15 03:52:33 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 03:52:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:52:36 localhost systemd[1]: tmp-crun.uSulT6.mount: Deactivated successfully. Dec 15 03:52:36 localhost podman[99090]: 2025-12-15 08:52:36.748656439 +0000 UTC m=+0.079979338 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, managed_by=tripleo_ansible, batch=17.1_20251118.1, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, container_name=nova_migration_target, tcib_managed=true, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=openstack-nova-compute-container, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute) Dec 15 03:52:37 localhost podman[99090]: 2025-12-15 08:52:37.109318815 +0000 UTC m=+0.440641694 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, config_id=tripleo_step4, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, url=https://www.redhat.com, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, batch=17.1_20251118.1, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, vcs-type=git, release=1761123044) Dec 15 03:52:37 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 03:52:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:52:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:52:39 localhost podman[99114]: 2025-12-15 08:52:39.750248734 +0000 UTC m=+0.082550746 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, vcs-type=git, distribution-scope=public, name=rhosp17/openstack-ovn-controller, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272) Dec 15 03:52:39 localhost podman[99114]: 2025-12-15 08:52:39.763796466 +0000 UTC m=+0.096098448 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, batch=17.1_20251118.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, url=https://www.redhat.com, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team, version=17.1.12) Dec 15 03:52:39 localhost podman[99114]: unhealthy Dec 15 03:52:39 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Main process exited, code=exited, status=1/FAILURE Dec 15 03:52:39 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Failed with result 'exit-code'. Dec 15 03:52:39 localhost systemd[1]: tmp-crun.OYH4mV.mount: Deactivated successfully. Dec 15 03:52:39 localhost podman[99115]: 2025-12-15 08:52:39.866167542 +0000 UTC m=+0.195541686 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, distribution-scope=public, container_name=ovn_metadata_agent, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, vcs-type=git, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Dec 15 03:52:39 localhost podman[99115]: 2025-12-15 08:52:39.908375949 +0000 UTC m=+0.237750133 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, batch=17.1_20251118.1, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1761123044, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, architecture=x86_64) Dec 15 03:52:39 localhost podman[99115]: unhealthy Dec 15 03:52:39 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Main process exited, code=exited, status=1/FAILURE Dec 15 03:52:39 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Failed with result 'exit-code'. Dec 15 03:52:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:52:48 localhost podman[99155]: 2025-12-15 08:52:48.757745424 +0000 UTC m=+0.088664081 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, distribution-scope=public, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, config_id=tripleo_step1, io.openshift.expose-services=, managed_by=tripleo_ansible, architecture=x86_64, name=rhosp17/openstack-qdrouterd, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 15 03:52:49 localhost podman[99155]: 2025-12-15 08:52:48.987446811 +0000 UTC m=+0.318365498 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, managed_by=tripleo_ansible, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, release=1761123044, version=17.1.12, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd) Dec 15 03:52:49 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:53:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:53:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:53:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 03:53:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:53:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:53:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:53:03 localhost systemd[1]: tmp-crun.UrhhDS.mount: Deactivated successfully. Dec 15 03:53:03 localhost podman[99184]: 2025-12-15 08:53:03.782708263 +0000 UTC m=+0.110039231 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, vcs-type=git, io.openshift.expose-services=, container_name=collectd, release=1761123044, build-date=2025-11-18T22:51:28Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, distribution-scope=public) Dec 15 03:53:03 localhost podman[99184]: 2025-12-15 08:53:03.794230451 +0000 UTC m=+0.121561439 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.buildah.version=1.41.4, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, vcs-type=git, version=17.1.12, container_name=collectd, tcib_managed=true, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 15 03:53:03 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:53:03 localhost systemd[1]: tmp-crun.1aUdZN.mount: Deactivated successfully. Dec 15 03:53:03 localhost podman[99186]: 2025-12-15 08:53:03.848424488 +0000 UTC m=+0.167448684 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., url=https://www.redhat.com, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, io.openshift.expose-services=, config_id=tripleo_step5, build-date=2025-11-19T00:36:58Z, tcib_managed=true, vcs-type=git, distribution-scope=public, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Dec 15 03:53:03 localhost podman[99187]: 2025-12-15 08:53:03.880969018 +0000 UTC m=+0.196126841 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, architecture=x86_64, vendor=Red Hat, Inc., batch=17.1_20251118.1, config_id=tripleo_step4, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible) Dec 15 03:53:03 localhost podman[99187]: 2025-12-15 08:53:03.931286552 +0000 UTC m=+0.246444385 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., version=17.1.12, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, vcs-type=git, tcib_managed=true) Dec 15 03:53:03 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 03:53:03 localhost podman[99194]: 2025-12-15 08:53:03.980665362 +0000 UTC m=+0.292543547 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, managed_by=tripleo_ansible, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, vcs-type=git, maintainer=OpenStack TripleO Team, architecture=x86_64, build-date=2025-11-18T22:49:32Z, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, container_name=logrotate_crond) Dec 15 03:53:03 localhost podman[99200]: 2025-12-15 08:53:03.932299959 +0000 UTC m=+0.242571231 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, managed_by=tripleo_ansible, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., build-date=2025-11-19T00:11:48Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, io.buildah.version=1.41.4, container_name=ceilometer_agent_compute, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container) Dec 15 03:53:03 localhost podman[99194]: 2025-12-15 08:53:03.991245144 +0000 UTC m=+0.303123379 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, release=1761123044, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, version=17.1.12) Dec 15 03:53:04 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 03:53:04 localhost podman[99186]: 2025-12-15 08:53:04.006326098 +0000 UTC m=+0.325350384 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.buildah.version=1.41.4, url=https://www.redhat.com, batch=17.1_20251118.1, tcib_managed=true, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, release=1761123044, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, vcs-type=git, config_id=tripleo_step5, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}) Dec 15 03:53:04 localhost podman[99200]: 2025-12-15 08:53:04.019259082 +0000 UTC m=+0.329530344 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, managed_by=tripleo_ansible, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, build-date=2025-11-19T00:11:48Z, url=https://www.redhat.com, release=1761123044, vendor=Red Hat, Inc., batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, distribution-scope=public, name=rhosp17/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Dec 15 03:53:04 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Deactivated successfully. Dec 15 03:53:04 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 03:53:04 localhost podman[99185]: 2025-12-15 08:53:04.089441037 +0000 UTC m=+0.411649328 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, managed_by=tripleo_ansible, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.component=openstack-iscsid-container, architecture=x86_64, build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.buildah.version=1.41.4, container_name=iscsid, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 15 03:53:04 localhost podman[99185]: 2025-12-15 08:53:04.126370615 +0000 UTC m=+0.448578886 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, architecture=x86_64, name=rhosp17/openstack-iscsid, distribution-scope=public, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, build-date=2025-11-18T23:44:13Z, release=1761123044, container_name=iscsid, version=17.1.12, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, batch=17.1_20251118.1, vcs-type=git, managed_by=tripleo_ansible, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 15 03:53:04 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:53:04 localhost sshd[99320]: main: sshd: ssh-rsa algorithm is disabled Dec 15 03:53:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:53:07 localhost podman[99398]: 2025-12-15 08:53:07.23649058 +0000 UTC m=+0.082762412 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vendor=Red Hat, Inc., tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, release=1761123044, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vcs-type=git, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, distribution-scope=public) Dec 15 03:53:07 localhost podman[99398]: 2025-12-15 08:53:07.608410317 +0000 UTC m=+0.454682149 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, build-date=2025-11-19T00:36:58Z, tcib_managed=true, url=https://www.redhat.com, managed_by=tripleo_ansible, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vcs-type=git, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_migration_target, release=1761123044) Dec 15 03:53:07 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 03:53:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:53:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:53:10 localhost systemd[1]: tmp-crun.NWBNfm.mount: Deactivated successfully. Dec 15 03:53:10 localhost podman[99421]: 2025-12-15 08:53:10.753208968 +0000 UTC m=+0.087590581 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, version=17.1.12, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., vcs-type=git, tcib_managed=true, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, config_id=tripleo_step4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, container_name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller) Dec 15 03:53:10 localhost podman[99421]: 2025-12-15 08:53:10.769441862 +0000 UTC m=+0.103823445 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, architecture=x86_64, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, distribution-scope=public, managed_by=tripleo_ansible, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, version=17.1.12, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, io.buildah.version=1.41.4) Dec 15 03:53:10 localhost podman[99422]: 2025-12-15 08:53:10.809634166 +0000 UTC m=+0.142082017 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, version=17.1.12, io.openshift.expose-services=, distribution-scope=public, release=1761123044, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, vcs-type=git) Dec 15 03:53:10 localhost podman[99421]: unhealthy Dec 15 03:53:10 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Main process exited, code=exited, status=1/FAILURE Dec 15 03:53:10 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Failed with result 'exit-code'. Dec 15 03:53:10 localhost podman[99422]: 2025-12-15 08:53:10.851411662 +0000 UTC m=+0.183859473 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., version=17.1.12, release=1761123044, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, build-date=2025-11-19T00:14:25Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, config_id=tripleo_step4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, batch=17.1_20251118.1, vcs-type=git) Dec 15 03:53:10 localhost podman[99422]: unhealthy Dec 15 03:53:10 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Main process exited, code=exited, status=1/FAILURE Dec 15 03:53:10 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Failed with result 'exit-code'. Dec 15 03:53:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:53:19 localhost systemd[1]: tmp-crun.bHTfXl.mount: Deactivated successfully. Dec 15 03:53:19 localhost podman[99459]: 2025-12-15 08:53:19.755576571 +0000 UTC m=+0.090076258 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:46Z, version=17.1.12, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, architecture=x86_64) Dec 15 03:53:19 localhost podman[99459]: 2025-12-15 08:53:19.968549671 +0000 UTC m=+0.303049358 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, container_name=metrics_qdr, version=17.1.12, distribution-scope=public, batch=17.1_20251118.1, build-date=2025-11-18T22:49:46Z, vcs-type=git, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true) Dec 15 03:53:19 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:53:25 localhost systemd[1]: session-28.scope: Deactivated successfully. Dec 15 03:53:25 localhost systemd[1]: session-28.scope: Consumed 7min 7.077s CPU time. Dec 15 03:53:25 localhost systemd-logind[763]: Session 28 logged out. Waiting for processes to exit. Dec 15 03:53:25 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Dec 15 03:53:25 localhost systemd-logind[763]: Removed session 28. Dec 15 03:53:25 localhost recover_tripleo_nova_virtqemud[99490]: 61849 Dec 15 03:53:25 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Dec 15 03:53:25 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Dec 15 03:53:33 localhost sshd[99491]: main: sshd: ssh-rsa algorithm is disabled Dec 15 03:53:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:53:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:53:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 03:53:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:53:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:53:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:53:34 localhost systemd[1]: tmp-crun.DbPj7R.mount: Deactivated successfully. Dec 15 03:53:34 localhost podman[99493]: 2025-12-15 08:53:34.826774486 +0000 UTC m=+0.144724888 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, url=https://www.redhat.com, architecture=x86_64, managed_by=tripleo_ansible, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1761123044) Dec 15 03:53:34 localhost podman[99495]: 2025-12-15 08:53:34.777644963 +0000 UTC m=+0.090520399 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, release=1761123044, version=17.1.12, vcs-type=git, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://www.redhat.com, config_id=tripleo_step5, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.41.4, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}) Dec 15 03:53:34 localhost podman[99495]: 2025-12-15 08:53:34.915453225 +0000 UTC m=+0.228328731 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-type=git, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, container_name=nova_compute, io.buildah.version=1.41.4, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, tcib_managed=true, managed_by=tripleo_ansible, architecture=x86_64) Dec 15 03:53:34 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Deactivated successfully. Dec 15 03:53:34 localhost podman[99496]: 2025-12-15 08:53:34.930237431 +0000 UTC m=+0.239938982 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, version=17.1.12, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.openshift.expose-services=, url=https://www.redhat.com, architecture=x86_64, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-type=git, build-date=2025-11-19T00:12:45Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, vendor=Red Hat, Inc., io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Dec 15 03:53:34 localhost podman[99494]: 2025-12-15 08:53:34.884056896 +0000 UTC m=+0.199872870 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, architecture=x86_64, com.redhat.component=openstack-iscsid-container, build-date=2025-11-18T23:44:13Z, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.buildah.version=1.41.4, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, vcs-type=git, release=1761123044) Dec 15 03:53:34 localhost podman[99496]: 2025-12-15 08:53:34.956363798 +0000 UTC m=+0.266065349 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, build-date=2025-11-19T00:12:45Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, release=1761123044, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676) Dec 15 03:53:34 localhost podman[99502]: 2025-12-15 08:53:34.990518351 +0000 UTC m=+0.297739426 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, name=rhosp17/openstack-cron, release=1761123044, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, build-date=2025-11-18T22:49:32Z, batch=17.1_20251118.1, container_name=logrotate_crond, distribution-scope=public, vendor=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible) Dec 15 03:53:35 localhost podman[99493]: 2025-12-15 08:53:35.009757385 +0000 UTC m=+0.327707817 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, distribution-scope=public, tcib_managed=true, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., release=1761123044, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z) Dec 15 03:53:35 localhost podman[99494]: 2025-12-15 08:53:35.019693551 +0000 UTC m=+0.335509525 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2025-11-18T23:44:13Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, architecture=x86_64, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, url=https://www.redhat.com, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4) Dec 15 03:53:35 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:53:35 localhost podman[99502]: 2025-12-15 08:53:35.022380842 +0000 UTC m=+0.329601877 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, architecture=x86_64, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, vendor=Red Hat, Inc., managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:32Z, url=https://www.redhat.com, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 15 03:53:35 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:53:35 localhost podman[99509]: 2025-12-15 08:53:34.807203023 +0000 UTC m=+0.110821812 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_id=tripleo_step4, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, release=1761123044, build-date=2025-11-19T00:11:48Z, name=rhosp17/openstack-ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, batch=17.1_20251118.1, architecture=x86_64, io.openshift.expose-services=) Dec 15 03:53:35 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 03:53:35 localhost podman[99509]: 2025-12-15 08:53:35.092405362 +0000 UTC m=+0.396024131 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, tcib_managed=true, batch=17.1_20251118.1, distribution-scope=public, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-compute, config_id=tripleo_step4, architecture=x86_64, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-type=git, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Dec 15 03:53:35 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 03:53:35 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 03:53:35 localhost systemd[1]: Stopping User Manager for UID 1003... Dec 15 03:53:35 localhost systemd[35760]: Activating special unit Exit the Session... Dec 15 03:53:35 localhost systemd[35760]: Removed slice User Background Tasks Slice. Dec 15 03:53:35 localhost systemd[35760]: Stopped target Main User Target. Dec 15 03:53:35 localhost systemd[35760]: Stopped target Basic System. Dec 15 03:53:35 localhost systemd[35760]: Stopped target Paths. Dec 15 03:53:35 localhost systemd[35760]: Stopped target Sockets. Dec 15 03:53:35 localhost systemd[35760]: Stopped target Timers. Dec 15 03:53:35 localhost systemd[35760]: Stopped Mark boot as successful after the user session has run 2 minutes. Dec 15 03:53:35 localhost systemd[35760]: Stopped Daily Cleanup of User's Temporary Directories. Dec 15 03:53:35 localhost systemd[35760]: Closed D-Bus User Message Bus Socket. Dec 15 03:53:35 localhost systemd[35760]: Stopped Create User's Volatile Files and Directories. Dec 15 03:53:35 localhost systemd[35760]: Removed slice User Application Slice. Dec 15 03:53:35 localhost systemd[35760]: Reached target Shutdown. Dec 15 03:53:35 localhost systemd[35760]: Finished Exit the Session. Dec 15 03:53:35 localhost systemd[35760]: Reached target Exit the Session. Dec 15 03:53:35 localhost systemd[1]: user@1003.service: Deactivated successfully. Dec 15 03:53:35 localhost systemd[1]: Stopped User Manager for UID 1003. Dec 15 03:53:35 localhost systemd[1]: user@1003.service: Consumed 4.580s CPU time, read 0B from disk, written 7.0K to disk. Dec 15 03:53:35 localhost systemd[1]: Stopping User Runtime Directory /run/user/1003... Dec 15 03:53:35 localhost systemd[1]: user-runtime-dir@1003.service: Deactivated successfully. Dec 15 03:53:35 localhost systemd[1]: Stopped User Runtime Directory /run/user/1003. Dec 15 03:53:35 localhost systemd[1]: Removed slice User Slice of UID 1003. Dec 15 03:53:35 localhost systemd[1]: user-1003.slice: Consumed 7min 11.681s CPU time. Dec 15 03:53:35 localhost systemd[1]: run-user-1003.mount: Deactivated successfully. Dec 15 03:53:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:53:37 localhost podman[99628]: 2025-12-15 08:53:37.745370284 +0000 UTC m=+0.077994085 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vendor=Red Hat, Inc., config_id=tripleo_step4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, architecture=x86_64, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, distribution-scope=public, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, maintainer=OpenStack TripleO Team, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 15 03:53:38 localhost podman[99628]: 2025-12-15 08:53:38.147417255 +0000 UTC m=+0.480041066 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, vendor=Red Hat, Inc., distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, name=rhosp17/openstack-nova-compute, tcib_managed=true, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.buildah.version=1.41.4, managed_by=tripleo_ansible, release=1761123044, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Dec 15 03:53:38 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 03:53:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:53:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:53:41 localhost systemd[1]: tmp-crun.IdYf7K.mount: Deactivated successfully. Dec 15 03:53:41 localhost podman[99652]: 2025-12-15 08:53:41.796415948 +0000 UTC m=+0.121786715 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, release=1761123044, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.12, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, architecture=x86_64, build-date=2025-11-19T00:14:25Z, managed_by=tripleo_ansible, io.openshift.expose-services=, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}) Dec 15 03:53:41 localhost podman[99651]: 2025-12-15 08:53:41.841428151 +0000 UTC m=+0.170149337 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, container_name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-type=git, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 15 03:53:41 localhost podman[99651]: 2025-12-15 08:53:41.855029324 +0000 UTC m=+0.183750470 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, config_id=tripleo_step4, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2025-11-18T23:34:05Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 15 03:53:41 localhost podman[99651]: unhealthy Dec 15 03:53:41 localhost podman[99652]: 2025-12-15 08:53:41.863713166 +0000 UTC m=+0.189083893 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vendor=Red Hat, Inc., url=https://www.redhat.com) Dec 15 03:53:41 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Main process exited, code=exited, status=1/FAILURE Dec 15 03:53:41 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Failed with result 'exit-code'. Dec 15 03:53:41 localhost podman[99652]: unhealthy Dec 15 03:53:41 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Main process exited, code=exited, status=1/FAILURE Dec 15 03:53:41 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Failed with result 'exit-code'. Dec 15 03:53:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:53:50 localhost systemd[1]: tmp-crun.OHpEqi.mount: Deactivated successfully. Dec 15 03:53:50 localhost podman[99693]: 2025-12-15 08:53:50.754616939 +0000 UTC m=+0.089892853 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.12, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 15 03:53:50 localhost podman[99693]: 2025-12-15 08:53:50.966443589 +0000 UTC m=+0.301719543 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.buildah.version=1.41.4, url=https://www.redhat.com, build-date=2025-11-18T22:49:46Z, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, container_name=metrics_qdr, config_id=tripleo_step1, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, distribution-scope=public, vendor=Red Hat, Inc., batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd) Dec 15 03:53:50 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:54:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:54:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:54:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 03:54:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:54:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:54:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:54:05 localhost systemd[1]: tmp-crun.qdWl6V.mount: Deactivated successfully. Dec 15 03:54:05 localhost podman[99723]: 2025-12-15 08:54:05.769064408 +0000 UTC m=+0.089410290 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, container_name=iscsid, release=1761123044, config_id=tripleo_step3, build-date=2025-11-18T23:44:13Z, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, url=https://www.redhat.com, io.buildah.version=1.41.4, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid) Dec 15 03:54:05 localhost podman[99724]: 2025-12-15 08:54:05.786867413 +0000 UTC m=+0.102955521 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp17/openstack-nova-compute, maintainer=OpenStack TripleO Team, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, container_name=nova_compute, io.buildah.version=1.41.4, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step5, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container) Dec 15 03:54:05 localhost podman[99722]: 2025-12-15 08:54:05.824847119 +0000 UTC m=+0.149764693 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, managed_by=tripleo_ansible, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, url=https://www.redhat.com, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container, architecture=x86_64, maintainer=OpenStack TripleO Team, container_name=collectd, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, io.buildah.version=1.41.4, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 15 03:54:05 localhost podman[99724]: 2025-12-15 08:54:05.838394511 +0000 UTC m=+0.154482629 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, release=1761123044, vcs-type=git, architecture=x86_64, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute) Dec 15 03:54:05 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Deactivated successfully. Dec 15 03:54:05 localhost podman[99736]: 2025-12-15 08:54:05.878078481 +0000 UTC m=+0.178327646 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, version=17.1.12, release=1761123044, batch=17.1_20251118.1, container_name=logrotate_crond, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, distribution-scope=public, build-date=2025-11-18T22:49:32Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, tcib_managed=true) Dec 15 03:54:05 localhost podman[99736]: 2025-12-15 08:54:05.884236375 +0000 UTC m=+0.184485560 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, tcib_managed=true, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, container_name=logrotate_crond, com.redhat.component=openstack-cron-container, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1761123044, managed_by=tripleo_ansible, vendor=Red Hat, Inc., distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron) Dec 15 03:54:05 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 03:54:05 localhost podman[99723]: 2025-12-15 08:54:05.90536068 +0000 UTC m=+0.225706522 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-type=git, vendor=Red Hat, Inc., url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, io.buildah.version=1.41.4, version=17.1.12) Dec 15 03:54:05 localhost podman[99722]: 2025-12-15 08:54:05.913009924 +0000 UTC m=+0.237927538 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, container_name=collectd, tcib_managed=true, batch=17.1_20251118.1, version=17.1.12, managed_by=tripleo_ansible, release=1761123044, config_id=tripleo_step3, architecture=x86_64, build-date=2025-11-18T22:51:28Z, io.openshift.expose-services=, name=rhosp17/openstack-collectd, com.redhat.component=openstack-collectd-container, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}) Dec 15 03:54:05 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:54:05 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:54:05 localhost podman[99742]: 2025-12-15 08:54:05.924646695 +0000 UTC m=+0.227688385 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, name=rhosp17/openstack-ceilometer-compute, vendor=Red Hat, Inc., batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, build-date=2025-11-19T00:11:48Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, vcs-type=git, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Dec 15 03:54:05 localhost podman[99742]: 2025-12-15 08:54:05.946221671 +0000 UTC m=+0.249263361 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, version=17.1.12, io.openshift.expose-services=, vcs-type=git, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, release=1761123044, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 15 03:54:05 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 03:54:05 localhost podman[99730]: 2025-12-15 08:54:05.993683469 +0000 UTC m=+0.299411380 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.buildah.version=1.41.4, build-date=2025-11-19T00:12:45Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, tcib_managed=true, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, release=1761123044, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=) Dec 15 03:54:06 localhost podman[99730]: 2025-12-15 08:54:06.019400436 +0000 UTC m=+0.325128357 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, managed_by=tripleo_ansible, release=1761123044, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, url=https://www.redhat.com, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, tcib_managed=true, build-date=2025-11-19T00:12:45Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 15 03:54:06 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 03:54:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:54:08 localhost podman[99932]: 2025-12-15 08:54:08.328943662 +0000 UTC m=+0.083609745 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, maintainer=OpenStack TripleO Team, architecture=x86_64, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, vendor=Red Hat, Inc., vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, version=17.1.12, distribution-scope=public, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container) Dec 15 03:54:08 localhost podman[99932]: 2025-12-15 08:54:08.653863813 +0000 UTC m=+0.408529946 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, batch=17.1_20251118.1, config_id=tripleo_step4, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, distribution-scope=public, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container) Dec 15 03:54:08 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 03:54:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:54:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:54:12 localhost podman[99957]: 2025-12-15 08:54:12.76120974 +0000 UTC m=+0.086697387 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, managed_by=tripleo_ansible, vendor=Red Hat, Inc., name=rhosp17/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, url=https://www.redhat.com, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, distribution-scope=public, container_name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, build-date=2025-11-19T00:14:25Z, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container) Dec 15 03:54:12 localhost podman[99957]: 2025-12-15 08:54:12.779343104 +0000 UTC m=+0.104830731 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.buildah.version=1.41.4, release=1761123044, url=https://www.redhat.com, name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, version=17.1.12, vcs-type=git, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn) Dec 15 03:54:12 localhost podman[99957]: unhealthy Dec 15 03:54:12 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Main process exited, code=exited, status=1/FAILURE Dec 15 03:54:12 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Failed with result 'exit-code'. Dec 15 03:54:12 localhost podman[99956]: 2025-12-15 08:54:12.865832655 +0000 UTC m=+0.192525024 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, tcib_managed=true, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, version=17.1.12, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, release=1761123044, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, url=https://www.redhat.com, container_name=ovn_controller, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, batch=17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272) Dec 15 03:54:12 localhost podman[99956]: 2025-12-15 08:54:12.905395372 +0000 UTC m=+0.232087731 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., version=17.1.12, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, io.buildah.version=1.41.4, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20251118.1, url=https://www.redhat.com, distribution-scope=public, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, container_name=ovn_controller) Dec 15 03:54:12 localhost podman[99956]: unhealthy Dec 15 03:54:12 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Main process exited, code=exited, status=1/FAILURE Dec 15 03:54:12 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Failed with result 'exit-code'. Dec 15 03:54:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:54:21 localhost podman[99994]: 2025-12-15 08:54:21.749449244 +0000 UTC m=+0.083093100 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, release=1761123044, container_name=metrics_qdr, io.openshift.expose-services=, config_id=tripleo_step1, architecture=x86_64, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team) Dec 15 03:54:21 localhost podman[99994]: 2025-12-15 08:54:21.997587554 +0000 UTC m=+0.331231360 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, release=1761123044, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, config_id=tripleo_step1, managed_by=tripleo_ansible, io.buildah.version=1.41.4, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd) Dec 15 03:54:22 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:54:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:54:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:54:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 03:54:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:54:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:54:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:54:36 localhost podman[100024]: 2025-12-15 08:54:36.756915057 +0000 UTC m=+0.078846867 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, vcs-type=git, io.openshift.expose-services=, build-date=2025-11-18T23:44:13Z, distribution-scope=public, url=https://www.redhat.com, container_name=iscsid, managed_by=tripleo_ansible, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, release=1761123044, batch=17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, tcib_managed=true) Dec 15 03:54:36 localhost podman[100024]: 2025-12-15 08:54:36.76938893 +0000 UTC m=+0.091320710 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, container_name=iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, distribution-scope=public, io.buildah.version=1.41.4, release=1761123044, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 15 03:54:36 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:54:36 localhost systemd[1]: tmp-crun.2kzSmc.mount: Deactivated successfully. Dec 15 03:54:36 localhost podman[100023]: 2025-12-15 08:54:36.825200161 +0000 UTC m=+0.152073883 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, build-date=2025-11-18T22:51:28Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.buildah.version=1.41.4, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=collectd, vcs-type=git, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-collectd, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, release=1761123044, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 15 03:54:36 localhost podman[100023]: 2025-12-15 08:54:36.838400174 +0000 UTC m=+0.165273936 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, vendor=Red Hat, Inc., container_name=collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, managed_by=tripleo_ansible, architecture=x86_64, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, name=rhosp17/openstack-collectd, version=17.1.12, io.buildah.version=1.41.4, build-date=2025-11-18T22:51:28Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, com.redhat.component=openstack-collectd-container) Dec 15 03:54:36 localhost podman[100025]: 2025-12-15 08:54:36.883021247 +0000 UTC m=+0.205629406 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, batch=17.1_20251118.1, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_compute, io.buildah.version=1.41.4, release=1761123044, managed_by=tripleo_ansible, vcs-type=git, version=17.1.12, config_id=tripleo_step5, architecture=x86_64, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute) Dec 15 03:54:36 localhost podman[100034]: 2025-12-15 08:54:36.926518098 +0000 UTC m=+0.232803371 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, build-date=2025-11-18T22:49:32Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, url=https://www.redhat.com, vcs-type=git, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, managed_by=tripleo_ansible, version=17.1.12, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, distribution-scope=public) Dec 15 03:54:36 localhost podman[100025]: 2025-12-15 08:54:36.937297766 +0000 UTC m=+0.259905885 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-type=git, architecture=x86_64, build-date=2025-11-19T00:36:58Z, version=17.1.12, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, vendor=Red Hat, Inc., managed_by=tripleo_ansible, release=1761123044, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4) Dec 15 03:54:36 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Deactivated successfully. Dec 15 03:54:36 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:54:36 localhost podman[100030]: 2025-12-15 08:54:36.97559542 +0000 UTC m=+0.281928444 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, tcib_managed=true, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, config_id=tripleo_step4, io.buildah.version=1.41.4) Dec 15 03:54:36 localhost podman[100034]: 2025-12-15 08:54:36.988119804 +0000 UTC m=+0.294405067 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.component=openstack-cron-container, name=rhosp17/openstack-cron, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, build-date=2025-11-18T22:49:32Z, batch=17.1_20251118.1, version=17.1.12, container_name=logrotate_crond, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.buildah.version=1.41.4, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=) Dec 15 03:54:36 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 03:54:37 localhost podman[100030]: 2025-12-15 08:54:37.060588831 +0000 UTC m=+0.366921835 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, io.buildah.version=1.41.4, tcib_managed=true, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, batch=17.1_20251118.1, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:12:45Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible) Dec 15 03:54:37 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 03:54:37 localhost podman[100043]: 2025-12-15 08:54:37.078498029 +0000 UTC m=+0.384088733 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, config_id=tripleo_step4, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, io.openshift.expose-services=, architecture=x86_64, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2025-11-19T00:11:48Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, tcib_managed=true, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676) Dec 15 03:54:37 localhost podman[100043]: 2025-12-15 08:54:37.136476118 +0000 UTC m=+0.442066842 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, tcib_managed=true, url=https://www.redhat.com, managed_by=tripleo_ansible, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.12, vcs-type=git, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, io.buildah.version=1.41.4, build-date=2025-11-19T00:11:48Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Dec 15 03:54:37 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 03:54:37 localhost systemd[1]: tmp-crun.aUHcQN.mount: Deactivated successfully. Dec 15 03:54:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:54:39 localhost podman[100160]: 2025-12-15 08:54:39.746189874 +0000 UTC m=+0.080827071 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, distribution-scope=public, tcib_managed=true, name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Dec 15 03:54:40 localhost podman[100160]: 2025-12-15 08:54:40.138396312 +0000 UTC m=+0.473033539 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_migration_target, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, release=1761123044, vcs-type=git, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step4, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64) Dec 15 03:54:40 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 03:54:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:54:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:54:43 localhost podman[100185]: 2025-12-15 08:54:43.756026337 +0000 UTC m=+0.080524273 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., vcs-type=git, io.buildah.version=1.41.4, container_name=ovn_controller, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, tcib_managed=true, url=https://www.redhat.com, version=17.1.12, name=rhosp17/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team) Dec 15 03:54:43 localhost podman[100186]: 2025-12-15 08:54:43.806592208 +0000 UTC m=+0.126639505 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, build-date=2025-11-19T00:14:25Z, io.buildah.version=1.41.4, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, config_id=tripleo_step4, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, url=https://www.redhat.com, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c) Dec 15 03:54:43 localhost podman[100185]: 2025-12-15 08:54:43.826627273 +0000 UTC m=+0.151125209 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_id=tripleo_step4, architecture=x86_64, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-type=git, build-date=2025-11-18T23:34:05Z, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, release=1761123044, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.openshift.expose-services=) Dec 15 03:54:43 localhost podman[100185]: unhealthy Dec 15 03:54:43 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Main process exited, code=exited, status=1/FAILURE Dec 15 03:54:43 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Failed with result 'exit-code'. Dec 15 03:54:43 localhost podman[100186]: 2025-12-15 08:54:43.85083857 +0000 UTC m=+0.170885837 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.expose-services=, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, vendor=Red Hat, Inc., vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, config_id=tripleo_step4, managed_by=tripleo_ansible, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team) Dec 15 03:54:43 localhost podman[100186]: unhealthy Dec 15 03:54:43 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Main process exited, code=exited, status=1/FAILURE Dec 15 03:54:43 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Failed with result 'exit-code'. Dec 15 03:54:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:54:52 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Dec 15 03:54:52 localhost recover_tripleo_nova_virtqemud[100230]: 61849 Dec 15 03:54:52 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Dec 15 03:54:52 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Dec 15 03:54:52 localhost systemd[1]: tmp-crun.qeiD0W.mount: Deactivated successfully. Dec 15 03:54:52 localhost podman[100225]: 2025-12-15 08:54:52.762111918 +0000 UTC m=+0.093895091 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, build-date=2025-11-18T22:49:46Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.openshift.expose-services=, release=1761123044, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.4, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 15 03:54:52 localhost podman[100225]: 2025-12-15 08:54:52.981410426 +0000 UTC m=+0.313193549 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, vcs-type=git, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, release=1761123044, distribution-scope=public, managed_by=tripleo_ansible, version=17.1.12, maintainer=OpenStack TripleO Team, tcib_managed=true, architecture=x86_64, vendor=Red Hat, Inc., build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Dec 15 03:54:52 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:54:56 localhost sshd[100256]: main: sshd: ssh-rsa algorithm is disabled Dec 15 03:55:05 localhost ceph-osd[31375]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Dec 15 03:55:05 localhost ceph-osd[31375]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 4200.1 total, 600.0 interval#012Cumulative writes: 4815 writes, 21K keys, 4815 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 4815 writes, 628 syncs, 7.67 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Dec 15 03:55:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:55:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:55:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 03:55:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:55:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:55:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:55:07 localhost podman[100259]: 2025-12-15 08:55:07.783732856 +0000 UTC m=+0.104472032 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, build-date=2025-11-18T23:44:13Z, distribution-scope=public, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, vcs-type=git, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc.) Dec 15 03:55:07 localhost podman[100259]: 2025-12-15 08:55:07.790674142 +0000 UTC m=+0.111413338 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, managed_by=tripleo_ansible, vcs-type=git, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, release=1761123044, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.expose-services=, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, url=https://www.redhat.com, name=rhosp17/openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, build-date=2025-11-18T23:44:13Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 15 03:55:07 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:55:07 localhost systemd[1]: tmp-crun.1jXE8m.mount: Deactivated successfully. Dec 15 03:55:07 localhost podman[100258]: 2025-12-15 08:55:07.832204191 +0000 UTC m=+0.152686660 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, release=1761123044, io.buildah.version=1.41.4, name=rhosp17/openstack-collectd, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, com.redhat.component=openstack-collectd-container, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, architecture=x86_64, managed_by=tripleo_ansible, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd) Dec 15 03:55:07 localhost podman[100258]: 2025-12-15 08:55:07.84638392 +0000 UTC m=+0.166866359 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, distribution-scope=public, io.buildah.version=1.41.4, name=rhosp17/openstack-collectd, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, tcib_managed=true, build-date=2025-11-18T22:51:28Z, batch=17.1_20251118.1, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}) Dec 15 03:55:07 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:55:07 localhost podman[100261]: 2025-12-15 08:55:07.929204103 +0000 UTC m=+0.245542222 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, io.buildah.version=1.41.4, url=https://www.redhat.com, distribution-scope=public, release=1761123044, tcib_managed=true, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:12:45Z) Dec 15 03:55:07 localhost podman[100261]: 2025-12-15 08:55:07.985287771 +0000 UTC m=+0.301625890 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, distribution-scope=public, build-date=2025-11-19T00:12:45Z, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, release=1761123044, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, container_name=ceilometer_agent_ipmi, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, io.openshift.expose-services=, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 15 03:55:07 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 03:55:08 localhost podman[100278]: 2025-12-15 08:55:08.0331648 +0000 UTC m=+0.343021405 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, version=17.1.12, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, vcs-type=git, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, architecture=x86_64, release=1761123044, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container) Dec 15 03:55:08 localhost podman[100260]: 2025-12-15 08:55:07.990024538 +0000 UTC m=+0.310646711 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, vendor=Red Hat, Inc., architecture=x86_64, version=17.1.12, distribution-scope=public, release=1761123044, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git) Dec 15 03:55:08 localhost podman[100260]: 2025-12-15 08:55:08.073734384 +0000 UTC m=+0.394356547 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, tcib_managed=true, vendor=Red Hat, Inc., vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, architecture=x86_64, maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, url=https://www.redhat.com) Dec 15 03:55:08 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Deactivated successfully. Dec 15 03:55:08 localhost podman[100278]: 2025-12-15 08:55:08.08739923 +0000 UTC m=+0.397255815 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., vcs-type=git, release=1761123044, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:11:48Z) Dec 15 03:55:08 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 03:55:08 localhost podman[100262]: 2025-12-15 08:55:08.089223788 +0000 UTC m=+0.403740538 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, release=1761123044, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, build-date=2025-11-18T22:49:32Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, url=https://www.redhat.com, vendor=Red Hat, Inc., batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, config_id=tripleo_step4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.openshift.expose-services=, io.buildah.version=1.41.4, architecture=x86_64, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 15 03:55:08 localhost podman[100262]: 2025-12-15 08:55:08.173386346 +0000 UTC m=+0.487903076 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, release=1761123044, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., managed_by=tripleo_ansible, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, name=rhosp17/openstack-cron) Dec 15 03:55:08 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 03:55:08 localhost systemd[1]: tmp-crun.lBdZm2.mount: Deactivated successfully. Dec 15 03:55:10 localhost ceph-osd[32311]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Dec 15 03:55:10 localhost ceph-osd[32311]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 4200.2 total, 600.0 interval#012Cumulative writes: 5745 writes, 25K keys, 5745 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 5745 writes, 763 syncs, 7.53 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Dec 15 03:55:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:55:10 localhost systemd[1]: tmp-crun.Y5Lsbr.mount: Deactivated successfully. Dec 15 03:55:10 localhost podman[100470]: 2025-12-15 08:55:10.747402657 +0000 UTC m=+0.082526665 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, vcs-type=git, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., config_id=tripleo_step4, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=17.1.12, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 15 03:55:11 localhost podman[100470]: 2025-12-15 08:55:11.132491216 +0000 UTC m=+0.467615264 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, version=17.1.12, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1761123044, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, tcib_managed=true, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 15 03:55:11 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 03:55:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:55:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:55:14 localhost systemd[1]: tmp-crun.MAwsg3.mount: Deactivated successfully. Dec 15 03:55:14 localhost podman[100493]: 2025-12-15 08:55:14.763398164 +0000 UTC m=+0.092313697 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.openshift.expose-services=, version=17.1.12, distribution-scope=public, url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, vcs-type=git, config_id=tripleo_step4, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, io.buildah.version=1.41.4) Dec 15 03:55:14 localhost podman[100493]: 2025-12-15 08:55:14.832347196 +0000 UTC m=+0.161262759 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, version=17.1.12, release=1761123044, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, tcib_managed=true, architecture=x86_64, name=rhosp17/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller) Dec 15 03:55:14 localhost podman[100493]: unhealthy Dec 15 03:55:14 localhost podman[100494]: 2025-12-15 08:55:14.8455993 +0000 UTC m=+0.170006003 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, build-date=2025-11-19T00:14:25Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, version=17.1.12, io.openshift.expose-services=, managed_by=tripleo_ansible, release=1761123044, io.buildah.version=1.41.4, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c) Dec 15 03:55:14 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Main process exited, code=exited, status=1/FAILURE Dec 15 03:55:14 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Failed with result 'exit-code'. Dec 15 03:55:14 localhost podman[100494]: 2025-12-15 08:55:14.862335228 +0000 UTC m=+0.186741931 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, config_id=tripleo_step4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public) Dec 15 03:55:14 localhost podman[100494]: unhealthy Dec 15 03:55:14 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Main process exited, code=exited, status=1/FAILURE Dec 15 03:55:14 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Failed with result 'exit-code'. Dec 15 03:55:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:55:23 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Dec 15 03:55:23 localhost recover_tripleo_nova_virtqemud[100536]: 61849 Dec 15 03:55:23 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Dec 15 03:55:23 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Dec 15 03:55:23 localhost systemd[1]: tmp-crun.85f25K.mount: Deactivated successfully. Dec 15 03:55:23 localhost podman[100533]: 2025-12-15 08:55:23.741835095 +0000 UTC m=+0.073309140 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, batch=17.1_20251118.1, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, architecture=x86_64, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, tcib_managed=true, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 15 03:55:23 localhost podman[100533]: 2025-12-15 08:55:23.954527417 +0000 UTC m=+0.286001452 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, url=https://www.redhat.com, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044) Dec 15 03:55:23 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:55:33 localhost sshd[100565]: main: sshd: ssh-rsa algorithm is disabled Dec 15 03:55:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:55:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:55:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 03:55:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:55:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:55:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:55:38 localhost podman[100569]: 2025-12-15 08:55:38.765736985 +0000 UTC m=+0.088410803 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step5, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, release=1761123044, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, tcib_managed=true, container_name=nova_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Dec 15 03:55:38 localhost systemd[1]: tmp-crun.JvS2kB.mount: Deactivated successfully. Dec 15 03:55:38 localhost podman[100587]: 2025-12-15 08:55:38.782208596 +0000 UTC m=+0.084948352 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, distribution-scope=public, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, build-date=2025-11-19T00:11:48Z, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, release=1761123044, managed_by=tripleo_ansible, url=https://www.redhat.com, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute) Dec 15 03:55:38 localhost podman[100581]: 2025-12-15 08:55:38.856264394 +0000 UTC m=+0.164364533 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.buildah.version=1.41.4, vendor=Red Hat, Inc., url=https://www.redhat.com, release=1761123044, build-date=2025-11-18T22:49:32Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, version=17.1.12, batch=17.1_20251118.1, name=rhosp17/openstack-cron, tcib_managed=true, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, architecture=x86_64, container_name=logrotate_crond, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 15 03:55:38 localhost podman[100568]: 2025-12-15 08:55:38.820911459 +0000 UTC m=+0.142158209 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, managed_by=tripleo_ansible, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, container_name=iscsid, config_id=tripleo_step3, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, distribution-scope=public, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d) Dec 15 03:55:38 localhost podman[100587]: 2025-12-15 08:55:38.882244368 +0000 UTC m=+0.184984204 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20251118.1, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.buildah.version=1.41.4, version=17.1.12, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, tcib_managed=true, build-date=2025-11-19T00:11:48Z, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute) Dec 15 03:55:38 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 03:55:38 localhost podman[100568]: 2025-12-15 08:55:38.90442617 +0000 UTC m=+0.225672930 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, release=1761123044, managed_by=tripleo_ansible, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, version=17.1.12, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, config_id=tripleo_step3, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid) Dec 15 03:55:38 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:55:38 localhost podman[100569]: 2025-12-15 08:55:38.924610949 +0000 UTC m=+0.247284817 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, tcib_managed=true, build-date=2025-11-19T00:36:58Z, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, container_name=nova_compute, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, version=17.1.12, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute) Dec 15 03:55:38 localhost podman[100581]: 2025-12-15 08:55:38.938605823 +0000 UTC m=+0.246705962 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:32Z, name=rhosp17/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1761123044, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, architecture=x86_64, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron) Dec 15 03:55:38 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Deactivated successfully. Dec 15 03:55:38 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 03:55:38 localhost podman[100567]: 2025-12-15 08:55:38.874900862 +0000 UTC m=+0.199825631 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, build-date=2025-11-18T22:51:28Z, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, maintainer=OpenStack TripleO Team, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp17/openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, config_id=tripleo_step3, vcs-type=git, distribution-scope=public, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 15 03:55:38 localhost podman[100573]: 2025-12-15 08:55:38.9938508 +0000 UTC m=+0.306488680 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, io.buildah.version=1.41.4, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, vcs-type=git, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, release=1761123044, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:12:45Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, batch=17.1_20251118.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Dec 15 03:55:39 localhost podman[100567]: 2025-12-15 08:55:39.00848768 +0000 UTC m=+0.333412429 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, container_name=collectd, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, managed_by=tripleo_ansible, config_id=tripleo_step3, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, vcs-type=git, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd) Dec 15 03:55:39 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:55:39 localhost podman[100573]: 2025-12-15 08:55:39.025700181 +0000 UTC m=+0.338338051 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, config_id=tripleo_step4, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-ipmi, distribution-scope=public, version=17.1.12, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, architecture=x86_64, build-date=2025-11-19T00:12:45Z, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676) Dec 15 03:55:39 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 03:55:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:55:41 localhost systemd[1]: tmp-crun.NtSkP4.mount: Deactivated successfully. Dec 15 03:55:41 localhost podman[100705]: 2025-12-15 08:55:41.75421879 +0000 UTC m=+0.088727162 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, distribution-scope=public, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, vcs-type=git, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, batch=17.1_20251118.1, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}) Dec 15 03:55:42 localhost podman[100705]: 2025-12-15 08:55:42.091240724 +0000 UTC m=+0.425749046 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_id=tripleo_step4, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, release=1761123044, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., distribution-scope=public, maintainer=OpenStack TripleO Team, version=17.1.12, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z) Dec 15 03:55:42 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 03:55:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:55:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:55:45 localhost podman[100728]: 2025-12-15 08:55:45.742602668 +0000 UTC m=+0.074799228 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, tcib_managed=true, config_id=tripleo_step4, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.openshift.expose-services=, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, batch=17.1_20251118.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, managed_by=tripleo_ansible) Dec 15 03:55:45 localhost podman[100728]: 2025-12-15 08:55:45.785383591 +0000 UTC m=+0.117580181 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, container_name=ovn_controller, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, distribution-scope=public, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 15 03:55:45 localhost podman[100728]: unhealthy Dec 15 03:55:45 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Main process exited, code=exited, status=1/FAILURE Dec 15 03:55:45 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Failed with result 'exit-code'. Dec 15 03:55:45 localhost podman[100729]: 2025-12-15 08:55:45.860343034 +0000 UTC m=+0.188203949 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., version=17.1.12, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, batch=17.1_20251118.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible) Dec 15 03:55:45 localhost podman[100729]: 2025-12-15 08:55:45.878311005 +0000 UTC m=+0.206171950 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., vcs-type=git, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, container_name=ovn_metadata_agent, release=1761123044, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, tcib_managed=true, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 15 03:55:45 localhost podman[100729]: unhealthy Dec 15 03:55:45 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Main process exited, code=exited, status=1/FAILURE Dec 15 03:55:45 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Failed with result 'exit-code'. Dec 15 03:55:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:55:54 localhost podman[100768]: 2025-12-15 08:55:54.765057605 +0000 UTC m=+0.096044587 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, release=1761123044, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, version=17.1.12, maintainer=OpenStack TripleO Team, architecture=x86_64, build-date=2025-11-18T22:49:46Z, distribution-scope=public, name=rhosp17/openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, container_name=metrics_qdr) Dec 15 03:55:54 localhost podman[100768]: 2025-12-15 08:55:54.985429593 +0000 UTC m=+0.316416555 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, release=1761123044, distribution-scope=public, io.openshift.expose-services=, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, url=https://www.redhat.com, vendor=Red Hat, Inc., tcib_managed=true, managed_by=tripleo_ansible, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 15 03:55:55 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:56:05 localhost sshd[100797]: main: sshd: ssh-rsa algorithm is disabled Dec 15 03:56:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:56:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:56:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 03:56:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:56:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:56:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:56:09 localhost podman[100799]: 2025-12-15 08:56:09.765390877 +0000 UTC m=+0.094354242 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.component=openstack-collectd-container, name=rhosp17/openstack-collectd, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., release=1761123044, tcib_managed=true, config_id=tripleo_step3, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, architecture=x86_64, container_name=collectd) Dec 15 03:56:09 localhost podman[100812]: 2025-12-15 08:56:09.813866302 +0000 UTC m=+0.128212127 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.expose-services=, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, container_name=ceilometer_agent_compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, version=17.1.12, name=rhosp17/openstack-ceilometer-compute, vendor=Red Hat, Inc., build-date=2025-11-19T00:11:48Z, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 15 03:56:09 localhost podman[100801]: 2025-12-15 08:56:09.867385642 +0000 UTC m=+0.190816709 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, config_id=tripleo_step5, url=https://www.redhat.com, vcs-type=git, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, distribution-scope=public, batch=17.1_20251118.1, release=1761123044, architecture=x86_64, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute) Dec 15 03:56:09 localhost podman[100801]: 2025-12-15 08:56:09.91187767 +0000 UTC m=+0.235308797 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, config_id=tripleo_step5, managed_by=tripleo_ansible, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, release=1761123044, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, vendor=Red Hat, Inc., container_name=nova_compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 15 03:56:09 localhost podman[100800]: 2025-12-15 08:56:09.92307531 +0000 UTC m=+0.242705086 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, build-date=2025-11-18T23:44:13Z, url=https://www.redhat.com, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, io.buildah.version=1.41.4, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, distribution-scope=public, managed_by=tripleo_ansible, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-iscsid, architecture=x86_64, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}) Dec 15 03:56:09 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Deactivated successfully. Dec 15 03:56:09 localhost podman[100799]: 2025-12-15 08:56:09.93993327 +0000 UTC m=+0.268896635 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, io.openshift.expose-services=, tcib_managed=true, architecture=x86_64, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, batch=17.1_20251118.1, container_name=collectd, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container) Dec 15 03:56:09 localhost podman[100800]: 2025-12-15 08:56:09.961374333 +0000 UTC m=+0.281004089 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vendor=Red Hat, Inc., build-date=2025-11-18T23:44:13Z, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, io.openshift.expose-services=, version=17.1.12, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, batch=17.1_20251118.1, io.buildah.version=1.41.4, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, name=rhosp17/openstack-iscsid, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid) Dec 15 03:56:09 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:56:09 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:56:09 localhost podman[100802]: 2025-12-15 08:56:09.982295772 +0000 UTC m=+0.299701039 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-type=git, batch=17.1_20251118.1, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, url=https://www.redhat.com, io.buildah.version=1.41.4, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 15 03:56:09 localhost podman[100808]: 2025-12-15 08:56:09.787146948 +0000 UTC m=+0.103885607 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, container_name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, batch=17.1_20251118.1, tcib_managed=true, build-date=2025-11-18T22:49:32Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, architecture=x86_64, managed_by=tripleo_ansible, vendor=Red Hat, Inc., version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, url=https://www.redhat.com, name=rhosp17/openstack-cron, release=1761123044, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron) Dec 15 03:56:10 localhost podman[100812]: 2025-12-15 08:56:10.001839214 +0000 UTC m=+0.316185079 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:11:48Z, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., release=1761123044, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, distribution-scope=public, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 15 03:56:10 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 03:56:10 localhost podman[100802]: 2025-12-15 08:56:10.020409 +0000 UTC m=+0.337814227 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., vcs-type=git, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, io.buildah.version=1.41.4, tcib_managed=true, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, version=17.1.12, batch=17.1_20251118.1, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676) Dec 15 03:56:10 localhost podman[100808]: 2025-12-15 08:56:10.026421351 +0000 UTC m=+0.343160050 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, name=rhosp17/openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, tcib_managed=true, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, architecture=x86_64, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1761123044, build-date=2025-11-18T22:49:32Z, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., distribution-scope=public) Dec 15 03:56:10 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 03:56:10 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 03:56:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:56:12 localhost podman[100995]: 2025-12-15 08:56:12.754097657 +0000 UTC m=+0.088834614 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.expose-services=, tcib_managed=true, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, io.buildah.version=1.41.4, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, architecture=x86_64, version=17.1.12, batch=17.1_20251118.1, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Dec 15 03:56:13 localhost podman[100995]: 2025-12-15 08:56:13.122593083 +0000 UTC m=+0.457330000 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, url=https://www.redhat.com, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, com.redhat.component=openstack-nova-compute-container) Dec 15 03:56:13 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 03:56:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:56:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:56:16 localhost systemd[1]: tmp-crun.zlsn5u.mount: Deactivated successfully. Dec 15 03:56:16 localhost podman[101033]: 2025-12-15 08:56:16.773803714 +0000 UTC m=+0.102138749 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, distribution-scope=public, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.openshift.expose-services=, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, com.redhat.component=openstack-ovn-controller-container, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, config_id=tripleo_step4) Dec 15 03:56:16 localhost podman[101033]: 2025-12-15 08:56:16.813846935 +0000 UTC m=+0.142181970 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:34:05Z, url=https://www.redhat.com, architecture=x86_64, vendor=Red Hat, Inc., config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, managed_by=tripleo_ansible, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 15 03:56:16 localhost podman[101033]: unhealthy Dec 15 03:56:16 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Main process exited, code=exited, status=1/FAILURE Dec 15 03:56:16 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Failed with result 'exit-code'. Dec 15 03:56:16 localhost podman[101034]: 2025-12-15 08:56:16.86086947 +0000 UTC m=+0.187329275 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, distribution-scope=public, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, version=17.1.12, release=1761123044, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 15 03:56:16 localhost podman[101034]: 2025-12-15 08:56:16.904505626 +0000 UTC m=+0.230965461 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, release=1761123044, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, vcs-type=git, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, io.buildah.version=1.41.4) Dec 15 03:56:16 localhost podman[101034]: unhealthy Dec 15 03:56:16 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Main process exited, code=exited, status=1/FAILURE Dec 15 03:56:16 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Failed with result 'exit-code'. Dec 15 03:56:19 localhost sshd[101075]: main: sshd: ssh-rsa algorithm is disabled Dec 15 03:56:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:56:25 localhost podman[101077]: 2025-12-15 08:56:25.754201618 +0000 UTC m=+0.085445483 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, version=17.1.12, container_name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, release=1761123044, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step1, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z) Dec 15 03:56:25 localhost podman[101077]: 2025-12-15 08:56:25.954924741 +0000 UTC m=+0.286168606 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.openshift.expose-services=, release=1761123044, name=rhosp17/openstack-qdrouterd, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=metrics_qdr, batch=17.1_20251118.1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team) Dec 15 03:56:25 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:56:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:56:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:56:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 03:56:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:56:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:56:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:56:40 localhost podman[101108]: 2025-12-15 08:56:40.782561659 +0000 UTC m=+0.100836135 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, architecture=x86_64, managed_by=tripleo_ansible, io.buildah.version=1.41.4, config_id=tripleo_step5, tcib_managed=true, container_name=nova_compute, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, vendor=Red Hat, Inc., vcs-type=git, name=rhosp17/openstack-nova-compute) Dec 15 03:56:40 localhost podman[101108]: 2025-12-15 08:56:40.81743643 +0000 UTC m=+0.135710906 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, container_name=nova_compute, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, architecture=x86_64, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, distribution-scope=public, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Dec 15 03:56:40 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Deactivated successfully. Dec 15 03:56:40 localhost podman[101115]: 2025-12-15 08:56:40.826167043 +0000 UTC m=+0.139015984 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, url=https://www.redhat.com, version=17.1.12, container_name=logrotate_crond, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, distribution-scope=public, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, architecture=x86_64, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron) Dec 15 03:56:40 localhost systemd[1]: tmp-crun.XTH0CY.mount: Deactivated successfully. Dec 15 03:56:40 localhost podman[101109]: 2025-12-15 08:56:40.885015356 +0000 UTC m=+0.200251461 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, tcib_managed=true, vendor=Red Hat, Inc., io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:12:45Z, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, name=rhosp17/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Dec 15 03:56:40 localhost podman[101115]: 2025-12-15 08:56:40.913402104 +0000 UTC m=+0.226251005 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, build-date=2025-11-18T22:49:32Z, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, architecture=x86_64, name=rhosp17/openstack-cron, maintainer=OpenStack TripleO Team, vcs-type=git, container_name=logrotate_crond, io.buildah.version=1.41.4, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1761123044, io.openshift.expose-services=, version=17.1.12, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron) Dec 15 03:56:40 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 03:56:40 localhost podman[101107]: 2025-12-15 08:56:40.935837593 +0000 UTC m=+0.255934208 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.buildah.version=1.41.4, name=rhosp17/openstack-iscsid, release=1761123044, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, batch=17.1_20251118.1, build-date=2025-11-18T23:44:13Z, version=17.1.12, config_id=tripleo_step3, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., vcs-type=git, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container) Dec 15 03:56:40 localhost podman[101107]: 2025-12-15 08:56:40.974491206 +0000 UTC m=+0.294587791 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, container_name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.buildah.version=1.41.4, vcs-type=git, build-date=2025-11-18T23:44:13Z, url=https://www.redhat.com, batch=17.1_20251118.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-iscsid, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, release=1761123044, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid) Dec 15 03:56:40 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:56:40 localhost podman[101106]: 2025-12-15 08:56:40.989263381 +0000 UTC m=+0.313117736 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, version=17.1.12, vcs-type=git, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, architecture=x86_64, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd) Dec 15 03:56:40 localhost podman[101106]: 2025-12-15 08:56:40.996239938 +0000 UTC m=+0.320094293 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=collectd, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., version=17.1.12, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, config_id=tripleo_step3, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:51:28Z, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, url=https://www.redhat.com, distribution-scope=public, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd) Dec 15 03:56:41 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:56:41 localhost podman[101117]: 2025-12-15 08:56:41.050349653 +0000 UTC m=+0.356947417 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, url=https://www.redhat.com, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., managed_by=tripleo_ansible, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, version=17.1.12, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:11:48Z, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute) Dec 15 03:56:41 localhost podman[101109]: 2025-12-15 08:56:41.064577243 +0000 UTC m=+0.379813338 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, release=1761123044, managed_by=tripleo_ansible, version=17.1.12, io.buildah.version=1.41.4, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, config_id=tripleo_step4, io.openshift.expose-services=, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, maintainer=OpenStack TripleO Team) Dec 15 03:56:41 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 03:56:41 localhost podman[101117]: 2025-12-15 08:56:41.085499723 +0000 UTC m=+0.392097497 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:11:48Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, architecture=x86_64, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, version=17.1.12, tcib_managed=true, io.buildah.version=1.41.4, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute) Dec 15 03:56:41 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 03:56:41 localhost systemd[1]: tmp-crun.c2zMYF.mount: Deactivated successfully. Dec 15 03:56:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:56:43 localhost sshd[101246]: main: sshd: ssh-rsa algorithm is disabled Dec 15 03:56:43 localhost systemd[1]: tmp-crun.URCUOI.mount: Deactivated successfully. Dec 15 03:56:43 localhost podman[101245]: 2025-12-15 08:56:43.753269639 +0000 UTC m=+0.087347895 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_migration_target, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team) Dec 15 03:56:44 localhost podman[101245]: 2025-12-15 08:56:44.153503562 +0000 UTC m=+0.487581868 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.expose-services=, batch=17.1_20251118.1, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, config_id=tripleo_step4, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., io.buildah.version=1.41.4) Dec 15 03:56:44 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 03:56:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:56:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:56:47 localhost systemd[1]: tmp-crun.e9QxLK.mount: Deactivated successfully. Dec 15 03:56:47 localhost podman[101272]: 2025-12-15 08:56:47.757387759 +0000 UTC m=+0.083682476 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, batch=17.1_20251118.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, config_id=tripleo_step4) Dec 15 03:56:47 localhost podman[101271]: 2025-12-15 08:56:47.804753204 +0000 UTC m=+0.133641340 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, vendor=Red Hat, Inc., io.buildah.version=1.41.4, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vcs-type=git, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_controller, build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044) Dec 15 03:56:47 localhost podman[101272]: 2025-12-15 08:56:47.827811561 +0000 UTC m=+0.154106298 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, batch=17.1_20251118.1, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, version=17.1.12, io.openshift.expose-services=, tcib_managed=true, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}) Dec 15 03:56:47 localhost podman[101272]: unhealthy Dec 15 03:56:47 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Main process exited, code=exited, status=1/FAILURE Dec 15 03:56:47 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Failed with result 'exit-code'. Dec 15 03:56:47 localhost podman[101271]: 2025-12-15 08:56:47.849468919 +0000 UTC m=+0.178357095 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, architecture=x86_64, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, vcs-type=git, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller) Dec 15 03:56:47 localhost podman[101271]: unhealthy Dec 15 03:56:47 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Main process exited, code=exited, status=1/FAILURE Dec 15 03:56:47 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Failed with result 'exit-code'. Dec 15 03:56:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:56:56 localhost systemd[1]: tmp-crun.BMxlLC.mount: Deactivated successfully. Dec 15 03:56:56 localhost podman[101308]: 2025-12-15 08:56:56.76348898 +0000 UTC m=+0.097825145 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, version=17.1.12, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container) Dec 15 03:56:56 localhost podman[101308]: 2025-12-15 08:56:56.957467353 +0000 UTC m=+0.291803518 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=metrics_qdr, io.openshift.expose-services=, release=1761123044, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, tcib_managed=true, io.buildah.version=1.41.4, batch=17.1_20251118.1, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, config_id=tripleo_step1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}) Dec 15 03:56:56 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:57:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:57:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:57:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 03:57:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:57:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:57:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:57:11 localhost podman[101337]: 2025-12-15 08:57:11.828468409 +0000 UTC m=+0.155322380 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, vendor=Red Hat, Inc., batch=17.1_20251118.1, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, architecture=x86_64, build-date=2025-11-18T22:51:28Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=collectd) Dec 15 03:57:11 localhost systemd[1]: tmp-crun.GXU6Xx.mount: Deactivated successfully. Dec 15 03:57:11 localhost podman[101340]: 2025-12-15 08:57:11.881572498 +0000 UTC m=+0.199552162 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, vcs-type=git, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, release=1761123044, distribution-scope=public, build-date=2025-11-19T00:12:45Z, version=17.1.12, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676) Dec 15 03:57:11 localhost podman[101338]: 2025-12-15 08:57:11.780616251 +0000 UTC m=+0.105314175 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20251118.1, vcs-type=git, config_id=tripleo_step3, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-iscsid-container, release=1761123044, architecture=x86_64, container_name=iscsid, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, name=rhosp17/openstack-iscsid, version=17.1.12, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com) Dec 15 03:57:11 localhost podman[101337]: 2025-12-15 08:57:11.890435954 +0000 UTC m=+0.217289985 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, release=1761123044, build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible, batch=17.1_20251118.1, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-collectd-container, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.openshift.expose-services=, version=17.1.12, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64) Dec 15 03:57:11 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:57:11 localhost podman[101340]: 2025-12-15 08:57:11.909388311 +0000 UTC m=+0.227368015 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, tcib_managed=true, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, distribution-scope=public, config_id=tripleo_step4, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 15 03:57:11 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 03:57:11 localhost podman[101346]: 2025-12-15 08:57:11.898449669 +0000 UTC m=+0.208065860 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, version=17.1.12, build-date=2025-11-18T22:49:32Z, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, architecture=x86_64, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public) Dec 15 03:57:11 localhost podman[101338]: 2025-12-15 08:57:11.966576109 +0000 UTC m=+0.291274003 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, batch=17.1_20251118.1, container_name=iscsid, release=1761123044, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, version=17.1.12, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible) Dec 15 03:57:11 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:57:11 localhost podman[101339]: 2025-12-15 08:57:11.977684335 +0000 UTC m=+0.299188094 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, vcs-type=git, io.buildah.version=1.41.4, distribution-scope=public, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container) Dec 15 03:57:11 localhost podman[101346]: 2025-12-15 08:57:11.983372697 +0000 UTC m=+0.292988848 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-cron, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, release=1761123044, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, architecture=x86_64, build-date=2025-11-18T22:49:32Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond) Dec 15 03:57:12 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 03:57:12 localhost podman[101339]: 2025-12-15 08:57:12.027524147 +0000 UTC m=+0.349027916 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, distribution-scope=public, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, vendor=Red Hat, Inc., version=17.1.12, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute) Dec 15 03:57:12 localhost podman[101351]: 2025-12-15 08:57:12.035781768 +0000 UTC m=+0.346033127 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, distribution-scope=public, tcib_managed=true, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, version=17.1.12, managed_by=tripleo_ansible, vendor=Red Hat, Inc., batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, config_id=tripleo_step4, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676) Dec 15 03:57:12 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Deactivated successfully. Dec 15 03:57:12 localhost podman[101351]: 2025-12-15 08:57:12.091453335 +0000 UTC m=+0.401704664 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:11:48Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, release=1761123044, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, io.openshift.expose-services=, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public) Dec 15 03:57:12 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 03:57:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:57:14 localhost podman[101502]: 2025-12-15 08:57:14.311347085 +0000 UTC m=+0.083948984 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, version=17.1.12) Dec 15 03:57:14 localhost podman[101502]: 2025-12-15 08:57:14.696522467 +0000 UTC m=+0.469124366 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, architecture=x86_64, vendor=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, config_id=tripleo_step4, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Dec 15 03:57:14 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 03:57:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:57:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:57:18 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Dec 15 03:57:18 localhost recover_tripleo_nova_virtqemud[101636]: 61849 Dec 15 03:57:18 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Dec 15 03:57:18 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Dec 15 03:57:18 localhost podman[101623]: 2025-12-15 08:57:18.755653736 +0000 UTC m=+0.081066147 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, io.openshift.expose-services=, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, url=https://www.redhat.com, batch=17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ovn-controller, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 15 03:57:18 localhost podman[101623]: 2025-12-15 08:57:18.802281991 +0000 UTC m=+0.127694362 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vendor=Red Hat, Inc., container_name=ovn_controller, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vcs-type=git, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, build-date=2025-11-18T23:34:05Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, io.buildah.version=1.41.4, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller) Dec 15 03:57:18 localhost podman[101623]: unhealthy Dec 15 03:57:18 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Main process exited, code=exited, status=1/FAILURE Dec 15 03:57:18 localhost podman[101624]: 2025-12-15 08:57:18.811651712 +0000 UTC m=+0.135822450 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, batch=17.1_20251118.1, architecture=x86_64, maintainer=OpenStack TripleO Team, version=17.1.12, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, vcs-type=git, build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, tcib_managed=true, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 15 03:57:18 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Failed with result 'exit-code'. Dec 15 03:57:18 localhost podman[101624]: 2025-12-15 08:57:18.847650184 +0000 UTC m=+0.171820942 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, managed_by=tripleo_ansible, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, release=1761123044, distribution-scope=public, container_name=ovn_metadata_agent, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, batch=17.1_20251118.1, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.openshift.expose-services=, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c) Dec 15 03:57:18 localhost podman[101624]: unhealthy Dec 15 03:57:18 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Main process exited, code=exited, status=1/FAILURE Dec 15 03:57:18 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Failed with result 'exit-code'. Dec 15 03:57:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:57:27 localhost systemd[1]: tmp-crun.3DfSib.mount: Deactivated successfully. Dec 15 03:57:27 localhost podman[101663]: 2025-12-15 08:57:27.777540729 +0000 UTC m=+0.110666537 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-type=git, tcib_managed=true, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., version=17.1.12, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, batch=17.1_20251118.1, container_name=metrics_qdr, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step1, build-date=2025-11-18T22:49:46Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd) Dec 15 03:57:27 localhost podman[101663]: 2025-12-15 08:57:27.990452967 +0000 UTC m=+0.323578755 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, name=rhosp17/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Dec 15 03:57:28 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:57:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:57:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:57:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 03:57:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:57:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:57:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:57:42 localhost podman[101693]: 2025-12-15 08:57:42.781140707 +0000 UTC m=+0.106467946 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_id=tripleo_step3, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, build-date=2025-11-18T22:51:28Z, io.openshift.expose-services=, url=https://www.redhat.com, container_name=collectd, name=rhosp17/openstack-collectd, batch=17.1_20251118.1, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, architecture=x86_64, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 15 03:57:42 localhost podman[101694]: 2025-12-15 08:57:42.750752164 +0000 UTC m=+0.070402022 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, release=1761123044, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, version=17.1.12, build-date=2025-11-18T23:44:13Z, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.openshift.expose-services=, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, com.redhat.component=openstack-iscsid-container, tcib_managed=true, maintainer=OpenStack TripleO Team) Dec 15 03:57:42 localhost podman[101694]: 2025-12-15 08:57:42.83104235 +0000 UTC m=+0.150692247 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, url=https://www.redhat.com, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, architecture=x86_64, build-date=2025-11-18T23:44:13Z, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, release=1761123044, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc.) Dec 15 03:57:42 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:57:42 localhost podman[101704]: 2025-12-15 08:57:42.837138013 +0000 UTC m=+0.142866088 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, io.buildah.version=1.41.4, distribution-scope=public, build-date=2025-11-18T22:49:32Z, architecture=x86_64, container_name=logrotate_crond, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vcs-type=git) Dec 15 03:57:42 localhost podman[101695]: 2025-12-15 08:57:42.908635433 +0000 UTC m=+0.211799770 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, tcib_managed=true, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute) Dec 15 03:57:42 localhost podman[101708]: 2025-12-15 08:57:42.811300413 +0000 UTC m=+0.109513257 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_compute, build-date=2025-11-19T00:11:48Z, config_id=tripleo_step4, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.12, architecture=x86_64, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, tcib_managed=true, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-compute, distribution-scope=public, vcs-type=git) Dec 15 03:57:42 localhost podman[101695]: 2025-12-15 08:57:42.939860618 +0000 UTC m=+0.243024895 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, maintainer=OpenStack TripleO Team, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, container_name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, vendor=Red Hat, Inc.) Dec 15 03:57:42 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Deactivated successfully. Dec 15 03:57:42 localhost podman[101701]: 2025-12-15 08:57:42.960797146 +0000 UTC m=+0.273366134 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, vcs-type=git, name=rhosp17/openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, batch=17.1_20251118.1, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 15 03:57:42 localhost podman[101693]: 2025-12-15 08:57:42.961460484 +0000 UTC m=+0.286787723 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, release=1761123044, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, managed_by=tripleo_ansible, vendor=Red Hat, Inc., version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, distribution-scope=public, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 15 03:57:42 localhost podman[101704]: 2025-12-15 08:57:42.97101422 +0000 UTC m=+0.276742295 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.openshift.expose-services=, architecture=x86_64, vendor=Red Hat, Inc., batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, version=17.1.12, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron) Dec 15 03:57:42 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:57:43 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 03:57:43 localhost podman[101701]: 2025-12-15 08:57:43.044470622 +0000 UTC m=+0.357039590 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, batch=17.1_20251118.1, io.buildah.version=1.41.4, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, architecture=x86_64, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, vcs-type=git, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi) Dec 15 03:57:43 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 03:57:43 localhost podman[101708]: 2025-12-15 08:57:43.092543436 +0000 UTC m=+0.390756290 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, build-date=2025-11-19T00:11:48Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, container_name=ceilometer_agent_compute, tcib_managed=true, batch=17.1_20251118.1, vcs-type=git, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Dec 15 03:57:43 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 03:57:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:57:45 localhost systemd[1]: tmp-crun.de295P.mount: Deactivated successfully. Dec 15 03:57:45 localhost podman[101827]: 2025-12-15 08:57:45.755119964 +0000 UTC m=+0.087485319 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, config_id=tripleo_step4, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., architecture=x86_64, name=rhosp17/openstack-nova-compute, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, release=1761123044, io.openshift.expose-services=, io.buildah.version=1.41.4, managed_by=tripleo_ansible) Dec 15 03:57:46 localhost podman[101827]: 2025-12-15 08:57:46.134709946 +0000 UTC m=+0.467075251 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, version=17.1.12, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, distribution-scope=public, tcib_managed=true, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, config_id=tripleo_step4, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, url=https://www.redhat.com) Dec 15 03:57:46 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 03:57:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:57:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:57:49 localhost podman[101850]: 2025-12-15 08:57:49.759714786 +0000 UTC m=+0.082543426 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, version=17.1.12, vendor=Red Hat, Inc., managed_by=tripleo_ansible, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller) Dec 15 03:57:49 localhost podman[101850]: 2025-12-15 08:57:49.800489166 +0000 UTC m=+0.123317766 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, vcs-type=git, release=1761123044, maintainer=OpenStack TripleO Team, tcib_managed=true, io.buildah.version=1.41.4, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20251118.1, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, vendor=Red Hat, Inc., architecture=x86_64, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z) Dec 15 03:57:49 localhost systemd[1]: tmp-crun.LxBITg.mount: Deactivated successfully. Dec 15 03:57:49 localhost podman[101851]: 2025-12-15 08:57:49.827537718 +0000 UTC m=+0.148415236 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, batch=17.1_20251118.1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.12, managed_by=tripleo_ansible, release=1761123044, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 15 03:57:49 localhost podman[101850]: unhealthy Dec 15 03:57:49 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Main process exited, code=exited, status=1/FAILURE Dec 15 03:57:49 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Failed with result 'exit-code'. Dec 15 03:57:49 localhost podman[101851]: 2025-12-15 08:57:49.869422787 +0000 UTC m=+0.190300315 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.expose-services=, batch=17.1_20251118.1, url=https://www.redhat.com, config_id=tripleo_step4, container_name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, release=1761123044, maintainer=OpenStack TripleO Team) Dec 15 03:57:49 localhost podman[101851]: unhealthy Dec 15 03:57:49 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Main process exited, code=exited, status=1/FAILURE Dec 15 03:57:49 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Failed with result 'exit-code'. Dec 15 03:57:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:57:59 localhost podman[101891]: 2025-12-15 08:57:59.100289173 +0000 UTC m=+0.083728617 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, release=1761123044, architecture=x86_64, config_id=tripleo_step1, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, vendor=Red Hat, Inc., managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public) Dec 15 03:57:59 localhost podman[101891]: 2025-12-15 08:57:59.332777965 +0000 UTC m=+0.316217369 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, vendor=Red Hat, Inc., version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 15 03:57:59 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:58:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:58:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:58:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 03:58:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:58:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:58:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:58:13 localhost podman[101922]: 2025-12-15 08:58:13.764918734 +0000 UTC m=+0.090497669 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, io.openshift.expose-services=, container_name=iscsid, vendor=Red Hat, Inc., distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4) Dec 15 03:58:13 localhost podman[101922]: 2025-12-15 08:58:13.771834019 +0000 UTC m=+0.097412964 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1761123044, vcs-type=git, vendor=Red Hat, Inc., build-date=2025-11-18T23:44:13Z, tcib_managed=true) Dec 15 03:58:13 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:58:13 localhost podman[101921]: 2025-12-15 08:58:13.822085162 +0000 UTC m=+0.152271560 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, container_name=collectd, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, version=17.1.12, batch=17.1_20251118.1, vendor=Red Hat, Inc., config_id=tripleo_step3, url=https://www.redhat.com, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, architecture=x86_64, vcs-type=git) Dec 15 03:58:13 localhost podman[101936]: 2025-12-15 08:58:13.862730387 +0000 UTC m=+0.175021706 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, build-date=2025-11-19T00:11:48Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., config_id=tripleo_step4, release=1761123044, io.openshift.expose-services=, maintainer=OpenStack TripleO Team) Dec 15 03:58:13 localhost podman[101921]: 2025-12-15 08:58:13.884084438 +0000 UTC m=+0.214270886 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, vendor=Red Hat, Inc., managed_by=tripleo_ansible, build-date=2025-11-18T22:51:28Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, version=17.1.12, url=https://www.redhat.com, vcs-type=git, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 15 03:58:13 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:58:13 localhost podman[101936]: 2025-12-15 08:58:13.915140868 +0000 UTC m=+0.227432277 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, distribution-scope=public, name=rhosp17/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-type=git, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, build-date=2025-11-19T00:11:48Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1761123044, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, tcib_managed=true) Dec 15 03:58:13 localhost podman[101935]: 2025-12-15 08:58:13.925603328 +0000 UTC m=+0.236246774 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, vendor=Red Hat, Inc., managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2025-11-18T22:49:32Z, distribution-scope=public, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, config_id=tripleo_step4, com.redhat.component=openstack-cron-container) Dec 15 03:58:13 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 03:58:13 localhost podman[101929]: 2025-12-15 08:58:13.885473835 +0000 UTC m=+0.200576810 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, version=17.1.12, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, url=https://www.redhat.com, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, container_name=ceilometer_agent_ipmi, tcib_managed=true, name=rhosp17/openstack-ceilometer-ipmi, release=1761123044, build-date=2025-11-19T00:12:45Z, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team) Dec 15 03:58:13 localhost podman[101929]: 2025-12-15 08:58:13.96838844 +0000 UTC m=+0.283491375 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, version=17.1.12, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., batch=17.1_20251118.1, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, distribution-scope=public, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 15 03:58:13 localhost podman[101923]: 2025-12-15 08:58:13.991146728 +0000 UTC m=+0.309023407 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, version=17.1.12, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vcs-type=git, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Dec 15 03:58:14 localhost podman[101935]: 2025-12-15 08:58:14.009148819 +0000 UTC m=+0.319792315 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-cron-container, architecture=x86_64, batch=17.1_20251118.1, io.buildah.version=1.41.4, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-cron, vcs-type=git, build-date=2025-11-18T22:49:32Z, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron) Dec 15 03:58:14 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 03:58:14 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 03:58:14 localhost podman[101923]: 2025-12-15 08:58:14.047603837 +0000 UTC m=+0.365480516 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, distribution-scope=public, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, vcs-type=git, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 15 03:58:14 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Deactivated successfully. Dec 15 03:58:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:58:16 localhost podman[102068]: 2025-12-15 08:58:16.307826894 +0000 UTC m=+0.078771755 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, distribution-scope=public, container_name=nova_migration_target, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, version=17.1.12, tcib_managed=true, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container) Dec 15 03:58:16 localhost podman[102068]: 2025-12-15 08:58:16.673608147 +0000 UTC m=+0.444552968 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, build-date=2025-11-19T00:36:58Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, version=17.1.12, distribution-scope=public, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, release=1761123044, managed_by=tripleo_ansible, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 15 03:58:16 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 03:58:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:58:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:58:20 localhost podman[102152]: 2025-12-15 08:58:20.757368453 +0000 UTC m=+0.085969497 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, batch=17.1_20251118.1, tcib_managed=true, container_name=ovn_controller, release=1761123044, vendor=Red Hat, Inc.) Dec 15 03:58:20 localhost podman[102152]: 2025-12-15 08:58:20.799279713 +0000 UTC m=+0.127880767 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, tcib_managed=true, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, batch=17.1_20251118.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git) Dec 15 03:58:20 localhost podman[102152]: unhealthy Dec 15 03:58:20 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Main process exited, code=exited, status=1/FAILURE Dec 15 03:58:20 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Failed with result 'exit-code'. Dec 15 03:58:20 localhost podman[102153]: 2025-12-15 08:58:20.811413248 +0000 UTC m=+0.135893772 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., architecture=x86_64, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, build-date=2025-11-19T00:14:25Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com) Dec 15 03:58:20 localhost podman[102153]: 2025-12-15 08:58:20.827541868 +0000 UTC m=+0.152022412 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, version=17.1.12, release=1761123044, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Dec 15 03:58:20 localhost podman[102153]: unhealthy Dec 15 03:58:20 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Main process exited, code=exited, status=1/FAILURE Dec 15 03:58:20 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Failed with result 'exit-code'. Dec 15 03:58:29 localhost sshd[102191]: main: sshd: ssh-rsa algorithm is disabled Dec 15 03:58:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:58:29 localhost podman[102192]: 2025-12-15 08:58:29.76144459 +0000 UTC m=+0.096312654 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, vcs-type=git, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, version=17.1.12, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true) Dec 15 03:58:29 localhost podman[102192]: 2025-12-15 08:58:29.961608548 +0000 UTC m=+0.296476622 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp17/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, vcs-type=git, io.buildah.version=1.41.4, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.12, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Dec 15 03:58:29 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:58:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:58:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:58:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 03:58:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:58:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:58:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:58:44 localhost podman[102233]: 2025-12-15 08:58:44.780245914 +0000 UTC m=+0.094410404 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=logrotate_crond, com.redhat.component=openstack-cron-container, release=1761123044, name=rhosp17/openstack-cron, distribution-scope=public, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, config_id=tripleo_step4, build-date=2025-11-18T22:49:32Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, vcs-type=git, io.buildah.version=1.41.4, version=17.1.12, url=https://www.redhat.com, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron) Dec 15 03:58:44 localhost podman[102225]: 2025-12-15 08:58:44.882331411 +0000 UTC m=+0.196734507 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, managed_by=tripleo_ansible, batch=17.1_20251118.1, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, config_id=tripleo_step5, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, architecture=x86_64, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044) Dec 15 03:58:44 localhost podman[102225]: 2025-12-15 08:58:44.913648438 +0000 UTC m=+0.228051514 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vcs-type=git, name=rhosp17/openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, tcib_managed=true, url=https://www.redhat.com, container_name=nova_compute, batch=17.1_20251118.1, version=17.1.12, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 15 03:58:44 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Deactivated successfully. Dec 15 03:58:44 localhost podman[102224]: 2025-12-15 08:58:44.832683554 +0000 UTC m=+0.155272189 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, batch=17.1_20251118.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, architecture=x86_64, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-type=git, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, container_name=iscsid, vendor=Red Hat, Inc., distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, tcib_managed=true) Dec 15 03:58:44 localhost podman[102223]: 2025-12-15 08:58:44.939378055 +0000 UTC m=+0.264623741 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., version=17.1.12, distribution-scope=public, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, managed_by=tripleo_ansible, architecture=x86_64, build-date=2025-11-18T22:51:28Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vcs-type=git, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 15 03:58:44 localhost podman[102233]: 2025-12-15 08:58:44.965860893 +0000 UTC m=+0.280025333 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, version=17.1.12, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, distribution-scope=public, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:32Z, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044) Dec 15 03:58:44 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 03:58:44 localhost podman[102223]: 2025-12-15 08:58:44.98035592 +0000 UTC m=+0.305601526 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, tcib_managed=true, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, name=rhosp17/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-collectd-container, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, architecture=x86_64, release=1761123044, build-date=2025-11-18T22:51:28Z, vendor=Red Hat, Inc., version=17.1.12, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, io.buildah.version=1.41.4, distribution-scope=public, managed_by=tripleo_ansible) Dec 15 03:58:44 localhost podman[102226]: 2025-12-15 08:58:44.992076933 +0000 UTC m=+0.306669314 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., release=1761123044, build-date=2025-11-19T00:12:45Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, tcib_managed=true, distribution-scope=public, url=https://www.redhat.com, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Dec 15 03:58:45 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:58:45 localhost podman[102239]: 2025-12-15 08:58:44.805465938 +0000 UTC m=+0.114266054 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, architecture=x86_64, build-date=2025-11-19T00:11:48Z, container_name=ceilometer_agent_compute, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-compute-container) Dec 15 03:58:45 localhost podman[102224]: 2025-12-15 08:58:45.01780325 +0000 UTC m=+0.340391825 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, version=17.1.12, com.redhat.component=openstack-iscsid-container, build-date=2025-11-18T23:44:13Z, distribution-scope=public, vcs-type=git, io.openshift.expose-services=, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, vendor=Red Hat, Inc., org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20251118.1, url=https://www.redhat.com) Dec 15 03:58:45 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:58:45 localhost podman[102239]: 2025-12-15 08:58:45.039420648 +0000 UTC m=+0.348220794 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, tcib_managed=true, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-compute, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, io.buildah.version=1.41.4, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:11:48Z, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Dec 15 03:58:45 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 03:58:45 localhost podman[102226]: 2025-12-15 08:58:45.074423774 +0000 UTC m=+0.389016155 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-ipmi, batch=17.1_20251118.1, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vcs-type=git, io.buildah.version=1.41.4, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Dec 15 03:58:45 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 03:58:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:58:47 localhost podman[102359]: 2025-12-15 08:58:47.750139902 +0000 UTC m=+0.081292533 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, version=17.1.12, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, release=1761123044, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=) Dec 15 03:58:48 localhost podman[102359]: 2025-12-15 08:58:48.107335315 +0000 UTC m=+0.438487886 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, distribution-scope=public, tcib_managed=true, name=rhosp17/openstack-nova-compute, architecture=x86_64, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.openshift.expose-services=, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, release=1761123044) Dec 15 03:58:48 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 03:58:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:58:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:58:51 localhost systemd[1]: tmp-crun.1x5o11.mount: Deactivated successfully. Dec 15 03:58:51 localhost podman[102383]: 2025-12-15 08:58:51.747459892 +0000 UTC m=+0.077860942 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, batch=17.1_20251118.1, release=1761123044, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 15 03:58:51 localhost systemd[1]: tmp-crun.zS60i6.mount: Deactivated successfully. Dec 15 03:58:51 localhost podman[102382]: 2025-12-15 08:58:51.801161516 +0000 UTC m=+0.132130971 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, container_name=ovn_controller, managed_by=tripleo_ansible, version=17.1.12, vcs-type=git, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, architecture=x86_64, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ovn-controller-container, release=1761123044, maintainer=OpenStack TripleO Team) Dec 15 03:58:51 localhost podman[102382]: 2025-12-15 08:58:51.818276683 +0000 UTC m=+0.149246208 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, vcs-type=git, io.buildah.version=1.41.4, url=https://www.redhat.com, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, build-date=2025-11-18T23:34:05Z, tcib_managed=true, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4) Dec 15 03:58:51 localhost podman[102383]: 2025-12-15 08:58:51.818798447 +0000 UTC m=+0.149199487 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, url=https://www.redhat.com, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, tcib_managed=true, distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, vcs-type=git, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., managed_by=tripleo_ansible, version=17.1.12, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container) Dec 15 03:58:51 localhost podman[102383]: unhealthy Dec 15 03:58:51 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Main process exited, code=exited, status=1/FAILURE Dec 15 03:58:51 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Failed with result 'exit-code'. Dec 15 03:58:51 localhost podman[102382]: unhealthy Dec 15 03:58:51 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Main process exited, code=exited, status=1/FAILURE Dec 15 03:58:51 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Failed with result 'exit-code'. Dec 15 03:58:58 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Dec 15 03:58:58 localhost recover_tripleo_nova_virtqemud[102423]: 61849 Dec 15 03:58:58 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Dec 15 03:58:58 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Dec 15 03:59:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:59:00 localhost podman[102425]: 2025-12-15 08:59:00.766443316 +0000 UTC m=+0.089927363 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.12, name=rhosp17/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, vendor=Red Hat, Inc., release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, architecture=x86_64, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 15 03:59:01 localhost podman[102425]: 2025-12-15 08:59:01.016335192 +0000 UTC m=+0.339819199 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2025-11-18T22:49:46Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, architecture=x86_64, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, batch=17.1_20251118.1, vcs-type=git, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=) Dec 15 03:59:01 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:59:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:59:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:59:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 03:59:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:59:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:59:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:59:15 localhost systemd[1]: tmp-crun.Ubihaj.mount: Deactivated successfully. Dec 15 03:59:15 localhost podman[102456]: 2025-12-15 08:59:15.769860451 +0000 UTC m=+0.089535843 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, release=1761123044, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, container_name=iscsid, version=17.1.12, batch=17.1_20251118.1, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, name=rhosp17/openstack-iscsid, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 15 03:59:15 localhost podman[102464]: 2025-12-15 08:59:15.78587956 +0000 UTC m=+0.096683775 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, io.buildah.version=1.41.4, url=https://www.redhat.com, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-cron, architecture=x86_64) Dec 15 03:59:15 localhost podman[102464]: 2025-12-15 08:59:15.795203878 +0000 UTC m=+0.106008103 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, release=1761123044, tcib_managed=true, vcs-type=git, version=17.1.12, com.redhat.component=openstack-cron-container, build-date=2025-11-18T22:49:32Z, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, managed_by=tripleo_ansible) Dec 15 03:59:15 localhost podman[102456]: 2025-12-15 08:59:15.804364793 +0000 UTC m=+0.124040235 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, distribution-scope=public, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, com.redhat.component=openstack-iscsid-container, name=rhosp17/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2025-11-18T23:44:13Z, container_name=iscsid, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, url=https://www.redhat.com, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.expose-services=) Dec 15 03:59:15 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:59:15 localhost podman[102458]: 2025-12-15 08:59:15.883395814 +0000 UTC m=+0.197251280 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, release=1761123044, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676) Dec 15 03:59:15 localhost podman[102455]: 2025-12-15 08:59:15.942036861 +0000 UTC m=+0.265486093 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, build-date=2025-11-18T22:51:28Z, maintainer=OpenStack TripleO Team, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, container_name=collectd, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, name=rhosp17/openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., batch=17.1_20251118.1, url=https://www.redhat.com, version=17.1.12, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}) Dec 15 03:59:15 localhost podman[102458]: 2025-12-15 08:59:15.946366517 +0000 UTC m=+0.260221953 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, distribution-scope=public, build-date=2025-11-19T00:12:45Z, name=rhosp17/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, architecture=x86_64, release=1761123044, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, vcs-type=git, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4) Dec 15 03:59:15 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 03:59:15 localhost podman[102455]: 2025-12-15 08:59:15.979541104 +0000 UTC m=+0.302990356 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, name=rhosp17/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, io.buildah.version=1.41.4, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, build-date=2025-11-18T22:51:28Z, batch=17.1_20251118.1, vcs-type=git, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12) Dec 15 03:59:15 localhost podman[102469]: 2025-12-15 08:59:15.989232662 +0000 UTC m=+0.297317394 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, name=rhosp17/openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, config_id=tripleo_step4, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, tcib_managed=true, architecture=x86_64, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 15 03:59:15 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:59:16 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 03:59:16 localhost podman[102469]: 2025-12-15 08:59:16.029355605 +0000 UTC m=+0.337440347 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.4, build-date=2025-11-19T00:11:48Z, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, release=1761123044, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, version=17.1.12, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676) Dec 15 03:59:16 localhost podman[102457]: 2025-12-15 08:59:16.039812163 +0000 UTC m=+0.355400276 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, vcs-type=git, distribution-scope=public, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., io.buildah.version=1.41.4, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, managed_by=tripleo_ansible, tcib_managed=true, batch=17.1_20251118.1, url=https://www.redhat.com, version=17.1.12, container_name=nova_compute) Dec 15 03:59:16 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 03:59:16 localhost podman[102457]: 2025-12-15 08:59:16.069407444 +0000 UTC m=+0.384995577 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, batch=17.1_20251118.1, tcib_managed=true, architecture=x86_64, config_id=tripleo_step5, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute) Dec 15 03:59:16 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Deactivated successfully. Dec 15 03:59:16 localhost systemd[1]: tmp-crun.ZygUbb.mount: Deactivated successfully. Dec 15 03:59:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:59:18 localhost podman[102667]: 2025-12-15 08:59:18.571479304 +0000 UTC m=+0.102926341 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., container_name=nova_migration_target, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 15 03:59:18 localhost podman[102713]: 2025-12-15 08:59:18.747108756 +0000 UTC m=+0.088655650 container exec 8dcda56b365b42dc8758aab77a9ec80db304780e449052738f7e4e648ae1ecaf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-crash-np0005559462, vcs-type=git, RELEASE=main, vendor=Red Hat, Inc., io.openshift.expose-services=, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, name=rhceph, build-date=2025-11-26T19:44:28Z, version=7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, distribution-scope=public, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux , cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0) Dec 15 03:59:18 localhost podman[102713]: 2025-12-15 08:59:18.846929373 +0000 UTC m=+0.188476267 container exec_died 8dcda56b365b42dc8758aab77a9ec80db304780e449052738f7e4e648ae1ecaf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-crash-np0005559462, name=rhceph, CEPH_POINT_RELEASE=, distribution-scope=public, build-date=2025-11-26T19:44:28Z, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, io.openshift.tags=rhceph ceph, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, com.redhat.component=rhceph-container, GIT_BRANCH=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers) Dec 15 03:59:18 localhost podman[102667]: 2025-12-15 08:59:18.946595416 +0000 UTC m=+0.478042473 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, version=17.1.12, config_id=tripleo_step4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, container_name=nova_migration_target, architecture=x86_64, tcib_managed=true) Dec 15 03:59:18 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 03:59:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:59:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:59:22 localhost systemd[1]: tmp-crun.dJ8eeW.mount: Deactivated successfully. Dec 15 03:59:22 localhost podman[102857]: 2025-12-15 08:59:22.766699501 +0000 UTC m=+0.086480122 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, architecture=x86_64, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-type=git, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, release=1761123044, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c) Dec 15 03:59:22 localhost podman[102856]: 2025-12-15 08:59:22.807149611 +0000 UTC m=+0.128451313 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, config_id=tripleo_step4, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, architecture=x86_64, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272) Dec 15 03:59:22 localhost podman[102856]: 2025-12-15 08:59:22.818265758 +0000 UTC m=+0.139567440 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, tcib_managed=true, io.openshift.expose-services=, managed_by=tripleo_ansible, distribution-scope=public, batch=17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller) Dec 15 03:59:22 localhost podman[102856]: unhealthy Dec 15 03:59:22 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Main process exited, code=exited, status=1/FAILURE Dec 15 03:59:22 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Failed with result 'exit-code'. Dec 15 03:59:22 localhost podman[102857]: 2025-12-15 08:59:22.833239498 +0000 UTC m=+0.153020119 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, vendor=Red Hat, Inc., config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, batch=17.1_20251118.1, tcib_managed=true, url=https://www.redhat.com) Dec 15 03:59:22 localhost podman[102857]: unhealthy Dec 15 03:59:22 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Main process exited, code=exited, status=1/FAILURE Dec 15 03:59:22 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Failed with result 'exit-code'. Dec 15 03:59:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 03:59:31 localhost podman[102898]: 2025-12-15 08:59:31.749956321 +0000 UTC m=+0.084431537 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, url=https://www.redhat.com, release=1761123044, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, tcib_managed=true, name=rhosp17/openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 15 03:59:31 localhost podman[102898]: 2025-12-15 08:59:31.969428766 +0000 UTC m=+0.303903952 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, vcs-type=git, config_id=tripleo_step1, version=17.1.12, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, name=rhosp17/openstack-qdrouterd) Dec 15 03:59:31 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 03:59:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 03:59:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 03:59:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 03:59:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 03:59:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 03:59:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 03:59:46 localhost systemd[1]: tmp-crun.SlzWKC.mount: Deactivated successfully. Dec 15 03:59:46 localhost podman[102927]: 2025-12-15 08:59:46.782233239 +0000 UTC m=+0.106801504 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:51:28Z, version=17.1.12, name=rhosp17/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, release=1761123044, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., config_id=tripleo_step3, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 15 03:59:46 localhost podman[102930]: 2025-12-15 08:59:46.788192048 +0000 UTC m=+0.101296347 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, name=rhosp17/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, release=1761123044, architecture=x86_64, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.12) Dec 15 03:59:46 localhost podman[102927]: 2025-12-15 08:59:46.795704249 +0000 UTC m=+0.120272534 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, vendor=Red Hat, Inc., url=https://www.redhat.com, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4, tcib_managed=true, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, config_id=tripleo_step3, batch=17.1_20251118.1, build-date=2025-11-18T22:51:28Z, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 15 03:59:46 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 03:59:46 localhost podman[102928]: 2025-12-15 08:59:46.866721326 +0000 UTC m=+0.191024554 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, managed_by=tripleo_ansible, vcs-type=git, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., release=1761123044, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, distribution-scope=public, container_name=iscsid, architecture=x86_64, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-iscsid, io.openshift.expose-services=, config_id=tripleo_step3) Dec 15 03:59:46 localhost podman[102941]: 2025-12-15 08:59:46.842055458 +0000 UTC m=+0.154621693 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1761123044, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, io.openshift.expose-services=, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, batch=17.1_20251118.1, config_id=tripleo_step4, name=rhosp17/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, build-date=2025-11-18T22:49:32Z, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true) Dec 15 03:59:46 localhost podman[102928]: 2025-12-15 08:59:46.897753306 +0000 UTC m=+0.222056584 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.4, vendor=Red Hat, Inc., managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, release=1761123044, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, batch=17.1_20251118.1, tcib_managed=true, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-iscsid, container_name=iscsid) Dec 15 03:59:46 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 03:59:46 localhost podman[102930]: 2025-12-15 08:59:46.916737883 +0000 UTC m=+0.229842192 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, url=https://www.redhat.com, io.buildah.version=1.41.4, tcib_managed=true, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, managed_by=tripleo_ansible, vcs-type=git, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team) Dec 15 03:59:46 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 03:59:46 localhost podman[102929]: 2025-12-15 08:59:46.968189518 +0000 UTC m=+0.288156490 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step5, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.buildah.version=1.41.4, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute) Dec 15 03:59:46 localhost podman[102941]: 2025-12-15 08:59:46.970811938 +0000 UTC m=+0.283378173 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, name=rhosp17/openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=logrotate_crond, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, release=1761123044, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, build-date=2025-11-18T22:49:32Z, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true) Dec 15 03:59:46 localhost podman[102943]: 2025-12-15 08:59:46.993217437 +0000 UTC m=+0.304770214 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, io.buildah.version=1.41.4, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, tcib_managed=true, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, version=17.1.12, name=rhosp17/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2025-11-19T00:11:48Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, architecture=x86_64) Dec 15 03:59:47 localhost podman[102929]: 2025-12-15 08:59:47.018534233 +0000 UTC m=+0.338501215 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, container_name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git) Dec 15 03:59:47 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Deactivated successfully. Dec 15 03:59:47 localhost podman[102943]: 2025-12-15 08:59:47.043784468 +0000 UTC m=+0.355337215 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, name=rhosp17/openstack-ceilometer-compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, batch=17.1_20251118.1, container_name=ceilometer_agent_compute, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, build-date=2025-11-19T00:11:48Z, io.openshift.expose-services=) Dec 15 03:59:47 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 03:59:47 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 03:59:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 03:59:49 localhost systemd[1]: tmp-crun.Vb1HEF.mount: Deactivated successfully. Dec 15 03:59:49 localhost podman[103063]: 2025-12-15 08:59:49.750879505 +0000 UTC m=+0.069653202 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, tcib_managed=true, release=1761123044, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, architecture=x86_64, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.buildah.version=1.41.4, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, url=https://www.redhat.com, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 15 03:59:50 localhost podman[103063]: 2025-12-15 08:59:50.092343548 +0000 UTC m=+0.411117295 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, tcib_managed=true, architecture=x86_64, batch=17.1_20251118.1, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.openshift.expose-services=, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, vcs-type=git, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute) Dec 15 03:59:50 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 03:59:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 03:59:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 03:59:53 localhost podman[103086]: 2025-12-15 08:59:53.754526293 +0000 UTC m=+0.086167344 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, name=rhosp17/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, io.openshift.expose-services=, architecture=x86_64, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., container_name=ovn_controller, build-date=2025-11-18T23:34:05Z, batch=17.1_20251118.1) Dec 15 03:59:53 localhost podman[103086]: 2025-12-15 08:59:53.771516476 +0000 UTC m=+0.103157548 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, managed_by=tripleo_ansible, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, tcib_managed=true, distribution-scope=public, release=1761123044, io.openshift.expose-services=, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, build-date=2025-11-18T23:34:05Z, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1) Dec 15 03:59:53 localhost podman[103086]: unhealthy Dec 15 03:59:53 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Main process exited, code=exited, status=1/FAILURE Dec 15 03:59:53 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Failed with result 'exit-code'. Dec 15 03:59:53 localhost podman[103087]: 2025-12-15 08:59:53.85137206 +0000 UTC m=+0.178021417 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, batch=17.1_20251118.1, container_name=ovn_metadata_agent, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, vendor=Red Hat, Inc., config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, managed_by=tripleo_ansible) Dec 15 03:59:53 localhost podman[103087]: 2025-12-15 08:59:53.891360729 +0000 UTC m=+0.218010046 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, name=rhosp17/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2025-11-19T00:14:25Z, config_id=tripleo_step4, architecture=x86_64, vcs-type=git, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, batch=17.1_20251118.1) Dec 15 03:59:53 localhost podman[103087]: unhealthy Dec 15 03:59:53 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Main process exited, code=exited, status=1/FAILURE Dec 15 03:59:53 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Failed with result 'exit-code'. Dec 15 04:00:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 04:00:02 localhost systemd[1]: tmp-crun.Y0vxBS.mount: Deactivated successfully. Dec 15 04:00:02 localhost podman[103128]: 2025-12-15 09:00:02.761487535 +0000 UTC m=+0.093393536 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20251118.1, io.openshift.expose-services=, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, maintainer=OpenStack TripleO Team) Dec 15 04:00:02 localhost podman[103128]: 2025-12-15 09:00:02.959930407 +0000 UTC m=+0.291836388 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, managed_by=tripleo_ansible, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, release=1761123044, com.redhat.component=openstack-qdrouterd-container, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1) Dec 15 04:00:02 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 04:00:12 localhost sshd[103158]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:00:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 04:00:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 04:00:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 04:00:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 04:00:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 04:00:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 04:00:17 localhost podman[103160]: 2025-12-15 09:00:17.7817962 +0000 UTC m=+0.105524820 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vendor=Red Hat, Inc., batch=17.1_20251118.1, build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, container_name=collectd, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, url=https://www.redhat.com, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 15 04:00:17 localhost systemd[1]: tmp-crun.XZj6iG.mount: Deactivated successfully. Dec 15 04:00:17 localhost podman[103173]: 2025-12-15 09:00:17.896113365 +0000 UTC m=+0.204528676 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, distribution-scope=public, container_name=logrotate_crond, build-date=2025-11-18T22:49:32Z, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, vcs-type=git, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, managed_by=tripleo_ansible, config_id=tripleo_step4) Dec 15 04:00:17 localhost podman[103160]: 2025-12-15 09:00:17.915847302 +0000 UTC m=+0.239575922 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, release=1761123044, name=rhosp17/openstack-collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., vcs-type=git, build-date=2025-11-18T22:51:28Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, architecture=x86_64) Dec 15 04:00:17 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 04:00:17 localhost podman[103161]: 2025-12-15 09:00:17.87385687 +0000 UTC m=+0.188385854 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vendor=Red Hat, Inc., url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:44:13Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, name=rhosp17/openstack-iscsid, release=1761123044, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, container_name=iscsid, com.redhat.component=openstack-iscsid-container, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, config_id=tripleo_step3, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 15 04:00:17 localhost podman[103162]: 2025-12-15 09:00:17.816060326 +0000 UTC m=+0.132618984 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_compute, architecture=x86_64, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, version=17.1.12, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public) Dec 15 04:00:17 localhost podman[103179]: 2025-12-15 09:00:17.983699445 +0000 UTC m=+0.285334755 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, url=https://www.redhat.com, batch=17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_compute, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, config_id=tripleo_step4, io.buildah.version=1.41.4, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, tcib_managed=true, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:11:48Z, name=rhosp17/openstack-ceilometer-compute) Dec 15 04:00:18 localhost podman[103162]: 2025-12-15 09:00:17.99959181 +0000 UTC m=+0.316150488 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, tcib_managed=true, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, container_name=nova_compute, io.openshift.expose-services=) Dec 15 04:00:18 localhost podman[103173]: 2025-12-15 09:00:18.008040155 +0000 UTC m=+0.316455466 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, managed_by=tripleo_ansible, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, build-date=2025-11-18T22:49:32Z, release=1761123044, version=17.1.12, io.openshift.expose-services=, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, vcs-type=git, container_name=logrotate_crond, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 15 04:00:18 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Deactivated successfully. Dec 15 04:00:18 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 04:00:18 localhost podman[103179]: 2025-12-15 09:00:18.03553544 +0000 UTC m=+0.337170770 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, build-date=2025-11-19T00:11:48Z, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, container_name=ceilometer_agent_compute, batch=17.1_20251118.1, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, config_id=tripleo_step4, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 15 04:00:18 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 04:00:18 localhost podman[103161]: 2025-12-15 09:00:18.058911744 +0000 UTC m=+0.373440708 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=iscsid, batch=17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, name=rhosp17/openstack-iscsid, managed_by=tripleo_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, config_id=tripleo_step3, io.openshift.expose-services=) Dec 15 04:00:18 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 04:00:18 localhost podman[103167]: 2025-12-15 09:00:18.041663543 +0000 UTC m=+0.354044910 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, distribution-scope=public, tcib_managed=true, io.buildah.version=1.41.4, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, build-date=2025-11-19T00:12:45Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp17/openstack-ceilometer-ipmi) Dec 15 04:00:18 localhost podman[103167]: 2025-12-15 09:00:18.124558588 +0000 UTC m=+0.436939915 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, batch=17.1_20251118.1, managed_by=tripleo_ansible, version=17.1.12, io.buildah.version=1.41.4, tcib_managed=true, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, name=rhosp17/openstack-ceilometer-ipmi, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 15 04:00:18 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 04:00:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 04:00:20 localhost podman[103293]: 2025-12-15 09:00:20.764862491 +0000 UTC m=+0.096374816 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, vendor=Red Hat, Inc., io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.openshift.expose-services=, url=https://www.redhat.com, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute) Dec 15 04:00:21 localhost podman[103293]: 2025-12-15 09:00:21.133292684 +0000 UTC m=+0.464805019 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, version=17.1.12, io.openshift.expose-services=, distribution-scope=public, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_migration_target, managed_by=tripleo_ansible) Dec 15 04:00:21 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 04:00:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 04:00:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 04:00:24 localhost systemd[1]: tmp-crun.Rh0v7q.mount: Deactivated successfully. Dec 15 04:00:24 localhost podman[103392]: 2025-12-15 09:00:24.734157591 +0000 UTC m=+0.064571306 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.buildah.version=1.41.4, config_id=tripleo_step4, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, vcs-type=git, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, version=17.1.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=) Dec 15 04:00:24 localhost podman[103391]: 2025-12-15 09:00:24.79925192 +0000 UTC m=+0.128706580 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, build-date=2025-11-18T23:34:05Z, batch=17.1_20251118.1, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., release=1761123044, distribution-scope=public, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, io.buildah.version=1.41.4, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ovn-controller) Dec 15 04:00:24 localhost podman[103392]: 2025-12-15 09:00:24.821244198 +0000 UTC m=+0.151657923 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, architecture=x86_64, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, release=1761123044, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 15 04:00:24 localhost podman[103392]: unhealthy Dec 15 04:00:24 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Main process exited, code=exited, status=1/FAILURE Dec 15 04:00:24 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Failed with result 'exit-code'. Dec 15 04:00:24 localhost podman[103391]: 2025-12-15 09:00:24.839412233 +0000 UTC m=+0.168866843 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, distribution-scope=public, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., version=17.1.12, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller) Dec 15 04:00:24 localhost podman[103391]: unhealthy Dec 15 04:00:24 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Main process exited, code=exited, status=1/FAILURE Dec 15 04:00:24 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Failed with result 'exit-code'. Dec 15 04:00:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 04:00:33 localhost systemd[1]: tmp-crun.sy4HC8.mount: Deactivated successfully. Dec 15 04:00:33 localhost podman[103433]: 2025-12-15 09:00:33.765187008 +0000 UTC m=+0.095048020 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, version=17.1.12, build-date=2025-11-18T22:49:46Z, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, name=rhosp17/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1761123044, io.openshift.expose-services=, config_id=tripleo_step1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd) Dec 15 04:00:33 localhost podman[103433]: 2025-12-15 09:00:33.959407007 +0000 UTC m=+0.289267999 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.openshift.expose-services=, container_name=metrics_qdr, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, version=17.1.12, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, distribution-scope=public, managed_by=tripleo_ansible) Dec 15 04:00:33 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 04:00:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 04:00:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 04:00:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 04:00:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 04:00:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 04:00:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 04:00:48 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Dec 15 04:00:48 localhost recover_tripleo_nova_virtqemud[103503]: 61849 Dec 15 04:00:48 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Dec 15 04:00:48 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Dec 15 04:00:48 localhost podman[103466]: 2025-12-15 09:00:48.787212598 +0000 UTC m=+0.101369049 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2025-11-19T00:12:45Z, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vcs-type=git, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, maintainer=OpenStack TripleO Team) Dec 15 04:00:48 localhost podman[103465]: 2025-12-15 09:00:48.835058637 +0000 UTC m=+0.147714918 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.buildah.version=1.41.4, managed_by=tripleo_ansible, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, url=https://www.redhat.com, version=17.1.12, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, tcib_managed=true, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., container_name=nova_compute, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, description=Red Hat OpenStack Platform 17.1 nova-compute) Dec 15 04:00:48 localhost podman[103466]: 2025-12-15 09:00:48.846438691 +0000 UTC m=+0.160595142 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, build-date=2025-11-19T00:12:45Z, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ceilometer_agent_ipmi, architecture=x86_64, io.buildah.version=1.41.4, url=https://www.redhat.com, vendor=Red Hat, Inc., batch=17.1_20251118.1) Dec 15 04:00:48 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 04:00:48 localhost systemd[1]: tmp-crun.DAqJ35.mount: Deactivated successfully. Dec 15 04:00:48 localhost podman[103472]: 2025-12-15 09:00:48.944637964 +0000 UTC m=+0.256985307 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, tcib_managed=true, io.openshift.expose-services=, distribution-scope=public, vcs-type=git, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp17/openstack-cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, release=1761123044, container_name=logrotate_crond, io.buildah.version=1.41.4) Dec 15 04:00:48 localhost podman[103465]: 2025-12-15 09:00:48.967130215 +0000 UTC m=+0.279786456 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, release=1761123044, url=https://www.redhat.com, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, vendor=Red Hat, Inc., container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, build-date=2025-11-19T00:36:58Z, architecture=x86_64, tcib_managed=true) Dec 15 04:00:48 localhost podman[103464]: 2025-12-15 09:00:48.917219231 +0000 UTC m=+0.239805827 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:44:13Z, url=https://www.redhat.com, batch=17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, io.openshift.expose-services=, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, container_name=iscsid, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12) Dec 15 04:00:48 localhost podman[103472]: 2025-12-15 09:00:48.976704922 +0000 UTC m=+0.289052265 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, vendor=Red Hat, Inc., vcs-type=git, batch=17.1_20251118.1, build-date=2025-11-18T22:49:32Z, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp17/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, release=1761123044) Dec 15 04:00:48 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 04:00:49 localhost podman[103464]: 2025-12-15 09:00:49.001316458 +0000 UTC m=+0.323903034 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, vcs-type=git, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:44:13Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, release=1761123044, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., container_name=iscsid, name=rhosp17/openstack-iscsid, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.openshift.expose-services=, batch=17.1_20251118.1) Dec 15 04:00:49 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 04:00:49 localhost podman[103478]: 2025-12-15 09:00:49.055093566 +0000 UTC m=+0.367500231 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, version=17.1.12, release=1761123044, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:11:48Z, batch=17.1_20251118.1, managed_by=tripleo_ansible, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, distribution-scope=public, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ceilometer-compute, config_id=tripleo_step4, container_name=ceilometer_agent_compute, vcs-type=git) Dec 15 04:00:49 localhost podman[103478]: 2025-12-15 09:00:49.088401015 +0000 UTC m=+0.400807690 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:11:48Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, vcs-type=git, config_id=tripleo_step4, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, batch=17.1_20251118.1, version=17.1.12, architecture=x86_64, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute) Dec 15 04:00:49 localhost podman[103463]: 2025-12-15 09:00:49.096526902 +0000 UTC m=+0.423410383 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z, release=1761123044, url=https://www.redhat.com, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc.) Dec 15 04:00:49 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 04:00:49 localhost podman[103463]: 2025-12-15 09:00:49.109633393 +0000 UTC m=+0.436516804 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, com.redhat.component=openstack-collectd-container, build-date=2025-11-18T22:51:28Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, name=rhosp17/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, container_name=collectd, architecture=x86_64, version=17.1.12, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, io.openshift.expose-services=, batch=17.1_20251118.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 15 04:00:49 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 04:00:49 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Deactivated successfully. Dec 15 04:00:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 04:00:51 localhost podman[103600]: 2025-12-15 09:00:51.76584723 +0000 UTC m=+0.093307004 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, maintainer=OpenStack TripleO Team, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, vcs-type=git, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, container_name=nova_migration_target, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Dec 15 04:00:52 localhost podman[103600]: 2025-12-15 09:00:52.171689953 +0000 UTC m=+0.499149747 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, container_name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, url=https://www.redhat.com, version=17.1.12, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, vcs-type=git, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vendor=Red Hat, Inc., release=1761123044, architecture=x86_64) Dec 15 04:00:52 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 04:00:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 04:00:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 04:00:55 localhost systemd[1]: tmp-crun.bOod98.mount: Deactivated successfully. Dec 15 04:00:55 localhost podman[103623]: 2025-12-15 09:00:55.773057544 +0000 UTC m=+0.100281980 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, managed_by=tripleo_ansible, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ovn-controller, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T23:34:05Z, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, distribution-scope=public, maintainer=OpenStack TripleO Team, version=17.1.12, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, url=https://www.redhat.com, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 15 04:00:55 localhost podman[103623]: 2025-12-15 09:00:55.788495866 +0000 UTC m=+0.115720282 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.openshift.expose-services=, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, release=1761123044, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp17/openstack-ovn-controller, tcib_managed=true, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, batch=17.1_20251118.1, managed_by=tripleo_ansible, vcs-type=git, url=https://www.redhat.com) Dec 15 04:00:55 localhost podman[103623]: unhealthy Dec 15 04:00:55 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Main process exited, code=exited, status=1/FAILURE Dec 15 04:00:55 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Failed with result 'exit-code'. Dec 15 04:00:55 localhost systemd[1]: tmp-crun.Nr6IQa.mount: Deactivated successfully. Dec 15 04:00:55 localhost podman[103624]: 2025-12-15 09:00:55.86986193 +0000 UTC m=+0.195557736 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, build-date=2025-11-19T00:14:25Z, release=1761123044, managed_by=tripleo_ansible, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, io.openshift.expose-services=, config_id=tripleo_step4) Dec 15 04:00:55 localhost podman[103624]: 2025-12-15 09:00:55.913789314 +0000 UTC m=+0.239485090 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, architecture=x86_64, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., config_id=tripleo_step4, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, release=1761123044, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-19T00:14:25Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c) Dec 15 04:00:55 localhost podman[103624]: unhealthy Dec 15 04:00:55 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Main process exited, code=exited, status=1/FAILURE Dec 15 04:00:55 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Failed with result 'exit-code'. Dec 15 04:01:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 04:01:04 localhost podman[103688]: 2025-12-15 09:01:04.754754293 +0000 UTC m=+0.090506019 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, version=17.1.12, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, vendor=Red Hat, Inc., container_name=metrics_qdr, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd) Dec 15 04:01:04 localhost podman[103688]: 2025-12-15 09:01:04.992915546 +0000 UTC m=+0.328667262 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.expose-services=, config_id=tripleo_step1, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, tcib_managed=true, distribution-scope=public, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., container_name=metrics_qdr) Dec 15 04:01:05 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 04:01:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 04:01:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 04:01:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 04:01:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 04:01:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 04:01:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 04:01:19 localhost podman[103719]: 2025-12-15 09:01:19.774284218 +0000 UTC m=+0.096780376 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, batch=17.1_20251118.1, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, container_name=collectd, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., build-date=2025-11-18T22:51:28Z, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, config_id=tripleo_step3, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git) Dec 15 04:01:19 localhost podman[103744]: 2025-12-15 09:01:19.787609604 +0000 UTC m=+0.089750529 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, build-date=2025-11-19T00:11:48Z, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, version=17.1.12, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 15 04:01:19 localhost podman[103744]: 2025-12-15 09:01:19.814327119 +0000 UTC m=+0.116468064 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, vendor=Red Hat, Inc., release=1761123044, build-date=2025-11-19T00:11:48Z, batch=17.1_20251118.1, container_name=ceilometer_agent_compute, config_id=tripleo_step4, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, distribution-scope=public, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=) Dec 15 04:01:19 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Deactivated successfully. Dec 15 04:01:19 localhost podman[103720]: 2025-12-15 09:01:19.830260244 +0000 UTC m=+0.148299544 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., release=1761123044, batch=17.1_20251118.1, tcib_managed=true, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, name=rhosp17/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, distribution-scope=public, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, version=17.1.12, url=https://www.redhat.com) Dec 15 04:01:19 localhost podman[103719]: 2025-12-15 09:01:19.865103285 +0000 UTC m=+0.187599463 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, container_name=collectd, config_id=tripleo_step3, vcs-type=git, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp17/openstack-collectd, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, architecture=x86_64, build-date=2025-11-18T22:51:28Z, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, vendor=Red Hat, Inc., org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com) Dec 15 04:01:19 localhost podman[103724]: 2025-12-15 09:01:19.910329333 +0000 UTC m=+0.225858125 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, managed_by=tripleo_ansible, architecture=x86_64, build-date=2025-11-19T00:12:45Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., version=17.1.12, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, io.buildah.version=1.41.4, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, release=1761123044, url=https://www.redhat.com, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 15 04:01:19 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 04:01:19 localhost podman[103728]: 2025-12-15 09:01:19.888763167 +0000 UTC m=+0.196344447 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, version=17.1.12, build-date=2025-11-18T22:49:32Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, io.openshift.expose-services=, batch=17.1_20251118.1, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, maintainer=OpenStack TripleO Team, architecture=x86_64, vendor=Red Hat, Inc., url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public) Dec 15 04:01:19 localhost podman[103721]: 2025-12-15 09:01:19.860186263 +0000 UTC m=+0.177144803 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, version=17.1.12, io.openshift.expose-services=, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20251118.1, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, container_name=nova_compute, architecture=x86_64, maintainer=OpenStack TripleO Team, tcib_managed=true, distribution-scope=public) Dec 15 04:01:19 localhost podman[103720]: 2025-12-15 09:01:19.968126478 +0000 UTC m=+0.286165838 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, io.buildah.version=1.41.4, batch=17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 15 04:01:19 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 04:01:19 localhost podman[103721]: 2025-12-15 09:01:19.98993657 +0000 UTC m=+0.306895150 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., config_id=tripleo_step5, io.openshift.expose-services=, distribution-scope=public, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, build-date=2025-11-19T00:36:58Z, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64) Dec 15 04:01:19 localhost podman[103724]: 2025-12-15 09:01:19.990404552 +0000 UTC m=+0.305933384 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-ipmi, config_id=tripleo_step4, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, url=https://www.redhat.com, distribution-scope=public, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 15 04:01:20 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Deactivated successfully. Dec 15 04:01:20 localhost podman[103728]: 2025-12-15 09:01:20.023323552 +0000 UTC m=+0.330904842 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vendor=Red Hat, Inc., vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, io.openshift.expose-services=, distribution-scope=public, release=1761123044, vcs-type=git, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, com.redhat.component=openstack-cron-container, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 15 04:01:20 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 04:01:20 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 04:01:20 localhost systemd[1]: tmp-crun.zHBxHh.mount: Deactivated successfully. Dec 15 04:01:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 04:01:22 localhost podman[103868]: 2025-12-15 09:01:22.366081544 +0000 UTC m=+0.101935434 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, architecture=x86_64, config_id=tripleo_step4, tcib_managed=true, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, version=17.1.12, container_name=nova_migration_target, release=1761123044, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute) Dec 15 04:01:22 localhost podman[103868]: 2025-12-15 09:01:22.707304351 +0000 UTC m=+0.443158191 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, name=rhosp17/openstack-nova-compute, distribution-scope=public, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, version=17.1.12, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, architecture=x86_64, com.redhat.component=openstack-nova-compute-container) Dec 15 04:01:22 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 04:01:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 04:01:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 04:01:26 localhost podman[103955]: 2025-12-15 09:01:26.750829595 +0000 UTC m=+0.081455467 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, architecture=x86_64, distribution-scope=public, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2025-11-19T00:14:25Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.4, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Dec 15 04:01:26 localhost podman[103955]: 2025-12-15 09:01:26.795762685 +0000 UTC m=+0.126388517 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, config_id=tripleo_step4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.openshift.expose-services=, release=1761123044, name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, architecture=x86_64, build-date=2025-11-19T00:14:25Z, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container) Dec 15 04:01:26 localhost podman[103955]: unhealthy Dec 15 04:01:26 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Main process exited, code=exited, status=1/FAILURE Dec 15 04:01:26 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Failed with result 'exit-code'. Dec 15 04:01:26 localhost podman[103954]: 2025-12-15 09:01:26.80268172 +0000 UTC m=+0.133742595 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, vcs-type=git, architecture=x86_64, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.12, vendor=Red Hat, Inc., container_name=ovn_controller, tcib_managed=true, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ovn-controller, io.openshift.expose-services=, url=https://www.redhat.com) Dec 15 04:01:26 localhost podman[103954]: 2025-12-15 09:01:26.886585461 +0000 UTC m=+0.217646336 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, build-date=2025-11-18T23:34:05Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, batch=17.1_20251118.1, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, distribution-scope=public) Dec 15 04:01:26 localhost podman[103954]: unhealthy Dec 15 04:01:26 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Main process exited, code=exited, status=1/FAILURE Dec 15 04:01:26 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Failed with result 'exit-code'. Dec 15 04:01:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 04:01:35 localhost podman[103997]: 2025-12-15 09:01:35.79717467 +0000 UTC m=+0.091681730 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, release=1761123044, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, config_id=tripleo_step1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, io.buildah.version=1.41.4, tcib_managed=true, url=https://www.redhat.com, managed_by=tripleo_ansible, vcs-type=git) Dec 15 04:01:35 localhost podman[103997]: 2025-12-15 09:01:35.995388346 +0000 UTC m=+0.289895396 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, version=17.1.12, config_id=tripleo_step1, io.buildah.version=1.41.4, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, tcib_managed=true) Dec 15 04:01:36 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 04:01:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 04:01:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 04:01:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 04:01:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 04:01:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 04:01:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 04:01:50 localhost systemd[1]: tmp-crun.4RZnkX.mount: Deactivated successfully. Dec 15 04:01:50 localhost podman[104028]: 2025-12-15 09:01:50.832799205 +0000 UTC m=+0.151154330 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, version=17.1.12, vendor=Red Hat, Inc., url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, name=rhosp17/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step5, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, vcs-type=git, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute) Dec 15 04:01:50 localhost podman[104028]: 2025-12-15 09:01:50.884524477 +0000 UTC m=+0.202879612 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, architecture=x86_64, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, vcs-type=git, name=rhosp17/openstack-nova-compute, container_name=nova_compute, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vendor=Red Hat, Inc., release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute) Dec 15 04:01:50 localhost podman[104029]: 2025-12-15 09:01:50.887359963 +0000 UTC m=+0.202330277 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, name=rhosp17/openstack-ceilometer-ipmi, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., architecture=x86_64, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, vcs-type=git, version=17.1.12, io.buildah.version=1.41.4, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, io.openshift.expose-services=, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}) Dec 15 04:01:50 localhost podman[104028]: unhealthy Dec 15 04:01:50 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Main process exited, code=exited, status=1/FAILURE Dec 15 04:01:50 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Failed with result 'exit-code'. Dec 15 04:01:50 localhost podman[104027]: 2025-12-15 09:01:50.930518736 +0000 UTC m=+0.253342720 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.buildah.version=1.41.4, url=https://www.redhat.com, vcs-type=git, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-iscsid, release=1761123044, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, build-date=2025-11-18T23:44:13Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, distribution-scope=public, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 15 04:01:50 localhost podman[104042]: 2025-12-15 09:01:50.801053807 +0000 UTC m=+0.108854660 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., url=https://www.redhat.com, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, release=1761123044, tcib_managed=true, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, build-date=2025-11-19T00:11:48Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, version=17.1.12, io.buildah.version=1.41.4, io.openshift.expose-services=) Dec 15 04:01:50 localhost podman[104042]: 2025-12-15 09:01:50.983393749 +0000 UTC m=+0.291194642 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, build-date=2025-11-19T00:11:48Z, vcs-type=git, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.12, config_id=tripleo_step4, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, distribution-scope=public, release=1761123044, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-compute) Dec 15 04:01:50 localhost podman[104042]: unhealthy Dec 15 04:01:50 localhost podman[104035]: 2025-12-15 09:01:50.99579993 +0000 UTC m=+0.308579486 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.buildah.version=1.41.4, distribution-scope=public, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, build-date=2025-11-18T22:49:32Z, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, release=1761123044, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, batch=17.1_20251118.1, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc.) Dec 15 04:01:50 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Main process exited, code=exited, status=1/FAILURE Dec 15 04:01:50 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Failed with result 'exit-code'. Dec 15 04:01:51 localhost podman[104027]: 2025-12-15 09:01:51.017271834 +0000 UTC m=+0.340095798 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-iscsid, managed_by=tripleo_ansible, url=https://www.redhat.com, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:44:13Z, tcib_managed=true, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044) Dec 15 04:01:51 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 04:01:51 localhost podman[104035]: 2025-12-15 09:01:51.033792905 +0000 UTC m=+0.346572441 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, release=1761123044, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, vendor=Red Hat, Inc., managed_by=tripleo_ansible, name=rhosp17/openstack-cron, vcs-type=git, config_id=tripleo_step4, tcib_managed=true, architecture=x86_64, url=https://www.redhat.com, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 15 04:01:51 localhost podman[104026]: 2025-12-15 09:01:51.033920078 +0000 UTC m=+0.360347618 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, tcib_managed=true, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, container_name=collectd, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.expose-services=, name=rhosp17/openstack-collectd, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-collectd-container) Dec 15 04:01:51 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 04:01:51 localhost podman[104029]: 2025-12-15 09:01:51.07140399 +0000 UTC m=+0.386374274 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.12, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:12:45Z, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, managed_by=tripleo_ansible, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, maintainer=OpenStack TripleO Team) Dec 15 04:01:51 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 04:01:51 localhost podman[104026]: 2025-12-15 09:01:51.117369578 +0000 UTC m=+0.443797168 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-type=git, io.openshift.expose-services=, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, vendor=Red Hat, Inc., io.buildah.version=1.41.4, version=17.1.12, build-date=2025-11-18T22:51:28Z, tcib_managed=true, distribution-scope=public, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20251118.1, com.redhat.component=openstack-collectd-container) Dec 15 04:01:51 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 04:01:52 localhost sshd[104153]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:01:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 04:01:53 localhost podman[104155]: 2025-12-15 09:01:53.727741751 +0000 UTC m=+0.065831011 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., io.buildah.version=1.41.4, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, config_id=tripleo_step4, version=17.1.12, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, release=1761123044, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true) Dec 15 04:01:54 localhost podman[104155]: 2025-12-15 09:01:54.123464353 +0000 UTC m=+0.461553673 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, release=1761123044, vcs-type=git, io.buildah.version=1.41.4, version=17.1.12, url=https://www.redhat.com, distribution-scope=public, io.openshift.expose-services=, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, tcib_managed=true, architecture=x86_64, name=rhosp17/openstack-nova-compute, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., batch=17.1_20251118.1, com.redhat.component=openstack-nova-compute-container) Dec 15 04:01:54 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 04:01:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 04:01:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 04:01:57 localhost podman[104179]: 2025-12-15 09:01:57.751091355 +0000 UTC m=+0.081904119 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, config_id=tripleo_step4, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-ovn-controller, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, distribution-scope=public, release=1761123044, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=ovn_controller, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller) Dec 15 04:01:57 localhost podman[104179]: 2025-12-15 09:01:57.792288295 +0000 UTC m=+0.123101009 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ovn-controller, tcib_managed=true, release=1761123044, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, architecture=x86_64, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, vendor=Red Hat, Inc., batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T23:34:05Z, config_id=tripleo_step4, url=https://www.redhat.com) Dec 15 04:01:57 localhost podman[104179]: unhealthy Dec 15 04:01:57 localhost systemd[1]: tmp-crun.agiHwp.mount: Deactivated successfully. Dec 15 04:01:57 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Main process exited, code=exited, status=1/FAILURE Dec 15 04:01:57 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Failed with result 'exit-code'. Dec 15 04:01:57 localhost podman[104180]: 2025-12-15 09:01:57.823210932 +0000 UTC m=+0.147582254 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, batch=17.1_20251118.1, container_name=ovn_metadata_agent, architecture=x86_64, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, managed_by=tripleo_ansible, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, config_id=tripleo_step4, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team) Dec 15 04:01:57 localhost podman[104180]: 2025-12-15 09:01:57.86245581 +0000 UTC m=+0.186827082 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, build-date=2025-11-19T00:14:25Z, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, version=17.1.12, release=1761123044, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Dec 15 04:01:57 localhost podman[104180]: unhealthy Dec 15 04:01:57 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Main process exited, code=exited, status=1/FAILURE Dec 15 04:01:57 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Failed with result 'exit-code'. Dec 15 04:02:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 04:02:06 localhost podman[104219]: 2025-12-15 09:02:06.760822823 +0000 UTC m=+0.083329168 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, release=1761123044, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, container_name=metrics_qdr, name=rhosp17/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, tcib_managed=true, config_id=tripleo_step1, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4) Dec 15 04:02:06 localhost podman[104219]: 2025-12-15 09:02:06.98155575 +0000 UTC m=+0.304062135 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, managed_by=tripleo_ansible, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, tcib_managed=true, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:46Z, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, name=rhosp17/openstack-qdrouterd) Dec 15 04:02:06 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 04:02:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 04:02:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 04:02:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 04:02:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 04:02:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 04:02:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 04:02:21 localhost podman[104263]: 2025-12-15 09:02:21.787133078 +0000 UTC m=+0.102323495 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, architecture=x86_64, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, managed_by=tripleo_ansible, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-cron, container_name=logrotate_crond, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, tcib_managed=true) Dec 15 04:02:21 localhost podman[104252]: 2025-12-15 09:02:21.770055092 +0000 UTC m=+0.087026406 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, architecture=x86_64, release=1761123044, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp17/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., io.openshift.expose-services=, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, distribution-scope=public, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 15 04:02:21 localhost podman[104263]: 2025-12-15 09:02:21.825079742 +0000 UTC m=+0.140270209 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, build-date=2025-11-18T22:49:32Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, distribution-scope=public, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.buildah.version=1.41.4, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=logrotate_crond, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12) Dec 15 04:02:21 localhost podman[104250]: 2025-12-15 09:02:21.836365443 +0000 UTC m=+0.163754155 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-iscsid, container_name=iscsid, config_id=tripleo_step3, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, vcs-type=git, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, batch=17.1_20251118.1, version=17.1.12, build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc.) Dec 15 04:02:21 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 04:02:21 localhost podman[104251]: 2025-12-15 09:02:21.86991504 +0000 UTC m=+0.194594000 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, release=1761123044, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, version=17.1.12, com.redhat.component=openstack-nova-compute-container, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., container_name=nova_compute) Dec 15 04:02:21 localhost podman[104251]: 2025-12-15 09:02:21.886289657 +0000 UTC m=+0.210968627 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, distribution-scope=public, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, version=17.1.12, io.buildah.version=1.41.4, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team) Dec 15 04:02:21 localhost podman[104251]: unhealthy Dec 15 04:02:21 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Main process exited, code=exited, status=1/FAILURE Dec 15 04:02:21 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Failed with result 'exit-code'. Dec 15 04:02:21 localhost podman[104249]: 2025-12-15 09:02:21.922736271 +0000 UTC m=+0.252347303 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, managed_by=tripleo_ansible, url=https://www.redhat.com, config_id=tripleo_step3, container_name=collectd, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.12, vendor=Red Hat, Inc., architecture=x86_64, vcs-type=git, maintainer=OpenStack TripleO Team, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, release=1761123044, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=) Dec 15 04:02:21 localhost podman[104252]: 2025-12-15 09:02:21.950639126 +0000 UTC m=+0.267610400 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, tcib_managed=true, release=1761123044, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:12:45Z, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi) Dec 15 04:02:21 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 04:02:22 localhost podman[104250]: 2025-12-15 09:02:22.001940227 +0000 UTC m=+0.329328939 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, batch=17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, vendor=Red Hat, Inc., vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, architecture=x86_64, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3) Dec 15 04:02:22 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 04:02:22 localhost podman[104269]: 2025-12-15 09:02:22.085261363 +0000 UTC m=+0.396993337 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, release=1761123044, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, vcs-type=git, tcib_managed=true, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:11:48Z, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, version=17.1.12) Dec 15 04:02:22 localhost podman[104249]: 2025-12-15 09:02:22.108492735 +0000 UTC m=+0.438103807 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, url=https://www.redhat.com, container_name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., name=rhosp17/openstack-collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.openshift.expose-services=, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, config_id=tripleo_step3, build-date=2025-11-18T22:51:28Z, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 15 04:02:22 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 04:02:22 localhost podman[104269]: 2025-12-15 09:02:22.136327367 +0000 UTC m=+0.448059281 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, io.openshift.expose-services=, build-date=2025-11-19T00:11:48Z, batch=17.1_20251118.1, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, name=rhosp17/openstack-ceilometer-compute, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, distribution-scope=public, managed_by=tripleo_ansible) Dec 15 04:02:22 localhost podman[104269]: unhealthy Dec 15 04:02:22 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Main process exited, code=exited, status=1/FAILURE Dec 15 04:02:22 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Failed with result 'exit-code'. Dec 15 04:02:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 04:02:24 localhost podman[104446]: 2025-12-15 09:02:24.756689768 +0000 UTC m=+0.084327444 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, vcs-type=git, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., architecture=x86_64, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Dec 15 04:02:25 localhost podman[104446]: 2025-12-15 09:02:25.106412332 +0000 UTC m=+0.434050018 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, container_name=nova_migration_target, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64) Dec 15 04:02:25 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 04:02:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 04:02:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 04:02:28 localhost systemd[1]: tmp-crun.bB8oCI.mount: Deactivated successfully. Dec 15 04:02:28 localhost podman[104484]: 2025-12-15 09:02:28.769402325 +0000 UTC m=+0.097731252 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, managed_by=tripleo_ansible, distribution-scope=public, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, version=17.1.12, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, vcs-type=git, io.openshift.expose-services=, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272) Dec 15 04:02:28 localhost podman[104485]: 2025-12-15 09:02:28.804094453 +0000 UTC m=+0.128341710 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, io.buildah.version=1.41.4, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, config_id=tripleo_step4, batch=17.1_20251118.1, container_name=ovn_metadata_agent) Dec 15 04:02:28 localhost podman[104484]: 2025-12-15 09:02:28.80888721 +0000 UTC m=+0.137216077 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, managed_by=tripleo_ansible, url=https://www.redhat.com, batch=17.1_20251118.1, version=17.1.12, release=1761123044, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, io.openshift.expose-services=, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, build-date=2025-11-18T23:34:05Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp17/openstack-ovn-controller, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}) Dec 15 04:02:28 localhost podman[104484]: unhealthy Dec 15 04:02:28 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Main process exited, code=exited, status=1/FAILURE Dec 15 04:02:28 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Failed with result 'exit-code'. Dec 15 04:02:28 localhost podman[104485]: 2025-12-15 09:02:28.840133416 +0000 UTC m=+0.164380673 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, build-date=2025-11-19T00:14:25Z, vcs-type=git, distribution-scope=public, managed_by=tripleo_ansible, config_id=tripleo_step4, architecture=x86_64, tcib_managed=true, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.12) Dec 15 04:02:28 localhost podman[104485]: unhealthy Dec 15 04:02:28 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Main process exited, code=exited, status=1/FAILURE Dec 15 04:02:28 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Failed with result 'exit-code'. Dec 15 04:02:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4441 DF PROTO=TCP SPT=50744 DPT=9882 SEQ=3553639707 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838184F90000000001030307) Dec 15 04:02:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4442 DF PROTO=TCP SPT=50744 DPT=9882 SEQ=3553639707 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838188E50000000001030307) Dec 15 04:02:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56630 DF PROTO=TCP SPT=35230 DPT=9105 SEQ=3686881532 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838190920000000001030307) Dec 15 04:02:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4443 DF PROTO=TCP SPT=50744 DPT=9882 SEQ=3553639707 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838190E50000000001030307) Dec 15 04:02:35 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56631 DF PROTO=TCP SPT=35230 DPT=9105 SEQ=3686881532 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838194A60000000001030307) Dec 15 04:02:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 04:02:37 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Dec 15 04:02:37 localhost recover_tripleo_nova_virtqemud[104530]: 61849 Dec 15 04:02:37 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Dec 15 04:02:37 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Dec 15 04:02:37 localhost podman[104525]: 2025-12-15 09:02:37.760836884 +0000 UTC m=+0.088368962 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, release=1761123044, io.buildah.version=1.41.4, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, version=17.1.12, managed_by=tripleo_ansible, vendor=Red Hat, Inc., batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 15 04:02:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56632 DF PROTO=TCP SPT=35230 DPT=9105 SEQ=3686881532 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83819CA60000000001030307) Dec 15 04:02:37 localhost podman[104525]: 2025-12-15 09:02:37.955306119 +0000 UTC m=+0.282838137 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, batch=17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, url=https://www.redhat.com, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 15 04:02:37 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 04:02:38 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4444 DF PROTO=TCP SPT=50744 DPT=9882 SEQ=3553639707 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8381A0A50000000001030307) Dec 15 04:02:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56633 DF PROTO=TCP SPT=35230 DPT=9105 SEQ=3686881532 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8381AC660000000001030307) Dec 15 04:02:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4445 DF PROTO=TCP SPT=50744 DPT=9882 SEQ=3553639707 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8381C1250000000001030307) Dec 15 04:02:50 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56634 DF PROTO=TCP SPT=35230 DPT=9105 SEQ=3686881532 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8381CD260000000001030307) Dec 15 04:02:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 04:02:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 04:02:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 04:02:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 04:02:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 04:02:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 04:02:52 localhost podman[104559]: 2025-12-15 09:02:52.774317846 +0000 UTC m=+0.095633786 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.buildah.version=1.41.4, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, architecture=x86_64, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, batch=17.1_20251118.1, io.openshift.expose-services=, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vendor=Red Hat, Inc., version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, build-date=2025-11-18T23:44:13Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 15 04:02:52 localhost podman[104559]: 2025-12-15 09:02:52.78639966 +0000 UTC m=+0.107715590 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, batch=17.1_20251118.1, distribution-scope=public, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vendor=Red Hat, Inc., url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, vcs-type=git, managed_by=tripleo_ansible, build-date=2025-11-18T23:44:13Z, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, release=1761123044, container_name=iscsid, io.openshift.expose-services=, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid) Dec 15 04:02:52 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 04:02:52 localhost systemd[1]: tmp-crun.XzTJeX.mount: Deactivated successfully. Dec 15 04:02:52 localhost podman[104561]: 2025-12-15 09:02:52.836780585 +0000 UTC m=+0.149242348 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, release=1761123044, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, name=rhosp17/openstack-ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, version=17.1.12, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi) Dec 15 04:02:52 localhost podman[104560]: 2025-12-15 09:02:52.866208091 +0000 UTC m=+0.184886920 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vcs-type=git, release=1761123044, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, build-date=2025-11-19T00:36:58Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=) Dec 15 04:02:52 localhost podman[104561]: 2025-12-15 09:02:52.888361683 +0000 UTC m=+0.200823426 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, io.openshift.expose-services=, build-date=2025-11-19T00:12:45Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20251118.1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, architecture=x86_64, version=17.1.12, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Dec 15 04:02:52 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 04:02:52 localhost podman[104558]: 2025-12-15 09:02:52.934742392 +0000 UTC m=+0.255705053 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, name=rhosp17/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, config_id=tripleo_step3, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, batch=17.1_20251118.1, version=17.1.12, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd) Dec 15 04:02:52 localhost podman[104579]: 2025-12-15 09:02:52.940851816 +0000 UTC m=+0.246956040 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, version=17.1.12, batch=17.1_20251118.1, vendor=Red Hat, Inc., name=rhosp17/openstack-ceilometer-compute, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, release=1761123044, vcs-type=git, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:11:48Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, managed_by=tripleo_ansible) Dec 15 04:02:52 localhost podman[104558]: 2025-12-15 09:02:52.947441632 +0000 UTC m=+0.268404293 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_id=tripleo_step3, container_name=collectd, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2025-11-18T22:51:28Z, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, batch=17.1_20251118.1, io.buildah.version=1.41.4, vcs-type=git, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 15 04:02:52 localhost podman[104560]: 2025-12-15 09:02:52.95676033 +0000 UTC m=+0.275439209 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, distribution-scope=public, version=17.1.12, build-date=2025-11-19T00:36:58Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, config_id=tripleo_step5, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, managed_by=tripleo_ansible, io.buildah.version=1.41.4, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Dec 15 04:02:52 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 04:02:52 localhost podman[104560]: unhealthy Dec 15 04:02:52 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Main process exited, code=exited, status=1/FAILURE Dec 15 04:02:52 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Failed with result 'exit-code'. Dec 15 04:02:52 localhost podman[104579]: 2025-12-15 09:02:52.991399706 +0000 UTC m=+0.297503930 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, version=17.1.12, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.4, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:11:48Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-ceilometer-compute, url=https://www.redhat.com, architecture=x86_64, tcib_managed=true, vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute) Dec 15 04:02:52 localhost podman[104579]: unhealthy Dec 15 04:02:53 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Main process exited, code=exited, status=1/FAILURE Dec 15 04:02:53 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Failed with result 'exit-code'. Dec 15 04:02:53 localhost podman[104563]: 2025-12-15 09:02:53.036663426 +0000 UTC m=+0.343972432 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, distribution-scope=public, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., vcs-type=git, io.buildah.version=1.41.4, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, build-date=2025-11-18T22:49:32Z, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 15 04:02:53 localhost podman[104563]: 2025-12-15 09:02:53.049494448 +0000 UTC m=+0.356803464 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, container_name=logrotate_crond, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, architecture=x86_64, name=rhosp17/openstack-cron, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step4, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 cron) Dec 15 04:02:53 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 04:02:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56607 DF PROTO=TCP SPT=49012 DPT=9100 SEQ=1708103804 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8381D8820000000001030307) Dec 15 04:02:53 localhost sshd[104689]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:02:53 localhost systemd-logind[763]: New session 36 of user zuul. Dec 15 04:02:53 localhost systemd[1]: Started Session 36 of User zuul. Dec 15 04:02:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56608 DF PROTO=TCP SPT=49012 DPT=9100 SEQ=1708103804 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8381DCA50000000001030307) Dec 15 04:02:54 localhost python3.9[104784]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova/nova.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 15 04:02:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41326 DF PROTO=TCP SPT=46142 DPT=9102 SEQ=9765919 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8381DEB10000000001030307) Dec 15 04:02:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 04:02:55 localhost podman[104879]: 2025-12-15 09:02:55.285088277 +0000 UTC m=+0.079922006 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, batch=17.1_20251118.1, io.buildah.version=1.41.4, container_name=nova_migration_target, distribution-scope=public, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, version=17.1.12, name=rhosp17/openstack-nova-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:36:58Z, release=1761123044, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}) Dec 15 04:02:55 localhost python3.9[104878]: ansible-ansible.legacy.command Invoked with cmd=python3 -c "import configparser as c; p = c.ConfigParser(strict=False); p.read('/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova/nova.conf'); print(p['DEFAULT']['host'])"#012 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:02:55 localhost podman[104879]: 2025-12-15 09:02:55.605390245 +0000 UTC m=+0.400223975 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, version=17.1.12, container_name=nova_migration_target, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, managed_by=tripleo_ansible, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team) Dec 15 04:02:55 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 04:02:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62924 DF PROTO=TCP SPT=38456 DPT=9101 SEQ=774739587 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8381E2880000000001030307) Dec 15 04:02:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41327 DF PROTO=TCP SPT=46142 DPT=9102 SEQ=9765919 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8381E2A50000000001030307) Dec 15 04:02:56 localhost python3.9[104993]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/neutron/etc/neutron/neutron.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 15 04:02:56 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56609 DF PROTO=TCP SPT=49012 DPT=9100 SEQ=1708103804 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8381E4A50000000001030307) Dec 15 04:02:56 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62925 DF PROTO=TCP SPT=38456 DPT=9101 SEQ=774739587 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8381E6A50000000001030307) Dec 15 04:02:57 localhost python3.9[105087]: ansible-ansible.legacy.command Invoked with cmd=python3 -c "import configparser as c; p = c.ConfigParser(strict=False); p.read('/var/lib/config-data/puppet-generated/neutron/etc/neutron/neutron.conf'); print(p['DEFAULT']['host'])"#012 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:02:57 localhost python3.9[105180]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:02:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41328 DF PROTO=TCP SPT=46142 DPT=9102 SEQ=9765919 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8381EAA60000000001030307) Dec 15 04:02:58 localhost python3.9[105271]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline Dec 15 04:02:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62926 DF PROTO=TCP SPT=38456 DPT=9101 SEQ=774739587 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8381EEA60000000001030307) Dec 15 04:02:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 04:02:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 04:02:59 localhost podman[105287]: 2025-12-15 09:02:59.766595042 +0000 UTC m=+0.086253406 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, build-date=2025-11-19T00:14:25Z, url=https://www.redhat.com, tcib_managed=true, batch=17.1_20251118.1, version=17.1.12, release=1761123044, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.buildah.version=1.41.4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64) Dec 15 04:02:59 localhost podman[105287]: 2025-12-15 09:02:59.81144652 +0000 UTC m=+0.131104854 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-type=git, release=1761123044, version=17.1.12, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, url=https://www.redhat.com, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c) Dec 15 04:02:59 localhost podman[105287]: unhealthy Dec 15 04:02:59 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Main process exited, code=exited, status=1/FAILURE Dec 15 04:02:59 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Failed with result 'exit-code'. Dec 15 04:02:59 localhost podman[105286]: 2025-12-15 09:02:59.816182387 +0000 UTC m=+0.135864441 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, vendor=Red Hat, Inc., tcib_managed=true, url=https://www.redhat.com, build-date=2025-11-18T23:34:05Z, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.4, release=1761123044, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, name=rhosp17/openstack-ovn-controller, vcs-type=git, maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272) Dec 15 04:02:59 localhost podman[105286]: 2025-12-15 09:02:59.902427991 +0000 UTC m=+0.222109995 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.buildah.version=1.41.4, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, batch=17.1_20251118.1, build-date=2025-11-18T23:34:05Z, name=rhosp17/openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, version=17.1.12, architecture=x86_64, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, url=https://www.redhat.com, release=1761123044, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc.) Dec 15 04:02:59 localhost podman[105286]: unhealthy Dec 15 04:02:59 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Main process exited, code=exited, status=1/FAILURE Dec 15 04:02:59 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Failed with result 'exit-code'. Dec 15 04:03:00 localhost python3.9[105401]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 15 04:03:00 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56610 DF PROTO=TCP SPT=49012 DPT=9100 SEQ=1708103804 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8381F4650000000001030307) Dec 15 04:03:01 localhost python3.9[105493]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile Dec 15 04:03:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22518 DF PROTO=TCP SPT=35016 DPT=9882 SEQ=3900424283 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8381FA2A0000000001030307) Dec 15 04:03:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41329 DF PROTO=TCP SPT=46142 DPT=9102 SEQ=9765919 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8381FA660000000001030307) Dec 15 04:03:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22519 DF PROTO=TCP SPT=35016 DPT=9882 SEQ=3900424283 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8381FE250000000001030307) Dec 15 04:03:02 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62927 DF PROTO=TCP SPT=38456 DPT=9101 SEQ=774739587 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8381FE660000000001030307) Dec 15 04:03:03 localhost python3.9[105583]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Dec 15 04:03:03 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4446 DF PROTO=TCP SPT=50744 DPT=9882 SEQ=3553639707 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838201250000000001030307) Dec 15 04:03:03 localhost python3.9[105631]: ansible-ansible.legacy.dnf Invoked with name=['systemd-container'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Dec 15 04:03:04 localhost systemd[1]: session-36.scope: Deactivated successfully. Dec 15 04:03:04 localhost systemd[1]: session-36.scope: Consumed 4.655s CPU time. Dec 15 04:03:04 localhost systemd-logind[763]: Session 36 logged out. Waiting for processes to exit. Dec 15 04:03:04 localhost systemd-logind[763]: Removed session 36. Dec 15 04:03:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22520 DF PROTO=TCP SPT=35016 DPT=9882 SEQ=3900424283 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838206250000000001030307) Dec 15 04:03:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38942 DF PROTO=TCP SPT=38360 DPT=9105 SEQ=2447316896 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838211E50000000001030307) Dec 15 04:03:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 04:03:08 localhost podman[105647]: 2025-12-15 09:03:08.774550843 +0000 UTC m=+0.086495132 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, batch=17.1_20251118.1, managed_by=tripleo_ansible, version=17.1.12, distribution-scope=public, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr) Dec 15 04:03:08 localhost podman[105647]: 2025-12-15 09:03:08.992114485 +0000 UTC m=+0.304058874 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, build-date=2025-11-18T22:49:46Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, batch=17.1_20251118.1, url=https://www.redhat.com, io.buildah.version=1.41.4, managed_by=tripleo_ansible, version=17.1.12, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-qdrouterd-container, release=1761123044, tcib_managed=true, container_name=metrics_qdr, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}) Dec 15 04:03:09 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 04:03:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62928 DF PROTO=TCP SPT=38456 DPT=9101 SEQ=774739587 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83821F250000000001030307) Dec 15 04:03:12 localhost sshd[105676]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:03:12 localhost systemd-logind[763]: New session 37 of user zuul. Dec 15 04:03:12 localhost systemd[1]: Started Session 37 of User zuul. Dec 15 04:03:13 localhost python3.9[105771]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Dec 15 04:03:13 localhost systemd[1]: Reloading. Dec 15 04:03:13 localhost systemd-rc-local-generator[105794]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:03:13 localhost systemd-sysv-generator[105797]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:03:13 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:03:14 localhost python3.9[105897]: ansible-ansible.builtin.service_facts Invoked Dec 15 04:03:14 localhost network[105914]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Dec 15 04:03:14 localhost network[105915]: 'network-scripts' will be removed from distribution in near future. Dec 15 04:03:14 localhost network[105916]: It is advised to switch to 'NetworkManager' instead for network management. Dec 15 04:03:17 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:03:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22522 DF PROTO=TCP SPT=35016 DPT=9882 SEQ=3900424283 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838237250000000001030307) Dec 15 04:03:19 localhost python3.9[106113]: ansible-ansible.builtin.service_facts Invoked Dec 15 04:03:19 localhost network[106130]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Dec 15 04:03:19 localhost network[106131]: 'network-scripts' will be removed from distribution in near future. Dec 15 04:03:19 localhost network[106132]: It is advised to switch to 'NetworkManager' instead for network management. Dec 15 04:03:20 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38944 DF PROTO=TCP SPT=38360 DPT=9105 SEQ=2447316896 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838241250000000001030307) Dec 15 04:03:21 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:03:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45803 DF PROTO=TCP SPT=36606 DPT=9100 SEQ=291674296 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83824DB30000000001030307) Dec 15 04:03:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 04:03:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 04:03:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 04:03:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 04:03:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 04:03:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 04:03:23 localhost systemd[1]: tmp-crun.NFr4aW.mount: Deactivated successfully. Dec 15 04:03:23 localhost podman[106332]: 2025-12-15 09:03:23.4340945 +0000 UTC m=+0.079431714 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, release=1761123044, build-date=2025-11-18T22:51:28Z, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, url=https://www.redhat.com, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-collectd, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, batch=17.1_20251118.1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_id=tripleo_step3) Dec 15 04:03:23 localhost podman[106333]: 2025-12-15 09:03:23.489023687 +0000 UTC m=+0.135799679 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:44:13Z, vcs-type=git, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, batch=17.1_20251118.1, container_name=iscsid, managed_by=tripleo_ansible, io.buildah.version=1.41.4, distribution-scope=public, url=https://www.redhat.com, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid) Dec 15 04:03:23 localhost podman[106333]: 2025-12-15 09:03:23.499291042 +0000 UTC m=+0.146067034 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, release=1761123044, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.12, vcs-type=git, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.component=openstack-iscsid-container, name=rhosp17/openstack-iscsid, vendor=Red Hat, Inc., architecture=x86_64, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4) Dec 15 04:03:23 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 04:03:23 localhost podman[106334]: 2025-12-15 09:03:23.534619256 +0000 UTC m=+0.180608637 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, config_id=tripleo_step5, build-date=2025-11-19T00:36:58Z, release=1761123044, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 15 04:03:23 localhost podman[106335]: 2025-12-15 09:03:23.451116515 +0000 UTC m=+0.093673085 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-ipmi, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, version=17.1.12, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, tcib_managed=true, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., managed_by=tripleo_ansible, url=https://www.redhat.com, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team) Dec 15 04:03:23 localhost podman[106332]: 2025-12-15 09:03:23.567632127 +0000 UTC m=+0.212969411 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, version=17.1.12, architecture=x86_64, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, name=rhosp17/openstack-collectd, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, io.openshift.expose-services=, maintainer=OpenStack TripleO Team) Dec 15 04:03:23 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 04:03:23 localhost podman[106337]: 2025-12-15 09:03:23.472255219 +0000 UTC m=+0.104920914 container health_status d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=unhealthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, release=1761123044, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.12, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, name=rhosp17/openstack-ceilometer-compute, config_id=tripleo_step4, build-date=2025-11-19T00:11:48Z, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, vendor=Red Hat, Inc., architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.openshift.expose-services=, url=https://www.redhat.com) Dec 15 04:03:23 localhost podman[106334]: 2025-12-15 09:03:23.579560137 +0000 UTC m=+0.225549538 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.openshift.expose-services=, release=1761123044, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-nova-compute, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Dec 15 04:03:23 localhost podman[106335]: 2025-12-15 09:03:23.584318783 +0000 UTC m=+0.226875363 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_id=tripleo_step4, io.buildah.version=1.41.4, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2025-11-19T00:12:45Z, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, name=rhosp17/openstack-ceilometer-ipmi, version=17.1.12, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, release=1761123044, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., managed_by=tripleo_ansible, architecture=x86_64) Dec 15 04:03:23 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 04:03:23 localhost podman[106337]: 2025-12-15 09:03:23.608316205 +0000 UTC m=+0.240981900 container exec_died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1761123044, build-date=2025-11-19T00:11:48Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20251118.1, tcib_managed=true, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676) Dec 15 04:03:23 localhost podman[106337]: unhealthy Dec 15 04:03:23 localhost python3.9[106331]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 04:03:23 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Main process exited, code=exited, status=1/FAILURE Dec 15 04:03:23 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Failed with result 'exit-code'. Dec 15 04:03:23 localhost podman[106334]: unhealthy Dec 15 04:03:23 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Main process exited, code=exited, status=1/FAILURE Dec 15 04:03:23 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Failed with result 'exit-code'. Dec 15 04:03:23 localhost podman[106336]: 2025-12-15 09:03:23.643102004 +0000 UTC m=+0.281425110 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=logrotate_crond, architecture=x86_64, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, version=17.1.12, build-date=2025-11-18T22:49:32Z, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, name=rhosp17/openstack-cron, io.openshift.expose-services=, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron) Dec 15 04:03:23 localhost systemd[1]: Reloading. Dec 15 04:03:23 localhost podman[106336]: 2025-12-15 09:03:23.677293228 +0000 UTC m=+0.315616354 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, container_name=logrotate_crond, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, distribution-scope=public, url=https://www.redhat.com, io.buildah.version=1.41.4, build-date=2025-11-18T22:49:32Z, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, managed_by=tripleo_ansible, vcs-type=git, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc.) Dec 15 04:03:23 localhost systemd-sysv-generator[106487]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:03:23 localhost systemd-rc-local-generator[106482]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:03:23 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:03:23 localhost systemd[1]: tmp-crun.XphtOF.mount: Deactivated successfully. Dec 15 04:03:23 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 04:03:24 localhost systemd[1]: Stopping ceilometer_agent_compute container... Dec 15 04:03:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45804 DF PROTO=TCP SPT=36606 DPT=9100 SEQ=291674296 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838251A60000000001030307) Dec 15 04:03:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 04:03:25 localhost podman[106543]: 2025-12-15 09:03:25.768201921 +0000 UTC m=+0.093546860 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, distribution-scope=public, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, tcib_managed=true, container_name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step4, architecture=x86_64, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, version=17.1.12, io.openshift.expose-services=, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Dec 15 04:03:26 localhost podman[106543]: 2025-12-15 09:03:26.11627463 +0000 UTC m=+0.441619509 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, release=1761123044, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, version=17.1.12, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, name=rhosp17/openstack-nova-compute, vcs-type=git, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}) Dec 15 04:03:26 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 04:03:26 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45805 DF PROTO=TCP SPT=36606 DPT=9100 SEQ=291674296 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838259A60000000001030307) Dec 15 04:03:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63705 DF PROTO=TCP SPT=36106 DPT=9101 SEQ=557446636 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838263A50000000001030307) Dec 15 04:03:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 04:03:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 04:03:30 localhost systemd[1]: tmp-crun.gylTaj.mount: Deactivated successfully. Dec 15 04:03:30 localhost podman[106613]: 2025-12-15 09:03:30.016089683 +0000 UTC m=+0.086699917 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, version=17.1.12, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ovn-controller, release=1761123044, config_id=tripleo_step4, batch=17.1_20251118.1, url=https://www.redhat.com, tcib_managed=true, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, vendor=Red Hat, Inc.) Dec 15 04:03:30 localhost podman[106613]: 2025-12-15 09:03:30.057471599 +0000 UTC m=+0.128081813 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, tcib_managed=true, io.buildah.version=1.41.4, url=https://www.redhat.com, release=1761123044, build-date=2025-11-18T23:34:05Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., name=rhosp17/openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git) Dec 15 04:03:30 localhost podman[106613]: unhealthy Dec 15 04:03:30 localhost podman[106612]: 2025-12-15 09:03:30.073359644 +0000 UTC m=+0.144447281 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, build-date=2025-11-19T00:14:25Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, container_name=ovn_metadata_agent, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, version=17.1.12, vcs-type=git, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., release=1761123044, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, config_id=tripleo_step4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true) Dec 15 04:03:30 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Main process exited, code=exited, status=1/FAILURE Dec 15 04:03:30 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Failed with result 'exit-code'. Dec 15 04:03:30 localhost podman[106612]: 2025-12-15 09:03:30.110336462 +0000 UTC m=+0.181424109 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, version=17.1.12, tcib_managed=true) Dec 15 04:03:30 localhost podman[106612]: unhealthy Dec 15 04:03:30 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Main process exited, code=exited, status=1/FAILURE Dec 15 04:03:30 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Failed with result 'exit-code'. Dec 15 04:03:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32236 DF PROTO=TCP SPT=57244 DPT=9102 SEQ=2415902273 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83826FA50000000001030307) Dec 15 04:03:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64785 DF PROTO=TCP SPT=47226 DPT=9882 SEQ=2459246200 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83827B650000000001030307) Dec 15 04:03:35 localhost sshd[106655]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:03:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34135 DF PROTO=TCP SPT=56016 DPT=9105 SEQ=2294934178 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838286E60000000001030307) Dec 15 04:03:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 04:03:39 localhost podman[106657]: 2025-12-15 09:03:39.479119664 +0000 UTC m=+0.064389981 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, version=17.1.12, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-qdrouterd, release=1761123044, io.buildah.version=1.41.4, batch=17.1_20251118.1, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, config_id=tripleo_step1) Dec 15 04:03:39 localhost podman[106657]: 2025-12-15 09:03:39.666572872 +0000 UTC m=+0.251843159 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, release=1761123044, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T22:49:46Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, url=https://www.redhat.com, config_id=tripleo_step1, version=17.1.12, io.openshift.expose-services=, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 15 04:03:39 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 04:03:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63707 DF PROTO=TCP SPT=36106 DPT=9101 SEQ=557446636 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838293250000000001030307) Dec 15 04:03:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64787 DF PROTO=TCP SPT=47226 DPT=9882 SEQ=2459246200 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8382AB250000000001030307) Dec 15 04:03:50 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34137 DF PROTO=TCP SPT=56016 DPT=9105 SEQ=2294934178 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8382B7250000000001030307) Dec 15 04:03:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11345 DF PROTO=TCP SPT=36436 DPT=9100 SEQ=1983816446 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8382C2E30000000001030307) Dec 15 04:03:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 04:03:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 04:03:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 04:03:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 04:03:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 04:03:53 localhost systemd[1]: tmp-crun.s9L5s4.mount: Deactivated successfully. Dec 15 04:03:53 localhost podman[106705]: Error: container d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 is not running Dec 15 04:03:53 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Main process exited, code=exited, status=125/n/a Dec 15 04:03:53 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Failed with result 'exit-code'. Dec 15 04:03:53 localhost podman[106686]: 2025-12-15 09:03:53.761468625 +0000 UTC m=+0.089183223 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, build-date=2025-11-18T22:51:28Z, vcs-type=git, managed_by=tripleo_ansible, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, config_id=tripleo_step3, name=rhosp17/openstack-collectd, io.openshift.expose-services=, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, container_name=collectd, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, version=17.1.12) Dec 15 04:03:53 localhost podman[106687]: 2025-12-15 09:03:53.762065871 +0000 UTC m=+0.085369872 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, url=https://www.redhat.com, distribution-scope=public, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, com.redhat.component=openstack-iscsid-container, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, tcib_managed=true, container_name=iscsid, release=1761123044, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid) Dec 15 04:03:53 localhost podman[106694]: 2025-12-15 09:03:53.825013242 +0000 UTC m=+0.137133854 container health_status 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-ipmi, architecture=x86_64, vcs-type=git, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, build-date=2025-11-19T00:12:45Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi) Dec 15 04:03:53 localhost podman[106687]: 2025-12-15 09:03:53.843426135 +0000 UTC m=+0.166730146 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.expose-services=, tcib_managed=true, url=https://www.redhat.com, distribution-scope=public, config_id=tripleo_step3, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, version=17.1.12, com.redhat.component=openstack-iscsid-container, build-date=2025-11-18T23:44:13Z, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp17/openstack-iscsid, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 15 04:03:53 localhost podman[106694]: 2025-12-15 09:03:53.854368147 +0000 UTC m=+0.166488769 container exec_died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., vcs-type=git, tcib_managed=true, managed_by=tripleo_ansible, distribution-scope=public, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, version=17.1.12, architecture=x86_64, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-ipmi-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi) Dec 15 04:03:53 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 04:03:53 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Deactivated successfully. Dec 15 04:03:53 localhost podman[106686]: 2025-12-15 09:03:53.89117236 +0000 UTC m=+0.218886888 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_id=tripleo_step3, name=rhosp17/openstack-collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, vcs-type=git, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, release=1761123044, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd) Dec 15 04:03:53 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 04:03:53 localhost podman[106688]: 2025-12-15 09:03:53.905099233 +0000 UTC m=+0.229384390 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, version=17.1.12, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, url=https://www.redhat.com, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Dec 15 04:03:53 localhost podman[106688]: 2025-12-15 09:03:53.921380558 +0000 UTC m=+0.245665715 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, io.openshift.expose-services=, version=17.1.12, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vendor=Red Hat, Inc., architecture=x86_64) Dec 15 04:03:53 localhost podman[106688]: unhealthy Dec 15 04:03:53 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Main process exited, code=exited, status=1/FAILURE Dec 15 04:03:53 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Failed with result 'exit-code'. Dec 15 04:03:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11346 DF PROTO=TCP SPT=36436 DPT=9100 SEQ=1983816446 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8382C6E50000000001030307) Dec 15 04:03:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 04:03:54 localhost systemd[1]: tmp-crun.A17u9r.mount: Deactivated successfully. Dec 15 04:03:54 localhost podman[106780]: 2025-12-15 09:03:54.759630223 +0000 UTC m=+0.095193414 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, build-date=2025-11-18T22:49:32Z, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, managed_by=tripleo_ansible, distribution-scope=public, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, batch=17.1_20251118.1, vendor=Red Hat, Inc., release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, tcib_managed=true, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 15 04:03:54 localhost podman[106780]: 2025-12-15 09:03:54.796462417 +0000 UTC m=+0.132025618 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, release=1761123044, url=https://www.redhat.com, architecture=x86_64, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, build-date=2025-11-18T22:49:32Z, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20251118.1, distribution-scope=public, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron) Dec 15 04:03:54 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 04:03:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6165 DF PROTO=TCP SPT=34512 DPT=9102 SEQ=4136051358 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8382CD260000000001030307) Dec 15 04:03:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 04:03:56 localhost systemd[1]: tmp-crun.lqTfnF.mount: Deactivated successfully. Dec 15 04:03:56 localhost podman[106799]: 2025-12-15 09:03:56.754185143 +0000 UTC m=+0.081576600 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, batch=17.1_20251118.1, build-date=2025-11-19T00:36:58Z, vcs-type=git, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1761123044, name=rhosp17/openstack-nova-compute, architecture=x86_64) Dec 15 04:03:57 localhost podman[106799]: 2025-12-15 09:03:57.123673135 +0000 UTC m=+0.451064612 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, io.openshift.expose-services=, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, config_id=tripleo_step4, vendor=Red Hat, Inc., container_name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, release=1761123044, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, url=https://www.redhat.com) Dec 15 04:03:57 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 04:03:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36241 DF PROTO=TCP SPT=36758 DPT=9101 SEQ=2230949885 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8382D8E50000000001030307) Dec 15 04:04:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 04:04:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 04:04:00 localhost podman[106823]: 2025-12-15 09:04:00.49689203 +0000 UTC m=+0.073720130 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1761123044, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, vcs-type=git, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, version=17.1.12, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=) Dec 15 04:04:00 localhost podman[106823]: 2025-12-15 09:04:00.513508234 +0000 UTC m=+0.090336334 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., architecture=x86_64, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, batch=17.1_20251118.1, release=1761123044, config_id=tripleo_step4, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, build-date=2025-11-19T00:14:25Z) Dec 15 04:04:00 localhost podman[106822]: 2025-12-15 09:04:00.546965228 +0000 UTC m=+0.125895315 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, batch=17.1_20251118.1, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, name=rhosp17/openstack-ovn-controller, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T23:34:05Z, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, distribution-scope=public, release=1761123044) Dec 15 04:04:00 localhost podman[106823]: unhealthy Dec 15 04:04:00 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Main process exited, code=exited, status=1/FAILURE Dec 15 04:04:00 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Failed with result 'exit-code'. Dec 15 04:04:00 localhost podman[106822]: 2025-12-15 09:04:00.58824998 +0000 UTC m=+0.167179997 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, container_name=ovn_controller, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, release=1761123044, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, io.buildah.version=1.41.4, tcib_managed=true, build-date=2025-11-18T23:34:05Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com) Dec 15 04:04:00 localhost podman[106822]: unhealthy Dec 15 04:04:00 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Main process exited, code=exited, status=1/FAILURE Dec 15 04:04:00 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Failed with result 'exit-code'. Dec 15 04:04:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6167 DF PROTO=TCP SPT=34512 DPT=9102 SEQ=4136051358 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8382E4E60000000001030307) Dec 15 04:04:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16379 DF PROTO=TCP SPT=35154 DPT=9882 SEQ=3266744635 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8382F0A50000000001030307) Dec 15 04:04:06 localhost podman[106499]: time="2025-12-15T09:04:06Z" level=warning msg="StopSignal SIGTERM failed to stop container ceilometer_agent_compute in 42 seconds, resorting to SIGKILL" Dec 15 04:04:06 localhost systemd[1]: libpod-d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.scope: Deactivated successfully. Dec 15 04:04:06 localhost systemd[1]: libpod-d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.scope: Consumed 6.419s CPU time. Dec 15 04:04:06 localhost podman[106499]: 2025-12-15 09:04:06.1408086 +0000 UTC m=+42.102974881 container died d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, io.openshift.expose-services=, tcib_managed=true, build-date=2025-11-19T00:11:48Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp17/openstack-ceilometer-compute, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.4, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20251118.1, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044) Dec 15 04:04:06 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.timer: Deactivated successfully. Dec 15 04:04:06 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146. Dec 15 04:04:06 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Failed to open /run/systemd/transient/d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: No such file or directory Dec 15 04:04:06 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146-userdata-shm.mount: Deactivated successfully. Dec 15 04:04:06 localhost systemd[1]: var-lib-containers-storage-overlay-22a343801c56d43a44097fa6d3d71cdf7d4806c36e876e53679f0b5dcaee3588-merged.mount: Deactivated successfully. Dec 15 04:04:06 localhost podman[106499]: 2025-12-15 09:04:06.199492888 +0000 UTC m=+42.161659139 container cleanup d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp17/openstack-ceilometer-compute, architecture=x86_64, config_id=tripleo_step4, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, container_name=ceilometer_agent_compute, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, release=1761123044, build-date=2025-11-19T00:11:48Z, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, version=17.1.12, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., io.openshift.expose-services=) Dec 15 04:04:06 localhost podman[106499]: ceilometer_agent_compute Dec 15 04:04:06 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.timer: Failed to open /run/systemd/transient/d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.timer: No such file or directory Dec 15 04:04:06 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Failed to open /run/systemd/transient/d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: No such file or directory Dec 15 04:04:06 localhost podman[106863]: 2025-12-15 09:04:06.230715242 +0000 UTC m=+0.079631789 container cleanup d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, version=17.1.12, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, architecture=x86_64, build-date=2025-11-19T00:11:48Z, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, managed_by=tripleo_ansible, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.4, vendor=Red Hat, Inc., batch=17.1_20251118.1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4) Dec 15 04:04:06 localhost systemd[1]: libpod-conmon-d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.scope: Deactivated successfully. Dec 15 04:04:06 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.timer: Failed to open /run/systemd/transient/d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.timer: No such file or directory Dec 15 04:04:06 localhost systemd[1]: d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: Failed to open /run/systemd/transient/d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146.service: No such file or directory Dec 15 04:04:06 localhost podman[106878]: 2025-12-15 09:04:06.339640793 +0000 UTC m=+0.072315164 container cleanup d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, architecture=x86_64, batch=17.1_20251118.1, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, vcs-type=git, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, build-date=2025-11-19T00:11:48Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp17/openstack-ceilometer-compute) Dec 15 04:04:06 localhost podman[106878]: ceilometer_agent_compute Dec 15 04:04:06 localhost systemd[1]: tripleo_ceilometer_agent_compute.service: Deactivated successfully. Dec 15 04:04:06 localhost systemd[1]: Stopped ceilometer_agent_compute container. Dec 15 04:04:06 localhost systemd[1]: tripleo_ceilometer_agent_compute.service: Consumed 1.097s CPU time, no IO. Dec 15 04:04:07 localhost python3.9[106980]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_ipmi.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 04:04:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57995 DF PROTO=TCP SPT=58934 DPT=9105 SEQ=950403520 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8382FC250000000001030307) Dec 15 04:04:08 localhost systemd[1]: Reloading. Dec 15 04:04:08 localhost systemd-rc-local-generator[107005]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:04:08 localhost systemd-sysv-generator[107009]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:04:08 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:04:08 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Dec 15 04:04:08 localhost systemd[1]: Stopping ceilometer_agent_ipmi container... Dec 15 04:04:08 localhost recover_tripleo_nova_virtqemud[107020]: 61849 Dec 15 04:04:08 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Dec 15 04:04:08 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Dec 15 04:04:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 04:04:10 localhost podman[107035]: 2025-12-15 09:04:10.752713709 +0000 UTC m=+0.083216084 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vendor=Red Hat, Inc., container_name=metrics_qdr, tcib_managed=true, config_id=tripleo_step1, version=17.1.12, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, build-date=2025-11-18T22:49:46Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 15 04:04:10 localhost podman[107035]: 2025-12-15 09:04:10.947435181 +0000 UTC m=+0.277937616 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2025-11-18T22:49:46Z, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.openshift.expose-services=, release=1761123044, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12) Dec 15 04:04:10 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 04:04:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36243 DF PROTO=TCP SPT=36758 DPT=9101 SEQ=2230949885 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838309260000000001030307) Dec 15 04:04:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16381 DF PROTO=TCP SPT=35154 DPT=9882 SEQ=3266744635 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838321260000000001030307) Dec 15 04:04:20 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57997 DF PROTO=TCP SPT=58934 DPT=9105 SEQ=950403520 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83832D250000000001030307) Dec 15 04:04:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29874 DF PROTO=TCP SPT=40516 DPT=9100 SEQ=1779956081 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838338130000000001030307) Dec 15 04:04:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 04:04:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 04:04:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 04:04:23 localhost podman[107065]: Error: container 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 is not running Dec 15 04:04:23 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Main process exited, code=exited, status=125/n/a Dec 15 04:04:23 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Failed with result 'exit-code'. Dec 15 04:04:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 04:04:24 localhost systemd[1]: tmp-crun.hS20L8.mount: Deactivated successfully. Dec 15 04:04:24 localhost systemd[1]: tmp-crun.4ZUrmc.mount: Deactivated successfully. Dec 15 04:04:24 localhost podman[107064]: 2025-12-15 09:04:24.077809114 +0000 UTC m=+0.163712574 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, io.buildah.version=1.41.4, release=1761123044, tcib_managed=true, container_name=iscsid, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, config_id=tripleo_step3, name=rhosp17/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2025-11-18T23:44:13Z, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid) Dec 15 04:04:24 localhost podman[107074]: 2025-12-15 09:04:24.033589654 +0000 UTC m=+0.110482214 container health_status 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, version=17.1.12, tcib_managed=true, io.buildah.version=1.41.4, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, batch=17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, vcs-type=git, architecture=x86_64, release=1761123044, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:51:28Z, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd) Dec 15 04:04:24 localhost podman[107064]: 2025-12-15 09:04:24.091426588 +0000 UTC m=+0.177330108 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, build-date=2025-11-18T23:44:13Z, name=rhosp17/openstack-iscsid, container_name=iscsid, tcib_managed=true, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.12, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container) Dec 15 04:04:24 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 04:04:24 localhost podman[107074]: 2025-12-15 09:04:24.117376732 +0000 UTC m=+0.194269302 container exec_died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, release=1761123044, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, url=https://www.redhat.com, container_name=collectd, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-collectd, build-date=2025-11-18T22:51:28Z, managed_by=tripleo_ansible, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, distribution-scope=public, version=17.1.12, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 15 04:04:24 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Deactivated successfully. Dec 15 04:04:24 localhost podman[107098]: 2025-12-15 09:04:24.095106667 +0000 UTC m=+0.085288690 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, name=rhosp17/openstack-nova-compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., config_id=tripleo_step5, distribution-scope=public, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, release=1761123044, managed_by=tripleo_ansible, container_name=nova_compute, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute) Dec 15 04:04:24 localhost podman[107098]: 2025-12-15 09:04:24.179530243 +0000 UTC m=+0.169712246 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, container_name=nova_compute, name=rhosp17/openstack-nova-compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, distribution-scope=public, io.buildah.version=1.41.4, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64) Dec 15 04:04:24 localhost podman[107098]: unhealthy Dec 15 04:04:24 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Main process exited, code=exited, status=1/FAILURE Dec 15 04:04:24 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Failed with result 'exit-code'. Dec 15 04:04:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29875 DF PROTO=TCP SPT=40516 DPT=9100 SEQ=1779956081 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83833C260000000001030307) Dec 15 04:04:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 04:04:24 localhost podman[107137]: 2025-12-15 09:04:24.984492419 +0000 UTC m=+0.068013518 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-cron, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, release=1761123044, build-date=2025-11-18T22:49:32Z, version=17.1.12, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step4, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 15 04:04:24 localhost podman[107137]: 2025-12-15 09:04:24.99538953 +0000 UTC m=+0.078910599 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, build-date=2025-11-18T22:49:32Z, distribution-scope=public, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp17/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.4, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., config_id=tripleo_step4, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, architecture=x86_64) Dec 15 04:04:25 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 04:04:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64173 DF PROTO=TCP SPT=52722 DPT=9102 SEQ=401436897 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838342650000000001030307) Dec 15 04:04:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 04:04:27 localhost podman[107204]: 2025-12-15 09:04:27.517018952 +0000 UTC m=+0.087517279 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, container_name=nova_migration_target, build-date=2025-11-19T00:36:58Z, tcib_managed=true, config_id=tripleo_step4, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, name=rhosp17/openstack-nova-compute, io.buildah.version=1.41.4, version=17.1.12, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, release=1761123044, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Dec 15 04:04:27 localhost podman[107204]: 2025-12-15 09:04:27.89156345 +0000 UTC m=+0.462061807 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, batch=17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, tcib_managed=true, architecture=x86_64, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, distribution-scope=public) Dec 15 04:04:27 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 04:04:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57789 DF PROTO=TCP SPT=33464 DPT=9101 SEQ=3208271871 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83834E260000000001030307) Dec 15 04:04:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 04:04:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 04:04:30 localhost podman[107256]: 2025-12-15 09:04:30.728122705 +0000 UTC m=+0.060553618 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, version=17.1.12, name=rhosp17/openstack-ovn-controller, container_name=ovn_controller, batch=17.1_20251118.1, url=https://www.redhat.com, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, io.buildah.version=1.41.4, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 15 04:04:30 localhost podman[107256]: 2025-12-15 09:04:30.742229752 +0000 UTC m=+0.074660675 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, url=https://www.redhat.com, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, config_id=tripleo_step4, name=rhosp17/openstack-ovn-controller, distribution-scope=public, build-date=2025-11-18T23:34:05Z, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller) Dec 15 04:04:30 localhost podman[107256]: unhealthy Dec 15 04:04:30 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Main process exited, code=exited, status=1/FAILURE Dec 15 04:04:30 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Failed with result 'exit-code'. Dec 15 04:04:30 localhost podman[107257]: 2025-12-15 09:04:30.798336381 +0000 UTC m=+0.123967333 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, batch=17.1_20251118.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, managed_by=tripleo_ansible, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64) Dec 15 04:04:30 localhost podman[107257]: 2025-12-15 09:04:30.813340602 +0000 UTC m=+0.138971584 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, architecture=x86_64, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, vcs-type=git, release=1761123044, container_name=ovn_metadata_agent, distribution-scope=public, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c) Dec 15 04:04:30 localhost podman[107257]: unhealthy Dec 15 04:04:30 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Main process exited, code=exited, status=1/FAILURE Dec 15 04:04:30 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Failed with result 'exit-code'. Dec 15 04:04:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64175 DF PROTO=TCP SPT=52722 DPT=9102 SEQ=401436897 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83835A250000000001030307) Dec 15 04:04:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61693 DF PROTO=TCP SPT=46762 DPT=9882 SEQ=3870898048 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838365A50000000001030307) Dec 15 04:04:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8240 DF PROTO=TCP SPT=52858 DPT=9105 SEQ=1018179834 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838371650000000001030307) Dec 15 04:04:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 04:04:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57791 DF PROTO=TCP SPT=33464 DPT=9101 SEQ=3208271871 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83837F250000000001030307) Dec 15 04:04:41 localhost podman[107293]: 2025-12-15 09:04:41.498571524 +0000 UTC m=+0.079989467 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, release=1761123044, batch=17.1_20251118.1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-18T22:49:46Z, config_id=tripleo_step1, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com) Dec 15 04:04:41 localhost podman[107293]: 2025-12-15 09:04:41.684226774 +0000 UTC m=+0.265644767 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, config_id=tripleo_step1, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-18T22:49:46Z, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, batch=17.1_20251118.1, tcib_managed=true, architecture=x86_64, io.buildah.version=1.41.4, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, distribution-scope=public, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-qdrouterd) Dec 15 04:04:41 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 04:04:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61695 DF PROTO=TCP SPT=46762 DPT=9882 SEQ=3870898048 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838395250000000001030307) Dec 15 04:04:50 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8242 DF PROTO=TCP SPT=52858 DPT=9105 SEQ=1018179834 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8383A1250000000001030307) Dec 15 04:04:50 localhost podman[107022]: time="2025-12-15T09:04:50Z" level=warning msg="StopSignal SIGTERM failed to stop container ceilometer_agent_ipmi in 42 seconds, resorting to SIGKILL" Dec 15 04:04:50 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:f3:4e:cb MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.110 DST=192.168.122.106 LEN=40 TOS=0x00 PREC=0x00 TTL=64 ID=0 DF PROTO=TCP SPT=6379 DPT=52776 SEQ=1766092657 ACK=0 WINDOW=0 RES=0x00 RST URGP=0 Dec 15 04:04:50 localhost systemd[1]: libpod-97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.scope: Deactivated successfully. Dec 15 04:04:50 localhost systemd[1]: libpod-97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.scope: Consumed 6.515s CPU time. Dec 15 04:04:50 localhost podman[107022]: 2025-12-15 09:04:50.62159669 +0000 UTC m=+42.093677613 container stop 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z, url=https://www.redhat.com, release=1761123044, name=rhosp17/openstack-ceilometer-ipmi, config_id=tripleo_step4, vendor=Red Hat, Inc., vcs-type=git, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team) Dec 15 04:04:50 localhost podman[107022]: 2025-12-15 09:04:50.654225561 +0000 UTC m=+42.126306504 container died 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, container_name=ceilometer_agent_ipmi, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, version=17.1.12, batch=17.1_20251118.1, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vendor=Red Hat, Inc., managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, release=1761123044, io.buildah.version=1.41.4, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2025-11-19T00:12:45Z) Dec 15 04:04:50 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.timer: Deactivated successfully. Dec 15 04:04:50 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99. Dec 15 04:04:50 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Failed to open /run/systemd/transient/97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: No such file or directory Dec 15 04:04:50 localhost systemd[1]: tmp-crun.wv5mn7.mount: Deactivated successfully. Dec 15 04:04:50 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99-userdata-shm.mount: Deactivated successfully. Dec 15 04:04:50 localhost podman[107022]: 2025-12-15 09:04:50.714494312 +0000 UTC m=+42.186575265 container cleanup 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:12:45Z, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., batch=17.1_20251118.1, name=rhosp17/openstack-ceilometer-ipmi, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, tcib_managed=true, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git) Dec 15 04:04:50 localhost podman[107022]: ceilometer_agent_ipmi Dec 15 04:04:50 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.timer: Failed to open /run/systemd/transient/97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.timer: No such file or directory Dec 15 04:04:50 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Failed to open /run/systemd/transient/97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: No such file or directory Dec 15 04:04:50 localhost podman[107324]: 2025-12-15 09:04:50.728118646 +0000 UTC m=+0.086146123 container cleanup 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, tcib_managed=true, distribution-scope=public, vcs-type=git, build-date=2025-11-19T00:12:45Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.4, batch=17.1_20251118.1, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, url=https://www.redhat.com) Dec 15 04:04:50 localhost systemd[1]: libpod-conmon-97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.scope: Deactivated successfully. Dec 15 04:04:50 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.timer: Failed to open /run/systemd/transient/97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.timer: No such file or directory Dec 15 04:04:50 localhost systemd[1]: 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: Failed to open /run/systemd/transient/97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99.service: No such file or directory Dec 15 04:04:50 localhost podman[107340]: 2025-12-15 09:04:50.811889795 +0000 UTC m=+0.046730050 container cleanup 97115ff7c5a84ae7e17f79e696e94da3c63f9fe0051287fe9193bec282a03f99 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_id=tripleo_step4, tcib_managed=true, build-date=2025-11-19T00:12:45Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, vendor=Red Hat, Inc., version=17.1.12, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1761123044, org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.4, distribution-scope=public, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676) Dec 15 04:04:50 localhost podman[107340]: ceilometer_agent_ipmi Dec 15 04:04:50 localhost systemd[1]: tripleo_ceilometer_agent_ipmi.service: Deactivated successfully. Dec 15 04:04:50 localhost systemd[1]: Stopped ceilometer_agent_ipmi container. Dec 15 04:04:51 localhost python3.9[107444]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_collectd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 04:04:51 localhost systemd[1]: Reloading. Dec 15 04:04:51 localhost systemd-sysv-generator[107474]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:04:51 localhost systemd-rc-local-generator[107471]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:04:51 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:04:51 localhost systemd[1]: var-lib-containers-storage-overlay-dcb3adb8ad1cc413fefbcfb3f7ed9e47d52e4bf4c16860b1049f35510ac414e2-merged.mount: Deactivated successfully. Dec 15 04:04:51 localhost systemd[1]: Stopping collectd container... Dec 15 04:04:52 localhost systemd[1]: tmp-crun.JS5TxZ.mount: Deactivated successfully. Dec 15 04:04:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19534 DF PROTO=TCP SPT=44202 DPT=9100 SEQ=847711568 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8383AD430000000001030307) Dec 15 04:04:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 04:04:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 04:04:54 localhost podman[107498]: Error: container 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e is not running Dec 15 04:04:54 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Main process exited, code=exited, status=125/n/a Dec 15 04:04:54 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Failed with result 'exit-code'. Dec 15 04:04:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 04:04:54 localhost systemd[1]: tmp-crun.U5vsQH.mount: Deactivated successfully. Dec 15 04:04:54 localhost podman[107522]: 2025-12-15 09:04:54.330755969 +0000 UTC m=+0.075669822 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, architecture=x86_64, container_name=nova_compute, io.openshift.expose-services=, config_id=tripleo_step5, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, release=1761123044, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, build-date=2025-11-19T00:36:58Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true) Dec 15 04:04:54 localhost podman[107499]: 2025-12-15 09:04:54.309225924 +0000 UTC m=+0.136399166 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, managed_by=tripleo_ansible, io.buildah.version=1.41.4, batch=17.1_20251118.1, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, url=https://www.redhat.com, tcib_managed=true, version=17.1.12, com.redhat.component=openstack-iscsid-container, name=rhosp17/openstack-iscsid, build-date=2025-11-18T23:44:13Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid) Dec 15 04:04:54 localhost podman[107522]: 2025-12-15 09:04:54.378535896 +0000 UTC m=+0.123449719 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-type=git, config_id=tripleo_step5, tcib_managed=true, maintainer=OpenStack TripleO Team, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, release=1761123044, vendor=Red Hat, Inc., batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}) Dec 15 04:04:54 localhost podman[107522]: unhealthy Dec 15 04:04:54 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Main process exited, code=exited, status=1/FAILURE Dec 15 04:04:54 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Failed with result 'exit-code'. Dec 15 04:04:54 localhost podman[107499]: 2025-12-15 09:04:54.394244235 +0000 UTC m=+0.221417457 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, build-date=2025-11-18T23:44:13Z, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, release=1761123044, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, name=rhosp17/openstack-iscsid, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public) Dec 15 04:04:54 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 04:04:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 04:04:55 localhost podman[107556]: 2025-12-15 09:04:55.717897321 +0000 UTC m=+0.055014001 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, build-date=2025-11-18T22:49:32Z, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, container_name=logrotate_crond, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, batch=17.1_20251118.1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.12, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., name=rhosp17/openstack-cron) Dec 15 04:04:55 localhost podman[107556]: 2025-12-15 09:04:55.729164992 +0000 UTC m=+0.066281672 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, version=17.1.12, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., tcib_managed=true, build-date=2025-11-18T22:49:32Z, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, container_name=logrotate_crond, release=1761123044, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, config_id=tripleo_step4, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 15 04:04:55 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 04:04:56 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19536 DF PROTO=TCP SPT=44202 DPT=9100 SEQ=847711568 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8383B9650000000001030307) Dec 15 04:04:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 04:04:58 localhost podman[107576]: 2025-12-15 09:04:58.507075252 +0000 UTC m=+0.088626749 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-type=git, tcib_managed=true, managed_by=tripleo_ansible, batch=17.1_20251118.1, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, architecture=x86_64, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., container_name=nova_migration_target) Dec 15 04:04:58 localhost podman[107576]: 2025-12-15 09:04:58.885535363 +0000 UTC m=+0.467086910 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-type=git, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2025-11-19T00:36:58Z, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, batch=17.1_20251118.1, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, vendor=Red Hat, Inc., container_name=nova_migration_target, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, architecture=x86_64, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, release=1761123044) Dec 15 04:04:58 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 04:04:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30052 DF PROTO=TCP SPT=40562 DPT=9101 SEQ=3858370266 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8383C3660000000001030307) Dec 15 04:05:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 04:05:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 04:05:01 localhost systemd[1]: tmp-crun.wS41Bq.mount: Deactivated successfully. Dec 15 04:05:01 localhost podman[107600]: 2025-12-15 09:05:01.257753752 +0000 UTC m=+0.091066793 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, architecture=x86_64, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, distribution-scope=public, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp17/openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 15 04:05:01 localhost podman[107601]: 2025-12-15 09:05:01.297550136 +0000 UTC m=+0.129292265 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, tcib_managed=true, managed_by=tripleo_ansible, build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, batch=17.1_20251118.1, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c) Dec 15 04:05:01 localhost podman[107601]: 2025-12-15 09:05:01.340346969 +0000 UTC m=+0.172089108 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, build-date=2025-11-19T00:14:25Z, distribution-scope=public, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, version=17.1.12, managed_by=tripleo_ansible, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 15 04:05:01 localhost podman[107601]: unhealthy Dec 15 04:05:01 localhost podman[107600]: 2025-12-15 09:05:01.350860121 +0000 UTC m=+0.184173222 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, version=17.1.12, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, vcs-type=git, name=rhosp17/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, config_id=tripleo_step4, architecture=x86_64, io.openshift.expose-services=, build-date=2025-11-18T23:34:05Z, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller) Dec 15 04:05:01 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Main process exited, code=exited, status=1/FAILURE Dec 15 04:05:01 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Failed with result 'exit-code'. Dec 15 04:05:01 localhost podman[107600]: unhealthy Dec 15 04:05:01 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Main process exited, code=exited, status=1/FAILURE Dec 15 04:05:01 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Failed with result 'exit-code'. Dec 15 04:05:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46379 DF PROTO=TCP SPT=54382 DPT=9102 SEQ=3835054903 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8383CF250000000001030307) Dec 15 04:05:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20726 DF PROTO=TCP SPT=55574 DPT=9882 SEQ=1869235371 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8383DAE60000000001030307) Dec 15 04:05:05 localhost ceph-osd[31375]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Dec 15 04:05:05 localhost ceph-osd[31375]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 4800.1 total, 600.0 interval#012Cumulative writes: 4815 writes, 21K keys, 4815 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 4815 writes, 628 syncs, 7.67 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Dec 15 04:05:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22604 DF PROTO=TCP SPT=47856 DPT=9105 SEQ=357411638 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8383E6A50000000001030307) Dec 15 04:05:10 localhost ceph-osd[32311]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Dec 15 04:05:10 localhost ceph-osd[32311]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 4800.2 total, 600.0 interval#012Cumulative writes: 5745 writes, 25K keys, 5745 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 5745 writes, 763 syncs, 7.53 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Dec 15 04:05:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30054 DF PROTO=TCP SPT=40562 DPT=9101 SEQ=3858370266 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8383F3250000000001030307) Dec 15 04:05:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 04:05:12 localhost systemd[1]: tmp-crun.BQVNyN.mount: Deactivated successfully. Dec 15 04:05:12 localhost podman[107642]: 2025-12-15 09:05:12.015436202 +0000 UTC m=+0.096755957 container health_status 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp17/openstack-qdrouterd, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, container_name=metrics_qdr, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, architecture=x86_64, version=17.1.12, tcib_managed=true, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, distribution-scope=public, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}) Dec 15 04:05:12 localhost podman[107642]: 2025-12-15 09:05:12.208701125 +0000 UTC m=+0.290020930 container exec_died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp17/openstack-qdrouterd, url=https://www.redhat.com, architecture=x86_64, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, managed_by=tripleo_ansible, build-date=2025-11-18T22:49:46Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, vcs-type=git, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, version=17.1.12, tcib_managed=true, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd) Dec 15 04:05:12 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Deactivated successfully. Dec 15 04:05:16 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Dec 15 04:05:16 localhost recover_tripleo_nova_virtqemud[107673]: 61849 Dec 15 04:05:16 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Dec 15 04:05:16 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Dec 15 04:05:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:f3:4e:cb MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.110 DST=192.168.122.106 LEN=40 TOS=0x00 PREC=0x00 TTL=64 ID=0 DF PROTO=TCP SPT=6379 DPT=52776 SEQ=1766092657 ACK=0 WINDOW=0 RES=0x00 RST URGP=0 Dec 15 04:05:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20728 DF PROTO=TCP SPT=55574 DPT=9882 SEQ=1869235371 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83840B250000000001030307) Dec 15 04:05:20 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22606 DF PROTO=TCP SPT=47856 DPT=9105 SEQ=357411638 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838417260000000001030307) Dec 15 04:05:22 localhost sshd[107674]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:05:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50868 DF PROTO=TCP SPT=51338 DPT=9100 SEQ=1232339034 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838422730000000001030307) Dec 15 04:05:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 04:05:24 localhost podman[107676]: Error: container 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e is not running Dec 15 04:05:24 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Main process exited, code=exited, status=125/n/a Dec 15 04:05:24 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Failed with result 'exit-code'. Dec 15 04:05:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 04:05:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 04:05:24 localhost podman[107688]: 2025-12-15 09:05:24.587189589 +0000 UTC m=+0.082632659 container health_status 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., container_name=iscsid, version=17.1.12, architecture=x86_64, name=rhosp17/openstack-iscsid, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, io.buildah.version=1.41.4, batch=17.1_20251118.1, tcib_managed=true, build-date=2025-11-18T23:44:13Z, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d) Dec 15 04:05:24 localhost podman[107688]: 2025-12-15 09:05:24.628600265 +0000 UTC m=+0.124043305 container exec_died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-type=git, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, container_name=iscsid, vendor=Red Hat, Inc., build-date=2025-11-18T23:44:13Z, version=17.1.12, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp17/openstack-iscsid) Dec 15 04:05:24 localhost systemd[1]: tmp-crun.IFSZNk.mount: Deactivated successfully. Dec 15 04:05:24 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Deactivated successfully. Dec 15 04:05:24 localhost podman[107689]: 2025-12-15 09:05:24.660097607 +0000 UTC m=+0.150776819 container health_status 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, tcib_managed=true, url=https://www.redhat.com, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.12, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_compute, io.buildah.version=1.41.4, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, managed_by=tripleo_ansible, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute) Dec 15 04:05:24 localhost podman[107689]: 2025-12-15 09:05:24.708428949 +0000 UTC m=+0.199108131 container exec_died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, release=1761123044, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, tcib_managed=true, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.buildah.version=1.41.4, container_name=nova_compute, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-compute) Dec 15 04:05:24 localhost podman[107689]: unhealthy Dec 15 04:05:24 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Main process exited, code=exited, status=1/FAILURE Dec 15 04:05:24 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Failed with result 'exit-code'. Dec 15 04:05:26 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50870 DF PROTO=TCP SPT=51338 DPT=9100 SEQ=1232339034 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83842E660000000001030307) Dec 15 04:05:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 04:05:26 localhost systemd[1]: tmp-crun.3zhWvN.mount: Deactivated successfully. Dec 15 04:05:26 localhost podman[107729]: 2025-12-15 09:05:26.763006781 +0000 UTC m=+0.092616475 container health_status ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_id=tripleo_step4, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64, io.openshift.expose-services=, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, distribution-scope=public, name=rhosp17/openstack-cron, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team) Dec 15 04:05:26 localhost podman[107729]: 2025-12-15 09:05:26.774247322 +0000 UTC m=+0.103857016 container exec_died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, distribution-scope=public, io.buildah.version=1.41.4, name=rhosp17/openstack-cron, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, container_name=logrotate_crond, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64, io.openshift.expose-services=, build-date=2025-11-18T22:49:32Z, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 15 04:05:26 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Deactivated successfully. Dec 15 04:05:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46470 DF PROTO=TCP SPT=42894 DPT=9101 SEQ=3883597858 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838438A50000000001030307) Dec 15 04:05:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 04:05:29 localhost systemd[1]: tmp-crun.RlCGMs.mount: Deactivated successfully. Dec 15 04:05:29 localhost podman[107810]: 2025-12-15 09:05:29.51543424 +0000 UTC m=+0.094285270 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, distribution-scope=public, config_id=tripleo_step4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, name=rhosp17/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, release=1761123044, version=17.1.12, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1) Dec 15 04:05:29 localhost podman[107810]: 2025-12-15 09:05:29.8773909 +0000 UTC m=+0.456241900 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, build-date=2025-11-19T00:36:58Z, io.buildah.version=1.41.4, config_id=tripleo_step4, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute) Dec 15 04:05:29 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 04:05:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 04:05:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 04:05:31 localhost systemd[1]: tmp-crun.yKwFlt.mount: Deactivated successfully. Dec 15 04:05:31 localhost podman[107848]: 2025-12-15 09:05:31.748376759 +0000 UTC m=+0.081430697 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, version=17.1.12, architecture=x86_64, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, build-date=2025-11-18T23:34:05Z, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, name=rhosp17/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, vcs-type=git) Dec 15 04:05:31 localhost podman[107848]: 2025-12-15 09:05:31.788453579 +0000 UTC m=+0.121507497 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, container_name=ovn_controller, tcib_managed=true, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, release=1761123044, distribution-scope=public, build-date=2025-11-18T23:34:05Z, io.openshift.expose-services=, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4) Dec 15 04:05:31 localhost podman[107848]: unhealthy Dec 15 04:05:31 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Main process exited, code=exited, status=1/FAILURE Dec 15 04:05:31 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Failed with result 'exit-code'. Dec 15 04:05:31 localhost podman[107849]: 2025-12-15 09:05:31.790124044 +0000 UTC m=+0.118321043 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=) Dec 15 04:05:31 localhost podman[107849]: 2025-12-15 09:05:31.871320323 +0000 UTC m=+0.199517362 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vendor=Red Hat, Inc., io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, managed_by=tripleo_ansible, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, release=1761123044, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z) Dec 15 04:05:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56997 DF PROTO=TCP SPT=47232 DPT=9882 SEQ=4144837055 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8384441A0000000001030307) Dec 15 04:05:31 localhost podman[107849]: unhealthy Dec 15 04:05:31 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Main process exited, code=exited, status=1/FAILURE Dec 15 04:05:31 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Failed with result 'exit-code'. Dec 15 04:05:34 localhost podman[107485]: time="2025-12-15T09:05:34Z" level=warning msg="StopSignal SIGTERM failed to stop container collectd in 42 seconds, resorting to SIGKILL" Dec 15 04:05:34 localhost systemd[1]: tmp-crun.wlJQWY.mount: Deactivated successfully. Dec 15 04:05:34 localhost podman[107485]: 2025-12-15 09:05:34.023473964 +0000 UTC m=+42.087847817 container stop 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.expose-services=, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp17/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=collectd, io.buildah.version=1.41.4, tcib_managed=true, vendor=Red Hat, Inc., architecture=x86_64, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, version=17.1.12, build-date=2025-11-18T22:51:28Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public) Dec 15 04:05:34 localhost systemd[1]: libpod-165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.scope: Deactivated successfully. Dec 15 04:05:34 localhost systemd[1]: libpod-165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.scope: Consumed 2.204s CPU time. Dec 15 04:05:34 localhost podman[107485]: 2025-12-15 09:05:34.056204658 +0000 UTC m=+42.120578541 container died 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, container_name=collectd, com.redhat.component=openstack-collectd-container, architecture=x86_64, release=1761123044, name=rhosp17/openstack-collectd, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, distribution-scope=public, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., batch=17.1_20251118.1, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, build-date=2025-11-18T22:51:28Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3) Dec 15 04:05:34 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.timer: Deactivated successfully. Dec 15 04:05:34 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e. Dec 15 04:05:34 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Failed to open /run/systemd/transient/165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: No such file or directory Dec 15 04:05:34 localhost podman[107485]: 2025-12-15 09:05:34.157561867 +0000 UTC m=+42.221935710 container cleanup 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, distribution-scope=public, container_name=collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, name=rhosp17/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.4, config_id=tripleo_step3, version=17.1.12, architecture=x86_64, release=1761123044, tcib_managed=true, batch=17.1_20251118.1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.openshift.expose-services=, vcs-type=git, build-date=2025-11-18T22:51:28Z, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd) Dec 15 04:05:34 localhost podman[107485]: collectd Dec 15 04:05:34 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.timer: Failed to open /run/systemd/transient/165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.timer: No such file or directory Dec 15 04:05:34 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Failed to open /run/systemd/transient/165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: No such file or directory Dec 15 04:05:34 localhost podman[107889]: 2025-12-15 09:05:34.18242994 +0000 UTC m=+0.142053255 container cleanup 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, vendor=Red Hat, Inc., batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, release=1761123044, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, tcib_managed=true, build-date=2025-11-18T22:51:28Z, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-collectd, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, config_id=tripleo_step3, managed_by=tripleo_ansible, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 15 04:05:34 localhost systemd[1]: libpod-conmon-165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.scope: Deactivated successfully. Dec 15 04:05:34 localhost podman[107918]: error opening file `/run/crun/165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e/status`: No such file or directory Dec 15 04:05:34 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.timer: Failed to open /run/systemd/transient/165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.timer: No such file or directory Dec 15 04:05:34 localhost systemd[1]: 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: Failed to open /run/systemd/transient/165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e.service: No such file or directory Dec 15 04:05:34 localhost podman[107906]: 2025-12-15 09:05:34.293780505 +0000 UTC m=+0.081384195 container cleanup 165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'da9a0dc7b40588672419e3ce10063e21'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., version=17.1.12, build-date=2025-11-18T22:51:28Z, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-collectd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, com.redhat.component=openstack-collectd-container, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, config_id=tripleo_step3, architecture=x86_64, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=) Dec 15 04:05:34 localhost podman[107906]: collectd Dec 15 04:05:34 localhost systemd[1]: tripleo_collectd.service: Deactivated successfully. Dec 15 04:05:34 localhost systemd[1]: Stopped collectd container. Dec 15 04:05:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56999 DF PROTO=TCP SPT=47232 DPT=9882 SEQ=4144837055 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838450250000000001030307) Dec 15 04:05:35 localhost python3.9[108011]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_iscsid.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 04:05:35 localhost systemd[1]: var-lib-containers-storage-overlay-3cbc8d0b1fd940058dd16189a8a0f2adea168e1862db457c5612a248dfd3b9d7-merged.mount: Deactivated successfully. Dec 15 04:05:35 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-165a6fd1f2b629d16da0798ec1c9bac41d302d530a0ffa668b2315a52d6aa04e-userdata-shm.mount: Deactivated successfully. Dec 15 04:05:35 localhost systemd[1]: Reloading. Dec 15 04:05:35 localhost systemd-rc-local-generator[108031]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:05:35 localhost systemd-sysv-generator[108038]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:05:35 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:05:35 localhost systemd[1]: Stopping iscsid container... Dec 15 04:05:35 localhost systemd[1]: libpod-2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.scope: Deactivated successfully. Dec 15 04:05:35 localhost systemd[1]: libpod-2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.scope: Consumed 1.076s CPU time. Dec 15 04:05:35 localhost podman[108052]: 2025-12-15 09:05:35.483569804 +0000 UTC m=+0.056833429 container died 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, container_name=iscsid, maintainer=OpenStack TripleO Team, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1761123044, name=rhosp17/openstack-iscsid, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.buildah.version=1.41.4, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, build-date=2025-11-18T23:44:13Z, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, vcs-type=git, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid) Dec 15 04:05:35 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.timer: Deactivated successfully. Dec 15 04:05:35 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057. Dec 15 04:05:35 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Failed to open /run/systemd/transient/2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: No such file or directory Dec 15 04:05:35 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057-userdata-shm.mount: Deactivated successfully. Dec 15 04:05:35 localhost podman[108052]: 2025-12-15 09:05:35.519756841 +0000 UTC m=+0.093020466 container cleanup 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, architecture=x86_64, distribution-scope=public, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, release=1761123044, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-iscsid, container_name=iscsid, tcib_managed=true, vendor=Red Hat, Inc., vcs-type=git, build-date=2025-11-18T23:44:13Z, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible) Dec 15 04:05:35 localhost podman[108052]: iscsid Dec 15 04:05:35 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.timer: Failed to open /run/systemd/transient/2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.timer: No such file or directory Dec 15 04:05:35 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Failed to open /run/systemd/transient/2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: No such file or directory Dec 15 04:05:35 localhost podman[108066]: 2025-12-15 09:05:35.567031924 +0000 UTC m=+0.073225147 container cleanup 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, name=rhosp17/openstack-iscsid, url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, batch=17.1_20251118.1, config_id=tripleo_step3, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, architecture=x86_64, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.4, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d) Dec 15 04:05:35 localhost systemd[1]: libpod-conmon-2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.scope: Deactivated successfully. Dec 15 04:05:35 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.timer: Failed to open /run/systemd/transient/2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.timer: No such file or directory Dec 15 04:05:35 localhost systemd[1]: 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: Failed to open /run/systemd/transient/2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057.service: No such file or directory Dec 15 04:05:35 localhost podman[108080]: 2025-12-15 09:05:35.662443383 +0000 UTC m=+0.068038878 container cleanup 2649c3aec9af2f04509e1d248b1050b202fccc3ed2b86421c89a2a65376db057 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-ref=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc., url=https://www.redhat.com, build-date=2025-11-18T23:44:13Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp17/openstack-iscsid, org.opencontainers.image.revision=5714445d3136fb8f8cd5e0726e4e3e709c68ad0d, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64, container_name=iscsid, io.buildah.version=1.41.4, version=17.1.12, managed_by=tripleo_ansible, vcs-type=git, batch=17.1_20251118.1) Dec 15 04:05:35 localhost podman[108080]: iscsid Dec 15 04:05:35 localhost systemd[1]: tripleo_iscsid.service: Deactivated successfully. Dec 15 04:05:35 localhost systemd[1]: Stopped iscsid container. Dec 15 04:05:36 localhost systemd[1]: var-lib-containers-storage-overlay-d63ab03ca441fedc5f5fcdf51699b396e9401963b7839d4b0e700c4e4e1e58a9-merged.mount: Deactivated successfully. Dec 15 04:05:36 localhost python3.9[108184]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_logrotate_crond.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 04:05:36 localhost systemd[1]: Reloading. Dec 15 04:05:36 localhost systemd-rc-local-generator[108212]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:05:36 localhost systemd-sysv-generator[108217]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:05:36 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:05:36 localhost systemd[1]: Stopping logrotate_crond container... Dec 15 04:05:36 localhost systemd[1]: libpod-ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.scope: Deactivated successfully. Dec 15 04:05:36 localhost systemd[1]: libpod-ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.scope: Consumed 1.047s CPU time. Dec 15 04:05:36 localhost podman[108226]: 2025-12-15 09:05:36.84956334 +0000 UTC m=+0.073391852 container died ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, managed_by=tripleo_ansible, name=rhosp17/openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, distribution-scope=public, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-type=git, version=17.1.12, release=1761123044, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, architecture=x86_64, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4) Dec 15 04:05:36 localhost systemd[1]: tmp-crun.CrziZc.mount: Deactivated successfully. Dec 15 04:05:36 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.timer: Deactivated successfully. Dec 15 04:05:36 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549. Dec 15 04:05:36 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Failed to open /run/systemd/transient/ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: No such file or directory Dec 15 04:05:36 localhost podman[108226]: 2025-12-15 09:05:36.8997546 +0000 UTC m=+0.123583112 container cleanup ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, tcib_managed=true, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.buildah.version=1.41.4, url=https://www.redhat.com, name=rhosp17/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, batch=17.1_20251118.1, architecture=x86_64, config_id=tripleo_step4, version=17.1.12, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2025-11-18T22:49:32Z, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc.) Dec 15 04:05:36 localhost podman[108226]: logrotate_crond Dec 15 04:05:36 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.timer: Failed to open /run/systemd/transient/ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.timer: No such file or directory Dec 15 04:05:36 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Failed to open /run/systemd/transient/ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: No such file or directory Dec 15 04:05:36 localhost podman[108239]: 2025-12-15 09:05:36.925736915 +0000 UTC m=+0.068847191 container cleanup ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, architecture=x86_64, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.expose-services=, name=rhosp17/openstack-cron, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, release=1761123044, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, io.buildah.version=1.41.4, distribution-scope=public, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, version=17.1.12) Dec 15 04:05:36 localhost systemd[1]: libpod-conmon-ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.scope: Deactivated successfully. Dec 15 04:05:37 localhost systemd[1]: var-lib-containers-storage-overlay-7b239ddaa407a2a969ac6b39fe52abf4714e7608ccfdf4f5f9e0e55a4123ba70-merged.mount: Deactivated successfully. Dec 15 04:05:37 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549-userdata-shm.mount: Deactivated successfully. Dec 15 04:05:37 localhost podman[108268]: error opening file `/run/crun/ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549/status`: No such file or directory Dec 15 04:05:37 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.timer: Failed to open /run/systemd/transient/ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.timer: No such file or directory Dec 15 04:05:37 localhost systemd[1]: ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: Failed to open /run/systemd/transient/ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549.service: No such file or directory Dec 15 04:05:37 localhost podman[108256]: 2025-12-15 09:05:37.037669496 +0000 UTC m=+0.079264550 container cleanup ac45612fab357b615b4684dbfa5bd000dd29eb1adfbaedb6db0802fc49e89549 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-type=git, build-date=2025-11-18T22:49:32Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp17/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, managed_by=tripleo_ansible, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, container_name=logrotate_crond, distribution-scope=public, release=1761123044, io.openshift.expose-services=, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, version=17.1.12, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a) Dec 15 04:05:37 localhost podman[108256]: logrotate_crond Dec 15 04:05:37 localhost systemd[1]: tripleo_logrotate_crond.service: Deactivated successfully. Dec 15 04:05:37 localhost systemd[1]: Stopped logrotate_crond container. Dec 15 04:05:37 localhost python3.9[108361]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_metrics_qdr.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 04:05:37 localhost systemd[1]: Reloading. Dec 15 04:05:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7277 DF PROTO=TCP SPT=56526 DPT=9105 SEQ=2992661105 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83845BA50000000001030307) Dec 15 04:05:37 localhost systemd-sysv-generator[108387]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:05:37 localhost systemd-rc-local-generator[108384]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:05:38 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:05:38 localhost systemd[1]: Stopping metrics_qdr container... Dec 15 04:05:38 localhost kernel: qdrouterd[54465]: segfault at 0 ip 00007fef4bdc37cb sp 00007ffeadbbde80 error 4 in libc.so.6[7fef4bd60000+175000] Dec 15 04:05:38 localhost kernel: Code: 0b 00 64 44 89 23 85 c0 75 d4 e9 2b ff ff ff e8 db a5 00 00 e9 fd fe ff ff e8 41 1d 0d 00 90 f3 0f 1e fa 41 54 55 48 89 fd 53 <8b> 07 f6 c4 20 0f 85 aa 00 00 00 89 c2 81 e2 00 80 00 00 0f 84 a9 Dec 15 04:05:38 localhost systemd[1]: Created slice Slice /system/systemd-coredump. Dec 15 04:05:38 localhost systemd[1]: Started Process Core Dump (PID 108414/UID 0). Dec 15 04:05:38 localhost systemd-coredump[108415]: Resource limits disable core dumping for process 54465 (qdrouterd). Dec 15 04:05:38 localhost systemd-coredump[108415]: Process 54465 (qdrouterd) of user 42465 dumped core. Dec 15 04:05:38 localhost systemd[1]: systemd-coredump@0-108414-0.service: Deactivated successfully. Dec 15 04:05:38 localhost podman[108402]: 2025-12-15 09:05:38.453923664 +0000 UTC m=+0.236134860 container died 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-qdrouterd, release=1761123044, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, version=17.1.12, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, managed_by=tripleo_ansible, config_id=tripleo_step1, batch=17.1_20251118.1, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, build-date=2025-11-18T22:49:46Z, io.buildah.version=1.41.4) Dec 15 04:05:38 localhost systemd[1]: libpod-6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.scope: Deactivated successfully. Dec 15 04:05:38 localhost systemd[1]: libpod-6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.scope: Consumed 29.620s CPU time. Dec 15 04:05:38 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.timer: Deactivated successfully. Dec 15 04:05:38 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c. Dec 15 04:05:38 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Failed to open /run/systemd/transient/6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: No such file or directory Dec 15 04:05:38 localhost systemd[1]: tmp-crun.Rwofqb.mount: Deactivated successfully. Dec 15 04:05:38 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c-userdata-shm.mount: Deactivated successfully. Dec 15 04:05:38 localhost podman[108402]: 2025-12-15 09:05:38.50093699 +0000 UTC m=+0.283148156 container cleanup 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, build-date=2025-11-18T22:49:46Z, release=1761123044, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, name=rhosp17/openstack-qdrouterd, version=17.1.12, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, vendor=Red Hat, Inc., container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 15 04:05:38 localhost podman[108402]: metrics_qdr Dec 15 04:05:38 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.timer: Failed to open /run/systemd/transient/6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.timer: No such file or directory Dec 15 04:05:38 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Failed to open /run/systemd/transient/6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: No such file or directory Dec 15 04:05:38 localhost podman[108419]: 2025-12-15 09:05:38.519959639 +0000 UTC m=+0.059744417 container cleanup 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-qdrouterd, version=17.1.12, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, release=1761123044, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, tcib_managed=true, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, managed_by=tripleo_ansible, config_id=tripleo_step1) Dec 15 04:05:38 localhost systemd[1]: tripleo_metrics_qdr.service: Main process exited, code=exited, status=139/n/a Dec 15 04:05:38 localhost systemd[1]: libpod-conmon-6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.scope: Deactivated successfully. Dec 15 04:05:38 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.timer: Failed to open /run/systemd/transient/6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.timer: No such file or directory Dec 15 04:05:38 localhost systemd[1]: 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: Failed to open /run/systemd/transient/6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c.service: No such file or directory Dec 15 04:05:38 localhost podman[108433]: 2025-12-15 09:05:38.61431448 +0000 UTC m=+0.066715004 container cleanup 6278d648aec9c9f12e0efaafa0d6ae5653eeb34459516699e562eeb9c8d23b3c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, config_id=tripleo_step1, name=rhosp17/openstack-qdrouterd, managed_by=tripleo_ansible, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20251118.1, release=1761123044, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2025-11-18T22:49:46Z, version=17.1.12, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '29d07da103bbd44a9ed3e29999314b03'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=7ecaafae6fa9301c7dd5c0fca835eecf10dd147a, vcs-type=git, distribution-scope=public, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd) Dec 15 04:05:38 localhost podman[108433]: metrics_qdr Dec 15 04:05:38 localhost systemd[1]: tripleo_metrics_qdr.service: Failed with result 'exit-code'. Dec 15 04:05:38 localhost systemd[1]: Stopped metrics_qdr container. Dec 15 04:05:39 localhost systemd[1]: var-lib-containers-storage-overlay-e75ee441979af5ea9ed0c5146d1f659562a9f9f9039bdfa8b70f6f9c6eebd6bb-merged.mount: Deactivated successfully. Dec 15 04:05:39 localhost python3.9[108538]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_neutron_dhcp.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 04:05:40 localhost python3.9[108631]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_neutron_l3_agent.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 04:05:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46472 DF PROTO=TCP SPT=42894 DPT=9101 SEQ=3883597858 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838469250000000001030307) Dec 15 04:05:41 localhost python3.9[108724]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_neutron_ovs_agent.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 04:05:42 localhost python3.9[108817]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 04:05:43 localhost systemd[1]: Reloading. Dec 15 04:05:43 localhost systemd-rc-local-generator[108841]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:05:43 localhost systemd-sysv-generator[108847]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:05:43 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:05:43 localhost systemd[1]: Stopping nova_compute container... Dec 15 04:05:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57001 DF PROTO=TCP SPT=47232 DPT=9882 SEQ=4144837055 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838481260000000001030307) Dec 15 04:05:50 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7279 DF PROTO=TCP SPT=56526 DPT=9105 SEQ=2992661105 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83848B260000000001030307) Dec 15 04:05:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1500 DF PROTO=TCP SPT=42982 DPT=9100 SEQ=2153066818 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838497A20000000001030307) Dec 15 04:05:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1501 DF PROTO=TCP SPT=42982 DPT=9100 SEQ=2153066818 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83849BA60000000001030307) Dec 15 04:05:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 04:05:55 localhost podman[108871]: Error: container 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 is not running Dec 15 04:05:55 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Main process exited, code=exited, status=125/n/a Dec 15 04:05:55 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Failed with result 'exit-code'. Dec 15 04:05:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32441 DF PROTO=TCP SPT=38370 DPT=9102 SEQ=622795607 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8384A1E50000000001030307) Dec 15 04:05:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2073 DF PROTO=TCP SPT=44808 DPT=9101 SEQ=1568708005 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8384ADA50000000001030307) Dec 15 04:05:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 04:05:59 localhost systemd[1]: tmp-crun.r8Vo26.mount: Deactivated successfully. Dec 15 04:06:00 localhost podman[108881]: 2025-12-15 09:05:59.999690016 +0000 UTC m=+0.079278968 container health_status 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, release=1761123044, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, tcib_managed=true, url=https://www.redhat.com, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, name=rhosp17/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute) Dec 15 04:06:00 localhost podman[108881]: 2025-12-15 09:06:00.381677263 +0000 UTC m=+0.461266205 container exec_died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, name=rhosp17/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, release=1761123044, build-date=2025-11-19T00:36:58Z, distribution-scope=public, managed_by=tripleo_ansible, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, version=17.1.12, url=https://www.redhat.com) Dec 15 04:06:00 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Deactivated successfully. Dec 15 04:06:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32443 DF PROTO=TCP SPT=38370 DPT=9102 SEQ=622795607 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8384B9A50000000001030307) Dec 15 04:06:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 04:06:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 04:06:02 localhost systemd[1]: tmp-crun.iIKjEc.mount: Deactivated successfully. Dec 15 04:06:02 localhost podman[108904]: 2025-12-15 09:06:02.26352239 +0000 UTC m=+0.095786870 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, config_id=tripleo_step4, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, architecture=x86_64, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, release=1761123044, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, container_name=ovn_controller) Dec 15 04:06:02 localhost systemd[1]: tmp-crun.isvDVY.mount: Deactivated successfully. Dec 15 04:06:02 localhost podman[108905]: 2025-12-15 09:06:02.307964278 +0000 UTC m=+0.138817540 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, batch=17.1_20251118.1, io.buildah.version=1.41.4, architecture=x86_64, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.openshift.expose-services=, release=1761123044, build-date=2025-11-19T00:14:25Z, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.12, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Dec 15 04:06:02 localhost podman[108905]: 2025-12-15 09:06:02.32936747 +0000 UTC m=+0.160220742 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, managed_by=tripleo_ansible, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, version=17.1.12, name=rhosp17/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2025-11-19T00:14:25Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Dec 15 04:06:02 localhost podman[108905]: unhealthy Dec 15 04:06:02 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Main process exited, code=exited, status=1/FAILURE Dec 15 04:06:02 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Failed with result 'exit-code'. Dec 15 04:06:02 localhost podman[108904]: 2025-12-15 09:06:02.381099201 +0000 UTC m=+0.213363681 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, architecture=x86_64, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, managed_by=tripleo_ansible, release=1761123044, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, distribution-scope=public, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 15 04:06:02 localhost podman[108904]: unhealthy Dec 15 04:06:02 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Main process exited, code=exited, status=1/FAILURE Dec 15 04:06:02 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Failed with result 'exit-code'. Dec 15 04:06:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29966 DF PROTO=TCP SPT=39844 DPT=9882 SEQ=2446189317 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8384C5650000000001030307) Dec 15 04:06:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44371 DF PROTO=TCP SPT=59622 DPT=9105 SEQ=2976832406 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8384D0E50000000001030307) Dec 15 04:06:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2075 DF PROTO=TCP SPT=44808 DPT=9101 SEQ=1568708005 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8384DD250000000001030307) Dec 15 04:06:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29968 DF PROTO=TCP SPT=39844 DPT=9882 SEQ=2446189317 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8384F5260000000001030307) Dec 15 04:06:20 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44373 DF PROTO=TCP SPT=59622 DPT=9105 SEQ=2976832406 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838501260000000001030307) Dec 15 04:06:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38266 DF PROTO=TCP SPT=39930 DPT=9100 SEQ=1221090715 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83850CD20000000001030307) Dec 15 04:06:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38267 DF PROTO=TCP SPT=39930 DPT=9100 SEQ=1221090715 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838510E50000000001030307) Dec 15 04:06:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 04:06:25 localhost podman[108941]: Error: container 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 is not running Dec 15 04:06:25 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Main process exited, code=exited, status=125/n/a Dec 15 04:06:25 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Failed with result 'exit-code'. Dec 15 04:06:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10438 DF PROTO=TCP SPT=37628 DPT=9102 SEQ=3433511798 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838517250000000001030307) Dec 15 04:06:26 localhost podman[108858]: time="2025-12-15T09:06:26Z" level=warning msg="StopSignal SIGTERM failed to stop container nova_compute in 42 seconds, resorting to SIGKILL" Dec 15 04:06:26 localhost systemd[1]: session-c11.scope: Deactivated successfully. Dec 15 04:06:26 localhost systemd[1]: libpod-36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.scope: Deactivated successfully. Dec 15 04:06:26 localhost systemd[1]: libpod-36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.scope: Consumed 37.233s CPU time. Dec 15 04:06:26 localhost podman[108858]: 2025-12-15 09:06:26.119304639 +0000 UTC m=+42.109649651 container died 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, distribution-scope=public, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.4, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, container_name=nova_compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.component=openstack-nova-compute-container) Dec 15 04:06:26 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.timer: Deactivated successfully. Dec 15 04:06:26 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5. Dec 15 04:06:26 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Failed to open /run/systemd/transient/36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: No such file or directory Dec 15 04:06:26 localhost systemd[1]: tmp-crun.20emk4.mount: Deactivated successfully. Dec 15 04:06:26 localhost systemd[1]: var-lib-containers-storage-overlay-52d2cac538be3ecf871864e9c47de12a2631999deebfb497d729204226e684f5-merged.mount: Deactivated successfully. Dec 15 04:06:26 localhost podman[108858]: 2025-12-15 09:06:26.198231947 +0000 UTC m=+42.188576959 container cleanup 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, name=rhosp17/openstack-nova-compute, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, version=17.1.12, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute) Dec 15 04:06:26 localhost podman[108858]: nova_compute Dec 15 04:06:26 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.timer: Failed to open /run/systemd/transient/36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.timer: No such file or directory Dec 15 04:06:26 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Failed to open /run/systemd/transient/36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: No such file or directory Dec 15 04:06:26 localhost podman[108953]: 2025-12-15 09:06:26.214240805 +0000 UTC m=+0.084969501 container cleanup 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, architecture=x86_64, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, build-date=2025-11-19T00:36:58Z, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, distribution-scope=public) Dec 15 04:06:26 localhost systemd[1]: libpod-conmon-36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.scope: Deactivated successfully. Dec 15 04:06:26 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.timer: Failed to open /run/systemd/transient/36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.timer: No such file or directory Dec 15 04:06:26 localhost systemd[1]: 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: Failed to open /run/systemd/transient/36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5.service: No such file or directory Dec 15 04:06:26 localhost podman[108966]: 2025-12-15 09:06:26.283866156 +0000 UTC m=+0.045371713 container cleanup 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, architecture=x86_64, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1761123044, io.buildah.version=1.41.4, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2025-11-19T00:36:58Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute) Dec 15 04:06:26 localhost podman[108966]: nova_compute Dec 15 04:06:26 localhost systemd[1]: tripleo_nova_compute.service: Deactivated successfully. Dec 15 04:06:26 localhost systemd[1]: Stopped nova_compute container. Dec 15 04:06:26 localhost systemd[1]: tripleo_nova_compute.service: Consumed 1.078s CPU time, no IO. Dec 15 04:06:27 localhost python3.9[109071]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 04:06:28 localhost systemd[1]: Reloading. Dec 15 04:06:28 localhost systemd-sysv-generator[109099]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:06:28 localhost systemd-rc-local-generator[109094]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:06:28 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:06:28 localhost systemd[1]: Stopping nova_migration_target container... Dec 15 04:06:28 localhost systemd[1]: libpod-4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.scope: Deactivated successfully. Dec 15 04:06:28 localhost systemd[1]: libpod-4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.scope: Consumed 34.962s CPU time. Dec 15 04:06:28 localhost podman[109111]: 2025-12-15 09:06:28.681134765 +0000 UTC m=+0.075450477 container died 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.buildah.version=1.41.4, release=1761123044, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:36:58Z, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, batch=17.1_20251118.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp17/openstack-nova-compute, tcib_managed=true, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=nova_migration_target, vcs-type=git, architecture=x86_64, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute) Dec 15 04:06:28 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.timer: Deactivated successfully. Dec 15 04:06:28 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0. Dec 15 04:06:28 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Failed to open /run/systemd/transient/4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: No such file or directory Dec 15 04:06:28 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0-userdata-shm.mount: Deactivated successfully. Dec 15 04:06:28 localhost systemd[1]: var-lib-containers-storage-overlay-1925df910a0bd163709115d5c6434edae9eb72581a26c20b4795234cbdad634b-merged.mount: Deactivated successfully. Dec 15 04:06:28 localhost podman[109111]: 2025-12-15 09:06:28.732917258 +0000 UTC m=+0.127232970 container cleanup 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, version=17.1.12, release=1761123044, url=https://www.redhat.com, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.buildah.version=1.41.4, vcs-type=git, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, build-date=2025-11-19T00:36:58Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible) Dec 15 04:06:28 localhost podman[109111]: nova_migration_target Dec 15 04:06:28 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.timer: Failed to open /run/systemd/transient/4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.timer: No such file or directory Dec 15 04:06:28 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Failed to open /run/systemd/transient/4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: No such file or directory Dec 15 04:06:28 localhost podman[109125]: 2025-12-15 09:06:28.74609157 +0000 UTC m=+0.059525081 container cleanup 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, name=rhosp17/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, url=https://www.redhat.com, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2025-11-19T00:36:58Z, version=17.1.12, distribution-scope=public, io.openshift.expose-services=, container_name=nova_migration_target, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., architecture=x86_64, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 15 04:06:28 localhost systemd[1]: libpod-conmon-4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.scope: Deactivated successfully. Dec 15 04:06:28 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.timer: Failed to open /run/systemd/transient/4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.timer: No such file or directory Dec 15 04:06:28 localhost systemd[1]: 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: Failed to open /run/systemd/transient/4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0.service: No such file or directory Dec 15 04:06:28 localhost podman[109137]: 2025-12-15 09:06:28.830487765 +0000 UTC m=+0.056973853 container cleanup 4c3f871f4bdf287b1d3ce717956c1cf130617489217410eadf6fffcc440eb3b0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, batch=17.1_20251118.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.buildah.version=1.41.4, build-date=2025-11-19T00:36:58Z, distribution-scope=public, release=1761123044, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-compute, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.12, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}) Dec 15 04:06:28 localhost podman[109137]: nova_migration_target Dec 15 04:06:28 localhost systemd[1]: tripleo_nova_migration_target.service: Deactivated successfully. Dec 15 04:06:28 localhost systemd[1]: Stopped nova_migration_target container. Dec 15 04:06:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47722 DF PROTO=TCP SPT=55416 DPT=9101 SEQ=574522976 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838522E60000000001030307) Dec 15 04:06:29 localhost python3.9[109240]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 04:06:31 localhost systemd[1]: Reloading. Dec 15 04:06:31 localhost systemd-rc-local-generator[109326]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:06:31 localhost systemd-sysv-generator[109329]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:06:31 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:06:31 localhost systemd[1]: Stopping nova_virtlogd_wrapper container... Dec 15 04:06:32 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10440 DF PROTO=TCP SPT=37628 DPT=9102 SEQ=3433511798 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83852EE60000000001030307) Dec 15 04:06:32 localhost systemd[1]: libpod-52123f040652e4c049640fa84ee13fbb7e46e8e9c52436fdd815aa5c35fd7af7.scope: Deactivated successfully. Dec 15 04:06:32 localhost podman[109343]: 2025-12-15 09:06:32.066880763 +0000 UTC m=+0.102556321 container died 52123f040652e4c049640fa84ee13fbb7e46e8e9c52436fdd815aa5c35fd7af7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, architecture=x86_64, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., config_id=tripleo_step3, distribution-scope=public, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, tcib_managed=true, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, io.openshift.expose-services=, batch=17.1_20251118.1, version=17.1.12, io.buildah.version=1.41.4, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-libvirt-container, container_name=nova_virtlogd_wrapper) Dec 15 04:06:32 localhost systemd[1]: tmp-crun.iF5ydU.mount: Deactivated successfully. Dec 15 04:06:32 localhost podman[109343]: 2025-12-15 09:06:32.106410339 +0000 UTC m=+0.142085877 container cleanup 52123f040652e4c049640fa84ee13fbb7e46e8e9c52436fdd815aa5c35fd7af7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, konflux.additional-tags=17.1.12 17.1_20251118.1, container_name=nova_virtlogd_wrapper, com.redhat.component=openstack-nova-libvirt-container, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, io.buildah.version=1.41.4, tcib_managed=true, distribution-scope=public, name=rhosp17/openstack-nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, vcs-type=git, version=17.1.12, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, maintainer=OpenStack TripleO Team) Dec 15 04:06:32 localhost podman[109343]: nova_virtlogd_wrapper Dec 15 04:06:32 localhost podman[109356]: 2025-12-15 09:06:32.146954982 +0000 UTC m=+0.076447563 container cleanup 52123f040652e4c049640fa84ee13fbb7e46e8e9c52436fdd815aa5c35fd7af7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, version=17.1.12, distribution-scope=public, build-date=2025-11-19T00:35:22Z, vcs-type=git, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-libvirt-container, description=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_virtlogd_wrapper, io.openshift.expose-services=, name=rhosp17/openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, io.buildah.version=1.41.4, batch=17.1_20251118.1, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, config_id=tripleo_step3, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, url=https://www.redhat.com) Dec 15 04:06:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 04:06:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 04:06:32 localhost podman[109371]: 2025-12-15 09:06:32.751109504 +0000 UTC m=+0.082332861 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, name=rhosp17/openstack-ovn-controller, url=https://www.redhat.com, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, vendor=Red Hat, Inc., version=17.1.12, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.4, container_name=ovn_controller, release=1761123044, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272) Dec 15 04:06:32 localhost podman[109371]: 2025-12-15 09:06:32.767348637 +0000 UTC m=+0.098571964 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, name=rhosp17/openstack-ovn-controller, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.12 17.1_20251118.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, tcib_managed=true, build-date=2025-11-18T23:34:05Z, io.buildah.version=1.41.4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, vcs-type=git, batch=17.1_20251118.1, vendor=Red Hat, Inc., container_name=ovn_controller, release=1761123044, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4) Dec 15 04:06:32 localhost podman[109371]: unhealthy Dec 15 04:06:32 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Main process exited, code=exited, status=1/FAILURE Dec 15 04:06:32 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Failed with result 'exit-code'. Dec 15 04:06:32 localhost podman[109372]: 2025-12-15 09:06:32.851954579 +0000 UTC m=+0.178105610 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, build-date=2025-11-19T00:14:25Z, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, architecture=x86_64, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, batch=17.1_20251118.1, release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent) Dec 15 04:06:32 localhost podman[109372]: 2025-12-15 09:06:32.869333723 +0000 UTC m=+0.195484714 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, version=17.1.12, architecture=x86_64, io.buildah.version=1.41.4, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, release=1761123044, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, tcib_managed=true) Dec 15 04:06:32 localhost podman[109372]: unhealthy Dec 15 04:06:32 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Main process exited, code=exited, status=1/FAILURE Dec 15 04:06:32 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Failed with result 'exit-code'. Dec 15 04:06:33 localhost systemd[1]: var-lib-containers-storage-overlay-7838c919c03fee0f47a1131073f9538ccbf08471ed4a61eebb0a717da3a55c5c-merged.mount: Deactivated successfully. Dec 15 04:06:33 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-52123f040652e4c049640fa84ee13fbb7e46e8e9c52436fdd815aa5c35fd7af7-userdata-shm.mount: Deactivated successfully. Dec 15 04:06:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12173 DF PROTO=TCP SPT=41952 DPT=9882 SEQ=3554493703 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83853A650000000001030307) Dec 15 04:06:36 localhost systemd[1]: Stopping User Manager for UID 0... Dec 15 04:06:36 localhost systemd[83969]: Activating special unit Exit the Session... Dec 15 04:06:36 localhost systemd[83969]: Removed slice User Background Tasks Slice. Dec 15 04:06:36 localhost systemd[83969]: Stopped target Main User Target. Dec 15 04:06:36 localhost systemd[83969]: Stopped target Basic System. Dec 15 04:06:36 localhost systemd[83969]: Stopped target Paths. Dec 15 04:06:36 localhost systemd[83969]: Stopped target Sockets. Dec 15 04:06:36 localhost systemd[83969]: Stopped target Timers. Dec 15 04:06:36 localhost systemd[83969]: Stopped Daily Cleanup of User's Temporary Directories. Dec 15 04:06:36 localhost systemd[83969]: Closed D-Bus User Message Bus Socket. Dec 15 04:06:36 localhost systemd[83969]: Stopped Create User's Volatile Files and Directories. Dec 15 04:06:36 localhost systemd[83969]: Removed slice User Application Slice. Dec 15 04:06:36 localhost systemd[83969]: Reached target Shutdown. Dec 15 04:06:36 localhost systemd[83969]: Finished Exit the Session. Dec 15 04:06:36 localhost systemd[83969]: Reached target Exit the Session. Dec 15 04:06:36 localhost systemd[1]: user@0.service: Deactivated successfully. Dec 15 04:06:36 localhost systemd[1]: Stopped User Manager for UID 0. Dec 15 04:06:36 localhost systemd[1]: user@0.service: Consumed 3.847s CPU time, no IO. Dec 15 04:06:36 localhost systemd[1]: Stopping User Runtime Directory /run/user/0... Dec 15 04:06:36 localhost systemd[1]: run-user-0.mount: Deactivated successfully. Dec 15 04:06:36 localhost systemd[1]: user-runtime-dir@0.service: Deactivated successfully. Dec 15 04:06:36 localhost systemd[1]: Stopped User Runtime Directory /run/user/0. Dec 15 04:06:36 localhost systemd[1]: Removed slice User Slice of UID 0. Dec 15 04:06:36 localhost systemd[1]: user-0.slice: Consumed 4.792s CPU time. Dec 15 04:06:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30283 DF PROTO=TCP SPT=45936 DPT=9105 SEQ=947797976 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838546250000000001030307) Dec 15 04:06:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47724 DF PROTO=TCP SPT=55416 DPT=9101 SEQ=574522976 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838553250000000001030307) Dec 15 04:06:44 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Dec 15 04:06:44 localhost recover_tripleo_nova_virtqemud[109428]: 61849 Dec 15 04:06:44 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Dec 15 04:06:44 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Dec 15 04:06:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12175 DF PROTO=TCP SPT=41952 DPT=9882 SEQ=3554493703 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83856B260000000001030307) Dec 15 04:06:50 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30285 DF PROTO=TCP SPT=45936 DPT=9105 SEQ=947797976 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838577260000000001030307) Dec 15 04:06:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5866 DF PROTO=TCP SPT=48988 DPT=9100 SEQ=3690099130 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838582020000000001030307) Dec 15 04:06:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5867 DF PROTO=TCP SPT=48988 DPT=9100 SEQ=3690099130 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838586250000000001030307) Dec 15 04:06:56 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5868 DF PROTO=TCP SPT=48988 DPT=9100 SEQ=3690099130 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83858E250000000001030307) Dec 15 04:06:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43099 DF PROTO=TCP SPT=41852 DPT=9101 SEQ=1633818135 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838598250000000001030307) Dec 15 04:07:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38634 DF PROTO=TCP SPT=41106 DPT=9102 SEQ=2325756852 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8385A3E60000000001030307) Dec 15 04:07:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 04:07:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 04:07:03 localhost podman[109429]: 2025-12-15 09:07:03.248317082 +0000 UTC m=+0.074863121 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.expose-services=, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, batch=17.1_20251118.1, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, release=1761123044, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., io.buildah.version=1.41.4, managed_by=tripleo_ansible, architecture=x86_64, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 15 04:07:03 localhost podman[109430]: 2025-12-15 09:07:03.292032071 +0000 UTC m=+0.114446809 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, config_id=tripleo_step4, tcib_managed=true, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, build-date=2025-11-19T00:14:25Z, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible) Dec 15 04:07:03 localhost podman[109430]: 2025-12-15 09:07:03.301256887 +0000 UTC m=+0.123671615 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, version=17.1.12, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, build-date=2025-11-19T00:14:25Z, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, url=https://www.redhat.com, name=rhosp17/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, tcib_managed=true, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20251118.1, release=1761123044, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}) Dec 15 04:07:03 localhost podman[109430]: unhealthy Dec 15 04:07:03 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Main process exited, code=exited, status=1/FAILURE Dec 15 04:07:03 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Failed with result 'exit-code'. Dec 15 04:07:03 localhost podman[109429]: 2025-12-15 09:07:03.315797785 +0000 UTC m=+0.142343804 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, config_id=tripleo_step4, build-date=2025-11-18T23:34:05Z, name=rhosp17/openstack-ovn-controller, release=1761123044, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, url=https://www.redhat.com) Dec 15 04:07:03 localhost podman[109429]: unhealthy Dec 15 04:07:03 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Main process exited, code=exited, status=1/FAILURE Dec 15 04:07:03 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Failed with result 'exit-code'. Dec 15 04:07:03 localhost sshd[109466]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:07:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39983 DF PROTO=TCP SPT=60398 DPT=9882 SEQ=3264327988 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8385AFA50000000001030307) Dec 15 04:07:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35208 DF PROTO=TCP SPT=60436 DPT=9105 SEQ=887094608 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8385BB650000000001030307) Dec 15 04:07:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43101 DF PROTO=TCP SPT=41852 DPT=9101 SEQ=1633818135 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8385C9250000000001030307) Dec 15 04:07:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39985 DF PROTO=TCP SPT=60398 DPT=9882 SEQ=3264327988 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8385DF250000000001030307) Dec 15 04:07:20 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35210 DF PROTO=TCP SPT=60436 DPT=9105 SEQ=887094608 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8385EB250000000001030307) Dec 15 04:07:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51224 DF PROTO=TCP SPT=59168 DPT=9100 SEQ=3189480170 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8385F7330000000001030307) Dec 15 04:07:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51225 DF PROTO=TCP SPT=59168 DPT=9100 SEQ=3189480170 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8385FB250000000001030307) Dec 15 04:07:26 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51226 DF PROTO=TCP SPT=59168 DPT=9100 SEQ=3189480170 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838603250000000001030307) Dec 15 04:07:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10443 DF PROTO=TCP SPT=37628 DPT=9102 SEQ=3433511798 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83860D250000000001030307) Dec 15 04:07:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40291 DF PROTO=TCP SPT=48726 DPT=9102 SEQ=4015559549 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838619250000000001030307) Dec 15 04:07:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 04:07:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 04:07:33 localhost podman[109468]: 2025-12-15 09:07:33.490780957 +0000 UTC m=+0.072124199 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, container_name=ovn_controller, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, architecture=x86_64, build-date=2025-11-18T23:34:05Z, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, name=rhosp17/openstack-ovn-controller, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4) Dec 15 04:07:33 localhost podman[109468]: 2025-12-15 09:07:33.508305615 +0000 UTC m=+0.089648827 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, architecture=x86_64, release=1761123044, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, tcib_managed=true, vcs-type=git, batch=17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, name=rhosp17/openstack-ovn-controller, build-date=2025-11-18T23:34:05Z, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_id=tripleo_step4, distribution-scope=public) Dec 15 04:07:33 localhost podman[109468]: unhealthy Dec 15 04:07:33 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Main process exited, code=exited, status=1/FAILURE Dec 15 04:07:33 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Failed with result 'exit-code'. Dec 15 04:07:33 localhost systemd[1]: tmp-crun.3AASWc.mount: Deactivated successfully. Dec 15 04:07:33 localhost podman[109469]: 2025-12-15 09:07:33.600846137 +0000 UTC m=+0.177100632 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, config_id=tripleo_step4, url=https://www.redhat.com, batch=17.1_20251118.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_metadata_agent, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, name=rhosp17/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 15 04:07:33 localhost podman[109469]: 2025-12-15 09:07:33.640343832 +0000 UTC m=+0.216598307 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, konflux.additional-tags=17.1.12 17.1_20251118.1, io.buildah.version=1.41.4, version=17.1.12, config_id=tripleo_step4, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2025-11-19T00:14:25Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1761123044, managed_by=tripleo_ansible, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 15 04:07:33 localhost podman[109469]: unhealthy Dec 15 04:07:33 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Main process exited, code=exited, status=1/FAILURE Dec 15 04:07:33 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Failed with result 'exit-code'. Dec 15 04:07:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23706 DF PROTO=TCP SPT=58676 DPT=9882 SEQ=2062184330 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838624E50000000001030307) Dec 15 04:07:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34612 DF PROTO=TCP SPT=59376 DPT=9105 SEQ=4020331901 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838630650000000001030307) Dec 15 04:07:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47757 DF PROTO=TCP SPT=37792 DPT=9101 SEQ=35712005 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83863D260000000001030307) Dec 15 04:07:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23708 DF PROTO=TCP SPT=58676 DPT=9882 SEQ=2062184330 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838655250000000001030307) Dec 15 04:07:50 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34614 DF PROTO=TCP SPT=59376 DPT=9105 SEQ=4020331901 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838661260000000001030307) Dec 15 04:07:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37237 DF PROTO=TCP SPT=47220 DPT=9100 SEQ=2090339387 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83866C630000000001030307) Dec 15 04:07:54 localhost systemd[1]: Starting Check and recover tripleo_nova_virtqemud... Dec 15 04:07:54 localhost recover_tripleo_nova_virtqemud[109639]: 61849 Dec 15 04:07:54 localhost systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully. Dec 15 04:07:54 localhost systemd[1]: Finished Check and recover tripleo_nova_virtqemud. Dec 15 04:07:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37238 DF PROTO=TCP SPT=47220 DPT=9100 SEQ=2090339387 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838670650000000001030307) Dec 15 04:07:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13118 DF PROTO=TCP SPT=35874 DPT=9102 SEQ=186911337 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838676A50000000001030307) Dec 15 04:07:56 localhost systemd[1]: tripleo_nova_virtlogd_wrapper.service: State 'stop-sigterm' timed out. Killing. Dec 15 04:07:56 localhost systemd[1]: tripleo_nova_virtlogd_wrapper.service: Killing process 61077 (conmon) with signal SIGKILL. Dec 15 04:07:56 localhost systemd[1]: tripleo_nova_virtlogd_wrapper.service: Main process exited, code=killed, status=9/KILL Dec 15 04:07:56 localhost systemd[1]: libpod-conmon-52123f040652e4c049640fa84ee13fbb7e46e8e9c52436fdd815aa5c35fd7af7.scope: Deactivated successfully. Dec 15 04:07:56 localhost podman[109653]: error opening file `/run/crun/52123f040652e4c049640fa84ee13fbb7e46e8e9c52436fdd815aa5c35fd7af7/status`: No such file or directory Dec 15 04:07:56 localhost podman[109640]: 2025-12-15 09:07:56.25879464 +0000 UTC m=+0.078321193 container cleanup 52123f040652e4c049640fa84ee13fbb7e46e8e9c52436fdd815aa5c35fd7af7 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, vcs-type=git, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, tcib_managed=true, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, config_id=tripleo_step3, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, architecture=x86_64, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:35:22Z, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, container_name=nova_virtlogd_wrapper, com.redhat.component=openstack-nova-libvirt-container, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 15 04:07:56 localhost podman[109640]: nova_virtlogd_wrapper Dec 15 04:07:56 localhost systemd[1]: tripleo_nova_virtlogd_wrapper.service: Failed with result 'timeout'. Dec 15 04:07:56 localhost systemd[1]: Stopped nova_virtlogd_wrapper container. Dec 15 04:07:56 localhost python3.9[109746]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 04:07:57 localhost systemd[1]: Reloading. Dec 15 04:07:57 localhost systemd-sysv-generator[109773]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:07:57 localhost systemd-rc-local-generator[109770]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:07:57 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:07:57 localhost systemd[1]: Stopping nova_virtnodedevd container... Dec 15 04:07:57 localhost systemd[1]: tmp-crun.ke5bOF.mount: Deactivated successfully. Dec 15 04:07:57 localhost systemd[1]: libpod-defb2439fc676fe463bf8d7eb18e0214dacf68a5280036dda3b72a4334bffc50.scope: Deactivated successfully. Dec 15 04:07:57 localhost systemd[1]: libpod-defb2439fc676fe463bf8d7eb18e0214dacf68a5280036dda3b72a4334bffc50.scope: Consumed 1.503s CPU time. Dec 15 04:07:57 localhost podman[109786]: 2025-12-15 09:07:57.423811266 +0000 UTC m=+0.081390575 container died defb2439fc676fe463bf8d7eb18e0214dacf68a5280036dda3b72a4334bffc50 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, config_id=tripleo_step3, batch=17.1_20251118.1, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, com.redhat.component=openstack-nova-libvirt-container, tcib_managed=true, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:35:22Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, container_name=nova_virtnodedevd, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64) Dec 15 04:07:57 localhost podman[109786]: 2025-12-15 09:07:57.457895547 +0000 UTC m=+0.115474836 container cleanup defb2439fc676fe463bf8d7eb18e0214dacf68a5280036dda3b72a4334bffc50 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, container_name=nova_virtnodedevd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, release=1761123044, distribution-scope=public, name=rhosp17/openstack-nova-libvirt, version=17.1.12, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, url=https://www.redhat.com, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, build-date=2025-11-19T00:35:22Z, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., io.buildah.version=1.41.4, tcib_managed=true, com.redhat.component=openstack-nova-libvirt-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt) Dec 15 04:07:57 localhost podman[109786]: nova_virtnodedevd Dec 15 04:07:57 localhost podman[109801]: 2025-12-15 09:07:57.513544584 +0000 UTC m=+0.070811073 container cleanup defb2439fc676fe463bf8d7eb18e0214dacf68a5280036dda3b72a4334bffc50 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, managed_by=tripleo_ansible, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, io.openshift.expose-services=, config_id=tripleo_step3, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, build-date=2025-11-19T00:35:22Z, release=1761123044, tcib_managed=true, distribution-scope=public, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-libvirt-container, container_name=nova_virtnodedevd, maintainer=OpenStack TripleO Team) Dec 15 04:07:57 localhost systemd[1]: libpod-conmon-defb2439fc676fe463bf8d7eb18e0214dacf68a5280036dda3b72a4334bffc50.scope: Deactivated successfully. Dec 15 04:07:57 localhost podman[109830]: error opening file `/run/crun/defb2439fc676fe463bf8d7eb18e0214dacf68a5280036dda3b72a4334bffc50/status`: No such file or directory Dec 15 04:07:57 localhost podman[109817]: 2025-12-15 09:07:57.610936586 +0000 UTC m=+0.064450273 container cleanup defb2439fc676fe463bf8d7eb18e0214dacf68a5280036dda3b72a4334bffc50 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, config_id=tripleo_step3, build-date=2025-11-19T00:35:22Z, architecture=x86_64, distribution-scope=public, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-libvirt-container, tcib_managed=true, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, name=rhosp17/openstack-nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, managed_by=tripleo_ansible, version=17.1.12, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., container_name=nova_virtnodedevd, batch=17.1_20251118.1, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt) Dec 15 04:07:57 localhost podman[109817]: nova_virtnodedevd Dec 15 04:07:57 localhost systemd[1]: tripleo_nova_virtnodedevd.service: Deactivated successfully. Dec 15 04:07:57 localhost systemd[1]: Stopped nova_virtnodedevd container. Dec 15 04:07:58 localhost python3.9[109923]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 04:07:58 localhost systemd[1]: var-lib-containers-storage-overlay-114f79195e61e2cc174bba0adc80dff416857f21c27a337c4dfb908cace7e308-merged.mount: Deactivated successfully. Dec 15 04:07:58 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-defb2439fc676fe463bf8d7eb18e0214dacf68a5280036dda3b72a4334bffc50-userdata-shm.mount: Deactivated successfully. Dec 15 04:07:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64874 DF PROTO=TCP SPT=40082 DPT=9101 SEQ=1367310476 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838682660000000001030307) Dec 15 04:07:59 localhost systemd[1]: Reloading. Dec 15 04:07:59 localhost systemd-rc-local-generator[109946]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:07:59 localhost systemd-sysv-generator[109951]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:07:59 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:07:59 localhost systemd[1]: Stopping nova_virtproxyd container... Dec 15 04:07:59 localhost systemd[1]: tmp-crun.2ssSCi.mount: Deactivated successfully. Dec 15 04:07:59 localhost systemd[1]: libpod-ddca038c8593b6a3bb2a52c20e9a406ff5bb4e8cd16a7b42ec23df21f0897120.scope: Deactivated successfully. Dec 15 04:07:59 localhost podman[109964]: 2025-12-15 09:07:59.807539144 +0000 UTC m=+0.074995325 container died ddca038c8593b6a3bb2a52c20e9a406ff5bb4e8cd16a7b42ec23df21f0897120 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_virtproxyd, architecture=x86_64, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, batch=17.1_20251118.1, distribution-scope=public, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-libvirt, tcib_managed=true, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, io.buildah.version=1.41.4, build-date=2025-11-19T00:35:22Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 15 04:07:59 localhost podman[109964]: 2025-12-15 09:07:59.85267111 +0000 UTC m=+0.120127291 container cleanup ddca038c8593b6a3bb2a52c20e9a406ff5bb4e8cd16a7b42ec23df21f0897120 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, distribution-scope=public, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, managed_by=tripleo_ansible, build-date=2025-11-19T00:35:22Z, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vendor=Red Hat, Inc., name=rhosp17/openstack-nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, container_name=nova_virtproxyd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vcs-type=git, com.redhat.component=openstack-nova-libvirt-container) Dec 15 04:07:59 localhost podman[109964]: nova_virtproxyd Dec 15 04:07:59 localhost podman[109979]: 2025-12-15 09:07:59.883868983 +0000 UTC m=+0.064422682 container cleanup ddca038c8593b6a3bb2a52c20e9a406ff5bb4e8cd16a7b42ec23df21f0897120 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, container_name=nova_virtproxyd, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, managed_by=tripleo_ansible, url=https://www.redhat.com, distribution-scope=public, vendor=Red Hat, Inc., org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, build-date=2025-11-19T00:35:22Z, release=1761123044, name=rhosp17/openstack-nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 15 04:07:59 localhost systemd[1]: libpod-conmon-ddca038c8593b6a3bb2a52c20e9a406ff5bb4e8cd16a7b42ec23df21f0897120.scope: Deactivated successfully. Dec 15 04:07:59 localhost podman[110008]: error opening file `/run/crun/ddca038c8593b6a3bb2a52c20e9a406ff5bb4e8cd16a7b42ec23df21f0897120/status`: No such file or directory Dec 15 04:07:59 localhost podman[109994]: 2025-12-15 09:07:59.989787743 +0000 UTC m=+0.075197540 container cleanup ddca038c8593b6a3bb2a52c20e9a406ff5bb4e8cd16a7b42ec23df21f0897120 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, build-date=2025-11-19T00:35:22Z, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-libvirt-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtproxyd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_id=tripleo_step3, distribution-scope=public, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible) Dec 15 04:07:59 localhost podman[109994]: nova_virtproxyd Dec 15 04:07:59 localhost systemd[1]: tripleo_nova_virtproxyd.service: Deactivated successfully. Dec 15 04:07:59 localhost systemd[1]: Stopped nova_virtproxyd container. Dec 15 04:08:00 localhost python3.9[110101]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 04:08:00 localhost systemd[1]: Reloading. Dec 15 04:08:00 localhost systemd-rc-local-generator[110123]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:08:00 localhost systemd-sysv-generator[110128]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:08:00 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:08:01 localhost systemd[1]: tmp-crun.QgrWwG.mount: Deactivated successfully. Dec 15 04:08:01 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ddca038c8593b6a3bb2a52c20e9a406ff5bb4e8cd16a7b42ec23df21f0897120-userdata-shm.mount: Deactivated successfully. Dec 15 04:08:01 localhost systemd[1]: var-lib-containers-storage-overlay-815b56c139a3f32508088ca9c999727b748310f25f30262c114b54984dd7dcbf-merged.mount: Deactivated successfully. Dec 15 04:08:01 localhost systemd[1]: tripleo_nova_virtqemud_recover.timer: Deactivated successfully. Dec 15 04:08:01 localhost systemd[1]: Stopped Check and recover tripleo_nova_virtqemud every 10m. Dec 15 04:08:01 localhost systemd[1]: Stopping nova_virtqemud container... Dec 15 04:08:01 localhost systemd[1]: libpod-17f8fb766dee5ec6e0cf3c348d53f962ee388bdd6255520267152d5d12168616.scope: Deactivated successfully. Dec 15 04:08:01 localhost systemd[1]: libpod-17f8fb766dee5ec6e0cf3c348d53f962ee388bdd6255520267152d5d12168616.scope: Consumed 2.884s CPU time. Dec 15 04:08:01 localhost podman[110141]: 2025-12-15 09:08:01.158156849 +0000 UTC m=+0.081461427 container died 17f8fb766dee5ec6e0cf3c348d53f962ee388bdd6255520267152d5d12168616 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, container_name=nova_virtqemud, config_id=tripleo_step3, com.redhat.component=openstack-nova-libvirt-container, vendor=Red Hat, Inc., version=17.1.12, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, managed_by=tripleo_ansible, url=https://www.redhat.com, architecture=x86_64, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, io.openshift.expose-services=, name=rhosp17/openstack-nova-libvirt, release=1761123044, tcib_managed=true, distribution-scope=public, build-date=2025-11-19T00:35:22Z, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 15 04:08:01 localhost podman[110141]: 2025-12-15 09:08:01.188412468 +0000 UTC m=+0.111717046 container cleanup 17f8fb766dee5ec6e0cf3c348d53f962ee388bdd6255520267152d5d12168616 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, name=rhosp17/openstack-nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, build-date=2025-11-19T00:35:22Z, vendor=Red Hat, Inc., distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.openshift.expose-services=, container_name=nova_virtqemud, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step3, tcib_managed=true, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, io.buildah.version=1.41.4, url=https://www.redhat.com, maintainer=OpenStack TripleO Team) Dec 15 04:08:01 localhost podman[110141]: nova_virtqemud Dec 15 04:08:01 localhost podman[110155]: 2025-12-15 09:08:01.249789418 +0000 UTC m=+0.078409376 container cleanup 17f8fb766dee5ec6e0cf3c348d53f962ee388bdd6255520267152d5d12168616 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, version=17.1.12, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, build-date=2025-11-19T00:35:22Z, konflux.additional-tags=17.1.12 17.1_20251118.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, url=https://www.redhat.com, io.buildah.version=1.41.4, container_name=nova_virtqemud, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, architecture=x86_64, maintainer=OpenStack TripleO Team, release=1761123044, com.redhat.component=openstack-nova-libvirt-container, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git) Dec 15 04:08:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13120 DF PROTO=TCP SPT=35874 DPT=9102 SEQ=186911337 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83868E650000000001030307) Dec 15 04:08:02 localhost systemd[1]: var-lib-containers-storage-overlay-ca79fd7369f4c32cc46dcf947016bb386f5de0d0f04a624169cf5c85994521b4-merged.mount: Deactivated successfully. Dec 15 04:08:02 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-17f8fb766dee5ec6e0cf3c348d53f962ee388bdd6255520267152d5d12168616-userdata-shm.mount: Deactivated successfully. Dec 15 04:08:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 04:08:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 04:08:03 localhost podman[110171]: 2025-12-15 09:08:03.752036741 +0000 UTC m=+0.085673290 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., version=17.1.12, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, architecture=x86_64, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, konflux.additional-tags=17.1.12 17.1_20251118.1, config_id=tripleo_step4, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, release=1761123044, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller) Dec 15 04:08:03 localhost systemd[1]: tmp-crun.LPEUE9.mount: Deactivated successfully. Dec 15 04:08:03 localhost podman[110172]: 2025-12-15 09:08:03.800208318 +0000 UTC m=+0.131243928 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, tcib_managed=true, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, build-date=2025-11-19T00:14:25Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Dec 15 04:08:03 localhost podman[110172]: 2025-12-15 09:08:03.817347586 +0000 UTC m=+0.148383226 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, architecture=x86_64, name=rhosp17/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, io.openshift.expose-services=, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, version=17.1.12, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20251118.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent) Dec 15 04:08:03 localhost podman[110172]: unhealthy Dec 15 04:08:03 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Main process exited, code=exited, status=1/FAILURE Dec 15 04:08:03 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Failed with result 'exit-code'. Dec 15 04:08:03 localhost podman[110171]: 2025-12-15 09:08:03.874189194 +0000 UTC m=+0.207825763 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, batch=17.1_20251118.1, url=https://www.redhat.com, io.openshift.expose-services=, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, version=17.1.12, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, container_name=ovn_controller, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, managed_by=tripleo_ansible, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272) Dec 15 04:08:03 localhost podman[110171]: unhealthy Dec 15 04:08:03 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Main process exited, code=exited, status=1/FAILURE Dec 15 04:08:03 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Failed with result 'exit-code'. Dec 15 04:08:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49015 DF PROTO=TCP SPT=50288 DPT=9882 SEQ=1317791284 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83869A250000000001030307) Dec 15 04:08:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11906 DF PROTO=TCP SPT=55756 DPT=9105 SEQ=1454720423 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8386A5A50000000001030307) Dec 15 04:08:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64876 DF PROTO=TCP SPT=40082 DPT=9101 SEQ=1367310476 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8386B3250000000001030307) Dec 15 04:08:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49017 DF PROTO=TCP SPT=50288 DPT=9882 SEQ=1317791284 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8386CB260000000001030307) Dec 15 04:08:20 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11908 DF PROTO=TCP SPT=55756 DPT=9105 SEQ=1454720423 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8386D5250000000001030307) Dec 15 04:08:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31159 DF PROTO=TCP SPT=60896 DPT=9100 SEQ=1837319215 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8386E1930000000001030307) Dec 15 04:08:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31160 DF PROTO=TCP SPT=60896 DPT=9100 SEQ=1837319215 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8386E5A50000000001030307) Dec 15 04:08:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4492 DF PROTO=TCP SPT=58768 DPT=9102 SEQ=453604489 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8386EBE50000000001030307) Dec 15 04:08:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28426 DF PROTO=TCP SPT=34112 DPT=9101 SEQ=1383681590 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8386F7A50000000001030307) Dec 15 04:08:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29102 DF PROTO=TCP SPT=52162 DPT=9882 SEQ=1079473667 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8387033A0000000001030307) Dec 15 04:08:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 04:08:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 04:08:34 localhost systemd[1]: tmp-crun.0eRteZ.mount: Deactivated successfully. Dec 15 04:08:34 localhost podman[110212]: 2025-12-15 09:08:34.004816708 +0000 UTC m=+0.080871862 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, vendor=Red Hat, Inc., release=1761123044, version=17.1.12, architecture=x86_64, io.openshift.expose-services=, build-date=2025-11-18T23:34:05Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp17/openstack-ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.buildah.version=1.41.4, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team, distribution-scope=public, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 15 04:08:34 localhost podman[110212]: 2025-12-15 09:08:34.02436044 +0000 UTC m=+0.100415604 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, release=1761123044, io.buildah.version=1.41.4, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, batch=17.1_20251118.1, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, io.openshift.expose-services=) Dec 15 04:08:34 localhost podman[110213]: 2025-12-15 09:08:34.060251809 +0000 UTC m=+0.133456987 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, build-date=2025-11-19T00:14:25Z, vcs-type=git, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.buildah.version=1.41.4, architecture=x86_64, config_id=tripleo_step4, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.12, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn) Dec 15 04:08:34 localhost podman[110212]: unhealthy Dec 15 04:08:34 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Main process exited, code=exited, status=1/FAILURE Dec 15 04:08:34 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Failed with result 'exit-code'. Dec 15 04:08:34 localhost podman[110213]: 2025-12-15 09:08:34.102459997 +0000 UTC m=+0.175665205 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, container_name=ovn_metadata_agent, release=1761123044, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, io.buildah.version=1.41.4, vendor=Red Hat, Inc., vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, url=https://www.redhat.com, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, tcib_managed=true, managed_by=tripleo_ansible, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c) Dec 15 04:08:34 localhost podman[110213]: unhealthy Dec 15 04:08:34 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Main process exited, code=exited, status=1/FAILURE Dec 15 04:08:34 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Failed with result 'exit-code'. Dec 15 04:08:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29104 DF PROTO=TCP SPT=52162 DPT=9882 SEQ=1079473667 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83870F250000000001030307) Dec 15 04:08:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54252 DF PROTO=TCP SPT=58732 DPT=9105 SEQ=997889004 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83871AE50000000001030307) Dec 15 04:08:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28428 DF PROTO=TCP SPT=34112 DPT=9101 SEQ=1383681590 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838727250000000001030307) Dec 15 04:08:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29106 DF PROTO=TCP SPT=52162 DPT=9882 SEQ=1079473667 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83873F250000000001030307) Dec 15 04:08:50 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54254 DF PROTO=TCP SPT=58732 DPT=9105 SEQ=997889004 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83874B250000000001030307) Dec 15 04:08:51 localhost sshd[110328]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:08:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64353 DF PROTO=TCP SPT=59736 DPT=9100 SEQ=3522399149 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838756C30000000001030307) Dec 15 04:08:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64354 DF PROTO=TCP SPT=59736 DPT=9100 SEQ=3522399149 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83875AE50000000001030307) Dec 15 04:08:56 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64355 DF PROTO=TCP SPT=59736 DPT=9100 SEQ=3522399149 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838762E50000000001030307) Dec 15 04:08:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30346 DF PROTO=TCP SPT=56300 DPT=9101 SEQ=1431061397 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83876CE60000000001030307) Dec 15 04:09:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33550 DF PROTO=TCP SPT=57908 DPT=9102 SEQ=3423404764 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838778A60000000001030307) Dec 15 04:09:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 04:09:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 04:09:04 localhost podman[110331]: 2025-12-15 09:09:04.510201693 +0000 UTC m=+0.082499636 container health_status 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, io.buildah.version=1.41.4, distribution-scope=public, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., architecture=x86_64, url=https://www.redhat.com, name=rhosp17/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, vcs-type=git, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2025-11-19T00:14:25Z) Dec 15 04:09:04 localhost podman[110330]: 2025-12-15 09:09:04.555245513 +0000 UTC m=+0.130381131 container health_status 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, name=rhosp17/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, distribution-scope=public, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, config_id=tripleo_step4, io.openshift.expose-services=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., container_name=ovn_controller, maintainer=OpenStack TripleO Team, build-date=2025-11-18T23:34:05Z, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, managed_by=tripleo_ansible, release=1761123044, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.component=openstack-ovn-controller-container, vcs-type=git) Dec 15 04:09:04 localhost podman[110331]: 2025-12-15 09:09:04.577324034 +0000 UTC m=+0.149622027 container exec_died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, container_name=ovn_metadata_agent, name=rhosp17/openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, release=1761123044, config_id=tripleo_step4, distribution-scope=public, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.12, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, io.buildah.version=1.41.4, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, architecture=x86_64) Dec 15 04:09:04 localhost podman[110331]: unhealthy Dec 15 04:09:04 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Main process exited, code=exited, status=1/FAILURE Dec 15 04:09:04 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Failed with result 'exit-code'. Dec 15 04:09:04 localhost podman[110330]: 2025-12-15 09:09:04.594906929 +0000 UTC m=+0.170042527 container exec_died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-ovn-controller, version=17.1.12, build-date=2025-11-18T23:34:05Z, release=1761123044, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20251118.1, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, container_name=ovn_controller, distribution-scope=public, managed_by=tripleo_ansible, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, config_id=tripleo_step4, architecture=x86_64, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 15 04:09:04 localhost podman[110330]: unhealthy Dec 15 04:09:04 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Main process exited, code=exited, status=1/FAILURE Dec 15 04:09:04 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Failed with result 'exit-code'. Dec 15 04:09:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26602 DF PROTO=TCP SPT=37364 DPT=9882 SEQ=2974762397 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838784650000000001030307) Dec 15 04:09:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34092 DF PROTO=TCP SPT=40782 DPT=9105 SEQ=2885882937 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838790250000000001030307) Dec 15 04:09:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30348 DF PROTO=TCP SPT=56300 DPT=9101 SEQ=1431061397 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83879D250000000001030307) Dec 15 04:09:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26604 DF PROTO=TCP SPT=37364 DPT=9882 SEQ=2974762397 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8387B5250000000001030307) Dec 15 04:09:20 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34094 DF PROTO=TCP SPT=40782 DPT=9105 SEQ=2885882937 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8387C1250000000001030307) Dec 15 04:09:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=752 DF PROTO=TCP SPT=60718 DPT=9100 SEQ=3673610027 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8387CBF30000000001030307) Dec 15 04:09:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=753 DF PROTO=TCP SPT=60718 DPT=9100 SEQ=3673610027 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8387CFE50000000001030307) Dec 15 04:09:25 localhost systemd[1]: tripleo_nova_virtqemud.service: State 'stop-sigterm' timed out. Killing. Dec 15 04:09:25 localhost systemd[1]: tripleo_nova_virtqemud.service: Killing process 61845 (conmon) with signal SIGKILL. Dec 15 04:09:25 localhost systemd[1]: tripleo_nova_virtqemud.service: Main process exited, code=killed, status=9/KILL Dec 15 04:09:25 localhost systemd[1]: libpod-conmon-17f8fb766dee5ec6e0cf3c348d53f962ee388bdd6255520267152d5d12168616.scope: Deactivated successfully. Dec 15 04:09:25 localhost systemd[1]: tmp-crun.1gXzAG.mount: Deactivated successfully. Dec 15 04:09:25 localhost podman[110380]: error opening file `/run/crun/17f8fb766dee5ec6e0cf3c348d53f962ee388bdd6255520267152d5d12168616/status`: No such file or directory Dec 15 04:09:25 localhost podman[110368]: 2025-12-15 09:09:25.503339422 +0000 UTC m=+0.084164221 container cleanup 17f8fb766dee5ec6e0cf3c348d53f962ee388bdd6255520267152d5d12168616 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, io.buildah.version=1.41.4, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, release=1761123044, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vendor=Red Hat, Inc., vcs-type=git, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, build-date=2025-11-19T00:35:22Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, managed_by=tripleo_ansible, tcib_managed=true, url=https://www.redhat.com, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, container_name=nova_virtqemud) Dec 15 04:09:25 localhost podman[110368]: nova_virtqemud Dec 15 04:09:25 localhost systemd[1]: tripleo_nova_virtqemud.service: Failed with result 'timeout'. Dec 15 04:09:25 localhost systemd[1]: Stopped nova_virtqemud container. Dec 15 04:09:26 localhost python3.9[110473]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud_recover.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 04:09:26 localhost systemd[1]: Reloading. Dec 15 04:09:26 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=754 DF PROTO=TCP SPT=60718 DPT=9100 SEQ=3673610027 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8387D7E50000000001030307) Dec 15 04:09:26 localhost systemd-sysv-generator[110506]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:09:26 localhost systemd-rc-local-generator[110503]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:09:26 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:09:27 localhost python3.9[110603]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 04:09:27 localhost systemd[1]: Reloading. Dec 15 04:09:27 localhost systemd-sysv-generator[110631]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:09:27 localhost systemd-rc-local-generator[110627]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:09:27 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:09:27 localhost systemd[1]: Stopping nova_virtsecretd container... Dec 15 04:09:27 localhost systemd[1]: libpod-79c86d80f06802ead502598f2b91dbc5585606176da16f7812de83462116f050.scope: Deactivated successfully. Dec 15 04:09:27 localhost podman[110643]: 2025-12-15 09:09:27.670685724 +0000 UTC m=+0.077891766 container died 79c86d80f06802ead502598f2b91dbc5585606176da16f7812de83462116f050 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, io.buildah.version=1.41.4, build-date=2025-11-19T00:35:22Z, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, name=rhosp17/openstack-nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, distribution-scope=public, container_name=nova_virtsecretd, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-libvirt-container, io.openshift.expose-services=, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, config_id=tripleo_step3, batch=17.1_20251118.1, release=1761123044, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, vcs-type=git) Dec 15 04:09:27 localhost podman[110643]: 2025-12-15 09:09:27.705372098 +0000 UTC m=+0.112578130 container cleanup 79c86d80f06802ead502598f2b91dbc5585606176da16f7812de83462116f050 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, io.openshift.expose-services=, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, maintainer=OpenStack TripleO Team, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, architecture=x86_64, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.component=openstack-nova-libvirt-container, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, name=rhosp17/openstack-nova-libvirt, vendor=Red Hat, Inc., container_name=nova_virtsecretd, version=17.1.12, build-date=2025-11-19T00:35:22Z, release=1761123044, konflux.additional-tags=17.1.12 17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, config_id=tripleo_step3) Dec 15 04:09:27 localhost podman[110643]: nova_virtsecretd Dec 15 04:09:27 localhost podman[110656]: 2025-12-15 09:09:27.756169078 +0000 UTC m=+0.072648796 container cleanup 79c86d80f06802ead502598f2b91dbc5585606176da16f7812de83462116f050 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, managed_by=tripleo_ansible, architecture=x86_64, name=rhosp17/openstack-nova-libvirt, vendor=Red Hat, Inc., version=17.1.12, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1761123044, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, batch=17.1_20251118.1, build-date=2025-11-19T00:35:22Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-type=git, com.redhat.component=openstack-nova-libvirt-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_id=tripleo_step3, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, container_name=nova_virtsecretd, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt) Dec 15 04:09:27 localhost systemd[1]: libpod-conmon-79c86d80f06802ead502598f2b91dbc5585606176da16f7812de83462116f050.scope: Deactivated successfully. Dec 15 04:09:27 localhost podman[110686]: error opening file `/run/crun/79c86d80f06802ead502598f2b91dbc5585606176da16f7812de83462116f050/status`: No such file or directory Dec 15 04:09:27 localhost podman[110673]: 2025-12-15 09:09:27.854721799 +0000 UTC m=+0.067841752 container cleanup 79c86d80f06802ead502598f2b91dbc5585606176da16f7812de83462116f050 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, config_id=tripleo_step3, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, batch=17.1_20251118.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, release=1761123044, name=rhosp17/openstack-nova-libvirt, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, build-date=2025-11-19T00:35:22Z, container_name=nova_virtsecretd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.expose-services=, distribution-scope=public, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., version=17.1.12, com.redhat.component=openstack-nova-libvirt-container, tcib_managed=true, vcs-type=git) Dec 15 04:09:27 localhost podman[110673]: nova_virtsecretd Dec 15 04:09:27 localhost systemd[1]: tripleo_nova_virtsecretd.service: Deactivated successfully. Dec 15 04:09:27 localhost systemd[1]: Stopped nova_virtsecretd container. Dec 15 04:09:28 localhost python3.9[110779]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 04:09:28 localhost systemd[1]: Reloading. Dec 15 04:09:28 localhost systemd-sysv-generator[110806]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:09:28 localhost systemd-rc-local-generator[110802]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:09:28 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:09:28 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-79c86d80f06802ead502598f2b91dbc5585606176da16f7812de83462116f050-userdata-shm.mount: Deactivated successfully. Dec 15 04:09:28 localhost systemd[1]: var-lib-containers-storage-overlay-8d291eddbcf4928f1b462d99ea0cfc000e1fa49a9a1356648b6b5b68c50a1cf0-merged.mount: Deactivated successfully. Dec 15 04:09:28 localhost systemd[1]: Stopping nova_virtstoraged container... Dec 15 04:09:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5050 DF PROTO=TCP SPT=49464 DPT=9101 SEQ=528366992 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8387E1E50000000001030307) Dec 15 04:09:28 localhost systemd[1]: libpod-68297132a91cfc5a6c8079435c9fe1de3a122b558a334b9752ca0b9c9ef20016.scope: Deactivated successfully. Dec 15 04:09:28 localhost podman[110820]: 2025-12-15 09:09:28.945797319 +0000 UTC m=+0.075068471 container died 68297132a91cfc5a6c8079435c9fe1de3a122b558a334b9752ca0b9c9ef20016 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, container_name=nova_virtstoraged, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, vendor=Red Hat, Inc., tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, managed_by=tripleo_ansible, io.buildah.version=1.41.4, com.redhat.component=openstack-nova-libvirt-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.12, config_id=tripleo_step3, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, build-date=2025-11-19T00:35:22Z, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, release=1761123044, name=rhosp17/openstack-nova-libvirt, batch=17.1_20251118.1, architecture=x86_64, distribution-scope=public, url=https://www.redhat.com) Dec 15 04:09:28 localhost podman[110820]: 2025-12-15 09:09:28.979285873 +0000 UTC m=+0.108557005 container cleanup 68297132a91cfc5a6c8079435c9fe1de3a122b558a334b9752ca0b9c9ef20016 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, vcs-type=git, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, com.redhat.component=openstack-nova-libvirt-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, managed_by=tripleo_ansible, io.openshift.expose-services=, build-date=2025-11-19T00:35:22Z, container_name=nova_virtstoraged, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp17/openstack-nova-libvirt, architecture=x86_64, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, config_id=tripleo_step3, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, io.buildah.version=1.41.4, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt) Dec 15 04:09:28 localhost podman[110820]: nova_virtstoraged Dec 15 04:09:29 localhost podman[110833]: 2025-12-15 09:09:29.028975184 +0000 UTC m=+0.077411394 container cleanup 68297132a91cfc5a6c8079435c9fe1de3a122b558a334b9752ca0b9c9ef20016 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, vcs-type=git, com.redhat.component=openstack-nova-libvirt-container, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_virtstoraged, tcib_managed=true, vendor=Red Hat, Inc., batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, config_id=tripleo_step3, distribution-scope=public, managed_by=tripleo_ansible, name=rhosp17/openstack-nova-libvirt, architecture=x86_64, io.buildah.version=1.41.4, konflux.additional-tags=17.1.12 17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, summary=Red Hat OpenStack Platform 17.1 nova-libvirt) Dec 15 04:09:29 localhost systemd[1]: libpod-conmon-68297132a91cfc5a6c8079435c9fe1de3a122b558a334b9752ca0b9c9ef20016.scope: Deactivated successfully. Dec 15 04:09:29 localhost podman[110864]: error opening file `/run/crun/68297132a91cfc5a6c8079435c9fe1de3a122b558a334b9752ca0b9c9ef20016/status`: No such file or directory Dec 15 04:09:29 localhost podman[110852]: 2025-12-15 09:09:29.122902181 +0000 UTC m=+0.066241228 container cleanup 68297132a91cfc5a6c8079435c9fe1de3a122b558a334b9752ca0b9c9ef20016 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, io.buildah.version=1.41.4, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.12, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=nova_virtstoraged, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, batch=17.1_20251118.1, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, name=rhosp17/openstack-nova-libvirt, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-libvirt-container, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '879500e96bf8dfb93687004bd86f2317'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, distribution-scope=public, io.openshift.expose-services=, konflux.additional-tags=17.1.12 17.1_20251118.1, release=1761123044, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, build-date=2025-11-19T00:35:22Z) Dec 15 04:09:29 localhost podman[110852]: nova_virtstoraged Dec 15 04:09:29 localhost systemd[1]: tripleo_nova_virtstoraged.service: Deactivated successfully. Dec 15 04:09:29 localhost systemd[1]: Stopped nova_virtstoraged container. Dec 15 04:09:29 localhost systemd[1]: var-lib-containers-storage-overlay-287e8a5e9612465394db15161b4361526f72e331708cc6adfea20036aeff367f-merged.mount: Deactivated successfully. Dec 15 04:09:29 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-68297132a91cfc5a6c8079435c9fe1de3a122b558a334b9752ca0b9c9ef20016-userdata-shm.mount: Deactivated successfully. Dec 15 04:09:29 localhost python3.9[110957]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ovn_controller.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 04:09:30 localhost systemd[1]: Reloading. Dec 15 04:09:30 localhost systemd-sysv-generator[110984]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:09:30 localhost systemd-rc-local-generator[110980]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:09:30 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:09:30 localhost systemd[1]: Stopping ovn_controller container... Dec 15 04:09:30 localhost systemd[1]: libpod-2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.scope: Deactivated successfully. Dec 15 04:09:30 localhost systemd[1]: libpod-2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.scope: Consumed 2.808s CPU time. Dec 15 04:09:30 localhost podman[110998]: 2025-12-15 09:09:30.467034637 +0000 UTC m=+0.086495013 container died 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20251118.1, io.buildah.version=1.41.4, tcib_managed=true, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, distribution-scope=public, io.openshift.expose-services=, version=17.1.12, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1761123044, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272) Dec 15 04:09:30 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.timer: Deactivated successfully. Dec 15 04:09:30 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1. Dec 15 04:09:30 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Failed to open /run/systemd/transient/2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: No such file or directory Dec 15 04:09:30 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1-userdata-shm.mount: Deactivated successfully. Dec 15 04:09:30 localhost podman[110998]: 2025-12-15 09:09:30.512381823 +0000 UTC m=+0.131842189 container cleanup 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, managed_by=tripleo_ansible, architecture=x86_64, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, container_name=ovn_controller, release=1761123044, io.buildah.version=1.41.4, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2025-11-18T23:34:05Z, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ovn-controller, batch=17.1_20251118.1, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, distribution-scope=public, tcib_managed=true, vendor=Red Hat, Inc., org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, maintainer=OpenStack TripleO Team, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.12, com.redhat.component=openstack-ovn-controller-container) Dec 15 04:09:30 localhost podman[110998]: ovn_controller Dec 15 04:09:30 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.timer: Failed to open /run/systemd/transient/2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.timer: No such file or directory Dec 15 04:09:30 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Failed to open /run/systemd/transient/2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: No such file or directory Dec 15 04:09:30 localhost podman[111011]: 2025-12-15 09:09:30.554877715 +0000 UTC m=+0.073793998 container cleanup 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20251118.1, url=https://www.redhat.com, io.buildah.version=1.41.4, version=17.1.12, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1761123044, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, container_name=ovn_controller, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, managed_by=tripleo_ansible, build-date=2025-11-18T23:34:05Z, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, distribution-scope=public, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.12 17.1_20251118.1) Dec 15 04:09:30 localhost systemd[1]: libpod-conmon-2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.scope: Deactivated successfully. Dec 15 04:09:30 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.timer: Failed to open /run/systemd/transient/2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.timer: No such file or directory Dec 15 04:09:30 localhost systemd[1]: 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: Failed to open /run/systemd/transient/2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1.service: No such file or directory Dec 15 04:09:30 localhost podman[111026]: 2025-12-15 09:09:30.644571581 +0000 UTC m=+0.053205475 container cleanup 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., url=https://www.redhat.com, tcib_managed=true, vcs-type=git, release=1761123044, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-18T23:34:05Z, distribution-scope=public, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.4, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, version=17.1.12, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, architecture=x86_64, name=rhosp17/openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream) Dec 15 04:09:30 localhost podman[111026]: ovn_controller Dec 15 04:09:30 localhost systemd[1]: tripleo_ovn_controller.service: Deactivated successfully. Dec 15 04:09:30 localhost systemd[1]: Stopped ovn_controller container. Dec 15 04:09:30 localhost systemd[1]: var-lib-containers-storage-overlay-ad536c756d37421165059407ca1ade816c0cce3c0bd15797d12fce327284d9de-merged.mount: Deactivated successfully. Dec 15 04:09:31 localhost python3.9[111130]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ovn_metadata_agent.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 04:09:31 localhost systemd[1]: Reloading. Dec 15 04:09:31 localhost systemd-sysv-generator[111158]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:09:31 localhost systemd-rc-local-generator[111154]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:09:31 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:09:31 localhost systemd[1]: Stopping ovn_metadata_agent container... Dec 15 04:09:31 localhost podman[111170]: 2025-12-15 09:09:31.768786176 +0000 UTC m=+0.075247616 container died 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.buildah.version=1.41.4, batch=17.1_20251118.1, build-date=2025-11-19T00:14:25Z, release=1761123044, url=https://www.redhat.com, name=rhosp17/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, tcib_managed=true, container_name=ovn_metadata_agent, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_id=tripleo_step4, io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.12 17.1_20251118.1, distribution-scope=public, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git) Dec 15 04:09:31 localhost systemd[1]: libpod-4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.scope: Deactivated successfully. Dec 15 04:09:31 localhost systemd[1]: libpod-4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.scope: Consumed 11.565s CPU time. Dec 15 04:09:31 localhost systemd[1]: tmp-crun.Zfe4UT.mount: Deactivated successfully. Dec 15 04:09:31 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.timer: Deactivated successfully. Dec 15 04:09:31 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379. Dec 15 04:09:31 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Failed to open /run/systemd/transient/4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: No such file or directory Dec 15 04:09:31 localhost systemd[1]: var-lib-containers-storage-overlay-efe3dd72d150707edc7bd6bf365298295ccd8023b849757f6c8f1cb42ddbcc93-merged.mount: Deactivated successfully. Dec 15 04:09:31 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379-userdata-shm.mount: Deactivated successfully. Dec 15 04:09:31 localhost podman[111170]: 2025-12-15 09:09:31.845736095 +0000 UTC m=+0.152197475 container cleanup 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, batch=17.1_20251118.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1761123044, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, url=https://www.redhat.com, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, name=rhosp17/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, build-date=2025-11-19T00:14:25Z, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, version=17.1.12, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, container_name=ovn_metadata_agent, architecture=x86_64) Dec 15 04:09:31 localhost podman[111170]: ovn_metadata_agent Dec 15 04:09:31 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.timer: Failed to open /run/systemd/transient/4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.timer: No such file or directory Dec 15 04:09:31 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Failed to open /run/systemd/transient/4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: No such file or directory Dec 15 04:09:31 localhost podman[111183]: 2025-12-15 09:09:31.866665158 +0000 UTC m=+0.085109387 container cleanup 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, architecture=x86_64, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, batch=17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, distribution-scope=public, release=1761123044, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, vcs-type=git, managed_by=tripleo_ansible, version=17.1.12) Dec 15 04:09:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62530 DF PROTO=TCP SPT=54934 DPT=9102 SEQ=4090041637 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8387EDE60000000001030307) Dec 15 04:09:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58649 DF PROTO=TCP SPT=32834 DPT=9882 SEQ=1240639400 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8387F9A50000000001030307) Dec 15 04:09:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5156 DF PROTO=TCP SPT=35242 DPT=9105 SEQ=3136361097 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838805270000000001030307) Dec 15 04:09:38 localhost podman[111305]: 2025-12-15 09:09:38.383013948 +0000 UTC m=+0.092442579 container exec 8dcda56b365b42dc8758aab77a9ec80db304780e449052738f7e4e648ae1ecaf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-crash-np0005559462, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, RELEASE=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, GIT_CLEAN=True, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64) Dec 15 04:09:38 localhost podman[111305]: 2025-12-15 09:09:38.512565555 +0000 UTC m=+0.221994246 container exec_died 8dcda56b365b42dc8758aab77a9ec80db304780e449052738f7e4e648ae1ecaf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-crash-np0005559462, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, vcs-type=git, maintainer=Guillaume Abrioux , release=1763362218, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-11-26T19:44:28Z, name=rhceph, ceph=True, vendor=Red Hat, Inc., GIT_BRANCH=main, description=Red Hat Ceph Storage 7, architecture=x86_64, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0) Dec 15 04:09:40 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5052 DF PROTO=TCP SPT=49464 DPT=9101 SEQ=528366992 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838811260000000001030307) Dec 15 04:09:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58651 DF PROTO=TCP SPT=32834 DPT=9882 SEQ=1240639400 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838829260000000001030307) Dec 15 04:09:50 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5158 DF PROTO=TCP SPT=35242 DPT=9105 SEQ=3136361097 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838835250000000001030307) Dec 15 04:09:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9856 DF PROTO=TCP SPT=36802 DPT=9100 SEQ=181697788 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838841230000000001030307) Dec 15 04:09:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9857 DF PROTO=TCP SPT=36802 DPT=9100 SEQ=181697788 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838845250000000001030307) Dec 15 04:09:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42817 DF PROTO=TCP SPT=38926 DPT=9102 SEQ=3133044919 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83884B660000000001030307) Dec 15 04:09:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24688 DF PROTO=TCP SPT=54064 DPT=9101 SEQ=2581614081 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838857250000000001030307) Dec 15 04:10:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29860 DF PROTO=TCP SPT=58610 DPT=9882 SEQ=3251854166 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838862CA0000000001030307) Dec 15 04:10:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29862 DF PROTO=TCP SPT=58610 DPT=9882 SEQ=3251854166 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83886EE50000000001030307) Dec 15 04:10:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39732 DF PROTO=TCP SPT=55452 DPT=9105 SEQ=1207015381 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83887A650000000001030307) Dec 15 04:10:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24690 DF PROTO=TCP SPT=54064 DPT=9101 SEQ=2581614081 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838887250000000001030307) Dec 15 04:10:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29864 DF PROTO=TCP SPT=58610 DPT=9882 SEQ=3251854166 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83889F250000000001030307) Dec 15 04:10:20 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39734 DF PROTO=TCP SPT=55452 DPT=9105 SEQ=1207015381 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8388AB250000000001030307) Dec 15 04:10:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=593 DF PROTO=TCP SPT=34306 DPT=9100 SEQ=1434469308 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8388B6530000000001030307) Dec 15 04:10:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=594 DF PROTO=TCP SPT=34306 DPT=9100 SEQ=1434469308 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8388BA650000000001030307) Dec 15 04:10:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51710 DF PROTO=TCP SPT=56470 DPT=9102 SEQ=2734882965 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8388C0A50000000001030307) Dec 15 04:10:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21185 DF PROTO=TCP SPT=56508 DPT=9101 SEQ=971499579 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8388CC650000000001030307) Dec 15 04:10:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51712 DF PROTO=TCP SPT=56470 DPT=9102 SEQ=2734882965 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8388D8650000000001030307) Dec 15 04:10:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53005 DF PROTO=TCP SPT=46296 DPT=9882 SEQ=3006527654 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8388E3E50000000001030307) Dec 15 04:10:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38311 DF PROTO=TCP SPT=33280 DPT=9105 SEQ=1053647317 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8388EFA50000000001030307) Dec 15 04:10:41 localhost sshd[111509]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:10:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21187 DF PROTO=TCP SPT=56508 DPT=9101 SEQ=971499579 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8388FD250000000001030307) Dec 15 04:10:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53007 DF PROTO=TCP SPT=46296 DPT=9882 SEQ=3006527654 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838913250000000001030307) Dec 15 04:10:50 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38313 DF PROTO=TCP SPT=33280 DPT=9105 SEQ=1053647317 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83891F260000000001030307) Dec 15 04:10:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54416 DF PROTO=TCP SPT=34858 DPT=9100 SEQ=1765130726 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83892B830000000001030307) Dec 15 04:10:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54417 DF PROTO=TCP SPT=34858 DPT=9100 SEQ=1765130726 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83892FA50000000001030307) Dec 15 04:10:55 localhost systemd[1]: tripleo_ovn_metadata_agent.service: State 'stop-sigterm' timed out. Killing. Dec 15 04:10:55 localhost systemd[1]: tripleo_ovn_metadata_agent.service: Killing process 71374 (conmon) with signal SIGKILL. Dec 15 04:10:55 localhost systemd[1]: tripleo_ovn_metadata_agent.service: Main process exited, code=killed, status=9/KILL Dec 15 04:10:55 localhost systemd[1]: libpod-conmon-4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.scope: Deactivated successfully. Dec 15 04:10:55 localhost podman[111537]: error opening file `/run/crun/4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379/status`: No such file or directory Dec 15 04:10:56 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.timer: Failed to open /run/systemd/transient/4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.timer: No such file or directory Dec 15 04:10:56 localhost systemd[1]: 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: Failed to open /run/systemd/transient/4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379.service: No such file or directory Dec 15 04:10:56 localhost podman[111526]: 2025-12-15 09:10:56.012801188 +0000 UTC m=+0.090936389 container cleanup 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, build-date=2025-11-19T00:14:25Z, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, io.buildah.version=1.41.4, distribution-scope=public, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, url=https://www.redhat.com, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.12, release=1761123044, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc.) Dec 15 04:10:56 localhost podman[111526]: ovn_metadata_agent Dec 15 04:10:56 localhost systemd[1]: tripleo_ovn_metadata_agent.service: Failed with result 'timeout'. Dec 15 04:10:56 localhost systemd[1]: Stopped ovn_metadata_agent container. Dec 15 04:10:56 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54418 DF PROTO=TCP SPT=34858 DPT=9100 SEQ=1765130726 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838937A50000000001030307) Dec 15 04:10:56 localhost python3.9[111630]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_rsyslog.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 04:10:56 localhost systemd[1]: Reloading. Dec 15 04:10:56 localhost systemd-rc-local-generator[111654]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:10:56 localhost systemd-sysv-generator[111657]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:10:56 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:10:58 localhost python3.9[111759]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:10:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9618 DF PROTO=TCP SPT=54404 DPT=9101 SEQ=1668794116 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838941A50000000001030307) Dec 15 04:10:59 localhost python3.9[111851]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_ipmi.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:11:00 localhost python3.9[111943]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_collectd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:11:00 localhost python3.9[112035]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_iscsid.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:11:01 localhost python3.9[112127]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_logrotate_crond.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:11:01 localhost python3.9[112219]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_metrics_qdr.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:11:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56229 DF PROTO=TCP SPT=43122 DPT=9102 SEQ=3368816555 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83894D660000000001030307) Dec 15 04:11:02 localhost python3.9[112311]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_neutron_dhcp.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:11:02 localhost python3.9[112403]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_neutron_l3_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:11:03 localhost python3.9[112495]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_neutron_ovs_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:11:03 localhost python3.9[112587]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:11:04 localhost python3.9[112679]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:11:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28570 DF PROTO=TCP SPT=34570 DPT=9882 SEQ=2607428259 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838959250000000001030307) Dec 15 04:11:05 localhost python3.9[112771]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:11:05 localhost python3.9[112863]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:11:06 localhost python3.9[112955]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:11:06 localhost python3.9[113047]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:11:07 localhost python3.9[113139]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud_recover.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:11:07 localhost python3.9[113231]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:11:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36371 DF PROTO=TCP SPT=47418 DPT=9105 SEQ=2108844707 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838964E50000000001030307) Dec 15 04:11:08 localhost python3.9[113323]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:11:08 localhost python3.9[113415]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ovn_controller.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:11:09 localhost python3.9[113507]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ovn_metadata_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:11:10 localhost python3.9[113599]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_rsyslog.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:11:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9620 DF PROTO=TCP SPT=54404 DPT=9101 SEQ=1668794116 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838971260000000001030307) Dec 15 04:11:11 localhost python3.9[113691]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:11:12 localhost python3.9[113783]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_ipmi.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:11:12 localhost python3.9[113875]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_collectd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:11:13 localhost python3.9[113967]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_iscsid.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:11:14 localhost python3.9[114059]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_logrotate_crond.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:11:14 localhost python3.9[114151]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_metrics_qdr.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:11:15 localhost python3.9[114243]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_neutron_dhcp.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:11:15 localhost python3.9[114335]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_neutron_l3_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:11:16 localhost python3.9[114427]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_neutron_ovs_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:11:16 localhost python3.9[114519]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:11:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28572 DF PROTO=TCP SPT=34570 DPT=9882 SEQ=2607428259 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838989250000000001030307) Dec 15 04:11:17 localhost python3.9[114611]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:11:18 localhost python3.9[114703]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:11:18 localhost python3.9[114795]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:11:18 localhost sshd[114852]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:11:19 localhost python3.9[114888]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:11:19 localhost python3.9[114980]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:11:20 localhost python3.9[115073]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud_recover.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:11:20 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36373 DF PROTO=TCP SPT=47418 DPT=9105 SEQ=2108844707 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838995260000000001030307) Dec 15 04:11:20 localhost python3.9[115165]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:11:21 localhost python3.9[115257]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:11:22 localhost python3.9[115349]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ovn_controller.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:11:22 localhost python3.9[115441]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ovn_metadata_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:11:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27958 DF PROTO=TCP SPT=34606 DPT=9100 SEQ=2258942877 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8389A0B20000000001030307) Dec 15 04:11:23 localhost python3.9[115533]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_rsyslog.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:11:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27959 DF PROTO=TCP SPT=34606 DPT=9100 SEQ=2258942877 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8389A4A50000000001030307) Dec 15 04:11:25 localhost python3.9[115625]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012 systemctl disable --now certmonger.service#012 test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:11:25 localhost python3.9[115717]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None Dec 15 04:11:26 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27960 DF PROTO=TCP SPT=34606 DPT=9100 SEQ=2258942877 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8389ACA50000000001030307) Dec 15 04:11:26 localhost python3.9[115809]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Dec 15 04:11:26 localhost systemd[1]: Reloading. Dec 15 04:11:26 localhost systemd-rc-local-generator[115834]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:11:26 localhost systemd-sysv-generator[115837]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:11:26 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:11:27 localhost python3.9[115937]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:11:28 localhost python3.9[116030]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_ipmi.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:11:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27977 DF PROTO=TCP SPT=38102 DPT=9101 SEQ=947026609 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8389B6A60000000001030307) Dec 15 04:11:28 localhost python3.9[116123]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_collectd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:11:29 localhost python3.9[116216]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_iscsid.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:11:30 localhost python3.9[116309]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_logrotate_crond.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:11:30 localhost python3.9[116402]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_metrics_qdr.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:11:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35560 DF PROTO=TCP SPT=57568 DPT=9882 SEQ=1487665153 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8389C25C0000000001030307) Dec 15 04:11:32 localhost python3.9[116495]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_neutron_dhcp.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:11:32 localhost python3.9[116588]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_neutron_l3_agent.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:11:34 localhost python3.9[116681]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_neutron_ovs_agent.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:11:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35562 DF PROTO=TCP SPT=57568 DPT=9882 SEQ=1487665153 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8389CE650000000001030307) Dec 15 04:11:35 localhost python3.9[116774]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:11:35 localhost python3.9[116867]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:11:36 localhost python3.9[116960]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:11:36 localhost python3.9[117053]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:11:37 localhost python3.9[117146]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:11:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29229 DF PROTO=TCP SPT=48464 DPT=9105 SEQ=567368289 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8389D9E50000000001030307) Dec 15 04:11:38 localhost python3.9[117239]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:11:38 localhost python3.9[117332]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud_recover.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:11:39 localhost python3.9[117425]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:11:39 localhost python3.9[117518]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:11:40 localhost python3.9[117611]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ovn_controller.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:11:41 localhost python3.9[117704]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ovn_metadata_agent.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:11:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27979 DF PROTO=TCP SPT=38102 DPT=9101 SEQ=947026609 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8389E7250000000001030307) Dec 15 04:11:41 localhost python3.9[117797]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_rsyslog.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:11:42 localhost systemd[1]: session-37.scope: Deactivated successfully. Dec 15 04:11:42 localhost systemd[1]: session-37.scope: Consumed 48.128s CPU time. Dec 15 04:11:42 localhost systemd-logind[763]: Session 37 logged out. Waiting for processes to exit. Dec 15 04:11:42 localhost systemd-logind[763]: Removed session 37. Dec 15 04:11:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35564 DF PROTO=TCP SPT=57568 DPT=9882 SEQ=1487665153 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8389FF250000000001030307) Dec 15 04:11:49 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29231 DF PROTO=TCP SPT=48464 DPT=9105 SEQ=567368289 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838A09250000000001030307) Dec 15 04:11:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62493 DF PROTO=TCP SPT=51632 DPT=9100 SEQ=1367674514 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838A15E20000000001030307) Dec 15 04:11:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62494 DF PROTO=TCP SPT=51632 DPT=9100 SEQ=1367674514 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838A19E50000000001030307) Dec 15 04:11:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55418 DF PROTO=TCP SPT=46996 DPT=9102 SEQ=1627750752 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838A20260000000001030307) Dec 15 04:11:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26905 DF PROTO=TCP SPT=59768 DPT=9101 SEQ=3513662849 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838A2BE50000000001030307) Dec 15 04:12:00 localhost sshd[117891]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:12:00 localhost systemd-logind[763]: New session 38 of user zuul. Dec 15 04:12:00 localhost systemd[1]: Started Session 38 of User zuul. Dec 15 04:12:01 localhost python3.9[117984]: ansible-ansible.legacy.ping Invoked with data=pong Dec 15 04:12:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55420 DF PROTO=TCP SPT=46996 DPT=9102 SEQ=1627750752 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838A37E50000000001030307) Dec 15 04:12:02 localhost python3.9[118088]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Dec 15 04:12:03 localhost python3.9[118180]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:12:04 localhost python3.9[118273]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 15 04:12:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64375 DF PROTO=TCP SPT=41292 DPT=9882 SEQ=3255986243 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838A43A50000000001030307) Dec 15 04:12:05 localhost python3.9[118365]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:12:05 localhost python3.9[118457]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:12:06 localhost python3.9[118530]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1765789925.3693705-177-67790567126847/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:12:07 localhost python3.9[118622]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Dec 15 04:12:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17103 DF PROTO=TCP SPT=48888 DPT=9105 SEQ=1525277633 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838A4F250000000001030307) Dec 15 04:12:08 localhost python3.9[118718]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 15 04:12:09 localhost python3.9[118810]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 15 04:12:09 localhost python3.9[118900]: ansible-ansible.builtin.service_facts Invoked Dec 15 04:12:09 localhost network[118917]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Dec 15 04:12:09 localhost network[118918]: 'network-scripts' will be removed from distribution in near future. Dec 15 04:12:09 localhost network[118919]: It is advised to switch to 'NetworkManager' instead for network management. Dec 15 04:12:10 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:12:10 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26907 DF PROTO=TCP SPT=59768 DPT=9101 SEQ=3513662849 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838A5B250000000001030307) Dec 15 04:12:14 localhost python3.9[119116]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:12:15 localhost python3.9[119206]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Dec 15 04:12:16 localhost python3.9[119302]: ansible-ansible.legacy.command Invoked with _raw_params=# This is a hack to deploy RDO Delorean repos to RHEL as if it were Centos 9 Stream#012set -euxo pipefail#012curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz#012python3 -m venv ./venv#012PBR_VERSION=0.0.0 ./venv/bin/pip install ./repo-setup-main#012# This is required for FIPS enabled until trunk.rdoproject.org#012# is not being served from a centos7 host, tracked by#012# https://issues.redhat.com/browse/RHOSZUUL-1517#012dnf -y install crypto-policies#012update-crypto-policies --set FIPS:NO-ENFORCE-EMS#012./venv/bin/repo-setup current-podified -b antelope -d centos9 --stream#012#012# Exclude ceph-common-18.2.7 as it's pulling newer openssl not compatible#012# with rhel 9.2 openssh#012dnf config-manager --setopt centos9-storage.exclude="ceph-common-18.2.7" --save#012# FIXME: perform dnf upgrade for other packages in EDPM ansible#012# here we only ensuring that decontainerized libvirt can start#012dnf -y upgrade openstack-selinux#012rm -f /run/virtlogd.pid#012#012rm -rf repo-setup-main#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:12:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64377 DF PROTO=TCP SPT=41292 DPT=9882 SEQ=3255986243 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838A73260000000001030307) Dec 15 04:12:20 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17105 DF PROTO=TCP SPT=48888 DPT=9105 SEQ=1525277633 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838A7F250000000001030307) Dec 15 04:12:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37512 DF PROTO=TCP SPT=49318 DPT=9100 SEQ=1008234451 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838A8B130000000001030307) Dec 15 04:12:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37513 DF PROTO=TCP SPT=49318 DPT=9100 SEQ=1008234451 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838A8F250000000001030307) Dec 15 04:12:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38211 DF PROTO=TCP SPT=47344 DPT=9102 SEQ=3665328880 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838A95650000000001030307) Dec 15 04:12:26 localhost systemd[1]: Stopping OpenSSH server daemon... Dec 15 04:12:26 localhost systemd[1]: sshd.service: Deactivated successfully. Dec 15 04:12:26 localhost systemd[1]: sshd.service: Unit process 114852 (sshd) remains running after unit stopped. Dec 15 04:12:26 localhost systemd[1]: sshd.service: Unit process 115040 (sshd) remains running after unit stopped. Dec 15 04:12:26 localhost systemd[1]: Stopped OpenSSH server daemon. Dec 15 04:12:26 localhost systemd[1]: sshd.service: Consumed 3.888s CPU time, read 0B from disk, written 136.0K to disk. Dec 15 04:12:26 localhost systemd[1]: Stopped target sshd-keygen.target. Dec 15 04:12:26 localhost systemd[1]: Stopping sshd-keygen.target... Dec 15 04:12:26 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Dec 15 04:12:26 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Dec 15 04:12:26 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Dec 15 04:12:26 localhost systemd[1]: Reached target sshd-keygen.target. Dec 15 04:12:26 localhost systemd[1]: Starting OpenSSH server daemon... Dec 15 04:12:26 localhost sshd[119345]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:12:26 localhost systemd[1]: Started OpenSSH server daemon. Dec 15 04:12:26 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Dec 15 04:12:26 localhost systemd[1]: Starting man-db-cache-update.service... Dec 15 04:12:26 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Dec 15 04:12:27 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Dec 15 04:12:27 localhost systemd[1]: Finished man-db-cache-update.service. Dec 15 04:12:27 localhost systemd[1]: run-r40cc204e951244139c52766a74d73c0c.service: Deactivated successfully. Dec 15 04:12:27 localhost systemd[1]: run-r7d49c6381dc749b9b10046ad50f2ef2e.service: Deactivated successfully. Dec 15 04:12:27 localhost systemd[1]: Stopping OpenSSH server daemon... Dec 15 04:12:27 localhost systemd[1]: sshd.service: Deactivated successfully. Dec 15 04:12:27 localhost systemd[1]: sshd.service: Unit process 114852 (sshd) remains running after unit stopped. Dec 15 04:12:27 localhost systemd[1]: sshd.service: Unit process 115040 (sshd) remains running after unit stopped. Dec 15 04:12:27 localhost systemd[1]: Stopped OpenSSH server daemon. Dec 15 04:12:27 localhost systemd[1]: Stopped target sshd-keygen.target. Dec 15 04:12:27 localhost systemd[1]: Stopping sshd-keygen.target... Dec 15 04:12:27 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Dec 15 04:12:28 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Dec 15 04:12:28 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Dec 15 04:12:28 localhost systemd[1]: Reached target sshd-keygen.target. Dec 15 04:12:28 localhost systemd[1]: Starting OpenSSH server daemon... Dec 15 04:12:28 localhost sshd[119516]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:12:28 localhost systemd[1]: Started OpenSSH server daemon. Dec 15 04:12:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4199 DF PROTO=TCP SPT=42660 DPT=9101 SEQ=2256110713 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838AA1250000000001030307) Dec 15 04:12:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38213 DF PROTO=TCP SPT=47344 DPT=9102 SEQ=3665328880 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838AAD250000000001030307) Dec 15 04:12:32 localhost sshd[119521]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:12:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43025 DF PROTO=TCP SPT=54380 DPT=9882 SEQ=496675720 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838AB8A50000000001030307) Dec 15 04:12:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44484 DF PROTO=TCP SPT=45040 DPT=9105 SEQ=1974965444 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838AC4650000000001030307) Dec 15 04:12:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4201 DF PROTO=TCP SPT=42660 DPT=9101 SEQ=2256110713 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838AD1250000000001030307) Dec 15 04:12:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43027 DF PROTO=TCP SPT=54380 DPT=9882 SEQ=496675720 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838AE9250000000001030307) Dec 15 04:12:50 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44486 DF PROTO=TCP SPT=45040 DPT=9105 SEQ=1974965444 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838AF5250000000001030307) Dec 15 04:12:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3910 DF PROTO=TCP SPT=37934 DPT=9100 SEQ=627303222 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838B00430000000001030307) Dec 15 04:12:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3911 DF PROTO=TCP SPT=37934 DPT=9100 SEQ=627303222 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838B04650000000001030307) Dec 15 04:12:56 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3912 DF PROTO=TCP SPT=37934 DPT=9100 SEQ=627303222 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838B0C650000000001030307) Dec 15 04:12:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41746 DF PROTO=TCP SPT=52792 DPT=9101 SEQ=3155902971 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838B16650000000001030307) Dec 15 04:13:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32545 DF PROTO=TCP SPT=56008 DPT=9102 SEQ=1237664367 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838B22250000000001030307) Dec 15 04:13:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16720 DF PROTO=TCP SPT=38432 DPT=9882 SEQ=3061714932 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838B2DE50000000001030307) Dec 15 04:13:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35255 DF PROTO=TCP SPT=45608 DPT=9105 SEQ=4242111540 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838B39A50000000001030307) Dec 15 04:13:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41748 DF PROTO=TCP SPT=52792 DPT=9101 SEQ=3155902971 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838B47260000000001030307) Dec 15 04:13:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16722 DF PROTO=TCP SPT=38432 DPT=9882 SEQ=3061714932 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838B5D250000000001030307) Dec 15 04:13:20 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35257 DF PROTO=TCP SPT=45608 DPT=9105 SEQ=4242111540 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838B69260000000001030307) Dec 15 04:13:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34635 DF PROTO=TCP SPT=43046 DPT=9100 SEQ=2058869408 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838B75720000000001030307) Dec 15 04:13:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34636 DF PROTO=TCP SPT=43046 DPT=9100 SEQ=2058869408 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838B79650000000001030307) Dec 15 04:13:26 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34637 DF PROTO=TCP SPT=43046 DPT=9100 SEQ=2058869408 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838B81650000000001030307) Dec 15 04:13:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43225 DF PROTO=TCP SPT=36642 DPT=9101 SEQ=2699918173 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838B8B650000000001030307) Dec 15 04:13:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23389 DF PROTO=TCP SPT=48856 DPT=9102 SEQ=3351044524 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838B97660000000001030307) Dec 15 04:13:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34471 DF PROTO=TCP SPT=36122 DPT=9882 SEQ=1249417925 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838BA3250000000001030307) Dec 15 04:13:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41166 DF PROTO=TCP SPT=47694 DPT=9105 SEQ=4289502642 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838BAEA60000000001030307) Dec 15 04:13:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43227 DF PROTO=TCP SPT=36642 DPT=9101 SEQ=2699918173 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838BBB250000000001030307) Dec 15 04:13:43 localhost kernel: SELinux: Converting 2754 SID table entries... Dec 15 04:13:43 localhost kernel: SELinux: policy capability network_peer_controls=1 Dec 15 04:13:43 localhost kernel: SELinux: policy capability open_perms=1 Dec 15 04:13:43 localhost kernel: SELinux: policy capability extended_socket_class=1 Dec 15 04:13:43 localhost kernel: SELinux: policy capability always_check_network=0 Dec 15 04:13:43 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Dec 15 04:13:43 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Dec 15 04:13:43 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Dec 15 04:13:45 localhost dbus-broker-launch[755]: avc: op=load_policy lsm=selinux seqno=17 res=1 Dec 15 04:13:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34473 DF PROTO=TCP SPT=36122 DPT=9882 SEQ=1249417925 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838BD3250000000001030307) Dec 15 04:13:47 localhost python3.9[120218]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:13:48 localhost python3.9[120310]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/edpm.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:13:49 localhost python3.9[120383]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/edpm.fact mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765790028.2220118-426-3083360728388/.source.fact _original_basename=.1rpdv0sb follow=False checksum=03aee63dcf9b49b0ac4473b2f1a1b5d3783aa639 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:13:50 localhost python3.9[120488]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Dec 15 04:13:50 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41168 DF PROTO=TCP SPT=47694 DPT=9105 SEQ=4289502642 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838BDF250000000001030307) Dec 15 04:13:51 localhost python3.9[120586]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Dec 15 04:13:52 localhost python3.9[120640]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Dec 15 04:13:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52567 DF PROTO=TCP SPT=38234 DPT=9100 SEQ=3602740930 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838BEAA30000000001030307) Dec 15 04:13:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52568 DF PROTO=TCP SPT=38234 DPT=9100 SEQ=3602740930 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838BEEA50000000001030307) Dec 15 04:13:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13817 DF PROTO=TCP SPT=60502 DPT=9102 SEQ=553396774 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838BF4E50000000001030307) Dec 15 04:13:56 localhost systemd[1]: Reloading. Dec 15 04:13:56 localhost systemd-sysv-generator[120677]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:13:56 localhost systemd-rc-local-generator[120672]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:13:56 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:13:56 localhost systemd[1]: Queuing reload/restart jobs for marked units… Dec 15 04:13:57 localhost python3.9[120781]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:13:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8854 DF PROTO=TCP SPT=59002 DPT=9101 SEQ=4247117014 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838C00A50000000001030307) Dec 15 04:14:00 localhost python3.9[121020]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False Dec 15 04:14:01 localhost python3.9[121112]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None Dec 15 04:14:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34428 DF PROTO=TCP SPT=47124 DPT=9882 SEQ=1587880615 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838C0C4A0000000001030307) Dec 15 04:14:02 localhost python3.9[121205]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:14:03 localhost python3.9[121297]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None Dec 15 04:14:04 localhost python3.9[121389]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 15 04:14:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34430 DF PROTO=TCP SPT=47124 DPT=9882 SEQ=1587880615 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838C18660000000001030307) Dec 15 04:14:05 localhost python3.9[121481]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:14:05 localhost python3.9[121554]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765790044.7459822-750-108833022937473/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=642b0566ec70b00e9a30dc94608ed63df1702edb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:14:07 localhost python3.9[121646]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 15 04:14:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14856 DF PROTO=TCP SPT=38658 DPT=9105 SEQ=2162429978 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838C23E50000000001030307) Dec 15 04:14:08 localhost python3.9[121740]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None Dec 15 04:14:09 localhost python3.9[121833]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None Dec 15 04:14:10 localhost python3.9[121926]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None Dec 15 04:14:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8856 DF PROTO=TCP SPT=59002 DPT=9101 SEQ=4247117014 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838C31260000000001030307) Dec 15 04:14:11 localhost python3.9[122024]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None Dec 15 04:14:12 localhost python3.9[122116]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Dec 15 04:14:16 localhost python3.9[122210]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 15 04:14:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34432 DF PROTO=TCP SPT=47124 DPT=9882 SEQ=1587880615 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838C49250000000001030307) Dec 15 04:14:20 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14858 DF PROTO=TCP SPT=38658 DPT=9105 SEQ=2162429978 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838C53260000000001030307) Dec 15 04:14:21 localhost python3.9[122303]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:14:22 localhost python3.9[122376]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765790060.1726115-1023-18890913942155/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Dec 15 04:14:22 localhost sshd[122391]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:14:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60939 DF PROTO=TCP SPT=41652 DPT=9100 SEQ=1750291100 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838C5FD30000000001030307) Dec 15 04:14:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60940 DF PROTO=TCP SPT=41652 DPT=9100 SEQ=1750291100 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838C63E50000000001030307) Dec 15 04:14:24 localhost python3.9[122470]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Dec 15 04:14:24 localhost systemd[1]: systemd-modules-load.service: Deactivated successfully. Dec 15 04:14:24 localhost systemd[1]: Stopped Load Kernel Modules. Dec 15 04:14:24 localhost systemd[1]: Stopping Load Kernel Modules... Dec 15 04:14:24 localhost systemd[1]: Starting Load Kernel Modules... Dec 15 04:14:24 localhost systemd-modules-load[122474]: Module 'msr' is built in Dec 15 04:14:24 localhost systemd[1]: Finished Load Kernel Modules. Dec 15 04:14:25 localhost python3.9[122567]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:14:25 localhost python3.9[122640]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765790064.6742964-1092-197552104907217/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Dec 15 04:14:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27963 DF PROTO=TCP SPT=33088 DPT=9102 SEQ=192538743 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838C6A260000000001030307) Dec 15 04:14:26 localhost python3.9[122732]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Dec 15 04:14:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24708 DF PROTO=TCP SPT=35036 DPT=9101 SEQ=188735471 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838C75E60000000001030307) Dec 15 04:14:31 localhost python3.9[122824]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 15 04:14:31 localhost python3.9[122916]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile Dec 15 04:14:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27965 DF PROTO=TCP SPT=33088 DPT=9102 SEQ=192538743 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838C81E50000000001030307) Dec 15 04:14:32 localhost python3.9[123006]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 15 04:14:33 localhost python3.9[123098]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 04:14:33 localhost systemd[1]: Stopping Dynamic System Tuning Daemon... Dec 15 04:14:33 localhost systemd[1]: tuned.service: Deactivated successfully. Dec 15 04:14:33 localhost systemd[1]: Stopped Dynamic System Tuning Daemon. Dec 15 04:14:33 localhost systemd[1]: tuned.service: Consumed 1.898s CPU time, no IO. Dec 15 04:14:33 localhost systemd[1]: Starting Dynamic System Tuning Daemon... Dec 15 04:14:34 localhost systemd[1]: Started Dynamic System Tuning Daemon. Dec 15 04:14:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6087 DF PROTO=TCP SPT=57882 DPT=9882 SEQ=3278243081 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838C8D660000000001030307) Dec 15 04:14:35 localhost python3.9[123200]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline Dec 15 04:14:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21895 DF PROTO=TCP SPT=55180 DPT=9105 SEQ=2959272412 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838C99250000000001030307) Dec 15 04:14:39 localhost python3.9[123292]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 04:14:39 localhost systemd[1]: Reloading. Dec 15 04:14:39 localhost systemd-rc-local-generator[123314]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:14:39 localhost systemd-sysv-generator[123319]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:14:39 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:14:40 localhost python3.9[123421]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 04:14:40 localhost systemd[1]: Reloading. Dec 15 04:14:40 localhost systemd-sysv-generator[123451]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:14:40 localhost systemd-rc-local-generator[123446]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:14:40 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:14:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24710 DF PROTO=TCP SPT=35036 DPT=9101 SEQ=188735471 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838CA5250000000001030307) Dec 15 04:14:41 localhost sshd[123474]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:14:43 localhost sshd[123475]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:14:43 localhost python3.9[123553]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:14:44 localhost python3.9[123646]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:14:44 localhost kernel: Adding 1048572k swap on /swap. Priority:-2 extents:1 across:1048572k FS Dec 15 04:14:45 localhost python3.9[123739]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:14:46 localhost python3.9[123838]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:14:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6089 DF PROTO=TCP SPT=57882 DPT=9882 SEQ=3278243081 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838CBD250000000001030307) Dec 15 04:14:47 localhost sshd[123932]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:14:47 localhost python3.9[123931]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Dec 15 04:14:47 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully. Dec 15 04:14:47 localhost systemd[1]: Stopped Apply Kernel Variables. Dec 15 04:14:47 localhost systemd[1]: Stopping Apply Kernel Variables... Dec 15 04:14:47 localhost systemd[1]: Starting Apply Kernel Variables... Dec 15 04:14:47 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Dec 15 04:14:47 localhost systemd[1]: Finished Apply Kernel Variables. Dec 15 04:14:49 localhost systemd[1]: session-38.scope: Deactivated successfully. Dec 15 04:14:49 localhost systemd[1]: session-38.scope: Consumed 1min 55.362s CPU time. Dec 15 04:14:49 localhost systemd-logind[763]: Session 38 logged out. Waiting for processes to exit. Dec 15 04:14:49 localhost systemd-logind[763]: Removed session 38. Dec 15 04:14:50 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21897 DF PROTO=TCP SPT=55180 DPT=9105 SEQ=2959272412 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838CC9250000000001030307) Dec 15 04:14:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29846 DF PROTO=TCP SPT=56488 DPT=9100 SEQ=2935816369 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838CD5030000000001030307) Dec 15 04:14:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29847 DF PROTO=TCP SPT=56488 DPT=9100 SEQ=2935816369 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838CD9260000000001030307) Dec 15 04:14:56 localhost sshd[124079]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:14:56 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29848 DF PROTO=TCP SPT=56488 DPT=9100 SEQ=2935816369 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838CE1250000000001030307) Dec 15 04:14:56 localhost systemd-logind[763]: New session 39 of user zuul. Dec 15 04:14:56 localhost systemd[1]: Started Session 39 of User zuul. Dec 15 04:14:57 localhost python3.9[124172]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Dec 15 04:14:58 localhost python3.9[124266]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Dec 15 04:14:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13822 DF PROTO=TCP SPT=60502 DPT=9102 SEQ=553396774 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838CEB250000000001030307) Dec 15 04:15:00 localhost python3.9[124362]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:15:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5192 DF PROTO=TCP SPT=51580 DPT=9102 SEQ=4038344721 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838CF6E50000000001030307) Dec 15 04:15:02 localhost python3.9[124453]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Dec 15 04:15:04 localhost python3.9[124549]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Dec 15 04:15:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19968 DF PROTO=TCP SPT=39086 DPT=9882 SEQ=4181383022 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838D02A50000000001030307) Dec 15 04:15:05 localhost python3.9[124603]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Dec 15 04:15:05 localhost ceph-osd[31375]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Dec 15 04:15:05 localhost ceph-osd[31375]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 5400.1 total, 600.0 interval#012Cumulative writes: 4815 writes, 21K keys, 4815 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 4815 writes, 628 syncs, 7.67 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Dec 15 04:15:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44208 DF PROTO=TCP SPT=48830 DPT=9105 SEQ=1797781556 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838D0E650000000001030307) Dec 15 04:15:09 localhost python3.9[124697]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d Dec 15 04:15:10 localhost ceph-osd[32311]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Dec 15 04:15:10 localhost ceph-osd[32311]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 5400.2 total, 600.0 interval#012Cumulative writes: 5745 writes, 25K keys, 5745 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 5745 writes, 763 syncs, 7.53 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Dec 15 04:15:10 localhost python3.9[124852]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:15:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46400 DF PROTO=TCP SPT=45954 DPT=9101 SEQ=3248273579 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838D1B260000000001030307) Dec 15 04:15:11 localhost python3.9[124944]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:15:12 localhost python3.9[125047]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:15:13 localhost python3.9[125095]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:15:14 localhost python3.9[125187]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:15:14 localhost python3.9[125260]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765790113.6676233-323-19987688218224/.source.conf follow=False _original_basename=registries.conf.j2 checksum=804a0d01b832e60d20f779a331306df708c87b02 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Dec 15 04:15:15 localhost python3.9[125352]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None Dec 15 04:15:15 localhost systemd-journald[47230]: Field hash table of /run/log/journal/738a39f68bc78fb81032e509449fb759/system.journal has a fill level at 75.4 (251 of 333 items), suggesting rotation. Dec 15 04:15:15 localhost systemd-journald[47230]: /run/log/journal/738a39f68bc78fb81032e509449fb759/system.journal: Journal header limits reached or header out-of-date, rotating. Dec 15 04:15:15 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Dec 15 04:15:15 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Dec 15 04:15:15 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Dec 15 04:15:16 localhost python3.9[125445]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None Dec 15 04:15:17 localhost python3.9[125537]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None Dec 15 04:15:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19970 DF PROTO=TCP SPT=39086 DPT=9882 SEQ=4181383022 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838D33260000000001030307) Dec 15 04:15:17 localhost python3.9[125629]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None Dec 15 04:15:18 localhost python3.9[125719]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Dec 15 04:15:19 localhost python3.9[125813]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None Dec 15 04:15:20 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44210 DF PROTO=TCP SPT=48830 DPT=9105 SEQ=1797781556 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838D3F250000000001030307) Dec 15 04:15:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58051 DF PROTO=TCP SPT=54658 DPT=9100 SEQ=4024094626 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838D4A320000000001030307) Dec 15 04:15:23 localhost python3.9[125907]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None Dec 15 04:15:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58052 DF PROTO=TCP SPT=54658 DPT=9100 SEQ=4024094626 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838D4E260000000001030307) Dec 15 04:15:26 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58053 DF PROTO=TCP SPT=54658 DPT=9100 SEQ=4024094626 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838D56250000000001030307) Dec 15 04:15:28 localhost python3.9[126001]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None Dec 15 04:15:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42862 DF PROTO=TCP SPT=46068 DPT=9101 SEQ=1114197423 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838D60250000000001030307) Dec 15 04:15:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23766 DF PROTO=TCP SPT=48650 DPT=9102 SEQ=266576372 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838D6C250000000001030307) Dec 15 04:15:32 localhost python3.9[126101]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None Dec 15 04:15:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50102 DF PROTO=TCP SPT=37200 DPT=9882 SEQ=3531621283 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838D77E60000000001030307) Dec 15 04:15:36 localhost python3.9[126195]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None Dec 15 04:15:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59375 DF PROTO=TCP SPT=35512 DPT=9105 SEQ=3806285862 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838D83660000000001030307) Dec 15 04:15:41 localhost python3.9[126289]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None Dec 15 04:15:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42864 DF PROTO=TCP SPT=46068 DPT=9101 SEQ=1114197423 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838D91250000000001030307) Dec 15 04:15:45 localhost python3.9[126383]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None Dec 15 04:15:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50104 DF PROTO=TCP SPT=37200 DPT=9882 SEQ=3531621283 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838DA7260000000001030307) Dec 15 04:15:50 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59377 DF PROTO=TCP SPT=35512 DPT=9105 SEQ=3806285862 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838DB3260000000001030307) Dec 15 04:15:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23703 DF PROTO=TCP SPT=54778 DPT=9100 SEQ=681976086 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838DBF620000000001030307) Dec 15 04:15:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23704 DF PROTO=TCP SPT=54778 DPT=9100 SEQ=681976086 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838DC3650000000001030307) Dec 15 04:15:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29078 DF PROTO=TCP SPT=40998 DPT=9102 SEQ=527342677 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838DC9A50000000001030307) Dec 15 04:15:56 localhost podman[126561]: Dec 15 04:15:56 localhost podman[126561]: 2025-12-15 09:15:56.245940189 +0000 UTC m=+0.078591344 container create dbf2be69b7a2dc5c4f81f178ee7cf98a0066dd487ffd80f9b9f9f33e984ae208 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=happy_hellman, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, version=7, description=Red Hat Ceph Storage 7, ceph=True, build-date=2025-11-26T19:44:28Z, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, architecture=x86_64, release=1763362218, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, GIT_BRANCH=main, name=rhceph, vcs-type=git, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git) Dec 15 04:15:56 localhost systemd[1]: Started libpod-conmon-dbf2be69b7a2dc5c4f81f178ee7cf98a0066dd487ffd80f9b9f9f33e984ae208.scope. Dec 15 04:15:56 localhost systemd[1]: Started libcrun container. Dec 15 04:15:56 localhost podman[126561]: 2025-12-15 09:15:56.212083336 +0000 UTC m=+0.044734521 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 04:15:56 localhost podman[126561]: 2025-12-15 09:15:56.323331421 +0000 UTC m=+0.155982576 container init dbf2be69b7a2dc5c4f81f178ee7cf98a0066dd487ffd80f9b9f9f33e984ae208 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=happy_hellman, maintainer=Guillaume Abrioux , RELEASE=main, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, ceph=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, GIT_BRANCH=main, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, io.buildah.version=1.41.4, build-date=2025-11-26T19:44:28Z, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1763362218, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, distribution-scope=public, version=7) Dec 15 04:15:56 localhost systemd[1]: tmp-crun.2lLqFS.mount: Deactivated successfully. Dec 15 04:15:56 localhost podman[126561]: 2025-12-15 09:15:56.338020858 +0000 UTC m=+0.170672003 container start dbf2be69b7a2dc5c4f81f178ee7cf98a0066dd487ffd80f9b9f9f33e984ae208 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=happy_hellman, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, architecture=x86_64, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., name=rhceph, distribution-scope=public, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, RELEASE=main, CEPH_POINT_RELEASE=, build-date=2025-11-26T19:44:28Z, version=7, release=1763362218, maintainer=Guillaume Abrioux , io.openshift.expose-services=) Dec 15 04:15:56 localhost podman[126561]: 2025-12-15 09:15:56.338297155 +0000 UTC m=+0.170948350 container attach dbf2be69b7a2dc5c4f81f178ee7cf98a0066dd487ffd80f9b9f9f33e984ae208 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=happy_hellman, vcs-type=git, GIT_CLEAN=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, build-date=2025-11-26T19:44:28Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, release=1763362218, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, com.redhat.component=rhceph-container, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., version=7, description=Red Hat Ceph Storage 7, architecture=x86_64, maintainer=Guillaume Abrioux ) Dec 15 04:15:56 localhost happy_hellman[126583]: 167 167 Dec 15 04:15:56 localhost systemd[1]: libpod-dbf2be69b7a2dc5c4f81f178ee7cf98a0066dd487ffd80f9b9f9f33e984ae208.scope: Deactivated successfully. Dec 15 04:15:56 localhost podman[126561]: 2025-12-15 09:15:56.34337194 +0000 UTC m=+0.176023115 container died dbf2be69b7a2dc5c4f81f178ee7cf98a0066dd487ffd80f9b9f9f33e984ae208 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=happy_hellman, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, RELEASE=main, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., release=1763362218, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, ceph=True, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, version=7, GIT_BRANCH=main, GIT_CLEAN=True, distribution-scope=public, description=Red Hat Ceph Storage 7) Dec 15 04:15:56 localhost podman[126590]: 2025-12-15 09:15:56.412854813 +0000 UTC m=+0.061751070 container remove dbf2be69b7a2dc5c4f81f178ee7cf98a0066dd487ffd80f9b9f9f33e984ae208 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=happy_hellman, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux , GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, vcs-type=git, ceph=True, io.buildah.version=1.41.4, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, version=7) Dec 15 04:15:56 localhost systemd[1]: libpod-conmon-dbf2be69b7a2dc5c4f81f178ee7cf98a0066dd487ffd80f9b9f9f33e984ae208.scope: Deactivated successfully. Dec 15 04:15:56 localhost podman[126619]: Dec 15 04:15:56 localhost podman[126619]: 2025-12-15 09:15:56.608329719 +0000 UTC m=+0.068721834 container create 2281d169848bb43bdc9002b24de138e41921843ab74f094e8c40bbb4f829b7c3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=confident_diffie, distribution-scope=public, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1763362218, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, RELEASE=main, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, GIT_CLEAN=True, vcs-type=git, GIT_BRANCH=main, com.redhat.component=rhceph-container, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0) Dec 15 04:15:56 localhost systemd[1]: Started libpod-conmon-2281d169848bb43bdc9002b24de138e41921843ab74f094e8c40bbb4f829b7c3.scope. Dec 15 04:15:56 localhost systemd[1]: Started libcrun container. Dec 15 04:15:56 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e8f425d50de6f083e1b26d1987438ec30fd0f7702ee9a722ef39f7e641ba61b4/merged/rootfs supports timestamps until 2038 (0x7fffffff) Dec 15 04:15:56 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e8f425d50de6f083e1b26d1987438ec30fd0f7702ee9a722ef39f7e641ba61b4/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Dec 15 04:15:56 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e8f425d50de6f083e1b26d1987438ec30fd0f7702ee9a722ef39f7e641ba61b4/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Dec 15 04:15:56 localhost podman[126619]: 2025-12-15 09:15:56.670816777 +0000 UTC m=+0.131208912 container init 2281d169848bb43bdc9002b24de138e41921843ab74f094e8c40bbb4f829b7c3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=confident_diffie, io.buildah.version=1.41.4, io.openshift.expose-services=, release=1763362218, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , distribution-scope=public, name=rhceph, build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, RELEASE=main, io.openshift.tags=rhceph ceph, vcs-type=git, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, GIT_BRANCH=main, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers) Dec 15 04:15:56 localhost podman[126619]: 2025-12-15 09:15:56.680705147 +0000 UTC m=+0.141097272 container start 2281d169848bb43bdc9002b24de138e41921843ab74f094e8c40bbb4f829b7c3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=confident_diffie, io.openshift.tags=rhceph ceph, name=rhceph, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, vcs-type=git, release=1763362218, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , ceph=True, GIT_BRANCH=main, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, io.openshift.expose-services=) Dec 15 04:15:56 localhost podman[126619]: 2025-12-15 09:15:56.681335775 +0000 UTC m=+0.141727940 container attach 2281d169848bb43bdc9002b24de138e41921843ab74f094e8c40bbb4f829b7c3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=confident_diffie, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, vcs-type=git, io.buildah.version=1.41.4, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., version=7, build-date=2025-11-26T19:44:28Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, io.openshift.expose-services=, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Dec 15 04:15:56 localhost podman[126619]: 2025-12-15 09:15:56.585543338 +0000 UTC m=+0.045935483 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 04:15:57 localhost systemd[1]: var-lib-containers-storage-overlay-3484a025d69f7784aa23fb2a2864b942d5c7e09154ec2fefdc5ba44719793aa6-merged.mount: Deactivated successfully. Dec 15 04:15:57 localhost confident_diffie[126640]: [ Dec 15 04:15:57 localhost confident_diffie[126640]: { Dec 15 04:15:57 localhost confident_diffie[126640]: "available": false, Dec 15 04:15:57 localhost confident_diffie[126640]: "ceph_device": false, Dec 15 04:15:57 localhost confident_diffie[126640]: "device_id": "QEMU_DVD-ROM_QM00001", Dec 15 04:15:57 localhost confident_diffie[126640]: "lsm_data": {}, Dec 15 04:15:57 localhost confident_diffie[126640]: "lvs": [], Dec 15 04:15:57 localhost confident_diffie[126640]: "path": "/dev/sr0", Dec 15 04:15:57 localhost confident_diffie[126640]: "rejected_reasons": [ Dec 15 04:15:57 localhost confident_diffie[126640]: "Has a FileSystem", Dec 15 04:15:57 localhost confident_diffie[126640]: "Insufficient space (<5GB)" Dec 15 04:15:57 localhost confident_diffie[126640]: ], Dec 15 04:15:57 localhost confident_diffie[126640]: "sys_api": { Dec 15 04:15:57 localhost confident_diffie[126640]: "actuators": null, Dec 15 04:15:57 localhost confident_diffie[126640]: "device_nodes": "sr0", Dec 15 04:15:57 localhost confident_diffie[126640]: "human_readable_size": "482.00 KB", Dec 15 04:15:57 localhost confident_diffie[126640]: "id_bus": "ata", Dec 15 04:15:57 localhost confident_diffie[126640]: "model": "QEMU DVD-ROM", Dec 15 04:15:57 localhost confident_diffie[126640]: "nr_requests": "2", Dec 15 04:15:57 localhost confident_diffie[126640]: "partitions": {}, Dec 15 04:15:57 localhost confident_diffie[126640]: "path": "/dev/sr0", Dec 15 04:15:57 localhost confident_diffie[126640]: "removable": "1", Dec 15 04:15:57 localhost confident_diffie[126640]: "rev": "2.5+", Dec 15 04:15:57 localhost confident_diffie[126640]: "ro": "0", Dec 15 04:15:57 localhost confident_diffie[126640]: "rotational": "1", Dec 15 04:15:57 localhost confident_diffie[126640]: "sas_address": "", Dec 15 04:15:57 localhost confident_diffie[126640]: "sas_device_handle": "", Dec 15 04:15:57 localhost confident_diffie[126640]: "scheduler_mode": "mq-deadline", Dec 15 04:15:57 localhost confident_diffie[126640]: "sectors": 0, Dec 15 04:15:57 localhost confident_diffie[126640]: "sectorsize": "2048", Dec 15 04:15:57 localhost confident_diffie[126640]: "size": 493568.0, Dec 15 04:15:57 localhost confident_diffie[126640]: "support_discard": "0", Dec 15 04:15:57 localhost confident_diffie[126640]: "type": "disk", Dec 15 04:15:57 localhost confident_diffie[126640]: "vendor": "QEMU" Dec 15 04:15:57 localhost confident_diffie[126640]: } Dec 15 04:15:57 localhost confident_diffie[126640]: } Dec 15 04:15:57 localhost confident_diffie[126640]: ] Dec 15 04:15:57 localhost systemd[1]: libpod-2281d169848bb43bdc9002b24de138e41921843ab74f094e8c40bbb4f829b7c3.scope: Deactivated successfully. Dec 15 04:15:57 localhost podman[126619]: 2025-12-15 09:15:57.511136934 +0000 UTC m=+0.971529089 container died 2281d169848bb43bdc9002b24de138e41921843ab74f094e8c40bbb4f829b7c3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=confident_diffie, description=Red Hat Ceph Storage 7, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, com.redhat.component=rhceph-container, RELEASE=main, vcs-type=git, architecture=x86_64, version=7, build-date=2025-11-26T19:44:28Z, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, GIT_BRANCH=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux , cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, release=1763362218) Dec 15 04:15:57 localhost systemd[1]: var-lib-containers-storage-overlay-e8f425d50de6f083e1b26d1987438ec30fd0f7702ee9a722ef39f7e641ba61b4-merged.mount: Deactivated successfully. Dec 15 04:15:57 localhost podman[128258]: 2025-12-15 09:15:57.627253266 +0000 UTC m=+0.102489554 container remove 2281d169848bb43bdc9002b24de138e41921843ab74f094e8c40bbb4f829b7c3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=confident_diffie, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, RELEASE=main, name=rhceph, ceph=True, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, release=1763362218, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Dec 15 04:15:57 localhost systemd[1]: libpod-conmon-2281d169848bb43bdc9002b24de138e41921843ab74f094e8c40bbb4f829b7c3.scope: Deactivated successfully. Dec 15 04:15:57 localhost python3.9[128342]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:15:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6675 DF PROTO=TCP SPT=54984 DPT=9101 SEQ=3865151778 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838DD5650000000001030307) Dec 15 04:15:59 localhost python3.9[128462]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:16:00 localhost python3.9[128535]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1765790158.307672-722-47386799716140/.source.json _original_basename=.813j1obt follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:16:01 localhost python3.9[128627]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None Dec 15 04:16:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57469 DF PROTO=TCP SPT=47208 DPT=9882 SEQ=3485301865 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838DE10C0000000001030307) Dec 15 04:16:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57471 DF PROTO=TCP SPT=47208 DPT=9882 SEQ=3485301865 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838DED250000000001030307) Dec 15 04:16:07 localhost podman[128639]: 2025-12-15 09:16:01.682701983 +0000 UTC m=+0.047414551 image pull quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified Dec 15 04:16:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57679 DF PROTO=TCP SPT=46614 DPT=9105 SEQ=3177898146 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838DF8A50000000001030307) Dec 15 04:16:09 localhost python3.9[128839]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None Dec 15 04:16:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6677 DF PROTO=TCP SPT=54984 DPT=9101 SEQ=3865151778 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838E05250000000001030307) Dec 15 04:16:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57473 DF PROTO=TCP SPT=47208 DPT=9882 SEQ=3485301865 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838E1D250000000001030307) Dec 15 04:16:18 localhost podman[128852]: 2025-12-15 09:16:09.627539906 +0000 UTC m=+0.043592921 image pull quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified Dec 15 04:16:20 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57681 DF PROTO=TCP SPT=46614 DPT=9105 SEQ=3177898146 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838E29250000000001030307) Dec 15 04:16:20 localhost python3.9[129052]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None Dec 15 04:16:22 localhost podman[129066]: 2025-12-15 09:16:20.989324954 +0000 UTC m=+0.042808790 image pull quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified Dec 15 04:16:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59996 DF PROTO=TCP SPT=33630 DPT=9100 SEQ=2670952482 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838E34930000000001030307) Dec 15 04:16:23 localhost python3.9[129231]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None Dec 15 04:16:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59997 DF PROTO=TCP SPT=33630 DPT=9100 SEQ=2670952482 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838E38A50000000001030307) Dec 15 04:16:25 localhost podman[129243]: 2025-12-15 09:16:23.939598618 +0000 UTC m=+0.044623188 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 15 04:16:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29819 DF PROTO=TCP SPT=43002 DPT=9102 SEQ=4155888403 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838E3EE50000000001030307) Dec 15 04:16:26 localhost python3.9[129406]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None Dec 15 04:16:27 localhost sshd[129431]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:16:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46024 DF PROTO=TCP SPT=33778 DPT=9101 SEQ=4273693366 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838E4AA40000000001030307) Dec 15 04:16:29 localhost podman[129418]: 2025-12-15 09:16:26.471510075 +0000 UTC m=+0.028620506 image pull quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified Dec 15 04:16:31 localhost python3.9[129596]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None Dec 15 04:16:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29821 DF PROTO=TCP SPT=43002 DPT=9102 SEQ=4155888403 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838E56A50000000001030307) Dec 15 04:16:32 localhost podman[129610]: 2025-12-15 09:16:31.378631008 +0000 UTC m=+0.046113977 image pull quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c Dec 15 04:16:33 localhost systemd-logind[763]: Session 39 logged out. Waiting for processes to exit. Dec 15 04:16:33 localhost systemd[1]: session-39.scope: Deactivated successfully. Dec 15 04:16:33 localhost systemd[1]: session-39.scope: Consumed 1min 29.161s CPU time. Dec 15 04:16:33 localhost systemd-logind[763]: Removed session 39. Dec 15 04:16:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13393 DF PROTO=TCP SPT=57032 DPT=9882 SEQ=1979757074 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838E62250000000001030307) Dec 15 04:16:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50475 DF PROTO=TCP SPT=37170 DPT=9105 SEQ=1114114959 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838E6DE50000000001030307) Dec 15 04:16:39 localhost sshd[129976]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:16:39 localhost systemd-logind[763]: New session 40 of user zuul. Dec 15 04:16:39 localhost systemd[1]: Started Session 40 of User zuul. Dec 15 04:16:41 localhost python3.9[130069]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Dec 15 04:16:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46026 DF PROTO=TCP SPT=33778 DPT=9101 SEQ=4273693366 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838E7B260000000001030307) Dec 15 04:16:42 localhost python3.9[130165]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None Dec 15 04:16:43 localhost python3.9[130258]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Dec 15 04:16:44 localhost python3.9[130312]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch3.3'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None Dec 15 04:16:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13395 DF PROTO=TCP SPT=57032 DPT=9882 SEQ=1979757074 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838E93250000000001030307) Dec 15 04:16:50 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50477 DF PROTO=TCP SPT=37170 DPT=9105 SEQ=1114114959 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838E9D250000000001030307) Dec 15 04:16:51 localhost python3.9[130660]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch3.3'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Dec 15 04:16:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48504 DF PROTO=TCP SPT=43250 DPT=9100 SEQ=2226060958 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838EA9C30000000001030307) Dec 15 04:16:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48505 DF PROTO=TCP SPT=43250 DPT=9100 SEQ=2226060958 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838EADE50000000001030307) Dec 15 04:16:56 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48506 DF PROTO=TCP SPT=43250 DPT=9100 SEQ=2226060958 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838EB5E50000000001030307) Dec 15 04:16:56 localhost python3.9[130754]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None Dec 15 04:16:57 localhost python3.9[130847]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Dec 15 04:16:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34163 DF PROTO=TCP SPT=54080 DPT=9101 SEQ=3775267932 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838EBFE60000000001030307) Dec 15 04:16:58 localhost python3.9[130969]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None Dec 15 04:17:01 localhost kernel: SELinux: Converting 2756 SID table entries... Dec 15 04:17:01 localhost kernel: SELinux: policy capability network_peer_controls=1 Dec 15 04:17:01 localhost kernel: SELinux: policy capability open_perms=1 Dec 15 04:17:01 localhost kernel: SELinux: policy capability extended_socket_class=1 Dec 15 04:17:01 localhost kernel: SELinux: policy capability always_check_network=0 Dec 15 04:17:01 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Dec 15 04:17:01 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Dec 15 04:17:01 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Dec 15 04:17:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58595 DF PROTO=TCP SPT=49030 DPT=9102 SEQ=2864381312 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838ECBA50000000001030307) Dec 15 04:17:02 localhost dbus-broker-launch[755]: avc: op=load_policy lsm=selinux seqno=18 res=1 Dec 15 04:17:02 localhost python3.9[131195]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Dec 15 04:17:03 localhost python3.9[131293]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Dec 15 04:17:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28808 DF PROTO=TCP SPT=42920 DPT=9882 SEQ=926188335 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838ED7650000000001030307) Dec 15 04:17:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37009 DF PROTO=TCP SPT=34150 DPT=9105 SEQ=2540994890 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838EE3250000000001030307) Dec 15 04:17:08 localhost python3.9[131387]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:17:10 localhost python3.9[131632]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None attributes=None Dec 15 04:17:10 localhost python3.9[131722]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 15 04:17:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34165 DF PROTO=TCP SPT=54080 DPT=9101 SEQ=3775267932 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838EEF250000000001030307) Dec 15 04:17:11 localhost python3.9[131816]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Dec 15 04:17:15 localhost python3.9[131910]: ansible-ansible.legacy.dnf Invoked with name=['openstack-network-scripts'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Dec 15 04:17:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28810 DF PROTO=TCP SPT=42920 DPT=9882 SEQ=926188335 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838F07250000000001030307) Dec 15 04:17:20 localhost python3.9[132004]: ansible-ansible.builtin.systemd Invoked with enabled=True name=network daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None Dec 15 04:17:20 localhost systemd[1]: Reloading. Dec 15 04:17:20 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37011 DF PROTO=TCP SPT=34150 DPT=9105 SEQ=2540994890 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838F13250000000001030307) Dec 15 04:17:20 localhost systemd-sysv-generator[132037]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:17:20 localhost systemd-rc-local-generator[132032]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:17:20 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:17:21 localhost python3.9[132136]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 15 04:17:22 localhost python3.9[132228]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:17:23 localhost python3.9[132322]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:17:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18727 DF PROTO=TCP SPT=53838 DPT=9100 SEQ=3420402271 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838F1EF20000000001030307) Dec 15 04:17:23 localhost python3.9[132414]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:17:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18728 DF PROTO=TCP SPT=53838 DPT=9100 SEQ=3420402271 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838F22E60000000001030307) Dec 15 04:17:24 localhost python3.9[132506]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:17:25 localhost python3.9[132579]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1765790244.0961187-563-132071369590545/.source _original_basename=.qss7hu2c follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:17:25 localhost python3.9[132671]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:17:26 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18729 DF PROTO=TCP SPT=53838 DPT=9100 SEQ=3420402271 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838F2AE50000000001030307) Dec 15 04:17:26 localhost python3.9[132763]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={} Dec 15 04:17:28 localhost python3.9[132855]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:17:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48896 DF PROTO=TCP SPT=37534 DPT=9101 SEQ=3417086040 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838F34E50000000001030307) Dec 15 04:17:29 localhost python3.9[132947]: ansible-ansible.legacy.stat Invoked with path=/etc/os-net-config/config.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:17:29 localhost python3.9[133020]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/os-net-config/config.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765790248.8028598-689-62041238224506/.source.yaml _original_basename=.n7orau4w follow=False checksum=0cadac3cfc033a4e07cfac59b43f6459e787700a force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:17:30 localhost python3.9[133112]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml Dec 15 04:17:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19184 DF PROTO=TCP SPT=39410 DPT=9102 SEQ=3064315406 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838F40E60000000001030307) Dec 15 04:17:32 localhost ansible-async_wrapper.py[133217]: Invoked with j204740626550 300 /home/zuul/.ansible/tmp/ansible-tmp-1765790251.2268753-761-63361080706090/AnsiballZ_edpm_os_net_config.py _ Dec 15 04:17:32 localhost ansible-async_wrapper.py[133220]: Starting module and watcher Dec 15 04:17:32 localhost ansible-async_wrapper.py[133220]: Start watching 133221 (300) Dec 15 04:17:32 localhost ansible-async_wrapper.py[133221]: Start module (133221) Dec 15 04:17:32 localhost ansible-async_wrapper.py[133217]: Return async_wrapper task started. Dec 15 04:17:32 localhost python3.9[133222]: ansible-edpm_os_net_config Invoked with cleanup=False config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=False Dec 15 04:17:32 localhost ansible-async_wrapper.py[133221]: Module complete (133221) Dec 15 04:17:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31194 DF PROTO=TCP SPT=50168 DPT=9882 SEQ=2341607757 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838F4CA50000000001030307) Dec 15 04:17:35 localhost python3.9[133314]: ansible-ansible.legacy.async_status Invoked with jid=j204740626550.133217 mode=status _async_dir=/root/.ansible_async Dec 15 04:17:36 localhost python3.9[133373]: ansible-ansible.legacy.async_status Invoked with jid=j204740626550.133217 mode=cleanup _async_dir=/root/.ansible_async Dec 15 04:17:36 localhost python3.9[133465]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:17:37 localhost ansible-async_wrapper.py[133220]: Done in kid B. Dec 15 04:17:37 localhost python3.9[133538]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765790256.5317996-827-278408114649813/.source.returncode _original_basename=.03mmu1pi follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:17:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19523 DF PROTO=TCP SPT=45130 DPT=9105 SEQ=3180753668 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838F58250000000001030307) Dec 15 04:17:38 localhost python3.9[133630]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:17:39 localhost python3.9[133703]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765790258.2685106-875-248783081485551/.source.cfg _original_basename=.35r6fwm8 follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:17:40 localhost python3.9[133795]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Dec 15 04:17:40 localhost systemd[1]: Reloading Network Manager... Dec 15 04:17:40 localhost NetworkManager[5963]: [1765790260.9262] audit: op="reload" arg="0" pid=133799 uid=0 result="success" Dec 15 04:17:40 localhost NetworkManager[5963]: [1765790260.9271] config: signal: SIGHUP (no changes from disk) Dec 15 04:17:40 localhost systemd[1]: Reloaded Network Manager. Dec 15 04:17:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48898 DF PROTO=TCP SPT=37534 DPT=9101 SEQ=3417086040 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838F65250000000001030307) Dec 15 04:17:42 localhost systemd[1]: session-40.scope: Deactivated successfully. Dec 15 04:17:42 localhost systemd[1]: session-40.scope: Consumed 34.597s CPU time. Dec 15 04:17:42 localhost systemd-logind[763]: Session 40 logged out. Waiting for processes to exit. Dec 15 04:17:42 localhost systemd-logind[763]: Removed session 40. Dec 15 04:17:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31196 DF PROTO=TCP SPT=50168 DPT=9882 SEQ=2341607757 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838F7D260000000001030307) Dec 15 04:17:47 localhost sshd[133814]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:17:47 localhost systemd-logind[763]: New session 41 of user zuul. Dec 15 04:17:47 localhost systemd[1]: Started Session 41 of User zuul. Dec 15 04:17:48 localhost python3.9[133907]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Dec 15 04:17:50 localhost python3.9[134001]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d Dec 15 04:17:50 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19525 DF PROTO=TCP SPT=45130 DPT=9105 SEQ=3180753668 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838F89260000000001030307) Dec 15 04:17:52 localhost python3.9[134154]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:17:52 localhost systemd[1]: session-41.scope: Deactivated successfully. Dec 15 04:17:52 localhost systemd[1]: session-41.scope: Consumed 2.050s CPU time. Dec 15 04:17:52 localhost systemd-logind[763]: Session 41 logged out. Waiting for processes to exit. Dec 15 04:17:52 localhost systemd-logind[763]: Removed session 41. Dec 15 04:17:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21316 DF PROTO=TCP SPT=43418 DPT=9100 SEQ=2984669499 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838F94230000000001030307) Dec 15 04:17:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21317 DF PROTO=TCP SPT=43418 DPT=9100 SEQ=2984669499 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838F98260000000001030307) Dec 15 04:17:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9631 DF PROTO=TCP SPT=48000 DPT=9102 SEQ=1814701240 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838F9E660000000001030307) Dec 15 04:17:58 localhost sshd[134170]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:17:58 localhost systemd-logind[763]: New session 42 of user zuul. Dec 15 04:17:58 localhost systemd[1]: Started Session 42 of User zuul. Dec 15 04:17:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2118 DF PROTO=TCP SPT=49926 DPT=9101 SEQ=561210028 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838FAA250000000001030307) Dec 15 04:18:00 localhost python3.9[134263]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Dec 15 04:18:01 localhost python3.9[134357]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Dec 15 04:18:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9633 DF PROTO=TCP SPT=48000 DPT=9102 SEQ=1814701240 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838FB6260000000001030307) Dec 15 04:18:02 localhost python3.9[134453]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Dec 15 04:18:03 localhost python3.9[134557]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Dec 15 04:18:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14612 DF PROTO=TCP SPT=49644 DPT=9882 SEQ=2985123419 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838FC1E50000000001030307) Dec 15 04:18:07 localhost python3.9[134727]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d Dec 15 04:18:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5006 DF PROTO=TCP SPT=35674 DPT=9105 SEQ=2536839382 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838FCD650000000001030307) Dec 15 04:18:08 localhost python3.9[134882]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:18:09 localhost python3.9[134974]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:18:10 localhost python3.9[135077]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:18:10 localhost python3.9[135125]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:18:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2120 DF PROTO=TCP SPT=49926 DPT=9101 SEQ=561210028 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838FDB260000000001030307) Dec 15 04:18:11 localhost python3.9[135217]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:18:12 localhost python3.9[135265]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 15 04:18:13 localhost python3.9[135357]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None Dec 15 04:18:13 localhost python3.9[135449]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None Dec 15 04:18:14 localhost python3.9[135541]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None Dec 15 04:18:15 localhost python3.9[135633]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None Dec 15 04:18:16 localhost python3.9[135725]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Dec 15 04:18:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14614 DF PROTO=TCP SPT=49644 DPT=9882 SEQ=2985123419 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838FF1250000000001030307) Dec 15 04:18:20 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5008 DF PROTO=TCP SPT=35674 DPT=9105 SEQ=2536839382 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A838FFD250000000001030307) Dec 15 04:18:20 localhost python3.9[135819]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Dec 15 04:18:21 localhost python3.9[135913]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 15 04:18:22 localhost python3.9[136005]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 15 04:18:22 localhost auditd[726]: Audit daemon rotating log files Dec 15 04:18:23 localhost python3.9[136097]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:18:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61527 DF PROTO=TCP SPT=52404 DPT=9100 SEQ=257834736 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839009530000000001030307) Dec 15 04:18:24 localhost python3.9[136190]: ansible-service_facts Invoked Dec 15 04:18:24 localhost network[136207]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Dec 15 04:18:24 localhost network[136208]: 'network-scripts' will be removed from distribution in near future. Dec 15 04:18:24 localhost network[136209]: It is advised to switch to 'NetworkManager' instead for network management. Dec 15 04:18:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61528 DF PROTO=TCP SPT=52404 DPT=9100 SEQ=257834736 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83900D650000000001030307) Dec 15 04:18:25 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:18:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63396 DF PROTO=TCP SPT=45838 DPT=9102 SEQ=1503593644 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839013A50000000001030307) Dec 15 04:18:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59639 DF PROTO=TCP SPT=59758 DPT=9101 SEQ=934864302 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83901F650000000001030307) Dec 15 04:18:29 localhost sshd[136439]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:18:30 localhost python3.9[136532]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Dec 15 04:18:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63398 DF PROTO=TCP SPT=45838 DPT=9102 SEQ=1503593644 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83902B660000000001030307) Dec 15 04:18:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18200 DF PROTO=TCP SPT=39980 DPT=9882 SEQ=175005657 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839036E50000000001030307) Dec 15 04:18:35 localhost python3.9[136626]: ansible-package_facts Invoked with manager=['auto'] strategy=first Dec 15 04:18:36 localhost python3.9[136718]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:18:37 localhost python3.9[136793]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765790316.288423-656-174299575851121/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:18:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57839 DF PROTO=TCP SPT=45040 DPT=9105 SEQ=3305006129 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839042A50000000001030307) Dec 15 04:18:38 localhost python3.9[136887]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:18:38 localhost python3.9[136962]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765790317.7410061-701-95719627419242/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:18:40 localhost python3.9[137056]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:18:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59641 DF PROTO=TCP SPT=59758 DPT=9101 SEQ=934864302 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83904F260000000001030307) Dec 15 04:18:42 localhost python3.9[137150]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d Dec 15 04:18:43 localhost python3.9[137204]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 04:18:44 localhost python3.9[137298]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d Dec 15 04:18:45 localhost python3.9[137352]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Dec 15 04:18:45 localhost chronyd[25720]: chronyd exiting Dec 15 04:18:45 localhost systemd[1]: Stopping NTP client/server... Dec 15 04:18:45 localhost systemd[1]: chronyd.service: Deactivated successfully. Dec 15 04:18:45 localhost systemd[1]: Stopped NTP client/server. Dec 15 04:18:45 localhost systemd[1]: Starting NTP client/server... Dec 15 04:18:45 localhost chronyd[137361]: chronyd version 4.3 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG) Dec 15 04:18:45 localhost chronyd[137361]: Frequency -26.333 +/- 0.170 ppm read from /var/lib/chrony/drift Dec 15 04:18:45 localhost chronyd[137361]: Loaded seccomp filter (level 2) Dec 15 04:18:45 localhost systemd[1]: Started NTP client/server. Dec 15 04:18:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18202 DF PROTO=TCP SPT=39980 DPT=9882 SEQ=175005657 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839067250000000001030307) Dec 15 04:18:47 localhost systemd[1]: session-42.scope: Deactivated successfully. Dec 15 04:18:47 localhost systemd[1]: session-42.scope: Consumed 28.069s CPU time. Dec 15 04:18:47 localhost systemd-logind[763]: Session 42 logged out. Waiting for processes to exit. Dec 15 04:18:47 localhost systemd-logind[763]: Removed session 42. Dec 15 04:18:50 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57841 DF PROTO=TCP SPT=45040 DPT=9105 SEQ=3305006129 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839073250000000001030307) Dec 15 04:18:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57211 DF PROTO=TCP SPT=39884 DPT=9100 SEQ=3391344646 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83907E830000000001030307) Dec 15 04:18:53 localhost sshd[137377]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:18:53 localhost systemd-logind[763]: New session 43 of user zuul. Dec 15 04:18:53 localhost systemd[1]: Started Session 43 of User zuul. Dec 15 04:18:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57212 DF PROTO=TCP SPT=39884 DPT=9100 SEQ=3391344646 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839082A50000000001030307) Dec 15 04:18:54 localhost python3.9[137470]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Dec 15 04:18:56 localhost python3.9[137566]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:18:56 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57213 DF PROTO=TCP SPT=39884 DPT=9100 SEQ=3391344646 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83908AA40000000001030307) Dec 15 04:18:57 localhost python3.9[137671]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:18:57 localhost python3.9[137719]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=._ruc7lu4 recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:18:58 localhost python3.9[137811]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:18:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11725 DF PROTO=TCP SPT=60012 DPT=9101 SEQ=3579079016 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839094A60000000001030307) Dec 15 04:18:59 localhost python3.9[137886]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765790338.126711-143-280065205627369/.source _original_basename=.cgkarnfo follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:19:00 localhost python3.9[137978]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 15 04:19:01 localhost python3.9[138070]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:19:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29707 DF PROTO=TCP SPT=45044 DPT=9882 SEQ=1877658322 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8390A02B0000000001030307) Dec 15 04:19:01 localhost python3.9[138143]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765790340.974756-215-30639362115907/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Dec 15 04:19:02 localhost python3.9[138235]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:19:03 localhost python3.9[138308]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765790342.076444-215-172233439391287/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Dec 15 04:19:04 localhost python3.9[138400]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:19:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29709 DF PROTO=TCP SPT=45044 DPT=9882 SEQ=1877658322 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8390AC250000000001030307) Dec 15 04:19:05 localhost python3.9[138492]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:19:05 localhost python3.9[138607]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765790344.7395802-326-8882709882412/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:19:06 localhost python3.9[138720]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:19:06 localhost python3.9[138808]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765790345.9558802-371-189270890990308/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:19:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38342 DF PROTO=TCP SPT=34684 DPT=9105 SEQ=226517591 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8390B7E50000000001030307) Dec 15 04:19:08 localhost python3.9[138900]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 04:19:08 localhost systemd[1]: Reloading. Dec 15 04:19:08 localhost systemd-rc-local-generator[138930]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:19:08 localhost systemd-sysv-generator[138933]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:19:08 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:19:08 localhost systemd[1]: Reloading. Dec 15 04:19:08 localhost systemd-sysv-generator[138970]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:19:08 localhost systemd-rc-local-generator[138965]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:19:08 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:19:08 localhost systemd[1]: Starting EDPM Container Shutdown... Dec 15 04:19:08 localhost systemd[1]: Finished EDPM Container Shutdown. Dec 15 04:19:09 localhost python3.9[139070]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:19:09 localhost python3.9[139143]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765790348.9887786-440-246319564019013/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:19:10 localhost python3.9[139235]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:19:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11727 DF PROTO=TCP SPT=60012 DPT=9101 SEQ=3579079016 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8390C53D0000000001030307) Dec 15 04:19:11 localhost python3.9[139308]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765790350.1984222-485-195658365196790/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:19:12 localhost python3.9[139400]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 04:19:12 localhost systemd[1]: Reloading. Dec 15 04:19:12 localhost systemd-rc-local-generator[139424]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:19:12 localhost systemd-sysv-generator[139431]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:19:12 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:19:12 localhost systemd[1]: Starting Create netns directory... Dec 15 04:19:12 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Dec 15 04:19:12 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Dec 15 04:19:12 localhost systemd[1]: Finished Create netns directory. Dec 15 04:19:13 localhost python3.9[139533]: ansible-ansible.builtin.service_facts Invoked Dec 15 04:19:13 localhost network[139550]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Dec 15 04:19:13 localhost network[139551]: 'network-scripts' will be removed from distribution in near future. Dec 15 04:19:13 localhost network[139552]: It is advised to switch to 'NetworkManager' instead for network management. Dec 15 04:19:14 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:19:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29711 DF PROTO=TCP SPT=45044 DPT=9882 SEQ=1877658322 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8390DD260000000001030307) Dec 15 04:19:20 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38344 DF PROTO=TCP SPT=34684 DPT=9105 SEQ=226517591 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8390E7250000000001030307) Dec 15 04:19:20 localhost python3.9[139753]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:19:21 localhost python3.9[139828]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1765790360.3073964-608-227087150941275/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:19:22 localhost python3.9[139921]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Dec 15 04:19:22 localhost systemd[1]: Reloading OpenSSH server daemon... Dec 15 04:19:22 localhost systemd[1]: Reloaded OpenSSH server daemon. Dec 15 04:19:22 localhost sshd[119516]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:19:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57009 DF PROTO=TCP SPT=34736 DPT=9100 SEQ=4071031516 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8390F3B20000000001030307) Dec 15 04:19:24 localhost python3.9[140017]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:19:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57010 DF PROTO=TCP SPT=34736 DPT=9100 SEQ=4071031516 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8390F7A50000000001030307) Dec 15 04:19:24 localhost python3.9[140109]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:19:25 localhost python3.9[140182]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765790364.3709521-701-212596405838469/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:19:26 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57011 DF PROTO=TCP SPT=34736 DPT=9100 SEQ=4071031516 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8390FFA60000000001030307) Dec 15 04:19:26 localhost python3.9[140274]: ansible-community.general.timezone Invoked with name=UTC hwclock=None Dec 15 04:19:26 localhost systemd[1]: Starting Time & Date Service... Dec 15 04:19:26 localhost systemd[1]: Started Time & Date Service. Dec 15 04:19:27 localhost python3.9[140370]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:19:28 localhost python3.9[140462]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:19:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9128 DF PROTO=TCP SPT=44254 DPT=9101 SEQ=2308302618 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839109A50000000001030307) Dec 15 04:19:28 localhost python3.9[140535]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765790367.8866284-806-31858728059761/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:19:29 localhost python3.9[140627]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:19:30 localhost python3.9[140700]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765790369.184269-851-43078383840721/.source.yaml _original_basename=.tbxl83no follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:19:30 localhost python3.9[140792]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:19:31 localhost python3.9[140867]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765790370.3804655-896-19196209333547/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:19:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3766 DF PROTO=TCP SPT=34832 DPT=9102 SEQ=994198532 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839115A60000000001030307) Dec 15 04:19:32 localhost python3.9[140959]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:19:33 localhost python3.9[141052]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:19:34 localhost python3[141145]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall Dec 15 04:19:35 localhost python3.9[141237]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:19:35 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18204 DF PROTO=TCP SPT=39980 DPT=9882 SEQ=175005657 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839125260000000001030307) Dec 15 04:19:36 localhost python3.9[141310]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765790375.2932816-1013-230852419459652/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:19:37 localhost python3.9[141402]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:19:37 localhost python3.9[141475]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765790376.496236-1058-241015869662604/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:19:38 localhost python3.9[141567]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:19:38 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57013 DF PROTO=TCP SPT=34736 DPT=9100 SEQ=4071031516 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83912F260000000001030307) Dec 15 04:19:38 localhost python3.9[141640]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765790377.7768974-1103-94229187233075/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:19:39 localhost python3.9[141732]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:19:39 localhost python3.9[141805]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765790379.0150895-1148-191158325063040/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:19:40 localhost python3.9[141897]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:19:41 localhost python3.9[141970]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765790380.2492425-1193-33833871359013/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:19:42 localhost python3.9[142062]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:19:42 localhost python3.9[142154]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:19:44 localhost python3.9[142249]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:19:44 localhost python3.9[142342]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:19:46 localhost python3.9[142434]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:19:47 localhost python3.9[142526]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None Dec 15 04:19:47 localhost python3.9[142619]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None Dec 15 04:19:48 localhost systemd[1]: session-43.scope: Deactivated successfully. Dec 15 04:19:48 localhost systemd[1]: session-43.scope: Consumed 27.996s CPU time. Dec 15 04:19:48 localhost systemd-logind[763]: Session 43 logged out. Waiting for processes to exit. Dec 15 04:19:48 localhost systemd-logind[763]: Removed session 43. Dec 15 04:19:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48666 DF PROTO=TCP SPT=48876 DPT=9100 SEQ=1142735909 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839168E30000000001030307) Dec 15 04:19:53 localhost sshd[142635]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:19:54 localhost systemd-logind[763]: New session 44 of user zuul. Dec 15 04:19:54 localhost systemd[1]: Started Session 44 of User zuul. Dec 15 04:19:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33344 DF PROTO=TCP SPT=43002 DPT=9102 SEQ=3142165093 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83916F120000000001030307) Dec 15 04:19:55 localhost python3.9[142730]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None Dec 15 04:19:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41580 DF PROTO=TCP SPT=35520 DPT=9101 SEQ=1965080997 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839172E90000000001030307) Dec 15 04:19:56 localhost systemd[1]: systemd-timedated.service: Deactivated successfully. Dec 15 04:19:57 localhost python3.9[142825]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 15 04:19:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57217 DF PROTO=TCP SPT=39884 DPT=9100 SEQ=3391344646 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839179260000000001030307) Dec 15 04:19:58 localhost python3.9[142919]: ansible-ansible.builtin.slurp Invoked with src=/etc/ssh/ssh_known_hosts Dec 15 04:19:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59641 DF PROTO=TCP SPT=45746 DPT=9102 SEQ=3062593613 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83917F250000000001030307) Dec 15 04:19:59 localhost python3.9[143011]: ansible-ansible.legacy.stat Invoked with path=/tmp/ansible.xpt7naa9 follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:19:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11729 DF PROTO=TCP SPT=60012 DPT=9101 SEQ=3579079016 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839183260000000001030307) Dec 15 04:20:00 localhost python3.9[143086]: ansible-ansible.legacy.copy Invoked with dest=/tmp/ansible.xpt7naa9 mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765790399.5168242-189-252060311306248/.source.xpt7naa9 _original_basename=._leexm_4 follow=False checksum=c3777b7a6734e9c158f0e0068482fbac001a256d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:20:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15417 DF PROTO=TCP SPT=48764 DPT=9882 SEQ=984616782 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83918A890000000001030307) Dec 15 04:20:02 localhost python3.9[143178]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Dec 15 04:20:04 localhost python3.9[143270]: ansible-ansible.builtin.blockinfile Invoked with block=np0005559461.localdomain,192.168.122.105,np0005559461* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC+pZDj8/JotpJwzMLuGO3sGR9qelkKNigZ2dHBVONU8Te2pVOOlBjGecT3+MT3PMotPbB8TwWgRvbJE0Z12178pRNQX61gnd2TtITG7EvEsL9j+LZHo+AJC2eSsdTlWMhCOlRy/TEUYfAJFXRawnsSsEU377lC5qTLesYFzCdgb3aC3pme1bP38Fpx2QDE8XZjl9wq7C1isruKuTifALk4kS2NnVU6XKllWAemqz4vf0UJUCG1qI2HmxPP/miVK//pk1ZdZzZk1kvbQYbaxXcsVJ7DHR+tTWPp/56OlKngz91Qt0xidMlJHxn8bf5rZChk4a0HLBbae2/ksxutNZb7i/LZ9B3Q41/Lq8bcvQPLkvYcW7tkMxHbR2MfKCFFfjxsJV03L3HrgbdsctrXW+58VS4sFRWRdKkOSRkesSF1+KDxG5GqFKFAhRp76OESCiv81XJPXGO5ElNpxnkajHwO/ts/neF3vlUr5z5BOPZ+hLohivjXlIFEQrFF9EqUau8=#012np0005559461.localdomain,192.168.122.105,np0005559461* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOGTsA6gvfDvD8B1lC7PEoiNVHrAFmtHqcIFQHzTZBM6#012np0005559461.localdomain,192.168.122.105,np0005559461* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGZ7AqNmEukrk24W0OcSd6vyPsJ6XOzrcNcEu4z20Uic6VYtJXPYCrJKFlHYQLGjGBc2JIrVa6hp62g1couq9y4=#012np0005559462.localdomain,192.168.122.106,np0005559462* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDrQzXLeVaQJEU2ztjn0fNzYVokp78uisG99XwWlgQg7ZpJT+WTsYqcKv35fw1GN/zQy59gWCk8kYF10TYn8QaRCgF0ZXGY8H0LjB7U4x/HUzMWF03yBBRXZjcF39ubxRGSmMaVWBpOYp9M08b2RBwNDJCTjdrYyyaRMnS+jyA8nFsXD/p8n8I625s6JjDL5pU06+6Urj5IJp/9WFWZUfPJZz31ZXhK/se/5c44GzjneCEn99dfU+Brux9+D6WQpI1HZxEuPPoUTRG/+dTwx0e2VrbxdqAxoMfXS+fB2l5XEQMrQD1Q6aG7K7ndtd+6BQYvLmFakcX/UevQngJOuz08tcgdea0gRXmOIr3JbqmFn3bcOP4ozZ9R23Hs4fMenHDW8Ivw57xe1oyPHF82POHh2HreGMWqVlcsWcHhZLEzEOHlLBfZEBrBLzP6Zck0gXqh3zgzip+dF70qUxiGtenul3aCJIJGIf/tUoMkGYM51NwDylsw0We2cO7tkD36uvc=#012np0005559462.localdomain,192.168.122.106,np0005559462* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAdoLDNckECkSX2dkQaaBZYuRStyMt5/lQmvAyMH/+K6#012np0005559462.localdomain,192.168.122.106,np0005559462* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBI7+bsMHywaGoLkZftdAzsmilmynnqxyajGjHaNPb6EaAXucwZtaqfUk7GRJdmDFQqnnlYElmsC1tSxqfIid47k=#012np0005559463.localdomain,192.168.122.107,np0005559463* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC0zv1YwbYydDOC13IxABIAlv2+N4/64PF3ADLPVluzLQONerjQ0gjtgR7OfrcoFTZTiqc2tJKpMFSw9qOZAUqHG8sT8FCcrTiwVoKv8yyUzcZh7iheOk4d4FMyUCuxj+VD7lbHDRSPnCafc91T/BKNKvMV2R/h+zdHoyg8u8zng++18JeHuYdtavYz5Uz7sgK6EZ4JPLao3nLM8Vbzl2lk0yfZQvVRn3dXvPf4jwvZJl+kNum2ZqrtrHACqFvq+zbh0hFCZYHmoc7dvmrl8pBj8e3Qs4iXW8vkYf0TKHtCCcPz3o9WDH946bmi2pAWCYgejR5Qg13HBcKQb0sKuXYSk6F7s6pzrOSAHajY3SLA0xVgpmbac0NyWJgnAdXxqspsXnW7z2bwgTDhxzRDh4QHTpWSlhr1PFnY/HvzvCJdo9RZ5D3xFEen1YQPXPJzWmsAQzvASxYUmweNC1xO3Vb5cnd51AiQOVpBJRgF64lKKCOnjqxOOKOvLD9L82F+Gsc=#012np0005559463.localdomain,192.168.122.107,np0005559463* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIORE0gyxitDAmICC0tS+NbrsyPbC8ZLeCmyb3eSED8jr#012np0005559463.localdomain,192.168.122.107,np0005559463* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGFU3AfG88DMKgZq9rp+UG0p4r2QBkw7dg9AlySz5E+hf/vIR0jhiNOqhmG8YkfIZuVCgWQZIvIGFwM/LWLcjM8=#012np0005559459.localdomain,192.168.122.103,np0005559459* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCqlnRqscQHmiXC9F0uBpTeOoqxw0/HUjdhUjI49Q/PK0NsH6XeoHwFn3RUyYvUTYQrRCPNFeNH+THx5uA5O7mjJBaHg0RbtpQ0qSufn0fFyDFIjuMW8u2Bs7DA7daecXfzweFHqWsZzRksCCZdvGUK61zPvmhxuPkzYaME/JuZ0RxpAMyb0YhyvzL++niWIi+OBpDYpAnbynsPE428f5U5GJ87eUDZ4g0Iy3+4HC98k/DBchi4w22zg1nU4O3vtPhLAgEWX7z3/nz/9St6ifZhXW2xurdbBr9nPb8xSGeAN7a0aERqAI4tYkZo9rXzKOTGB85OeKP1XoYiWdhexZjV+j4RwAwNWVAIVACfkw8ZuW5WmFuCSVjT8A+EmQo9PLeg3RGBjRbTO8orQ59hNelHoqnK60HO4/JO6x3VAcn/EpCSS5a8xsMQFQDCMbaPrmNpTA0Na1qpq+yu92glOEGwff5HKWN2vDfVqGQZhWwabrKtbVf5cKMi2jhmecCNj9E=#012np0005559459.localdomain,192.168.122.103,np0005559459* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIPWfdiDOzrDkEvTOMH334lCBvp2i5gydUz9TURExaH84#012np0005559459.localdomain,192.168.122.103,np0005559459* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBMyLh66+Q1gjTXcWUegOMnEcqjYGsSewlGdiMz2vuKglUP6/8dg/Nr2YMGcAQYR0WsMvNJpNp/AyTDyyIUlrkoY=#012np0005559464.localdomain,192.168.122.108,np0005559464* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCxqkgoIIgxeHEDa634HpTkDNt9TKXrhBXtKApvA2HDOYJC7R2lrHf0hxkAuU/MCkIZse6a6pP4n4JPp29gByMqzGBNcoo1iSRmKkSHDpaUeu0f+9fH8BDL9pdOwo8IiwHn9kk8ffoq3gqVhWKEdD4Td38/we+YLKMNXqM7yyrIiXSbLSPLJxR66ZRY+JXYFKRs8IJSMRyuXnNporymS5NtzgTxuYebROnCEG+mNONzDWnqPSBB0oSEi76oKKTqJRq4kv+8V3ZTMIMTs42VntiTD5hBgzPa2ZhmY0wKdz7vI1xGZ+2SAuZkzRwv7YXF6J0pqlKgHxE7TrvWUj+EXb/kgQ/tgtB+wXbu9pLw91/8L3hgAWNrMzcWheIEHXcH3btf9HTSxLtgs1xB8EgPw/nmSUDppsScqfPEUPtHoZOZ0O3wevinvnBqo9dmxOcZWSPmrujK3TvByP2omhYn8MWSctYg2sER10rd8Wg3JArPwiEdcp3UA/hkrYGAJWYNeBM=#012np0005559464.localdomain,192.168.122.108,np0005559464* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIMbUxFSDI3CpXVngVLwGd9R0ZOEpMpWDykgAWSynBJL4#012np0005559464.localdomain,192.168.122.108,np0005559464* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBBNEKEF2/+SabtpDFTW8ZW+iwtf+z8OUQEdFZ9Cki/Amc97pr2d6rNHjSjVB+7+yPskGahX5idDyokM4V84ix8g=#012np0005559460.localdomain,192.168.122.104,np0005559460* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDEKLYeWo8M0KKNWeR21MbsfAxuubj3NQ9RoSnLNJB8iaQd0VMr5Mr6IxEco83PyPUfuz+0BCKzHjMUmEFroQLP0uJAbkmRhN+pbImyPvqY0LdluyDf40PwNKyXGbWGbW76YLzHwp+CUSV3BAQ6cLJzrMQn3GFmepexVzDNvziyEloO16lObV6r/1mKXEMM4qkqPbDNNznwKBL8jS8nXglEUPTE0ATtyp/4/tvv8tUTQ5uQyxF6QwBmzzMgrhyA/L0CQC1kYCjUtxAau88hJ6XT06rJ+bGWUojpdI+hYlKRtcd/5x8+9LD8kQ29s4AnLPg72qZglIFa46JfZAPFBabQoMDtnA2uZlJ+AWYEcvCyIlLPRRiXaSDVMkBhzYT3FVwLqmdmpcurZzs1WS8HEVbLY4ZJchb2gL1PVZMazpBH8tEH/n4fmy7p2t1G5z2xT45grhWyr5xE9fQtfes2N8l1gMa02U9vCGr8lBhSGic6KL09+XWZtdqGCz1IfoRt0tM=#012np0005559460.localdomain,192.168.122.104,np0005559460* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIExikVHMPRKrn1JeIk4tBBTYjlMwcEE66k6nv6ziNq62#012np0005559460.localdomain,192.168.122.104,np0005559460* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDO0dXS9Xk0i0/mFvS+KIGzkvDY+rzTdeKozJEFDcSUbQGDiRdl+xW+/s1u8wu9Pbr5bRq+QmMLYXYkb96XzDZo=#012 create=True mode=0644 path=/tmp/ansible.xpt7naa9 state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:20:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41170 DF PROTO=TCP SPT=38142 DPT=9105 SEQ=1189527890 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839196230000000001030307) Dec 15 04:20:06 localhost python3.9[143362]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.xpt7naa9' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:20:07 localhost podman[143515]: 2025-12-15 09:20:07.645710562 +0000 UTC m=+0.099451516 container exec 8dcda56b365b42dc8758aab77a9ec80db304780e449052738f7e4e648ae1ecaf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-crash-np0005559462, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, vcs-type=git, RELEASE=main, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, GIT_CLEAN=True, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, version=7, ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Dec 15 04:20:07 localhost podman[143515]: 2025-12-15 09:20:07.7584319 +0000 UTC m=+0.212172894 container exec_died 8dcda56b365b42dc8758aab77a9ec80db304780e449052738f7e4e648ae1ecaf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-crash-np0005559462, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, build-date=2025-11-26T19:44:28Z, io.openshift.tags=rhceph ceph, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, name=rhceph, architecture=x86_64, ceph=True, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0) Dec 15 04:20:08 localhost python3.9[143613]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.xpt7naa9 state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:20:08 localhost systemd[1]: session-44.scope: Deactivated successfully. Dec 15 04:20:08 localhost systemd[1]: session-44.scope: Consumed 4.319s CPU time. Dec 15 04:20:08 localhost systemd-logind[763]: Session 44 logged out. Waiting for processes to exit. Dec 15 04:20:08 localhost systemd-logind[763]: Removed session 44. Dec 15 04:20:15 localhost sshd[143721]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:20:15 localhost systemd-logind[763]: New session 45 of user zuul. Dec 15 04:20:15 localhost systemd[1]: Started Session 45 of User zuul. Dec 15 04:20:16 localhost python3.9[143814]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Dec 15 04:20:18 localhost python3.9[143910]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None Dec 15 04:20:18 localhost sshd[143930]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:20:19 localhost python3.9[144006]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Dec 15 04:20:20 localhost python3.9[144099]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:20:20 localhost sshd[144101]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:20:21 localhost python3.9[144194]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 15 04:20:21 localhost python3.9[144288]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:20:22 localhost python3.9[144383]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:20:23 localhost systemd-logind[763]: Session 45 logged out. Waiting for processes to exit. Dec 15 04:20:23 localhost systemd[1]: session-45.scope: Deactivated successfully. Dec 15 04:20:23 localhost systemd[1]: session-45.scope: Consumed 3.750s CPU time. Dec 15 04:20:23 localhost systemd-logind[763]: Removed session 45. Dec 15 04:20:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=853 DF PROTO=TCP SPT=55868 DPT=9100 SEQ=128043122 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8391DE120000000001030307) Dec 15 04:20:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=854 DF PROTO=TCP SPT=55868 DPT=9100 SEQ=128043122 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8391E2250000000001030307) Dec 15 04:20:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14638 DF PROTO=TCP SPT=56300 DPT=9102 SEQ=2060782979 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8391E4420000000001030307) Dec 15 04:20:25 localhost sshd[144398]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:20:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25381 DF PROTO=TCP SPT=59676 DPT=9101 SEQ=2578327966 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8391E8180000000001030307) Dec 15 04:20:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14639 DF PROTO=TCP SPT=56300 DPT=9102 SEQ=2060782979 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8391E8660000000001030307) Dec 15 04:20:26 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=855 DF PROTO=TCP SPT=55868 DPT=9100 SEQ=128043122 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8391EA250000000001030307) Dec 15 04:20:26 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25382 DF PROTO=TCP SPT=59676 DPT=9101 SEQ=2578327966 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8391EC250000000001030307) Dec 15 04:20:27 localhost sshd[144400]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:20:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14640 DF PROTO=TCP SPT=56300 DPT=9102 SEQ=2060782979 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8391F0660000000001030307) Dec 15 04:20:28 localhost sshd[144402]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:20:28 localhost systemd-logind[763]: New session 46 of user zuul. Dec 15 04:20:28 localhost systemd[1]: Started Session 46 of User zuul. Dec 15 04:20:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25383 DF PROTO=TCP SPT=59676 DPT=9101 SEQ=2578327966 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8391F4260000000001030307) Dec 15 04:20:29 localhost sshd[144423]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:20:29 localhost sshd[144440]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:20:30 localhost python3.9[144499]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Dec 15 04:20:31 localhost python3.9[144595]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Dec 15 04:20:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14641 DF PROTO=TCP SPT=56300 DPT=9102 SEQ=2060782979 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839200250000000001030307) Dec 15 04:20:32 localhost python3.9[144649]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None Dec 15 04:20:32 localhost sshd[144652]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:20:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52483 DF PROTO=TCP SPT=53712 DPT=9882 SEQ=2623161305 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83920BA50000000001030307) Dec 15 04:20:36 localhost python3.9[144743]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:20:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21853 DF PROTO=TCP SPT=56902 DPT=9105 SEQ=3782238744 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839217650000000001030307) Dec 15 04:20:38 localhost sshd[144837]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:20:38 localhost python3.9[144836]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/reboot_required/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:20:38 localhost python3.9[144930]: ansible-ansible.builtin.file Invoked with mode=0600 path=/var/lib/openstack/reboot_required/needs_restarting state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:20:39 localhost python3.9[145022]: ansible-ansible.builtin.lineinfile Invoked with dest=/var/lib/openstack/reboot_required/needs_restarting line=Not root, Subscription Management repositories not updated#012Core libraries or services have been updated since boot-up:#012 * systemd#012#012Reboot is required to fully utilize these updates.#012More information: https://access.redhat.com/solutions/27943 path=/var/lib/openstack/reboot_required/needs_restarting state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:20:40 localhost python3.9[145112]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None Dec 15 04:20:41 localhost python3.9[145202]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 15 04:20:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25385 DF PROTO=TCP SPT=59676 DPT=9101 SEQ=2578327966 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839225250000000001030307) Dec 15 04:20:42 localhost python3.9[145294]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 15 04:20:42 localhost systemd[1]: session-46.scope: Deactivated successfully. Dec 15 04:20:42 localhost systemd[1]: session-46.scope: Consumed 8.802s CPU time. Dec 15 04:20:42 localhost systemd-logind[763]: Session 46 logged out. Waiting for processes to exit. Dec 15 04:20:42 localhost systemd-logind[763]: Removed session 46. Dec 15 04:20:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52485 DF PROTO=TCP SPT=53712 DPT=9882 SEQ=2623161305 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83923B260000000001030307) Dec 15 04:20:49 localhost sshd[145311]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:20:49 localhost systemd-logind[763]: New session 47 of user zuul. Dec 15 04:20:49 localhost systemd[1]: Started Session 47 of User zuul. Dec 15 04:20:50 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21855 DF PROTO=TCP SPT=56902 DPT=9105 SEQ=3782238744 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839247250000000001030307) Dec 15 04:20:50 localhost python3.9[145404]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Dec 15 04:20:52 localhost python3.9[145500]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 15 04:20:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9983 DF PROTO=TCP SPT=46474 DPT=9100 SEQ=3403527213 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839253430000000001030307) Dec 15 04:20:53 localhost python3.9[145592]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:20:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9984 DF PROTO=TCP SPT=46474 DPT=9100 SEQ=3403527213 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839257650000000001030307) Dec 15 04:20:54 localhost python3.9[145665]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765790453.124181-182-217751270490568/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=642b0566ec70b00e9a30dc94608ed63df1702edb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:20:55 localhost python3.9[145757]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-sriov setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 15 04:20:55 localhost python3.9[145849]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:20:56 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9985 DF PROTO=TCP SPT=46474 DPT=9100 SEQ=3403527213 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83925F650000000001030307) Dec 15 04:20:56 localhost chronyd[137361]: Selected source 23.133.168.244 (pool.ntp.org) Dec 15 04:20:57 localhost python3.9[145922]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765790455.3329473-257-10684297317441/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=642b0566ec70b00e9a30dc94608ed63df1702edb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:20:57 localhost python3.9[146014]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-dhcp setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 15 04:20:58 localhost python3.9[146106]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:20:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60796 DF PROTO=TCP SPT=50420 DPT=9101 SEQ=3261741074 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839269650000000001030307) Dec 15 04:20:59 localhost python3.9[146179]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765790457.9004588-333-269553155507638/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=642b0566ec70b00e9a30dc94608ed63df1702edb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:21:00 localhost python3.9[146271]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 15 04:21:00 localhost python3.9[146363]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:21:01 localhost python3.9[146436]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765790460.4697173-414-99296132633490/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=642b0566ec70b00e9a30dc94608ed63df1702edb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:21:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46252 DF PROTO=TCP SPT=58026 DPT=9882 SEQ=2706393241 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839274EB0000000001030307) Dec 15 04:21:02 localhost python3.9[146528]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 15 04:21:02 localhost python3.9[146620]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:21:03 localhost python3.9[146693]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765790462.4965913-486-199576987585341/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=642b0566ec70b00e9a30dc94608ed63df1702edb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:21:04 localhost python3.9[146785]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 15 04:21:04 localhost python3.9[146877]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:21:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46254 DF PROTO=TCP SPT=58026 DPT=9882 SEQ=2706393241 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839280E50000000001030307) Dec 15 04:21:05 localhost python3.9[146950]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765790464.3188806-558-135041951301708/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=642b0566ec70b00e9a30dc94608ed63df1702edb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:21:06 localhost python3.9[147042]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 15 04:21:06 localhost python3.9[147134]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:21:07 localhost python3.9[147207]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765790466.195704-628-234760959374882/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=642b0566ec70b00e9a30dc94608ed63df1702edb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:21:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9864 DF PROTO=TCP SPT=41748 DPT=9105 SEQ=3590738117 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83928CA60000000001030307) Dec 15 04:21:08 localhost python3.9[147299]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 15 04:21:08 localhost python3.9[147391]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:21:09 localhost python3.9[147494]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765790468.4035053-698-221276620149102/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=642b0566ec70b00e9a30dc94608ed63df1702edb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:21:10 localhost systemd[1]: session-47.scope: Deactivated successfully. Dec 15 04:21:10 localhost systemd[1]: session-47.scope: Consumed 11.993s CPU time. Dec 15 04:21:10 localhost systemd-logind[763]: Session 47 logged out. Waiting for processes to exit. Dec 15 04:21:10 localhost systemd-logind[763]: Removed session 47. Dec 15 04:21:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60798 DF PROTO=TCP SPT=50420 DPT=9101 SEQ=3261741074 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839299250000000001030307) Dec 15 04:21:15 localhost sshd[147555]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:21:16 localhost systemd-logind[763]: New session 48 of user zuul. Dec 15 04:21:16 localhost systemd[1]: Started Session 48 of User zuul. Dec 15 04:21:16 localhost python3.9[147650]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:21:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46256 DF PROTO=TCP SPT=58026 DPT=9882 SEQ=2706393241 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8392B1250000000001030307) Dec 15 04:21:18 localhost python3.9[147742]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:21:18 localhost python3.9[147815]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765790477.044005-62-32894004187852/.source.conf _original_basename=ceph.conf follow=False checksum=3af07c3985ec8ce2ff4820d307d8876535cac8d8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:21:20 localhost python3.9[147907]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:21:20 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9866 DF PROTO=TCP SPT=41748 DPT=9105 SEQ=3590738117 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8392BD250000000001030307) Dec 15 04:21:20 localhost python3.9[147980]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1765790479.5893703-62-80965300601593/.source.keyring _original_basename=ceph.client.openstack.keyring follow=False checksum=12c95e7154f1a9ee6f3f3cf63bf30d2d8ea78471 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:21:21 localhost systemd[1]: session-48.scope: Deactivated successfully. Dec 15 04:21:21 localhost systemd[1]: session-48.scope: Consumed 2.263s CPU time. Dec 15 04:21:21 localhost systemd-logind[763]: Session 48 logged out. Waiting for processes to exit. Dec 15 04:21:21 localhost systemd-logind[763]: Removed session 48. Dec 15 04:21:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25649 DF PROTO=TCP SPT=45732 DPT=9100 SEQ=3703823226 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8392C8720000000001030307) Dec 15 04:21:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25650 DF PROTO=TCP SPT=45732 DPT=9100 SEQ=3703823226 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8392CC650000000001030307) Dec 15 04:21:26 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25651 DF PROTO=TCP SPT=45732 DPT=9100 SEQ=3703823226 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8392D4660000000001030307) Dec 15 04:21:26 localhost sshd[147995]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:21:26 localhost systemd-logind[763]: New session 49 of user zuul. Dec 15 04:21:26 localhost systemd[1]: Started Session 49 of User zuul. Dec 15 04:21:28 localhost python3.9[148088]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Dec 15 04:21:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32661 DF PROTO=TCP SPT=45492 DPT=9101 SEQ=2257224488 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8392DE650000000001030307) Dec 15 04:21:30 localhost python3.9[148184]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 15 04:21:30 localhost python3.9[148276]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Dec 15 04:21:31 localhost python3.9[148366]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Dec 15 04:21:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58079 DF PROTO=TCP SPT=41152 DPT=9882 SEQ=3336732040 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8392EA1A0000000001030307) Dec 15 04:21:32 localhost python3.9[148458]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False Dec 15 04:21:33 localhost python3.9[148550]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Dec 15 04:21:34 localhost python3.9[148604]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch3.3'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Dec 15 04:21:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58081 DF PROTO=TCP SPT=41152 DPT=9882 SEQ=3336732040 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8392F6250000000001030307) Dec 15 04:21:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6525 DF PROTO=TCP SPT=55350 DPT=9105 SEQ=639137250 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839301A50000000001030307) Dec 15 04:21:39 localhost python3.9[148698]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None Dec 15 04:21:41 localhost python3[148793]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks#012 rule:#012 proto: udp#012 dport: 4789#012- rule_name: 119 neutron geneve networks#012 rule:#012 proto: udp#012 dport: 6081#012 state: ["UNTRACKED"]#012- rule_name: 120 neutron geneve networks no conntrack#012 rule:#012 proto: udp#012 dport: 6081#012 table: raw#012 chain: OUTPUT#012 jump: NOTRACK#012 action: append#012 state: []#012- rule_name: 121 neutron geneve networks no conntrack#012 rule:#012 proto: udp#012 dport: 6081#012 table: raw#012 chain: PREROUTING#012 jump: NOTRACK#012 action: append#012 state: []#012 dest=/var/lib/edpm-config/firewall/ovn.yaml state=present Dec 15 04:21:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32663 DF PROTO=TCP SPT=45492 DPT=9101 SEQ=2257224488 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83930F250000000001030307) Dec 15 04:21:42 localhost python3.9[148885]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:21:42 localhost sshd[148978]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:21:43 localhost python3.9[148977]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:21:43 localhost python3.9[149027]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:21:44 localhost python3.9[149119]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:21:44 localhost python3.9[149168]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.u62apmn_ recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:21:45 localhost python3.9[149260]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:21:45 localhost python3.9[149308]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:21:46 localhost python3.9[149400]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:21:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58083 DF PROTO=TCP SPT=41152 DPT=9882 SEQ=3336732040 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839327260000000001030307) Dec 15 04:21:47 localhost python3[149493]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall Dec 15 04:21:47 localhost sshd[149508]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:21:48 localhost python3.9[149586]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:21:49 localhost python3.9[149662]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765790507.9260874-431-248077093599349/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:21:49 localhost python3.9[149754]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:21:50 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6527 DF PROTO=TCP SPT=55350 DPT=9105 SEQ=639137250 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839331260000000001030307) Dec 15 04:21:50 localhost python3.9[149829]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765790509.3454115-476-232527477072402/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:21:51 localhost python3.9[149921]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:21:51 localhost sshd[149925]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:21:51 localhost python3.9[149997]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765790510.6057842-521-77758101963088/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:21:52 localhost python3.9[150090]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:21:52 localhost python3.9[150165]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765790511.9977577-566-151971435072586/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:21:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48181 DF PROTO=TCP SPT=34104 DPT=9100 SEQ=3642357827 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83933DA20000000001030307) Dec 15 04:21:53 localhost python3.9[150257]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:21:54 localhost python3.9[150332]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765790513.1995773-611-114221765666904/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:21:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48182 DF PROTO=TCP SPT=34104 DPT=9100 SEQ=3642357827 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839341A50000000001030307) Dec 15 04:21:54 localhost python3.9[150424]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:21:55 localhost python3.9[150516]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:21:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43496 DF PROTO=TCP SPT=43480 DPT=9102 SEQ=3040878623 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839347E50000000001030307) Dec 15 04:21:56 localhost sshd[150612]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:21:56 localhost python3.9[150611]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:21:57 localhost python3.9[150705]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:21:57 localhost python3.9[150798]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 15 04:21:58 localhost python3.9[150892]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:21:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37106 DF PROTO=TCP SPT=41542 DPT=9101 SEQ=1323721784 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839353A50000000001030307) Dec 15 04:21:59 localhost python3.9[150987]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:22:00 localhost sshd[151002]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:22:01 localhost python3.9[151079]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Dec 15 04:22:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18255 DF PROTO=TCP SPT=35078 DPT=9882 SEQ=2320765380 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83935F490000000001030307) Dec 15 04:22:02 localhost python3.9[151172]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=np0005559462.localdomain external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:3e:0a:81:2c:d1:5d" external_ids:ovn-encap-ip=172.19.0.106 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=tcp:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch #012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:22:02 localhost ovs-vsctl[151173]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=np0005559462.localdomain external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:3e:0a:81:2c:d1:5d external_ids:ovn-encap-ip=172.19.0.106 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=tcp:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch Dec 15 04:22:03 localhost python3.9[151265]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ovs-vsctl show | grep -q "Manager"#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:22:03 localhost sshd[151266]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:22:04 localhost python3.9[151360]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 15 04:22:04 localhost python3.9[151454]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 15 04:22:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18257 DF PROTO=TCP SPT=35078 DPT=9882 SEQ=2320765380 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83936B650000000001030307) Dec 15 04:22:05 localhost python3.9[151546]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:22:06 localhost python3.9[151594]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 15 04:22:06 localhost python3.9[151686]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:22:07 localhost python3.9[151734]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 15 04:22:07 localhost sshd[151826]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:22:07 localhost python3.9[151827]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:22:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57946 DF PROTO=TCP SPT=38752 DPT=9105 SEQ=3143139013 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839376E50000000001030307) Dec 15 04:22:08 localhost python3.9[151920]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:22:08 localhost python3.9[151968]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:22:09 localhost python3.9[152060]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:22:10 localhost python3.9[152108]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:22:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37108 DF PROTO=TCP SPT=41542 DPT=9101 SEQ=1323721784 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839383260000000001030307) Dec 15 04:22:11 localhost python3.9[152229]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 04:22:11 localhost systemd[1]: Reloading. Dec 15 04:22:11 localhost systemd-rc-local-generator[152270]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:22:11 localhost systemd-sysv-generator[152273]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:22:11 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:22:12 localhost python3.9[152406]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:22:13 localhost python3.9[152454]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:22:13 localhost python3.9[152546]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:22:14 localhost python3.9[152594]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:22:15 localhost python3.9[152686]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 04:22:15 localhost systemd[1]: Reloading. Dec 15 04:22:15 localhost systemd-sysv-generator[152716]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:22:15 localhost systemd-rc-local-generator[152710]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:22:15 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:22:15 localhost systemd[1]: Starting Create netns directory... Dec 15 04:22:15 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Dec 15 04:22:15 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Dec 15 04:22:15 localhost systemd[1]: Finished Create netns directory. Dec 15 04:22:16 localhost sshd[152823]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:22:16 localhost python3.9[152822]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 15 04:22:17 localhost python3.9[152915]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:22:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18259 DF PROTO=TCP SPT=35078 DPT=9882 SEQ=2320765380 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83939B250000000001030307) Dec 15 04:22:17 localhost python3.9[152988]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765790536.5962365-1343-173048990254703/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Dec 15 04:22:18 localhost python3.9[153081]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:22:19 localhost python3.9[153173]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 15 04:22:19 localhost python3.9[153265]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:22:20 localhost python3.9[153340]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1765790539.3286397-1442-155062372391573/.source.json _original_basename=.cyvli7at follow=False checksum=38f75f59f5c2ef6b5da12297bfd31cd1e97012ac backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:22:20 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57948 DF PROTO=TCP SPT=38752 DPT=9105 SEQ=3143139013 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8393A7250000000001030307) Dec 15 04:22:21 localhost sshd[153414]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:22:21 localhost python3.9[153432]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:22:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18694 DF PROTO=TCP SPT=55704 DPT=9100 SEQ=4290392255 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8393B2D20000000001030307) Dec 15 04:22:24 localhost python3.9[153685]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False Dec 15 04:22:24 localhost sshd[153686]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:22:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18695 DF PROTO=TCP SPT=55704 DPT=9100 SEQ=4290392255 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8393B6E50000000001030307) Dec 15 04:22:25 localhost python3.9[153779]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack Dec 15 04:22:25 localhost sshd[153810]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:22:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50226 DF PROTO=TCP SPT=40578 DPT=9102 SEQ=4260179053 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8393BD250000000001030307) Dec 15 04:22:26 localhost python3.9[153873]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None Dec 15 04:22:27 localhost sshd[153914]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:22:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1187 DF PROTO=TCP SPT=52164 DPT=9101 SEQ=1266571064 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8393C8E50000000001030307) Dec 15 04:22:29 localhost sshd[153916]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:22:31 localhost python3[153995]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json containers=['ovn_controller'] log_base_path=/var/log/containers/stdouts debug=False Dec 15 04:22:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37883 DF PROTO=TCP SPT=32990 DPT=9882 SEQ=1974127088 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8393D4790000000001030307) Dec 15 04:22:31 localhost python3[153995]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [#012 {#012 "Id": "a17927617ef5a603f0594ee0d6df65aabdc9e0303ccc5a52c36f193de33ee0fe",#012 "Digest": "sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de",#012 "RepoTags": [#012 "quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified"#012 ],#012 "RepoDigests": [#012 "quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de"#012 ],#012 "Parent": "",#012 "Comment": "",#012 "Created": "2025-12-08T06:37:53.576314044Z",#012 "Config": {#012 "User": "root",#012 "Env": [#012 "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",#012 "LANG=en_US.UTF-8",#012 "TZ=UTC",#012 "container=oci"#012 ],#012 "Entrypoint": [#012 "dumb-init",#012 "--single-child",#012 "--"#012 ],#012 "Cmd": [#012 "kolla_start"#012 ],#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251202",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "c3923531bcda0b0811b2d5053f189beb",#012 "tcib_managed": "true"#012 },#012 "StopSignal": "SIGTERM"#012 },#012 "Version": "",#012 "Author": "",#012 "Architecture": "amd64",#012 "Os": "linux",#012 "Size": 346249157,#012 "VirtualSize": 346249157,#012 "GraphDriver": {#012 "Name": "overlay",#012 "Data": {#012 "LowerDir": "/var/lib/containers/storage/overlay/78ae08eb8eacad53e931fdafc7d34696c8b31051d781d506af7ddd4666a71629/diff:/var/lib/containers/storage/overlay/102653142e2259aa6223045dee7736729104ac8aed3ce9b3c87a6d0787e59de8/diff:/var/lib/containers/storage/overlay/a170762be59c15b133bd19c602942600caa3082ffe7158ccee8771dfc16bb660/diff",#012 "UpperDir": "/var/lib/containers/storage/overlay/574e236e5849acbe82f271d59361aa668ee9c4e937e46fcc5c1444d277d89253/diff",#012 "WorkDir": "/var/lib/containers/storage/overlay/574e236e5849acbe82f271d59361aa668ee9c4e937e46fcc5c1444d277d89253/work"#012 }#012 },#012 "RootFS": {#012 "Type": "layers",#012 "Layers": [#012 "sha256:a170762be59c15b133bd19c602942600caa3082ffe7158ccee8771dfc16bb660",#012 "sha256:47bbb708952ccfdaf6b1a15cd5347cc2e9ee37e63ec65603401dcebf66de9242",#012 "sha256:348efc1e36476f80246bdade698c91ef818679232c35b7fa18a305ae7991737d",#012 "sha256:b89a39b9ba4fd7ae72a60dcce0c8ab3aed0d2b7bc44f3f76beaf602e4bb954b5"#012 ]#012 },#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251202",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "c3923531bcda0b0811b2d5053f189beb",#012 "tcib_managed": "true"#012 },#012 "Annotations": {},#012 "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",#012 "User": "root",#012 "History": [#012 {#012 "created": "2025-12-02T04:26:51.317229596Z",#012 "created_by": "/bin/sh -c #(nop) ADD file:9f05a0f58e10b77188c7243d914ce56c5ce3e0f2ee7e13a7b0d4990588c97b99 in / ",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-02T04:26:51.317315213Z",#012 "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\" org.label-schema.name=\"CentOS Stream 9 Base Image\" org.label-schema.vendor=\"CentOS\" org.label-schema.license=\"GPLv2\" org.label-schema.build-date=\"20251202\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-02T04:26:54.063957926Z",#012 "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"#012 },#012 {#012 "created": "2025-12-08T06:08:28.750777742Z",#012 "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",#012 "comment": "FROM quay.io/centos/centos:stream9",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-08T06:08:28.750791962Z",#012 "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-08T06:08:28.750804372Z",#012 "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-08T06:08:28.750813613Z",#012 "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-08T06:08:28.750824813Z",#012 "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-08T06:08:28.750833663Z",#012 "created_by": "/bin/sh -c #(nop) USER root",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-08T06:08:29.160435164Z",#012 "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-08T06:09:05.859236491Z",#012 "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-08T06:09:09.443839088Z",#012 "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util- Dec 15 04:22:32 localhost podman[154044]: 2025-12-15 09:22:32.063314312 +0000 UTC m=+0.095343259 container remove 2df8d20c6ba32f38bd0e6519c9c77a4146b632f21c81197d8c319b223aad81f1 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.expose-services=, io.buildah.version=1.41.4, vcs-type=git, container_name=ovn_controller, konflux.additional-tags=17.1.12 17.1_20251118.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vendor=Red Hat, Inc., build-date=2025-11-18T23:34:05Z, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, release=1761123044, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, version=17.1.12, vcs-ref=ae875c168a6ec3400acf0a639b71f4bcc4adf272, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp17/openstack-ovn-controller, org.opencontainers.image.revision=ae875c168a6ec3400acf0a639b71f4bcc4adf272, batch=17.1_20251118.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container) Dec 15 04:22:32 localhost python3[153995]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman rm --force ovn_controller Dec 15 04:22:32 localhost sshd[154057]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:22:32 localhost podman[154056]: Dec 15 04:22:32 localhost podman[154056]: 2025-12-15 09:22:32.163426952 +0000 UTC m=+0.081397746 container create ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 04:22:32 localhost podman[154056]: 2025-12-15 09:22:32.125624181 +0000 UTC m=+0.043595005 image pull quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified Dec 15 04:22:32 localhost python3[153995]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193 --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified Dec 15 04:22:33 localhost python3.9[154187]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 15 04:22:34 localhost python3.9[154281]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:22:34 localhost python3.9[154327]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 15 04:22:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37885 DF PROTO=TCP SPT=32990 DPT=9882 SEQ=1974127088 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8393E0650000000001030307) Dec 15 04:22:35 localhost python3.9[154418]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765790554.5592234-1712-273577419382535/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:22:35 localhost python3.9[154464]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Dec 15 04:22:35 localhost sshd[154465]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:22:35 localhost systemd[1]: Reloading. Dec 15 04:22:35 localhost systemd-rc-local-generator[154492]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:22:35 localhost systemd-sysv-generator[154496]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:22:35 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:22:36 localhost python3.9[154547]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 04:22:36 localhost systemd[1]: Reloading. Dec 15 04:22:36 localhost systemd-rc-local-generator[154574]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:22:36 localhost systemd-sysv-generator[154579]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:22:36 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:22:36 localhost systemd[1]: Starting ovn_controller container... Dec 15 04:22:37 localhost systemd[1]: tmp-crun.shT6Lt.mount: Deactivated successfully. Dec 15 04:22:37 localhost systemd[1]: Started libcrun container. Dec 15 04:22:37 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/147df5e209bf24838ec2e7c3af8316ae5f3de19b3e91c588c28707e91fed47f0/merged/run/ovn supports timestamps until 2038 (0x7fffffff) Dec 15 04:22:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 04:22:37 localhost podman[154589]: 2025-12-15 09:22:37.109079696 +0000 UTC m=+0.132358318 container init ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202) Dec 15 04:22:37 localhost ovn_controller[154603]: + sudo -E kolla_set_configs Dec 15 04:22:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 04:22:37 localhost podman[154589]: 2025-12-15 09:22:37.142500534 +0000 UTC m=+0.165779126 container start ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202) Dec 15 04:22:37 localhost edpm-start-podman-container[154589]: ovn_controller Dec 15 04:22:37 localhost systemd[1]: Created slice User Slice of UID 0. Dec 15 04:22:37 localhost systemd[1]: Starting User Runtime Directory /run/user/0... Dec 15 04:22:37 localhost systemd[1]: Finished User Runtime Directory /run/user/0. Dec 15 04:22:37 localhost systemd[1]: Starting User Manager for UID 0... Dec 15 04:22:37 localhost podman[154611]: 2025-12-15 09:22:37.234288212 +0000 UTC m=+0.088218199 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, tcib_managed=true, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller) Dec 15 04:22:37 localhost edpm-start-podman-container[154588]: Creating additional drop-in dependency for "ovn_controller" (ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8) Dec 15 04:22:37 localhost systemd[1]: Reloading. Dec 15 04:22:37 localhost podman[154611]: 2025-12-15 09:22:37.319149155 +0000 UTC m=+0.173079172 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Dec 15 04:22:37 localhost podman[154611]: unhealthy Dec 15 04:22:37 localhost systemd[154636]: Queued start job for default target Main User Target. Dec 15 04:22:37 localhost systemd[154636]: Created slice User Application Slice. Dec 15 04:22:37 localhost systemd[154636]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system). Dec 15 04:22:37 localhost systemd[154636]: Started Daily Cleanup of User's Temporary Directories. Dec 15 04:22:37 localhost systemd[154636]: Reached target Paths. Dec 15 04:22:37 localhost systemd[154636]: Reached target Timers. Dec 15 04:22:37 localhost systemd[154636]: Starting D-Bus User Message Bus Socket... Dec 15 04:22:37 localhost systemd[154636]: Starting Create User's Volatile Files and Directories... Dec 15 04:22:37 localhost systemd[154636]: Finished Create User's Volatile Files and Directories. Dec 15 04:22:37 localhost systemd[154636]: Listening on D-Bus User Message Bus Socket. Dec 15 04:22:37 localhost systemd[154636]: Reached target Sockets. Dec 15 04:22:37 localhost systemd[154636]: Reached target Basic System. Dec 15 04:22:37 localhost systemd[154636]: Reached target Main User Target. Dec 15 04:22:37 localhost systemd[154636]: Startup finished in 127ms. Dec 15 04:22:37 localhost systemd-sysv-generator[154694]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:22:37 localhost systemd-rc-local-generator[154689]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:22:37 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:22:37 localhost systemd[1]: Started User Manager for UID 0. Dec 15 04:22:37 localhost systemd[1]: Started ovn_controller container. Dec 15 04:22:37 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Main process exited, code=exited, status=1/FAILURE Dec 15 04:22:37 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Failed with result 'exit-code'. Dec 15 04:22:37 localhost systemd[1]: Started Session c12 of User root. Dec 15 04:22:37 localhost systemd-journald[47230]: Field hash table of /run/log/journal/738a39f68bc78fb81032e509449fb759/system.journal has a fill level at 75.1 (250 of 333 items), suggesting rotation. Dec 15 04:22:37 localhost systemd-journald[47230]: /run/log/journal/738a39f68bc78fb81032e509449fb759/system.journal: Journal header limits reached or header out-of-date, rotating. Dec 15 04:22:37 localhost systemd[1]: Starting dnf makecache... Dec 15 04:22:37 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Dec 15 04:22:37 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Dec 15 04:22:37 localhost ovn_controller[154603]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Dec 15 04:22:37 localhost ovn_controller[154603]: INFO:__main__:Validating config file Dec 15 04:22:37 localhost ovn_controller[154603]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Dec 15 04:22:37 localhost ovn_controller[154603]: INFO:__main__:Writing out command to execute Dec 15 04:22:37 localhost systemd[1]: session-c12.scope: Deactivated successfully. Dec 15 04:22:37 localhost ovn_controller[154603]: ++ cat /run_command Dec 15 04:22:37 localhost ovn_controller[154603]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock ' Dec 15 04:22:37 localhost ovn_controller[154603]: + ARGS= Dec 15 04:22:37 localhost ovn_controller[154603]: + sudo kolla_copy_cacerts Dec 15 04:22:37 localhost systemd[1]: Started Session c13 of User root. Dec 15 04:22:37 localhost ovn_controller[154603]: + [[ ! -n '' ]] Dec 15 04:22:37 localhost ovn_controller[154603]: + . kolla_extend_start Dec 15 04:22:37 localhost ovn_controller[154603]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock '\''' Dec 15 04:22:37 localhost ovn_controller[154603]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock ' Dec 15 04:22:37 localhost ovn_controller[154603]: + umask 0022 Dec 15 04:22:37 localhost ovn_controller[154603]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock Dec 15 04:22:37 localhost systemd[1]: session-c13.scope: Deactivated successfully. Dec 15 04:22:37 localhost ovn_controller[154603]: 2025-12-15T09:22:37Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting... Dec 15 04:22:37 localhost ovn_controller[154603]: 2025-12-15T09:22:37Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected Dec 15 04:22:37 localhost ovn_controller[154603]: 2025-12-15T09:22:37Z|00003|main|INFO|OVN internal version is : [24.03.8-20.33.0-76.8] Dec 15 04:22:37 localhost ovn_controller[154603]: 2025-12-15T09:22:37Z|00004|main|INFO|OVS IDL reconnected, force recompute. Dec 15 04:22:37 localhost ovn_controller[154603]: 2025-12-15T09:22:37Z|00005|reconnect|INFO|tcp:ovsdbserver-sb.openstack.svc:6642: connecting... Dec 15 04:22:37 localhost ovn_controller[154603]: 2025-12-15T09:22:37Z|00006|main|INFO|OVNSB IDL reconnected, force recompute. Dec 15 04:22:37 localhost ovn_controller[154603]: 2025-12-15T09:22:37Z|00007|reconnect|INFO|tcp:ovsdbserver-sb.openstack.svc:6642: connected Dec 15 04:22:37 localhost ovn_controller[154603]: 2025-12-15T09:22:37Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch Dec 15 04:22:37 localhost ovn_controller[154603]: 2025-12-15T09:22:37Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting... Dec 15 04:22:37 localhost ovn_controller[154603]: 2025-12-15T09:22:37Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported Dec 15 04:22:37 localhost ovn_controller[154603]: 2025-12-15T09:22:37Z|00011|features|INFO|OVS Feature: ct_flush, state: supported Dec 15 04:22:37 localhost ovn_controller[154603]: 2025-12-15T09:22:37Z|00012|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting... Dec 15 04:22:37 localhost ovn_controller[154603]: 2025-12-15T09:22:37Z|00013|main|INFO|OVS feature set changed, force recompute. Dec 15 04:22:37 localhost ovn_controller[154603]: 2025-12-15T09:22:37Z|00014|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch Dec 15 04:22:37 localhost ovn_controller[154603]: 2025-12-15T09:22:37Z|00015|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting... Dec 15 04:22:37 localhost ovn_controller[154603]: 2025-12-15T09:22:37Z|00016|reconnect|INFO|unix:/run/openvswitch/db.sock: connected Dec 15 04:22:37 localhost ovn_controller[154603]: 2025-12-15T09:22:37Z|00017|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected Dec 15 04:22:37 localhost ovn_controller[154603]: 2025-12-15T09:22:37Z|00018|main|INFO|OVS feature set changed, force recompute. Dec 15 04:22:37 localhost ovn_controller[154603]: 2025-12-15T09:22:37Z|00019|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected Dec 15 04:22:37 localhost ovn_controller[154603]: 2025-12-15T09:22:37Z|00020|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms) Dec 15 04:22:37 localhost ovn_controller[154603]: 2025-12-15T09:22:37Z|00021|main|INFO|OVS OpenFlow connection reconnected,force recompute. Dec 15 04:22:37 localhost ovn_controller[154603]: 2025-12-15T09:22:37Z|00022|ovn_bfd|INFO|Disabled BFD on interface ovn-9f826b-0 Dec 15 04:22:37 localhost ovn_controller[154603]: 2025-12-15T09:22:37Z|00023|ovn_bfd|INFO|Disabled BFD on interface ovn-843308-0 Dec 15 04:22:37 localhost ovn_controller[154603]: 2025-12-15T09:22:37Z|00024|ovn_bfd|INFO|Disabled BFD on interface ovn-c1fd65-0 Dec 15 04:22:37 localhost ovn_controller[154603]: 2025-12-15T09:22:37Z|00025|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4 Dec 15 04:22:37 localhost ovn_controller[154603]: 2025-12-15T09:22:37Z|00026|binding|INFO|Claiming lport 03ef8889-3216-43fb-8a52-4be17a956ce1 for this chassis. Dec 15 04:22:37 localhost ovn_controller[154603]: 2025-12-15T09:22:37Z|00027|binding|INFO|03ef8889-3216-43fb-8a52-4be17a956ce1: Claiming fa:16:3e:74:df:7c 192.168.0.201 Dec 15 04:22:37 localhost ovn_controller[154603]: 2025-12-15T09:22:37Z|00028|binding|INFO|Releasing lport b35254ad-12eb-47bb-92be-44fefe0694f0 from this chassis (sb_readonly=0) Dec 15 04:22:37 localhost ovn_controller[154603]: 2025-12-15T09:22:37Z|00029|binding|INFO|Removing lport 03ef8889-3216-43fb-8a52-4be17a956ce1 ovn-installed in OVS Dec 15 04:22:37 localhost ovn_controller[154603]: 2025-12-15T09:22:37Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch Dec 15 04:22:37 localhost ovn_controller[154603]: 2025-12-15T09:22:37Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch Dec 15 04:22:37 localhost ovn_controller[154603]: 2025-12-15T09:22:37Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting... Dec 15 04:22:37 localhost ovn_controller[154603]: 2025-12-15T09:22:37Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting... Dec 15 04:22:37 localhost ovn_controller[154603]: 2025-12-15T09:22:37Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected Dec 15 04:22:37 localhost ovn_controller[154603]: 2025-12-15T09:22:37Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected Dec 15 04:22:37 localhost ovn_controller[154603]: 2025-12-15T09:22:37Z|00030|ovn_bfd|INFO|Enabled BFD on interface ovn-9f826b-0 Dec 15 04:22:37 localhost ovn_controller[154603]: 2025-12-15T09:22:37Z|00031|ovn_bfd|INFO|Enabled BFD on interface ovn-843308-0 Dec 15 04:22:37 localhost ovn_controller[154603]: 2025-12-15T09:22:37Z|00032|ovn_bfd|INFO|Enabled BFD on interface ovn-c1fd65-0 Dec 15 04:22:37 localhost ovn_controller[154603]: 2025-12-15T09:22:37Z|00033|binding|INFO|Releasing lport b35254ad-12eb-47bb-92be-44fefe0694f0 from this chassis (sb_readonly=0) Dec 15 04:22:37 localhost ovn_controller[154603]: 2025-12-15T09:22:37Z|00034|binding|INFO|Releasing lport b35254ad-12eb-47bb-92be-44fefe0694f0 from this chassis (sb_readonly=0) Dec 15 04:22:37 localhost ovn_controller[154603]: 2025-12-15T09:22:37Z|00035|binding|INFO|Releasing lport b35254ad-12eb-47bb-92be-44fefe0694f0 from this chassis (sb_readonly=0) Dec 15 04:22:37 localhost ovn_controller[154603]: 2025-12-15T09:22:37Z|00036|binding|INFO|Releasing lport b35254ad-12eb-47bb-92be-44fefe0694f0 from this chassis (sb_readonly=0) Dec 15 04:22:37 localhost ovn_controller[154603]: 2025-12-15T09:22:37Z|00037|binding|INFO|Releasing lport b35254ad-12eb-47bb-92be-44fefe0694f0 from this chassis (sb_readonly=0) Dec 15 04:22:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26949 DF PROTO=TCP SPT=44348 DPT=9105 SEQ=3282536519 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8393EC240000000001030307) Dec 15 04:22:37 localhost dnf[154701]: Updating Subscription Management repositories. Dec 15 04:22:38 localhost ovn_controller[154603]: 2025-12-15T09:22:38Z|00038|binding|INFO|Releasing lport b35254ad-12eb-47bb-92be-44fefe0694f0 from this chassis (sb_readonly=0) Dec 15 04:22:38 localhost ovn_controller[154603]: 2025-12-15T09:22:38Z|00039|binding|INFO|Releasing lport b35254ad-12eb-47bb-92be-44fefe0694f0 from this chassis (sb_readonly=0) Dec 15 04:22:38 localhost ovn_controller[154603]: 2025-12-15T09:22:38Z|00040|binding|INFO|Releasing lport b35254ad-12eb-47bb-92be-44fefe0694f0 from this chassis (sb_readonly=0) Dec 15 04:22:38 localhost ovn_controller[154603]: 2025-12-15T09:22:38Z|00041|binding|INFO|Releasing lport b35254ad-12eb-47bb-92be-44fefe0694f0 from this chassis (sb_readonly=0) Dec 15 04:22:39 localhost ovn_controller[154603]: 2025-12-15T09:22:39Z|00042|binding|INFO|Releasing lport b35254ad-12eb-47bb-92be-44fefe0694f0 from this chassis (sb_readonly=0) Dec 15 04:22:39 localhost dnf[154701]: Metadata cache refreshed recently. Dec 15 04:22:39 localhost systemd[1]: dnf-makecache.service: Deactivated successfully. Dec 15 04:22:39 localhost systemd[1]: Finished dnf makecache. Dec 15 04:22:39 localhost systemd[1]: dnf-makecache.service: Consumed 1.993s CPU time. Dec 15 04:22:39 localhost python3.9[154803]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml Dec 15 04:22:40 localhost python3.9[154895]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:22:41 localhost python3.9[154968]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765790560.1921692-1835-232786624030343/.source.yaml _original_basename=.k9g978oy follow=False checksum=8dd71ad93d7aa7c0e3bd356a24a062b1b02b7eb9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:22:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1189 DF PROTO=TCP SPT=52164 DPT=9101 SEQ=1266571064 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8393F9250000000001030307) Dec 15 04:22:42 localhost python3.9[155060]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:22:42 localhost ovs-vsctl[155061]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload Dec 15 04:22:43 localhost python3.9[155153]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:22:43 localhost ovs-vsctl[155155]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids Dec 15 04:22:44 localhost python3.9[155248]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:22:44 localhost ovs-vsctl[155249]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options Dec 15 04:22:45 localhost systemd[1]: session-49.scope: Deactivated successfully. Dec 15 04:22:45 localhost systemd[1]: session-49.scope: Consumed 41.687s CPU time. Dec 15 04:22:45 localhost systemd-logind[763]: Session 49 logged out. Waiting for processes to exit. Dec 15 04:22:45 localhost systemd-logind[763]: Removed session 49. Dec 15 04:22:45 localhost ovn_controller[154603]: 2025-12-15T09:22:45Z|00043|binding|INFO|Setting lport 03ef8889-3216-43fb-8a52-4be17a956ce1 ovn-installed in OVS Dec 15 04:22:45 localhost ovn_controller[154603]: 2025-12-15T09:22:45Z|00044|binding|INFO|Setting lport 03ef8889-3216-43fb-8a52-4be17a956ce1 up in Southbound Dec 15 04:22:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37887 DF PROTO=TCP SPT=32990 DPT=9882 SEQ=1974127088 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839411250000000001030307) Dec 15 04:22:47 localhost systemd[1]: Stopping User Manager for UID 0... Dec 15 04:22:47 localhost systemd[154636]: Activating special unit Exit the Session... Dec 15 04:22:47 localhost systemd[154636]: Stopped target Main User Target. Dec 15 04:22:47 localhost systemd[154636]: Stopped target Basic System. Dec 15 04:22:47 localhost systemd[154636]: Stopped target Paths. Dec 15 04:22:47 localhost systemd[154636]: Stopped target Sockets. Dec 15 04:22:47 localhost systemd[154636]: Stopped target Timers. Dec 15 04:22:47 localhost systemd[154636]: Stopped Daily Cleanup of User's Temporary Directories. Dec 15 04:22:47 localhost systemd[154636]: Closed D-Bus User Message Bus Socket. Dec 15 04:22:47 localhost systemd[154636]: Stopped Create User's Volatile Files and Directories. Dec 15 04:22:47 localhost systemd[154636]: Removed slice User Application Slice. Dec 15 04:22:47 localhost systemd[154636]: Reached target Shutdown. Dec 15 04:22:47 localhost systemd[154636]: Finished Exit the Session. Dec 15 04:22:47 localhost systemd[154636]: Reached target Exit the Session. Dec 15 04:22:47 localhost systemd[1]: user@0.service: Deactivated successfully. Dec 15 04:22:47 localhost systemd[1]: Stopped User Manager for UID 0. Dec 15 04:22:47 localhost systemd[1]: Stopping User Runtime Directory /run/user/0... Dec 15 04:22:47 localhost systemd[1]: run-user-0.mount: Deactivated successfully. Dec 15 04:22:47 localhost systemd[1]: user-runtime-dir@0.service: Deactivated successfully. Dec 15 04:22:47 localhost systemd[1]: Stopped User Runtime Directory /run/user/0. Dec 15 04:22:47 localhost systemd[1]: Removed slice User Slice of UID 0. Dec 15 04:22:50 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26951 DF PROTO=TCP SPT=44348 DPT=9105 SEQ=3282536519 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83941D250000000001030307) Dec 15 04:22:52 localhost sshd[155268]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:22:52 localhost systemd-logind[763]: New session 51 of user zuul. Dec 15 04:22:52 localhost systemd[1]: Started Session 51 of User zuul. Dec 15 04:22:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1962 DF PROTO=TCP SPT=48006 DPT=9100 SEQ=2397388030 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839428030000000001030307) Dec 15 04:22:53 localhost python3.9[155361]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Dec 15 04:22:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1963 DF PROTO=TCP SPT=48006 DPT=9100 SEQ=2397388030 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83942C250000000001030307) Dec 15 04:22:55 localhost python3.9[155457]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/openstack/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Dec 15 04:22:55 localhost python3.9[155549]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 15 04:22:56 localhost python3.9[155642]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 15 04:22:56 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1964 DF PROTO=TCP SPT=48006 DPT=9100 SEQ=2397388030 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839434250000000001030307) Dec 15 04:22:56 localhost python3.9[155734]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 15 04:22:57 localhost python3.9[155826]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 15 04:22:58 localhost python3.9[155916]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Dec 15 04:22:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11512 DF PROTO=TCP SPT=39906 DPT=9101 SEQ=1951186649 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83943E240000000001030307) Dec 15 04:22:59 localhost python3.9[156008]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False Dec 15 04:23:00 localhost python3.9[156098]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:23:00 localhost python3.9[156171]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765790579.457706-218-49606575935582/.source follow=False _original_basename=haproxy.j2 checksum=a5072e7b19ca96a1f495d94f97f31903737cfd27 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 15 04:23:01 localhost python3.9[156261]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:23:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61452 DF PROTO=TCP SPT=36318 DPT=9882 SEQ=1525524826 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839449A90000000001030307) Dec 15 04:23:02 localhost ovn_controller[154603]: 2025-12-15T09:23:02Z|00045|memory|INFO|17104 kB peak resident set size after 24.3 seconds Dec 15 04:23:02 localhost ovn_controller[154603]: 2025-12-15T09:23:02Z|00046|memory|INFO|idl-cells-OVN_Southbound:4033 idl-cells-Open_vSwitch:1045 if_status_mgr_ifaces_state_usage-KB:1 if_status_mgr_ifaces_usage-KB:1 lflow-cache-entries-cache-expr:76 lflow-cache-entries-cache-matches:195 lflow-cache-size-KB:289 local_datapath_usage-KB:1 ofctrl_desired_flow_usage-KB:154 ofctrl_installed_flow_usage-KB:112 ofctrl_sb_flow_ref_usage-KB:67 Dec 15 04:23:02 localhost python3.9[156334]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765790581.2528734-263-259229401285243/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 15 04:23:03 localhost python3.9[156426]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Dec 15 04:23:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61454 DF PROTO=TCP SPT=36318 DPT=9882 SEQ=1525524826 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839455A50000000001030307) Dec 15 04:23:05 localhost python3.9[156480]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch3.3'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Dec 15 04:23:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 04:23:07 localhost systemd[1]: tmp-crun.vy90uM.mount: Deactivated successfully. Dec 15 04:23:07 localhost podman[156483]: 2025-12-15 09:23:07.745436832 +0000 UTC m=+0.076847748 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS) Dec 15 04:23:07 localhost podman[156483]: 2025-12-15 09:23:07.834404035 +0000 UTC m=+0.165814951 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team) Dec 15 04:23:07 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 04:23:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1372 DF PROTO=TCP SPT=47770 DPT=9105 SEQ=283524394 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839461650000000001030307) Dec 15 04:23:09 localhost python3.9[156599]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None Dec 15 04:23:11 localhost python3.9[156692]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:23:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11514 DF PROTO=TCP SPT=39906 DPT=9101 SEQ=1951186649 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83946F250000000001030307) Dec 15 04:23:11 localhost python3.9[156763]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765790591.0024154-374-160876770775422/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 15 04:23:12 localhost python3.9[156853]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:23:13 localhost python3.9[156970]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765790591.985905-374-167045336328234/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 15 04:23:15 localhost python3.9[157092]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:23:15 localhost python3.9[157163]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765790594.5950909-506-14803171564136/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=aa9e89725fbcebf7a5c773d7b97083445b7b7759 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 15 04:23:15 localhost ovn_controller[154603]: 2025-12-15T09:23:15Z|00047|memory_trim|INFO|Detected inactivity (last active 30009 ms ago): trimming memory Dec 15 04:23:16 localhost python3.9[157253]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:23:16 localhost python3.9[157324]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765790595.63367-506-228405251123879/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=979187b925479d81d0609f4188e5b95fe1f92c18 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 15 04:23:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61456 DF PROTO=TCP SPT=36318 DPT=9882 SEQ=1525524826 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839485260000000001030307) Dec 15 04:23:17 localhost python3.9[157414]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 15 04:23:18 localhost python3.9[157508]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 15 04:23:18 localhost python3.9[157600]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:23:19 localhost python3.9[157648]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 15 04:23:19 localhost python3.9[157740]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:23:20 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1374 DF PROTO=TCP SPT=47770 DPT=9105 SEQ=283524394 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839491250000000001030307) Dec 15 04:23:20 localhost python3.9[157788]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 15 04:23:20 localhost python3.9[157880]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:23:21 localhost python3.9[157972]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:23:22 localhost python3.9[158020]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:23:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8657 DF PROTO=TCP SPT=47694 DPT=9100 SEQ=265801259 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83949D330000000001030307) Dec 15 04:23:23 localhost python3.9[158112]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:23:23 localhost python3.9[158160]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:23:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8658 DF PROTO=TCP SPT=47694 DPT=9100 SEQ=265801259 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8394A1250000000001030307) Dec 15 04:23:25 localhost python3.9[158252]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 04:23:25 localhost systemd[1]: Reloading. Dec 15 04:23:25 localhost systemd-sysv-generator[158278]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:23:25 localhost systemd-rc-local-generator[158274]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:23:25 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:23:26 localhost python3.9[158382]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:23:26 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8659 DF PROTO=TCP SPT=47694 DPT=9100 SEQ=265801259 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8394A9250000000001030307) Dec 15 04:23:26 localhost python3.9[158430]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:23:27 localhost python3.9[158522]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:23:28 localhost python3.9[158570]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:23:28 localhost python3.9[158662]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 04:23:28 localhost systemd[1]: Reloading. Dec 15 04:23:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15294 DF PROTO=TCP SPT=33904 DPT=9101 SEQ=1789549141 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8394B3250000000001030307) Dec 15 04:23:28 localhost systemd-rc-local-generator[158688]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:23:28 localhost systemd-sysv-generator[158692]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:23:28 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:23:29 localhost systemd[1]: Starting Create netns directory... Dec 15 04:23:29 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Dec 15 04:23:29 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Dec 15 04:23:29 localhost systemd[1]: Finished Create netns directory. Dec 15 04:23:30 localhost python3.9[158796]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 15 04:23:30 localhost python3.9[158888]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:23:31 localhost python3.9[158961]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765790610.3574886-959-169330752727590/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Dec 15 04:23:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41876 DF PROTO=TCP SPT=57782 DPT=9882 SEQ=630765296 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8394BED90000000001030307) Dec 15 04:23:32 localhost python3.9[159053]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:23:33 localhost python3.9[159145]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 15 04:23:34 localhost python3.9[159237]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:23:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41878 DF PROTO=TCP SPT=57782 DPT=9882 SEQ=630765296 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8394CAE50000000001030307) Dec 15 04:23:35 localhost python3.9[159312]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1765790613.9048393-1058-29326142601470/.source.json _original_basename=.bza0x101 follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:23:36 localhost python3.9[159402]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:23:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60865 DF PROTO=TCP SPT=36376 DPT=9105 SEQ=1399683046 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8394D6650000000001030307) Dec 15 04:23:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 04:23:38 localhost systemd[1]: tmp-crun.YQqpdd.mount: Deactivated successfully. Dec 15 04:23:38 localhost podman[159655]: 2025-12-15 09:23:38.391030885 +0000 UTC m=+0.100514682 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller) Dec 15 04:23:38 localhost podman[159655]: 2025-12-15 09:23:38.434422798 +0000 UTC m=+0.143906555 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3) Dec 15 04:23:38 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 04:23:38 localhost python3.9[159656]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False Dec 15 04:23:39 localhost python3.9[159770]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack Dec 15 04:23:40 localhost python3.9[159862]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None Dec 15 04:23:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15296 DF PROTO=TCP SPT=33904 DPT=9101 SEQ=1789549141 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8394E3250000000001030307) Dec 15 04:23:44 localhost python3[159980]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json containers=['ovn_metadata_agent'] log_base_path=/var/log/containers/stdouts debug=False Dec 15 04:23:44 localhost python3[159980]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [#012 {#012 "Id": "3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3",#012 "Digest": "sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2",#012 "RepoTags": [#012 "quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified"#012 ],#012 "RepoDigests": [#012 "quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2"#012 ],#012 "Parent": "",#012 "Comment": "",#012 "Created": "2025-12-08T06:28:07.122893603Z",#012 "Config": {#012 "User": "neutron",#012 "Env": [#012 "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",#012 "LANG=en_US.UTF-8",#012 "TZ=UTC",#012 "container=oci"#012 ],#012 "Entrypoint": [#012 "dumb-init",#012 "--single-child",#012 "--"#012 ],#012 "Cmd": [#012 "kolla_start"#012 ],#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251202",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "c3923531bcda0b0811b2d5053f189beb",#012 "tcib_managed": "true"#012 },#012 "StopSignal": "SIGTERM"#012 },#012 "Version": "",#012 "Author": "",#012 "Architecture": "amd64",#012 "Os": "linux",#012 "Size": 784723198,#012 "VirtualSize": 784723198,#012 "GraphDriver": {#012 "Name": "overlay",#012 "Data": {#012 "LowerDir": "/var/lib/containers/storage/overlay/e286d84738a37bd2b207737e0500901c46e6f74c0034deffa3c9c2ea6c42af5a/diff:/var/lib/containers/storage/overlay/3f01a5f11d308182c9ef96830a09f87e28c35e55cefdcd5aaa0bea98e3111a1e/diff:/var/lib/containers/storage/overlay/4c2a493cc38fe0c2d274b137f7d549c92d76e83cf216e797584fb8469937682d/diff:/var/lib/containers/storage/overlay/102653142e2259aa6223045dee7736729104ac8aed3ce9b3c87a6d0787e59de8/diff:/var/lib/containers/storage/overlay/a170762be59c15b133bd19c602942600caa3082ffe7158ccee8771dfc16bb660/diff",#012 "UpperDir": "/var/lib/containers/storage/overlay/1c99620ce928d1d7a7fa7a4a270012879db892360c109f88ecf7a139ea7db3ab/diff",#012 "WorkDir": "/var/lib/containers/storage/overlay/1c99620ce928d1d7a7fa7a4a270012879db892360c109f88ecf7a139ea7db3ab/work"#012 }#012 },#012 "RootFS": {#012 "Type": "layers",#012 "Layers": [#012 "sha256:a170762be59c15b133bd19c602942600caa3082ffe7158ccee8771dfc16bb660",#012 "sha256:47bbb708952ccfdaf6b1a15cd5347cc2e9ee37e63ec65603401dcebf66de9242",#012 "sha256:dde195c4be3ea0882f3029365e3a9510c9e08a199c8a2c93ddc2b8aa725a10f1",#012 "sha256:108989fda0cdd6bbb662590b73002e9463660d785f81310df7ee368fa28b901c",#012 "sha256:73aff16c23a5f196357ae405dcc16a167ae55e8d21d8a8f56756af638e773ef6",#012 "sha256:8b8803b3462d229dcf5c7649cfa1382ca44ec246714157a327945ea0bf316dd9"#012 ]#012 },#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251202",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "c3923531bcda0b0811b2d5053f189beb",#012 "tcib_managed": "true"#012 },#012 "Annotations": {},#012 "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",#012 "User": "neutron",#012 "History": [#012 {#012 "created": "2025-12-02T04:26:51.317229596Z",#012 "created_by": "/bin/sh -c #(nop) ADD file:9f05a0f58e10b77188c7243d914ce56c5ce3e0f2ee7e13a7b0d4990588c97b99 in / ",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-02T04:26:51.317315213Z",#012 "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\" org.label-schema.name=\"CentOS Stream 9 Base Image\" org.label-schema.vendor=\"CentOS\" org.label-schema.license=\"GPLv2\" org.label-schema.build-date=\"20251202\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-02T04:26:54.063957926Z",#012 "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"#012 },#012 {#012 "created": "2025-12-08T06:08:28.750777742Z",#012 "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",#012 "comment": "FROM quay.io/centos/centos:stream9",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-08T06:08:28.750791962Z",#012 "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-08T06:08:28.750804372Z",#012 "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-08T06:08:28.750813613Z",#012 "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-08T06:08:28.750824813Z",#012 "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-08T06:08:28.750833663Z",#012 "created_by": "/bin/sh -c #(nop) USER root",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-08T06:08:29.160435164Z",#012 "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-08T06:09:05.859236491Z",#012 "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.con Dec 15 04:23:44 localhost podman[160029]: 2025-12-15 09:23:44.859967343 +0000 UTC m=+0.091150003 container remove 4d35b7dc3f3a4e3c60661e85d3511f50804e87244d74cab67af5c6e8c447d379 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, version=17.1.12, distribution-scope=public, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, name=rhosp17/openstack-neutron-metadata-agent-ovn, vcs-type=git, release=1761123044, build-date=2025-11-19T00:14:25Z, config_id=tripleo_step4, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, url=https://www.redhat.com, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'a56a6f14b467cd9064e40c03defa5ed7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, batch=17.1_20251118.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, vendor=Red Hat, Inc., architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Dec 15 04:23:44 localhost python3[159980]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman rm --force ovn_metadata_agent Dec 15 04:23:44 localhost podman[160043]: Dec 15 04:23:44 localhost podman[160043]: 2025-12-15 09:23:44.965749085 +0000 UTC m=+0.087006971 container create 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.41.3) Dec 15 04:23:44 localhost podman[160043]: 2025-12-15 09:23:44.923913115 +0000 UTC m=+0.045171031 image pull quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified Dec 15 04:23:44 localhost python3[159980]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311 --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified Dec 15 04:23:45 localhost python3.9[160171]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 15 04:23:46 localhost python3.9[160265]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:23:47 localhost python3.9[160311]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 15 04:23:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41880 DF PROTO=TCP SPT=57782 DPT=9882 SEQ=630765296 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8394FB250000000001030307) Dec 15 04:23:47 localhost python3.9[160402]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765790627.101321-1328-93479662471157/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:23:48 localhost python3.9[160448]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Dec 15 04:23:48 localhost systemd[1]: Reloading. Dec 15 04:23:48 localhost systemd-rc-local-generator[160470]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:23:48 localhost systemd-sysv-generator[160475]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:23:48 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:23:49 localhost python3.9[160529]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 04:23:49 localhost systemd[1]: Reloading. Dec 15 04:23:49 localhost systemd-sysv-generator[160562]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:23:49 localhost systemd-rc-local-generator[160559]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:23:49 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:23:49 localhost systemd[1]: Starting ovn_metadata_agent container... Dec 15 04:23:49 localhost systemd[1]: tmp-crun.DeUZsn.mount: Deactivated successfully. Dec 15 04:23:49 localhost systemd[1]: Started libcrun container. Dec 15 04:23:49 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/14a6ad23b11d062c9bf1b5a12cea6f91f689e1e5405e8b2142943bce5dfa8b31/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff) Dec 15 04:23:49 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/14a6ad23b11d062c9bf1b5a12cea6f91f689e1e5405e8b2142943bce5dfa8b31/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 04:23:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 04:23:49 localhost podman[160571]: 2025-12-15 09:23:49.811358765 +0000 UTC m=+0.172862471 container init 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent) Dec 15 04:23:49 localhost ovn_metadata_agent[160585]: + sudo -E kolla_set_configs Dec 15 04:23:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 04:23:49 localhost podman[160571]: 2025-12-15 09:23:49.854545362 +0000 UTC m=+0.216049028 container start 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Dec 15 04:23:49 localhost edpm-start-podman-container[160571]: ovn_metadata_agent Dec 15 04:23:49 localhost ovn_metadata_agent[160585]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Dec 15 04:23:49 localhost ovn_metadata_agent[160585]: INFO:__main__:Validating config file Dec 15 04:23:49 localhost ovn_metadata_agent[160585]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Dec 15 04:23:49 localhost ovn_metadata_agent[160585]: INFO:__main__:Copying service configuration files Dec 15 04:23:49 localhost ovn_metadata_agent[160585]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf Dec 15 04:23:49 localhost ovn_metadata_agent[160585]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf Dec 15 04:23:49 localhost ovn_metadata_agent[160585]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf Dec 15 04:23:49 localhost ovn_metadata_agent[160585]: INFO:__main__:Writing out command to execute Dec 15 04:23:49 localhost ovn_metadata_agent[160585]: INFO:__main__:Setting permission for /var/lib/neutron Dec 15 04:23:49 localhost ovn_metadata_agent[160585]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts Dec 15 04:23:49 localhost ovn_metadata_agent[160585]: INFO:__main__:Setting permission for /var/lib/neutron/.cache Dec 15 04:23:49 localhost ovn_metadata_agent[160585]: INFO:__main__:Setting permission for /var/lib/neutron/external Dec 15 04:23:49 localhost ovn_metadata_agent[160585]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy Dec 15 04:23:49 localhost ovn_metadata_agent[160585]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper Dec 15 04:23:49 localhost ovn_metadata_agent[160585]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy Dec 15 04:23:49 localhost ovn_metadata_agent[160585]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill Dec 15 04:23:49 localhost ovn_metadata_agent[160585]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints Dec 15 04:23:49 localhost ovn_metadata_agent[160585]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/adac9f827fd7fb11fb07020ef60ee06a1fede4feab743856dc8fb3266181d934 Dec 15 04:23:49 localhost ovn_metadata_agent[160585]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids Dec 15 04:23:49 localhost ovn_metadata_agent[160585]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids/befb7a72-17a9-4bcb-b561-84b8f626685a.pid.haproxy Dec 15 04:23:49 localhost ovn_metadata_agent[160585]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy/befb7a72-17a9-4bcb-b561-84b8f626685a.conf Dec 15 04:23:49 localhost ovn_metadata_agent[160585]: ++ cat /run_command Dec 15 04:23:49 localhost ovn_metadata_agent[160585]: + CMD=neutron-ovn-metadata-agent Dec 15 04:23:49 localhost ovn_metadata_agent[160585]: + ARGS= Dec 15 04:23:49 localhost ovn_metadata_agent[160585]: + sudo kolla_copy_cacerts Dec 15 04:23:49 localhost ovn_metadata_agent[160585]: Running command: 'neutron-ovn-metadata-agent' Dec 15 04:23:49 localhost ovn_metadata_agent[160585]: + [[ ! -n '' ]] Dec 15 04:23:49 localhost ovn_metadata_agent[160585]: + . kolla_extend_start Dec 15 04:23:49 localhost podman[160594]: 2025-12-15 09:23:49.94561803 +0000 UTC m=+0.083749924 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=starting, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible) Dec 15 04:23:49 localhost ovn_metadata_agent[160585]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\''' Dec 15 04:23:49 localhost ovn_metadata_agent[160585]: + umask 0022 Dec 15 04:23:49 localhost ovn_metadata_agent[160585]: + exec neutron-ovn-metadata-agent Dec 15 04:23:50 localhost podman[160594]: 2025-12-15 09:23:50.03261132 +0000 UTC m=+0.170743214 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, tcib_managed=true) Dec 15 04:23:50 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 04:23:50 localhost edpm-start-podman-container[160570]: Creating additional drop-in dependency for "ovn_metadata_agent" (4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923) Dec 15 04:23:50 localhost systemd[1]: Reloading. Dec 15 04:23:50 localhost systemd-sysv-generator[160662]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:23:50 localhost systemd-rc-local-generator[160659]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:23:50 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:23:50 localhost systemd[1]: Started ovn_metadata_agent container. Dec 15 04:23:50 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60867 DF PROTO=TCP SPT=36376 DPT=9105 SEQ=1399683046 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839507260000000001030307) Dec 15 04:23:51 localhost python3.9[160760]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.395 160590 INFO neutron.common.config [-] Logging enabled!#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.395 160590 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.396 160590 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.396 160590 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.396 160590 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.396 160590 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.396 160590 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.396 160590 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.396 160590 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.397 160590 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.397 160590 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.397 160590 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.397 160590 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.397 160590 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.397 160590 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.397 160590 DEBUG neutron.agent.ovn.metadata_agent [-] backlog = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.397 160590 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.397 160590 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.397 160590 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.397 160590 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.398 160590 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.398 160590 DEBUG neutron.agent.ovn.metadata_agent [-] config_file = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.398 160590 DEBUG neutron.agent.ovn.metadata_agent [-] config_source = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.398 160590 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.398 160590 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.398 160590 DEBUG neutron.agent.ovn.metadata_agent [-] debug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.398 160590 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.398 160590 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.398 160590 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.399 160590 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.399 160590 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.399 160590 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.399 160590 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.399 160590 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.399 160590 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.399 160590 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.399 160590 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.399 160590 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.399 160590 DEBUG neutron.agent.ovn.metadata_agent [-] host = np0005559462.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.400 160590 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.400 160590 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.400 160590 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.400 160590 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.400 160590 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.400 160590 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.400 160590 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.400 160590 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.400 160590 DEBUG neutron.agent.ovn.metadata_agent [-] log_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.400 160590 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.400 160590 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.400 160590 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.401 160590 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.401 160590 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.401 160590 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.401 160590 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.401 160590 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.401 160590 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.401 160590 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.401 160590 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.401 160590 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.401 160590 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.401 160590 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.402 160590 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.402 160590 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.402 160590 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.402 160590 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.402 160590 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.402 160590 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.402 160590 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.402 160590 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.402 160590 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.402 160590 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.403 160590 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.403 160590 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.403 160590 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.403 160590 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.403 160590 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol = http log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.403 160590 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.403 160590 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.403 160590 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.403 160590 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.403 160590 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.404 160590 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.404 160590 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.404 160590 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.404 160590 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.404 160590 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.404 160590 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.404 160590 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.404 160590 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.404 160590 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.404 160590 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.404 160590 DEBUG neutron.agent.ovn.metadata_agent [-] state_path = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.405 160590 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.405 160590 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.405 160590 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.405 160590 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.405 160590 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.405 160590 DEBUG neutron.agent.ovn.metadata_agent [-] use_json = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.405 160590 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.405 160590 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.405 160590 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.405 160590 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.405 160590 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.405 160590 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.406 160590 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.406 160590 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.406 160590 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.406 160590 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.406 160590 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.406 160590 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.406 160590 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.406 160590 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.406 160590 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.406 160590 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.407 160590 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.407 160590 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.407 160590 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.407 160590 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.407 160590 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.407 160590 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.407 160590 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.407 160590 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.407 160590 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.407 160590 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.408 160590 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.408 160590 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.408 160590 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.408 160590 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.408 160590 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.408 160590 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.408 160590 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.408 160590 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.408 160590 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.408 160590 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.409 160590 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.409 160590 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.409 160590 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.409 160590 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.409 160590 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.409 160590 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.409 160590 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.409 160590 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.409 160590 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.409 160590 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.410 160590 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.410 160590 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.410 160590 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.410 160590 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.410 160590 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.410 160590 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.410 160590 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.410 160590 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.410 160590 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.410 160590 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.411 160590 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.411 160590 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.411 160590 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.411 160590 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.411 160590 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.411 160590 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.411 160590 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.411 160590 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.411 160590 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.411 160590 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.411 160590 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.412 160590 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.412 160590 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.412 160590 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.412 160590 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.412 160590 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.412 160590 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.412 160590 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.412 160590 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.412 160590 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.412 160590 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.413 160590 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.413 160590 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.413 160590 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.413 160590 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.413 160590 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.413 160590 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.413 160590 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.413 160590 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.413 160590 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.414 160590 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.414 160590 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.414 160590 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.414 160590 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.414 160590 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.414 160590 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.414 160590 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.414 160590 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.414 160590 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.414 160590 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.414 160590 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.415 160590 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.415 160590 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.415 160590 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.415 160590 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.415 160590 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.415 160590 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.415 160590 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.415 160590 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.415 160590 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.415 160590 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.416 160590 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.416 160590 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.416 160590 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.416 160590 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.416 160590 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.416 160590 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.416 160590 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.416 160590 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.416 160590 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.416 160590 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.417 160590 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.417 160590 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.417 160590 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.417 160590 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.417 160590 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.417 160590 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.417 160590 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.417 160590 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.417 160590 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.417 160590 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.417 160590 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.418 160590 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.418 160590 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.418 160590 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.418 160590 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.418 160590 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.418 160590 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.418 160590 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.418 160590 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.418 160590 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.418 160590 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.419 160590 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.419 160590 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.419 160590 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.419 160590 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.419 160590 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.419 160590 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.419 160590 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.419 160590 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.419 160590 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.419 160590 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.420 160590 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.420 160590 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.420 160590 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.420 160590 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.420 160590 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.420 160590 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.420 160590 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.420 160590 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.420 160590 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection = tcp:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.420 160590 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.421 160590 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.421 160590 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.421 160590 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.421 160590 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.421 160590 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.421 160590 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.421 160590 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.421 160590 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.421 160590 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.421 160590 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.422 160590 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.422 160590 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.422 160590 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.422 160590 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.422 160590 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.422 160590 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.422 160590 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.422 160590 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.422 160590 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.422 160590 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.423 160590 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.423 160590 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.423 160590 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.423 160590 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.423 160590 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.423 160590 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.423 160590 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.423 160590 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.423 160590 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.423 160590 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.423 160590 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.424 160590 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.424 160590 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.424 160590 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.424 160590 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.424 160590 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.424 160590 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.424 160590 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.424 160590 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.424 160590 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.424 160590 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.425 160590 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.425 160590 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.425 160590 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.425 160590 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.425 160590 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.425 160590 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.473 160590 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.474 160590 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.474 160590 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.474 160590 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.474 160590 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.490 160590 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name 12d96d64-e862-4f68-81e5-8d9ec5d3a5e2 (UUID: 12d96d64-e862-4f68-81e5-8d9ec5d3a5e2) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.513 160590 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.513 160590 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.513 160590 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.514 160590 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.516 160590 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.517 160590 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:ovsdbserver-sb.openstack.svc:6642: connected#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.527 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: PortBindingCreateWithChassis(events=('create',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:74:df:7c 192.168.0.201'], port_security=['fa:16:3e:74:df:7c 192.168.0.201'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.0.201/24', 'neutron:device_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'neutron:device_owner': 'compute:nova', 'neutron:host_id': 'np0005559462.localdomain', 'neutron:mtu': '', 'neutron:network_name': 'neutron-befb7a72-17a9-4bcb-b561-84b8f626685a', 'neutron:port_capabilities': '', 'neutron:port_fip': '192.168.122.20', 'neutron:port_name': '', 'neutron:project_id': 'c785bf23f53946bc99867d8832a50266', 'neutron:revision_number': '7', 'neutron:security_group_ids': 'adeef2d9-3b61-4849-9b44-ac3bff90d0cd fa685b85-67a9-4a56-ba21-4767a05c4811', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=56a5044a-5384-46d9-b45d-bcd5602105ab, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=03ef8889-3216-43fb-8a52-4be17a956ce1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.528 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', '12d96d64-e862-4f68-81e5-8d9ec5d3a5e2'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[], external_ids={'neutron:ovn-metadata-id': '20a29b40-d485-5855-8ada-f8c435ab8cd8', 'neutron:ovn-metadata-sb-cfg': '1'}, name=12d96d64-e862-4f68-81e5-8d9ec5d3a5e2, nb_cfg_timestamp=1765790566442, nb_cfg=4) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.528 160590 INFO neutron.agent.ovn.metadata.agent [-] Port 03ef8889-3216-43fb-8a52-4be17a956ce1 in datapath befb7a72-17a9-4bcb-b561-84b8f626685a bound to our chassis on insert#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.529 160590 DEBUG neutron_lib.callbacks.manager [-] Subscribe: > process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.529 160590 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.529 160590 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.529 160590 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.530 160590 INFO oslo_service.service [-] Starting 1 workers#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.532 160590 DEBUG oslo_service.service [-] Started child 160779 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.535 160590 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network befb7a72-17a9-4bcb-b561-84b8f626685a#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.536 160590 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmpcobbenxu/privsep.sock']#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.537 160779 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-453349'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.566 160779 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.566 160779 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.567 160779 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.569 160779 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.571 160779 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:ovsdbserver-sb.openstack.svc:6642: connected#033[00m Dec 15 04:23:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:51.587 160779 INFO eventlet.wsgi.server [-] (160779) wsgi starting up on http:/var/lib/neutron/metadata_proxy#033[00m Dec 15 04:23:52 localhost python3.9[160857]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:23:52 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:52.147 160590 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m Dec 15 04:23:52 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:52.148 160590 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpcobbenxu/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m Dec 15 04:23:52 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:52.039 160858 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m Dec 15 04:23:52 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:52.044 160858 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m Dec 15 04:23:52 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:52.048 160858 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none#033[00m Dec 15 04:23:52 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:52.048 160858 INFO oslo.privsep.daemon [-] privsep daemon running as pid 160858#033[00m Dec 15 04:23:52 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:52.150 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[4d336b39-80e3-4ac0-acfc-c5337f8214ea]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 04:23:52 localhost python3.9[160937]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765790631.5600846-1451-260639597454787/.source.yaml _original_basename=.2fsa5tch follow=False checksum=128dbeb1db3b9f598d275818ea6dfc50e6affd8e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:23:52 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:52.559 160858 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:23:52 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:52.559 160858 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:23:52 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:52.559 160858 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:23:53 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:52.999 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[2ee77de9-c8ac-4e5e-81d8-035594088e8f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 04:23:53 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:53.001 160590 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmpppdn02hr/privsep.sock']#033[00m Dec 15 04:23:53 localhost systemd[1]: session-51.scope: Deactivated successfully. Dec 15 04:23:53 localhost systemd[1]: session-51.scope: Consumed 33.527s CPU time. Dec 15 04:23:53 localhost systemd-logind[763]: Session 51 logged out. Waiting for processes to exit. Dec 15 04:23:53 localhost systemd-logind[763]: Removed session 51. Dec 15 04:23:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47716 DF PROTO=TCP SPT=45800 DPT=9100 SEQ=2695342916 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839512630000000001030307) Dec 15 04:23:53 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:53.572 160590 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m Dec 15 04:23:53 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:53.573 160590 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpppdn02hr/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m Dec 15 04:23:53 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:53.463 160959 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m Dec 15 04:23:53 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:53.469 160959 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m Dec 15 04:23:53 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:53.472 160959 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m Dec 15 04:23:53 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:53.472 160959 INFO oslo.privsep.daemon [-] privsep daemon running as pid 160959#033[00m Dec 15 04:23:53 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:53.576 160959 DEBUG oslo.privsep.daemon [-] privsep: reply[6b846a05-a374-497f-ad49-c6191ef32df3]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 04:23:54 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:54.000 160959 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:23:54 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:54.000 160959 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:23:54 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:54.000 160959 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:23:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47717 DF PROTO=TCP SPT=45800 DPT=9100 SEQ=2695342916 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839516650000000001030307) Dec 15 04:23:54 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:54.454 160959 DEBUG oslo.privsep.daemon [-] privsep: reply[749b0d47-bc9a-4205-943d-b4ab5d595ce6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 04:23:54 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:54.457 160959 DEBUG oslo.privsep.daemon [-] privsep: reply[13fbd899-2b4e-4704-a403-eb49d1a73235]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 04:23:54 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:54.476 160959 DEBUG oslo.privsep.daemon [-] privsep: reply[b97775a2-843e-4a4d-93cc-27d6db0635ff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 04:23:54 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:54.494 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[2c27ac10-c36a-4010-bba7-f079ff415dec]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbefb7a72-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:d6:2e:fb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 104, 'tx_packets': 68, 'rx_bytes': 8926, 'tx_bytes': 7143, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 104, 'tx_packets': 68, 'rx_bytes': 8926, 'tx_bytes': 7143, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483664], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 646669, 'reachable_time': 33590, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 17, 'outoctets': 1164, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 17, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 1164, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 17, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 160969, 'error': None, 'target': 'ovnmeta-befb7a72-17a9-4bcb-b561-84b8f626685a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 04:23:54 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:54.511 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[85ebdb02-ebdc-412d-8309-0d9530710a33]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapbefb7a72-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 646676, 'tstamp': 646676}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 160970, 'error': None, 'target': 'ovnmeta-befb7a72-17a9-4bcb-b561-84b8f626685a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 24, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '192.168.0.2'], ['IFA_LOCAL', '192.168.0.2'], ['IFA_BROADCAST', '192.168.0.255'], ['IFA_LABEL', 'tapbefb7a72-11'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 646678, 'tstamp': 646678}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 160970, 'error': None, 'target': 'ovnmeta-befb7a72-17a9-4bcb-b561-84b8f626685a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 10, 'prefixlen': 64, 'flags': 128, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::a9fe:a9fe'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 646681, 'tstamp': 646681}], ['IFA_FLAGS', 128]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 160970, 'error': None, 'target': 'ovnmeta-befb7a72-17a9-4bcb-b561-84b8f626685a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 10, 'prefixlen': 64, 'flags': 128, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed6:2efb'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 646669, 'tstamp': 646669}], ['IFA_FLAGS', 128]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 160970, 'error': None, 'target': 'ovnmeta-befb7a72-17a9-4bcb-b561-84b8f626685a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 04:23:54 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:54.565 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[ebf5a03c-92df-4249-96e3-6167c13c635d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 04:23:54 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:54.566 160590 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbefb7a72-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 15 04:23:54 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:54.611 160590 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbefb7a72-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 15 04:23:54 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:54.611 160590 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Dec 15 04:23:54 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:54.612 160590 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapbefb7a72-10, col_values=(('external_ids', {'iface-id': 'b35254ad-12eb-47bb-92be-44fefe0694f0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 15 04:23:54 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:54.613 160590 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Dec 15 04:23:54 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:54.616 160590 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmp3uetsk_i/privsep.sock']#033[00m Dec 15 04:23:55 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:55.229 160590 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m Dec 15 04:23:55 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:55.230 160590 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp3uetsk_i/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m Dec 15 04:23:55 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:55.124 160979 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m Dec 15 04:23:55 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:55.128 160979 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m Dec 15 04:23:55 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:55.129 160979 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none#033[00m Dec 15 04:23:55 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:55.129 160979 INFO oslo.privsep.daemon [-] privsep daemon running as pid 160979#033[00m Dec 15 04:23:55 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:55.234 160979 DEBUG oslo.privsep.daemon [-] privsep: reply[4c4f4335-9253-421d-a3ac-44cf56ee59f9]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 04:23:55 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:55.711 160979 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:23:55 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:55.711 160979 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:23:55 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:55.711 160979 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:23:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10726 DF PROTO=TCP SPT=56532 DPT=9102 SEQ=2670373206 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83951CA50000000001030307) Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.178 160979 DEBUG oslo.privsep.daemon [-] privsep: reply[bfb1cde8-8d67-4527-9f3b-08e5998f2257]: (4, ['ovnmeta-befb7a72-17a9-4bcb-b561-84b8f626685a']) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.182 160590 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=12d96d64-e862-4f68-81e5-8d9ec5d3a5e2, column=external_ids, values=({'neutron:ovn-metadata-id': '20a29b40-d485-5855-8ada-f8c435ab8cd8'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.183 160590 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.184 160590 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=12d96d64-e862-4f68-81e5-8d9ec5d3a5e2, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.197 160590 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.197 160590 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.198 160590 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.198 160590 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.199 160590 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.199 160590 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.200 160590 DEBUG oslo_service.service [-] agent_down_time = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.200 160590 DEBUG oslo_service.service [-] allow_bulk = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.201 160590 DEBUG oslo_service.service [-] api_extensions_path = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.201 160590 DEBUG oslo_service.service [-] api_paste_config = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.202 160590 DEBUG oslo_service.service [-] api_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.202 160590 DEBUG oslo_service.service [-] auth_ca_cert = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.203 160590 DEBUG oslo_service.service [-] auth_strategy = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.203 160590 DEBUG oslo_service.service [-] backlog = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.204 160590 DEBUG oslo_service.service [-] base_mac = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.204 160590 DEBUG oslo_service.service [-] bind_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.205 160590 DEBUG oslo_service.service [-] bind_port = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.205 160590 DEBUG oslo_service.service [-] client_socket_timeout = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.205 160590 DEBUG oslo_service.service [-] config_dir = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.206 160590 DEBUG oslo_service.service [-] config_file = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.206 160590 DEBUG oslo_service.service [-] config_source = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.207 160590 DEBUG oslo_service.service [-] control_exchange = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.207 160590 DEBUG oslo_service.service [-] core_plugin = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.208 160590 DEBUG oslo_service.service [-] debug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.208 160590 DEBUG oslo_service.service [-] default_availability_zones = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.209 160590 DEBUG oslo_service.service [-] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.210 160590 DEBUG oslo_service.service [-] dhcp_agent_notification = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.210 160590 DEBUG oslo_service.service [-] dhcp_lease_duration = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.211 160590 DEBUG oslo_service.service [-] dhcp_load_type = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.211 160590 DEBUG oslo_service.service [-] dns_domain = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.212 160590 DEBUG oslo_service.service [-] enable_new_agents = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.212 160590 DEBUG oslo_service.service [-] enable_traditional_dhcp = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.212 160590 DEBUG oslo_service.service [-] external_dns_driver = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.213 160590 DEBUG oslo_service.service [-] external_pids = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.213 160590 DEBUG oslo_service.service [-] filter_validation = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.214 160590 DEBUG oslo_service.service [-] global_physnet_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.214 160590 DEBUG oslo_service.service [-] graceful_shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.215 160590 DEBUG oslo_service.service [-] host = np0005559462.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.215 160590 DEBUG oslo_service.service [-] http_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.216 160590 DEBUG oslo_service.service [-] instance_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.216 160590 DEBUG oslo_service.service [-] instance_uuid_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.216 160590 DEBUG oslo_service.service [-] ipam_driver = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.217 160590 DEBUG oslo_service.service [-] ipv6_pd_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.217 160590 DEBUG oslo_service.service [-] log_config_append = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.218 160590 DEBUG oslo_service.service [-] log_date_format = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.218 160590 DEBUG oslo_service.service [-] log_dir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.219 160590 DEBUG oslo_service.service [-] log_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.219 160590 DEBUG oslo_service.service [-] log_options = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.220 160590 DEBUG oslo_service.service [-] log_rotate_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.220 160590 DEBUG oslo_service.service [-] log_rotate_interval_type = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.220 160590 DEBUG oslo_service.service [-] log_rotation_type = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.221 160590 DEBUG oslo_service.service [-] logging_context_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.221 160590 DEBUG oslo_service.service [-] logging_debug_format_suffix = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.221 160590 DEBUG oslo_service.service [-] logging_default_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.221 160590 DEBUG oslo_service.service [-] logging_exception_prefix = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.221 160590 DEBUG oslo_service.service [-] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.222 160590 DEBUG oslo_service.service [-] max_dns_nameservers = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.222 160590 DEBUG oslo_service.service [-] max_header_line = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.222 160590 DEBUG oslo_service.service [-] max_logfile_count = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.222 160590 DEBUG oslo_service.service [-] max_logfile_size_mb = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.223 160590 DEBUG oslo_service.service [-] max_subnet_host_routes = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.223 160590 DEBUG oslo_service.service [-] metadata_backlog = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.223 160590 DEBUG oslo_service.service [-] metadata_proxy_group = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.223 160590 DEBUG oslo_service.service [-] metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.224 160590 DEBUG oslo_service.service [-] metadata_proxy_socket = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.224 160590 DEBUG oslo_service.service [-] metadata_proxy_socket_mode = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.224 160590 DEBUG oslo_service.service [-] metadata_proxy_user = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.224 160590 DEBUG oslo_service.service [-] metadata_workers = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.225 160590 DEBUG oslo_service.service [-] network_link_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.225 160590 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.225 160590 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.225 160590 DEBUG oslo_service.service [-] nova_client_cert = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.225 160590 DEBUG oslo_service.service [-] nova_client_priv_key = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.226 160590 DEBUG oslo_service.service [-] nova_metadata_host = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.226 160590 DEBUG oslo_service.service [-] nova_metadata_insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.226 160590 DEBUG oslo_service.service [-] nova_metadata_port = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.226 160590 DEBUG oslo_service.service [-] nova_metadata_protocol = http log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.227 160590 DEBUG oslo_service.service [-] pagination_max_limit = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.227 160590 DEBUG oslo_service.service [-] periodic_fuzzy_delay = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.227 160590 DEBUG oslo_service.service [-] periodic_interval = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.227 160590 DEBUG oslo_service.service [-] publish_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.228 160590 DEBUG oslo_service.service [-] rate_limit_burst = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.228 160590 DEBUG oslo_service.service [-] rate_limit_except_level = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.228 160590 DEBUG oslo_service.service [-] rate_limit_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.228 160590 DEBUG oslo_service.service [-] retry_until_window = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.228 160590 DEBUG oslo_service.service [-] rpc_resources_processing_step = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.229 160590 DEBUG oslo_service.service [-] rpc_response_max_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.229 160590 DEBUG oslo_service.service [-] rpc_state_report_workers = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.229 160590 DEBUG oslo_service.service [-] rpc_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.229 160590 DEBUG oslo_service.service [-] send_events_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.230 160590 DEBUG oslo_service.service [-] service_plugins = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.230 160590 DEBUG oslo_service.service [-] setproctitle = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.230 160590 DEBUG oslo_service.service [-] state_path = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.230 160590 DEBUG oslo_service.service [-] syslog_log_facility = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.231 160590 DEBUG oslo_service.service [-] tcp_keepidle = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.231 160590 DEBUG oslo_service.service [-] transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.231 160590 DEBUG oslo_service.service [-] use_eventlog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.231 160590 DEBUG oslo_service.service [-] use_journal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.232 160590 DEBUG oslo_service.service [-] use_json = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.232 160590 DEBUG oslo_service.service [-] use_ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.232 160590 DEBUG oslo_service.service [-] use_stderr = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.232 160590 DEBUG oslo_service.service [-] use_syslog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.232 160590 DEBUG oslo_service.service [-] vlan_transparent = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.233 160590 DEBUG oslo_service.service [-] watch_log_file = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.233 160590 DEBUG oslo_service.service [-] wsgi_default_pool_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.233 160590 DEBUG oslo_service.service [-] wsgi_keep_alive = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.233 160590 DEBUG oslo_service.service [-] wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.233 160590 DEBUG oslo_service.service [-] wsgi_server_debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.234 160590 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.234 160590 DEBUG oslo_service.service [-] oslo_concurrency.lock_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.235 160590 DEBUG oslo_service.service [-] profiler.connection_string = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.235 160590 DEBUG oslo_service.service [-] profiler.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.235 160590 DEBUG oslo_service.service [-] profiler.es_doc_type = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.235 160590 DEBUG oslo_service.service [-] profiler.es_scroll_size = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.236 160590 DEBUG oslo_service.service [-] profiler.es_scroll_time = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.236 160590 DEBUG oslo_service.service [-] profiler.filter_error_trace = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.236 160590 DEBUG oslo_service.service [-] profiler.hmac_keys = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.236 160590 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.236 160590 DEBUG oslo_service.service [-] profiler.socket_timeout = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.237 160590 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.237 160590 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.238 160590 DEBUG oslo_service.service [-] oslo_policy.enforce_scope = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.238 160590 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.238 160590 DEBUG oslo_service.service [-] oslo_policy.policy_dirs = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.239 160590 DEBUG oslo_service.service [-] oslo_policy.policy_file = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.239 160590 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.239 160590 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.240 160590 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.240 160590 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.240 160590 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.241 160590 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.241 160590 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.241 160590 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.242 160590 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.242 160590 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.242 160590 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.242 160590 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.243 160590 DEBUG oslo_service.service [-] privsep.capabilities = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.243 160590 DEBUG oslo_service.service [-] privsep.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.243 160590 DEBUG oslo_service.service [-] privsep.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.243 160590 DEBUG oslo_service.service [-] privsep.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.244 160590 DEBUG oslo_service.service [-] privsep.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.244 160590 DEBUG oslo_service.service [-] privsep.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.244 160590 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.244 160590 DEBUG oslo_service.service [-] privsep_dhcp_release.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.244 160590 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.245 160590 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.245 160590 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.245 160590 DEBUG oslo_service.service [-] privsep_dhcp_release.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.245 160590 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.245 160590 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.246 160590 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.246 160590 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.246 160590 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.246 160590 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.247 160590 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.247 160590 DEBUG oslo_service.service [-] privsep_namespace.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.247 160590 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.247 160590 DEBUG oslo_service.service [-] privsep_namespace.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.248 160590 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.248 160590 DEBUG oslo_service.service [-] privsep_namespace.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.248 160590 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.249 160590 DEBUG oslo_service.service [-] privsep_conntrack.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.249 160590 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.249 160590 DEBUG oslo_service.service [-] privsep_conntrack.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.250 160590 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.250 160590 DEBUG oslo_service.service [-] privsep_conntrack.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.250 160590 DEBUG oslo_service.service [-] privsep_link.capabilities = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.251 160590 DEBUG oslo_service.service [-] privsep_link.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.251 160590 DEBUG oslo_service.service [-] privsep_link.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.252 160590 DEBUG oslo_service.service [-] privsep_link.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.252 160590 DEBUG oslo_service.service [-] privsep_link.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.252 160590 DEBUG oslo_service.service [-] privsep_link.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.252 160590 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.253 160590 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.253 160590 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.253 160590 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.253 160590 DEBUG oslo_service.service [-] AGENT.kill_scripts_path = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.253 160590 DEBUG oslo_service.service [-] AGENT.root_helper = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.254 160590 DEBUG oslo_service.service [-] AGENT.root_helper_daemon = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.254 160590 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.254 160590 DEBUG oslo_service.service [-] AGENT.use_random_fully = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.254 160590 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.255 160590 DEBUG oslo_service.service [-] QUOTAS.default_quota = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.255 160590 DEBUG oslo_service.service [-] QUOTAS.quota_driver = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.255 160590 DEBUG oslo_service.service [-] QUOTAS.quota_network = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.255 160590 DEBUG oslo_service.service [-] QUOTAS.quota_port = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.256 160590 DEBUG oslo_service.service [-] QUOTAS.quota_security_group = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.256 160590 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.256 160590 DEBUG oslo_service.service [-] QUOTAS.quota_subnet = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.256 160590 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.257 160590 DEBUG oslo_service.service [-] nova.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.257 160590 DEBUG oslo_service.service [-] nova.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.257 160590 DEBUG oslo_service.service [-] nova.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.257 160590 DEBUG oslo_service.service [-] nova.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.258 160590 DEBUG oslo_service.service [-] nova.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.258 160590 DEBUG oslo_service.service [-] nova.endpoint_type = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.258 160590 DEBUG oslo_service.service [-] nova.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.258 160590 DEBUG oslo_service.service [-] nova.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.258 160590 DEBUG oslo_service.service [-] nova.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.259 160590 DEBUG oslo_service.service [-] nova.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.259 160590 DEBUG oslo_service.service [-] nova.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.259 160590 DEBUG oslo_service.service [-] placement.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.259 160590 DEBUG oslo_service.service [-] placement.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.260 160590 DEBUG oslo_service.service [-] placement.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.260 160590 DEBUG oslo_service.service [-] placement.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.260 160590 DEBUG oslo_service.service [-] placement.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.260 160590 DEBUG oslo_service.service [-] placement.endpoint_type = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.260 160590 DEBUG oslo_service.service [-] placement.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.261 160590 DEBUG oslo_service.service [-] placement.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.261 160590 DEBUG oslo_service.service [-] placement.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.261 160590 DEBUG oslo_service.service [-] placement.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.261 160590 DEBUG oslo_service.service [-] placement.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.261 160590 DEBUG oslo_service.service [-] ironic.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.262 160590 DEBUG oslo_service.service [-] ironic.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.262 160590 DEBUG oslo_service.service [-] ironic.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.262 160590 DEBUG oslo_service.service [-] ironic.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.263 160590 DEBUG oslo_service.service [-] ironic.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.263 160590 DEBUG oslo_service.service [-] ironic.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.263 160590 DEBUG oslo_service.service [-] ironic.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.264 160590 DEBUG oslo_service.service [-] ironic.enable_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.264 160590 DEBUG oslo_service.service [-] ironic.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.264 160590 DEBUG oslo_service.service [-] ironic.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.265 160590 DEBUG oslo_service.service [-] ironic.interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.265 160590 DEBUG oslo_service.service [-] ironic.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.265 160590 DEBUG oslo_service.service [-] ironic.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.266 160590 DEBUG oslo_service.service [-] ironic.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.266 160590 DEBUG oslo_service.service [-] ironic.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.266 160590 DEBUG oslo_service.service [-] ironic.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.266 160590 DEBUG oslo_service.service [-] ironic.service_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.267 160590 DEBUG oslo_service.service [-] ironic.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.267 160590 DEBUG oslo_service.service [-] ironic.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.267 160590 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.267 160590 DEBUG oslo_service.service [-] ironic.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.268 160590 DEBUG oslo_service.service [-] ironic.valid_interfaces = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.268 160590 DEBUG oslo_service.service [-] ironic.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.268 160590 DEBUG oslo_service.service [-] cli_script.dry_run = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.268 160590 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.268 160590 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.269 160590 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.269 160590 DEBUG oslo_service.service [-] ovn.dns_servers = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.269 160590 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.269 160590 DEBUG oslo_service.service [-] ovn.neutron_sync_mode = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.270 160590 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.270 160590 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.270 160590 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.270 160590 DEBUG oslo_service.service [-] ovn.ovn_l3_mode = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.271 160590 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.271 160590 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.271 160590 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.271 160590 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.271 160590 DEBUG oslo_service.service [-] ovn.ovn_nb_connection = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.271 160590 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.271 160590 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.272 160590 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.272 160590 DEBUG oslo_service.service [-] ovn.ovn_sb_connection = tcp:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.272 160590 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.272 160590 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.272 160590 DEBUG oslo_service.service [-] ovn.ovsdb_log_level = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.272 160590 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.272 160590 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.273 160590 DEBUG oslo_service.service [-] ovn.vhost_sock_dir = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.273 160590 DEBUG oslo_service.service [-] ovn.vif_type = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.273 160590 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.273 160590 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.273 160590 DEBUG oslo_service.service [-] OVS.ovsdb_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.273 160590 DEBUG oslo_service.service [-] ovs.ovsdb_connection = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.274 160590 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.274 160590 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.274 160590 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.274 160590 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.274 160590 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.274 160590 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.275 160590 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.275 160590 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.275 160590 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.275 160590 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.275 160590 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.275 160590 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.276 160590 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.276 160590 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.276 160590 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.276 160590 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.276 160590 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.276 160590 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.276 160590 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.277 160590 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.277 160590 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.277 160590 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.277 160590 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.277 160590 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.277 160590 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.278 160590 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.278 160590 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.278 160590 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.278 160590 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.278 160590 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.278 160590 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.278 160590 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.279 160590 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.279 160590 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.279 160590 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.279 160590 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:23:56 localhost ovn_metadata_agent[160585]: 2025-12-15 09:23:56.279 160590 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m Dec 15 04:23:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27985 DF PROTO=TCP SPT=48424 DPT=9101 SEQ=177924105 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839528660000000001030307) Dec 15 04:23:59 localhost sshd[160984]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:23:59 localhost systemd-logind[763]: New session 52 of user zuul. Dec 15 04:23:59 localhost systemd[1]: Started Session 52 of User zuul. Dec 15 04:24:00 localhost python3.9[161077]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Dec 15 04:24:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18252 DF PROTO=TCP SPT=56306 DPT=9882 SEQ=4283742383 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839534090000000001030307) Dec 15 04:24:01 localhost python3.9[161173]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:24:02 localhost python3.9[161278]: ansible-ansible.legacy.command Invoked with _raw_params=podman stop nova_virtlogd _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:24:02 localhost systemd[1]: libpod-92b280dfcbf579f50e259592e35912e53a86821f523ea224be339f770852945b.scope: Deactivated successfully. Dec 15 04:24:02 localhost podman[161279]: 2025-12-15 09:24:02.843155435 +0000 UTC m=+0.079631973 container died 92b280dfcbf579f50e259592e35912e53a86821f523ea224be339f770852945b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, batch=17.1_20251118.1, com.redhat.component=openstack-nova-libvirt-container, io.buildah.version=1.41.4, name=rhosp17/openstack-nova-libvirt, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, build-date=2025-11-19T00:35:22Z, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, io.openshift.expose-services=, release=1761123044, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, distribution-scope=public) Dec 15 04:24:02 localhost systemd[1]: tmp-crun.Gb7ECQ.mount: Deactivated successfully. Dec 15 04:24:02 localhost podman[161279]: 2025-12-15 09:24:02.874400702 +0000 UTC m=+0.110877210 container cleanup 92b280dfcbf579f50e259592e35912e53a86821f523ea224be339f770852945b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2025-11-19T00:35:22Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, vcs-type=git, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp17/openstack-nova-libvirt, release=1761123044, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, tcib_managed=true, url=https://www.redhat.com, batch=17.1_20251118.1, distribution-scope=public, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 15 04:24:02 localhost podman[161294]: 2025-12-15 09:24:02.925729896 +0000 UTC m=+0.075478431 container remove 92b280dfcbf579f50e259592e35912e53a86821f523ea224be339f770852945b (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, name=rhosp17/openstack-nova-libvirt, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.41.4, vendor=Red Hat, Inc., batch=17.1_20251118.1, build-date=2025-11-19T00:35:22Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d, url=https://www.redhat.com, release=1761123044, tcib_managed=true, konflux.additional-tags=17.1.12 17.1_20251118.1, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, version=17.1.12, maintainer=OpenStack TripleO Team, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt) Dec 15 04:24:02 localhost systemd[1]: libpod-conmon-92b280dfcbf579f50e259592e35912e53a86821f523ea224be339f770852945b.scope: Deactivated successfully. Dec 15 04:24:03 localhost systemd[1]: var-lib-containers-storage-overlay-87a1920e0c677ba0eff7b258724290a21d0d758cce1c2e79cf986e2254064ecd-merged.mount: Deactivated successfully. Dec 15 04:24:03 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-92b280dfcbf579f50e259592e35912e53a86821f523ea224be339f770852945b-userdata-shm.mount: Deactivated successfully. Dec 15 04:24:04 localhost python3.9[161402]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Dec 15 04:24:04 localhost systemd[1]: Reloading. Dec 15 04:24:04 localhost systemd-rc-local-generator[161424]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:24:04 localhost systemd-sysv-generator[161430]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:24:04 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:24:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18254 DF PROTO=TCP SPT=56306 DPT=9882 SEQ=4283742383 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839540250000000001030307) Dec 15 04:24:05 localhost python3.9[161527]: ansible-ansible.builtin.service_facts Invoked Dec 15 04:24:05 localhost network[161544]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Dec 15 04:24:05 localhost network[161545]: 'network-scripts' will be removed from distribution in near future. Dec 15 04:24:05 localhost network[161546]: It is advised to switch to 'NetworkManager' instead for network management. Dec 15 04:24:07 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:24:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10259 DF PROTO=TCP SPT=45524 DPT=9105 SEQ=702366019 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83954BA60000000001030307) Dec 15 04:24:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 04:24:08 localhost podman[161670]: 2025-12-15 09:24:08.76923049 +0000 UTC m=+0.099107285 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller) Dec 15 04:24:08 localhost podman[161670]: 2025-12-15 09:24:08.840448088 +0000 UTC m=+0.170324843 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Dec 15 04:24:08 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 04:24:09 localhost python3.9[161771]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 04:24:09 localhost systemd[1]: Reloading. Dec 15 04:24:09 localhost systemd-sysv-generator[161798]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:24:09 localhost systemd-rc-local-generator[161794]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:24:09 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:24:09 localhost systemd[1]: Stopped target tripleo_nova_libvirt.target. Dec 15 04:24:10 localhost python3.9[161902]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 04:24:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27987 DF PROTO=TCP SPT=48424 DPT=9101 SEQ=177924105 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839559250000000001030307) Dec 15 04:24:11 localhost python3.9[161995]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 04:24:12 localhost python3.9[162088]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 04:24:12 localhost python3.9[162181]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 04:24:13 localhost python3.9[162274]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 04:24:14 localhost python3.9[162367]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 04:24:17 localhost python3.9[162537]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:24:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18256 DF PROTO=TCP SPT=56306 DPT=9882 SEQ=4283742383 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839571250000000001030307) Dec 15 04:24:18 localhost python3.9[162629]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:24:19 localhost python3.9[162721]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:24:20 localhost python3.9[162813]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:24:20 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10261 DF PROTO=TCP SPT=45524 DPT=9105 SEQ=702366019 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83957B250000000001030307) Dec 15 04:24:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 04:24:20 localhost podman[162906]: 2025-12-15 09:24:20.505438066 +0000 UTC m=+0.090042134 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2) Dec 15 04:24:20 localhost podman[162906]: 2025-12-15 09:24:20.550327574 +0000 UTC m=+0.134931612 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true) Dec 15 04:24:20 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 04:24:20 localhost python3.9[162905]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:24:21 localhost python3.9[163017]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:24:21 localhost python3.9[163109]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:24:22 localhost python3.9[163201]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:24:23 localhost python3.9[163293]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:24:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3832 DF PROTO=TCP SPT=39246 DPT=9100 SEQ=3099863631 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839587920000000001030307) Dec 15 04:24:23 localhost python3.9[163385]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:24:24 localhost python3.9[163477]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:24:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3833 DF PROTO=TCP SPT=39246 DPT=9100 SEQ=3099863631 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83958BA50000000001030307) Dec 15 04:24:24 localhost python3.9[163569]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:24:25 localhost python3.9[163661]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:24:25 localhost sshd[163721]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:24:25 localhost python3.9[163754]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:24:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63780 DF PROTO=TCP SPT=53054 DPT=9102 SEQ=1889845080 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839591E50000000001030307) Dec 15 04:24:27 localhost python3.9[163847]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012 systemctl disable --now certmonger.service#012 test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:24:28 localhost python3.9[163939]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None Dec 15 04:24:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44980 DF PROTO=TCP SPT=55472 DPT=9101 SEQ=1239237575 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83959DA50000000001030307) Dec 15 04:24:29 localhost python3.9[164031]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Dec 15 04:24:29 localhost systemd[1]: Reloading. Dec 15 04:24:29 localhost systemd-rc-local-generator[164054]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:24:29 localhost systemd-sysv-generator[164062]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:24:29 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:24:30 localhost python3.9[164159]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:24:31 localhost python3.9[164252]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:24:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57663 DF PROTO=TCP SPT=49254 DPT=9882 SEQ=3866463549 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8395A9390000000001030307) Dec 15 04:24:31 localhost python3.9[164345]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:24:32 localhost python3.9[164438]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:24:32 localhost python3.9[164531]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:24:33 localhost python3.9[164624]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:24:34 localhost python3.9[164717]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:24:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57665 DF PROTO=TCP SPT=49254 DPT=9882 SEQ=3866463549 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8395B5250000000001030307) Dec 15 04:24:35 localhost python3.9[164810]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None Dec 15 04:24:37 localhost python3.9[164903]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None Dec 15 04:24:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19004 DF PROTO=TCP SPT=48684 DPT=9105 SEQ=4021504194 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8395C0E60000000001030307) Dec 15 04:24:38 localhost python3.9[165001]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005559462.localdomain update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None Dec 15 04:24:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 04:24:39 localhost podman[165102]: 2025-12-15 09:24:39.356384698 +0000 UTC m=+0.087208908 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible) Dec 15 04:24:39 localhost podman[165102]: 2025-12-15 09:24:39.396267072 +0000 UTC m=+0.127091302 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 04:24:39 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 04:24:39 localhost python3.9[165101]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Dec 15 04:24:40 localhost python3.9[165180]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Dec 15 04:24:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44982 DF PROTO=TCP SPT=55472 DPT=9101 SEQ=1239237575 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8395CD250000000001030307) Dec 15 04:24:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57667 DF PROTO=TCP SPT=49254 DPT=9882 SEQ=3866463549 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8395E5250000000001030307) Dec 15 04:24:50 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19006 DF PROTO=TCP SPT=48684 DPT=9105 SEQ=4021504194 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8395F1250000000001030307) Dec 15 04:24:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 04:24:50 localhost podman[165252]: 2025-12-15 09:24:50.761711721 +0000 UTC m=+0.081247049 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 04:24:50 localhost podman[165252]: 2025-12-15 09:24:50.77028724 +0000 UTC m=+0.089822568 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3) Dec 15 04:24:50 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 04:24:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:24:51.427 160590 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:24:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:24:51.428 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:24:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:24:51.429 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:24:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17430 DF PROTO=TCP SPT=57656 DPT=9100 SEQ=1457146967 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8395FCC30000000001030307) Dec 15 04:24:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17431 DF PROTO=TCP SPT=57656 DPT=9100 SEQ=1457146967 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839600E50000000001030307) Dec 15 04:24:56 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17432 DF PROTO=TCP SPT=57656 DPT=9100 SEQ=1457146967 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839608E50000000001030307) Dec 15 04:24:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15014 DF PROTO=TCP SPT=46552 DPT=9101 SEQ=2948986194 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839612E50000000001030307) Dec 15 04:25:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24833 DF PROTO=TCP SPT=50724 DPT=9882 SEQ=2881148762 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83961E6A0000000001030307) Dec 15 04:25:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24835 DF PROTO=TCP SPT=50724 DPT=9882 SEQ=2881148762 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83962A660000000001030307) Dec 15 04:25:05 localhost kernel: SELinux: Converting 2759 SID table entries... Dec 15 04:25:05 localhost kernel: SELinux: Context system_u:object_r:insights_client_cache_t:s0 became invalid (unmapped). Dec 15 04:25:05 localhost kernel: SELinux: policy capability network_peer_controls=1 Dec 15 04:25:05 localhost kernel: SELinux: policy capability open_perms=1 Dec 15 04:25:05 localhost kernel: SELinux: policy capability extended_socket_class=1 Dec 15 04:25:05 localhost kernel: SELinux: policy capability always_check_network=0 Dec 15 04:25:05 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Dec 15 04:25:05 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Dec 15 04:25:05 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Dec 15 04:25:05 localhost ceph-osd[31375]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Dec 15 04:25:05 localhost ceph-osd[31375]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 6000.1 total, 600.0 interval#012Cumulative writes: 4815 writes, 21K keys, 4815 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 4815 writes, 628 syncs, 7.67 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 2/0 2.61 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.2 0.01 0.00 1 0.007 0 0 0.0 0.0#012 Sum 2/0 2.61 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.2 0.01 0.00 1 0.007 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.2 0.01 0.00 1 0.007 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6000.1 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x5568990c62d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 4e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-0] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6000.1 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x5568990c62d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 4e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-1] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6000.1 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdo Dec 15 04:25:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23299 DF PROTO=TCP SPT=35064 DPT=9105 SEQ=2075611141 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839636260000000001030307) Dec 15 04:25:09 localhost dbus-broker-launch[755]: avc: op=load_policy lsm=selinux seqno=19 res=1 Dec 15 04:25:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 04:25:09 localhost podman[166319]: 2025-12-15 09:25:09.763676593 +0000 UTC m=+0.081969828 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202) Dec 15 04:25:09 localhost podman[166319]: 2025-12-15 09:25:09.806428735 +0000 UTC m=+0.124721990 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true) Dec 15 04:25:09 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 04:25:10 localhost ceph-osd[32311]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Dec 15 04:25:10 localhost ceph-osd[32311]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 6000.2 total, 600.0 interval#012Cumulative writes: 5745 writes, 25K keys, 5745 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 5745 writes, 763 syncs, 7.53 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 2/0 2.61 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.3 0.00 0.00 1 0.005 0 0 0.0 0.0#012 Sum 2/0 2.61 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.3 0.00 0.00 1 0.005 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.3 0.00 0.00 1 0.005 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6000.2 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x564edfcfa2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 9e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-0] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6000.2 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x564edfcfa2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 9e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0#012#012** Compaction Stats [m-1] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6000.2 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdo Dec 15 04:25:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15016 DF PROTO=TCP SPT=46552 DPT=9101 SEQ=2948986194 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839643250000000001030307) Dec 15 04:25:15 localhost kernel: SELinux: Converting 2762 SID table entries... Dec 15 04:25:15 localhost kernel: SELinux: policy capability network_peer_controls=1 Dec 15 04:25:15 localhost kernel: SELinux: policy capability open_perms=1 Dec 15 04:25:15 localhost kernel: SELinux: policy capability extended_socket_class=1 Dec 15 04:25:15 localhost kernel: SELinux: policy capability always_check_network=0 Dec 15 04:25:15 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Dec 15 04:25:15 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Dec 15 04:25:15 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Dec 15 04:25:17 localhost dbus-broker-launch[755]: avc: op=load_policy lsm=selinux seqno=20 res=1 Dec 15 04:25:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24837 DF PROTO=TCP SPT=50724 DPT=9882 SEQ=2881148762 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83965B250000000001030307) Dec 15 04:25:20 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23301 DF PROTO=TCP SPT=35064 DPT=9105 SEQ=2075611141 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839667260000000001030307) Dec 15 04:25:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 04:25:21 localhost podman[166437]: 2025-12-15 09:25:21.778263718 +0000 UTC m=+0.095306518 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS) Dec 15 04:25:21 localhost podman[166437]: 2025-12-15 09:25:21.817430703 +0000 UTC m=+0.134473463 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent) Dec 15 04:25:21 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 04:25:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39614 DF PROTO=TCP SPT=59840 DPT=9100 SEQ=3972410855 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839671F30000000001030307) Dec 15 04:25:24 localhost kernel: SELinux: Converting 2762 SID table entries... Dec 15 04:25:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39615 DF PROTO=TCP SPT=59840 DPT=9100 SEQ=3972410855 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839675E50000000001030307) Dec 15 04:25:24 localhost kernel: SELinux: policy capability network_peer_controls=1 Dec 15 04:25:24 localhost kernel: SELinux: policy capability open_perms=1 Dec 15 04:25:24 localhost kernel: SELinux: policy capability extended_socket_class=1 Dec 15 04:25:24 localhost kernel: SELinux: policy capability always_check_network=0 Dec 15 04:25:24 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Dec 15 04:25:24 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Dec 15 04:25:24 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Dec 15 04:25:26 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39616 DF PROTO=TCP SPT=59840 DPT=9100 SEQ=3972410855 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83967DE50000000001030307) Dec 15 04:25:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52103 DF PROTO=TCP SPT=45660 DPT=9101 SEQ=3369650621 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839687E50000000001030307) Dec 15 04:25:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55756 DF PROTO=TCP SPT=50970 DPT=9882 SEQ=2901051512 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839693990000000001030307) Dec 15 04:25:32 localhost kernel: SELinux: Converting 2762 SID table entries... Dec 15 04:25:32 localhost kernel: SELinux: policy capability network_peer_controls=1 Dec 15 04:25:32 localhost kernel: SELinux: policy capability open_perms=1 Dec 15 04:25:32 localhost kernel: SELinux: policy capability extended_socket_class=1 Dec 15 04:25:32 localhost kernel: SELinux: policy capability always_check_network=0 Dec 15 04:25:32 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Dec 15 04:25:32 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Dec 15 04:25:32 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Dec 15 04:25:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55758 DF PROTO=TCP SPT=50970 DPT=9882 SEQ=2901051512 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83969FA50000000001030307) Dec 15 04:25:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49327 DF PROTO=TCP SPT=37094 DPT=9105 SEQ=2273985313 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8396AB250000000001030307) Dec 15 04:25:40 localhost dbus-broker-launch[755]: avc: op=load_policy lsm=selinux seqno=22 res=1 Dec 15 04:25:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 04:25:40 localhost podman[166476]: 2025-12-15 09:25:40.771443708 +0000 UTC m=+0.094186085 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2) Dec 15 04:25:40 localhost podman[166476]: 2025-12-15 09:25:40.809132201 +0000 UTC m=+0.131874638 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 04:25:40 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 04:25:40 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52105 DF PROTO=TCP SPT=45660 DPT=9101 SEQ=3369650621 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8396B7250000000001030307) Dec 15 04:25:42 localhost kernel: SELinux: Converting 2762 SID table entries... Dec 15 04:25:42 localhost kernel: SELinux: policy capability network_peer_controls=1 Dec 15 04:25:42 localhost kernel: SELinux: policy capability open_perms=1 Dec 15 04:25:42 localhost kernel: SELinux: policy capability extended_socket_class=1 Dec 15 04:25:42 localhost kernel: SELinux: policy capability always_check_network=0 Dec 15 04:25:42 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Dec 15 04:25:42 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Dec 15 04:25:42 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Dec 15 04:25:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55760 DF PROTO=TCP SPT=50970 DPT=9882 SEQ=2901051512 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8396CF250000000001030307) Dec 15 04:25:50 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49329 DF PROTO=TCP SPT=37094 DPT=9105 SEQ=2273985313 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8396DB260000000001030307) Dec 15 04:25:50 localhost kernel: SELinux: Converting 2762 SID table entries... Dec 15 04:25:50 localhost kernel: SELinux: policy capability network_peer_controls=1 Dec 15 04:25:50 localhost kernel: SELinux: policy capability open_perms=1 Dec 15 04:25:50 localhost kernel: SELinux: policy capability extended_socket_class=1 Dec 15 04:25:50 localhost kernel: SELinux: policy capability always_check_network=0 Dec 15 04:25:50 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Dec 15 04:25:50 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Dec 15 04:25:50 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Dec 15 04:25:51 localhost systemd[1]: Reloading. Dec 15 04:25:51 localhost dbus-broker-launch[755]: avc: op=load_policy lsm=selinux seqno=24 res=1 Dec 15 04:25:51 localhost systemd-rc-local-generator[166540]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:25:51 localhost systemd-sysv-generator[166545]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:25:51 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:25:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:25:51.428 160590 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:25:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:25:51.429 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:25:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:25:51.430 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:25:51 localhost systemd[1]: Reloading. Dec 15 04:25:51 localhost systemd-sysv-generator[166584]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:25:51 localhost systemd-rc-local-generator[166580]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:25:51 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:25:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 04:25:52 localhost podman[166600]: 2025-12-15 09:25:52.758536722 +0000 UTC m=+0.088416476 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS) Dec 15 04:25:52 localhost podman[166600]: 2025-12-15 09:25:52.790715254 +0000 UTC m=+0.120594998 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent) Dec 15 04:25:52 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 04:25:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8867 DF PROTO=TCP SPT=42982 DPT=9100 SEQ=3723217908 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8396E7230000000001030307) Dec 15 04:25:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8868 DF PROTO=TCP SPT=42982 DPT=9100 SEQ=3723217908 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8396EB260000000001030307) Dec 15 04:25:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60492 DF PROTO=TCP SPT=40550 DPT=9102 SEQ=236342633 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8396F1650000000001030307) Dec 15 04:25:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46450 DF PROTO=TCP SPT=57746 DPT=9102 SEQ=1669330527 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8396FD250000000001030307) Dec 15 04:26:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55245 DF PROTO=TCP SPT=43878 DPT=9882 SEQ=1699305852 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839708C90000000001030307) Dec 15 04:26:01 localhost kernel: SELinux: Converting 2763 SID table entries... Dec 15 04:26:01 localhost kernel: SELinux: policy capability network_peer_controls=1 Dec 15 04:26:01 localhost kernel: SELinux: policy capability open_perms=1 Dec 15 04:26:01 localhost kernel: SELinux: policy capability extended_socket_class=1 Dec 15 04:26:01 localhost kernel: SELinux: policy capability always_check_network=0 Dec 15 04:26:01 localhost kernel: SELinux: policy capability cgroup_seclabel=1 Dec 15 04:26:01 localhost kernel: SELinux: policy capability nnp_nosuid_transition=1 Dec 15 04:26:01 localhost kernel: SELinux: policy capability genfs_seclabel_symlinks=1 Dec 15 04:26:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55247 DF PROTO=TCP SPT=43878 DPT=9882 SEQ=1699305852 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839714E50000000001030307) Dec 15 04:26:06 localhost dbus-broker-launch[751]: Noticed file-system modification, trigger reload. Dec 15 04:26:06 localhost dbus-broker-launch[755]: avc: op=load_policy lsm=selinux seqno=25 res=1 Dec 15 04:26:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6606 DF PROTO=TCP SPT=48446 DPT=9105 SEQ=2582120598 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839720650000000001030307) Dec 15 04:26:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42906 DF PROTO=TCP SPT=53442 DPT=9101 SEQ=1715509182 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83972D250000000001030307) Dec 15 04:26:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 04:26:11 localhost systemd[1]: tmp-crun.NUqySS.mount: Deactivated successfully. Dec 15 04:26:11 localhost podman[166688]: 2025-12-15 09:26:11.774346345 +0000 UTC m=+0.096645247 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Dec 15 04:26:11 localhost podman[166688]: 2025-12-15 09:26:11.866376736 +0000 UTC m=+0.188675668 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 04:26:11 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 04:26:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55249 DF PROTO=TCP SPT=43878 DPT=9882 SEQ=1699305852 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839745250000000001030307) Dec 15 04:26:20 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6608 DF PROTO=TCP SPT=48446 DPT=9105 SEQ=2582120598 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839751250000000001030307) Dec 15 04:26:22 localhost sshd[170043]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:26:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9697 DF PROTO=TCP SPT=50196 DPT=9100 SEQ=4115907455 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83975C530000000001030307) Dec 15 04:26:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 04:26:23 localhost podman[170530]: 2025-12-15 09:26:23.75622396 +0000 UTC m=+0.087444374 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3) Dec 15 04:26:23 localhost podman[170530]: 2025-12-15 09:26:23.790367844 +0000 UTC m=+0.121588238 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2) Dec 15 04:26:23 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 04:26:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9698 DF PROTO=TCP SPT=50196 DPT=9100 SEQ=4115907455 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839760650000000001030307) Dec 15 04:26:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27963 DF PROTO=TCP SPT=49036 DPT=9102 SEQ=2828530432 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839766A50000000001030307) Dec 15 04:26:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53468 DF PROTO=TCP SPT=41432 DPT=9101 SEQ=3683613288 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839772650000000001030307) Dec 15 04:26:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34602 DF PROTO=TCP SPT=35200 DPT=9882 SEQ=3666718599 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83977DFA0000000001030307) Dec 15 04:26:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34604 DF PROTO=TCP SPT=35200 DPT=9882 SEQ=3666718599 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839789E50000000001030307) Dec 15 04:26:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59815 DF PROTO=TCP SPT=46356 DPT=9105 SEQ=2182790618 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839795A50000000001030307) Dec 15 04:26:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53470 DF PROTO=TCP SPT=41432 DPT=9101 SEQ=3683613288 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8397A3260000000001030307) Dec 15 04:26:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 04:26:42 localhost systemd[1]: tmp-crun.nrx6a6.mount: Deactivated successfully. Dec 15 04:26:42 localhost podman[183686]: 2025-12-15 09:26:42.845022493 +0000 UTC m=+0.174543419 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Dec 15 04:26:42 localhost podman[183686]: 2025-12-15 09:26:42.885350788 +0000 UTC m=+0.214871653 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251202) Dec 15 04:26:42 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 04:26:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34606 DF PROTO=TCP SPT=35200 DPT=9882 SEQ=3666718599 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8397B9250000000001030307) Dec 15 04:26:49 localhost systemd[1]: Stopping OpenSSH server daemon... Dec 15 04:26:49 localhost systemd[1]: sshd.service: Deactivated successfully. Dec 15 04:26:49 localhost systemd[1]: Stopped OpenSSH server daemon. Dec 15 04:26:49 localhost systemd[1]: sshd.service: Consumed 2.091s CPU time, read 32.0K from disk, written 8.0K to disk. Dec 15 04:26:49 localhost systemd[1]: Stopped target sshd-keygen.target. Dec 15 04:26:49 localhost systemd[1]: Stopping sshd-keygen.target... Dec 15 04:26:49 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Dec 15 04:26:49 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Dec 15 04:26:49 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target). Dec 15 04:26:49 localhost systemd[1]: Reached target sshd-keygen.target. Dec 15 04:26:49 localhost systemd[1]: Starting OpenSSH server daemon... Dec 15 04:26:49 localhost sshd[184724]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:26:49 localhost systemd[1]: Started OpenSSH server daemon. Dec 15 04:26:49 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 15 04:26:49 localhost systemd[1]: /usr/lib/systemd/system/libvirtd.service:29: Failed to parse service type, ignoring: notify-reload Dec 15 04:26:49 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:26:49 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:26:49 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:26:49 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:26:49 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:26:49 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 15 04:26:50 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:26:50 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:26:50 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59817 DF PROTO=TCP SPT=46356 DPT=9105 SEQ=2182790618 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8397C5250000000001030307) Dec 15 04:26:50 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:26:50 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:26:50 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:26:50 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:26:51 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Dec 15 04:26:51 localhost systemd[1]: Starting man-db-cache-update.service... Dec 15 04:26:51 localhost systemd[1]: Reloading. Dec 15 04:26:51 localhost systemd-rc-local-generator[184954]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:26:51 localhost systemd-sysv-generator[184958]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:26:51 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 15 04:26:51 localhost systemd[1]: /usr/lib/systemd/system/libvirtd.service:29: Failed to parse service type, ignoring: notify-reload Dec 15 04:26:51 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:26:51 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 15 04:26:51 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:26:51 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:26:51 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:26:51 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:26:51 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:26:51 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:26:51 localhost systemd[1]: Queuing reload/restart jobs for marked units… Dec 15 04:26:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:26:51.429 160590 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:26:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:26:51.430 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:26:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:26:51.431 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:26:51 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Dec 15 04:26:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14109 DF PROTO=TCP SPT=54174 DPT=9100 SEQ=2480578630 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8397D1830000000001030307) Dec 15 04:26:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 04:26:54 localhost podman[189142]: 2025-12-15 09:26:54.238265518 +0000 UTC m=+0.069862333 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true) Dec 15 04:26:54 localhost podman[189142]: 2025-12-15 09:26:54.269796522 +0000 UTC m=+0.101393327 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Dec 15 04:26:54 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 04:26:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14110 DF PROTO=TCP SPT=54174 DPT=9100 SEQ=2480578630 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8397D5A50000000001030307) Dec 15 04:26:56 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14111 DF PROTO=TCP SPT=54174 DPT=9100 SEQ=2480578630 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8397DDA50000000001030307) Dec 15 04:26:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56413 DF PROTO=TCP SPT=37912 DPT=9101 SEQ=1224953696 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8397E7A50000000001030307) Dec 15 04:27:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44907 DF PROTO=TCP SPT=47346 DPT=9882 SEQ=2552606454 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8397F32A0000000001030307) Dec 15 04:27:03 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Dec 15 04:27:03 localhost systemd[1]: Finished man-db-cache-update.service. Dec 15 04:27:03 localhost systemd[1]: man-db-cache-update.service: Consumed 15.694s CPU time. Dec 15 04:27:03 localhost systemd[1]: run-rbed4197a721c469981c84d1117d4bd85.service: Deactivated successfully. Dec 15 04:27:03 localhost systemd[1]: run-re22b15f6d06f4b4d9ec44a61a3a2519b.service: Deactivated successfully. Dec 15 04:27:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44909 DF PROTO=TCP SPT=47346 DPT=9882 SEQ=2552606454 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8397FF250000000001030307) Dec 15 04:27:07 localhost python3.9[193557]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None Dec 15 04:27:07 localhost systemd[1]: Reloading. Dec 15 04:27:07 localhost systemd-rc-local-generator[193586]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:27:07 localhost systemd-sysv-generator[193589]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:27:07 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 15 04:27:07 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:27:07 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 15 04:27:07 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:27:07 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:27:07 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:27:07 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:27:07 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:27:07 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:27:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35588 DF PROTO=TCP SPT=45684 DPT=9105 SEQ=3592407047 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83980AE50000000001030307) Dec 15 04:27:08 localhost python3.9[193706]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None Dec 15 04:27:08 localhost systemd[1]: Reloading. Dec 15 04:27:08 localhost systemd-rc-local-generator[193733]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:27:08 localhost systemd-sysv-generator[193738]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:27:08 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 15 04:27:08 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:27:08 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 15 04:27:08 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:27:08 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:27:08 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:27:08 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:27:08 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:27:08 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:27:09 localhost python3.9[193854]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None Dec 15 04:27:09 localhost systemd[1]: Reloading. Dec 15 04:27:09 localhost systemd-rc-local-generator[193881]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:27:09 localhost systemd-sysv-generator[193884]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:27:09 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 15 04:27:09 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:27:09 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 15 04:27:09 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:27:09 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:27:09 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:27:09 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:27:09 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:27:09 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:27:10 localhost python3.9[194003]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None Dec 15 04:27:10 localhost systemd[1]: Reloading. Dec 15 04:27:10 localhost systemd-sysv-generator[194035]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:27:10 localhost systemd-rc-local-generator[194032]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:27:11 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 15 04:27:11 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:27:11 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 15 04:27:11 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:27:11 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:27:11 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:27:11 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:27:11 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:27:11 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:27:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56415 DF PROTO=TCP SPT=37912 DPT=9101 SEQ=1224953696 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839817250000000001030307) Dec 15 04:27:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 04:27:13 localhost podman[194152]: 2025-12-15 09:27:13.096499237 +0000 UTC m=+0.098402555 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller) Dec 15 04:27:13 localhost podman[194152]: 2025-12-15 09:27:13.163752778 +0000 UTC m=+0.165656116 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.build-date=20251202, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true) Dec 15 04:27:13 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 04:27:13 localhost python3.9[194153]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Dec 15 04:27:13 localhost systemd[1]: Reloading. Dec 15 04:27:13 localhost systemd-sysv-generator[194208]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:27:13 localhost systemd-rc-local-generator[194204]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:27:13 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 15 04:27:13 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:27:13 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 15 04:27:13 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:27:13 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:27:13 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:27:13 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:27:13 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:27:13 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:27:14 localhost python3.9[194323]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Dec 15 04:27:14 localhost systemd[1]: Reloading. Dec 15 04:27:14 localhost systemd-rc-local-generator[194349]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:27:14 localhost systemd-sysv-generator[194354]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:27:14 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 15 04:27:14 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:27:14 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:27:14 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 15 04:27:14 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:27:14 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:27:14 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:27:14 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:27:14 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:27:15 localhost python3.9[194472]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Dec 15 04:27:15 localhost systemd[1]: Reloading. Dec 15 04:27:15 localhost systemd-rc-local-generator[194498]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:27:15 localhost systemd-sysv-generator[194502]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:27:15 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 15 04:27:15 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:27:15 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:27:15 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:27:15 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 15 04:27:15 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:27:15 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:27:15 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:27:15 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:27:16 localhost python3.9[194620]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Dec 15 04:27:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44911 DF PROTO=TCP SPT=47346 DPT=9882 SEQ=2552606454 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83982F260000000001030307) Dec 15 04:27:17 localhost python3.9[194733]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Dec 15 04:27:17 localhost systemd[1]: Reloading. Dec 15 04:27:17 localhost systemd-rc-local-generator[194759]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:27:17 localhost systemd-sysv-generator[194765]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:27:17 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:27:17 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 15 04:27:17 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:27:17 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:27:17 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:27:17 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 15 04:27:17 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:27:17 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:27:17 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:27:18 localhost python3.9[194882]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None Dec 15 04:27:18 localhost systemd[1]: Reloading. Dec 15 04:27:18 localhost systemd-rc-local-generator[194921]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:27:18 localhost systemd-sysv-generator[194924]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:27:18 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:27:18 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 15 04:27:18 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:27:18 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:27:18 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:27:18 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 15 04:27:18 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:27:18 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:27:18 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:27:20 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35590 DF PROTO=TCP SPT=45684 DPT=9105 SEQ=3592407047 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83983B250000000001030307) Dec 15 04:27:20 localhost python3.9[195098]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Dec 15 04:27:22 localhost python3.9[195229]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Dec 15 04:27:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7556 DF PROTO=TCP SPT=34516 DPT=9100 SEQ=864517943 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839846B30000000001030307) Dec 15 04:27:23 localhost python3.9[195342]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Dec 15 04:27:24 localhost python3.9[195455]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Dec 15 04:27:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7557 DF PROTO=TCP SPT=34516 DPT=9100 SEQ=864517943 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83984AA50000000001030307) Dec 15 04:27:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 04:27:24 localhost podman[195459]: 2025-12-15 09:27:24.77358505 +0000 UTC m=+0.095963997 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Dec 15 04:27:24 localhost podman[195459]: 2025-12-15 09:27:24.806219674 +0000 UTC m=+0.128598571 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0) Dec 15 04:27:24 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 04:27:25 localhost python3.9[195588]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Dec 15 04:27:26 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7558 DF PROTO=TCP SPT=34516 DPT=9100 SEQ=864517943 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839852A50000000001030307) Dec 15 04:27:26 localhost python3.9[195701]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Dec 15 04:27:27 localhost python3.9[195814]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Dec 15 04:27:27 localhost python3.9[195927]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Dec 15 04:27:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48381 DF PROTO=TCP SPT=37072 DPT=9101 SEQ=1457353543 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83985CA60000000001030307) Dec 15 04:27:29 localhost python3.9[196040]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Dec 15 04:27:31 localhost python3.9[196153]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Dec 15 04:27:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64696 DF PROTO=TCP SPT=33222 DPT=9882 SEQ=2866578846 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8398685A0000000001030307) Dec 15 04:27:32 localhost python3.9[196266]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Dec 15 04:27:33 localhost python3.9[196379]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Dec 15 04:27:34 localhost python3.9[196492]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Dec 15 04:27:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64698 DF PROTO=TCP SPT=33222 DPT=9882 SEQ=2866578846 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839874650000000001030307) Dec 15 04:27:35 localhost python3.9[196605]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None Dec 15 04:27:36 localhost python3.9[196718]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Dec 15 04:27:37 localhost python3.9[196828]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Dec 15 04:27:37 localhost python3.9[196938]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 15 04:27:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1767 DF PROTO=TCP SPT=44070 DPT=9105 SEQ=2478276687 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83987FE50000000001030307) Dec 15 04:27:38 localhost python3.9[197048]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 15 04:27:38 localhost python3.9[197158]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 15 04:27:39 localhost python3.9[197268]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Dec 15 04:27:40 localhost python3.9[197378]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:27:41 localhost python3.9[197468]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765790859.7809308-1643-52558863361145/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:27:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48383 DF PROTO=TCP SPT=37072 DPT=9101 SEQ=1457353543 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83988D250000000001030307) Dec 15 04:27:41 localhost python3.9[197578]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:27:42 localhost python3.9[197668]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765790861.305605-1643-274041194160857/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:27:42 localhost python3.9[197778]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:27:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 04:27:43 localhost podman[197869]: 2025-12-15 09:27:43.326255733 +0000 UTC m=+0.077768627 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3) Dec 15 04:27:43 localhost podman[197869]: 2025-12-15 09:27:43.364653997 +0000 UTC m=+0.116166911 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, container_name=ovn_controller, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller) Dec 15 04:27:43 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 04:27:43 localhost python3.9[197868]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765790862.4089096-1643-199120021558906/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:27:44 localhost python3.9[198003]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:27:44 localhost python3.9[198093]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765790863.5756037-1643-103536404088886/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:27:45 localhost python3.9[198203]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:27:45 localhost python3.9[198293]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765790864.8821776-1643-83828803271989/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=8d9b2057482987a531d808ceb2ac4bc7d43bf17c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:27:46 localhost python3.9[198403]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:27:47 localhost python3.9[198493]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765790866.0855484-1643-33136056624630/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:27:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64700 DF PROTO=TCP SPT=33222 DPT=9882 SEQ=2866578846 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8398A5260000000001030307) Dec 15 04:27:47 localhost python3.9[198603]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:27:48 localhost python3.9[198691]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765790867.2563953-1643-232919946615720/.source.conf follow=False _original_basename=auth.conf checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:27:48 localhost python3.9[198801]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:27:49 localhost python3.9[198891]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765790868.3399634-1643-223521250626967/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:27:49 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1769 DF PROTO=TCP SPT=44070 DPT=9105 SEQ=2478276687 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8398AF250000000001030307) Dec 15 04:27:51 localhost python3.9[199001]: ansible-ansible.builtin.file Invoked with path=/etc/libvirt/passwd.db state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:27:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:27:51.430 160590 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:27:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:27:51.431 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:27:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:27:51.432 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:27:51 localhost python3.9[199111]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:27:52 localhost python3.9[199221]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:27:53 localhost python3.9[199331]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:27:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25049 DF PROTO=TCP SPT=59632 DPT=9100 SEQ=3601622544 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8398BBE20000000001030307) Dec 15 04:27:53 localhost python3.9[199441]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:27:54 localhost python3.9[199551]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:27:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25050 DF PROTO=TCP SPT=59632 DPT=9100 SEQ=3601622544 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8398BFE60000000001030307) Dec 15 04:27:54 localhost python3.9[199661]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:27:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 04:27:55 localhost podman[199772]: 2025-12-15 09:27:55.279154715 +0000 UTC m=+0.082559260 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Dec 15 04:27:55 localhost podman[199772]: 2025-12-15 09:27:55.308719061 +0000 UTC m=+0.112123596 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2) Dec 15 04:27:55 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 04:27:55 localhost python3.9[199771]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:27:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61845 DF PROTO=TCP SPT=48128 DPT=9102 SEQ=724714870 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8398C6250000000001030307) Dec 15 04:27:55 localhost python3.9[199900]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:27:56 localhost python3.9[200010]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:27:57 localhost python3.9[200120]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:27:57 localhost python3.9[200230]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:27:58 localhost python3.9[200340]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:27:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60684 DF PROTO=TCP SPT=56656 DPT=9101 SEQ=869779006 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8398D1E50000000001030307) Dec 15 04:27:59 localhost python3.9[200450]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:27:59 localhost python3.9[200560]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:28:01 localhost python3.9[200670]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:28:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16831 DF PROTO=TCP SPT=39236 DPT=9882 SEQ=4257720101 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8398DD8A0000000001030307) Dec 15 04:28:01 localhost python3.9[200758]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765790880.9556394-2306-233697681899685/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:28:03 localhost python3.9[200868]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:28:03 localhost python3.9[200956]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765790882.759456-2306-271435458953783/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:28:04 localhost python3.9[201066]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:28:04 localhost python3.9[201154]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765790883.8651252-2306-63197896192956/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:28:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16833 DF PROTO=TCP SPT=39236 DPT=9882 SEQ=4257720101 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8398E9A50000000001030307) Dec 15 04:28:05 localhost python3.9[201264]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:28:05 localhost python3.9[201352]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765790884.9663627-2306-222419115334154/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:28:06 localhost python3.9[201462]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:28:07 localhost python3.9[201550]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765790886.1077752-2306-24521770733520/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:28:07 localhost python3.9[201660]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:28:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25125 DF PROTO=TCP SPT=59884 DPT=9105 SEQ=3685571738 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8398F5260000000001030307) Dec 15 04:28:08 localhost python3.9[201748]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765790887.2355266-2306-149664302167197/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:28:08 localhost python3.9[201858]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:28:09 localhost python3.9[201946]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765790888.3845193-2306-106781893540939/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:28:09 localhost python3.9[202056]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:28:10 localhost python3.9[202144]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765790889.451423-2306-237837297010233/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:28:10 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60686 DF PROTO=TCP SPT=56656 DPT=9101 SEQ=869779006 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839901250000000001030307) Dec 15 04:28:11 localhost python3.9[202254]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:28:11 localhost python3.9[202342]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765790890.55651-2306-229220429956195/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:28:12 localhost python3.9[202452]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:28:12 localhost python3.9[202540]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765790891.7552245-2306-36095696275736/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:28:13 localhost python3.9[202650]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:28:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 04:28:13 localhost podman[202738]: 2025-12-15 09:28:13.739431473 +0000 UTC m=+0.071851219 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.build-date=20251202) Dec 15 04:28:13 localhost podman[202738]: 2025-12-15 09:28:13.776714837 +0000 UTC m=+0.109134633 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 04:28:13 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 04:28:13 localhost python3.9[202739]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765790892.86023-2306-49101156286685/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:28:14 localhost python3.9[202872]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:28:14 localhost python3.9[202960]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765790893.9900925-2306-85444725591926/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:28:15 localhost python3.9[203070]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:28:16 localhost python3.9[203158]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765790895.0983057-2306-196485328090824/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:28:16 localhost python3.9[203268]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:28:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16835 DF PROTO=TCP SPT=39236 DPT=9882 SEQ=4257720101 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839919250000000001030307) Dec 15 04:28:17 localhost python3.9[203356]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765790896.1897461-2306-41033508065207/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:28:17 localhost python3.9[203464]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ls -lRZ /run/libvirt | grep -E ':container_\S+_t'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:28:18 localhost sshd[203539]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:28:18 localhost python3.9[203578]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False Dec 15 04:28:20 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25127 DF PROTO=TCP SPT=59884 DPT=9105 SEQ=3685571738 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839925250000000001030307) Dec 15 04:28:20 localhost python3.9[203689]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Dec 15 04:28:20 localhost systemd[1]: Reloading. Dec 15 04:28:20 localhost systemd-rc-local-generator[203711]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:28:20 localhost systemd-sysv-generator[203717]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:28:20 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:28:20 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 15 04:28:20 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:28:20 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:28:20 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:28:20 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 15 04:28:20 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:28:20 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:28:20 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:28:20 localhost systemd[1]: Starting libvirt logging daemon socket... Dec 15 04:28:20 localhost systemd[1]: Listening on libvirt logging daemon socket. Dec 15 04:28:20 localhost systemd[1]: Starting libvirt logging daemon admin socket... Dec 15 04:28:20 localhost systemd[1]: Listening on libvirt logging daemon admin socket. Dec 15 04:28:20 localhost systemd[1]: Starting libvirt logging daemon... Dec 15 04:28:20 localhost systemd[1]: Started libvirt logging daemon. Dec 15 04:28:22 localhost python3.9[203841]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Dec 15 04:28:22 localhost systemd[1]: Reloading. Dec 15 04:28:22 localhost systemd-rc-local-generator[203864]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:28:22 localhost systemd-sysv-generator[203867]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:28:22 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:28:22 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 15 04:28:22 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:28:22 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:28:22 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:28:22 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 15 04:28:22 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:28:22 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:28:22 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:28:22 localhost systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs... Dec 15 04:28:22 localhost systemd[1]: Starting libvirt nodedev daemon socket... Dec 15 04:28:22 localhost systemd[1]: Listening on libvirt nodedev daemon socket. Dec 15 04:28:22 localhost systemd[1]: Starting libvirt nodedev daemon admin socket... Dec 15 04:28:22 localhost systemd[1]: Starting libvirt nodedev daemon read-only socket... Dec 15 04:28:22 localhost systemd[1]: Listening on libvirt nodedev daemon admin socket. Dec 15 04:28:22 localhost systemd[1]: Listening on libvirt nodedev daemon read-only socket. Dec 15 04:28:22 localhost systemd[1]: Started libvirt nodedev daemon. Dec 15 04:28:22 localhost systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs. Dec 15 04:28:22 localhost setroubleshoot[203895]: Deleting alert efd43a6a-fefb-4317-a644-4d61e31ab16b, it is allowed in current policy Dec 15 04:28:22 localhost systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@1.service. Dec 15 04:28:23 localhost python3.9[204072]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Dec 15 04:28:23 localhost systemd[1]: Reloading. Dec 15 04:28:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24661 DF PROTO=TCP SPT=32954 DPT=9100 SEQ=1375483913 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839931120000000001030307) Dec 15 04:28:23 localhost systemd-rc-local-generator[204140]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:28:23 localhost systemd-sysv-generator[204144]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:28:23 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:28:23 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 15 04:28:23 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:28:23 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:28:23 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:28:23 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 15 04:28:23 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:28:23 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:28:23 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:28:23 localhost systemd[1]: Starting libvirt proxy daemon socket... Dec 15 04:28:23 localhost systemd[1]: Listening on libvirt proxy daemon socket. Dec 15 04:28:23 localhost systemd[1]: Starting libvirt proxy daemon admin socket... Dec 15 04:28:23 localhost systemd[1]: Starting libvirt proxy daemon read-only socket... Dec 15 04:28:23 localhost systemd[1]: Listening on libvirt proxy daemon admin socket. Dec 15 04:28:23 localhost systemd[1]: Listening on libvirt proxy daemon read-only socket. Dec 15 04:28:23 localhost systemd[1]: Started libvirt proxy daemon. Dec 15 04:28:23 localhost setroubleshoot[203895]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l d4e42d77-78ef-42b9-89f3-c4d585eaaafe Dec 15 04:28:23 localhost setroubleshoot[203895]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012***** Plugin dac_override (91.4 confidence) suggests **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012***** Plugin catchall (9.59 confidence) suggests **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012 Dec 15 04:28:23 localhost setroubleshoot[203895]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l d4e42d77-78ef-42b9-89f3-c4d585eaaafe Dec 15 04:28:23 localhost setroubleshoot[203895]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012***** Plugin dac_override (91.4 confidence) suggests **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012***** Plugin catchall (9.59 confidence) suggests **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012 Dec 15 04:28:24 localhost python3.9[204321]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Dec 15 04:28:24 localhost systemd[1]: Reloading. Dec 15 04:28:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24662 DF PROTO=TCP SPT=32954 DPT=9100 SEQ=1375483913 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839935250000000001030307) Dec 15 04:28:24 localhost systemd-sysv-generator[204347]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:28:24 localhost systemd-rc-local-generator[204343]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:28:24 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:28:24 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 15 04:28:24 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:28:24 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:28:24 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:28:24 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 15 04:28:24 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:28:24 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:28:24 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:28:24 localhost systemd[1]: Listening on libvirt locking daemon socket. Dec 15 04:28:24 localhost systemd[1]: Starting libvirt QEMU daemon socket... Dec 15 04:28:24 localhost systemd[1]: Listening on libvirt QEMU daemon socket. Dec 15 04:28:24 localhost systemd[1]: Starting libvirt QEMU daemon admin socket... Dec 15 04:28:24 localhost systemd[1]: Starting libvirt QEMU daemon read-only socket... Dec 15 04:28:24 localhost systemd[1]: Listening on libvirt QEMU daemon admin socket. Dec 15 04:28:24 localhost systemd[1]: Listening on libvirt QEMU daemon read-only socket. Dec 15 04:28:24 localhost systemd[1]: Started libvirt QEMU daemon. Dec 15 04:28:25 localhost python3.9[204520]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Dec 15 04:28:25 localhost systemd[1]: Reloading. Dec 15 04:28:25 localhost systemd-rc-local-generator[204553]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:28:25 localhost systemd-sysv-generator[204556]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:28:25 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:28:25 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 15 04:28:25 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:28:25 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:28:25 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:28:25 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 15 04:28:25 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:28:25 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:28:25 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:28:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 04:28:25 localhost systemd[1]: Starting libvirt secret daemon socket... Dec 15 04:28:25 localhost systemd[1]: Listening on libvirt secret daemon socket. Dec 15 04:28:25 localhost systemd[1]: Starting libvirt secret daemon admin socket... Dec 15 04:28:25 localhost systemd[1]: Starting libvirt secret daemon read-only socket... Dec 15 04:28:25 localhost systemd[1]: Listening on libvirt secret daemon admin socket. Dec 15 04:28:25 localhost systemd[1]: Listening on libvirt secret daemon read-only socket. Dec 15 04:28:25 localhost systemd[1]: Started libvirt secret daemon. Dec 15 04:28:25 localhost podman[204569]: 2025-12-15 09:28:25.715867815 +0000 UTC m=+0.091768698 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 04:28:25 localhost podman[204569]: 2025-12-15 09:28:25.750342638 +0000 UTC m=+0.126243481 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202) Dec 15 04:28:25 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 04:28:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46130 DF PROTO=TCP SPT=51948 DPT=9102 SEQ=1433948725 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83993B650000000001030307) Dec 15 04:28:27 localhost python3.9[204720]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:28:27 localhost python3.9[204830]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None Dec 15 04:28:28 localhost python3.9[204940]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail;#012echo ceph#012awk -F '=' '/fsid/ {print $2}' /var/lib/openstack/config/ceph/ceph.conf | xargs#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:28:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34351 DF PROTO=TCP SPT=33936 DPT=9101 SEQ=536724456 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839947250000000001030307) Dec 15 04:28:29 localhost python3.9[205052]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.keyring'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None Dec 15 04:28:30 localhost python3.9[205160]: ansible-ansible.legacy.stat Invoked with path=/tmp/secret.xml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:28:30 localhost python3.9[205246]: ansible-ansible.legacy.copy Invoked with dest=/tmp/secret.xml mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1765790909.495951-3170-121006465518635/.source.xml follow=False _original_basename=secret.xml.j2 checksum=54d879b08295e5ec0dce633a7f674332cd786afd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:28:31 localhost python3.9[205356]: ansible-ansible.legacy.command Invoked with _raw_params=virsh secret-undefine bce17446-41b5-5408-a23e-0b011906b44a#012virsh secret-define --file /tmp/secret.xml#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:28:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2806 DF PROTO=TCP SPT=41270 DPT=9882 SEQ=1458337883 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839952BA0000000001030307) Dec 15 04:28:32 localhost python3.9[205476]: ansible-ansible.builtin.file Invoked with path=/tmp/secret.xml state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:28:33 localhost systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@1.service: Deactivated successfully. Dec 15 04:28:34 localhost systemd[1]: setroubleshootd.service: Deactivated successfully. Dec 15 04:28:34 localhost python3.9[205813]: ansible-ansible.legacy.copy Invoked with dest=/etc/ceph/ceph.conf group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/config/ceph/ceph.conf backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:28:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2808 DF PROTO=TCP SPT=41270 DPT=9882 SEQ=1458337883 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83995EA50000000001030307) Dec 15 04:28:35 localhost python3.9[205923]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:28:35 localhost python3.9[206011]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1765790915.042686-3335-89872986090997/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=dc5ee7162311c27a6084cbee4052b901d56cb1ba backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:28:36 localhost python3.9[206121]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:28:37 localhost python3.9[206231]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:28:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48187 DF PROTO=TCP SPT=49126 DPT=9105 SEQ=857230444 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83996A650000000001030307) Dec 15 04:28:38 localhost python3.9[206288]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:28:38 localhost python3.9[206398]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:28:39 localhost python3.9[206455]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.mdpbxe1i recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:28:39 localhost python3.9[206565]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:28:40 localhost python3.9[206622]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:28:40 localhost python3.9[206732]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:28:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34353 DF PROTO=TCP SPT=33936 DPT=9101 SEQ=536724456 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839977250000000001030307) Dec 15 04:28:41 localhost python3[206843]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall Dec 15 04:28:42 localhost python3.9[206953]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:28:42 localhost python3.9[207010]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:28:43 localhost python3.9[207120]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:28:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 04:28:44 localhost podman[207178]: 2025-12-15 09:28:44.169372617 +0000 UTC m=+0.083116522 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2) Dec 15 04:28:44 localhost podman[207178]: 2025-12-15 09:28:44.23693541 +0000 UTC m=+0.150679335 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, config_id=ovn_controller) Dec 15 04:28:44 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 04:28:44 localhost python3.9[207177]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:28:44 localhost python3.9[207313]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:28:45 localhost python3.9[207370]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:28:45 localhost kernel: DROPPING: IN=eth0 OUT= MACSRC=c6:e7:bc:23:0b:06 MACDST=fa:16:3e:ba:ca:0f MACPROTO=0800 SRC=18.223.126.122 DST=38.102.83.173 LEN=52 TOS=0x00 PREC=0x00 TTL=50 ID=391 PROTO=TCP SPT=56688 DPT=9090 SEQ=1023685474 ACK=0 WINDOW=65535 RES=0x00 SYN URGP=0 OPT (020405B40103030801010402) Dec 15 04:28:46 localhost python3.9[207480]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:28:46 localhost python3.9[207537]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:28:47 localhost python3.9[207647]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:28:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2810 DF PROTO=TCP SPT=41270 DPT=9882 SEQ=1458337883 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83998F250000000001030307) Dec 15 04:28:48 localhost python3.9[207737]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765790926.712525-3710-43229070763116/.source.nft follow=False _original_basename=ruleset.j2 checksum=e2e2635f27347d386f310e86d2b40c40289835bb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:28:48 localhost python3.9[207847]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:28:49 localhost python3.9[207957]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:28:50 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48189 DF PROTO=TCP SPT=49126 DPT=9105 SEQ=857230444 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83999B260000000001030307) Dec 15 04:28:50 localhost python3.9[208070]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:28:51 localhost python3.9[208180]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:28:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:28:51.431 160590 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:28:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:28:51.431 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:28:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:28:51.433 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:28:51 localhost python3.9[208291]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 15 04:28:52 localhost python3.9[208403]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:28:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59903 DF PROTO=TCP SPT=59040 DPT=9100 SEQ=3275774266 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8399A6430000000001030307) Dec 15 04:28:53 localhost python3.9[208516]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:28:54 localhost python3.9[208626]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:28:54 localhost python3.9[208714]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765790933.560986-3926-226716975534595/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:28:55 localhost python3.9[208824]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:28:55 localhost python3.9[208912]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765790934.7487326-3971-125342480082329/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:28:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 04:28:56 localhost systemd[1]: tmp-crun.bZU2BI.mount: Deactivated successfully. Dec 15 04:28:56 localhost podman[209023]: 2025-12-15 09:28:56.317397939 +0000 UTC m=+0.095191176 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Dec 15 04:28:56 localhost podman[209023]: 2025-12-15 09:28:56.349558483 +0000 UTC m=+0.127351690 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS) Dec 15 04:28:56 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 04:28:56 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59905 DF PROTO=TCP SPT=59040 DPT=9100 SEQ=3275774266 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8399B2650000000001030307) Dec 15 04:28:56 localhost python3.9[209022]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:28:56 localhost python3.9[209129]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765790935.9378808-4016-14146097435082/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:28:58 localhost python3.9[209239]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 04:28:58 localhost systemd[1]: Reloading. Dec 15 04:28:58 localhost systemd-rc-local-generator[209261]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:28:58 localhost systemd-sysv-generator[209267]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:28:58 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:28:58 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 15 04:28:58 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:28:58 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:28:58 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:28:58 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 15 04:28:58 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:28:58 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:28:58 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:28:58 localhost systemd[1]: Reached target edpm_libvirt.target. Dec 15 04:28:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28357 DF PROTO=TCP SPT=51028 DPT=9101 SEQ=1372211531 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8399BC650000000001030307) Dec 15 04:28:59 localhost python3.9[209388]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None Dec 15 04:28:59 localhost systemd[1]: Reloading. Dec 15 04:29:00 localhost systemd-sysv-generator[209415]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:29:00 localhost systemd-rc-local-generator[209411]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:29:00 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:29:00 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 15 04:29:00 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:29:00 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:29:00 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:29:00 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 15 04:29:00 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:29:00 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:29:00 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:29:00 localhost systemd[1]: Reloading. Dec 15 04:29:00 localhost systemd-rc-local-generator[209450]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:29:00 localhost systemd-sysv-generator[209454]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:29:00 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:29:00 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 15 04:29:00 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:29:00 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:29:00 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:29:00 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 15 04:29:00 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:29:00 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:29:00 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:29:01 localhost systemd[1]: session-52.scope: Deactivated successfully. Dec 15 04:29:01 localhost systemd[1]: session-52.scope: Consumed 3min 38.755s CPU time. Dec 15 04:29:01 localhost systemd-logind[763]: Session 52 logged out. Waiting for processes to exit. Dec 15 04:29:01 localhost systemd-logind[763]: Removed session 52. Dec 15 04:29:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50596 DF PROTO=TCP SPT=46338 DPT=9882 SEQ=3708499562 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8399C7E90000000001030307) Dec 15 04:29:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50598 DF PROTO=TCP SPT=46338 DPT=9882 SEQ=3708499562 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8399D3E50000000001030307) Dec 15 04:29:06 localhost sshd[209481]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:29:06 localhost systemd-logind[763]: New session 53 of user zuul. Dec 15 04:29:07 localhost systemd[1]: Started Session 53 of User zuul. Dec 15 04:29:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5489 DF PROTO=TCP SPT=35192 DPT=9105 SEQ=2537868783 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8399DFA50000000001030307) Dec 15 04:29:08 localhost python3.9[209592]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Dec 15 04:29:10 localhost python3.9[209704]: ansible-ansible.builtin.service_facts Invoked Dec 15 04:29:10 localhost network[209721]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Dec 15 04:29:10 localhost network[209722]: 'network-scripts' will be removed from distribution in near future. Dec 15 04:29:10 localhost network[209723]: It is advised to switch to 'NetworkManager' instead for network management. Dec 15 04:29:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28359 DF PROTO=TCP SPT=51028 DPT=9101 SEQ=1372211531 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A8399ED250000000001030307) Dec 15 04:29:11 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:29:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 04:29:14 localhost systemd[1]: tmp-crun.IB2G3A.mount: Deactivated successfully. Dec 15 04:29:14 localhost podman[209863]: 2025-12-15 09:29:14.764896042 +0000 UTC m=+0.092416441 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller) Dec 15 04:29:14 localhost podman[209863]: 2025-12-15 09:29:14.829897548 +0000 UTC m=+0.157417947 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ovn_controller, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true) Dec 15 04:29:14 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 04:29:15 localhost python3.9[209981]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Dec 15 04:29:16 localhost python3.9[210044]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Dec 15 04:29:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50600 DF PROTO=TCP SPT=46338 DPT=9882 SEQ=3708499562 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839A03250000000001030307) Dec 15 04:29:20 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5491 DF PROTO=TCP SPT=35192 DPT=9105 SEQ=2537868783 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839A0F250000000001030307) Dec 15 04:29:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61901 DF PROTO=TCP SPT=34160 DPT=9100 SEQ=401897763 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839A1B720000000001030307) Dec 15 04:29:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61902 DF PROTO=TCP SPT=34160 DPT=9100 SEQ=401897763 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839A1F650000000001030307) Dec 15 04:29:24 localhost python3.9[210156]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 15 04:29:25 localhost python3.9[210319]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi mode=preserve remote_src=True src=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi/ backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:29:26 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61903 DF PROTO=TCP SPT=34160 DPT=9100 SEQ=401897763 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839A27650000000001030307) Dec 15 04:29:26 localhost python3.9[210450]: ansible-ansible.legacy.command Invoked with _raw_params=mv "/var/lib/config-data/puppet-generated/iscsid/etc/iscsi" "/var/lib/config-data/puppet-generated/iscsid/etc/iscsi.adopted"#012 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:29:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 04:29:26 localhost podman[210537]: 2025-12-15 09:29:26.769947869 +0000 UTC m=+0.086562845 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent) Dec 15 04:29:26 localhost podman[210537]: 2025-12-15 09:29:26.779321751 +0000 UTC m=+0.095936756 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, tcib_managed=true) Dec 15 04:29:26 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 04:29:26 localhost python3.9[210593]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:29:27 localhost python3.9[210704]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -rF /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:29:28 localhost python3.9[210815]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 15 04:29:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61216 DF PROTO=TCP SPT=33950 DPT=9101 SEQ=1255619058 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839A31650000000001030307) Dec 15 04:29:29 localhost python3.9[210927]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:29:30 localhost python3.9[211037]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 04:29:30 localhost systemd[1]: Listening on Open-iSCSI iscsid Socket. Dec 15 04:29:31 localhost python3.9[211151]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 04:29:31 localhost systemd[1]: Reloading. Dec 15 04:29:31 localhost systemd-sysv-generator[211183]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:29:31 localhost systemd-rc-local-generator[211178]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:29:31 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:29:31 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 15 04:29:31 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:29:31 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:29:31 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:29:31 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 15 04:29:31 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:29:31 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:29:31 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:29:31 localhost systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi). Dec 15 04:29:31 localhost systemd[1]: Starting Open-iSCSI... Dec 15 04:29:31 localhost iscsid[211192]: iscsid: can't open InitiatorName configuration file /etc/iscsi/initiatorname.iscsi Dec 15 04:29:31 localhost iscsid[211192]: iscsid: Warning: InitiatorName file /etc/iscsi/initiatorname.iscsi does not exist or does not contain a properly formatted InitiatorName. If using software iscsi (iscsi_tcp or ib_iser) or partial offload (bnx2i or cxgbi iscsi), you may not be able to log into or discover targets. Please create a file /etc/iscsi/initiatorname.iscsi that contains a sting with the format: InitiatorName=iqn.yyyy-mm.[:identifier]. Dec 15 04:29:31 localhost iscsid[211192]: Example: InitiatorName=iqn.2001-04.com.redhat:fc6. Dec 15 04:29:31 localhost iscsid[211192]: If using hardware iscsi like qla4xxx this message can be ignored. Dec 15 04:29:31 localhost iscsid[211192]: iscsid: can't open InitiatorAlias configuration file /etc/iscsi/initiatorname.iscsi Dec 15 04:29:31 localhost iscsid[211192]: iscsid: can't open iscsid.safe_logout configuration file /etc/iscsi/iscsid.conf Dec 15 04:29:31 localhost iscsid[211192]: iscsid: can't open iscsid.ipc_auth_uid configuration file /etc/iscsi/iscsid.conf Dec 15 04:29:31 localhost systemd[1]: Started Open-iSCSI. Dec 15 04:29:31 localhost systemd[1]: Starting Logout off all iSCSI sessions on shutdown... Dec 15 04:29:31 localhost systemd[1]: Finished Logout off all iSCSI sessions on shutdown. Dec 15 04:29:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62923 DF PROTO=TCP SPT=60348 DPT=9882 SEQ=1364452147 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839A3D190000000001030307) Dec 15 04:29:33 localhost systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs... Dec 15 04:29:33 localhost systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs. Dec 15 04:29:33 localhost systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@2.service. Dec 15 04:29:33 localhost python3.9[211304]: ansible-ansible.builtin.service_facts Invoked Dec 15 04:29:33 localhost network[211334]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Dec 15 04:29:33 localhost network[211335]: 'network-scripts' will be removed from distribution in near future. Dec 15 04:29:33 localhost network[211336]: It is advised to switch to 'NetworkManager' instead for network management. Dec 15 04:29:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62925 DF PROTO=TCP SPT=60348 DPT=9882 SEQ=1364452147 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839A49250000000001030307) Dec 15 04:29:34 localhost setroubleshoot[211227]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l cb494263-a368-425f-b531-8241fee021b0 Dec 15 04:29:34 localhost setroubleshoot[211227]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.#012#012***** Plugin catchall (100. confidence) suggests **************************#012#012If you believe that iscsid should be allowed search access on the iscsi directory by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid#012# semodule -X 300 -i my-iscsid.pp#012 Dec 15 04:29:34 localhost setroubleshoot[211227]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l cb494263-a368-425f-b531-8241fee021b0 Dec 15 04:29:34 localhost setroubleshoot[211227]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.#012#012***** Plugin catchall (100. confidence) suggests **************************#012#012If you believe that iscsid should be allowed search access on the iscsi directory by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid#012# semodule -X 300 -i my-iscsid.pp#012 Dec 15 04:29:34 localhost setroubleshoot[211227]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l cb494263-a368-425f-b531-8241fee021b0 Dec 15 04:29:34 localhost setroubleshoot[211227]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.#012#012***** Plugin catchall (100. confidence) suggests **************************#012#012If you believe that iscsid should be allowed search access on the iscsi directory by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid#012# semodule -X 300 -i my-iscsid.pp#012 Dec 15 04:29:34 localhost setroubleshoot[211227]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l cb494263-a368-425f-b531-8241fee021b0 Dec 15 04:29:34 localhost setroubleshoot[211227]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.#012#012***** Plugin catchall (100. confidence) suggests **************************#012#012If you believe that iscsid should be allowed search access on the iscsi directory by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid#012# semodule -X 300 -i my-iscsid.pp#012 Dec 15 04:29:34 localhost setroubleshoot[211227]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l cb494263-a368-425f-b531-8241fee021b0 Dec 15 04:29:34 localhost setroubleshoot[211227]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.#012#012***** Plugin catchall (100. confidence) suggests **************************#012#012If you believe that iscsid should be allowed search access on the iscsi directory by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid#012# semodule -X 300 -i my-iscsid.pp#012 Dec 15 04:29:34 localhost setroubleshoot[211227]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l cb494263-a368-425f-b531-8241fee021b0 Dec 15 04:29:34 localhost setroubleshoot[211227]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.#012#012***** Plugin catchall (100. confidence) suggests **************************#012#012If you believe that iscsid should be allowed search access on the iscsi directory by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid#012# semodule -X 300 -i my-iscsid.pp#012 Dec 15 04:29:35 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:29:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53632 DF PROTO=TCP SPT=52730 DPT=9105 SEQ=390235958 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839A54A50000000001030307) Dec 15 04:29:39 localhost python3.9[211570]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None Dec 15 04:29:39 localhost python3.9[211680]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled Dec 15 04:29:40 localhost python3.9[211794]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:29:41 localhost python3.9[211882]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765790980.0519664-455-177405770296393/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:29:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61218 DF PROTO=TCP SPT=33950 DPT=9101 SEQ=1255619058 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839A61250000000001030307) Dec 15 04:29:41 localhost python3.9[211992]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:29:42 localhost python3.9[212102]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Dec 15 04:29:42 localhost systemd[1]: systemd-modules-load.service: Deactivated successfully. Dec 15 04:29:42 localhost systemd[1]: Stopped Load Kernel Modules. Dec 15 04:29:42 localhost systemd[1]: Stopping Load Kernel Modules... Dec 15 04:29:42 localhost systemd[1]: Starting Load Kernel Modules... Dec 15 04:29:42 localhost systemd-modules-load[212106]: Module 'msr' is built in Dec 15 04:29:42 localhost systemd[1]: Finished Load Kernel Modules. Dec 15 04:29:44 localhost python3.9[212217]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 15 04:29:44 localhost python3.9[212327]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 15 04:29:45 localhost systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@2.service: Deactivated successfully. Dec 15 04:29:45 localhost systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@2.service: Consumed 1.071s CPU time. Dec 15 04:29:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 04:29:45 localhost systemd[1]: setroubleshootd.service: Deactivated successfully. Dec 15 04:29:45 localhost systemd[1]: tmp-crun.t2LXB6.mount: Deactivated successfully. Dec 15 04:29:45 localhost podman[212345]: 2025-12-15 09:29:45.152026379 +0000 UTC m=+0.085437463 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS) Dec 15 04:29:45 localhost podman[212345]: 2025-12-15 09:29:45.217944498 +0000 UTC m=+0.151355542 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller) Dec 15 04:29:45 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 04:29:46 localhost python3.9[212463]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 15 04:29:46 localhost python3.9[212573]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:29:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62927 DF PROTO=TCP SPT=60348 DPT=9882 SEQ=1364452147 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839A79260000000001030307) Dec 15 04:29:47 localhost python3.9[212661]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765790986.29823-629-79053748461960/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:29:47 localhost python3.9[212771]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:29:48 localhost python3.9[212882]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:29:49 localhost python3.9[212992]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:29:50 localhost python3.9[213102]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:29:50 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53634 DF PROTO=TCP SPT=52730 DPT=9105 SEQ=390235958 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839A85250000000001030307) Dec 15 04:29:50 localhost python3.9[213212]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line= find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:29:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:29:51.432 160590 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:29:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:29:51.433 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:29:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:29:51.434 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:29:51 localhost python3.9[213322]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line= recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:29:52 localhost python3.9[213432]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line= skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:29:52 localhost python3.9[213542]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line= user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:29:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2280 DF PROTO=TCP SPT=39644 DPT=9100 SEQ=3084080408 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839A90A30000000001030307) Dec 15 04:29:53 localhost python3.9[213652]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 15 04:29:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2281 DF PROTO=TCP SPT=39644 DPT=9100 SEQ=3084080408 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839A94A50000000001030307) Dec 15 04:29:54 localhost python3.9[213764]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/multipath/.multipath_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:29:55 localhost python3.9[213874]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 15 04:29:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=324 DF PROTO=TCP SPT=47944 DPT=9102 SEQ=897522475 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839A9AE50000000001030307) Dec 15 04:29:56 localhost python3.9[213984]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:29:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 04:29:56 localhost systemd[1]: tmp-crun.DnGHSJ.mount: Deactivated successfully. Dec 15 04:29:56 localhost podman[214042]: 2025-12-15 09:29:56.907277302 +0000 UTC m=+0.079909814 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent) Dec 15 04:29:56 localhost podman[214042]: 2025-12-15 09:29:56.911795232 +0000 UTC m=+0.084427754 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3) Dec 15 04:29:56 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 04:29:57 localhost python3.9[214041]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 15 04:29:57 localhost python3.9[214169]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:29:58 localhost python3.9[214226]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 15 04:29:58 localhost python3.9[214336]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:29:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16842 DF PROTO=TCP SPT=53168 DPT=9101 SEQ=2860691213 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839AA6A50000000001030307) Dec 15 04:29:59 localhost python3.9[214446]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:30:00 localhost python3.9[214503]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:30:00 localhost python3.9[214613]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:30:01 localhost python3.9[214670]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:30:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62308 DF PROTO=TCP SPT=57336 DPT=9882 SEQ=3352156978 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839AB24A0000000001030307) Dec 15 04:30:02 localhost python3.9[214780]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 04:30:02 localhost systemd[1]: Reloading. Dec 15 04:30:02 localhost systemd-rc-local-generator[214804]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:30:02 localhost systemd-sysv-generator[214809]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:30:02 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:30:02 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 15 04:30:02 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:30:02 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:30:02 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:30:02 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 15 04:30:02 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:30:02 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:30:02 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:30:03 localhost python3.9[214928]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:30:04 localhost python3.9[214985]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:30:04 localhost python3.9[215095]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:30:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62310 DF PROTO=TCP SPT=57336 DPT=9882 SEQ=3352156978 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839ABE640000000001030307) Dec 15 04:30:05 localhost python3.9[215152]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:30:05 localhost python3.9[215262]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 04:30:06 localhost systemd[1]: Reloading. Dec 15 04:30:06 localhost systemd-rc-local-generator[215288]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:30:06 localhost systemd-sysv-generator[215294]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:30:06 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:30:06 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 15 04:30:06 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:30:06 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:30:06 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:30:06 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 15 04:30:06 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:30:06 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:30:06 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:30:06 localhost systemd[1]: Starting Create netns directory... Dec 15 04:30:06 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Dec 15 04:30:06 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Dec 15 04:30:06 localhost systemd[1]: Finished Create netns directory. Dec 15 04:30:07 localhost python3.9[215415]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 15 04:30:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11531 DF PROTO=TCP SPT=52400 DPT=9105 SEQ=327096363 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839AC9E60000000001030307) Dec 15 04:30:07 localhost python3.9[215525]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:30:08 localhost python3.9[215613]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/multipathd/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765791007.5157325-1250-143763488692657/.source _original_basename=healthcheck follow=False checksum=af9d0c1c8f3cb0e30ce9609be9d5b01924d0d23f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Dec 15 04:30:10 localhost python3.9[215723]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:30:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16844 DF PROTO=TCP SPT=53168 DPT=9101 SEQ=2860691213 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839AD7250000000001030307) Dec 15 04:30:11 localhost python3.9[215833]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 15 04:30:12 localhost python3.9[215943]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:30:12 localhost python3.9[216031]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/multipathd.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1765791011.8530974-1349-117001155298722/.source.json _original_basename=.uevhbbpe follow=False checksum=3f7959ee8ac9757398adcc451c3b416c957d7c14 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:30:13 localhost python3.9[216139]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:30:15 localhost sshd[216355]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:30:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 04:30:15 localhost podman[216446]: 2025-12-15 09:30:15.586072629 +0000 UTC m=+0.089520841 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 04:30:15 localhost python3.9[216445]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False Dec 15 04:30:15 localhost podman[216446]: 2025-12-15 09:30:15.680011765 +0000 UTC m=+0.183459967 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true) Dec 15 04:30:15 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 04:30:16 localhost python3.9[216581]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack Dec 15 04:30:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62312 DF PROTO=TCP SPT=57336 DPT=9882 SEQ=3352156978 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839AEF250000000001030307) Dec 15 04:30:17 localhost python3.9[216691]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None Dec 15 04:30:20 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11533 DF PROTO=TCP SPT=52400 DPT=9105 SEQ=327096363 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839AF9250000000001030307) Dec 15 04:30:22 localhost python3[216828]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json containers=['multipathd'] log_base_path=/var/log/containers/stdouts debug=False Dec 15 04:30:22 localhost systemd[1]: virtnodedevd.service: Deactivated successfully. Dec 15 04:30:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58547 DF PROTO=TCP SPT=54636 DPT=9100 SEQ=3622553073 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839B05D20000000001030307) Dec 15 04:30:23 localhost systemd[1]: virtproxyd.service: Deactivated successfully. Dec 15 04:30:24 localhost podman[216842]: 2025-12-15 09:30:22.14621357 +0000 UTC m=+0.027921486 image pull quay.io/podified-antelope-centos9/openstack-multipathd:current-podified Dec 15 04:30:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58548 DF PROTO=TCP SPT=54636 DPT=9100 SEQ=3622553073 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839B09E50000000001030307) Dec 15 04:30:24 localhost podman[216891]: Dec 15 04:30:24 localhost podman[216891]: 2025-12-15 09:30:24.319301105 +0000 UTC m=+0.085930227 container create 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, io.buildah.version=1.41.3, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS) Dec 15 04:30:24 localhost podman[216891]: 2025-12-15 09:30:24.279215311 +0000 UTC m=+0.045844453 image pull quay.io/podified-antelope-centos9/openstack-multipathd:current-podified Dec 15 04:30:24 localhost python3[216828]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name multipathd --conmon-pidfile /run/multipathd.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=multipathd --label container_name=multipathd --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run/udev:/run/udev --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /var/lib/openstack/healthchecks/multipathd:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-multipathd:current-podified Dec 15 04:30:25 localhost python3.9[217038]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 15 04:30:25 localhost python3.9[217150]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:30:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20102 DF PROTO=TCP SPT=55176 DPT=9102 SEQ=3351575424 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839B10250000000001030307) Dec 15 04:30:26 localhost python3.9[217205]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 15 04:30:26 localhost python3.9[217350]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765791026.3417583-1619-263250343677209/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:30:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 04:30:27 localhost podman[217414]: 2025-12-15 09:30:27.162772517 +0000 UTC m=+0.089545552 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 04:30:27 localhost podman[217414]: 2025-12-15 09:30:27.19655652 +0000 UTC m=+0.123329555 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_managed=true) Dec 15 04:30:27 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 04:30:27 localhost systemd[1]: tmp-crun.MWTUBl.mount: Deactivated successfully. Dec 15 04:30:27 localhost podman[217489]: 2025-12-15 09:30:27.366440546 +0000 UTC m=+0.101769695 container exec 8dcda56b365b42dc8758aab77a9ec80db304780e449052738f7e4e648ae1ecaf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-crash-np0005559462, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, release=1763362218, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , GIT_CLEAN=True, name=rhceph, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, build-date=2025-11-26T19:44:28Z, vcs-type=git, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, RELEASE=main) Dec 15 04:30:27 localhost podman[217489]: 2025-12-15 09:30:27.467516058 +0000 UTC m=+0.202845227 container exec_died 8dcda56b365b42dc8758aab77a9ec80db304780e449052738f7e4e648ae1ecaf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-crash-np0005559462, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, ceph=True, architecture=x86_64, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, RELEASE=main, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., io.buildah.version=1.41.4, GIT_CLEAN=True, version=7, distribution-scope=public, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, build-date=2025-11-26T19:44:28Z, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0) Dec 15 04:30:27 localhost python3.9[217484]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Dec 15 04:30:27 localhost systemd[1]: Reloading. Dec 15 04:30:27 localhost systemd-rc-local-generator[217543]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:30:27 localhost systemd-sysv-generator[217550]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:30:27 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:30:27 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 15 04:30:27 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:30:27 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:30:27 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:30:27 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 15 04:30:27 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:30:27 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:30:27 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:30:28 localhost python3.9[217662]: ansible-systemd Invoked with state=restarted name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 04:30:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49518 DF PROTO=TCP SPT=58268 DPT=9101 SEQ=946277257 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839B1BE60000000001030307) Dec 15 04:30:29 localhost systemd[1]: Reloading. Dec 15 04:30:29 localhost systemd-sysv-generator[217762]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:30:29 localhost systemd-rc-local-generator[217756]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:30:29 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:30:29 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 15 04:30:29 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:30:29 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:30:29 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:30:29 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 15 04:30:29 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:30:29 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:30:29 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:30:29 localhost systemd[1]: Starting multipathd container... Dec 15 04:30:29 localhost systemd[1]: Started libcrun container. Dec 15 04:30:29 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f800ab6daf506e7efc74915056177a1bc94cf171015ad5acbd83dffb9906a007/merged/etc/multipath supports timestamps until 2038 (0x7fffffff) Dec 15 04:30:29 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f800ab6daf506e7efc74915056177a1bc94cf171015ad5acbd83dffb9906a007/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff) Dec 15 04:30:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 04:30:30 localhost podman[217771]: 2025-12-15 09:30:30.005052104 +0000 UTC m=+0.153584286 container init 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=multipathd) Dec 15 04:30:30 localhost multipathd[217785]: + sudo -E kolla_set_configs Dec 15 04:30:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 04:30:30 localhost podman[217771]: 2025-12-15 09:30:30.048111084 +0000 UTC m=+0.196643226 container start 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2) Dec 15 04:30:30 localhost podman[217771]: multipathd Dec 15 04:30:30 localhost systemd[1]: Started multipathd container. Dec 15 04:30:30 localhost multipathd[217785]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Dec 15 04:30:30 localhost multipathd[217785]: INFO:__main__:Validating config file Dec 15 04:30:30 localhost multipathd[217785]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Dec 15 04:30:30 localhost multipathd[217785]: INFO:__main__:Writing out command to execute Dec 15 04:30:30 localhost multipathd[217785]: ++ cat /run_command Dec 15 04:30:30 localhost multipathd[217785]: + CMD='/usr/sbin/multipathd -d' Dec 15 04:30:30 localhost multipathd[217785]: + ARGS= Dec 15 04:30:30 localhost multipathd[217785]: + sudo kolla_copy_cacerts Dec 15 04:30:30 localhost multipathd[217785]: + [[ ! -n '' ]] Dec 15 04:30:30 localhost multipathd[217785]: + . kolla_extend_start Dec 15 04:30:30 localhost multipathd[217785]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\''' Dec 15 04:30:30 localhost multipathd[217785]: Running command: '/usr/sbin/multipathd -d' Dec 15 04:30:30 localhost multipathd[217785]: + umask 0022 Dec 15 04:30:30 localhost multipathd[217785]: + exec /usr/sbin/multipathd -d Dec 15 04:30:30 localhost multipathd[217785]: 10096.325426 | --------start up-------- Dec 15 04:30:30 localhost multipathd[217785]: 10096.325441 | read /etc/multipath.conf Dec 15 04:30:30 localhost multipathd[217785]: 10096.330947 | path checkers start up Dec 15 04:30:30 localhost podman[217794]: 2025-12-15 09:30:30.14859147 +0000 UTC m=+0.094499464 container health_status 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Dec 15 04:30:30 localhost podman[217794]: 2025-12-15 09:30:30.180048956 +0000 UTC m=+0.125956970 container exec_died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, tcib_managed=true) Dec 15 04:30:30 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.service: Deactivated successfully. Dec 15 04:30:30 localhost python3.9[217931]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml Dec 15 04:30:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55919 DF PROTO=TCP SPT=50638 DPT=9882 SEQ=2697804111 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839B277A0000000001030307) Dec 15 04:30:32 localhost python3.9[218041]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:30:33 localhost python3.9[218131]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765791032.0704207-1742-273173210842878/.source.yaml _original_basename=.6xva8eud follow=False checksum=b16f0ae2e37aba7239aaa003b59d1b153c5739d2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:30:33 localhost python3.9[218239]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 15 04:30:34 localhost systemd[1]: virtsecretd.service: Deactivated successfully. Dec 15 04:30:34 localhost python3.9[218352]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps --filter volume=/etc/multipath.conf --format {{.Names}} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:30:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55921 DF PROTO=TCP SPT=50638 DPT=9882 SEQ=2697804111 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839B33660000000001030307) Dec 15 04:30:35 localhost python3.9[218475]: ansible-ansible.builtin.systemd Invoked with name=edpm_multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Dec 15 04:30:35 localhost systemd[1]: Stopping multipathd container... Dec 15 04:30:35 localhost multipathd[217785]: 10101.650081 | exit (signal) Dec 15 04:30:35 localhost multipathd[217785]: 10101.650155 | --------shut down------- Dec 15 04:30:35 localhost systemd[1]: libpod-9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.scope: Deactivated successfully. Dec 15 04:30:35 localhost podman[218479]: 2025-12-15 09:30:35.488466151 +0000 UTC m=+0.080531951 container died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=multipathd, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Dec 15 04:30:35 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.timer: Deactivated successfully. Dec 15 04:30:35 localhost systemd[1]: Stopped /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 04:30:35 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb-userdata-shm.mount: Deactivated successfully. Dec 15 04:30:35 localhost systemd[1]: var-lib-containers-storage-overlay-f800ab6daf506e7efc74915056177a1bc94cf171015ad5acbd83dffb9906a007-merged.mount: Deactivated successfully. Dec 15 04:30:35 localhost podman[218479]: 2025-12-15 09:30:35.684712126 +0000 UTC m=+0.276777916 container cleanup 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, config_id=multipathd, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 04:30:35 localhost podman[218479]: multipathd Dec 15 04:30:35 localhost podman[218507]: 2025-12-15 09:30:35.781728181 +0000 UTC m=+0.068001361 container cleanup 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd) Dec 15 04:30:35 localhost podman[218507]: multipathd Dec 15 04:30:35 localhost systemd[1]: edpm_multipathd.service: Deactivated successfully. Dec 15 04:30:35 localhost systemd[1]: Stopped multipathd container. Dec 15 04:30:35 localhost systemd[1]: Starting multipathd container... Dec 15 04:30:35 localhost systemd[1]: Started libcrun container. Dec 15 04:30:35 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f800ab6daf506e7efc74915056177a1bc94cf171015ad5acbd83dffb9906a007/merged/etc/multipath supports timestamps until 2038 (0x7fffffff) Dec 15 04:30:35 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f800ab6daf506e7efc74915056177a1bc94cf171015ad5acbd83dffb9906a007/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff) Dec 15 04:30:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 04:30:35 localhost podman[218521]: 2025-12-15 09:30:35.94443588 +0000 UTC m=+0.131122920 container init 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS) Dec 15 04:30:35 localhost multipathd[218535]: + sudo -E kolla_set_configs Dec 15 04:30:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 04:30:35 localhost podman[218521]: 2025-12-15 09:30:35.984175374 +0000 UTC m=+0.170862434 container start 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 04:30:35 localhost podman[218521]: multipathd Dec 15 04:30:35 localhost systemd[1]: Started multipathd container. Dec 15 04:30:36 localhost multipathd[218535]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Dec 15 04:30:36 localhost multipathd[218535]: INFO:__main__:Validating config file Dec 15 04:30:36 localhost multipathd[218535]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Dec 15 04:30:36 localhost multipathd[218535]: INFO:__main__:Writing out command to execute Dec 15 04:30:36 localhost multipathd[218535]: ++ cat /run_command Dec 15 04:30:36 localhost multipathd[218535]: + CMD='/usr/sbin/multipathd -d' Dec 15 04:30:36 localhost multipathd[218535]: + ARGS= Dec 15 04:30:36 localhost multipathd[218535]: + sudo kolla_copy_cacerts Dec 15 04:30:36 localhost multipathd[218535]: + [[ ! -n '' ]] Dec 15 04:30:36 localhost multipathd[218535]: + . kolla_extend_start Dec 15 04:30:36 localhost multipathd[218535]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\''' Dec 15 04:30:36 localhost multipathd[218535]: Running command: '/usr/sbin/multipathd -d' Dec 15 04:30:36 localhost multipathd[218535]: + umask 0022 Dec 15 04:30:36 localhost multipathd[218535]: + exec /usr/sbin/multipathd -d Dec 15 04:30:36 localhost podman[218544]: 2025-12-15 09:30:36.075222117 +0000 UTC m=+0.086826381 container health_status 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=multipathd, org.label-schema.build-date=20251202, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2) Dec 15 04:30:36 localhost multipathd[218535]: 10102.270644 | --------start up-------- Dec 15 04:30:36 localhost multipathd[218535]: 10102.270663 | read /etc/multipath.conf Dec 15 04:30:36 localhost multipathd[218535]: 10102.274546 | path checkers start up Dec 15 04:30:36 localhost podman[218544]: 2025-12-15 09:30:36.09438991 +0000 UTC m=+0.105994244 container exec_died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true) Dec 15 04:30:36 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.service: Deactivated successfully. Dec 15 04:30:36 localhost python3.9[218683]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:30:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37365 DF PROTO=TCP SPT=49936 DPT=9105 SEQ=821094624 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839B3F250000000001030307) Dec 15 04:30:37 localhost python3.9[218793]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None Dec 15 04:30:38 localhost python3.9[218903]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled Dec 15 04:30:39 localhost python3.9[219022]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:30:40 localhost python3.9[219110]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765791038.965853-1943-122323937197966/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:30:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49520 DF PROTO=TCP SPT=58268 DPT=9101 SEQ=946277257 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839B4B250000000001030307) Dec 15 04:30:41 localhost python3.9[219220]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:30:42 localhost python3.9[219330]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Dec 15 04:30:42 localhost systemd[1]: systemd-modules-load.service: Deactivated successfully. Dec 15 04:30:42 localhost systemd[1]: Stopped Load Kernel Modules. Dec 15 04:30:42 localhost systemd[1]: Stopping Load Kernel Modules... Dec 15 04:30:42 localhost systemd[1]: Starting Load Kernel Modules... Dec 15 04:30:42 localhost systemd-modules-load[219334]: Module 'msr' is built in Dec 15 04:30:42 localhost systemd[1]: Finished Load Kernel Modules. Dec 15 04:30:44 localhost python3.9[219444]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Dec 15 04:30:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 04:30:46 localhost podman[219447]: 2025-12-15 09:30:46.756304209 +0000 UTC m=+0.077915248 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 04:30:46 localhost podman[219447]: 2025-12-15 09:30:46.822111392 +0000 UTC m=+0.143722471 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller) Dec 15 04:30:46 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 04:30:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55923 DF PROTO=TCP SPT=50638 DPT=9882 SEQ=2697804111 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839B63260000000001030307) Dec 15 04:30:47 localhost systemd[1]: Reloading. Dec 15 04:30:47 localhost systemd-rc-local-generator[219500]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:30:47 localhost systemd-sysv-generator[219503]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:30:47 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:30:47 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 15 04:30:47 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:30:47 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:30:47 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:30:47 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 15 04:30:47 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:30:47 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:30:47 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:30:48 localhost systemd[1]: Reloading. Dec 15 04:30:48 localhost systemd-sysv-generator[219542]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:30:48 localhost systemd-rc-local-generator[219537]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:30:48 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:30:48 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 15 04:30:48 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:30:48 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:30:48 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:30:48 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 15 04:30:48 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:30:48 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:30:48 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:30:48 localhost systemd-logind[763]: Watching system buttons on /dev/input/event0 (Power Button) Dec 15 04:30:48 localhost systemd-logind[763]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard) Dec 15 04:30:48 localhost lvm[219588]: PV /dev/loop3 online, VG ceph_vg0 is complete. Dec 15 04:30:48 localhost lvm[219588]: VG ceph_vg0 finished Dec 15 04:30:48 localhost lvm[219589]: PV /dev/loop4 online, VG ceph_vg1 is complete. Dec 15 04:30:48 localhost lvm[219589]: VG ceph_vg1 finished Dec 15 04:30:48 localhost systemd[1]: Started /usr/bin/systemctl start man-db-cache-update. Dec 15 04:30:48 localhost systemd[1]: Starting man-db-cache-update.service... Dec 15 04:30:48 localhost systemd[1]: Reloading. Dec 15 04:30:48 localhost systemd-rc-local-generator[219645]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:30:48 localhost systemd-sysv-generator[219648]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:30:48 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:30:48 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 15 04:30:48 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:30:48 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:30:48 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:30:48 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 15 04:30:48 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:30:48 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:30:48 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:30:48 localhost systemd[1]: Queuing reload/restart jobs for marked units… Dec 15 04:30:49 localhost systemd[1]: man-db-cache-update.service: Deactivated successfully. Dec 15 04:30:49 localhost systemd[1]: Finished man-db-cache-update.service. Dec 15 04:30:49 localhost systemd[1]: man-db-cache-update.service: Consumed 1.325s CPU time. Dec 15 04:30:49 localhost systemd[1]: run-rfc4f25e537114521a93d9377b7fc70e8.service: Deactivated successfully. Dec 15 04:30:50 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37367 DF PROTO=TCP SPT=49936 DPT=9105 SEQ=821094624 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839B6F260000000001030307) Dec 15 04:30:51 localhost python3.9[220898]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Dec 15 04:30:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:30:51.433 160590 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:30:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:30:51.434 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:30:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:30:51.436 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:30:52 localhost python3.9[221012]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:30:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33712 DF PROTO=TCP SPT=33156 DPT=9100 SEQ=1083872988 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839B7B030000000001030307) Dec 15 04:30:53 localhost python3.9[221122]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Dec 15 04:30:53 localhost systemd[1]: Reloading. Dec 15 04:30:53 localhost systemd-sysv-generator[221153]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:30:53 localhost systemd-rc-local-generator[221149]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:30:53 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:30:53 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 15 04:30:53 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:30:53 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:30:53 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:30:53 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 15 04:30:53 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:30:53 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:30:53 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:30:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33713 DF PROTO=TCP SPT=33156 DPT=9100 SEQ=1083872988 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839B7F250000000001030307) Dec 15 04:30:54 localhost python3.9[221266]: ansible-ansible.builtin.service_facts Invoked Dec 15 04:30:54 localhost network[221283]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Dec 15 04:30:54 localhost network[221284]: 'network-scripts' will be removed from distribution in near future. Dec 15 04:30:54 localhost network[221285]: It is advised to switch to 'NetworkManager' instead for network management. Dec 15 04:30:56 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33714 DF PROTO=TCP SPT=33156 DPT=9100 SEQ=1083872988 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839B87260000000001030307) Dec 15 04:30:57 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:30:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 04:30:57 localhost podman[221339]: 2025-12-15 09:30:57.349404313 +0000 UTC m=+0.091769619 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Dec 15 04:30:57 localhost podman[221339]: 2025-12-15 09:30:57.355021968 +0000 UTC m=+0.097387324 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Dec 15 04:30:57 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 04:30:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=329 DF PROTO=TCP SPT=47944 DPT=9102 SEQ=897522475 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839B91250000000001030307) Dec 15 04:31:01 localhost python3.9[221536]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 04:31:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58829 DF PROTO=TCP SPT=45564 DPT=9882 SEQ=346648784 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839B9CAA0000000001030307) Dec 15 04:31:03 localhost python3.9[221647]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 04:31:04 localhost python3.9[221758]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 04:31:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58831 DF PROTO=TCP SPT=45564 DPT=9882 SEQ=346648784 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839BA8A60000000001030307) Dec 15 04:31:05 localhost python3.9[221869]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 04:31:05 localhost python3.9[221980]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 04:31:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 04:31:06 localhost podman[222092]: 2025-12-15 09:31:06.706011157 +0000 UTC m=+0.075617824 container health_status 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true) Dec 15 04:31:06 localhost podman[222092]: 2025-12-15 09:31:06.721286797 +0000 UTC m=+0.090893514 container exec_died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd) Dec 15 04:31:06 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.service: Deactivated successfully. Dec 15 04:31:06 localhost python3.9[222091]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 04:31:07 localhost python3.9[222222]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 04:31:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58569 DF PROTO=TCP SPT=39572 DPT=9105 SEQ=3221570896 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839BB4650000000001030307) Dec 15 04:31:08 localhost python3.9[222333]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 04:31:09 localhost python3.9[222444]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:31:10 localhost python3.9[222554]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:31:10 localhost python3.9[222664]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:31:11 localhost python3.9[222774]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:31:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30546 DF PROTO=TCP SPT=45884 DPT=9101 SEQ=2683969703 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839BC1250000000001030307) Dec 15 04:31:12 localhost python3.9[222884]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:31:13 localhost python3.9[222994]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:31:14 localhost python3.9[223104]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:31:14 localhost python3.9[223214]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:31:15 localhost python3.9[223324]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:31:15 localhost python3.9[223434]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:31:16 localhost python3.9[223544]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:31:16 localhost python3.9[223654]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:31:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58833 DF PROTO=TCP SPT=45564 DPT=9882 SEQ=346648784 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839BD9250000000001030307) Dec 15 04:31:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 04:31:17 localhost podman[223765]: 2025-12-15 09:31:17.680096864 +0000 UTC m=+0.086364391 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 04:31:17 localhost podman[223765]: 2025-12-15 09:31:17.723321894 +0000 UTC m=+0.129589411 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202) Dec 15 04:31:17 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 04:31:17 localhost python3.9[223764]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:31:18 localhost python3.9[223899]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:31:18 localhost python3.9[224009]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:31:19 localhost python3.9[224119]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:31:20 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58571 DF PROTO=TCP SPT=39572 DPT=9105 SEQ=3221570896 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839BE5250000000001030307) Dec 15 04:31:20 localhost python3.9[224229]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012 systemctl disable --now certmonger.service#012 test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:31:21 localhost python3.9[224339]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None Dec 15 04:31:22 localhost python3.9[224449]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Dec 15 04:31:22 localhost systemd[1]: Reloading. Dec 15 04:31:22 localhost systemd-rc-local-generator[224469]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:31:22 localhost systemd-sysv-generator[224474]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:31:22 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:31:22 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 15 04:31:22 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:31:22 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:31:22 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:31:22 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 15 04:31:22 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:31:22 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:31:22 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:31:23 localhost python3.9[224594]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:31:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7829 DF PROTO=TCP SPT=44722 DPT=9100 SEQ=2552764496 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839BF0370000000001030307) Dec 15 04:31:23 localhost python3.9[224705]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:31:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7830 DF PROTO=TCP SPT=44722 DPT=9100 SEQ=2552764496 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839BF4250000000001030307) Dec 15 04:31:24 localhost python3.9[224816]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:31:24 localhost python3.9[224927]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:31:25 localhost python3.9[225038]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:31:26 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7831 DF PROTO=TCP SPT=44722 DPT=9100 SEQ=2552764496 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839BFC250000000001030307) Dec 15 04:31:27 localhost python3.9[225149]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:31:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 04:31:27 localhost podman[225261]: 2025-12-15 09:31:27.750581402 +0000 UTC m=+0.091787919 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_metadata_agent) Dec 15 04:31:27 localhost podman[225261]: 2025-12-15 09:31:27.782351887 +0000 UTC m=+0.123558354 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202) Dec 15 04:31:27 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 04:31:27 localhost python3.9[225260]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:31:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23284 DF PROTO=TCP SPT=36894 DPT=9101 SEQ=2994808243 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839C06250000000001030307) Dec 15 04:31:29 localhost python3.9[225389]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:31:31 localhost python3.9[225585]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 15 04:31:31 localhost python3.9[225695]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 15 04:31:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25058 DF PROTO=TCP SPT=44404 DPT=9882 SEQ=3010571428 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839C11D90000000001030307) Dec 15 04:31:32 localhost python3.9[225805]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 15 04:31:33 localhost python3.9[225915]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 15 04:31:33 localhost python3.9[226025]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 15 04:31:34 localhost python3.9[226135]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 15 04:31:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25060 DF PROTO=TCP SPT=44404 DPT=9882 SEQ=3010571428 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839C1DE50000000001030307) Dec 15 04:31:35 localhost python3.9[226245]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 15 04:31:35 localhost python3.9[226355]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Dec 15 04:31:36 localhost python3.9[226465]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Dec 15 04:31:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 04:31:37 localhost systemd[1]: tmp-crun.UaumbG.mount: Deactivated successfully. Dec 15 04:31:37 localhost podman[226576]: 2025-12-15 09:31:37.069107056 +0000 UTC m=+0.097787565 container health_status 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=multipathd, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202) Dec 15 04:31:37 localhost podman[226576]: 2025-12-15 09:31:37.111332999 +0000 UTC m=+0.140013508 container exec_died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 04:31:37 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.service: Deactivated successfully. Dec 15 04:31:37 localhost python3.9[226575]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Dec 15 04:31:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55587 DF PROTO=TCP SPT=36708 DPT=9105 SEQ=1819289840 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839C29650000000001030307) Dec 15 04:31:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23286 DF PROTO=TCP SPT=36894 DPT=9101 SEQ=2994808243 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839C37260000000001030307) Dec 15 04:31:43 localhost python3.9[226704]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None Dec 15 04:31:44 localhost python3.9[226815]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None Dec 15 04:31:45 localhost python3.9[226931]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005559462.localdomain update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None Dec 15 04:31:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25062 DF PROTO=TCP SPT=44404 DPT=9882 SEQ=3010571428 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839C4D260000000001030307) Dec 15 04:31:47 localhost sshd[226957]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:31:47 localhost systemd-logind[763]: New session 54 of user zuul. Dec 15 04:31:47 localhost systemd[1]: Started Session 54 of User zuul. Dec 15 04:31:47 localhost systemd[1]: session-54.scope: Deactivated successfully. Dec 15 04:31:47 localhost systemd-logind[763]: Session 54 logged out. Waiting for processes to exit. Dec 15 04:31:47 localhost systemd-logind[763]: Removed session 54. Dec 15 04:31:48 localhost python3.9[227068]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:31:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 04:31:48 localhost podman[227102]: 2025-12-15 09:31:48.745106329 +0000 UTC m=+0.077857144 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true) Dec 15 04:31:48 localhost podman[227102]: 2025-12-15 09:31:48.784233445 +0000 UTC m=+0.116984290 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller) Dec 15 04:31:48 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 04:31:49 localhost python3.9[227179]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765791107.8102415-3502-214973380949997/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 15 04:31:49 localhost python3.9[227287]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:31:50 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55589 DF PROTO=TCP SPT=36708 DPT=9105 SEQ=1819289840 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839C59260000000001030307) Dec 15 04:31:50 localhost python3.9[227342]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 15 04:31:51 localhost python3.9[227450]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:31:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:31:51.434 160590 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:31:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:31:51.434 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:31:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:31:51.436 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:31:51 localhost python3.9[227536]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765791110.6888776-3502-233698019550500/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 15 04:31:52 localhost python3.9[227644]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:31:52 localhost python3.9[227730]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765791111.7117703-3502-114378139680340/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=2665bfc4419dff19b3a41ac57ea64cb1932d7c0f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 15 04:31:53 localhost python3.9[227838]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:31:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62051 DF PROTO=TCP SPT=52058 DPT=9100 SEQ=3869232803 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839C65620000000001030307) Dec 15 04:31:53 localhost python3.9[227924]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765791112.7495897-3502-51797583597202/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 15 04:31:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62052 DF PROTO=TCP SPT=52058 DPT=9100 SEQ=3869232803 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839C69660000000001030307) Dec 15 04:31:54 localhost python3.9[228032]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:31:54 localhost python3.9[228118]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765791113.8561966-3502-239794594401302/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 15 04:31:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39337 DF PROTO=TCP SPT=58226 DPT=9102 SEQ=151846719 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839C6FA50000000001030307) Dec 15 04:31:55 localhost python3.9[228228]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:31:57 localhost python3.9[228338]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:31:57 localhost python3.9[228448]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 15 04:31:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 04:31:58 localhost podman[228506]: 2025-12-15 09:31:58.813291757 +0000 UTC m=+0.143340405 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true) Dec 15 04:31:58 localhost podman[228506]: 2025-12-15 09:31:58.84851535 +0000 UTC m=+0.178563988 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent) Dec 15 04:31:58 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 04:31:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10276 DF PROTO=TCP SPT=32836 DPT=9101 SEQ=1320123975 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839C7B660000000001030307) Dec 15 04:31:59 localhost python3.9[228580]: ansible-ansible.builtin.file Invoked with group=nova mode=0400 owner=nova path=/var/lib/nova/compute_id state=file recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:31:59 localhost python3.9[228689]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 15 04:32:00 localhost python3.9[228799]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:32:01 localhost python3.9[228885]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765791120.2270198-3877-52282352943806/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=211ffd0bca4b407eb4de45a749ef70116a7806fd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 15 04:32:01 localhost python3.9[228993]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:32:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37346 DF PROTO=TCP SPT=37302 DPT=9882 SEQ=3074295749 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839C870A0000000001030307) Dec 15 04:32:02 localhost python3.9[229079]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765791121.4575667-3922-201175109249043/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=60b024e6db49dc6e700fc0d50263944d98d4c034 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 15 04:32:03 localhost python3.9[229189]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False Dec 15 04:32:04 localhost python3.9[229299]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack Dec 15 04:32:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37348 DF PROTO=TCP SPT=37302 DPT=9882 SEQ=3074295749 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839C93250000000001030307) Dec 15 04:32:05 localhost python3[229409]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json containers=[] log_base_path=/var/log/containers/stdouts debug=False Dec 15 04:32:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 04:32:07 localhost podman[229437]: 2025-12-15 09:32:07.735321047 +0000 UTC m=+0.064776464 container health_status 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 04:32:07 localhost podman[229437]: 2025-12-15 09:32:07.772239484 +0000 UTC m=+0.101694881 container exec_died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_managed=true, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible) Dec 15 04:32:07 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.service: Deactivated successfully. Dec 15 04:32:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41037 DF PROTO=TCP SPT=40726 DPT=9105 SEQ=1887043291 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839C9EA50000000001030307) Dec 15 04:32:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10278 DF PROTO=TCP SPT=32836 DPT=9101 SEQ=1320123975 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839CAB260000000001030307) Dec 15 04:32:14 localhost sshd[229481]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:32:15 localhost podman[229424]: 2025-12-15 09:32:05.363973205 +0000 UTC m=+0.039159838 image pull quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified Dec 15 04:32:15 localhost podman[229506]: Dec 15 04:32:15 localhost podman[229506]: 2025-12-15 09:32:15.925165773 +0000 UTC m=+0.082448338 container create c86c0d58ed395af407f97b4ea9c613ee83c5b4d277ca4b706ab48762a26987aa (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, container_name=nova_compute_init, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Dec 15 04:32:15 localhost podman[229506]: 2025-12-15 09:32:15.88172953 +0000 UTC m=+0.039012125 image pull quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified Dec 15 04:32:15 localhost python3[229409]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init Dec 15 04:32:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37350 DF PROTO=TCP SPT=37302 DPT=9882 SEQ=3074295749 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839CC3260000000001030307) Dec 15 04:32:18 localhost python3.9[229652]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 15 04:32:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 04:32:19 localhost podman[229672]: 2025-12-15 09:32:19.75546588 +0000 UTC m=+0.082586851 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS) Dec 15 04:32:19 localhost podman[229672]: 2025-12-15 09:32:19.822356299 +0000 UTC m=+0.149477220 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Dec 15 04:32:19 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 04:32:20 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41039 DF PROTO=TCP SPT=40726 DPT=9105 SEQ=1887043291 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839CCF260000000001030307) Dec 15 04:32:20 localhost python3.9[229789]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False Dec 15 04:32:21 localhost python3.9[229899]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack Dec 15 04:32:22 localhost python3[230009]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json containers=[] log_base_path=/var/log/containers/stdouts debug=False Dec 15 04:32:22 localhost python3[230009]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [#012 {#012 "Id": "e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18",#012 "Digest": "sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b",#012 "RepoTags": [#012 "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"#012 ],#012 "RepoDigests": [#012 "quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b"#012 ],#012 "Parent": "",#012 "Comment": "",#012 "Created": "2025-12-08T06:29:58.491919425Z",#012 "Config": {#012 "User": "nova",#012 "Env": [#012 "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",#012 "LANG=en_US.UTF-8",#012 "TZ=UTC",#012 "container=oci"#012 ],#012 "Entrypoint": [#012 "dumb-init",#012 "--single-child",#012 "--"#012 ],#012 "Cmd": [#012 "kolla_start"#012 ],#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251202",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "c3923531bcda0b0811b2d5053f189beb",#012 "tcib_managed": "true"#012 },#012 "StopSignal": "SIGTERM"#012 },#012 "Version": "",#012 "Author": "",#012 "Architecture": "amd64",#012 "Os": "linux",#012 "Size": 1212370809,#012 "VirtualSize": 1212370809,#012 "GraphDriver": {#012 "Name": "overlay",#012 "Data": {#012 "LowerDir": "/var/lib/containers/storage/overlay/0825dff58a4425bd2cec24761b25b1273896b2e1fd9e1bbd68a0daa8371ae8a9/diff:/var/lib/containers/storage/overlay/4c2a493cc38fe0c2d274b137f7d549c92d76e83cf216e797584fb8469937682d/diff:/var/lib/containers/storage/overlay/102653142e2259aa6223045dee7736729104ac8aed3ce9b3c87a6d0787e59de8/diff:/var/lib/containers/storage/overlay/a170762be59c15b133bd19c602942600caa3082ffe7158ccee8771dfc16bb660/diff",#012 "UpperDir": "/var/lib/containers/storage/overlay/fa799eba62894ffeea98e2f07b9b6a110cbee9080a6c3af1755332d33ad27617/diff",#012 "WorkDir": "/var/lib/containers/storage/overlay/fa799eba62894ffeea98e2f07b9b6a110cbee9080a6c3af1755332d33ad27617/work"#012 }#012 },#012 "RootFS": {#012 "Type": "layers",#012 "Layers": [#012 "sha256:a170762be59c15b133bd19c602942600caa3082ffe7158ccee8771dfc16bb660",#012 "sha256:47bbb708952ccfdaf6b1a15cd5347cc2e9ee37e63ec65603401dcebf66de9242",#012 "sha256:dde195c4be3ea0882f3029365e3a9510c9e08a199c8a2c93ddc2b8aa725a10f1",#012 "sha256:191522e021d026966b0789970c823d3aa8f268180d3ac4a9714f61201ef3b79e",#012 "sha256:226bf9c0939dc7236a630e18a3cd37bc8e773e86f3cf4ef2cedf4d22a6a7d337"#012 ]#012 },#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251202",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "c3923531bcda0b0811b2d5053f189beb",#012 "tcib_managed": "true"#012 },#012 "Annotations": {},#012 "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",#012 "User": "nova",#012 "History": [#012 {#012 "created": "2025-12-02T04:26:51.317229596Z",#012 "created_by": "/bin/sh -c #(nop) ADD file:9f05a0f58e10b77188c7243d914ce56c5ce3e0f2ee7e13a7b0d4990588c97b99 in / ",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-02T04:26:51.317315213Z",#012 "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\" org.label-schema.name=\"CentOS Stream 9 Base Image\" org.label-schema.vendor=\"CentOS\" org.label-schema.license=\"GPLv2\" org.label-schema.build-date=\"20251202\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-02T04:26:54.063957926Z",#012 "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"#012 },#012 {#012 "created": "2025-12-08T06:08:28.750777742Z",#012 "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",#012 "comment": "FROM quay.io/centos/centos:stream9",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-08T06:08:28.750791962Z",#012 "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-08T06:08:28.750804372Z",#012 "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-08T06:08:28.750813613Z",#012 "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-08T06:08:28.750824813Z",#012 "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-08T06:08:28.750833663Z",#012 "created_by": "/bin/sh -c #(nop) USER root",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-08T06:08:29.160435164Z",#012 "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-08T06:09:05.859236491Z",#012 "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",#012 "empty_layer": true#012 },#012 {#012 Dec 15 04:32:22 localhost podman[230059]: 2025-12-15 09:32:22.976353288 +0000 UTC m=+0.092467775 container remove 36a88083ed613f6871d2631ae462bca308a45ba583333f7cfe4d0b0c40a18ba5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, container_name=nova_compute, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.4, org.opencontainers.image.revision=d13aeaae6d02e9d9273775f1920879be7af2cf2d, batch=17.1_20251118.1, io.openshift.expose-services=, release=1761123044, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.12 17.1_20251118.1, architecture=x86_64, tcib_managed=true, vcs-type=git, version=17.1.12, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., config_id=tripleo_step5, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:36:58Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, name=rhosp17/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '182e509007ab5e6e5b2500a552cbd5ba-879500e96bf8dfb93687004bd86f2317'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=d13aeaae6d02e9d9273775f1920879be7af2cf2d) Dec 15 04:32:22 localhost python3[230009]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman rm --force nova_compute Dec 15 04:32:23 localhost podman[230074]: Dec 15 04:32:23 localhost podman[230074]: 2025-12-15 09:32:23.084064859 +0000 UTC m=+0.088457018 container create b5d335c872cd511e4f9ef497ade55685828922bcddd4d43ed3b0589e6eb54c6a (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, config_id=edpm, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, container_name=nova_compute, managed_by=edpm_ansible) Dec 15 04:32:23 localhost podman[230074]: 2025-12-15 09:32:23.041921581 +0000 UTC m=+0.046313790 image pull quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified Dec 15 04:32:23 localhost python3[230009]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start Dec 15 04:32:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31816 DF PROTO=TCP SPT=41280 DPT=9100 SEQ=863237432 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839CDB3B0000000001030307) Dec 15 04:32:24 localhost python3.9[230221]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 15 04:32:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31817 DF PROTO=TCP SPT=41280 DPT=9100 SEQ=863237432 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839CDF250000000001030307) Dec 15 04:32:24 localhost python3.9[230333]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:32:25 localhost python3.9[230442]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765791144.953517-4210-65662248070943/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:32:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26508 DF PROTO=TCP SPT=41008 DPT=9102 SEQ=4203990259 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839CE4E50000000001030307) Dec 15 04:32:25 localhost python3.9[230497]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Dec 15 04:32:26 localhost systemd[1]: Reloading. Dec 15 04:32:26 localhost systemd-sysv-generator[230523]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:32:26 localhost systemd-rc-local-generator[230520]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:32:26 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:32:26 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 15 04:32:26 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:32:26 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:32:26 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:32:26 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 15 04:32:26 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:32:26 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:32:26 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:32:26 localhost python3.9[230588]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 04:32:26 localhost systemd[1]: Reloading. Dec 15 04:32:27 localhost systemd-sysv-generator[230620]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:32:27 localhost systemd-rc-local-generator[230613]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:32:27 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:32:27 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 15 04:32:27 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:32:27 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:32:27 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:32:27 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 15 04:32:27 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:32:27 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:32:27 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:32:27 localhost systemd[1]: Starting nova_compute container... Dec 15 04:32:27 localhost systemd[1]: Started libcrun container. Dec 15 04:32:27 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/90c3d76968c2380582c34ed47ce3bd3c1017e3f0027bb2b60c7c3fb4e76ff2b1/merged/etc/multipath supports timestamps until 2038 (0x7fffffff) Dec 15 04:32:27 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/90c3d76968c2380582c34ed47ce3bd3c1017e3f0027bb2b60c7c3fb4e76ff2b1/merged/etc/nvme supports timestamps until 2038 (0x7fffffff) Dec 15 04:32:27 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/90c3d76968c2380582c34ed47ce3bd3c1017e3f0027bb2b60c7c3fb4e76ff2b1/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff) Dec 15 04:32:27 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/90c3d76968c2380582c34ed47ce3bd3c1017e3f0027bb2b60c7c3fb4e76ff2b1/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Dec 15 04:32:27 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/90c3d76968c2380582c34ed47ce3bd3c1017e3f0027bb2b60c7c3fb4e76ff2b1/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Dec 15 04:32:27 localhost podman[230628]: 2025-12-15 09:32:27.469866007 +0000 UTC m=+0.131018536 container init b5d335c872cd511e4f9ef497ade55685828922bcddd4d43ed3b0589e6eb54c6a (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=nova_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 04:32:27 localhost podman[230628]: 2025-12-15 09:32:27.480260825 +0000 UTC m=+0.141413354 container start b5d335c872cd511e4f9ef497ade55685828922bcddd4d43ed3b0589e6eb54c6a (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=edpm, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, maintainer=OpenStack Kubernetes Operator team, container_name=nova_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Dec 15 04:32:27 localhost podman[230628]: nova_compute Dec 15 04:32:27 localhost systemd[1]: Started nova_compute container. Dec 15 04:32:27 localhost nova_compute[230642]: + sudo -E kolla_set_configs Dec 15 04:32:27 localhost nova_compute[230642]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Dec 15 04:32:27 localhost nova_compute[230642]: INFO:__main__:Validating config file Dec 15 04:32:27 localhost nova_compute[230642]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Dec 15 04:32:27 localhost nova_compute[230642]: INFO:__main__:Copying service configuration files Dec 15 04:32:27 localhost nova_compute[230642]: INFO:__main__:Deleting /etc/nova/nova.conf Dec 15 04:32:27 localhost nova_compute[230642]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf Dec 15 04:32:27 localhost nova_compute[230642]: INFO:__main__:Setting permission for /etc/nova/nova.conf Dec 15 04:32:27 localhost nova_compute[230642]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf Dec 15 04:32:27 localhost nova_compute[230642]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf Dec 15 04:32:27 localhost nova_compute[230642]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf Dec 15 04:32:27 localhost nova_compute[230642]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf Dec 15 04:32:27 localhost nova_compute[230642]: INFO:__main__:Copying /var/lib/kolla/config_files/99-nova-compute-cells-workarounds.conf to /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf Dec 15 04:32:27 localhost nova_compute[230642]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf Dec 15 04:32:27 localhost nova_compute[230642]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf Dec 15 04:32:27 localhost nova_compute[230642]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf Dec 15 04:32:27 localhost nova_compute[230642]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf Dec 15 04:32:27 localhost nova_compute[230642]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf Dec 15 04:32:27 localhost nova_compute[230642]: INFO:__main__:Deleting /etc/ceph Dec 15 04:32:27 localhost nova_compute[230642]: INFO:__main__:Creating directory /etc/ceph Dec 15 04:32:27 localhost nova_compute[230642]: INFO:__main__:Setting permission for /etc/ceph Dec 15 04:32:27 localhost nova_compute[230642]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf Dec 15 04:32:27 localhost nova_compute[230642]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf Dec 15 04:32:27 localhost nova_compute[230642]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring Dec 15 04:32:27 localhost nova_compute[230642]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring Dec 15 04:32:27 localhost nova_compute[230642]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey Dec 15 04:32:27 localhost nova_compute[230642]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey Dec 15 04:32:27 localhost nova_compute[230642]: INFO:__main__:Deleting /var/lib/nova/.ssh/config Dec 15 04:32:27 localhost nova_compute[230642]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config Dec 15 04:32:27 localhost nova_compute[230642]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config Dec 15 04:32:27 localhost nova_compute[230642]: INFO:__main__:Deleting /usr/sbin/iscsiadm Dec 15 04:32:27 localhost nova_compute[230642]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm Dec 15 04:32:27 localhost nova_compute[230642]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm Dec 15 04:32:27 localhost nova_compute[230642]: INFO:__main__:Writing out command to execute Dec 15 04:32:27 localhost nova_compute[230642]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf Dec 15 04:32:27 localhost nova_compute[230642]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring Dec 15 04:32:27 localhost nova_compute[230642]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ Dec 15 04:32:27 localhost nova_compute[230642]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey Dec 15 04:32:27 localhost nova_compute[230642]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config Dec 15 04:32:27 localhost nova_compute[230642]: ++ cat /run_command Dec 15 04:32:27 localhost nova_compute[230642]: + CMD=nova-compute Dec 15 04:32:27 localhost nova_compute[230642]: + ARGS= Dec 15 04:32:27 localhost nova_compute[230642]: + sudo kolla_copy_cacerts Dec 15 04:32:27 localhost nova_compute[230642]: + [[ ! -n '' ]] Dec 15 04:32:27 localhost nova_compute[230642]: + . kolla_extend_start Dec 15 04:32:27 localhost nova_compute[230642]: + echo 'Running command: '\''nova-compute'\''' Dec 15 04:32:27 localhost nova_compute[230642]: Running command: 'nova-compute' Dec 15 04:32:27 localhost nova_compute[230642]: + umask 0022 Dec 15 04:32:27 localhost nova_compute[230642]: + exec nova-compute Dec 15 04:32:28 localhost python3.9[230762]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 15 04:32:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43901 DF PROTO=TCP SPT=51200 DPT=9101 SEQ=4199071406 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839CF0A50000000001030307) Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.266 230646 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.267 230646 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.267 230646 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.267 230646 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.385 230646 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.399 230646 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.014s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.400 230646 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m Dec 15 04:32:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 04:32:29 localhost python3.9[230874]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 15 04:32:29 localhost podman[230875]: 2025-12-15 09:32:29.764662622 +0000 UTC m=+0.088631682 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, config_id=ovn_metadata_agent) Dec 15 04:32:29 localhost podman[230875]: 2025-12-15 09:32:29.774366321 +0000 UTC m=+0.098335421 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent) Dec 15 04:32:29 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.815 230646 INFO nova.virt.driver [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.926 230646 INFO nova.compute.provider_config [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.959 230646 WARNING nova.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] Current Nova version does not support computes older than Yoga but the minimum compute service level in your cell is 57 and the oldest supported service level is 61.: nova.exception.TooOldComputeService: Current Nova version does not support computes older than Yoga but the minimum compute service level in your cell is 57 and the oldest supported service level is 61.#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.960 230646 DEBUG oslo_concurrency.lockutils [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.960 230646 DEBUG oslo_concurrency.lockutils [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.961 230646 DEBUG oslo_concurrency.lockutils [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.961 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.961 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.961 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.962 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.962 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.962 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.962 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] allow_resize_to_same_host = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.962 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] arq_binding_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.962 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] backdoor_port = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.963 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] backdoor_socket = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.963 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] block_device_allocate_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.963 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.963 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] cert = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.963 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] compute_driver = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.964 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] compute_monitors = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.964 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] config_dir = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.964 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] config_drive_format = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.964 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] config_file = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.964 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] config_source = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.965 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] console_host = np0005559462.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.965 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] control_exchange = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.965 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] cpu_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.965 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] daemon = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.965 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] debug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.965 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.966 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] default_availability_zone = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.966 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] default_ephemeral_format = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.966 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.966 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] default_schedule_zone = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.966 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] disk_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.967 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] enable_new_services = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.967 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] enabled_apis = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.967 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] enabled_ssl_apis = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.967 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] flat_injected = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.967 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] force_config_drive = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.968 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] force_raw_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.968 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] graceful_shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.968 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.968 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] host = np0005559462.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.969 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] initial_cpu_allocation_ratio = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.969 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] initial_disk_allocation_ratio = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.969 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] initial_ram_allocation_ratio = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.969 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] injected_network_template = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.969 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] instance_build_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.970 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] instance_delete_interval = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.970 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] instance_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.970 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] instance_name_template = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.970 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] instance_usage_audit = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.970 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] instance_usage_audit_period = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.970 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] instance_uuid_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.971 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] instances_path = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.971 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.971 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.971 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] live_migration_retry_count = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.971 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] log_config_append = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.972 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] log_date_format = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.972 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] log_dir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.972 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] log_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.972 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] log_options = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.972 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] log_rotate_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.973 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] log_rotate_interval_type = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.973 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] log_rotation_type = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.973 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] logging_context_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.973 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] logging_debug_format_suffix = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.973 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] logging_default_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.974 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] logging_exception_prefix = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.974 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.974 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] long_rpc_timeout = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.974 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] max_concurrent_builds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.974 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.974 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] max_concurrent_snapshots = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.975 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] max_local_block_devices = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.975 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] max_logfile_count = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.975 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] max_logfile_size_mb = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.975 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.975 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] metadata_listen = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.976 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] metadata_listen_port = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.976 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] metadata_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.976 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] migrate_max_retries = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.976 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] mkisofs_cmd = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.976 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] my_block_storage_ip = 192.168.122.106 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.977 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] my_ip = 192.168.122.106 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.977 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] network_allocate_retries = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.977 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.977 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] osapi_compute_listen = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.977 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] osapi_compute_listen_port = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.977 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] osapi_compute_unique_server_name_scope = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.978 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] osapi_compute_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.978 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] password_length = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.978 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] periodic_enable = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.978 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] periodic_fuzzy_delay = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.978 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] pointer_model = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.979 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] preallocate_images = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.979 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] publish_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.979 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] pybasedir = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.979 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] ram_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.979 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] rate_limit_burst = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.979 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] rate_limit_except_level = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.980 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] rate_limit_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.980 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] reboot_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.980 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] reclaim_instance_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.980 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] record = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.980 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] reimage_timeout_per_gb = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.981 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] report_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.981 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] rescue_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.981 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] reserved_host_cpus = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.981 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] reserved_host_disk_mb = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.981 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] reserved_host_memory_mb = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.981 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] reserved_huge_pages = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.982 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] resize_confirm_window = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.982 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] resize_fs_using_block_device = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.982 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.982 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] rootwrap_config = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.982 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] rpc_response_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.982 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] run_external_periodic_tasks = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.983 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.983 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.983 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.983 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.983 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] service_down_time = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.984 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] servicegroup_driver = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.984 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] shelved_offload_time = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.984 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] shelved_poll_interval = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.984 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.984 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] source_is_ipv6 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.984 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] ssl_only = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.985 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] state_path = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.985 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] sync_power_state_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.985 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] sync_power_state_pool_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.985 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] syslog_log_facility = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.985 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] tempdir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.986 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] timeout_nbd = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.986 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.986 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] update_resources_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.986 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] use_cow_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.986 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] use_eventlog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.986 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] use_journal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.987 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] use_json = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.987 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] use_rootwrap_daemon = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.987 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] use_stderr = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.987 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] use_syslog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.987 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] vcpu_pin_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.988 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] vif_plugging_is_fatal = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.988 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] vif_plugging_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.988 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] virt_mkfs = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.988 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] volume_usage_poll_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.988 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] watch_log_file = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.988 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] web = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.989 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.989 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_concurrency.lock_path = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.989 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.989 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.989 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_messaging_metrics.metrics_process_name = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.990 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.990 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.990 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] api.auth_strategy = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.990 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] api.compute_link_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.990 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.991 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] api.dhcp_domain = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.991 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] api.enable_instance_password = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.991 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] api.glance_link_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.991 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.991 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.991 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.992 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.992 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] api.local_metadata_per_cell = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.992 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] api.max_limit = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.992 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] api.metadata_cache_expiration = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.992 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] api.neutron_default_tenant_id = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.993 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] api.use_forwarded_for = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.993 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] api.use_neutron_default_nets = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.993 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.993 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.993 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.994 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] api.vendordata_dynamic_ssl_certfile = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.994 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.994 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] api.vendordata_jsonfile_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.994 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] api.vendordata_providers = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.994 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] cache.backend = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.994 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] cache.backend_argument = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.995 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] cache.config_prefix = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.995 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] cache.dead_timeout = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.995 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] cache.debug_cache_backend = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.995 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] cache.enable_retry_client = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.995 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] cache.enable_socket_keepalive = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.996 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] cache.enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.996 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] cache.expiration_time = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.996 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.996 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] cache.hashclient_retry_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.996 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] cache.memcache_dead_retry = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.997 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] cache.memcache_password = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.997 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.997 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.997 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] cache.memcache_pool_maxsize = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.997 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.997 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] cache.memcache_sasl_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.998 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] cache.memcache_servers = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.998 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] cache.memcache_socket_timeout = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.998 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] cache.memcache_username = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.998 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] cache.proxies = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.998 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] cache.retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.999 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] cache.retry_delay = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.999 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] cache.socket_keepalive_count = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.999 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] cache.socket_keepalive_idle = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.999 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:29 localhost nova_compute[230642]: 2025-12-15 09:32:29.999 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] cache.tls_allowed_ciphers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:29.999 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] cache.tls_cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.000 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] cache.tls_certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.000 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] cache.tls_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.000 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] cache.tls_keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.000 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] cinder.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.000 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] cinder.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.001 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] cinder.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.001 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] cinder.catalog_info = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.001 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] cinder.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.001 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] cinder.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.001 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] cinder.cross_az_attach = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.001 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] cinder.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.001 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] cinder.endpoint_template = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.002 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] cinder.http_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.002 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] cinder.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.002 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] cinder.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.002 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] cinder.os_region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.002 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] cinder.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.002 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] cinder.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.002 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.003 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] compute.cpu_dedicated_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.003 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] compute.cpu_shared_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.003 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.003 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.003 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.003 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.003 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.003 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.004 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.004 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.004 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.004 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] conductor.workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.004 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] console.allowed_origins = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.004 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] console.ssl_ciphers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.004 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] console.ssl_minimum_version = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.005 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] consoleauth.token_ttl = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.005 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] cyborg.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.005 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] cyborg.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.005 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] cyborg.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.005 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] cyborg.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.005 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] cyborg.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.005 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] cyborg.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.005 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] cyborg.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.006 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] cyborg.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.006 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] cyborg.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.006 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] cyborg.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.006 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] cyborg.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.006 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] cyborg.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.006 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] cyborg.service_type = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.006 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] cyborg.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.006 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] cyborg.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.007 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.007 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] cyborg.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.007 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] cyborg.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.007 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] cyborg.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.007 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] database.backend = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.007 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] database.connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.007 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] database.connection_debug = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.008 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] database.connection_parameters = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.008 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.008 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] database.connection_trace = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.008 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.008 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] database.db_max_retries = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.008 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.008 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.008 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] database.max_overflow = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.009 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] database.max_pool_size = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.009 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] database.max_retries = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.009 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] database.mysql_enable_ndb = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.009 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] database.mysql_sql_mode = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.009 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.009 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] database.pool_timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.009 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] database.retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.010 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] database.slave_connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.010 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.010 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] api_database.backend = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.010 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] api_database.connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.010 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] api_database.connection_debug = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.010 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] api_database.connection_parameters = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.010 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.011 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] api_database.connection_trace = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.011 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.011 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] api_database.db_max_retries = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.011 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.011 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.011 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] api_database.max_overflow = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.011 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] api_database.max_pool_size = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.011 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] api_database.max_retries = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.012 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] api_database.mysql_enable_ndb = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.012 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] api_database.mysql_sql_mode = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.012 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.012 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] api_database.pool_timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.012 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] api_database.retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.012 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] api_database.slave_connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.012 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.012 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] devices.enabled_mdev_types = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.013 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.013 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.013 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.013 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] glance.api_servers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.013 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] glance.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.013 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] glance.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.013 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] glance.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.014 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] glance.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.014 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] glance.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.014 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] glance.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.014 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.014 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.014 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] glance.enable_rbd_download = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.014 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] glance.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.014 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] glance.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.015 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] glance.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.015 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] glance.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.015 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] glance.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.015 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] glance.num_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.015 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] glance.rbd_ceph_conf = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.015 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] glance.rbd_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.015 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] glance.rbd_pool = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.016 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] glance.rbd_user = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.016 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] glance.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.016 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] glance.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.016 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] glance.service_type = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.016 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] glance.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.016 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] glance.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.016 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.016 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] glance.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.017 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] glance.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.017 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.017 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] glance.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.017 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] guestfs.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.017 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] hyperv.config_drive_cdrom = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.017 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.017 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] hyperv.dynamic_memory_ratio = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.018 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.018 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] hyperv.enable_remotefx = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.018 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] hyperv.instances_path_share = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.018 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] hyperv.iscsi_initiator_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.018 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] hyperv.limit_cpu_features = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.018 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.018 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.018 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.019 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.019 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] hyperv.qemu_img_cmd = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.019 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] hyperv.use_multipath_io = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.019 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.019 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.019 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] hyperv.vswitch_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.019 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.019 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] mks.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.020 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] mks.mksproxy_base_url = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.020 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] image_cache.manager_interval = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.020 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.020 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.020 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.020 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.021 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] image_cache.subdirectory_name = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.021 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] ironic.api_max_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.021 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] ironic.api_retry_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.021 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] ironic.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.021 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] ironic.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.021 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] ironic.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.021 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] ironic.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.022 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] ironic.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.022 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] ironic.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.022 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] ironic.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.022 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] ironic.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.022 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] ironic.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.022 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] ironic.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.022 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] ironic.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.022 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] ironic.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.023 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] ironic.partition_key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.023 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] ironic.peer_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.023 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] ironic.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.023 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.023 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] ironic.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.023 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] ironic.service_type = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.023 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] ironic.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.024 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] ironic.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.024 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.024 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] ironic.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.024 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] ironic.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.024 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] ironic.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.024 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] key_manager.backend = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.024 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] key_manager.fixed_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.024 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] barbican.auth_endpoint = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.025 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] barbican.barbican_api_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.025 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] barbican.barbican_endpoint = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.025 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.025 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] barbican.barbican_region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.025 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] barbican.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.025 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] barbican.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.025 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] barbican.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.026 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] barbican.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.026 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] barbican.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.026 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] barbican.number_of_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.026 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] barbican.retry_delay = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.026 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.026 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] barbican.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.026 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] barbican.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.026 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] barbican.verify_ssl = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.027 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] barbican.verify_ssl_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.027 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.027 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.027 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] barbican_service_user.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.027 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.027 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.027 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.027 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] barbican_service_user.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.028 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.028 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] barbican_service_user.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.028 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] vault.approle_role_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.028 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] vault.approle_secret_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.028 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] vault.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.028 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] vault.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.028 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] vault.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.029 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] vault.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.029 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] vault.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.029 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] vault.kv_mountpoint = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.029 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] vault.kv_version = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.029 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] vault.namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.029 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] vault.root_token_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.029 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] vault.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.029 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] vault.ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.030 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] vault.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.030 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] vault.use_ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.030 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] vault.vault_url = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.030 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] keystone.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.030 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] keystone.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.030 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] keystone.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.030 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] keystone.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.031 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] keystone.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.031 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] keystone.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.031 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] keystone.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.031 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] keystone.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.031 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] keystone.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.031 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] keystone.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.031 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] keystone.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.031 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] keystone.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.032 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] keystone.service_type = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.032 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] keystone.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.032 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] keystone.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.032 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.032 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] keystone.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.032 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] keystone.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.032 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] keystone.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.032 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.connection_uri = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.033 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.cpu_mode = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.033 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.cpu_model_extra_flags = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.033 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.cpu_models = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.033 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.033 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.033 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.cpu_power_management = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.033 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.034 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.034 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.device_detach_timeout = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.034 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.disk_cachemodes = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.034 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.disk_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.034 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.enabled_perf_events = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.034 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.file_backed_memory = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.034 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.gid_maps = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.034 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.hw_disk_discard = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.035 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.hw_machine_type = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.035 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.images_rbd_ceph_conf = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.035 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.035 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.035 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.035 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.images_rbd_pool = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.035 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.images_type = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.036 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.images_volume_group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.036 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.inject_key = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.036 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.inject_partition = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.036 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.036 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.iscsi_iface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.036 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.iser_use_multipath = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.036 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.037 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.037 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.037 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.037 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.037 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.037 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.037 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.037 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.live_migration_scheme = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.038 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.038 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.038 230646 WARNING oslo_config.cfg [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal ( Dec 15 04:32:30 localhost nova_compute[230642]: live_migration_uri is deprecated for removal in favor of two other options that Dec 15 04:32:30 localhost nova_compute[230642]: allow to change live migration scheme and target URI: ``live_migration_scheme`` Dec 15 04:32:30 localhost nova_compute[230642]: and ``live_migration_inbound_addr`` respectively. Dec 15 04:32:30 localhost nova_compute[230642]: ). Its value may be silently ignored in the future.#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.038 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.live_migration_uri = qemu+ssh://nova@%s/system?keyfile=/var/lib/nova/.ssh/ssh-privatekey log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.038 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.live_migration_with_native_tls = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.038 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.max_queues = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.038 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.039 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.nfs_mount_options = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.039 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.nfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.039 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.039 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.num_iser_scan_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.039 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.039 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.039 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.num_pcie_ports = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.040 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.num_volume_scan_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.040 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.pmem_namespaces = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.040 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.quobyte_client_cfg = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.040 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.040 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.rbd_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.040 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.040 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.041 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.rbd_secret_uuid = bce17446-41b5-5408-a23e-0b011906b44a log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.041 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.rbd_user = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.041 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.041 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.041 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.rescue_image_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.041 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.rescue_kernel_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.041 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.rescue_ramdisk_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.042 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.rng_dev_path = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.042 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.rx_queue_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.042 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.smbfs_mount_options = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.042 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.042 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.snapshot_compression = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.042 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.snapshot_image_format = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.042 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.snapshots_directory = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.043 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.043 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.swtpm_enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.043 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.swtpm_group = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.043 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.swtpm_user = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.043 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.sysinfo_serial = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.043 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.tx_queue_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.043 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.uid_maps = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.043 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.044 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.virt_type = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.044 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.volume_clear = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.044 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.volume_clear_size = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.044 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.volume_use_multipath = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.044 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.vzstorage_cache_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.044 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.044 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.vzstorage_mount_group = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.045 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.vzstorage_mount_opts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.045 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.vzstorage_mount_perms = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.045 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.045 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.vzstorage_mount_user = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.045 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.045 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] neutron.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.045 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] neutron.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.046 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] neutron.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.046 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] neutron.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.046 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] neutron.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.046 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] neutron.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.046 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] neutron.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.046 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] neutron.default_floating_pool = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.046 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] neutron.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.046 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.047 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] neutron.http_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.047 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] neutron.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.047 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] neutron.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.047 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] neutron.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.047 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.047 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] neutron.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.047 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] neutron.ovs_bridge = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.047 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] neutron.physnets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.048 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] neutron.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.048 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.048 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] neutron.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.048 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] neutron.service_type = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.048 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] neutron.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.048 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] neutron.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.048 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.049 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] neutron.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.049 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] neutron.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.049 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] neutron.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.049 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.049 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] notifications.default_level = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.049 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.049 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.049 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.050 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] pci.alias = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.050 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] pci.device_spec = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.050 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] pci.report_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.050 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] placement.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.050 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] placement.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.050 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] placement.auth_url = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.050 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] placement.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.051 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] placement.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.051 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] placement.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.051 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] placement.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.051 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] placement.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.051 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] placement.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.051 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] placement.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.051 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] placement.domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.051 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] placement.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.052 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] placement.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.052 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] placement.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.052 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] placement.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.052 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] placement.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.052 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] placement.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.052 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] placement.password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.052 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] placement.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.053 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] placement.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.053 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] placement.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.053 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] placement.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.053 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] placement.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.053 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] placement.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.053 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] placement.service_type = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.053 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] placement.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.053 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] placement.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.054 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.054 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] placement.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.054 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] placement.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.054 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] placement.trust_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.054 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] placement.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.054 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] placement.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.054 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] placement.user_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.054 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] placement.username = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.055 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] placement.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.055 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] placement.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.055 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] quota.cores = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.055 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.055 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] quota.driver = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.055 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.055 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.056 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] quota.injected_files = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.056 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] quota.instances = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.056 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] quota.key_pairs = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.056 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] quota.metadata_items = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.056 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] quota.ram = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.056 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] quota.recheck_quota = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.056 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] quota.server_group_members = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.057 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] quota.server_groups = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.057 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] rdp.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.057 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.057 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.057 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.057 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.057 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.058 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] scheduler.max_attempts = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.058 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.058 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.058 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.058 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.058 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.058 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] scheduler.workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.058 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.059 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.059 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.059 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.059 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.059 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.059 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.059 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.060 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.060 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.060 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.060 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.060 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.060 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.060 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.061 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.061 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.061 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.061 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.061 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.061 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.061 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.061 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.062 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.062 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] metrics.required = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.062 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] metrics.weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.062 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] metrics.weight_of_unavailable = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.062 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] metrics.weight_setting = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.062 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] serial_console.base_url = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.062 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] serial_console.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.063 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] serial_console.port_range = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.063 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.063 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.063 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.063 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.063 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] service_user.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.063 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] service_user.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.064 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.064 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.064 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.064 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] service_user.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.064 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.064 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.064 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] service_user.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.065 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] spice.agent_enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.065 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] spice.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.065 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] spice.html5proxy_base_url = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.065 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] spice.html5proxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.065 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] spice.html5proxy_port = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.065 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] spice.image_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.065 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] spice.jpeg_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.066 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] spice.playback_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.066 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] spice.server_listen = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.066 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.066 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] spice.streaming_mode = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.066 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] spice.zlib_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.066 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] upgrade_levels.baseapi = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.066 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] upgrade_levels.cert = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.066 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] upgrade_levels.compute = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.067 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] upgrade_levels.conductor = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.067 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] upgrade_levels.scheduler = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.067 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.067 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.067 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.067 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.067 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.068 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.068 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.068 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.068 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.068 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] vmware.api_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.068 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] vmware.ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.068 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] vmware.cache_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.068 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] vmware.cluster_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.069 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] vmware.connection_pool_size = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.069 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] vmware.console_delay_seconds = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.069 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] vmware.datastore_regex = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.069 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] vmware.host_ip = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.069 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] vmware.host_password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.069 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] vmware.host_port = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.069 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] vmware.host_username = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.069 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] vmware.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.070 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] vmware.integration_bridge = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.070 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] vmware.maximum_objects = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.070 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] vmware.pbm_default_policy = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.070 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] vmware.pbm_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.070 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] vmware.pbm_wsdl_location = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.070 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] vmware.serial_log_dir = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.070 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] vmware.serial_port_proxy_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.071 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.071 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] vmware.task_poll_interval = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.071 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] vmware.use_linked_clone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.071 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] vmware.vnc_keymap = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.071 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] vmware.vnc_port = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.071 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] vmware.vnc_port_total = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.071 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] vnc.auth_schemes = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.071 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] vnc.enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.072 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] vnc.novncproxy_base_url = http://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.072 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] vnc.novncproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.072 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] vnc.novncproxy_port = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.072 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] vnc.server_listen = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.072 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] vnc.server_proxyclient_address = 192.168.122.106 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.072 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] vnc.vencrypt_ca_certs = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.073 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] vnc.vencrypt_client_cert = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.073 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] vnc.vencrypt_client_key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.073 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] workarounds.disable_compute_service_check_for_ffu = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.073 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.073 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.073 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.073 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.074 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] workarounds.disable_rootwrap = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.074 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.074 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.074 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.074 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.074 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.074 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.075 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.075 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.075 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.075 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.075 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.075 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.075 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.076 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.076 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.076 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] wsgi.api_paste_config = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.076 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] wsgi.client_socket_timeout = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.076 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] wsgi.default_pool_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.076 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] wsgi.keep_alive = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.076 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] wsgi.max_header_line = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.077 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] wsgi.secure_proxy_ssl_header = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.077 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] wsgi.ssl_ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.077 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] wsgi.ssl_cert_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.077 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] wsgi.ssl_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.077 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] wsgi.tcp_keepidle = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.077 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.077 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] zvm.ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.077 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] zvm.cloud_connector_url = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.078 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] zvm.image_tmp_path = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.078 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] zvm.reachable_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.078 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.078 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_policy.enforce_scope = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.078 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.078 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_policy.policy_dirs = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.078 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_policy.policy_file = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.079 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.079 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.079 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.079 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.079 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.079 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.079 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.079 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] remote_debug.host = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.080 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] remote_debug.port = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.080 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.080 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.080 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.080 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.080 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.080 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.081 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.081 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.081 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.081 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.081 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.081 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.081 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.082 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.082 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.082 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.082 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.082 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.082 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.082 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.082 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.083 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.083 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.083 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.083 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.083 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_messaging_rabbit.ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.083 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_messaging_rabbit.ssl_ca_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.083 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_messaging_rabbit.ssl_cert_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.084 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.084 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_messaging_rabbit.ssl_key_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.084 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_messaging_rabbit.ssl_version = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.084 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.084 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.084 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.084 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.085 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_limit.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.085 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_limit.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.085 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_limit.auth_url = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.085 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_limit.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.085 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_limit.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.085 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_limit.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.085 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_limit.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.085 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.086 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_limit.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.086 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.086 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_limit.domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.086 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_limit.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.086 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_limit.endpoint_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.086 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_limit.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.086 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_limit.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.086 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_limit.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.087 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_limit.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.087 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_limit.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.087 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_limit.password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.087 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_limit.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.087 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.087 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_limit.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.087 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_limit.project_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.087 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_limit.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.088 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_limit.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.088 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_limit.service_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.088 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_limit.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.088 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.088 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.088 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_limit.system_scope = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.088 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_limit.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.089 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_limit.trust_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.089 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_limit.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.089 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_limit.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.089 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_limit.user_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.089 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_limit.username = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.089 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_limit.valid_interfaces = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.089 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_limit.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.089 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.090 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.090 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] oslo_reports.log_dir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.090 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.090 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.090 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.090 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.090 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.090 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.091 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.091 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] vif_plug_ovs_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.091 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.091 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.091 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.091 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] vif_plug_ovs_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.091 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.092 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.092 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] os_vif_linux_bridge.iptables_bottom_regex = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.092 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.092 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] os_vif_linux_bridge.iptables_top_regex = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.092 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.092 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] os_vif_linux_bridge.use_ipv6 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.092 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.093 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] os_vif_ovs.isolate_vif = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.093 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] os_vif_ovs.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.093 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] os_vif_ovs.ovs_vsctl_timeout = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.093 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] os_vif_ovs.ovsdb_connection = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.093 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] os_vif_ovs.ovsdb_interface = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.093 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] os_vif_ovs.per_port_bridge = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.093 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] os_brick.lock_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.093 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.094 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.094 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] privsep_osbrick.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.094 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] privsep_osbrick.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.094 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.094 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] privsep_osbrick.logger_name = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.094 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.094 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] privsep_osbrick.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.094 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] nova_sys_admin.capabilities = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.095 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] nova_sys_admin.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.095 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] nova_sys_admin.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.095 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] nova_sys_admin.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.095 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.095 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] nova_sys_admin.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.095 230646 DEBUG oslo_service.service [None req-49080c14-bd80-4b1c-bc6f-a59411562e0b - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.096 230646 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.109 230646 INFO nova.virt.node [None req-6dbb3ea6-5667-4c72-84e8-baa2852417c3 - - - - - -] Determined node identity 26c8956b-6742-4951-b566-971b9bbe323b from /var/lib/nova/compute_id#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.109 230646 DEBUG nova.virt.libvirt.host [None req-6dbb3ea6-5667-4c72-84e8-baa2852417c3 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.110 230646 DEBUG nova.virt.libvirt.host [None req-6dbb3ea6-5667-4c72-84e8-baa2852417c3 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.110 230646 DEBUG nova.virt.libvirt.host [None req-6dbb3ea6-5667-4c72-84e8-baa2852417c3 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.110 230646 DEBUG nova.virt.libvirt.host [None req-6dbb3ea6-5667-4c72-84e8-baa2852417c3 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.123 230646 DEBUG nova.virt.libvirt.host [None req-6dbb3ea6-5667-4c72-84e8-baa2852417c3 - - - - - -] Registering for lifecycle events _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.126 230646 DEBUG nova.virt.libvirt.host [None req-6dbb3ea6-5667-4c72-84e8-baa2852417c3 - - - - - -] Registering for connection events: _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.126 230646 INFO nova.virt.libvirt.driver [None req-6dbb3ea6-5667-4c72-84e8-baa2852417c3 - - - - - -] Connection event '1' reason 'None'#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.137 230646 DEBUG nova.virt.libvirt.volume.mount [None req-6dbb3ea6-5667-4c72-84e8-baa2852417c3 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.146 230646 INFO nova.virt.libvirt.host [None req-6dbb3ea6-5667-4c72-84e8-baa2852417c3 - - - - - -] Libvirt host capabilities Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: 12c7b589-8d2b-44b6-80e1-1f4b0f34f69b Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: x86_64 Dec 15 04:32:30 localhost nova_compute[230642]: EPYC-Rome-v4 Dec 15 04:32:30 localhost nova_compute[230642]: AMD Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: tcp Dec 15 04:32:30 localhost nova_compute[230642]: rdma Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: 16116604 Dec 15 04:32:30 localhost nova_compute[230642]: 4029151 Dec 15 04:32:30 localhost nova_compute[230642]: 0 Dec 15 04:32:30 localhost nova_compute[230642]: 0 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: selinux Dec 15 04:32:30 localhost nova_compute[230642]: 0 Dec 15 04:32:30 localhost nova_compute[230642]: system_u:system_r:svirt_t:s0 Dec 15 04:32:30 localhost nova_compute[230642]: system_u:system_r:svirt_tcg_t:s0 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: dac Dec 15 04:32:30 localhost nova_compute[230642]: 0 Dec 15 04:32:30 localhost nova_compute[230642]: +107:+107 Dec 15 04:32:30 localhost nova_compute[230642]: +107:+107 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: hvm Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: 32 Dec 15 04:32:30 localhost nova_compute[230642]: /usr/libexec/qemu-kvm Dec 15 04:32:30 localhost nova_compute[230642]: pc-i440fx-rhel7.6.0 Dec 15 04:32:30 localhost nova_compute[230642]: pc Dec 15 04:32:30 localhost nova_compute[230642]: pc-q35-rhel9.8.0 Dec 15 04:32:30 localhost nova_compute[230642]: q35 Dec 15 04:32:30 localhost nova_compute[230642]: pc-q35-rhel9.6.0 Dec 15 04:32:30 localhost nova_compute[230642]: pc-q35-rhel8.6.0 Dec 15 04:32:30 localhost nova_compute[230642]: pc-q35-rhel9.4.0 Dec 15 04:32:30 localhost nova_compute[230642]: pc-q35-rhel8.5.0 Dec 15 04:32:30 localhost nova_compute[230642]: pc-q35-rhel8.3.0 Dec 15 04:32:30 localhost nova_compute[230642]: pc-q35-rhel7.6.0 Dec 15 04:32:30 localhost nova_compute[230642]: pc-q35-rhel8.4.0 Dec 15 04:32:30 localhost nova_compute[230642]: pc-q35-rhel9.2.0 Dec 15 04:32:30 localhost nova_compute[230642]: pc-q35-rhel8.2.0 Dec 15 04:32:30 localhost nova_compute[230642]: pc-q35-rhel9.0.0 Dec 15 04:32:30 localhost nova_compute[230642]: pc-q35-rhel8.0.0 Dec 15 04:32:30 localhost nova_compute[230642]: pc-q35-rhel8.1.0 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: hvm Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: 64 Dec 15 04:32:30 localhost nova_compute[230642]: /usr/libexec/qemu-kvm Dec 15 04:32:30 localhost nova_compute[230642]: pc-i440fx-rhel7.6.0 Dec 15 04:32:30 localhost nova_compute[230642]: pc Dec 15 04:32:30 localhost nova_compute[230642]: pc-q35-rhel9.8.0 Dec 15 04:32:30 localhost nova_compute[230642]: q35 Dec 15 04:32:30 localhost nova_compute[230642]: pc-q35-rhel9.6.0 Dec 15 04:32:30 localhost nova_compute[230642]: pc-q35-rhel8.6.0 Dec 15 04:32:30 localhost nova_compute[230642]: pc-q35-rhel9.4.0 Dec 15 04:32:30 localhost nova_compute[230642]: pc-q35-rhel8.5.0 Dec 15 04:32:30 localhost nova_compute[230642]: pc-q35-rhel8.3.0 Dec 15 04:32:30 localhost nova_compute[230642]: pc-q35-rhel7.6.0 Dec 15 04:32:30 localhost nova_compute[230642]: pc-q35-rhel8.4.0 Dec 15 04:32:30 localhost nova_compute[230642]: pc-q35-rhel9.2.0 Dec 15 04:32:30 localhost nova_compute[230642]: pc-q35-rhel8.2.0 Dec 15 04:32:30 localhost nova_compute[230642]: pc-q35-rhel9.0.0 Dec 15 04:32:30 localhost nova_compute[230642]: pc-q35-rhel8.0.0 Dec 15 04:32:30 localhost nova_compute[230642]: pc-q35-rhel8.1.0 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: #033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.154 230646 DEBUG nova.virt.libvirt.host [None req-6dbb3ea6-5667-4c72-84e8-baa2852417c3 - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.173 230646 DEBUG nova.virt.libvirt.host [None req-6dbb3ea6-5667-4c72-84e8-baa2852417c3 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: /usr/libexec/qemu-kvm Dec 15 04:32:30 localhost nova_compute[230642]: kvm Dec 15 04:32:30 localhost nova_compute[230642]: pc-i440fx-rhel7.6.0 Dec 15 04:32:30 localhost nova_compute[230642]: i686 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: /usr/share/OVMF/OVMF_CODE.secboot.fd Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: rom Dec 15 04:32:30 localhost nova_compute[230642]: pflash Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: yes Dec 15 04:32:30 localhost nova_compute[230642]: no Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: no Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: on Dec 15 04:32:30 localhost nova_compute[230642]: off Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: on Dec 15 04:32:30 localhost nova_compute[230642]: off Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: EPYC-Rome Dec 15 04:32:30 localhost nova_compute[230642]: AMD Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: 486 Dec 15 04:32:30 localhost nova_compute[230642]: 486-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Broadwell Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Broadwell-IBRS Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Broadwell-noTSX Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Broadwell-noTSX-IBRS Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Broadwell-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Broadwell-v2 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Broadwell-v3 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Broadwell-v4 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Cascadelake-Server Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Cascadelake-Server-noTSX Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Cascadelake-Server-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Cascadelake-Server-v2 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Cascadelake-Server-v3 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Cascadelake-Server-v4 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Cascadelake-Server-v5 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Conroe Dec 15 04:32:30 localhost nova_compute[230642]: Conroe-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Cooperlake Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Cooperlake-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Cooperlake-v2 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Denverton Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Denverton-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Denverton-v2 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Denverton-v3 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dhyana Dec 15 04:32:30 localhost nova_compute[230642]: Dhyana-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dhyana-v2 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: EPYC Dec 15 04:32:30 localhost nova_compute[230642]: EPYC-Genoa Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: EPYC-Genoa-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: EPYC-IBPB Dec 15 04:32:30 localhost nova_compute[230642]: EPYC-Milan Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: EPYC-Milan-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: EPYC-Milan-v2 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: EPYC-Rome Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: EPYC-Rome-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: EPYC-Rome-v2 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: EPYC-Rome-v3 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: EPYC-Rome-v4 Dec 15 04:32:30 localhost nova_compute[230642]: EPYC-v1 Dec 15 04:32:30 localhost nova_compute[230642]: EPYC-v2 Dec 15 04:32:30 localhost nova_compute[230642]: EPYC-v3 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: EPYC-v4 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: GraniteRapids Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: GraniteRapids-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: GraniteRapids-v2 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Haswell Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Haswell-IBRS Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Haswell-noTSX Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Haswell-noTSX-IBRS Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Haswell-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Haswell-v2 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Haswell-v3 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Haswell-v4 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Icelake-Server Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Icelake-Server-noTSX Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Icelake-Server-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Icelake-Server-v2 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Icelake-Server-v3 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Icelake-Server-v4 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Icelake-Server-v5 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Icelake-Server-v6 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Icelake-Server-v7 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: IvyBridge Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: IvyBridge-IBRS Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: IvyBridge-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: IvyBridge-v2 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: KnightsMill Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: KnightsMill-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Nehalem Dec 15 04:32:30 localhost nova_compute[230642]: Nehalem-IBRS Dec 15 04:32:30 localhost nova_compute[230642]: Nehalem-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Nehalem-v2 Dec 15 04:32:30 localhost nova_compute[230642]: Opteron_G1 Dec 15 04:32:30 localhost nova_compute[230642]: Opteron_G1-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Opteron_G2 Dec 15 04:32:30 localhost nova_compute[230642]: Opteron_G2-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Opteron_G3 Dec 15 04:32:30 localhost nova_compute[230642]: Opteron_G3-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Opteron_G4 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Opteron_G4-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Opteron_G5 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Opteron_G5-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Penryn Dec 15 04:32:30 localhost nova_compute[230642]: Penryn-v1 Dec 15 04:32:30 localhost nova_compute[230642]: SandyBridge Dec 15 04:32:30 localhost nova_compute[230642]: SandyBridge-IBRS Dec 15 04:32:30 localhost nova_compute[230642]: SandyBridge-v1 Dec 15 04:32:30 localhost nova_compute[230642]: SandyBridge-v2 Dec 15 04:32:30 localhost nova_compute[230642]: SapphireRapids Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: SapphireRapids-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: SapphireRapids-v2 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: SapphireRapids-v3 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: SierraForest Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: SierraForest-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Skylake-Client Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Skylake-Client-IBRS Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Skylake-Client-noTSX-IBRS Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Skylake-Client-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Skylake-Client-v2 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Skylake-Client-v3 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Skylake-Client-v4 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Skylake-Server Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Skylake-Server-IBRS Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Skylake-Server-noTSX-IBRS Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Skylake-Server-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Skylake-Server-v2 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Skylake-Server-v3 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Skylake-Server-v4 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Skylake-Server-v5 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Snowridge Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Snowridge-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Snowridge-v2 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Snowridge-v3 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Snowridge-v4 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Westmere Dec 15 04:32:30 localhost nova_compute[230642]: Westmere-IBRS Dec 15 04:32:30 localhost nova_compute[230642]: Westmere-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Westmere-v2 Dec 15 04:32:30 localhost nova_compute[230642]: athlon Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: athlon-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: core2duo Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: core2duo-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: coreduo Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: coreduo-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: kvm32 Dec 15 04:32:30 localhost nova_compute[230642]: kvm32-v1 Dec 15 04:32:30 localhost nova_compute[230642]: kvm64 Dec 15 04:32:30 localhost nova_compute[230642]: kvm64-v1 Dec 15 04:32:30 localhost nova_compute[230642]: n270 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: n270-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: pentium Dec 15 04:32:30 localhost nova_compute[230642]: pentium-v1 Dec 15 04:32:30 localhost nova_compute[230642]: pentium2 Dec 15 04:32:30 localhost nova_compute[230642]: pentium2-v1 Dec 15 04:32:30 localhost nova_compute[230642]: pentium3 Dec 15 04:32:30 localhost nova_compute[230642]: pentium3-v1 Dec 15 04:32:30 localhost nova_compute[230642]: phenom Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: phenom-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: qemu32 Dec 15 04:32:30 localhost nova_compute[230642]: qemu32-v1 Dec 15 04:32:30 localhost nova_compute[230642]: qemu64 Dec 15 04:32:30 localhost nova_compute[230642]: qemu64-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: file Dec 15 04:32:30 localhost nova_compute[230642]: anonymous Dec 15 04:32:30 localhost nova_compute[230642]: memfd Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: disk Dec 15 04:32:30 localhost nova_compute[230642]: cdrom Dec 15 04:32:30 localhost nova_compute[230642]: floppy Dec 15 04:32:30 localhost nova_compute[230642]: lun Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: ide Dec 15 04:32:30 localhost nova_compute[230642]: fdc Dec 15 04:32:30 localhost nova_compute[230642]: scsi Dec 15 04:32:30 localhost nova_compute[230642]: virtio Dec 15 04:32:30 localhost nova_compute[230642]: usb Dec 15 04:32:30 localhost nova_compute[230642]: sata Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: virtio Dec 15 04:32:30 localhost nova_compute[230642]: virtio-transitional Dec 15 04:32:30 localhost nova_compute[230642]: virtio-non-transitional Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: vnc Dec 15 04:32:30 localhost nova_compute[230642]: egl-headless Dec 15 04:32:30 localhost nova_compute[230642]: dbus Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: subsystem Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: default Dec 15 04:32:30 localhost nova_compute[230642]: mandatory Dec 15 04:32:30 localhost nova_compute[230642]: requisite Dec 15 04:32:30 localhost nova_compute[230642]: optional Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: usb Dec 15 04:32:30 localhost nova_compute[230642]: pci Dec 15 04:32:30 localhost nova_compute[230642]: scsi Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: virtio Dec 15 04:32:30 localhost nova_compute[230642]: virtio-transitional Dec 15 04:32:30 localhost nova_compute[230642]: virtio-non-transitional Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: random Dec 15 04:32:30 localhost nova_compute[230642]: egd Dec 15 04:32:30 localhost nova_compute[230642]: builtin Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: path Dec 15 04:32:30 localhost nova_compute[230642]: handle Dec 15 04:32:30 localhost nova_compute[230642]: virtiofs Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: tpm-tis Dec 15 04:32:30 localhost nova_compute[230642]: tpm-crb Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: emulator Dec 15 04:32:30 localhost nova_compute[230642]: external Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: 2.0 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: usb Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: pty Dec 15 04:32:30 localhost nova_compute[230642]: unix Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: qemu Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: builtin Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: default Dec 15 04:32:30 localhost nova_compute[230642]: passt Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: isa Dec 15 04:32:30 localhost nova_compute[230642]: hyperv Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: null Dec 15 04:32:30 localhost nova_compute[230642]: vc Dec 15 04:32:30 localhost nova_compute[230642]: pty Dec 15 04:32:30 localhost nova_compute[230642]: dev Dec 15 04:32:30 localhost nova_compute[230642]: file Dec 15 04:32:30 localhost nova_compute[230642]: pipe Dec 15 04:32:30 localhost nova_compute[230642]: stdio Dec 15 04:32:30 localhost nova_compute[230642]: udp Dec 15 04:32:30 localhost nova_compute[230642]: tcp Dec 15 04:32:30 localhost nova_compute[230642]: unix Dec 15 04:32:30 localhost nova_compute[230642]: qemu-vdagent Dec 15 04:32:30 localhost nova_compute[230642]: dbus Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: relaxed Dec 15 04:32:30 localhost nova_compute[230642]: vapic Dec 15 04:32:30 localhost nova_compute[230642]: spinlocks Dec 15 04:32:30 localhost nova_compute[230642]: vpindex Dec 15 04:32:30 localhost nova_compute[230642]: runtime Dec 15 04:32:30 localhost nova_compute[230642]: synic Dec 15 04:32:30 localhost nova_compute[230642]: stimer Dec 15 04:32:30 localhost nova_compute[230642]: reset Dec 15 04:32:30 localhost nova_compute[230642]: vendor_id Dec 15 04:32:30 localhost nova_compute[230642]: frequencies Dec 15 04:32:30 localhost nova_compute[230642]: reenlightenment Dec 15 04:32:30 localhost nova_compute[230642]: tlbflush Dec 15 04:32:30 localhost nova_compute[230642]: ipi Dec 15 04:32:30 localhost nova_compute[230642]: avic Dec 15 04:32:30 localhost nova_compute[230642]: emsr_bitmap Dec 15 04:32:30 localhost nova_compute[230642]: xmm_input Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: 4095 Dec 15 04:32:30 localhost nova_compute[230642]: on Dec 15 04:32:30 localhost nova_compute[230642]: off Dec 15 04:32:30 localhost nova_compute[230642]: off Dec 15 04:32:30 localhost nova_compute[230642]: Linux KVM Hv Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: tdx Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.182 230646 DEBUG nova.virt.libvirt.host [None req-6dbb3ea6-5667-4c72-84e8-baa2852417c3 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: /usr/libexec/qemu-kvm Dec 15 04:32:30 localhost nova_compute[230642]: kvm Dec 15 04:32:30 localhost nova_compute[230642]: pc-q35-rhel9.8.0 Dec 15 04:32:30 localhost nova_compute[230642]: i686 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: /usr/share/OVMF/OVMF_CODE.secboot.fd Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: rom Dec 15 04:32:30 localhost nova_compute[230642]: pflash Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: yes Dec 15 04:32:30 localhost nova_compute[230642]: no Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: no Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: on Dec 15 04:32:30 localhost nova_compute[230642]: off Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: on Dec 15 04:32:30 localhost nova_compute[230642]: off Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: EPYC-Rome Dec 15 04:32:30 localhost nova_compute[230642]: AMD Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: 486 Dec 15 04:32:30 localhost nova_compute[230642]: 486-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Broadwell Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Broadwell-IBRS Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Broadwell-noTSX Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Broadwell-noTSX-IBRS Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Broadwell-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Broadwell-v2 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Broadwell-v3 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Broadwell-v4 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Cascadelake-Server Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Cascadelake-Server-noTSX Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Cascadelake-Server-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Cascadelake-Server-v2 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Cascadelake-Server-v3 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Cascadelake-Server-v4 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Cascadelake-Server-v5 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Conroe Dec 15 04:32:30 localhost nova_compute[230642]: Conroe-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Cooperlake Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Cooperlake-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Cooperlake-v2 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Denverton Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Denverton-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Denverton-v2 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Denverton-v3 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dhyana Dec 15 04:32:30 localhost nova_compute[230642]: Dhyana-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dhyana-v2 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: EPYC Dec 15 04:32:30 localhost nova_compute[230642]: EPYC-Genoa Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: EPYC-Genoa-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: EPYC-IBPB Dec 15 04:32:30 localhost nova_compute[230642]: EPYC-Milan Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: EPYC-Milan-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: EPYC-Milan-v2 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: EPYC-Rome Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: EPYC-Rome-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: EPYC-Rome-v2 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: EPYC-Rome-v3 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: EPYC-Rome-v4 Dec 15 04:32:30 localhost nova_compute[230642]: EPYC-v1 Dec 15 04:32:30 localhost nova_compute[230642]: EPYC-v2 Dec 15 04:32:30 localhost nova_compute[230642]: EPYC-v3 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: EPYC-v4 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: GraniteRapids Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: GraniteRapids-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: GraniteRapids-v2 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Haswell Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Haswell-IBRS Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Haswell-noTSX Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Haswell-noTSX-IBRS Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Haswell-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Haswell-v2 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Haswell-v3 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Haswell-v4 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Icelake-Server Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Icelake-Server-noTSX Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Icelake-Server-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Icelake-Server-v2 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Icelake-Server-v3 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Icelake-Server-v4 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Icelake-Server-v5 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Icelake-Server-v6 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Icelake-Server-v7 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: IvyBridge Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: IvyBridge-IBRS Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: IvyBridge-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: IvyBridge-v2 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: KnightsMill Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: KnightsMill-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Nehalem Dec 15 04:32:30 localhost nova_compute[230642]: Nehalem-IBRS Dec 15 04:32:30 localhost nova_compute[230642]: Nehalem-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Nehalem-v2 Dec 15 04:32:30 localhost nova_compute[230642]: Opteron_G1 Dec 15 04:32:30 localhost nova_compute[230642]: Opteron_G1-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Opteron_G2 Dec 15 04:32:30 localhost nova_compute[230642]: Opteron_G2-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Opteron_G3 Dec 15 04:32:30 localhost nova_compute[230642]: Opteron_G3-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Opteron_G4 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Opteron_G4-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Opteron_G5 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Opteron_G5-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Penryn Dec 15 04:32:30 localhost nova_compute[230642]: Penryn-v1 Dec 15 04:32:30 localhost nova_compute[230642]: SandyBridge Dec 15 04:32:30 localhost nova_compute[230642]: SandyBridge-IBRS Dec 15 04:32:30 localhost nova_compute[230642]: SandyBridge-v1 Dec 15 04:32:30 localhost nova_compute[230642]: SandyBridge-v2 Dec 15 04:32:30 localhost nova_compute[230642]: SapphireRapids Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: SapphireRapids-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: SapphireRapids-v2 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: SapphireRapids-v3 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: SierraForest Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: SierraForest-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Skylake-Client Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Skylake-Client-IBRS Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Skylake-Client-noTSX-IBRS Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Skylake-Client-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Skylake-Client-v2 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Skylake-Client-v3 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Skylake-Client-v4 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Skylake-Server Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Skylake-Server-IBRS Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Skylake-Server-noTSX-IBRS Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Skylake-Server-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Skylake-Server-v2 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Skylake-Server-v3 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Skylake-Server-v4 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Skylake-Server-v5 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Snowridge Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Snowridge-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Snowridge-v2 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Snowridge-v3 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Snowridge-v4 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Westmere Dec 15 04:32:30 localhost nova_compute[230642]: Westmere-IBRS Dec 15 04:32:30 localhost nova_compute[230642]: Westmere-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Westmere-v2 Dec 15 04:32:30 localhost nova_compute[230642]: athlon Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: athlon-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: core2duo Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: core2duo-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: coreduo Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: coreduo-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: kvm32 Dec 15 04:32:30 localhost nova_compute[230642]: kvm32-v1 Dec 15 04:32:30 localhost nova_compute[230642]: kvm64 Dec 15 04:32:30 localhost nova_compute[230642]: kvm64-v1 Dec 15 04:32:30 localhost nova_compute[230642]: n270 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: n270-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: pentium Dec 15 04:32:30 localhost nova_compute[230642]: pentium-v1 Dec 15 04:32:30 localhost nova_compute[230642]: pentium2 Dec 15 04:32:30 localhost nova_compute[230642]: pentium2-v1 Dec 15 04:32:30 localhost nova_compute[230642]: pentium3 Dec 15 04:32:30 localhost nova_compute[230642]: pentium3-v1 Dec 15 04:32:30 localhost nova_compute[230642]: phenom Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: phenom-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: qemu32 Dec 15 04:32:30 localhost nova_compute[230642]: qemu32-v1 Dec 15 04:32:30 localhost nova_compute[230642]: qemu64 Dec 15 04:32:30 localhost nova_compute[230642]: qemu64-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: file Dec 15 04:32:30 localhost nova_compute[230642]: anonymous Dec 15 04:32:30 localhost nova_compute[230642]: memfd Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: disk Dec 15 04:32:30 localhost nova_compute[230642]: cdrom Dec 15 04:32:30 localhost nova_compute[230642]: floppy Dec 15 04:32:30 localhost nova_compute[230642]: lun Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: fdc Dec 15 04:32:30 localhost nova_compute[230642]: scsi Dec 15 04:32:30 localhost nova_compute[230642]: virtio Dec 15 04:32:30 localhost nova_compute[230642]: usb Dec 15 04:32:30 localhost nova_compute[230642]: sata Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: virtio Dec 15 04:32:30 localhost nova_compute[230642]: virtio-transitional Dec 15 04:32:30 localhost nova_compute[230642]: virtio-non-transitional Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: vnc Dec 15 04:32:30 localhost nova_compute[230642]: egl-headless Dec 15 04:32:30 localhost nova_compute[230642]: dbus Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: subsystem Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: default Dec 15 04:32:30 localhost nova_compute[230642]: mandatory Dec 15 04:32:30 localhost nova_compute[230642]: requisite Dec 15 04:32:30 localhost nova_compute[230642]: optional Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: usb Dec 15 04:32:30 localhost nova_compute[230642]: pci Dec 15 04:32:30 localhost nova_compute[230642]: scsi Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: virtio Dec 15 04:32:30 localhost nova_compute[230642]: virtio-transitional Dec 15 04:32:30 localhost nova_compute[230642]: virtio-non-transitional Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: random Dec 15 04:32:30 localhost nova_compute[230642]: egd Dec 15 04:32:30 localhost nova_compute[230642]: builtin Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: path Dec 15 04:32:30 localhost nova_compute[230642]: handle Dec 15 04:32:30 localhost nova_compute[230642]: virtiofs Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: tpm-tis Dec 15 04:32:30 localhost nova_compute[230642]: tpm-crb Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: emulator Dec 15 04:32:30 localhost nova_compute[230642]: external Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: 2.0 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: usb Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: pty Dec 15 04:32:30 localhost nova_compute[230642]: unix Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: qemu Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: builtin Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: default Dec 15 04:32:30 localhost nova_compute[230642]: passt Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: isa Dec 15 04:32:30 localhost nova_compute[230642]: hyperv Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: null Dec 15 04:32:30 localhost nova_compute[230642]: vc Dec 15 04:32:30 localhost nova_compute[230642]: pty Dec 15 04:32:30 localhost nova_compute[230642]: dev Dec 15 04:32:30 localhost nova_compute[230642]: file Dec 15 04:32:30 localhost nova_compute[230642]: pipe Dec 15 04:32:30 localhost nova_compute[230642]: stdio Dec 15 04:32:30 localhost nova_compute[230642]: udp Dec 15 04:32:30 localhost nova_compute[230642]: tcp Dec 15 04:32:30 localhost nova_compute[230642]: unix Dec 15 04:32:30 localhost nova_compute[230642]: qemu-vdagent Dec 15 04:32:30 localhost nova_compute[230642]: dbus Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: relaxed Dec 15 04:32:30 localhost nova_compute[230642]: vapic Dec 15 04:32:30 localhost nova_compute[230642]: spinlocks Dec 15 04:32:30 localhost nova_compute[230642]: vpindex Dec 15 04:32:30 localhost nova_compute[230642]: runtime Dec 15 04:32:30 localhost nova_compute[230642]: synic Dec 15 04:32:30 localhost nova_compute[230642]: stimer Dec 15 04:32:30 localhost nova_compute[230642]: reset Dec 15 04:32:30 localhost nova_compute[230642]: vendor_id Dec 15 04:32:30 localhost nova_compute[230642]: frequencies Dec 15 04:32:30 localhost nova_compute[230642]: reenlightenment Dec 15 04:32:30 localhost nova_compute[230642]: tlbflush Dec 15 04:32:30 localhost nova_compute[230642]: ipi Dec 15 04:32:30 localhost nova_compute[230642]: avic Dec 15 04:32:30 localhost nova_compute[230642]: emsr_bitmap Dec 15 04:32:30 localhost nova_compute[230642]: xmm_input Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: 4095 Dec 15 04:32:30 localhost nova_compute[230642]: on Dec 15 04:32:30 localhost nova_compute[230642]: off Dec 15 04:32:30 localhost nova_compute[230642]: off Dec 15 04:32:30 localhost nova_compute[230642]: Linux KVM Hv Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: tdx Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.228 230646 DEBUG nova.virt.libvirt.host [None req-6dbb3ea6-5667-4c72-84e8-baa2852417c3 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.235 230646 DEBUG nova.virt.libvirt.host [None req-6dbb3ea6-5667-4c72-84e8-baa2852417c3 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: /usr/libexec/qemu-kvm Dec 15 04:32:30 localhost nova_compute[230642]: kvm Dec 15 04:32:30 localhost nova_compute[230642]: pc-i440fx-rhel7.6.0 Dec 15 04:32:30 localhost nova_compute[230642]: x86_64 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: /usr/share/OVMF/OVMF_CODE.secboot.fd Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: rom Dec 15 04:32:30 localhost nova_compute[230642]: pflash Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: yes Dec 15 04:32:30 localhost nova_compute[230642]: no Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: no Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: on Dec 15 04:32:30 localhost nova_compute[230642]: off Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: on Dec 15 04:32:30 localhost nova_compute[230642]: off Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: EPYC-Rome Dec 15 04:32:30 localhost nova_compute[230642]: AMD Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: 486 Dec 15 04:32:30 localhost nova_compute[230642]: 486-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Broadwell Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Broadwell-IBRS Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Broadwell-noTSX Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Broadwell-noTSX-IBRS Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Broadwell-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Broadwell-v2 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Broadwell-v3 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Broadwell-v4 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Cascadelake-Server Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Cascadelake-Server-noTSX Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Cascadelake-Server-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Cascadelake-Server-v2 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Cascadelake-Server-v3 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Cascadelake-Server-v4 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Cascadelake-Server-v5 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Conroe Dec 15 04:32:30 localhost nova_compute[230642]: Conroe-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Cooperlake Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Cooperlake-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Cooperlake-v2 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Denverton Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Denverton-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Denverton-v2 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Denverton-v3 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dhyana Dec 15 04:32:30 localhost nova_compute[230642]: Dhyana-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dhyana-v2 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: EPYC Dec 15 04:32:30 localhost nova_compute[230642]: EPYC-Genoa Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: EPYC-Genoa-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: EPYC-IBPB Dec 15 04:32:30 localhost nova_compute[230642]: EPYC-Milan Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: EPYC-Milan-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: EPYC-Milan-v2 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: EPYC-Rome Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: EPYC-Rome-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: EPYC-Rome-v2 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: EPYC-Rome-v3 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: EPYC-Rome-v4 Dec 15 04:32:30 localhost nova_compute[230642]: EPYC-v1 Dec 15 04:32:30 localhost nova_compute[230642]: EPYC-v2 Dec 15 04:32:30 localhost nova_compute[230642]: EPYC-v3 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: EPYC-v4 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: GraniteRapids Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: GraniteRapids-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: GraniteRapids-v2 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Haswell Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Haswell-IBRS Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Haswell-noTSX Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Haswell-noTSX-IBRS Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Haswell-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Haswell-v2 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Haswell-v3 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Haswell-v4 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Icelake-Server Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Icelake-Server-noTSX Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Icelake-Server-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Icelake-Server-v2 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Icelake-Server-v3 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Icelake-Server-v4 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Icelake-Server-v5 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Icelake-Server-v6 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Icelake-Server-v7 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: IvyBridge Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: IvyBridge-IBRS Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: IvyBridge-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: IvyBridge-v2 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: KnightsMill Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: KnightsMill-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Nehalem Dec 15 04:32:30 localhost nova_compute[230642]: Nehalem-IBRS Dec 15 04:32:30 localhost nova_compute[230642]: Nehalem-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Nehalem-v2 Dec 15 04:32:30 localhost nova_compute[230642]: Opteron_G1 Dec 15 04:32:30 localhost nova_compute[230642]: Opteron_G1-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Opteron_G2 Dec 15 04:32:30 localhost nova_compute[230642]: Opteron_G2-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Opteron_G3 Dec 15 04:32:30 localhost nova_compute[230642]: Opteron_G3-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Opteron_G4 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Opteron_G4-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Opteron_G5 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Opteron_G5-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Penryn Dec 15 04:32:30 localhost nova_compute[230642]: Penryn-v1 Dec 15 04:32:30 localhost nova_compute[230642]: SandyBridge Dec 15 04:32:30 localhost nova_compute[230642]: SandyBridge-IBRS Dec 15 04:32:30 localhost nova_compute[230642]: SandyBridge-v1 Dec 15 04:32:30 localhost nova_compute[230642]: SandyBridge-v2 Dec 15 04:32:30 localhost nova_compute[230642]: SapphireRapids Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: SapphireRapids-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: SapphireRapids-v2 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: SapphireRapids-v3 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: SierraForest Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: SierraForest-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Skylake-Client Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Skylake-Client-IBRS Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Skylake-Client-noTSX-IBRS Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Skylake-Client-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Skylake-Client-v2 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Skylake-Client-v3 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Skylake-Client-v4 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Skylake-Server Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Skylake-Server-IBRS Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Skylake-Server-noTSX-IBRS Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Skylake-Server-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Skylake-Server-v2 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Skylake-Server-v3 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Skylake-Server-v4 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Skylake-Server-v5 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Snowridge Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Snowridge-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Snowridge-v2 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Snowridge-v3 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Snowridge-v4 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Westmere Dec 15 04:32:30 localhost nova_compute[230642]: Westmere-IBRS Dec 15 04:32:30 localhost nova_compute[230642]: Westmere-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Westmere-v2 Dec 15 04:32:30 localhost nova_compute[230642]: athlon Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: athlon-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: core2duo Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: core2duo-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: coreduo Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: coreduo-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: kvm32 Dec 15 04:32:30 localhost nova_compute[230642]: kvm32-v1 Dec 15 04:32:30 localhost nova_compute[230642]: kvm64 Dec 15 04:32:30 localhost nova_compute[230642]: kvm64-v1 Dec 15 04:32:30 localhost nova_compute[230642]: n270 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: n270-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: pentium Dec 15 04:32:30 localhost nova_compute[230642]: pentium-v1 Dec 15 04:32:30 localhost nova_compute[230642]: pentium2 Dec 15 04:32:30 localhost nova_compute[230642]: pentium2-v1 Dec 15 04:32:30 localhost nova_compute[230642]: pentium3 Dec 15 04:32:30 localhost nova_compute[230642]: pentium3-v1 Dec 15 04:32:30 localhost nova_compute[230642]: phenom Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: phenom-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: qemu32 Dec 15 04:32:30 localhost nova_compute[230642]: qemu32-v1 Dec 15 04:32:30 localhost nova_compute[230642]: qemu64 Dec 15 04:32:30 localhost nova_compute[230642]: qemu64-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: file Dec 15 04:32:30 localhost nova_compute[230642]: anonymous Dec 15 04:32:30 localhost nova_compute[230642]: memfd Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: disk Dec 15 04:32:30 localhost nova_compute[230642]: cdrom Dec 15 04:32:30 localhost nova_compute[230642]: floppy Dec 15 04:32:30 localhost nova_compute[230642]: lun Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: ide Dec 15 04:32:30 localhost nova_compute[230642]: fdc Dec 15 04:32:30 localhost nova_compute[230642]: scsi Dec 15 04:32:30 localhost nova_compute[230642]: virtio Dec 15 04:32:30 localhost nova_compute[230642]: usb Dec 15 04:32:30 localhost nova_compute[230642]: sata Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: virtio Dec 15 04:32:30 localhost nova_compute[230642]: virtio-transitional Dec 15 04:32:30 localhost nova_compute[230642]: virtio-non-transitional Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: vnc Dec 15 04:32:30 localhost nova_compute[230642]: egl-headless Dec 15 04:32:30 localhost nova_compute[230642]: dbus Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: subsystem Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: default Dec 15 04:32:30 localhost nova_compute[230642]: mandatory Dec 15 04:32:30 localhost nova_compute[230642]: requisite Dec 15 04:32:30 localhost nova_compute[230642]: optional Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: usb Dec 15 04:32:30 localhost nova_compute[230642]: pci Dec 15 04:32:30 localhost nova_compute[230642]: scsi Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: virtio Dec 15 04:32:30 localhost nova_compute[230642]: virtio-transitional Dec 15 04:32:30 localhost nova_compute[230642]: virtio-non-transitional Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: random Dec 15 04:32:30 localhost nova_compute[230642]: egd Dec 15 04:32:30 localhost nova_compute[230642]: builtin Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: path Dec 15 04:32:30 localhost nova_compute[230642]: handle Dec 15 04:32:30 localhost nova_compute[230642]: virtiofs Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: tpm-tis Dec 15 04:32:30 localhost nova_compute[230642]: tpm-crb Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: emulator Dec 15 04:32:30 localhost nova_compute[230642]: external Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: 2.0 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: usb Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: pty Dec 15 04:32:30 localhost nova_compute[230642]: unix Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: qemu Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: builtin Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: default Dec 15 04:32:30 localhost nova_compute[230642]: passt Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: isa Dec 15 04:32:30 localhost nova_compute[230642]: hyperv Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: null Dec 15 04:32:30 localhost nova_compute[230642]: vc Dec 15 04:32:30 localhost nova_compute[230642]: pty Dec 15 04:32:30 localhost nova_compute[230642]: dev Dec 15 04:32:30 localhost nova_compute[230642]: file Dec 15 04:32:30 localhost nova_compute[230642]: pipe Dec 15 04:32:30 localhost nova_compute[230642]: stdio Dec 15 04:32:30 localhost nova_compute[230642]: udp Dec 15 04:32:30 localhost nova_compute[230642]: tcp Dec 15 04:32:30 localhost nova_compute[230642]: unix Dec 15 04:32:30 localhost nova_compute[230642]: qemu-vdagent Dec 15 04:32:30 localhost nova_compute[230642]: dbus Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: relaxed Dec 15 04:32:30 localhost nova_compute[230642]: vapic Dec 15 04:32:30 localhost nova_compute[230642]: spinlocks Dec 15 04:32:30 localhost nova_compute[230642]: vpindex Dec 15 04:32:30 localhost nova_compute[230642]: runtime Dec 15 04:32:30 localhost nova_compute[230642]: synic Dec 15 04:32:30 localhost nova_compute[230642]: stimer Dec 15 04:32:30 localhost nova_compute[230642]: reset Dec 15 04:32:30 localhost nova_compute[230642]: vendor_id Dec 15 04:32:30 localhost nova_compute[230642]: frequencies Dec 15 04:32:30 localhost nova_compute[230642]: reenlightenment Dec 15 04:32:30 localhost nova_compute[230642]: tlbflush Dec 15 04:32:30 localhost nova_compute[230642]: ipi Dec 15 04:32:30 localhost nova_compute[230642]: avic Dec 15 04:32:30 localhost nova_compute[230642]: emsr_bitmap Dec 15 04:32:30 localhost nova_compute[230642]: xmm_input Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: 4095 Dec 15 04:32:30 localhost nova_compute[230642]: on Dec 15 04:32:30 localhost nova_compute[230642]: off Dec 15 04:32:30 localhost nova_compute[230642]: off Dec 15 04:32:30 localhost nova_compute[230642]: Linux KVM Hv Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: tdx Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.287 230646 DEBUG nova.virt.libvirt.host [None req-6dbb3ea6-5667-4c72-84e8-baa2852417c3 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: /usr/libexec/qemu-kvm Dec 15 04:32:30 localhost nova_compute[230642]: kvm Dec 15 04:32:30 localhost nova_compute[230642]: pc-q35-rhel9.8.0 Dec 15 04:32:30 localhost nova_compute[230642]: x86_64 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: efi Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: /usr/share/edk2/ovmf/OVMF_CODE.secboot.fd Dec 15 04:32:30 localhost nova_compute[230642]: /usr/share/edk2/ovmf/OVMF_CODE.fd Dec 15 04:32:30 localhost nova_compute[230642]: /usr/share/edk2/ovmf/OVMF.amdsev.fd Dec 15 04:32:30 localhost nova_compute[230642]: /usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: rom Dec 15 04:32:30 localhost nova_compute[230642]: pflash Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: yes Dec 15 04:32:30 localhost nova_compute[230642]: no Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: yes Dec 15 04:32:30 localhost nova_compute[230642]: no Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: on Dec 15 04:32:30 localhost nova_compute[230642]: off Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: on Dec 15 04:32:30 localhost nova_compute[230642]: off Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: EPYC-Rome Dec 15 04:32:30 localhost nova_compute[230642]: AMD Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: 486 Dec 15 04:32:30 localhost nova_compute[230642]: 486-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Broadwell Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Broadwell-IBRS Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Broadwell-noTSX Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Broadwell-noTSX-IBRS Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Broadwell-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Broadwell-v2 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Broadwell-v3 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Broadwell-v4 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Cascadelake-Server Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Cascadelake-Server-noTSX Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Cascadelake-Server-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Cascadelake-Server-v2 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Cascadelake-Server-v3 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Cascadelake-Server-v4 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Cascadelake-Server-v5 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Conroe Dec 15 04:32:30 localhost nova_compute[230642]: Conroe-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Cooperlake Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Cooperlake-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Cooperlake-v2 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Denverton Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Denverton-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Denverton-v2 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Denverton-v3 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dhyana Dec 15 04:32:30 localhost nova_compute[230642]: Dhyana-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dhyana-v2 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: EPYC Dec 15 04:32:30 localhost nova_compute[230642]: EPYC-Genoa Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: EPYC-Genoa-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: EPYC-IBPB Dec 15 04:32:30 localhost nova_compute[230642]: EPYC-Milan Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: EPYC-Milan-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: EPYC-Milan-v2 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: EPYC-Rome Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: EPYC-Rome-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: EPYC-Rome-v2 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: EPYC-Rome-v3 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: EPYC-Rome-v4 Dec 15 04:32:30 localhost nova_compute[230642]: EPYC-v1 Dec 15 04:32:30 localhost nova_compute[230642]: EPYC-v2 Dec 15 04:32:30 localhost nova_compute[230642]: EPYC-v3 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: EPYC-v4 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: GraniteRapids Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: GraniteRapids-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: GraniteRapids-v2 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Haswell Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Haswell-IBRS Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Haswell-noTSX Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Haswell-noTSX-IBRS Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Haswell-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Haswell-v2 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Haswell-v3 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Haswell-v4 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Icelake-Server Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Icelake-Server-noTSX Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Icelake-Server-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Icelake-Server-v2 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Icelake-Server-v3 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Icelake-Server-v4 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Icelake-Server-v5 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Icelake-Server-v6 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Icelake-Server-v7 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: IvyBridge Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: IvyBridge-IBRS Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: IvyBridge-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: IvyBridge-v2 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: KnightsMill Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: KnightsMill-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Nehalem Dec 15 04:32:30 localhost nova_compute[230642]: Nehalem-IBRS Dec 15 04:32:30 localhost nova_compute[230642]: Nehalem-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Nehalem-v2 Dec 15 04:32:30 localhost nova_compute[230642]: Opteron_G1 Dec 15 04:32:30 localhost nova_compute[230642]: Opteron_G1-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Opteron_G2 Dec 15 04:32:30 localhost nova_compute[230642]: Opteron_G2-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Opteron_G3 Dec 15 04:32:30 localhost nova_compute[230642]: Opteron_G3-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Opteron_G4 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Opteron_G4-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Opteron_G5 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Opteron_G5-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Penryn Dec 15 04:32:30 localhost nova_compute[230642]: Penryn-v1 Dec 15 04:32:30 localhost nova_compute[230642]: SandyBridge Dec 15 04:32:30 localhost nova_compute[230642]: SandyBridge-IBRS Dec 15 04:32:30 localhost nova_compute[230642]: SandyBridge-v1 Dec 15 04:32:30 localhost nova_compute[230642]: SandyBridge-v2 Dec 15 04:32:30 localhost nova_compute[230642]: SapphireRapids Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: SapphireRapids-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: SapphireRapids-v2 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: SapphireRapids-v3 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: SierraForest Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: SierraForest-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Skylake-Client Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Skylake-Client-IBRS Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Skylake-Client-noTSX-IBRS Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Skylake-Client-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Skylake-Client-v2 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Skylake-Client-v3 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Skylake-Client-v4 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Skylake-Server Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Skylake-Server-IBRS Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Skylake-Server-noTSX-IBRS Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Skylake-Server-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Skylake-Server-v2 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Skylake-Server-v3 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Skylake-Server-v4 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Skylake-Server-v5 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Snowridge Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Snowridge-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Snowridge-v2 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Snowridge-v3 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Snowridge-v4 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Westmere Dec 15 04:32:30 localhost nova_compute[230642]: Westmere-IBRS Dec 15 04:32:30 localhost nova_compute[230642]: Westmere-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Westmere-v2 Dec 15 04:32:30 localhost nova_compute[230642]: athlon Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: athlon-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: core2duo Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: core2duo-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: coreduo Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: coreduo-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: kvm32 Dec 15 04:32:30 localhost nova_compute[230642]: kvm32-v1 Dec 15 04:32:30 localhost nova_compute[230642]: kvm64 Dec 15 04:32:30 localhost nova_compute[230642]: kvm64-v1 Dec 15 04:32:30 localhost nova_compute[230642]: n270 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: n270-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: pentium Dec 15 04:32:30 localhost nova_compute[230642]: pentium-v1 Dec 15 04:32:30 localhost nova_compute[230642]: pentium2 Dec 15 04:32:30 localhost nova_compute[230642]: pentium2-v1 Dec 15 04:32:30 localhost nova_compute[230642]: pentium3 Dec 15 04:32:30 localhost nova_compute[230642]: pentium3-v1 Dec 15 04:32:30 localhost nova_compute[230642]: phenom Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: phenom-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: qemu32 Dec 15 04:32:30 localhost nova_compute[230642]: qemu32-v1 Dec 15 04:32:30 localhost nova_compute[230642]: qemu64 Dec 15 04:32:30 localhost nova_compute[230642]: qemu64-v1 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: file Dec 15 04:32:30 localhost nova_compute[230642]: anonymous Dec 15 04:32:30 localhost nova_compute[230642]: memfd Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: disk Dec 15 04:32:30 localhost nova_compute[230642]: cdrom Dec 15 04:32:30 localhost nova_compute[230642]: floppy Dec 15 04:32:30 localhost nova_compute[230642]: lun Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: fdc Dec 15 04:32:30 localhost nova_compute[230642]: scsi Dec 15 04:32:30 localhost nova_compute[230642]: virtio Dec 15 04:32:30 localhost nova_compute[230642]: usb Dec 15 04:32:30 localhost nova_compute[230642]: sata Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: virtio Dec 15 04:32:30 localhost nova_compute[230642]: virtio-transitional Dec 15 04:32:30 localhost nova_compute[230642]: virtio-non-transitional Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: vnc Dec 15 04:32:30 localhost nova_compute[230642]: egl-headless Dec 15 04:32:30 localhost nova_compute[230642]: dbus Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: subsystem Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: default Dec 15 04:32:30 localhost nova_compute[230642]: mandatory Dec 15 04:32:30 localhost nova_compute[230642]: requisite Dec 15 04:32:30 localhost nova_compute[230642]: optional Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: usb Dec 15 04:32:30 localhost nova_compute[230642]: pci Dec 15 04:32:30 localhost nova_compute[230642]: scsi Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: virtio Dec 15 04:32:30 localhost nova_compute[230642]: virtio-transitional Dec 15 04:32:30 localhost nova_compute[230642]: virtio-non-transitional Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: random Dec 15 04:32:30 localhost nova_compute[230642]: egd Dec 15 04:32:30 localhost nova_compute[230642]: builtin Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: path Dec 15 04:32:30 localhost nova_compute[230642]: handle Dec 15 04:32:30 localhost nova_compute[230642]: virtiofs Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: tpm-tis Dec 15 04:32:30 localhost nova_compute[230642]: tpm-crb Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: emulator Dec 15 04:32:30 localhost nova_compute[230642]: external Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: 2.0 Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: usb Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: pty Dec 15 04:32:30 localhost nova_compute[230642]: unix Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: qemu Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: builtin Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: default Dec 15 04:32:30 localhost nova_compute[230642]: passt Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: isa Dec 15 04:32:30 localhost nova_compute[230642]: hyperv Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: null Dec 15 04:32:30 localhost nova_compute[230642]: vc Dec 15 04:32:30 localhost nova_compute[230642]: pty Dec 15 04:32:30 localhost nova_compute[230642]: dev Dec 15 04:32:30 localhost nova_compute[230642]: file Dec 15 04:32:30 localhost nova_compute[230642]: pipe Dec 15 04:32:30 localhost nova_compute[230642]: stdio Dec 15 04:32:30 localhost nova_compute[230642]: udp Dec 15 04:32:30 localhost nova_compute[230642]: tcp Dec 15 04:32:30 localhost nova_compute[230642]: unix Dec 15 04:32:30 localhost nova_compute[230642]: qemu-vdagent Dec 15 04:32:30 localhost nova_compute[230642]: dbus Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: relaxed Dec 15 04:32:30 localhost nova_compute[230642]: vapic Dec 15 04:32:30 localhost nova_compute[230642]: spinlocks Dec 15 04:32:30 localhost nova_compute[230642]: vpindex Dec 15 04:32:30 localhost nova_compute[230642]: runtime Dec 15 04:32:30 localhost nova_compute[230642]: synic Dec 15 04:32:30 localhost nova_compute[230642]: stimer Dec 15 04:32:30 localhost nova_compute[230642]: reset Dec 15 04:32:30 localhost nova_compute[230642]: vendor_id Dec 15 04:32:30 localhost nova_compute[230642]: frequencies Dec 15 04:32:30 localhost nova_compute[230642]: reenlightenment Dec 15 04:32:30 localhost nova_compute[230642]: tlbflush Dec 15 04:32:30 localhost nova_compute[230642]: ipi Dec 15 04:32:30 localhost nova_compute[230642]: avic Dec 15 04:32:30 localhost nova_compute[230642]: emsr_bitmap Dec 15 04:32:30 localhost nova_compute[230642]: xmm_input Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: 4095 Dec 15 04:32:30 localhost nova_compute[230642]: on Dec 15 04:32:30 localhost nova_compute[230642]: off Dec 15 04:32:30 localhost nova_compute[230642]: off Dec 15 04:32:30 localhost nova_compute[230642]: Linux KVM Hv Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: tdx Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: Dec 15 04:32:30 localhost nova_compute[230642]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.334 230646 DEBUG nova.virt.libvirt.host [None req-6dbb3ea6-5667-4c72-84e8-baa2852417c3 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.334 230646 DEBUG nova.virt.libvirt.host [None req-6dbb3ea6-5667-4c72-84e8-baa2852417c3 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.334 230646 DEBUG nova.virt.libvirt.host [None req-6dbb3ea6-5667-4c72-84e8-baa2852417c3 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.334 230646 INFO nova.virt.libvirt.host [None req-6dbb3ea6-5667-4c72-84e8-baa2852417c3 - - - - - -] Secure Boot support detected#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.338 230646 INFO nova.virt.libvirt.driver [None req-6dbb3ea6-5667-4c72-84e8-baa2852417c3 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.338 230646 INFO nova.virt.libvirt.driver [None req-6dbb3ea6-5667-4c72-84e8-baa2852417c3 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.348 230646 DEBUG nova.virt.libvirt.driver [None req-6dbb3ea6-5667-4c72-84e8-baa2852417c3 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.387 230646 INFO nova.virt.node [None req-6dbb3ea6-5667-4c72-84e8-baa2852417c3 - - - - - -] Determined node identity 26c8956b-6742-4951-b566-971b9bbe323b from /var/lib/nova/compute_id#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.409 230646 DEBUG nova.compute.manager [None req-6dbb3ea6-5667-4c72-84e8-baa2852417c3 - - - - - -] Verified node 26c8956b-6742-4951-b566-971b9bbe323b matches my host np0005559462.localdomain _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.451 230646 DEBUG nova.compute.manager [None req-6dbb3ea6-5667-4c72-84e8-baa2852417c3 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.456 230646 DEBUG nova.virt.libvirt.vif [None req-6dbb3ea6-5667-4c72-84e8-baa2852417c3 - - - - - -] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-15T08:29:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='test',display_name='test',ec2_ids=,ephemeral_gb=1,ephemeral_key_uuid=None,fault=,flavor=,hidden=False,host='np0005559462.localdomain',hostname='test',id=2,image_ref='7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2025-12-15T08:30:01Z,launched_on='np0005559462.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=,new_flavor=,node='np0005559462.localdomain',numa_topology=None,old_flavor=,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='c785bf23f53946bc99867d8832a50266',ramdisk_id='',reservation_id='r-e1tbwc05',resources=,root_device_name='/dev/vda',root_gb=1,security_groups=,services=,shutdown_terminate=False,system_metadata=,tags=,task_state=None,terminated_at=None,trusted_certs=,updated_at=2025-12-15T08:30:01Z,user_data=None,user_id='1ba5fce347b64bfebf995f187193f205',uuid=39ff1bd9-6f6b-44c8-bbec-a1fd9d196359,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "03ef8889-3216-43fb-8a52-4be17a956ce1", "address": "fa:16:3e:74:df:7c", "network": {"id": "befb7a72-17a9-4bcb-b561-84b8f626685a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.201", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.1"}}], "meta": {"injected": false, "tenant_id": "c785bf23f53946bc99867d8832a50266", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tap03ef8889-32", "ovs_interfaceid": "03ef8889-3216-43fb-8a52-4be17a956ce1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.456 230646 DEBUG nova.network.os_vif_util [None req-6dbb3ea6-5667-4c72-84e8-baa2852417c3 - - - - - -] Converting VIF {"id": "03ef8889-3216-43fb-8a52-4be17a956ce1", "address": "fa:16:3e:74:df:7c", "network": {"id": "befb7a72-17a9-4bcb-b561-84b8f626685a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.201", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.1"}}], "meta": {"injected": false, "tenant_id": "c785bf23f53946bc99867d8832a50266", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tap03ef8889-32", "ovs_interfaceid": "03ef8889-3216-43fb-8a52-4be17a956ce1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.457 230646 DEBUG nova.network.os_vif_util [None req-6dbb3ea6-5667-4c72-84e8-baa2852417c3 - - - - - -] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:74:df:7c,bridge_name='br-int',has_traffic_filtering=True,id=03ef8889-3216-43fb-8a52-4be17a956ce1,network=Network(befb7a72-17a9-4bcb-b561-84b8f626685a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap03ef8889-32') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.458 230646 DEBUG os_vif [None req-6dbb3ea6-5667-4c72-84e8-baa2852417c3 - - - - - -] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:74:df:7c,bridge_name='br-int',has_traffic_filtering=True,id=03ef8889-3216-43fb-8a52-4be17a956ce1,network=Network(befb7a72-17a9-4bcb-b561-84b8f626685a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap03ef8889-32') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.544 230646 DEBUG ovsdbapp.backend.ovs_idl [None req-6dbb3ea6-5667-4c72-84e8-baa2852417c3 - - - - - -] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.544 230646 DEBUG ovsdbapp.backend.ovs_idl [None req-6dbb3ea6-5667-4c72-84e8-baa2852417c3 - - - - - -] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.544 230646 DEBUG ovsdbapp.backend.ovs_idl [None req-6dbb3ea6-5667-4c72-84e8-baa2852417c3 - - - - - -] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.545 230646 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-6dbb3ea6-5667-4c72-84e8-baa2852417c3 - - - - - -] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.545 230646 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-6dbb3ea6-5667-4c72-84e8-baa2852417c3 - - - - - -] [POLLOUT] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.545 230646 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-6dbb3ea6-5667-4c72-84e8-baa2852417c3 - - - - - -] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.546 230646 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-6dbb3ea6-5667-4c72-84e8-baa2852417c3 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.546 230646 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-6dbb3ea6-5667-4c72-84e8-baa2852417c3 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.550 230646 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-6dbb3ea6-5667-4c72-84e8-baa2852417c3 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.561 230646 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.561 230646 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.561 230646 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.562 230646 INFO oslo.privsep.daemon [None req-6dbb3ea6-5667-4c72-84e8-baa2852417c3 - - - - - -] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpsfnh77wb/privsep.sock']#033[00m Dec 15 04:32:30 localhost nova_compute[230642]: 2025-12-15 09:32:30.915 230646 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:32:31 localhost python3.9[231029]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 15 04:32:31 localhost nova_compute[230642]: 2025-12-15 09:32:31.181 230646 INFO oslo.privsep.daemon [None req-6dbb3ea6-5667-4c72-84e8-baa2852417c3 - - - - - -] Spawned new privsep daemon via rootwrap#033[00m Dec 15 04:32:31 localhost nova_compute[230642]: 2025-12-15 09:32:31.071 231041 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m Dec 15 04:32:31 localhost nova_compute[230642]: 2025-12-15 09:32:31.074 231041 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m Dec 15 04:32:31 localhost nova_compute[230642]: 2025-12-15 09:32:31.075 231041 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none#033[00m Dec 15 04:32:31 localhost nova_compute[230642]: 2025-12-15 09:32:31.075 231041 INFO oslo.privsep.daemon [-] privsep daemon running as pid 231041#033[00m Dec 15 04:32:31 localhost nova_compute[230642]: 2025-12-15 09:32:31.474 230646 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:32:31 localhost nova_compute[230642]: 2025-12-15 09:32:31.475 230646 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap03ef8889-32, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 15 04:32:31 localhost nova_compute[230642]: 2025-12-15 09:32:31.475 230646 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap03ef8889-32, col_values=(('external_ids', {'iface-id': '03ef8889-3216-43fb-8a52-4be17a956ce1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:74:df:7c', 'vm-uuid': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 15 04:32:31 localhost nova_compute[230642]: 2025-12-15 09:32:31.476 230646 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Dec 15 04:32:31 localhost nova_compute[230642]: 2025-12-15 09:32:31.476 230646 INFO os_vif [None req-6dbb3ea6-5667-4c72-84e8-baa2852417c3 - - - - - -] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:74:df:7c,bridge_name='br-int',has_traffic_filtering=True,id=03ef8889-3216-43fb-8a52-4be17a956ce1,network=Network(befb7a72-17a9-4bcb-b561-84b8f626685a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap03ef8889-32')#033[00m Dec 15 04:32:31 localhost nova_compute[230642]: 2025-12-15 09:32:31.476 230646 DEBUG nova.compute.manager [None req-6dbb3ea6-5667-4c72-84e8-baa2852417c3 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Dec 15 04:32:31 localhost nova_compute[230642]: 2025-12-15 09:32:31.480 230646 DEBUG nova.compute.manager [None req-6dbb3ea6-5667-4c72-84e8-baa2852417c3 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Current state is 1, state in DB is 1. _init_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:1304#033[00m Dec 15 04:32:31 localhost nova_compute[230642]: 2025-12-15 09:32:31.480 230646 INFO nova.compute.manager [None req-6dbb3ea6-5667-4c72-84e8-baa2852417c3 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m Dec 15 04:32:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62363 DF PROTO=TCP SPT=35152 DPT=9882 SEQ=386127919 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839CFC390000000001030307) Dec 15 04:32:31 localhost nova_compute[230642]: 2025-12-15 09:32:31.893 230646 INFO nova.service [None req-6dbb3ea6-5667-4c72-84e8-baa2852417c3 - - - - - -] Updating service version for nova-compute on np0005559462.localdomain from 57 to 66#033[00m Dec 15 04:32:31 localhost nova_compute[230642]: 2025-12-15 09:32:31.924 230646 DEBUG oslo_concurrency.lockutils [None req-6dbb3ea6-5667-4c72-84e8-baa2852417c3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:32:31 localhost nova_compute[230642]: 2025-12-15 09:32:31.924 230646 DEBUG oslo_concurrency.lockutils [None req-6dbb3ea6-5667-4c72-84e8-baa2852417c3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:32:31 localhost nova_compute[230642]: 2025-12-15 09:32:31.925 230646 DEBUG oslo_concurrency.lockutils [None req-6dbb3ea6-5667-4c72-84e8-baa2852417c3 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:32:31 localhost nova_compute[230642]: 2025-12-15 09:32:31.925 230646 DEBUG nova.compute.resource_tracker [None req-6dbb3ea6-5667-4c72-84e8-baa2852417c3 - - - - - -] Auditing locally available compute resources for np0005559462.localdomain (node: np0005559462.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 15 04:32:31 localhost nova_compute[230642]: 2025-12-15 09:32:31.926 230646 DEBUG oslo_concurrency.processutils [None req-6dbb3ea6-5667-4c72-84e8-baa2852417c3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 04:32:32 localhost python3.9[231211]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None Dec 15 04:32:32 localhost systemd-journald[47230]: Field hash table of /run/log/journal/738a39f68bc78fb81032e509449fb759/system.journal has a fill level at 121.3 (404 of 333 items), suggesting rotation. Dec 15 04:32:32 localhost systemd-journald[47230]: /run/log/journal/738a39f68bc78fb81032e509449fb759/system.journal: Journal header limits reached or header out-of-date, rotating. Dec 15 04:32:32 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Dec 15 04:32:32 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Dec 15 04:32:32 localhost nova_compute[230642]: 2025-12-15 09:32:32.384 230646 DEBUG oslo_concurrency.processutils [None req-6dbb3ea6-5667-4c72-84e8-baa2852417c3 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 04:32:32 localhost nova_compute[230642]: 2025-12-15 09:32:32.462 230646 DEBUG nova.virt.libvirt.driver [None req-6dbb3ea6-5667-4c72-84e8-baa2852417c3 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 04:32:32 localhost nova_compute[230642]: 2025-12-15 09:32:32.463 230646 DEBUG nova.virt.libvirt.driver [None req-6dbb3ea6-5667-4c72-84e8-baa2852417c3 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 04:32:32 localhost systemd[1]: Started libvirt nodedev daemon. Dec 15 04:32:32 localhost nova_compute[230642]: 2025-12-15 09:32:32.841 230646 WARNING nova.virt.libvirt.driver [None req-6dbb3ea6-5667-4c72-84e8-baa2852417c3 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 15 04:32:32 localhost nova_compute[230642]: 2025-12-15 09:32:32.845 230646 DEBUG nova.compute.resource_tracker [None req-6dbb3ea6-5667-4c72-84e8-baa2852417c3 - - - - - -] Hypervisor/Node resource view: name=np0005559462.localdomain free_ram=12951MB free_disk=41.83720779418945GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 15 04:32:32 localhost nova_compute[230642]: 2025-12-15 09:32:32.846 230646 DEBUG oslo_concurrency.lockutils [None req-6dbb3ea6-5667-4c72-84e8-baa2852417c3 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:32:32 localhost nova_compute[230642]: 2025-12-15 09:32:32.846 230646 DEBUG oslo_concurrency.lockutils [None req-6dbb3ea6-5667-4c72-84e8-baa2852417c3 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:32:32 localhost nova_compute[230642]: 2025-12-15 09:32:32.979 230646 DEBUG nova.compute.resource_tracker [None req-6dbb3ea6-5667-4c72-84e8-baa2852417c3 - - - - - -] Instance 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 15 04:32:32 localhost nova_compute[230642]: 2025-12-15 09:32:32.980 230646 DEBUG nova.compute.resource_tracker [None req-6dbb3ea6-5667-4c72-84e8-baa2852417c3 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 15 04:32:32 localhost nova_compute[230642]: 2025-12-15 09:32:32.981 230646 DEBUG nova.compute.resource_tracker [None req-6dbb3ea6-5667-4c72-84e8-baa2852417c3 - - - - - -] Final resource view: name=np0005559462.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 15 04:32:32 localhost nova_compute[230642]: 2025-12-15 09:32:32.997 230646 DEBUG nova.scheduler.client.report [None req-6dbb3ea6-5667-4c72-84e8-baa2852417c3 - - - - - -] Refreshing inventories for resource provider 26c8956b-6742-4951-b566-971b9bbe323b _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Dec 15 04:32:33 localhost nova_compute[230642]: 2025-12-15 09:32:33.019 230646 DEBUG nova.scheduler.client.report [None req-6dbb3ea6-5667-4c72-84e8-baa2852417c3 - - - - - -] Updating ProviderTree inventory for provider 26c8956b-6742-4951-b566-971b9bbe323b from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Dec 15 04:32:33 localhost nova_compute[230642]: 2025-12-15 09:32:33.019 230646 DEBUG nova.compute.provider_tree [None req-6dbb3ea6-5667-4c72-84e8-baa2852417c3 - - - - - -] Updating inventory in ProviderTree for provider 26c8956b-6742-4951-b566-971b9bbe323b with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Dec 15 04:32:33 localhost python3.9[231410]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Dec 15 04:32:33 localhost nova_compute[230642]: 2025-12-15 09:32:33.072 230646 DEBUG nova.scheduler.client.report [None req-6dbb3ea6-5667-4c72-84e8-baa2852417c3 - - - - - -] Refreshing aggregate associations for resource provider 26c8956b-6742-4951-b566-971b9bbe323b, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Dec 15 04:32:33 localhost systemd[1]: Stopping nova_compute container... Dec 15 04:32:33 localhost nova_compute[230642]: 2025-12-15 09:32:33.095 230646 DEBUG nova.scheduler.client.report [None req-6dbb3ea6-5667-4c72-84e8-baa2852417c3 - - - - - -] Refreshing trait associations for resource provider 26c8956b-6742-4951-b566-971b9bbe323b, traits: COMPUTE_NET_VIF_MODEL_LAN9118,HW_CPU_X86_SHA,HW_CPU_X86_BMI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_TRUSTED_CERTS,COMPUTE_VOLUME_EXTEND,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_ABM,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_F16C,HW_CPU_X86_SSE4A,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE41,HW_CPU_X86_AESNI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_RESCUE_BFV,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SSE42,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_FMA3,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_AMD_SVM,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NODE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SVM,HW_CPU_X86_SSE2,HW_CPU_X86_AVX2,HW_CPU_X86_BMI2,HW_CPU_X86_AVX,HW_CPU_X86_MMX,COMPUTE_ACCELERATORS,HW_CPU_X86_SSSE3,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE,HW_CPU_X86_CLMUL,COMPUTE_IMAGE_TYPE_AMI _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Dec 15 04:32:33 localhost nova_compute[230642]: 2025-12-15 09:32:33.129 230646 DEBUG oslo_concurrency.processutils [None req-6dbb3ea6-5667-4c72-84e8-baa2852417c3 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 04:32:33 localhost journal[204381]: libvirt version: 11.9.0, package: 1.el9 (builder@centos.org, 2025-11-04-09:54:50, ) Dec 15 04:32:33 localhost journal[204381]: hostname: np0005559462.localdomain Dec 15 04:32:33 localhost journal[204381]: End of file while reading data: Input/output error Dec 15 04:32:33 localhost systemd[1]: libpod-b5d335c872cd511e4f9ef497ade55685828922bcddd4d43ed3b0589e6eb54c6a.scope: Deactivated successfully. Dec 15 04:32:33 localhost systemd[1]: libpod-b5d335c872cd511e4f9ef497ade55685828922bcddd4d43ed3b0589e6eb54c6a.scope: Consumed 3.995s CPU time. Dec 15 04:32:33 localhost podman[231414]: 2025-12-15 09:32:33.197640523 +0000 UTC m=+0.109221873 container died b5d335c872cd511e4f9ef497ade55685828922bcddd4d43ed3b0589e6eb54c6a (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=nova_compute, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=edpm, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}) Dec 15 04:32:34 localhost systemd[1]: var-lib-containers-storage-overlay-90c3d76968c2380582c34ed47ce3bd3c1017e3f0027bb2b60c7c3fb4e76ff2b1-merged.mount: Deactivated successfully. Dec 15 04:32:34 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b5d335c872cd511e4f9ef497ade55685828922bcddd4d43ed3b0589e6eb54c6a-userdata-shm.mount: Deactivated successfully. Dec 15 04:32:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62365 DF PROTO=TCP SPT=35152 DPT=9882 SEQ=386127919 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839D08250000000001030307) Dec 15 04:32:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50589 DF PROTO=TCP SPT=33002 DPT=9105 SEQ=1064622596 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839D13E60000000001030307) Dec 15 04:32:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 04:32:38 localhost podman[231414]: 2025-12-15 09:32:38.869083431 +0000 UTC m=+5.780664731 container cleanup b5d335c872cd511e4f9ef497ade55685828922bcddd4d43ed3b0589e6eb54c6a (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, managed_by=edpm_ansible) Dec 15 04:32:38 localhost podman[231414]: nova_compute Dec 15 04:32:38 localhost podman[231702]: 2025-12-15 09:32:38.90158808 +0000 UTC m=+0.730054209 container health_status 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Dec 15 04:32:38 localhost podman[231702]: 2025-12-15 09:32:38.912354558 +0000 UTC m=+0.740820667 container exec_died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251202, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Dec 15 04:32:38 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.service: Deactivated successfully. Dec 15 04:32:38 localhost podman[231734]: error opening file `/run/crun/b5d335c872cd511e4f9ef497ade55685828922bcddd4d43ed3b0589e6eb54c6a/status`: No such file or directory Dec 15 04:32:38 localhost podman[231717]: 2025-12-15 09:32:38.962278934 +0000 UTC m=+0.053561494 container cleanup b5d335c872cd511e4f9ef497ade55685828922bcddd4d43ed3b0589e6eb54c6a (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}) Dec 15 04:32:38 localhost podman[231717]: nova_compute Dec 15 04:32:38 localhost systemd[1]: edpm_nova_compute.service: Deactivated successfully. Dec 15 04:32:38 localhost systemd[1]: Stopped nova_compute container. Dec 15 04:32:38 localhost systemd[1]: Starting nova_compute container... Dec 15 04:32:39 localhost systemd[1]: Started libcrun container. Dec 15 04:32:39 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/90c3d76968c2380582c34ed47ce3bd3c1017e3f0027bb2b60c7c3fb4e76ff2b1/merged/etc/multipath supports timestamps until 2038 (0x7fffffff) Dec 15 04:32:39 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/90c3d76968c2380582c34ed47ce3bd3c1017e3f0027bb2b60c7c3fb4e76ff2b1/merged/etc/nvme supports timestamps until 2038 (0x7fffffff) Dec 15 04:32:39 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/90c3d76968c2380582c34ed47ce3bd3c1017e3f0027bb2b60c7c3fb4e76ff2b1/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff) Dec 15 04:32:39 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/90c3d76968c2380582c34ed47ce3bd3c1017e3f0027bb2b60c7c3fb4e76ff2b1/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Dec 15 04:32:39 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/90c3d76968c2380582c34ed47ce3bd3c1017e3f0027bb2b60c7c3fb4e76ff2b1/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Dec 15 04:32:39 localhost podman[231737]: 2025-12-15 09:32:39.085171781 +0000 UTC m=+0.098895256 container init b5d335c872cd511e4f9ef497ade55685828922bcddd4d43ed3b0589e6eb54c6a (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=nova_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 04:32:39 localhost podman[231737]: 2025-12-15 09:32:39.096375201 +0000 UTC m=+0.110098676 container start b5d335c872cd511e4f9ef497ade55685828922bcddd4d43ed3b0589e6eb54c6a (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=nova_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251202) Dec 15 04:32:39 localhost podman[231737]: nova_compute Dec 15 04:32:39 localhost nova_compute[231752]: + sudo -E kolla_set_configs Dec 15 04:32:39 localhost systemd[1]: Started nova_compute container. Dec 15 04:32:39 localhost nova_compute[231752]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Dec 15 04:32:39 localhost nova_compute[231752]: INFO:__main__:Validating config file Dec 15 04:32:39 localhost nova_compute[231752]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Dec 15 04:32:39 localhost nova_compute[231752]: INFO:__main__:Copying service configuration files Dec 15 04:32:39 localhost nova_compute[231752]: INFO:__main__:Deleting /etc/nova/nova.conf Dec 15 04:32:39 localhost nova_compute[231752]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf Dec 15 04:32:39 localhost nova_compute[231752]: INFO:__main__:Setting permission for /etc/nova/nova.conf Dec 15 04:32:39 localhost nova_compute[231752]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf Dec 15 04:32:39 localhost nova_compute[231752]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf Dec 15 04:32:39 localhost nova_compute[231752]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf Dec 15 04:32:39 localhost nova_compute[231752]: INFO:__main__:Deleting /etc/nova/nova.conf.d/03-ceph-nova.conf Dec 15 04:32:39 localhost nova_compute[231752]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf Dec 15 04:32:39 localhost nova_compute[231752]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf Dec 15 04:32:39 localhost nova_compute[231752]: INFO:__main__:Deleting /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf Dec 15 04:32:39 localhost nova_compute[231752]: INFO:__main__:Copying /var/lib/kolla/config_files/99-nova-compute-cells-workarounds.conf to /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf Dec 15 04:32:39 localhost nova_compute[231752]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf Dec 15 04:32:39 localhost nova_compute[231752]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf Dec 15 04:32:39 localhost nova_compute[231752]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf Dec 15 04:32:39 localhost nova_compute[231752]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf Dec 15 04:32:39 localhost nova_compute[231752]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf Dec 15 04:32:39 localhost nova_compute[231752]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf Dec 15 04:32:39 localhost nova_compute[231752]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf Dec 15 04:32:39 localhost nova_compute[231752]: INFO:__main__:Deleting /etc/ceph Dec 15 04:32:39 localhost nova_compute[231752]: INFO:__main__:Creating directory /etc/ceph Dec 15 04:32:39 localhost nova_compute[231752]: INFO:__main__:Setting permission for /etc/ceph Dec 15 04:32:39 localhost nova_compute[231752]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf Dec 15 04:32:39 localhost nova_compute[231752]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf Dec 15 04:32:39 localhost nova_compute[231752]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring Dec 15 04:32:39 localhost nova_compute[231752]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring Dec 15 04:32:39 localhost nova_compute[231752]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey Dec 15 04:32:39 localhost nova_compute[231752]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey Dec 15 04:32:39 localhost nova_compute[231752]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey Dec 15 04:32:39 localhost nova_compute[231752]: INFO:__main__:Deleting /var/lib/nova/.ssh/config Dec 15 04:32:39 localhost nova_compute[231752]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config Dec 15 04:32:39 localhost nova_compute[231752]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config Dec 15 04:32:39 localhost nova_compute[231752]: INFO:__main__:Deleting /usr/sbin/iscsiadm Dec 15 04:32:39 localhost nova_compute[231752]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm Dec 15 04:32:39 localhost nova_compute[231752]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm Dec 15 04:32:39 localhost nova_compute[231752]: INFO:__main__:Writing out command to execute Dec 15 04:32:39 localhost nova_compute[231752]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf Dec 15 04:32:39 localhost nova_compute[231752]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring Dec 15 04:32:39 localhost nova_compute[231752]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ Dec 15 04:32:39 localhost nova_compute[231752]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey Dec 15 04:32:39 localhost nova_compute[231752]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config Dec 15 04:32:39 localhost nova_compute[231752]: ++ cat /run_command Dec 15 04:32:39 localhost nova_compute[231752]: + CMD=nova-compute Dec 15 04:32:39 localhost nova_compute[231752]: + ARGS= Dec 15 04:32:39 localhost nova_compute[231752]: + sudo kolla_copy_cacerts Dec 15 04:32:39 localhost nova_compute[231752]: + [[ ! -n '' ]] Dec 15 04:32:39 localhost nova_compute[231752]: + . kolla_extend_start Dec 15 04:32:39 localhost nova_compute[231752]: Running command: 'nova-compute' Dec 15 04:32:39 localhost nova_compute[231752]: + echo 'Running command: '\''nova-compute'\''' Dec 15 04:32:39 localhost nova_compute[231752]: + umask 0022 Dec 15 04:32:39 localhost nova_compute[231752]: + exec nova-compute Dec 15 04:32:39 localhost python3.9[231873]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None Dec 15 04:32:40 localhost systemd[1]: Started libpod-conmon-c86c0d58ed395af407f97b4ea9c613ee83c5b4d277ca4b706ab48762a26987aa.scope. Dec 15 04:32:40 localhost systemd[1]: Started libcrun container. Dec 15 04:32:40 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5c523cc249966f9abaa8c86aeb9bc1f9fc67bbfacedf0d474f96681ec85a864c/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff) Dec 15 04:32:40 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5c523cc249966f9abaa8c86aeb9bc1f9fc67bbfacedf0d474f96681ec85a864c/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Dec 15 04:32:40 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5c523cc249966f9abaa8c86aeb9bc1f9fc67bbfacedf0d474f96681ec85a864c/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff) Dec 15 04:32:40 localhost podman[231897]: 2025-12-15 09:32:40.200578448 +0000 UTC m=+0.157991417 container init c86c0d58ed395af407f97b4ea9c613ee83c5b4d277ca4b706ab48762a26987aa (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202) Dec 15 04:32:40 localhost podman[231897]: 2025-12-15 09:32:40.211206712 +0000 UTC m=+0.168619681 container start c86c0d58ed395af407f97b4ea9c613ee83c5b4d277ca4b706ab48762a26987aa (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, tcib_managed=true, container_name=nova_compute_init, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}) Dec 15 04:32:40 localhost python3.9[231873]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init Dec 15 04:32:40 localhost nova_compute_init[231919]: INFO:nova_statedir:Applying nova statedir ownership Dec 15 04:32:40 localhost nova_compute_init[231919]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436 Dec 15 04:32:40 localhost nova_compute_init[231919]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/ Dec 15 04:32:40 localhost nova_compute_init[231919]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436 Dec 15 04:32:40 localhost nova_compute_init[231919]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0 Dec 15 04:32:40 localhost nova_compute_init[231919]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/ Dec 15 04:32:40 localhost nova_compute_init[231919]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436 Dec 15 04:32:40 localhost nova_compute_init[231919]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0 Dec 15 04:32:40 localhost nova_compute_init[231919]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/ Dec 15 04:32:40 localhost nova_compute_init[231919]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 already 42436:42436 Dec 15 04:32:40 localhost nova_compute_init[231919]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 to system_u:object_r:container_file_t:s0 Dec 15 04:32:40 localhost nova_compute_init[231919]: INFO:nova_statedir:Checking uid: 0 gid: 0 path: /var/lib/nova/instances/39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/console.log Dec 15 04:32:40 localhost nova_compute_init[231919]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/_base/ Dec 15 04:32:40 localhost nova_compute_init[231919]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/_base already 42436:42436 Dec 15 04:32:40 localhost nova_compute_init[231919]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/_base to system_u:object_r:container_file_t:s0 Dec 15 04:32:40 localhost nova_compute_init[231919]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/_base/f51109af1d3e72d8fb41e75a49bf4f04de3202b1 Dec 15 04:32:40 localhost nova_compute_init[231919]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/_base/ephemeral_1_0706d66 Dec 15 04:32:40 localhost nova_compute_init[231919]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/ Dec 15 04:32:40 localhost nova_compute_init[231919]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/locks already 42436:42436 Dec 15 04:32:40 localhost nova_compute_init[231919]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/locks to system_u:object_r:container_file_t:s0 Dec 15 04:32:40 localhost nova_compute_init[231919]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/nova-f51109af1d3e72d8fb41e75a49bf4f04de3202b1 Dec 15 04:32:40 localhost nova_compute_init[231919]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/nova-ephemeral_1_0706d66 Dec 15 04:32:40 localhost nova_compute_init[231919]: INFO:nova_statedir:Checking uid: 0 gid: 0 path: /var/lib/nova/delay-nova-compute Dec 15 04:32:40 localhost nova_compute_init[231919]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ Dec 15 04:32:40 localhost nova_compute_init[231919]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436 Dec 15 04:32:40 localhost nova_compute_init[231919]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0 Dec 15 04:32:40 localhost nova_compute_init[231919]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey Dec 15 04:32:40 localhost nova_compute_init[231919]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config Dec 15 04:32:40 localhost nova_compute_init[231919]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/ Dec 15 04:32:40 localhost nova_compute_init[231919]: INFO:nova_statedir:Ownership of /var/lib/nova/.cache already 42436:42436 Dec 15 04:32:40 localhost nova_compute_init[231919]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.cache to system_u:object_r:container_file_t:s0 Dec 15 04:32:40 localhost nova_compute_init[231919]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/ Dec 15 04:32:40 localhost nova_compute_init[231919]: INFO:nova_statedir:Ownership of /var/lib/nova/.cache/python-entrypoints already 42436:42436 Dec 15 04:32:40 localhost nova_compute_init[231919]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.cache/python-entrypoints to system_u:object_r:container_file_t:s0 Dec 15 04:32:40 localhost nova_compute_init[231919]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/b234715fc878456b41e32c4fbc669b417044dbe6c6684bbc9059e5c93396ffea Dec 15 04:32:40 localhost nova_compute_init[231919]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/16958d231615fa2e15154aac2f4371388ef8f2a8455c69ba0e5e08f2c33545f5 Dec 15 04:32:40 localhost nova_compute_init[231919]: INFO:nova_statedir:Nova statedir ownership complete Dec 15 04:32:40 localhost systemd[1]: libpod-c86c0d58ed395af407f97b4ea9c613ee83c5b4d277ca4b706ab48762a26987aa.scope: Deactivated successfully. Dec 15 04:32:40 localhost podman[231918]: 2025-12-15 09:32:40.280968408 +0000 UTC m=+0.051459508 container died c86c0d58ed395af407f97b4ea9c613ee83c5b4d277ca4b706ab48762a26987aa (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, container_name=nova_compute_init, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2) Dec 15 04:32:40 localhost podman[231931]: 2025-12-15 09:32:40.356259803 +0000 UTC m=+0.069558692 container cleanup c86c0d58ed395af407f97b4ea9c613ee83c5b4d277ca4b706ab48762a26987aa (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=nova_compute_init, tcib_managed=true, config_id=edpm, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 04:32:40 localhost systemd[1]: libpod-conmon-c86c0d58ed395af407f97b4ea9c613ee83c5b4d277ca4b706ab48762a26987aa.scope: Deactivated successfully. Dec 15 04:32:40 localhost nova_compute[231752]: 2025-12-15 09:32:40.808 231756 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Dec 15 04:32:40 localhost nova_compute[231752]: 2025-12-15 09:32:40.809 231756 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Dec 15 04:32:40 localhost nova_compute[231752]: 2025-12-15 09:32:40.809 231756 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Dec 15 04:32:40 localhost nova_compute[231752]: 2025-12-15 09:32:40.809 231756 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m Dec 15 04:32:40 localhost systemd[1]: tmp-crun.IXkZQn.mount: Deactivated successfully. Dec 15 04:32:40 localhost systemd[1]: var-lib-containers-storage-overlay-5c523cc249966f9abaa8c86aeb9bc1f9fc67bbfacedf0d474f96681ec85a864c-merged.mount: Deactivated successfully. Dec 15 04:32:40 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c86c0d58ed395af407f97b4ea9c613ee83c5b4d277ca4b706ab48762a26987aa-userdata-shm.mount: Deactivated successfully. Dec 15 04:32:40 localhost nova_compute[231752]: 2025-12-15 09:32:40.921 231756 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 04:32:40 localhost nova_compute[231752]: 2025-12-15 09:32:40.943 231756 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.022s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 04:32:40 localhost nova_compute[231752]: 2025-12-15 09:32:40.943 231756 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m Dec 15 04:32:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43903 DF PROTO=TCP SPT=51200 DPT=9101 SEQ=4199071406 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839D21250000000001030307) Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.336 231756 INFO nova.virt.driver [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m Dec 15 04:32:41 localhost systemd[1]: session-53.scope: Deactivated successfully. Dec 15 04:32:41 localhost systemd[1]: session-53.scope: Consumed 2min 14.052s CPU time. Dec 15 04:32:41 localhost systemd-logind[763]: Session 53 logged out. Waiting for processes to exit. Dec 15 04:32:41 localhost systemd-logind[763]: Removed session 53. Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.453 231756 INFO nova.compute.provider_config [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.461 231756 WARNING nova.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] Current Nova version does not support computes older than Yoga but the minimum compute service level in your cell is 57 and the oldest supported service level is 61.: nova.exception.TooOldComputeService: Current Nova version does not support computes older than Yoga but the minimum compute service level in your cell is 57 and the oldest supported service level is 61.#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.461 231756 DEBUG oslo_concurrency.lockutils [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.461 231756 DEBUG oslo_concurrency.lockutils [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.461 231756 DEBUG oslo_concurrency.lockutils [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.462 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.462 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.462 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.462 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.462 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.462 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.462 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] allow_resize_to_same_host = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.463 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] arq_binding_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.463 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] backdoor_port = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.463 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] backdoor_socket = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.463 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] block_device_allocate_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.463 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.463 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] cert = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.463 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] compute_driver = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.463 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] compute_monitors = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.464 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] config_dir = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.464 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] config_drive_format = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.464 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] config_file = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.464 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] config_source = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.464 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] console_host = np0005559462.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.464 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] control_exchange = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.464 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] cpu_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.465 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] daemon = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.465 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] debug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.465 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.465 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] default_availability_zone = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.465 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] default_ephemeral_format = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.465 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.465 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] default_schedule_zone = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.466 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] disk_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.466 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] enable_new_services = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.466 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] enabled_apis = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.466 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] enabled_ssl_apis = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.466 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] flat_injected = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.466 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] force_config_drive = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.466 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] force_raw_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.467 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] graceful_shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.467 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.467 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] host = np0005559462.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.467 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] initial_cpu_allocation_ratio = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.467 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] initial_disk_allocation_ratio = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.467 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] initial_ram_allocation_ratio = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.467 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] injected_network_template = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.468 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] instance_build_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.468 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] instance_delete_interval = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.468 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] instance_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.468 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] instance_name_template = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.468 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] instance_usage_audit = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.468 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] instance_usage_audit_period = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.468 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] instance_uuid_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.468 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] instances_path = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.469 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.469 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.469 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] live_migration_retry_count = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.469 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] log_config_append = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.469 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] log_date_format = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.469 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] log_dir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.469 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] log_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.470 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] log_options = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.470 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] log_rotate_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.470 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] log_rotate_interval_type = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.470 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] log_rotation_type = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.470 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] logging_context_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.470 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] logging_debug_format_suffix = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.470 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] logging_default_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.470 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] logging_exception_prefix = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.471 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.471 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] long_rpc_timeout = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.471 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] max_concurrent_builds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.471 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.471 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] max_concurrent_snapshots = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.471 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] max_local_block_devices = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.471 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] max_logfile_count = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.472 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] max_logfile_size_mb = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.472 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.472 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] metadata_listen = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.472 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] metadata_listen_port = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.472 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] metadata_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.472 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] migrate_max_retries = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.472 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] mkisofs_cmd = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.472 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] my_block_storage_ip = 192.168.122.106 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.473 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] my_ip = 192.168.122.106 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.473 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] network_allocate_retries = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.473 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.473 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] osapi_compute_listen = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.473 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] osapi_compute_listen_port = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.473 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] osapi_compute_unique_server_name_scope = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.473 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] osapi_compute_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.474 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] password_length = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.474 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] periodic_enable = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.474 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] periodic_fuzzy_delay = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.474 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] pointer_model = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.474 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] preallocate_images = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.474 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] publish_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.474 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] pybasedir = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.474 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] ram_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.475 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] rate_limit_burst = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.475 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] rate_limit_except_level = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.475 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] rate_limit_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.475 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] reboot_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.475 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] reclaim_instance_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.475 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] record = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.475 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] reimage_timeout_per_gb = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.475 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] report_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.476 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] rescue_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.476 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] reserved_host_cpus = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.476 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] reserved_host_disk_mb = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.476 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] reserved_host_memory_mb = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.476 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] reserved_huge_pages = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.476 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] resize_confirm_window = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.476 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] resize_fs_using_block_device = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.476 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.477 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] rootwrap_config = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.477 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] rpc_response_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.477 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] run_external_periodic_tasks = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.477 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.477 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.477 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.477 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.477 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] service_down_time = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.478 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] servicegroup_driver = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.478 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] shelved_offload_time = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.478 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] shelved_poll_interval = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.478 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.478 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] source_is_ipv6 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.478 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] ssl_only = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.479 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] state_path = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.479 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] sync_power_state_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.479 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] sync_power_state_pool_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.479 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] syslog_log_facility = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.479 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] tempdir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.479 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] timeout_nbd = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.479 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.479 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] update_resources_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.480 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] use_cow_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.480 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] use_eventlog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.480 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] use_journal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.480 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] use_json = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.480 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] use_rootwrap_daemon = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.480 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] use_stderr = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.480 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] use_syslog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.481 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] vcpu_pin_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.481 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] vif_plugging_is_fatal = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.481 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] vif_plugging_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.481 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] virt_mkfs = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.481 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] volume_usage_poll_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.481 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] watch_log_file = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.481 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] web = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.481 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.482 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_concurrency.lock_path = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.482 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.482 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.482 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_messaging_metrics.metrics_process_name = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.482 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.482 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.482 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] api.auth_strategy = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.483 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] api.compute_link_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.483 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.483 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] api.dhcp_domain = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.483 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] api.enable_instance_password = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.483 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] api.glance_link_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.483 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.483 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.483 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.484 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.484 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] api.local_metadata_per_cell = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.484 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] api.max_limit = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.484 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] api.metadata_cache_expiration = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.484 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] api.neutron_default_tenant_id = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.484 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] api.use_forwarded_for = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.484 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] api.use_neutron_default_nets = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.485 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.485 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.485 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.485 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] api.vendordata_dynamic_ssl_certfile = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.485 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.485 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] api.vendordata_jsonfile_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.485 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] api.vendordata_providers = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.486 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] cache.backend = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.486 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] cache.backend_argument = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.486 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] cache.config_prefix = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.486 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] cache.dead_timeout = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.486 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] cache.debug_cache_backend = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.486 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] cache.enable_retry_client = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.486 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] cache.enable_socket_keepalive = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.487 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] cache.enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.487 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] cache.expiration_time = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.487 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.487 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] cache.hashclient_retry_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.487 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] cache.memcache_dead_retry = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.487 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] cache.memcache_password = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.487 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.487 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.488 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] cache.memcache_pool_maxsize = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.488 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.488 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] cache.memcache_sasl_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.488 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] cache.memcache_servers = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.488 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] cache.memcache_socket_timeout = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.488 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] cache.memcache_username = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.488 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] cache.proxies = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.489 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] cache.retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.489 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] cache.retry_delay = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.489 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] cache.socket_keepalive_count = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.489 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] cache.socket_keepalive_idle = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.489 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.489 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] cache.tls_allowed_ciphers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.489 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] cache.tls_cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.489 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] cache.tls_certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.490 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] cache.tls_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.490 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] cache.tls_keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.490 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] cinder.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.490 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] cinder.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.490 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] cinder.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.490 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] cinder.catalog_info = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.490 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] cinder.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.491 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] cinder.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.491 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] cinder.cross_az_attach = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.491 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] cinder.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.491 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] cinder.endpoint_template = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.491 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] cinder.http_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.491 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] cinder.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.491 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] cinder.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.491 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] cinder.os_region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.492 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] cinder.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.492 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] cinder.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.492 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.492 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] compute.cpu_dedicated_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.492 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] compute.cpu_shared_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.492 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.492 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.493 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.493 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.493 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.493 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.493 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.493 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.493 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.493 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] conductor.workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.494 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] console.allowed_origins = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.494 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] console.ssl_ciphers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.494 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] console.ssl_minimum_version = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.494 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] consoleauth.token_ttl = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.494 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] cyborg.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.494 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] cyborg.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.494 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] cyborg.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.495 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] cyborg.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.495 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] cyborg.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.495 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] cyborg.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.495 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] cyborg.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.495 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] cyborg.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.495 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] cyborg.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.495 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] cyborg.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.495 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] cyborg.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.496 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] cyborg.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.496 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] cyborg.service_type = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.496 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] cyborg.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.496 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] cyborg.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.496 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.496 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] cyborg.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.496 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] cyborg.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.496 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] cyborg.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.497 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] database.backend = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.497 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] database.connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.497 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] database.connection_debug = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.497 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] database.connection_parameters = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.497 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.497 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] database.connection_trace = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.497 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.498 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] database.db_max_retries = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.498 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.498 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.498 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] database.max_overflow = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.498 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] database.max_pool_size = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.498 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] database.max_retries = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.498 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] database.mysql_enable_ndb = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.498 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] database.mysql_sql_mode = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.499 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.499 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] database.pool_timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.499 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] database.retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.499 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] database.slave_connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.499 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.499 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] api_database.backend = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.499 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] api_database.connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.500 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] api_database.connection_debug = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.500 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] api_database.connection_parameters = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.500 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.500 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] api_database.connection_trace = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.500 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.500 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] api_database.db_max_retries = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.500 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.500 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.501 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] api_database.max_overflow = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.501 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] api_database.max_pool_size = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.501 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] api_database.max_retries = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.501 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] api_database.mysql_enable_ndb = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.501 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] api_database.mysql_sql_mode = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.501 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.501 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] api_database.pool_timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.502 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] api_database.retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.502 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] api_database.slave_connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.502 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.502 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] devices.enabled_mdev_types = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.502 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.502 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.502 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.502 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] glance.api_servers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.503 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] glance.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.503 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] glance.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.503 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] glance.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.503 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] glance.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.503 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] glance.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.503 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] glance.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.503 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.504 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.504 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] glance.enable_rbd_download = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.504 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] glance.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.504 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] glance.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.504 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] glance.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.504 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] glance.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.504 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] glance.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.504 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] glance.num_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.505 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] glance.rbd_ceph_conf = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.505 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] glance.rbd_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.505 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] glance.rbd_pool = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.505 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] glance.rbd_user = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.505 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] glance.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.505 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] glance.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.505 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] glance.service_type = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.506 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] glance.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.506 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] glance.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.506 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.506 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] glance.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.506 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] glance.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.506 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.506 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] glance.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.507 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] guestfs.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.507 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] hyperv.config_drive_cdrom = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.507 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.507 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] hyperv.dynamic_memory_ratio = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.507 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.507 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] hyperv.enable_remotefx = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.507 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] hyperv.instances_path_share = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.507 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] hyperv.iscsi_initiator_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.508 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] hyperv.limit_cpu_features = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.508 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.508 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.508 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.508 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.508 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] hyperv.qemu_img_cmd = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.508 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] hyperv.use_multipath_io = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.509 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.509 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.509 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] hyperv.vswitch_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.509 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.509 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] mks.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.509 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] mks.mksproxy_base_url = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.510 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] image_cache.manager_interval = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.510 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.510 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.510 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.510 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.510 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] image_cache.subdirectory_name = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.510 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] ironic.api_max_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.511 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] ironic.api_retry_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.511 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] ironic.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.511 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] ironic.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.511 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] ironic.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.511 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] ironic.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.511 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] ironic.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.511 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] ironic.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.511 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] ironic.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.512 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] ironic.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.512 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] ironic.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.512 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] ironic.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.512 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] ironic.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.512 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] ironic.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.512 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] ironic.partition_key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.512 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] ironic.peer_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.513 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] ironic.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.513 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.513 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] ironic.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.513 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] ironic.service_type = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.513 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] ironic.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.513 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] ironic.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.513 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.513 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] ironic.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.514 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] ironic.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.514 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] ironic.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.514 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] key_manager.backend = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.514 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] key_manager.fixed_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.514 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] barbican.auth_endpoint = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.514 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] barbican.barbican_api_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.514 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] barbican.barbican_endpoint = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.515 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.515 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] barbican.barbican_region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.515 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] barbican.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.515 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] barbican.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.515 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] barbican.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.515 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] barbican.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.515 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] barbican.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.515 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] barbican.number_of_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.516 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] barbican.retry_delay = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.516 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.516 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] barbican.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.516 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] barbican.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.516 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] barbican.verify_ssl = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.516 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] barbican.verify_ssl_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.516 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.516 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.517 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] barbican_service_user.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.517 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.517 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.517 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.517 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] barbican_service_user.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.517 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.517 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] barbican_service_user.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.518 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] vault.approle_role_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.518 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] vault.approle_secret_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.518 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] vault.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.518 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] vault.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.518 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] vault.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.518 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] vault.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.518 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] vault.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.518 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] vault.kv_mountpoint = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.519 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] vault.kv_version = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.519 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] vault.namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.519 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] vault.root_token_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.519 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] vault.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.519 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] vault.ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.519 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] vault.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.519 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] vault.use_ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.520 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] vault.vault_url = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.520 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] keystone.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.520 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] keystone.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.520 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] keystone.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.520 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] keystone.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.520 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] keystone.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.520 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] keystone.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.521 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] keystone.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.521 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] keystone.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.521 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] keystone.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.521 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] keystone.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.521 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] keystone.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.521 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] keystone.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.521 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] keystone.service_type = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.521 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] keystone.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.522 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] keystone.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.522 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.522 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] keystone.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.522 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] keystone.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.522 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] keystone.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.522 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.connection_uri = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.522 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.cpu_mode = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.523 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.cpu_model_extra_flags = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.523 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.cpu_models = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.523 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.523 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.523 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.cpu_power_management = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.523 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.523 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.523 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.device_detach_timeout = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.524 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.disk_cachemodes = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.524 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.disk_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.524 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.enabled_perf_events = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.524 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.file_backed_memory = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.524 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.gid_maps = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.524 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.hw_disk_discard = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.524 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.hw_machine_type = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.525 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.images_rbd_ceph_conf = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.525 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.525 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.525 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.525 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.images_rbd_pool = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.525 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.images_type = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.525 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.images_volume_group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.526 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.inject_key = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.526 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.inject_partition = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.526 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.526 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.iscsi_iface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.526 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.iser_use_multipath = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.526 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.526 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.527 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.527 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.527 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.527 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.527 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.527 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.527 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.live_migration_scheme = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.527 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.528 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.528 231756 WARNING oslo_config.cfg [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal ( Dec 15 04:32:41 localhost nova_compute[231752]: live_migration_uri is deprecated for removal in favor of two other options that Dec 15 04:32:41 localhost nova_compute[231752]: allow to change live migration scheme and target URI: ``live_migration_scheme`` Dec 15 04:32:41 localhost nova_compute[231752]: and ``live_migration_inbound_addr`` respectively. Dec 15 04:32:41 localhost nova_compute[231752]: ). Its value may be silently ignored in the future.#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.528 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.live_migration_uri = qemu+ssh://nova@%s/system?keyfile=/var/lib/nova/.ssh/ssh-privatekey log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.528 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.live_migration_with_native_tls = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.528 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.max_queues = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.529 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.529 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.nfs_mount_options = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.529 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.nfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.529 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.529 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.num_iser_scan_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.529 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.530 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.530 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.num_pcie_ports = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.530 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.num_volume_scan_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.530 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.pmem_namespaces = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.530 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.quobyte_client_cfg = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.530 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.530 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.rbd_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.531 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.531 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.531 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.rbd_secret_uuid = bce17446-41b5-5408-a23e-0b011906b44a log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.531 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.rbd_user = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.531 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.531 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.531 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.rescue_image_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.531 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.rescue_kernel_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.532 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.rescue_ramdisk_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.532 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.rng_dev_path = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.532 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.rx_queue_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.532 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.smbfs_mount_options = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.532 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.532 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.snapshot_compression = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.532 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.snapshot_image_format = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.533 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.snapshots_directory = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.533 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.533 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.swtpm_enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.533 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.swtpm_group = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.533 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.swtpm_user = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.533 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.sysinfo_serial = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.533 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.tx_queue_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.534 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.uid_maps = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.534 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.534 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.virt_type = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.534 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.volume_clear = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.534 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.volume_clear_size = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.534 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.volume_use_multipath = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.534 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.vzstorage_cache_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.535 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.535 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.vzstorage_mount_group = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.535 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.vzstorage_mount_opts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.535 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.vzstorage_mount_perms = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.535 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.535 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.vzstorage_mount_user = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.535 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.536 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] neutron.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.536 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] neutron.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.536 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] neutron.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.536 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] neutron.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.536 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] neutron.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.536 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] neutron.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.536 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] neutron.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.537 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] neutron.default_floating_pool = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.537 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] neutron.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.537 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.537 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] neutron.http_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.537 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] neutron.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.537 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] neutron.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.537 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] neutron.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.538 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.538 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] neutron.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.538 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] neutron.ovs_bridge = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.538 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] neutron.physnets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.538 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] neutron.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.538 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.538 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] neutron.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.539 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] neutron.service_type = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.539 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] neutron.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.539 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] neutron.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.539 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.539 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] neutron.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.539 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] neutron.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.539 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] neutron.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.540 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.540 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] notifications.default_level = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.540 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.540 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.540 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.540 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] pci.alias = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.540 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] pci.device_spec = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.541 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] pci.report_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.541 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] placement.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.541 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] placement.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.541 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] placement.auth_url = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.541 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] placement.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.541 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] placement.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.541 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] placement.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.542 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] placement.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.542 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] placement.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.542 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] placement.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.542 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] placement.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.542 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] placement.domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.542 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] placement.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.542 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] placement.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.542 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] placement.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.543 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] placement.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.543 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] placement.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.543 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] placement.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.543 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] placement.password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.543 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] placement.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.543 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] placement.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.543 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] placement.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.544 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] placement.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.544 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] placement.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.544 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] placement.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.544 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] placement.service_type = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.544 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] placement.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.544 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] placement.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.544 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.544 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] placement.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.545 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] placement.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.545 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] placement.trust_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.545 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] placement.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.545 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] placement.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.545 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] placement.user_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.545 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] placement.username = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.545 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] placement.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.546 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] placement.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.546 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] quota.cores = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.546 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.546 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] quota.driver = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.546 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.546 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.546 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] quota.injected_files = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.546 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] quota.instances = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.547 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] quota.key_pairs = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.547 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] quota.metadata_items = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.547 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] quota.ram = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.547 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] quota.recheck_quota = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.547 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] quota.server_group_members = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.547 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] quota.server_groups = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.547 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] rdp.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.548 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.548 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.548 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.548 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.548 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.548 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] scheduler.max_attempts = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.548 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.549 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.549 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.549 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.549 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.549 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] scheduler.workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.549 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.550 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.550 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.550 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.550 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.550 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.550 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.550 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.551 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.551 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.551 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.551 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.551 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.551 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.551 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.552 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.552 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.552 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.552 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.552 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.552 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.552 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.553 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.553 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.553 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] metrics.required = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.553 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] metrics.weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.553 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] metrics.weight_of_unavailable = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.553 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] metrics.weight_setting = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.553 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] serial_console.base_url = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.554 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] serial_console.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.554 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] serial_console.port_range = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.554 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.554 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.554 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.554 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.554 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] service_user.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.555 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] service_user.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.555 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.555 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.555 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.555 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] service_user.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.555 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.555 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.556 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] service_user.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.556 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] spice.agent_enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.556 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] spice.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.556 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] spice.html5proxy_base_url = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.556 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] spice.html5proxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.556 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] spice.html5proxy_port = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.557 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] spice.image_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.557 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] spice.jpeg_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.557 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] spice.playback_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.557 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] spice.server_listen = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.557 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.557 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] spice.streaming_mode = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.557 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] spice.zlib_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.557 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] upgrade_levels.baseapi = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.558 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] upgrade_levels.cert = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.558 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] upgrade_levels.compute = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.558 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] upgrade_levels.conductor = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.558 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] upgrade_levels.scheduler = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.558 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.558 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.558 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.559 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.559 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.559 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.559 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.559 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.559 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.560 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] vmware.api_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.560 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] vmware.ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.560 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] vmware.cache_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.560 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] vmware.cluster_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.560 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] vmware.connection_pool_size = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.560 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] vmware.console_delay_seconds = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.560 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] vmware.datastore_regex = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.560 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] vmware.host_ip = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.561 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] vmware.host_password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.561 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] vmware.host_port = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.561 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] vmware.host_username = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.561 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] vmware.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.561 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] vmware.integration_bridge = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.561 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] vmware.maximum_objects = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.561 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] vmware.pbm_default_policy = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.562 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] vmware.pbm_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.562 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] vmware.pbm_wsdl_location = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.562 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] vmware.serial_log_dir = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.562 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] vmware.serial_port_proxy_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.562 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.562 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] vmware.task_poll_interval = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.562 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] vmware.use_linked_clone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.562 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] vmware.vnc_keymap = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.563 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] vmware.vnc_port = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.563 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] vmware.vnc_port_total = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.563 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] vnc.auth_schemes = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.563 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] vnc.enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.563 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] vnc.novncproxy_base_url = http://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.563 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] vnc.novncproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.564 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] vnc.novncproxy_port = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.564 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] vnc.server_listen = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.564 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] vnc.server_proxyclient_address = 192.168.122.106 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.564 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] vnc.vencrypt_ca_certs = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.564 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] vnc.vencrypt_client_cert = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.564 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] vnc.vencrypt_client_key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.564 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] workarounds.disable_compute_service_check_for_ffu = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.565 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.565 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.565 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.565 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.565 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] workarounds.disable_rootwrap = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.565 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.565 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.565 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.566 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.566 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.566 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.566 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.566 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.566 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.566 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.567 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.567 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.567 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.567 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.567 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.567 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] wsgi.api_paste_config = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.567 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] wsgi.client_socket_timeout = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.568 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] wsgi.default_pool_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.568 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] wsgi.keep_alive = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.568 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] wsgi.max_header_line = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.568 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] wsgi.secure_proxy_ssl_header = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.568 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] wsgi.ssl_ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.568 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] wsgi.ssl_cert_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.568 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] wsgi.ssl_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.568 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] wsgi.tcp_keepidle = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.569 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.569 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] zvm.ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.569 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] zvm.cloud_connector_url = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.569 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] zvm.image_tmp_path = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.569 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] zvm.reachable_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.569 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.570 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_policy.enforce_scope = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.570 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.570 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_policy.policy_dirs = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.570 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_policy.policy_file = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.570 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.570 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.570 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.571 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.571 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.571 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.571 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.571 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] remote_debug.host = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.571 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] remote_debug.port = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.571 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.572 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.572 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.572 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.572 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.572 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.572 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.572 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.572 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.573 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.573 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.573 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.573 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.573 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.573 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.573 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.574 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.574 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.574 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.574 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.574 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.574 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.574 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.574 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.575 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.575 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_messaging_rabbit.ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.575 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_messaging_rabbit.ssl_ca_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.575 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_messaging_rabbit.ssl_cert_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.575 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.575 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_messaging_rabbit.ssl_key_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.575 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_messaging_rabbit.ssl_version = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.576 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.576 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.576 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.576 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.576 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_limit.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.576 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_limit.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.576 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_limit.auth_url = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.577 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_limit.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.577 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_limit.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.577 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_limit.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.577 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_limit.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.577 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.577 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_limit.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.577 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.578 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_limit.domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.578 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_limit.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.578 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_limit.endpoint_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.578 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_limit.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.578 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_limit.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.578 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_limit.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.578 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_limit.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.578 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_limit.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.579 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_limit.password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.579 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_limit.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.579 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.579 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_limit.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.579 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_limit.project_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.579 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_limit.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.580 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_limit.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.580 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_limit.service_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.580 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_limit.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.580 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.580 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.580 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_limit.system_scope = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.580 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_limit.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.581 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_limit.trust_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.581 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_limit.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.581 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_limit.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.581 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_limit.user_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.581 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_limit.username = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.581 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_limit.valid_interfaces = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.581 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_limit.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.581 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.582 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.582 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] oslo_reports.log_dir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.582 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.582 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.582 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.582 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.582 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.583 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.583 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.583 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] vif_plug_ovs_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.583 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.583 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.583 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.583 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] vif_plug_ovs_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.584 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.584 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.584 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.584 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.584 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] os_vif_linux_bridge.iptables_top_regex = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.584 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.584 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] os_vif_linux_bridge.use_ipv6 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.585 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.585 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] os_vif_ovs.isolate_vif = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.585 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] os_vif_ovs.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.585 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] os_vif_ovs.ovs_vsctl_timeout = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.585 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] os_vif_ovs.ovsdb_connection = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.585 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] os_vif_ovs.ovsdb_interface = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.585 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] os_vif_ovs.per_port_bridge = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.586 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] os_brick.lock_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.586 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.586 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.586 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] privsep_osbrick.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.586 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] privsep_osbrick.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.586 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.586 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] privsep_osbrick.logger_name = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.586 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.587 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] privsep_osbrick.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.587 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] nova_sys_admin.capabilities = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.587 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] nova_sys_admin.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.587 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] nova_sys_admin.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.587 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] nova_sys_admin.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.587 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.587 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] nova_sys_admin.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.588 231756 DEBUG oslo_service.service [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.589 231756 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.601 231756 INFO nova.virt.node [None req-4273134e-490e-4169-bca2-17f951ca8cd9 - - - - - -] Determined node identity 26c8956b-6742-4951-b566-971b9bbe323b from /var/lib/nova/compute_id#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.601 231756 DEBUG nova.virt.libvirt.host [None req-4273134e-490e-4169-bca2-17f951ca8cd9 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.602 231756 DEBUG nova.virt.libvirt.host [None req-4273134e-490e-4169-bca2-17f951ca8cd9 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.602 231756 DEBUG nova.virt.libvirt.host [None req-4273134e-490e-4169-bca2-17f951ca8cd9 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.602 231756 DEBUG nova.virt.libvirt.host [None req-4273134e-490e-4169-bca2-17f951ca8cd9 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.616 231756 DEBUG nova.virt.libvirt.host [None req-4273134e-490e-4169-bca2-17f951ca8cd9 - - - - - -] Registering for lifecycle events _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.618 231756 DEBUG nova.virt.libvirt.host [None req-4273134e-490e-4169-bca2-17f951ca8cd9 - - - - - -] Registering for connection events: _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.619 231756 INFO nova.virt.libvirt.driver [None req-4273134e-490e-4169-bca2-17f951ca8cd9 - - - - - -] Connection event '1' reason 'None'#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.626 231756 INFO nova.virt.libvirt.host [None req-4273134e-490e-4169-bca2-17f951ca8cd9 - - - - - -] Libvirt host capabilities Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: 12c7b589-8d2b-44b6-80e1-1f4b0f34f69b Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: x86_64 Dec 15 04:32:41 localhost nova_compute[231752]: EPYC-Rome-v4 Dec 15 04:32:41 localhost nova_compute[231752]: AMD Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: tcp Dec 15 04:32:41 localhost nova_compute[231752]: rdma Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: 16116604 Dec 15 04:32:41 localhost nova_compute[231752]: 4029151 Dec 15 04:32:41 localhost nova_compute[231752]: 0 Dec 15 04:32:41 localhost nova_compute[231752]: 0 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: selinux Dec 15 04:32:41 localhost nova_compute[231752]: 0 Dec 15 04:32:41 localhost nova_compute[231752]: system_u:system_r:svirt_t:s0 Dec 15 04:32:41 localhost nova_compute[231752]: system_u:system_r:svirt_tcg_t:s0 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: dac Dec 15 04:32:41 localhost nova_compute[231752]: 0 Dec 15 04:32:41 localhost nova_compute[231752]: +107:+107 Dec 15 04:32:41 localhost nova_compute[231752]: +107:+107 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: hvm Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: 32 Dec 15 04:32:41 localhost nova_compute[231752]: /usr/libexec/qemu-kvm Dec 15 04:32:41 localhost nova_compute[231752]: pc-i440fx-rhel7.6.0 Dec 15 04:32:41 localhost nova_compute[231752]: pc Dec 15 04:32:41 localhost nova_compute[231752]: pc-q35-rhel9.8.0 Dec 15 04:32:41 localhost nova_compute[231752]: q35 Dec 15 04:32:41 localhost nova_compute[231752]: pc-q35-rhel9.6.0 Dec 15 04:32:41 localhost nova_compute[231752]: pc-q35-rhel8.6.0 Dec 15 04:32:41 localhost nova_compute[231752]: pc-q35-rhel9.4.0 Dec 15 04:32:41 localhost nova_compute[231752]: pc-q35-rhel8.5.0 Dec 15 04:32:41 localhost nova_compute[231752]: pc-q35-rhel8.3.0 Dec 15 04:32:41 localhost nova_compute[231752]: pc-q35-rhel7.6.0 Dec 15 04:32:41 localhost nova_compute[231752]: pc-q35-rhel8.4.0 Dec 15 04:32:41 localhost nova_compute[231752]: pc-q35-rhel9.2.0 Dec 15 04:32:41 localhost nova_compute[231752]: pc-q35-rhel8.2.0 Dec 15 04:32:41 localhost nova_compute[231752]: pc-q35-rhel9.0.0 Dec 15 04:32:41 localhost nova_compute[231752]: pc-q35-rhel8.0.0 Dec 15 04:32:41 localhost nova_compute[231752]: pc-q35-rhel8.1.0 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: hvm Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: 64 Dec 15 04:32:41 localhost nova_compute[231752]: /usr/libexec/qemu-kvm Dec 15 04:32:41 localhost nova_compute[231752]: pc-i440fx-rhel7.6.0 Dec 15 04:32:41 localhost nova_compute[231752]: pc Dec 15 04:32:41 localhost nova_compute[231752]: pc-q35-rhel9.8.0 Dec 15 04:32:41 localhost nova_compute[231752]: q35 Dec 15 04:32:41 localhost nova_compute[231752]: pc-q35-rhel9.6.0 Dec 15 04:32:41 localhost nova_compute[231752]: pc-q35-rhel8.6.0 Dec 15 04:32:41 localhost nova_compute[231752]: pc-q35-rhel9.4.0 Dec 15 04:32:41 localhost nova_compute[231752]: pc-q35-rhel8.5.0 Dec 15 04:32:41 localhost nova_compute[231752]: pc-q35-rhel8.3.0 Dec 15 04:32:41 localhost nova_compute[231752]: pc-q35-rhel7.6.0 Dec 15 04:32:41 localhost nova_compute[231752]: pc-q35-rhel8.4.0 Dec 15 04:32:41 localhost nova_compute[231752]: pc-q35-rhel9.2.0 Dec 15 04:32:41 localhost nova_compute[231752]: pc-q35-rhel8.2.0 Dec 15 04:32:41 localhost nova_compute[231752]: pc-q35-rhel9.0.0 Dec 15 04:32:41 localhost nova_compute[231752]: pc-q35-rhel8.0.0 Dec 15 04:32:41 localhost nova_compute[231752]: pc-q35-rhel8.1.0 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: #033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.630 231756 DEBUG nova.virt.libvirt.volume.mount [None req-4273134e-490e-4169-bca2-17f951ca8cd9 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.633 231756 DEBUG nova.virt.libvirt.host [None req-4273134e-490e-4169-bca2-17f951ca8cd9 - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.638 231756 DEBUG nova.virt.libvirt.host [None req-4273134e-490e-4169-bca2-17f951ca8cd9 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: /usr/libexec/qemu-kvm Dec 15 04:32:41 localhost nova_compute[231752]: kvm Dec 15 04:32:41 localhost nova_compute[231752]: pc-i440fx-rhel7.6.0 Dec 15 04:32:41 localhost nova_compute[231752]: i686 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: /usr/share/OVMF/OVMF_CODE.secboot.fd Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: rom Dec 15 04:32:41 localhost nova_compute[231752]: pflash Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: yes Dec 15 04:32:41 localhost nova_compute[231752]: no Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: no Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: on Dec 15 04:32:41 localhost nova_compute[231752]: off Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: on Dec 15 04:32:41 localhost nova_compute[231752]: off Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: EPYC-Rome Dec 15 04:32:41 localhost nova_compute[231752]: AMD Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: 486 Dec 15 04:32:41 localhost nova_compute[231752]: 486-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Broadwell Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Broadwell-IBRS Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Broadwell-noTSX Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Broadwell-noTSX-IBRS Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Broadwell-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Broadwell-v2 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Broadwell-v3 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Broadwell-v4 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Cascadelake-Server Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Cascadelake-Server-noTSX Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Cascadelake-Server-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Cascadelake-Server-v2 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Cascadelake-Server-v3 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Cascadelake-Server-v4 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Cascadelake-Server-v5 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Conroe Dec 15 04:32:41 localhost nova_compute[231752]: Conroe-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Cooperlake Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Cooperlake-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Cooperlake-v2 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Denverton Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Denverton-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Denverton-v2 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Denverton-v3 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dhyana Dec 15 04:32:41 localhost nova_compute[231752]: Dhyana-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dhyana-v2 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: EPYC Dec 15 04:32:41 localhost nova_compute[231752]: EPYC-Genoa Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: EPYC-Genoa-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: EPYC-IBPB Dec 15 04:32:41 localhost nova_compute[231752]: EPYC-Milan Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: EPYC-Milan-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: EPYC-Milan-v2 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: EPYC-Rome Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: EPYC-Rome-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: EPYC-Rome-v2 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: EPYC-Rome-v3 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: EPYC-Rome-v4 Dec 15 04:32:41 localhost nova_compute[231752]: EPYC-v1 Dec 15 04:32:41 localhost nova_compute[231752]: EPYC-v2 Dec 15 04:32:41 localhost nova_compute[231752]: EPYC-v3 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: EPYC-v4 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: GraniteRapids Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: GraniteRapids-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: GraniteRapids-v2 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Haswell Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Haswell-IBRS Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Haswell-noTSX Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Haswell-noTSX-IBRS Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Haswell-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Haswell-v2 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Haswell-v3 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Haswell-v4 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Icelake-Server Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Icelake-Server-noTSX Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Icelake-Server-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Icelake-Server-v2 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Icelake-Server-v3 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Icelake-Server-v4 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Icelake-Server-v5 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Icelake-Server-v6 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Icelake-Server-v7 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: IvyBridge Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: IvyBridge-IBRS Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: IvyBridge-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: IvyBridge-v2 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: KnightsMill Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: KnightsMill-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Nehalem Dec 15 04:32:41 localhost nova_compute[231752]: Nehalem-IBRS Dec 15 04:32:41 localhost nova_compute[231752]: Nehalem-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Nehalem-v2 Dec 15 04:32:41 localhost nova_compute[231752]: Opteron_G1 Dec 15 04:32:41 localhost nova_compute[231752]: Opteron_G1-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Opteron_G2 Dec 15 04:32:41 localhost nova_compute[231752]: Opteron_G2-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Opteron_G3 Dec 15 04:32:41 localhost nova_compute[231752]: Opteron_G3-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Opteron_G4 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Opteron_G4-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Opteron_G5 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Opteron_G5-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Penryn Dec 15 04:32:41 localhost nova_compute[231752]: Penryn-v1 Dec 15 04:32:41 localhost nova_compute[231752]: SandyBridge Dec 15 04:32:41 localhost nova_compute[231752]: SandyBridge-IBRS Dec 15 04:32:41 localhost nova_compute[231752]: SandyBridge-v1 Dec 15 04:32:41 localhost nova_compute[231752]: SandyBridge-v2 Dec 15 04:32:41 localhost nova_compute[231752]: SapphireRapids Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: SapphireRapids-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: SapphireRapids-v2 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: SapphireRapids-v3 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: SierraForest Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: SierraForest-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Skylake-Client Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Skylake-Client-IBRS Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Skylake-Client-noTSX-IBRS Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Skylake-Client-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Skylake-Client-v2 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Skylake-Client-v3 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Skylake-Client-v4 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Skylake-Server Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Skylake-Server-IBRS Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Skylake-Server-noTSX-IBRS Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Skylake-Server-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Skylake-Server-v2 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Skylake-Server-v3 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Skylake-Server-v4 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Skylake-Server-v5 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Snowridge Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Snowridge-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Snowridge-v2 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Snowridge-v3 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Snowridge-v4 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Westmere Dec 15 04:32:41 localhost nova_compute[231752]: Westmere-IBRS Dec 15 04:32:41 localhost nova_compute[231752]: Westmere-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Westmere-v2 Dec 15 04:32:41 localhost nova_compute[231752]: athlon Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: athlon-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: core2duo Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: core2duo-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: coreduo Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: coreduo-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: kvm32 Dec 15 04:32:41 localhost nova_compute[231752]: kvm32-v1 Dec 15 04:32:41 localhost nova_compute[231752]: kvm64 Dec 15 04:32:41 localhost nova_compute[231752]: kvm64-v1 Dec 15 04:32:41 localhost nova_compute[231752]: n270 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: n270-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: pentium Dec 15 04:32:41 localhost nova_compute[231752]: pentium-v1 Dec 15 04:32:41 localhost nova_compute[231752]: pentium2 Dec 15 04:32:41 localhost nova_compute[231752]: pentium2-v1 Dec 15 04:32:41 localhost nova_compute[231752]: pentium3 Dec 15 04:32:41 localhost nova_compute[231752]: pentium3-v1 Dec 15 04:32:41 localhost nova_compute[231752]: phenom Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: phenom-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: qemu32 Dec 15 04:32:41 localhost nova_compute[231752]: qemu32-v1 Dec 15 04:32:41 localhost nova_compute[231752]: qemu64 Dec 15 04:32:41 localhost nova_compute[231752]: qemu64-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: file Dec 15 04:32:41 localhost nova_compute[231752]: anonymous Dec 15 04:32:41 localhost nova_compute[231752]: memfd Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: disk Dec 15 04:32:41 localhost nova_compute[231752]: cdrom Dec 15 04:32:41 localhost nova_compute[231752]: floppy Dec 15 04:32:41 localhost nova_compute[231752]: lun Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: ide Dec 15 04:32:41 localhost nova_compute[231752]: fdc Dec 15 04:32:41 localhost nova_compute[231752]: scsi Dec 15 04:32:41 localhost nova_compute[231752]: virtio Dec 15 04:32:41 localhost nova_compute[231752]: usb Dec 15 04:32:41 localhost nova_compute[231752]: sata Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: virtio Dec 15 04:32:41 localhost nova_compute[231752]: virtio-transitional Dec 15 04:32:41 localhost nova_compute[231752]: virtio-non-transitional Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: vnc Dec 15 04:32:41 localhost nova_compute[231752]: egl-headless Dec 15 04:32:41 localhost nova_compute[231752]: dbus Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: subsystem Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: default Dec 15 04:32:41 localhost nova_compute[231752]: mandatory Dec 15 04:32:41 localhost nova_compute[231752]: requisite Dec 15 04:32:41 localhost nova_compute[231752]: optional Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: usb Dec 15 04:32:41 localhost nova_compute[231752]: pci Dec 15 04:32:41 localhost nova_compute[231752]: scsi Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: virtio Dec 15 04:32:41 localhost nova_compute[231752]: virtio-transitional Dec 15 04:32:41 localhost nova_compute[231752]: virtio-non-transitional Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: random Dec 15 04:32:41 localhost nova_compute[231752]: egd Dec 15 04:32:41 localhost nova_compute[231752]: builtin Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: path Dec 15 04:32:41 localhost nova_compute[231752]: handle Dec 15 04:32:41 localhost nova_compute[231752]: virtiofs Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: tpm-tis Dec 15 04:32:41 localhost nova_compute[231752]: tpm-crb Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: emulator Dec 15 04:32:41 localhost nova_compute[231752]: external Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: 2.0 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: usb Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: pty Dec 15 04:32:41 localhost nova_compute[231752]: unix Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: qemu Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: builtin Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: default Dec 15 04:32:41 localhost nova_compute[231752]: passt Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: isa Dec 15 04:32:41 localhost nova_compute[231752]: hyperv Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: null Dec 15 04:32:41 localhost nova_compute[231752]: vc Dec 15 04:32:41 localhost nova_compute[231752]: pty Dec 15 04:32:41 localhost nova_compute[231752]: dev Dec 15 04:32:41 localhost nova_compute[231752]: file Dec 15 04:32:41 localhost nova_compute[231752]: pipe Dec 15 04:32:41 localhost nova_compute[231752]: stdio Dec 15 04:32:41 localhost nova_compute[231752]: udp Dec 15 04:32:41 localhost nova_compute[231752]: tcp Dec 15 04:32:41 localhost nova_compute[231752]: unix Dec 15 04:32:41 localhost nova_compute[231752]: qemu-vdagent Dec 15 04:32:41 localhost nova_compute[231752]: dbus Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: relaxed Dec 15 04:32:41 localhost nova_compute[231752]: vapic Dec 15 04:32:41 localhost nova_compute[231752]: spinlocks Dec 15 04:32:41 localhost nova_compute[231752]: vpindex Dec 15 04:32:41 localhost nova_compute[231752]: runtime Dec 15 04:32:41 localhost nova_compute[231752]: synic Dec 15 04:32:41 localhost nova_compute[231752]: stimer Dec 15 04:32:41 localhost nova_compute[231752]: reset Dec 15 04:32:41 localhost nova_compute[231752]: vendor_id Dec 15 04:32:41 localhost nova_compute[231752]: frequencies Dec 15 04:32:41 localhost nova_compute[231752]: reenlightenment Dec 15 04:32:41 localhost nova_compute[231752]: tlbflush Dec 15 04:32:41 localhost nova_compute[231752]: ipi Dec 15 04:32:41 localhost nova_compute[231752]: avic Dec 15 04:32:41 localhost nova_compute[231752]: emsr_bitmap Dec 15 04:32:41 localhost nova_compute[231752]: xmm_input Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: 4095 Dec 15 04:32:41 localhost nova_compute[231752]: on Dec 15 04:32:41 localhost nova_compute[231752]: off Dec 15 04:32:41 localhost nova_compute[231752]: off Dec 15 04:32:41 localhost nova_compute[231752]: Linux KVM Hv Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: tdx Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.644 231756 DEBUG nova.virt.libvirt.host [None req-4273134e-490e-4169-bca2-17f951ca8cd9 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: /usr/libexec/qemu-kvm Dec 15 04:32:41 localhost nova_compute[231752]: kvm Dec 15 04:32:41 localhost nova_compute[231752]: pc-q35-rhel9.8.0 Dec 15 04:32:41 localhost nova_compute[231752]: i686 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: /usr/share/OVMF/OVMF_CODE.secboot.fd Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: rom Dec 15 04:32:41 localhost nova_compute[231752]: pflash Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: yes Dec 15 04:32:41 localhost nova_compute[231752]: no Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: no Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: on Dec 15 04:32:41 localhost nova_compute[231752]: off Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: on Dec 15 04:32:41 localhost nova_compute[231752]: off Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: EPYC-Rome Dec 15 04:32:41 localhost nova_compute[231752]: AMD Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: 486 Dec 15 04:32:41 localhost nova_compute[231752]: 486-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Broadwell Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Broadwell-IBRS Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Broadwell-noTSX Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Broadwell-noTSX-IBRS Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Broadwell-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Broadwell-v2 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Broadwell-v3 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Broadwell-v4 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Cascadelake-Server Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Cascadelake-Server-noTSX Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Cascadelake-Server-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Cascadelake-Server-v2 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Cascadelake-Server-v3 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Cascadelake-Server-v4 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Cascadelake-Server-v5 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Conroe Dec 15 04:32:41 localhost nova_compute[231752]: Conroe-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Cooperlake Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Cooperlake-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Cooperlake-v2 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Denverton Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Denverton-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Denverton-v2 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Denverton-v3 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dhyana Dec 15 04:32:41 localhost nova_compute[231752]: Dhyana-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dhyana-v2 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: EPYC Dec 15 04:32:41 localhost nova_compute[231752]: EPYC-Genoa Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: EPYC-Genoa-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: EPYC-IBPB Dec 15 04:32:41 localhost nova_compute[231752]: EPYC-Milan Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: EPYC-Milan-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: EPYC-Milan-v2 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: EPYC-Rome Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: EPYC-Rome-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: EPYC-Rome-v2 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: EPYC-Rome-v3 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: EPYC-Rome-v4 Dec 15 04:32:41 localhost nova_compute[231752]: EPYC-v1 Dec 15 04:32:41 localhost nova_compute[231752]: EPYC-v2 Dec 15 04:32:41 localhost nova_compute[231752]: EPYC-v3 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: EPYC-v4 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: GraniteRapids Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: GraniteRapids-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: GraniteRapids-v2 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Haswell Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Haswell-IBRS Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Haswell-noTSX Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Haswell-noTSX-IBRS Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Haswell-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Haswell-v2 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Haswell-v3 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Haswell-v4 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Icelake-Server Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Icelake-Server-noTSX Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Icelake-Server-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Icelake-Server-v2 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Icelake-Server-v3 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Icelake-Server-v4 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Icelake-Server-v5 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Icelake-Server-v6 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Icelake-Server-v7 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: IvyBridge Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: IvyBridge-IBRS Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: IvyBridge-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: IvyBridge-v2 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: KnightsMill Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: KnightsMill-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Nehalem Dec 15 04:32:41 localhost nova_compute[231752]: Nehalem-IBRS Dec 15 04:32:41 localhost nova_compute[231752]: Nehalem-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Nehalem-v2 Dec 15 04:32:41 localhost nova_compute[231752]: Opteron_G1 Dec 15 04:32:41 localhost nova_compute[231752]: Opteron_G1-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Opteron_G2 Dec 15 04:32:41 localhost nova_compute[231752]: Opteron_G2-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Opteron_G3 Dec 15 04:32:41 localhost nova_compute[231752]: Opteron_G3-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Opteron_G4 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Opteron_G4-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Opteron_G5 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Opteron_G5-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Penryn Dec 15 04:32:41 localhost nova_compute[231752]: Penryn-v1 Dec 15 04:32:41 localhost nova_compute[231752]: SandyBridge Dec 15 04:32:41 localhost nova_compute[231752]: SandyBridge-IBRS Dec 15 04:32:41 localhost nova_compute[231752]: SandyBridge-v1 Dec 15 04:32:41 localhost nova_compute[231752]: SandyBridge-v2 Dec 15 04:32:41 localhost nova_compute[231752]: SapphireRapids Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: SapphireRapids-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: SapphireRapids-v2 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: SapphireRapids-v3 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: SierraForest Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: SierraForest-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Skylake-Client Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Skylake-Client-IBRS Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Skylake-Client-noTSX-IBRS Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Skylake-Client-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Skylake-Client-v2 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Skylake-Client-v3 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Skylake-Client-v4 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Skylake-Server Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Skylake-Server-IBRS Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Skylake-Server-noTSX-IBRS Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Skylake-Server-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Skylake-Server-v2 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Skylake-Server-v3 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Skylake-Server-v4 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Skylake-Server-v5 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Snowridge Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Snowridge-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Snowridge-v2 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Snowridge-v3 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Snowridge-v4 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Westmere Dec 15 04:32:41 localhost nova_compute[231752]: Westmere-IBRS Dec 15 04:32:41 localhost nova_compute[231752]: Westmere-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Westmere-v2 Dec 15 04:32:41 localhost nova_compute[231752]: athlon Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: athlon-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: core2duo Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: core2duo-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: coreduo Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: coreduo-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: kvm32 Dec 15 04:32:41 localhost nova_compute[231752]: kvm32-v1 Dec 15 04:32:41 localhost nova_compute[231752]: kvm64 Dec 15 04:32:41 localhost nova_compute[231752]: kvm64-v1 Dec 15 04:32:41 localhost nova_compute[231752]: n270 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: n270-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: pentium Dec 15 04:32:41 localhost nova_compute[231752]: pentium-v1 Dec 15 04:32:41 localhost nova_compute[231752]: pentium2 Dec 15 04:32:41 localhost nova_compute[231752]: pentium2-v1 Dec 15 04:32:41 localhost nova_compute[231752]: pentium3 Dec 15 04:32:41 localhost nova_compute[231752]: pentium3-v1 Dec 15 04:32:41 localhost nova_compute[231752]: phenom Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: phenom-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: qemu32 Dec 15 04:32:41 localhost nova_compute[231752]: qemu32-v1 Dec 15 04:32:41 localhost nova_compute[231752]: qemu64 Dec 15 04:32:41 localhost nova_compute[231752]: qemu64-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: file Dec 15 04:32:41 localhost nova_compute[231752]: anonymous Dec 15 04:32:41 localhost nova_compute[231752]: memfd Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: disk Dec 15 04:32:41 localhost nova_compute[231752]: cdrom Dec 15 04:32:41 localhost nova_compute[231752]: floppy Dec 15 04:32:41 localhost nova_compute[231752]: lun Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: fdc Dec 15 04:32:41 localhost nova_compute[231752]: scsi Dec 15 04:32:41 localhost nova_compute[231752]: virtio Dec 15 04:32:41 localhost nova_compute[231752]: usb Dec 15 04:32:41 localhost nova_compute[231752]: sata Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: virtio Dec 15 04:32:41 localhost nova_compute[231752]: virtio-transitional Dec 15 04:32:41 localhost nova_compute[231752]: virtio-non-transitional Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: vnc Dec 15 04:32:41 localhost nova_compute[231752]: egl-headless Dec 15 04:32:41 localhost nova_compute[231752]: dbus Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: subsystem Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: default Dec 15 04:32:41 localhost nova_compute[231752]: mandatory Dec 15 04:32:41 localhost nova_compute[231752]: requisite Dec 15 04:32:41 localhost nova_compute[231752]: optional Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: usb Dec 15 04:32:41 localhost nova_compute[231752]: pci Dec 15 04:32:41 localhost nova_compute[231752]: scsi Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: virtio Dec 15 04:32:41 localhost nova_compute[231752]: virtio-transitional Dec 15 04:32:41 localhost nova_compute[231752]: virtio-non-transitional Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: random Dec 15 04:32:41 localhost nova_compute[231752]: egd Dec 15 04:32:41 localhost nova_compute[231752]: builtin Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: path Dec 15 04:32:41 localhost nova_compute[231752]: handle Dec 15 04:32:41 localhost nova_compute[231752]: virtiofs Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: tpm-tis Dec 15 04:32:41 localhost nova_compute[231752]: tpm-crb Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: emulator Dec 15 04:32:41 localhost nova_compute[231752]: external Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: 2.0 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: usb Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: pty Dec 15 04:32:41 localhost nova_compute[231752]: unix Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: qemu Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: builtin Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: default Dec 15 04:32:41 localhost nova_compute[231752]: passt Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: isa Dec 15 04:32:41 localhost nova_compute[231752]: hyperv Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: null Dec 15 04:32:41 localhost nova_compute[231752]: vc Dec 15 04:32:41 localhost nova_compute[231752]: pty Dec 15 04:32:41 localhost nova_compute[231752]: dev Dec 15 04:32:41 localhost nova_compute[231752]: file Dec 15 04:32:41 localhost nova_compute[231752]: pipe Dec 15 04:32:41 localhost nova_compute[231752]: stdio Dec 15 04:32:41 localhost nova_compute[231752]: udp Dec 15 04:32:41 localhost nova_compute[231752]: tcp Dec 15 04:32:41 localhost nova_compute[231752]: unix Dec 15 04:32:41 localhost nova_compute[231752]: qemu-vdagent Dec 15 04:32:41 localhost nova_compute[231752]: dbus Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: relaxed Dec 15 04:32:41 localhost nova_compute[231752]: vapic Dec 15 04:32:41 localhost nova_compute[231752]: spinlocks Dec 15 04:32:41 localhost nova_compute[231752]: vpindex Dec 15 04:32:41 localhost nova_compute[231752]: runtime Dec 15 04:32:41 localhost nova_compute[231752]: synic Dec 15 04:32:41 localhost nova_compute[231752]: stimer Dec 15 04:32:41 localhost nova_compute[231752]: reset Dec 15 04:32:41 localhost nova_compute[231752]: vendor_id Dec 15 04:32:41 localhost nova_compute[231752]: frequencies Dec 15 04:32:41 localhost nova_compute[231752]: reenlightenment Dec 15 04:32:41 localhost nova_compute[231752]: tlbflush Dec 15 04:32:41 localhost nova_compute[231752]: ipi Dec 15 04:32:41 localhost nova_compute[231752]: avic Dec 15 04:32:41 localhost nova_compute[231752]: emsr_bitmap Dec 15 04:32:41 localhost nova_compute[231752]: xmm_input Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: 4095 Dec 15 04:32:41 localhost nova_compute[231752]: on Dec 15 04:32:41 localhost nova_compute[231752]: off Dec 15 04:32:41 localhost nova_compute[231752]: off Dec 15 04:32:41 localhost nova_compute[231752]: Linux KVM Hv Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: tdx Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.665 231756 DEBUG nova.virt.libvirt.host [None req-4273134e-490e-4169-bca2-17f951ca8cd9 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.670 231756 DEBUG nova.virt.libvirt.host [None req-4273134e-490e-4169-bca2-17f951ca8cd9 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: /usr/libexec/qemu-kvm Dec 15 04:32:41 localhost nova_compute[231752]: kvm Dec 15 04:32:41 localhost nova_compute[231752]: pc-i440fx-rhel7.6.0 Dec 15 04:32:41 localhost nova_compute[231752]: x86_64 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: /usr/share/OVMF/OVMF_CODE.secboot.fd Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: rom Dec 15 04:32:41 localhost nova_compute[231752]: pflash Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: yes Dec 15 04:32:41 localhost nova_compute[231752]: no Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: no Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: on Dec 15 04:32:41 localhost nova_compute[231752]: off Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: on Dec 15 04:32:41 localhost nova_compute[231752]: off Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: EPYC-Rome Dec 15 04:32:41 localhost nova_compute[231752]: AMD Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: 486 Dec 15 04:32:41 localhost nova_compute[231752]: 486-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Broadwell Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Broadwell-IBRS Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Broadwell-noTSX Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Broadwell-noTSX-IBRS Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Broadwell-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Broadwell-v2 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Broadwell-v3 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Broadwell-v4 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Cascadelake-Server Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Cascadelake-Server-noTSX Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Cascadelake-Server-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Cascadelake-Server-v2 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Cascadelake-Server-v3 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Cascadelake-Server-v4 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Cascadelake-Server-v5 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Conroe Dec 15 04:32:41 localhost nova_compute[231752]: Conroe-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Cooperlake Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Cooperlake-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Cooperlake-v2 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Denverton Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Denverton-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Denverton-v2 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Denverton-v3 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dhyana Dec 15 04:32:41 localhost nova_compute[231752]: Dhyana-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dhyana-v2 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: EPYC Dec 15 04:32:41 localhost nova_compute[231752]: EPYC-Genoa Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: EPYC-Genoa-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: EPYC-IBPB Dec 15 04:32:41 localhost nova_compute[231752]: EPYC-Milan Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: EPYC-Milan-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: EPYC-Milan-v2 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: EPYC-Rome Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: EPYC-Rome-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: EPYC-Rome-v2 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: EPYC-Rome-v3 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: EPYC-Rome-v4 Dec 15 04:32:41 localhost nova_compute[231752]: EPYC-v1 Dec 15 04:32:41 localhost nova_compute[231752]: EPYC-v2 Dec 15 04:32:41 localhost nova_compute[231752]: EPYC-v3 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: EPYC-v4 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: GraniteRapids Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: GraniteRapids-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: GraniteRapids-v2 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Haswell Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Haswell-IBRS Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Haswell-noTSX Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Haswell-noTSX-IBRS Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Haswell-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Haswell-v2 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Haswell-v3 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Haswell-v4 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Icelake-Server Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Icelake-Server-noTSX Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Icelake-Server-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Icelake-Server-v2 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Icelake-Server-v3 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Icelake-Server-v4 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Icelake-Server-v5 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Icelake-Server-v6 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Icelake-Server-v7 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: IvyBridge Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: IvyBridge-IBRS Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: IvyBridge-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: IvyBridge-v2 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: KnightsMill Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: KnightsMill-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Nehalem Dec 15 04:32:41 localhost nova_compute[231752]: Nehalem-IBRS Dec 15 04:32:41 localhost nova_compute[231752]: Nehalem-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Nehalem-v2 Dec 15 04:32:41 localhost nova_compute[231752]: Opteron_G1 Dec 15 04:32:41 localhost nova_compute[231752]: Opteron_G1-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Opteron_G2 Dec 15 04:32:41 localhost nova_compute[231752]: Opteron_G2-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Opteron_G3 Dec 15 04:32:41 localhost nova_compute[231752]: Opteron_G3-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Opteron_G4 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Opteron_G4-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Opteron_G5 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Opteron_G5-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Penryn Dec 15 04:32:41 localhost nova_compute[231752]: Penryn-v1 Dec 15 04:32:41 localhost nova_compute[231752]: SandyBridge Dec 15 04:32:41 localhost nova_compute[231752]: SandyBridge-IBRS Dec 15 04:32:41 localhost nova_compute[231752]: SandyBridge-v1 Dec 15 04:32:41 localhost nova_compute[231752]: SandyBridge-v2 Dec 15 04:32:41 localhost nova_compute[231752]: SapphireRapids Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: SapphireRapids-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: SapphireRapids-v2 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: SapphireRapids-v3 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: SierraForest Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: SierraForest-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Skylake-Client Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Skylake-Client-IBRS Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Skylake-Client-noTSX-IBRS Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Skylake-Client-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Skylake-Client-v2 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Skylake-Client-v3 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Skylake-Client-v4 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Skylake-Server Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Skylake-Server-IBRS Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Skylake-Server-noTSX-IBRS Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Skylake-Server-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Skylake-Server-v2 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Skylake-Server-v3 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Skylake-Server-v4 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Skylake-Server-v5 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Snowridge Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Snowridge-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Snowridge-v2 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Snowridge-v3 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Snowridge-v4 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Westmere Dec 15 04:32:41 localhost nova_compute[231752]: Westmere-IBRS Dec 15 04:32:41 localhost nova_compute[231752]: Westmere-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Westmere-v2 Dec 15 04:32:41 localhost nova_compute[231752]: athlon Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: athlon-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: core2duo Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: core2duo-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: coreduo Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: coreduo-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: kvm32 Dec 15 04:32:41 localhost nova_compute[231752]: kvm32-v1 Dec 15 04:32:41 localhost nova_compute[231752]: kvm64 Dec 15 04:32:41 localhost nova_compute[231752]: kvm64-v1 Dec 15 04:32:41 localhost nova_compute[231752]: n270 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: n270-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: pentium Dec 15 04:32:41 localhost nova_compute[231752]: pentium-v1 Dec 15 04:32:41 localhost nova_compute[231752]: pentium2 Dec 15 04:32:41 localhost nova_compute[231752]: pentium2-v1 Dec 15 04:32:41 localhost nova_compute[231752]: pentium3 Dec 15 04:32:41 localhost nova_compute[231752]: pentium3-v1 Dec 15 04:32:41 localhost nova_compute[231752]: phenom Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: phenom-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: qemu32 Dec 15 04:32:41 localhost nova_compute[231752]: qemu32-v1 Dec 15 04:32:41 localhost nova_compute[231752]: qemu64 Dec 15 04:32:41 localhost nova_compute[231752]: qemu64-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: file Dec 15 04:32:41 localhost nova_compute[231752]: anonymous Dec 15 04:32:41 localhost nova_compute[231752]: memfd Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: disk Dec 15 04:32:41 localhost nova_compute[231752]: cdrom Dec 15 04:32:41 localhost nova_compute[231752]: floppy Dec 15 04:32:41 localhost nova_compute[231752]: lun Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: ide Dec 15 04:32:41 localhost nova_compute[231752]: fdc Dec 15 04:32:41 localhost nova_compute[231752]: scsi Dec 15 04:32:41 localhost nova_compute[231752]: virtio Dec 15 04:32:41 localhost nova_compute[231752]: usb Dec 15 04:32:41 localhost nova_compute[231752]: sata Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: virtio Dec 15 04:32:41 localhost nova_compute[231752]: virtio-transitional Dec 15 04:32:41 localhost nova_compute[231752]: virtio-non-transitional Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: vnc Dec 15 04:32:41 localhost nova_compute[231752]: egl-headless Dec 15 04:32:41 localhost nova_compute[231752]: dbus Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: subsystem Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: default Dec 15 04:32:41 localhost nova_compute[231752]: mandatory Dec 15 04:32:41 localhost nova_compute[231752]: requisite Dec 15 04:32:41 localhost nova_compute[231752]: optional Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: usb Dec 15 04:32:41 localhost nova_compute[231752]: pci Dec 15 04:32:41 localhost nova_compute[231752]: scsi Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: virtio Dec 15 04:32:41 localhost nova_compute[231752]: virtio-transitional Dec 15 04:32:41 localhost nova_compute[231752]: virtio-non-transitional Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: random Dec 15 04:32:41 localhost nova_compute[231752]: egd Dec 15 04:32:41 localhost nova_compute[231752]: builtin Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: path Dec 15 04:32:41 localhost nova_compute[231752]: handle Dec 15 04:32:41 localhost nova_compute[231752]: virtiofs Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: tpm-tis Dec 15 04:32:41 localhost nova_compute[231752]: tpm-crb Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: emulator Dec 15 04:32:41 localhost nova_compute[231752]: external Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: 2.0 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: usb Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: pty Dec 15 04:32:41 localhost nova_compute[231752]: unix Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: qemu Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: builtin Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: default Dec 15 04:32:41 localhost nova_compute[231752]: passt Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: isa Dec 15 04:32:41 localhost nova_compute[231752]: hyperv Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: null Dec 15 04:32:41 localhost nova_compute[231752]: vc Dec 15 04:32:41 localhost nova_compute[231752]: pty Dec 15 04:32:41 localhost nova_compute[231752]: dev Dec 15 04:32:41 localhost nova_compute[231752]: file Dec 15 04:32:41 localhost nova_compute[231752]: pipe Dec 15 04:32:41 localhost nova_compute[231752]: stdio Dec 15 04:32:41 localhost nova_compute[231752]: udp Dec 15 04:32:41 localhost nova_compute[231752]: tcp Dec 15 04:32:41 localhost nova_compute[231752]: unix Dec 15 04:32:41 localhost nova_compute[231752]: qemu-vdagent Dec 15 04:32:41 localhost nova_compute[231752]: dbus Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: relaxed Dec 15 04:32:41 localhost nova_compute[231752]: vapic Dec 15 04:32:41 localhost nova_compute[231752]: spinlocks Dec 15 04:32:41 localhost nova_compute[231752]: vpindex Dec 15 04:32:41 localhost nova_compute[231752]: runtime Dec 15 04:32:41 localhost nova_compute[231752]: synic Dec 15 04:32:41 localhost nova_compute[231752]: stimer Dec 15 04:32:41 localhost nova_compute[231752]: reset Dec 15 04:32:41 localhost nova_compute[231752]: vendor_id Dec 15 04:32:41 localhost nova_compute[231752]: frequencies Dec 15 04:32:41 localhost nova_compute[231752]: reenlightenment Dec 15 04:32:41 localhost nova_compute[231752]: tlbflush Dec 15 04:32:41 localhost nova_compute[231752]: ipi Dec 15 04:32:41 localhost nova_compute[231752]: avic Dec 15 04:32:41 localhost nova_compute[231752]: emsr_bitmap Dec 15 04:32:41 localhost nova_compute[231752]: xmm_input Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: 4095 Dec 15 04:32:41 localhost nova_compute[231752]: on Dec 15 04:32:41 localhost nova_compute[231752]: off Dec 15 04:32:41 localhost nova_compute[231752]: off Dec 15 04:32:41 localhost nova_compute[231752]: Linux KVM Hv Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: tdx Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.720 231756 DEBUG nova.virt.libvirt.host [None req-4273134e-490e-4169-bca2-17f951ca8cd9 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: /usr/libexec/qemu-kvm Dec 15 04:32:41 localhost nova_compute[231752]: kvm Dec 15 04:32:41 localhost nova_compute[231752]: pc-q35-rhel9.8.0 Dec 15 04:32:41 localhost nova_compute[231752]: x86_64 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: efi Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: /usr/share/edk2/ovmf/OVMF_CODE.secboot.fd Dec 15 04:32:41 localhost nova_compute[231752]: /usr/share/edk2/ovmf/OVMF_CODE.fd Dec 15 04:32:41 localhost nova_compute[231752]: /usr/share/edk2/ovmf/OVMF.amdsev.fd Dec 15 04:32:41 localhost nova_compute[231752]: /usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: rom Dec 15 04:32:41 localhost nova_compute[231752]: pflash Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: yes Dec 15 04:32:41 localhost nova_compute[231752]: no Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: yes Dec 15 04:32:41 localhost nova_compute[231752]: no Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: on Dec 15 04:32:41 localhost nova_compute[231752]: off Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: on Dec 15 04:32:41 localhost nova_compute[231752]: off Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: EPYC-Rome Dec 15 04:32:41 localhost nova_compute[231752]: AMD Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: 486 Dec 15 04:32:41 localhost nova_compute[231752]: 486-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Broadwell Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Broadwell-IBRS Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Broadwell-noTSX Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Broadwell-noTSX-IBRS Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Broadwell-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Broadwell-v2 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Broadwell-v3 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Broadwell-v4 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Cascadelake-Server Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Cascadelake-Server-noTSX Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Cascadelake-Server-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Cascadelake-Server-v2 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Cascadelake-Server-v3 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Cascadelake-Server-v4 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Cascadelake-Server-v5 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Conroe Dec 15 04:32:41 localhost nova_compute[231752]: Conroe-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Cooperlake Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Cooperlake-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Cooperlake-v2 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Denverton Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Denverton-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Denverton-v2 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Denverton-v3 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dhyana Dec 15 04:32:41 localhost nova_compute[231752]: Dhyana-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dhyana-v2 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: EPYC Dec 15 04:32:41 localhost nova_compute[231752]: EPYC-Genoa Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: EPYC-Genoa-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: EPYC-IBPB Dec 15 04:32:41 localhost nova_compute[231752]: EPYC-Milan Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: EPYC-Milan-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: EPYC-Milan-v2 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: EPYC-Rome Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: EPYC-Rome-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: EPYC-Rome-v2 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: EPYC-Rome-v3 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: EPYC-Rome-v4 Dec 15 04:32:41 localhost nova_compute[231752]: EPYC-v1 Dec 15 04:32:41 localhost nova_compute[231752]: EPYC-v2 Dec 15 04:32:41 localhost nova_compute[231752]: EPYC-v3 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: EPYC-v4 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: GraniteRapids Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: GraniteRapids-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: GraniteRapids-v2 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Haswell Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Haswell-IBRS Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Haswell-noTSX Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Haswell-noTSX-IBRS Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Haswell-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Haswell-v2 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Haswell-v3 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Haswell-v4 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Icelake-Server Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Icelake-Server-noTSX Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Icelake-Server-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Icelake-Server-v2 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Icelake-Server-v3 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Icelake-Server-v4 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Icelake-Server-v5 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Icelake-Server-v6 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Icelake-Server-v7 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: IvyBridge Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: IvyBridge-IBRS Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: IvyBridge-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: IvyBridge-v2 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: KnightsMill Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: KnightsMill-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Nehalem Dec 15 04:32:41 localhost nova_compute[231752]: Nehalem-IBRS Dec 15 04:32:41 localhost nova_compute[231752]: Nehalem-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Nehalem-v2 Dec 15 04:32:41 localhost nova_compute[231752]: Opteron_G1 Dec 15 04:32:41 localhost nova_compute[231752]: Opteron_G1-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Opteron_G2 Dec 15 04:32:41 localhost nova_compute[231752]: Opteron_G2-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Opteron_G3 Dec 15 04:32:41 localhost nova_compute[231752]: Opteron_G3-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Opteron_G4 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Opteron_G4-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Opteron_G5 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Opteron_G5-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Penryn Dec 15 04:32:41 localhost nova_compute[231752]: Penryn-v1 Dec 15 04:32:41 localhost nova_compute[231752]: SandyBridge Dec 15 04:32:41 localhost nova_compute[231752]: SandyBridge-IBRS Dec 15 04:32:41 localhost nova_compute[231752]: SandyBridge-v1 Dec 15 04:32:41 localhost nova_compute[231752]: SandyBridge-v2 Dec 15 04:32:41 localhost nova_compute[231752]: SapphireRapids Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: SapphireRapids-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: SapphireRapids-v2 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: SapphireRapids-v3 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: SierraForest Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: SierraForest-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Skylake-Client Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Skylake-Client-IBRS Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Skylake-Client-noTSX-IBRS Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Skylake-Client-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Skylake-Client-v2 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Skylake-Client-v3 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Skylake-Client-v4 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Skylake-Server Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Skylake-Server-IBRS Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Skylake-Server-noTSX-IBRS Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Skylake-Server-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Skylake-Server-v2 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Skylake-Server-v3 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Skylake-Server-v4 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Skylake-Server-v5 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Snowridge Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Snowridge-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Snowridge-v2 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Snowridge-v3 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Snowridge-v4 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Westmere Dec 15 04:32:41 localhost nova_compute[231752]: Westmere-IBRS Dec 15 04:32:41 localhost nova_compute[231752]: Westmere-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Westmere-v2 Dec 15 04:32:41 localhost nova_compute[231752]: athlon Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: athlon-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: core2duo Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: core2duo-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: coreduo Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: coreduo-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: kvm32 Dec 15 04:32:41 localhost nova_compute[231752]: kvm32-v1 Dec 15 04:32:41 localhost nova_compute[231752]: kvm64 Dec 15 04:32:41 localhost nova_compute[231752]: kvm64-v1 Dec 15 04:32:41 localhost nova_compute[231752]: n270 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: n270-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: pentium Dec 15 04:32:41 localhost nova_compute[231752]: pentium-v1 Dec 15 04:32:41 localhost nova_compute[231752]: pentium2 Dec 15 04:32:41 localhost nova_compute[231752]: pentium2-v1 Dec 15 04:32:41 localhost nova_compute[231752]: pentium3 Dec 15 04:32:41 localhost nova_compute[231752]: pentium3-v1 Dec 15 04:32:41 localhost nova_compute[231752]: phenom Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: phenom-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: qemu32 Dec 15 04:32:41 localhost nova_compute[231752]: qemu32-v1 Dec 15 04:32:41 localhost nova_compute[231752]: qemu64 Dec 15 04:32:41 localhost nova_compute[231752]: qemu64-v1 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: file Dec 15 04:32:41 localhost nova_compute[231752]: anonymous Dec 15 04:32:41 localhost nova_compute[231752]: memfd Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: disk Dec 15 04:32:41 localhost nova_compute[231752]: cdrom Dec 15 04:32:41 localhost nova_compute[231752]: floppy Dec 15 04:32:41 localhost nova_compute[231752]: lun Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: fdc Dec 15 04:32:41 localhost nova_compute[231752]: scsi Dec 15 04:32:41 localhost nova_compute[231752]: virtio Dec 15 04:32:41 localhost nova_compute[231752]: usb Dec 15 04:32:41 localhost nova_compute[231752]: sata Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: virtio Dec 15 04:32:41 localhost nova_compute[231752]: virtio-transitional Dec 15 04:32:41 localhost nova_compute[231752]: virtio-non-transitional Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: vnc Dec 15 04:32:41 localhost nova_compute[231752]: egl-headless Dec 15 04:32:41 localhost nova_compute[231752]: dbus Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: subsystem Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: default Dec 15 04:32:41 localhost nova_compute[231752]: mandatory Dec 15 04:32:41 localhost nova_compute[231752]: requisite Dec 15 04:32:41 localhost nova_compute[231752]: optional Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: usb Dec 15 04:32:41 localhost nova_compute[231752]: pci Dec 15 04:32:41 localhost nova_compute[231752]: scsi Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: virtio Dec 15 04:32:41 localhost nova_compute[231752]: virtio-transitional Dec 15 04:32:41 localhost nova_compute[231752]: virtio-non-transitional Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: random Dec 15 04:32:41 localhost nova_compute[231752]: egd Dec 15 04:32:41 localhost nova_compute[231752]: builtin Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: path Dec 15 04:32:41 localhost nova_compute[231752]: handle Dec 15 04:32:41 localhost nova_compute[231752]: virtiofs Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: tpm-tis Dec 15 04:32:41 localhost nova_compute[231752]: tpm-crb Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: emulator Dec 15 04:32:41 localhost nova_compute[231752]: external Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: 2.0 Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: usb Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: pty Dec 15 04:32:41 localhost nova_compute[231752]: unix Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: qemu Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: builtin Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: default Dec 15 04:32:41 localhost nova_compute[231752]: passt Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: isa Dec 15 04:32:41 localhost nova_compute[231752]: hyperv Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: null Dec 15 04:32:41 localhost nova_compute[231752]: vc Dec 15 04:32:41 localhost nova_compute[231752]: pty Dec 15 04:32:41 localhost nova_compute[231752]: dev Dec 15 04:32:41 localhost nova_compute[231752]: file Dec 15 04:32:41 localhost nova_compute[231752]: pipe Dec 15 04:32:41 localhost nova_compute[231752]: stdio Dec 15 04:32:41 localhost nova_compute[231752]: udp Dec 15 04:32:41 localhost nova_compute[231752]: tcp Dec 15 04:32:41 localhost nova_compute[231752]: unix Dec 15 04:32:41 localhost nova_compute[231752]: qemu-vdagent Dec 15 04:32:41 localhost nova_compute[231752]: dbus Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: relaxed Dec 15 04:32:41 localhost nova_compute[231752]: vapic Dec 15 04:32:41 localhost nova_compute[231752]: spinlocks Dec 15 04:32:41 localhost nova_compute[231752]: vpindex Dec 15 04:32:41 localhost nova_compute[231752]: runtime Dec 15 04:32:41 localhost nova_compute[231752]: synic Dec 15 04:32:41 localhost nova_compute[231752]: stimer Dec 15 04:32:41 localhost nova_compute[231752]: reset Dec 15 04:32:41 localhost nova_compute[231752]: vendor_id Dec 15 04:32:41 localhost nova_compute[231752]: frequencies Dec 15 04:32:41 localhost nova_compute[231752]: reenlightenment Dec 15 04:32:41 localhost nova_compute[231752]: tlbflush Dec 15 04:32:41 localhost nova_compute[231752]: ipi Dec 15 04:32:41 localhost nova_compute[231752]: avic Dec 15 04:32:41 localhost nova_compute[231752]: emsr_bitmap Dec 15 04:32:41 localhost nova_compute[231752]: xmm_input Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: 4095 Dec 15 04:32:41 localhost nova_compute[231752]: on Dec 15 04:32:41 localhost nova_compute[231752]: off Dec 15 04:32:41 localhost nova_compute[231752]: off Dec 15 04:32:41 localhost nova_compute[231752]: Linux KVM Hv Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: tdx Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: Dec 15 04:32:41 localhost nova_compute[231752]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.772 231756 DEBUG nova.virt.libvirt.host [None req-4273134e-490e-4169-bca2-17f951ca8cd9 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.773 231756 DEBUG nova.virt.libvirt.host [None req-4273134e-490e-4169-bca2-17f951ca8cd9 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.773 231756 DEBUG nova.virt.libvirt.host [None req-4273134e-490e-4169-bca2-17f951ca8cd9 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.773 231756 INFO nova.virt.libvirt.host [None req-4273134e-490e-4169-bca2-17f951ca8cd9 - - - - - -] Secure Boot support detected#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.775 231756 INFO nova.virt.libvirt.driver [None req-4273134e-490e-4169-bca2-17f951ca8cd9 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.776 231756 INFO nova.virt.libvirt.driver [None req-4273134e-490e-4169-bca2-17f951ca8cd9 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.789 231756 DEBUG nova.virt.libvirt.driver [None req-4273134e-490e-4169-bca2-17f951ca8cd9 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.814 231756 INFO nova.virt.node [None req-4273134e-490e-4169-bca2-17f951ca8cd9 - - - - - -] Determined node identity 26c8956b-6742-4951-b566-971b9bbe323b from /var/lib/nova/compute_id#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.830 231756 DEBUG nova.compute.manager [None req-4273134e-490e-4169-bca2-17f951ca8cd9 - - - - - -] Verified node 26c8956b-6742-4951-b566-971b9bbe323b matches my host np0005559462.localdomain _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.861 231756 DEBUG nova.compute.manager [None req-4273134e-490e-4169-bca2-17f951ca8cd9 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.865 231756 DEBUG nova.virt.libvirt.vif [None req-4273134e-490e-4169-bca2-17f951ca8cd9 - - - - - -] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-15T08:29:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='test',display_name='test',ec2_ids=,ephemeral_gb=1,ephemeral_key_uuid=None,fault=,flavor=,hidden=False,host='np0005559462.localdomain',hostname='test',id=2,image_ref='7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2025-12-15T08:30:01Z,launched_on='np0005559462.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=,new_flavor=,node='np0005559462.localdomain',numa_topology=None,old_flavor=,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='c785bf23f53946bc99867d8832a50266',ramdisk_id='',reservation_id='r-e1tbwc05',resources=,root_device_name='/dev/vda',root_gb=1,security_groups=,services=,shutdown_terminate=False,system_metadata=,tags=,task_state=None,terminated_at=None,trusted_certs=,updated_at=2025-12-15T08:30:01Z,user_data=None,user_id='1ba5fce347b64bfebf995f187193f205',uuid=39ff1bd9-6f6b-44c8-bbec-a1fd9d196359,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "03ef8889-3216-43fb-8a52-4be17a956ce1", "address": "fa:16:3e:74:df:7c", "network": {"id": "befb7a72-17a9-4bcb-b561-84b8f626685a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.201", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.1"}}], "meta": {"injected": false, "tenant_id": "c785bf23f53946bc99867d8832a50266", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tap03ef8889-32", "ovs_interfaceid": "03ef8889-3216-43fb-8a52-4be17a956ce1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.866 231756 DEBUG nova.network.os_vif_util [None req-4273134e-490e-4169-bca2-17f951ca8cd9 - - - - - -] Converting VIF {"id": "03ef8889-3216-43fb-8a52-4be17a956ce1", "address": "fa:16:3e:74:df:7c", "network": {"id": "befb7a72-17a9-4bcb-b561-84b8f626685a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.201", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.1"}}], "meta": {"injected": false, "tenant_id": "c785bf23f53946bc99867d8832a50266", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tap03ef8889-32", "ovs_interfaceid": "03ef8889-3216-43fb-8a52-4be17a956ce1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.867 231756 DEBUG nova.network.os_vif_util [None req-4273134e-490e-4169-bca2-17f951ca8cd9 - - - - - -] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:74:df:7c,bridge_name='br-int',has_traffic_filtering=True,id=03ef8889-3216-43fb-8a52-4be17a956ce1,network=Network(befb7a72-17a9-4bcb-b561-84b8f626685a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap03ef8889-32') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.868 231756 DEBUG os_vif [None req-4273134e-490e-4169-bca2-17f951ca8cd9 - - - - - -] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:74:df:7c,bridge_name='br-int',has_traffic_filtering=True,id=03ef8889-3216-43fb-8a52-4be17a956ce1,network=Network(befb7a72-17a9-4bcb-b561-84b8f626685a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap03ef8889-32') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.933 231756 DEBUG ovsdbapp.backend.ovs_idl [None req-4273134e-490e-4169-bca2-17f951ca8cd9 - - - - - -] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.934 231756 DEBUG ovsdbapp.backend.ovs_idl [None req-4273134e-490e-4169-bca2-17f951ca8cd9 - - - - - -] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.934 231756 DEBUG ovsdbapp.backend.ovs_idl [None req-4273134e-490e-4169-bca2-17f951ca8cd9 - - - - - -] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.934 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-4273134e-490e-4169-bca2-17f951ca8cd9 - - - - - -] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.935 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-4273134e-490e-4169-bca2-17f951ca8cd9 - - - - - -] [POLLOUT] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.935 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-4273134e-490e-4169-bca2-17f951ca8cd9 - - - - - -] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.935 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-4273134e-490e-4169-bca2-17f951ca8cd9 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.937 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-4273134e-490e-4169-bca2-17f951ca8cd9 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.940 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-4273134e-490e-4169-bca2-17f951ca8cd9 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.954 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.955 231756 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.955 231756 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Dec 15 04:32:41 localhost nova_compute[231752]: 2025-12-15 09:32:41.956 231756 INFO oslo.privsep.daemon [None req-4273134e-490e-4169-bca2-17f951ca8cd9 - - - - - -] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmp0eoiqd_2/privsep.sock']#033[00m Dec 15 04:32:42 localhost nova_compute[231752]: 2025-12-15 09:32:42.546 231756 INFO oslo.privsep.daemon [None req-4273134e-490e-4169-bca2-17f951ca8cd9 - - - - - -] Spawned new privsep daemon via rootwrap#033[00m Dec 15 04:32:42 localhost nova_compute[231752]: 2025-12-15 09:32:42.435 232000 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m Dec 15 04:32:42 localhost nova_compute[231752]: 2025-12-15 09:32:42.440 232000 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m Dec 15 04:32:42 localhost nova_compute[231752]: 2025-12-15 09:32:42.443 232000 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none#033[00m Dec 15 04:32:42 localhost nova_compute[231752]: 2025-12-15 09:32:42.444 232000 INFO oslo.privsep.daemon [-] privsep daemon running as pid 232000#033[00m Dec 15 04:32:42 localhost nova_compute[231752]: 2025-12-15 09:32:42.831 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:32:42 localhost nova_compute[231752]: 2025-12-15 09:32:42.832 231756 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap03ef8889-32, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 15 04:32:42 localhost nova_compute[231752]: 2025-12-15 09:32:42.832 231756 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap03ef8889-32, col_values=(('external_ids', {'iface-id': '03ef8889-3216-43fb-8a52-4be17a956ce1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:74:df:7c', 'vm-uuid': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 15 04:32:42 localhost nova_compute[231752]: 2025-12-15 09:32:42.834 231756 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Dec 15 04:32:42 localhost nova_compute[231752]: 2025-12-15 09:32:42.834 231756 INFO os_vif [None req-4273134e-490e-4169-bca2-17f951ca8cd9 - - - - - -] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:74:df:7c,bridge_name='br-int',has_traffic_filtering=True,id=03ef8889-3216-43fb-8a52-4be17a956ce1,network=Network(befb7a72-17a9-4bcb-b561-84b8f626685a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap03ef8889-32')#033[00m Dec 15 04:32:42 localhost nova_compute[231752]: 2025-12-15 09:32:42.835 231756 DEBUG nova.compute.manager [None req-4273134e-490e-4169-bca2-17f951ca8cd9 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Dec 15 04:32:42 localhost nova_compute[231752]: 2025-12-15 09:32:42.839 231756 DEBUG nova.compute.manager [None req-4273134e-490e-4169-bca2-17f951ca8cd9 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Current state is 1, state in DB is 1. _init_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:1304#033[00m Dec 15 04:32:42 localhost nova_compute[231752]: 2025-12-15 09:32:42.839 231756 INFO nova.compute.manager [None req-4273134e-490e-4169-bca2-17f951ca8cd9 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m Dec 15 04:32:42 localhost nova_compute[231752]: 2025-12-15 09:32:42.917 231756 DEBUG oslo_concurrency.lockutils [None req-4273134e-490e-4169-bca2-17f951ca8cd9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:32:42 localhost nova_compute[231752]: 2025-12-15 09:32:42.917 231756 DEBUG oslo_concurrency.lockutils [None req-4273134e-490e-4169-bca2-17f951ca8cd9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:32:42 localhost nova_compute[231752]: 2025-12-15 09:32:42.918 231756 DEBUG oslo_concurrency.lockutils [None req-4273134e-490e-4169-bca2-17f951ca8cd9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:32:42 localhost nova_compute[231752]: 2025-12-15 09:32:42.918 231756 DEBUG nova.compute.resource_tracker [None req-4273134e-490e-4169-bca2-17f951ca8cd9 - - - - - -] Auditing locally available compute resources for np0005559462.localdomain (node: np0005559462.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 15 04:32:42 localhost nova_compute[231752]: 2025-12-15 09:32:42.919 231756 DEBUG oslo_concurrency.processutils [None req-4273134e-490e-4169-bca2-17f951ca8cd9 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 04:32:43 localhost nova_compute[231752]: 2025-12-15 09:32:43.381 231756 DEBUG oslo_concurrency.processutils [None req-4273134e-490e-4169-bca2-17f951ca8cd9 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.462s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 04:32:43 localhost nova_compute[231752]: 2025-12-15 09:32:43.443 231756 DEBUG nova.virt.libvirt.driver [None req-4273134e-490e-4169-bca2-17f951ca8cd9 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 04:32:43 localhost nova_compute[231752]: 2025-12-15 09:32:43.443 231756 DEBUG nova.virt.libvirt.driver [None req-4273134e-490e-4169-bca2-17f951ca8cd9 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 04:32:43 localhost nova_compute[231752]: 2025-12-15 09:32:43.657 231756 WARNING nova.virt.libvirt.driver [None req-4273134e-490e-4169-bca2-17f951ca8cd9 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 15 04:32:43 localhost nova_compute[231752]: 2025-12-15 09:32:43.659 231756 DEBUG nova.compute.resource_tracker [None req-4273134e-490e-4169-bca2-17f951ca8cd9 - - - - - -] Hypervisor/Node resource view: name=np0005559462.localdomain free_ram=12934MB free_disk=41.83720779418945GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 15 04:32:43 localhost nova_compute[231752]: 2025-12-15 09:32:43.659 231756 DEBUG oslo_concurrency.lockutils [None req-4273134e-490e-4169-bca2-17f951ca8cd9 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:32:43 localhost nova_compute[231752]: 2025-12-15 09:32:43.660 231756 DEBUG oslo_concurrency.lockutils [None req-4273134e-490e-4169-bca2-17f951ca8cd9 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:32:43 localhost nova_compute[231752]: 2025-12-15 09:32:43.763 231756 DEBUG nova.compute.resource_tracker [None req-4273134e-490e-4169-bca2-17f951ca8cd9 - - - - - -] Instance 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 15 04:32:43 localhost nova_compute[231752]: 2025-12-15 09:32:43.764 231756 DEBUG nova.compute.resource_tracker [None req-4273134e-490e-4169-bca2-17f951ca8cd9 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 15 04:32:43 localhost nova_compute[231752]: 2025-12-15 09:32:43.764 231756 DEBUG nova.compute.resource_tracker [None req-4273134e-490e-4169-bca2-17f951ca8cd9 - - - - - -] Final resource view: name=np0005559462.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 15 04:32:43 localhost nova_compute[231752]: 2025-12-15 09:32:43.810 231756 DEBUG nova.scheduler.client.report [None req-4273134e-490e-4169-bca2-17f951ca8cd9 - - - - - -] Refreshing inventories for resource provider 26c8956b-6742-4951-b566-971b9bbe323b _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Dec 15 04:32:43 localhost nova_compute[231752]: 2025-12-15 09:32:43.855 231756 DEBUG nova.scheduler.client.report [None req-4273134e-490e-4169-bca2-17f951ca8cd9 - - - - - -] Updating ProviderTree inventory for provider 26c8956b-6742-4951-b566-971b9bbe323b from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Dec 15 04:32:43 localhost nova_compute[231752]: 2025-12-15 09:32:43.856 231756 DEBUG nova.compute.provider_tree [None req-4273134e-490e-4169-bca2-17f951ca8cd9 - - - - - -] Updating inventory in ProviderTree for provider 26c8956b-6742-4951-b566-971b9bbe323b with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Dec 15 04:32:43 localhost nova_compute[231752]: 2025-12-15 09:32:43.871 231756 DEBUG nova.scheduler.client.report [None req-4273134e-490e-4169-bca2-17f951ca8cd9 - - - - - -] Refreshing aggregate associations for resource provider 26c8956b-6742-4951-b566-971b9bbe323b, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Dec 15 04:32:43 localhost nova_compute[231752]: 2025-12-15 09:32:43.891 231756 DEBUG nova.scheduler.client.report [None req-4273134e-490e-4169-bca2-17f951ca8cd9 - - - - - -] Refreshing trait associations for resource provider 26c8956b-6742-4951-b566-971b9bbe323b, traits: COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_F16C,HW_CPU_X86_SVM,COMPUTE_ACCELERATORS,HW_CPU_X86_ABM,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_AVX2,HW_CPU_X86_BMI2,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_FMA3,HW_CPU_X86_SHA,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_LAN9118,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_AVX,HW_CPU_X86_MMX,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSSE3,HW_CPU_X86_CLMUL,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_AESNI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE41,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NODE,HW_CPU_X86_BMI,HW_CPU_X86_SSE4A _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Dec 15 04:32:43 localhost nova_compute[231752]: 2025-12-15 09:32:43.928 231756 DEBUG oslo_concurrency.processutils [None req-4273134e-490e-4169-bca2-17f951ca8cd9 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 04:32:44 localhost nova_compute[231752]: 2025-12-15 09:32:44.379 231756 DEBUG oslo_concurrency.processutils [None req-4273134e-490e-4169-bca2-17f951ca8cd9 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.450s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 04:32:44 localhost nova_compute[231752]: 2025-12-15 09:32:44.383 231756 DEBUG nova.virt.libvirt.host [None req-4273134e-490e-4169-bca2-17f951ca8cd9 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N Dec 15 04:32:44 localhost nova_compute[231752]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803#033[00m Dec 15 04:32:44 localhost nova_compute[231752]: 2025-12-15 09:32:44.384 231756 INFO nova.virt.libvirt.host [None req-4273134e-490e-4169-bca2-17f951ca8cd9 - - - - - -] kernel doesn't support AMD SEV#033[00m Dec 15 04:32:44 localhost nova_compute[231752]: 2025-12-15 09:32:44.384 231756 DEBUG nova.compute.provider_tree [None req-4273134e-490e-4169-bca2-17f951ca8cd9 - - - - - -] Updating inventory in ProviderTree for provider 26c8956b-6742-4951-b566-971b9bbe323b with inventory: {'MEMORY_MB': {'total': 15738, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0, 'reserved': 0}, 'DISK_GB': {'total': 41, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Dec 15 04:32:44 localhost nova_compute[231752]: 2025-12-15 09:32:44.385 231756 DEBUG nova.virt.libvirt.driver [None req-4273134e-490e-4169-bca2-17f951ca8cd9 - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m Dec 15 04:32:44 localhost nova_compute[231752]: 2025-12-15 09:32:44.436 231756 DEBUG nova.scheduler.client.report [None req-4273134e-490e-4169-bca2-17f951ca8cd9 - - - - - -] Updated inventory for provider 26c8956b-6742-4951-b566-971b9bbe323b with generation 3 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 15738, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0, 'reserved': 0}, 'DISK_GB': {'total': 41, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957#033[00m Dec 15 04:32:44 localhost nova_compute[231752]: 2025-12-15 09:32:44.436 231756 DEBUG nova.compute.provider_tree [None req-4273134e-490e-4169-bca2-17f951ca8cd9 - - - - - -] Updating resource provider 26c8956b-6742-4951-b566-971b9bbe323b generation from 3 to 4 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m Dec 15 04:32:44 localhost nova_compute[231752]: 2025-12-15 09:32:44.436 231756 DEBUG nova.compute.provider_tree [None req-4273134e-490e-4169-bca2-17f951ca8cd9 - - - - - -] Updating inventory in ProviderTree for provider 26c8956b-6742-4951-b566-971b9bbe323b with inventory: {'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Dec 15 04:32:44 localhost nova_compute[231752]: 2025-12-15 09:32:44.523 231756 DEBUG nova.compute.provider_tree [None req-4273134e-490e-4169-bca2-17f951ca8cd9 - - - - - -] Updating resource provider 26c8956b-6742-4951-b566-971b9bbe323b generation from 4 to 5 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m Dec 15 04:32:44 localhost nova_compute[231752]: 2025-12-15 09:32:44.553 231756 DEBUG nova.compute.resource_tracker [None req-4273134e-490e-4169-bca2-17f951ca8cd9 - - - - - -] Compute_service record updated for np0005559462.localdomain:np0005559462.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 15 04:32:44 localhost nova_compute[231752]: 2025-12-15 09:32:44.554 231756 DEBUG oslo_concurrency.lockutils [None req-4273134e-490e-4169-bca2-17f951ca8cd9 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.894s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:32:44 localhost nova_compute[231752]: 2025-12-15 09:32:44.554 231756 DEBUG nova.service [None req-4273134e-490e-4169-bca2-17f951ca8cd9 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182#033[00m Dec 15 04:32:44 localhost nova_compute[231752]: 2025-12-15 09:32:44.607 231756 DEBUG nova.service [None req-4273134e-490e-4169-bca2-17f951ca8cd9 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199#033[00m Dec 15 04:32:44 localhost nova_compute[231752]: 2025-12-15 09:32:44.607 231756 DEBUG nova.servicegroup.drivers.db [None req-4273134e-490e-4169-bca2-17f951ca8cd9 - - - - - -] DB_Driver: join new ServiceGroup member np0005559462.localdomain to the compute group, service = join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44#033[00m Dec 15 04:32:46 localhost nova_compute[231752]: 2025-12-15 09:32:46.050 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:32:46 localhost nova_compute[231752]: 2025-12-15 09:32:46.938 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:32:47 localhost sshd[232048]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:32:47 localhost systemd-logind[763]: New session 55 of user zuul. Dec 15 04:32:47 localhost systemd[1]: Started Session 55 of User zuul. Dec 15 04:32:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62367 DF PROTO=TCP SPT=35152 DPT=9882 SEQ=386127919 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839D39250000000001030307) Dec 15 04:32:48 localhost python3.9[232159]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Dec 15 04:32:49 localhost python3.9[232273]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Dec 15 04:32:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 04:32:49 localhost systemd[1]: Reloading. Dec 15 04:32:50 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50591 DF PROTO=TCP SPT=33002 DPT=9105 SEQ=1064622596 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839D43250000000001030307) Dec 15 04:32:50 localhost podman[232275]: 2025-12-15 09:32:50.048028428 +0000 UTC m=+0.095827119 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Dec 15 04:32:50 localhost systemd-sysv-generator[232315]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:32:50 localhost systemd-rc-local-generator[232312]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:32:50 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:32:50 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 15 04:32:50 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:32:50 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:32:50 localhost podman[232275]: 2025-12-15 09:32:50.109396522 +0000 UTC m=+0.157195253 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Dec 15 04:32:50 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:32:50 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 15 04:32:50 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:32:50 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:32:50 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:32:50 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 04:32:51 localhost nova_compute[231752]: 2025-12-15 09:32:51.053 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:32:51 localhost python3.9[232441]: ansible-ansible.builtin.service_facts Invoked Dec 15 04:32:51 localhost network[232458]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Dec 15 04:32:51 localhost network[232459]: 'network-scripts' will be removed from distribution in near future. Dec 15 04:32:51 localhost network[232460]: It is advised to switch to 'NetworkManager' instead for network management. Dec 15 04:32:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:32:51.435 160590 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:32:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:32:51.435 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:32:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:32:51.437 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:32:51 localhost nova_compute[231752]: 2025-12-15 09:32:51.940 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:32:52 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:32:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60403 DF PROTO=TCP SPT=48968 DPT=9100 SEQ=2344630221 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839D4FC20000000001030307) Dec 15 04:32:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60404 DF PROTO=TCP SPT=48968 DPT=9100 SEQ=2344630221 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839D53E50000000001030307) Dec 15 04:32:56 localhost nova_compute[231752]: 2025-12-15 09:32:56.056 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:32:56 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60405 DF PROTO=TCP SPT=48968 DPT=9100 SEQ=2344630221 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839D5BE50000000001030307) Dec 15 04:32:56 localhost nova_compute[231752]: 2025-12-15 09:32:56.986 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:32:58 localhost python3.9[232695]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 04:32:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61523 DF PROTO=TCP SPT=59906 DPT=9101 SEQ=3566699330 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839D65E50000000001030307) Dec 15 04:32:59 localhost python3.9[232806]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:32:59 localhost systemd-journald[47230]: Field hash table of /run/log/journal/738a39f68bc78fb81032e509449fb759/system.journal has a fill level at 76.6 (255 of 333 items), suggesting rotation. Dec 15 04:32:59 localhost systemd-journald[47230]: /run/log/journal/738a39f68bc78fb81032e509449fb759/system.journal: Journal header limits reached or header out-of-date, rotating. Dec 15 04:32:59 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Dec 15 04:32:59 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Dec 15 04:32:59 localhost python3.9[232917]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:33:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 04:33:00 localhost podman[233023]: 2025-12-15 09:33:00.758047413 +0000 UTC m=+0.084224861 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 04:33:00 localhost podman[233023]: 2025-12-15 09:33:00.791321696 +0000 UTC m=+0.117499134 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202) Dec 15 04:33:00 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 04:33:00 localhost python3.9[233038]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012 systemctl disable --now certmonger.service#012 test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:33:01 localhost nova_compute[231752]: 2025-12-15 09:33:01.059 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:33:01 localhost python3.9[233154]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None Dec 15 04:33:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9549 DF PROTO=TCP SPT=35102 DPT=9882 SEQ=1482552060 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839D71690000000001030307) Dec 15 04:33:02 localhost nova_compute[231752]: 2025-12-15 09:33:02.017 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:33:02 localhost python3.9[233264]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Dec 15 04:33:02 localhost systemd[1]: Reloading. Dec 15 04:33:02 localhost systemd-rc-local-generator[233288]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:33:02 localhost systemd-sysv-generator[233291]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:33:02 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:33:02 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 15 04:33:02 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:33:02 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:33:02 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:33:02 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 15 04:33:02 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:33:02 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:33:02 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:33:03 localhost python3.9[233410]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:33:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9551 DF PROTO=TCP SPT=35102 DPT=9882 SEQ=1482552060 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839D7D650000000001030307) Dec 15 04:33:05 localhost python3.9[233521]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/telemetry recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 15 04:33:06 localhost nova_compute[231752]: 2025-12-15 09:33:06.061 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:33:06 localhost python3.9[233629]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 15 04:33:06 localhost python3.9[233741]: ansible-ansible.builtin.group Invoked with name=libvirt state=present force=False system=False local=False non_unique=False gid=None gid_min=None gid_max=None Dec 15 04:33:07 localhost nova_compute[231752]: 2025-12-15 09:33:07.038 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:33:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53079 DF PROTO=TCP SPT=58982 DPT=9105 SEQ=3368063327 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839D89250000000001030307) Dec 15 04:33:08 localhost python3.9[233851]: ansible-ansible.builtin.getent Invoked with database=passwd key=ceilometer fail_key=True service=None split=None Dec 15 04:33:08 localhost python3.9[233962]: ansible-ansible.builtin.group Invoked with gid=42405 name=ceilometer state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None Dec 15 04:33:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 04:33:09 localhost podman[234024]: 2025-12-15 09:33:09.743527944 +0000 UTC m=+0.077913799 container health_status 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 04:33:09 localhost podman[234024]: 2025-12-15 09:33:09.759308107 +0000 UTC m=+0.093693982 container exec_died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=multipathd) Dec 15 04:33:09 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.service: Deactivated successfully. Dec 15 04:33:10 localhost python3.9[234097]: ansible-ansible.builtin.user Invoked with comment=ceilometer user group=ceilometer groups=['libvirt'] name=ceilometer shell=/sbin/nologin state=present uid=42405 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005559462.localdomain update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None Dec 15 04:33:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61525 DF PROTO=TCP SPT=59906 DPT=9101 SEQ=3566699330 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839D95250000000001030307) Dec 15 04:33:11 localhost nova_compute[231752]: 2025-12-15 09:33:11.063 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:33:11 localhost python3.9[234213]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/ceilometer.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:33:12 localhost nova_compute[231752]: 2025-12-15 09:33:12.041 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:33:12 localhost python3.9[234299]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/ceilometer.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1765791191.209441-518-178648427435400/.source.conf _original_basename=ceilometer.conf follow=False checksum=9b76e570b72b8cf17b7ed56797f0f9f2b7e66cb5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:33:12 localhost python3.9[234407]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/polling.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:33:13 localhost python3.9[234493]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/polling.yaml mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1765791192.3932574-518-156107656320501/.source.yaml _original_basename=polling.yaml follow=False checksum=6c8680a286285f2e0ef9fa528ca754765e5ed0e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:33:13 localhost python3.9[234601]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/custom.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:33:14 localhost python3.9[234687]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/custom.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1765791193.5150776-518-279944995236582/.source.conf _original_basename=custom.conf follow=False checksum=838b8b0a7d7f72e55ab67d39f32e3cb3eca2139b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:33:15 localhost python3.9[234795]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 15 04:33:15 localhost python3.9[234903]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 15 04:33:16 localhost nova_compute[231752]: 2025-12-15 09:33:16.066 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:33:16 localhost python3.9[235011]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:33:17 localhost nova_compute[231752]: 2025-12-15 09:33:17.042 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:33:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9553 DF PROTO=TCP SPT=35102 DPT=9882 SEQ=1482552060 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839DAD250000000001030307) Dec 15 04:33:17 localhost python3.9[235097]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/ceilometer-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765791195.8349516-695-52505003836837/.source.conf follow=False _original_basename=ceilometer-host-specific.conf.j2 checksum=667c035368a6c09951153764dbe63ca405b5a741 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 15 04:33:18 localhost python3.9[235205]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/openstack_network_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:33:19 localhost python3.9[235291]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/openstack_network_exporter.yaml mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765791197.7467415-695-45434447615697/.source.yaml follow=False _original_basename=openstack_network_exporter.yaml.j2 checksum=b056dcaaba7624b93826bb95ee9e82f81bde6c72 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 15 04:33:19 localhost python3.9[235399]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/firewall.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:33:20 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53081 DF PROTO=TCP SPT=58982 DPT=9105 SEQ=3368063327 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839DB9250000000001030307) Dec 15 04:33:20 localhost python3.9[235485]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/firewall.yaml mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765791199.428152-782-192284812333962/.source.yaml _original_basename=firewall.yaml follow=False checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 15 04:33:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 04:33:20 localhost podman[235503]: 2025-12-15 09:33:20.752513898 +0000 UTC m=+0.083074690 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true) Dec 15 04:33:20 localhost podman[235503]: 2025-12-15 09:33:20.796368791 +0000 UTC m=+0.126929643 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ovn_controller) Dec 15 04:33:20 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 04:33:21 localhost nova_compute[231752]: 2025-12-15 09:33:21.070 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:33:21 localhost python3.9[235617]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 15 04:33:21 localhost python3.9[235727]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 15 04:33:22 localhost nova_compute[231752]: 2025-12-15 09:33:22.045 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:33:22 localhost python3.9[235835]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 15 04:33:23 localhost python3.9[235945]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 15 04:33:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62920 DF PROTO=TCP SPT=38406 DPT=9100 SEQ=88939030 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839DC63A0000000001030307) Dec 15 04:33:23 localhost python3.9[236055]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=podman.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 04:33:23 localhost systemd[1]: Reloading. Dec 15 04:33:24 localhost systemd-rc-local-generator[236083]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:33:24 localhost systemd-sysv-generator[236088]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:33:24 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:33:24 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 15 04:33:24 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:33:24 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:33:24 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:33:24 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 15 04:33:24 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:33:24 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:33:24 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:33:24 localhost systemd[1]: Listening on Podman API Socket. Dec 15 04:33:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62921 DF PROTO=TCP SPT=38406 DPT=9100 SEQ=88939030 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839DCA250000000001030307) Dec 15 04:33:26 localhost nova_compute[231752]: 2025-12-15 09:33:26.073 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:33:26 localhost python3.9[236206]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:33:26 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26355 DF PROTO=TCP SPT=35920 DPT=9102 SEQ=2603980847 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839DD1260000000001030307) Dec 15 04:33:26 localhost python3.9[236294]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765791205.6773415-959-32874184327767/.source _original_basename=healthcheck follow=False checksum=ebb343c21fce35a02591a9351660cb7035a47d42 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Dec 15 04:33:27 localhost nova_compute[231752]: 2025-12-15 09:33:27.046 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:33:27 localhost python3.9[236349]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck.future follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:33:27 localhost python3.9[236437]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765791205.6773415-959-32874184327767/.source.future _original_basename=healthcheck.future follow=False checksum=d500a98192f4ddd70b4dfdc059e2d81aed36a294 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Dec 15 04:33:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15931 DF PROTO=TCP SPT=56488 DPT=9101 SEQ=3362799351 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839DDAE50000000001030307) Dec 15 04:33:29 localhost python3.9[236547]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:33:30 localhost python3.9[236657]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 15 04:33:31 localhost nova_compute[231752]: 2025-12-15 09:33:31.076 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:33:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 04:33:31 localhost podman[236768]: 2025-12-15 09:33:31.469961196 +0000 UTC m=+0.080853478 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Dec 15 04:33:31 localhost podman[236768]: 2025-12-15 09:33:31.500760761 +0000 UTC m=+0.111653013 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3) Dec 15 04:33:31 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 04:33:31 localhost python3.9[236767]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:33:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18186 DF PROTO=TCP SPT=46342 DPT=9882 SEQ=321014561 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839DE6990000000001030307) Dec 15 04:33:32 localhost nova_compute[231752]: 2025-12-15 09:33:32.048 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:33:32 localhost python3.9[236875]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ceilometer_agent_compute.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1765791211.1102078-1103-5580996621245/.source.json _original_basename=.wym5lzpm follow=False checksum=ce2b0c83293a970bafffa087afa083dd7c93a79c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:33:32 localhost python3.9[236983]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ceilometer_agent_compute state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:33:34 localhost python3.9[237372]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ceilometer_agent_compute config_pattern=*.json debug=False Dec 15 04:33:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18188 DF PROTO=TCP SPT=46342 DPT=9882 SEQ=321014561 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839DF2A50000000001030307) Dec 15 04:33:35 localhost python3.9[237482]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack Dec 15 04:33:36 localhost nova_compute[231752]: 2025-12-15 09:33:36.078 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:33:36 localhost nova_compute[231752]: 2025-12-15 09:33:36.609 231756 DEBUG oslo_service.periodic_task [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:33:36 localhost nova_compute[231752]: 2025-12-15 09:33:36.634 231756 DEBUG nova.compute.manager [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Triggering sync for uuid 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m Dec 15 04:33:36 localhost nova_compute[231752]: 2025-12-15 09:33:36.635 231756 DEBUG oslo_concurrency.lockutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Acquiring lock "39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:33:36 localhost nova_compute[231752]: 2025-12-15 09:33:36.636 231756 DEBUG oslo_concurrency.lockutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Lock "39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:33:36 localhost nova_compute[231752]: 2025-12-15 09:33:36.636 231756 DEBUG oslo_service.periodic_task [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:33:36 localhost nova_compute[231752]: 2025-12-15 09:33:36.676 231756 DEBUG oslo_concurrency.lockutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Lock "39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.041s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:33:36 localhost python3.9[237592]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None Dec 15 04:33:37 localhost nova_compute[231752]: 2025-12-15 09:33:37.050 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:33:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1389 DF PROTO=TCP SPT=35508 DPT=9105 SEQ=3873680184 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839DFE250000000001030307) Dec 15 04:33:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 04:33:40 localhost podman[237675]: 2025-12-15 09:33:40.765005829 +0000 UTC m=+0.089502806 container health_status 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251202, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Dec 15 04:33:40 localhost podman[237675]: 2025-12-15 09:33:40.778381775 +0000 UTC m=+0.102878752 container exec_died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible) Dec 15 04:33:40 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.service: Deactivated successfully. Dec 15 04:33:41 localhost nova_compute[231752]: 2025-12-15 09:33:41.005 231756 DEBUG oslo_service.periodic_task [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:33:41 localhost nova_compute[231752]: 2025-12-15 09:33:41.007 231756 DEBUG oslo_service.periodic_task [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:33:41 localhost nova_compute[231752]: 2025-12-15 09:33:41.008 231756 DEBUG nova.compute.manager [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 15 04:33:41 localhost nova_compute[231752]: 2025-12-15 09:33:41.009 231756 DEBUG nova.compute.manager [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 15 04:33:41 localhost nova_compute[231752]: 2025-12-15 09:33:41.082 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:33:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15933 DF PROTO=TCP SPT=56488 DPT=9101 SEQ=3362799351 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839E0B240000000001030307) Dec 15 04:33:41 localhost python3[237748]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ceilometer_agent_compute config_id=ceilometer_agent_compute config_overrides={} config_patterns=*.json containers=['ceilometer_agent_compute'] log_base_path=/var/log/containers/stdouts debug=False Dec 15 04:33:41 localhost python3[237748]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [#012 {#012 "Id": "806262ad9f61127734555408f71447afe6ceede79cc666e6f523dacd5edec739",#012 "Digest": "sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf",#012 "RepoTags": [#012 "quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified"#012 ],#012 "RepoDigests": [#012 "quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:41dc9cf27a902d9c7b392d730bd761cf3c391a548a841e9e4d38e1571f3c53bf"#012 ],#012 "Parent": "",#012 "Comment": "",#012 "Created": "2025-12-08T06:20:34.946898289Z",#012 "Config": {#012 "User": "root",#012 "Env": [#012 "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",#012 "LANG=en_US.UTF-8",#012 "TZ=UTC",#012 "container=oci"#012 ],#012 "Entrypoint": [#012 "dumb-init",#012 "--single-child",#012 "--"#012 ],#012 "Cmd": [#012 "kolla_start"#012 ],#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251202",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "c3923531bcda0b0811b2d5053f189beb",#012 "tcib_managed": "true"#012 },#012 "StopSignal": "SIGTERM"#012 },#012 "Version": "",#012 "Author": "",#012 "Architecture": "amd64",#012 "Os": "linux",#012 "Size": 505715453,#012 "VirtualSize": 505715453,#012 "GraphDriver": {#012 "Name": "overlay",#012 "Data": {#012 "LowerDir": "/var/lib/containers/storage/overlay/7b69ee470d42d327d16a705620b04b4d782013570760b825dbb56beef9aeca73/diff:/var/lib/containers/storage/overlay/4c2a493cc38fe0c2d274b137f7d549c92d76e83cf216e797584fb8469937682d/diff:/var/lib/containers/storage/overlay/102653142e2259aa6223045dee7736729104ac8aed3ce9b3c87a6d0787e59de8/diff:/var/lib/containers/storage/overlay/a170762be59c15b133bd19c602942600caa3082ffe7158ccee8771dfc16bb660/diff",#012 "UpperDir": "/var/lib/containers/storage/overlay/b8c33572414808d2bd24b0744a09c207e3e50a88f16b63b44d75f7bd8a56eb6a/diff",#012 "WorkDir": "/var/lib/containers/storage/overlay/b8c33572414808d2bd24b0744a09c207e3e50a88f16b63b44d75f7bd8a56eb6a/work"#012 }#012 },#012 "RootFS": {#012 "Type": "layers",#012 "Layers": [#012 "sha256:a170762be59c15b133bd19c602942600caa3082ffe7158ccee8771dfc16bb660",#012 "sha256:47bbb708952ccfdaf6b1a15cd5347cc2e9ee37e63ec65603401dcebf66de9242",#012 "sha256:dde195c4be3ea0882f3029365e3a9510c9e08a199c8a2c93ddc2b8aa725a10f1",#012 "sha256:fea5deb7160e302cd22e3c992deba62cbf7d6165a1405448e9f5c2a028d21642",#012 "sha256:4eeffabd4cdb9b4e2c5d580693413a79687c46c48ae14555b12922136d641757"#012 ]#012 },#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251202",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "c3923531bcda0b0811b2d5053f189beb",#012 "tcib_managed": "true"#012 },#012 "Annotations": {},#012 "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",#012 "User": "root",#012 "History": [#012 {#012 "created": "2025-12-02T04:26:51.317229596Z",#012 "created_by": "/bin/sh -c #(nop) ADD file:9f05a0f58e10b77188c7243d914ce56c5ce3e0f2ee7e13a7b0d4990588c97b99 in / ",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-02T04:26:51.317315213Z",#012 "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\" org.label-schema.name=\"CentOS Stream 9 Base Image\" org.label-schema.vendor=\"CentOS\" org.label-schema.license=\"GPLv2\" org.label-schema.build-date=\"20251202\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-02T04:26:54.063957926Z",#012 "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"#012 },#012 {#012 "created": "2025-12-08T06:08:28.750777742Z",#012 "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",#012 "comment": "FROM quay.io/centos/centos:stream9",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-08T06:08:28.750791962Z",#012 "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-08T06:08:28.750804372Z",#012 "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-08T06:08:28.750813613Z",#012 "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-08T06:08:28.750824813Z",#012 "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-08T06:08:28.750833663Z",#012 "created_by": "/bin/sh -c #(nop) USER root",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-08T06:08:29.160435164Z",#012 "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-08T06:09:05.859236491Z",#012 "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",#012 "empty_layer": true#012 },#012 Dec 15 04:33:41 localhost podman[237797]: 2025-12-15 09:33:41.830527206 +0000 UTC m=+0.092672353 container remove d6041e5471624a495d5be42684f6049ccf71802a80d5760239330151d465d146 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.4, release=1761123044, vcs-type=git, distribution-scope=public, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, konflux.additional-tags=17.1.12 17.1_20251118.1, name=rhosp17/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'fcee5a4a91f85471fca7b61211375646'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=073ea4b06e5aa460399b0c251f416da40b228676, batch=17.1_20251118.1, tcib_managed=true, vendor=Red Hat, Inc., org.opencontainers.image.revision=073ea4b06e5aa460399b0c251f416da40b228676, io.openshift.expose-services=, version=17.1.12, build-date=2025-11-19T00:11:48Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, maintainer=OpenStack TripleO Team) Dec 15 04:33:41 localhost python3[237748]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman rm --force ceilometer_agent_compute Dec 15 04:33:41 localhost podman[237810]: Dec 15 04:33:41 localhost podman[237810]: 2025-12-15 09:33:41.937846199 +0000 UTC m=+0.088415645 container create b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3) Dec 15 04:33:41 localhost podman[237810]: 2025-12-15 09:33:41.89485839 +0000 UTC m=+0.045427846 image pull quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified Dec 15 04:33:41 localhost python3[237748]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ceilometer_agent_compute --conmon-pidfile /run/ceilometer_agent_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb --healthcheck-command /openstack/healthcheck compute --label config_id=ceilometer_agent_compute --label container_name=ceilometer_agent_compute --label managed_by=edpm_ansible --label config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']} --log-driver journald --log-level info --network host --security-opt label:type:ceilometer_polling_t --user ceilometer --volume /var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z --volume /var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z --volume /run/libvirt:/run/libvirt:shared,ro --volume /etc/hosts:/etc/hosts:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /dev/log:/dev/log --volume /var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified kolla_start Dec 15 04:33:42 localhost nova_compute[231752]: 2025-12-15 09:33:42.053 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:33:42 localhost nova_compute[231752]: 2025-12-15 09:33:42.364 231756 DEBUG oslo_concurrency.lockutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Acquiring lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 15 04:33:42 localhost nova_compute[231752]: 2025-12-15 09:33:42.365 231756 DEBUG oslo_concurrency.lockutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Acquired lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 15 04:33:42 localhost nova_compute[231752]: 2025-12-15 09:33:42.365 231756 DEBUG nova.network.neutron [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 15 04:33:42 localhost nova_compute[231752]: 2025-12-15 09:33:42.366 231756 DEBUG nova.objects.instance [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Lazy-loading 'info_cache' on Instance uuid 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 15 04:33:42 localhost python3.9[237958]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 15 04:33:43 localhost nova_compute[231752]: 2025-12-15 09:33:43.363 231756 DEBUG nova.network.neutron [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Updating instance_info_cache with network_info: [{"id": "03ef8889-3216-43fb-8a52-4be17a956ce1", "address": "fa:16:3e:74:df:7c", "network": {"id": "befb7a72-17a9-4bcb-b561-84b8f626685a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.201", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c785bf23f53946bc99867d8832a50266", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03ef8889-32", "ovs_interfaceid": "03ef8889-3216-43fb-8a52-4be17a956ce1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 15 04:33:43 localhost nova_compute[231752]: 2025-12-15 09:33:43.385 231756 DEBUG oslo_concurrency.lockutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Releasing lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 15 04:33:43 localhost nova_compute[231752]: 2025-12-15 09:33:43.386 231756 DEBUG nova.compute.manager [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 15 04:33:43 localhost nova_compute[231752]: 2025-12-15 09:33:43.387 231756 DEBUG oslo_service.periodic_task [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:33:43 localhost nova_compute[231752]: 2025-12-15 09:33:43.388 231756 DEBUG oslo_service.periodic_task [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:33:43 localhost nova_compute[231752]: 2025-12-15 09:33:43.388 231756 DEBUG oslo_service.periodic_task [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:33:43 localhost nova_compute[231752]: 2025-12-15 09:33:43.389 231756 DEBUG oslo_service.periodic_task [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:33:43 localhost nova_compute[231752]: 2025-12-15 09:33:43.389 231756 DEBUG oslo_service.periodic_task [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:33:43 localhost nova_compute[231752]: 2025-12-15 09:33:43.390 231756 DEBUG oslo_service.periodic_task [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:33:43 localhost nova_compute[231752]: 2025-12-15 09:33:43.390 231756 DEBUG nova.compute.manager [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 15 04:33:43 localhost nova_compute[231752]: 2025-12-15 09:33:43.391 231756 DEBUG oslo_service.periodic_task [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:33:43 localhost nova_compute[231752]: 2025-12-15 09:33:43.408 231756 DEBUG oslo_concurrency.lockutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:33:43 localhost nova_compute[231752]: 2025-12-15 09:33:43.408 231756 DEBUG oslo_concurrency.lockutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:33:43 localhost nova_compute[231752]: 2025-12-15 09:33:43.409 231756 DEBUG oslo_concurrency.lockutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:33:43 localhost nova_compute[231752]: 2025-12-15 09:33:43.409 231756 DEBUG nova.compute.resource_tracker [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Auditing locally available compute resources for np0005559462.localdomain (node: np0005559462.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 15 04:33:43 localhost nova_compute[231752]: 2025-12-15 09:33:43.410 231756 DEBUG oslo_concurrency.processutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 04:33:43 localhost python3.9[238070]: ansible-file Invoked with path=/etc/systemd/system/edpm_ceilometer_agent_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:33:43 localhost nova_compute[231752]: 2025-12-15 09:33:43.875 231756 DEBUG oslo_concurrency.processutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 04:33:43 localhost nova_compute[231752]: 2025-12-15 09:33:43.929 231756 DEBUG nova.virt.libvirt.driver [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 04:33:43 localhost nova_compute[231752]: 2025-12-15 09:33:43.930 231756 DEBUG nova.virt.libvirt.driver [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 04:33:44 localhost python3.9[238145]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ceilometer_agent_compute_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 15 04:33:44 localhost nova_compute[231752]: 2025-12-15 09:33:44.087 231756 WARNING nova.virt.libvirt.driver [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 15 04:33:44 localhost nova_compute[231752]: 2025-12-15 09:33:44.088 231756 DEBUG nova.compute.resource_tracker [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Hypervisor/Node resource view: name=np0005559462.localdomain free_ram=12912MB free_disk=41.83720779418945GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 15 04:33:44 localhost nova_compute[231752]: 2025-12-15 09:33:44.089 231756 DEBUG oslo_concurrency.lockutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:33:44 localhost nova_compute[231752]: 2025-12-15 09:33:44.089 231756 DEBUG oslo_concurrency.lockutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:33:44 localhost nova_compute[231752]: 2025-12-15 09:33:44.147 231756 DEBUG nova.compute.resource_tracker [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Instance 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 15 04:33:44 localhost nova_compute[231752]: 2025-12-15 09:33:44.148 231756 DEBUG nova.compute.resource_tracker [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 15 04:33:44 localhost nova_compute[231752]: 2025-12-15 09:33:44.148 231756 DEBUG nova.compute.resource_tracker [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Final resource view: name=np0005559462.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 15 04:33:44 localhost nova_compute[231752]: 2025-12-15 09:33:44.187 231756 DEBUG oslo_concurrency.processutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 04:33:44 localhost nova_compute[231752]: 2025-12-15 09:33:44.646 231756 DEBUG oslo_concurrency.processutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 04:33:44 localhost nova_compute[231752]: 2025-12-15 09:33:44.652 231756 DEBUG nova.compute.provider_tree [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Inventory has not changed in ProviderTree for provider: 26c8956b-6742-4951-b566-971b9bbe323b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 15 04:33:44 localhost nova_compute[231752]: 2025-12-15 09:33:44.668 231756 DEBUG nova.scheduler.client.report [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Inventory has not changed for provider 26c8956b-6742-4951-b566-971b9bbe323b based on inventory data: {'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 15 04:33:44 localhost nova_compute[231752]: 2025-12-15 09:33:44.670 231756 DEBUG nova.compute.resource_tracker [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Compute_service record updated for np0005559462.localdomain:np0005559462.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 15 04:33:44 localhost nova_compute[231752]: 2025-12-15 09:33:44.671 231756 DEBUG oslo_concurrency.lockutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.582s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:33:44 localhost python3.9[238276]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765791224.0720658-1373-9944393223011/source dest=/etc/systemd/system/edpm_ceilometer_agent_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:33:45 localhost python3.9[238333]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Dec 15 04:33:45 localhost systemd[1]: Reloading. Dec 15 04:33:45 localhost systemd-sysv-generator[238363]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:33:45 localhost systemd-rc-local-generator[238359]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:33:45 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:33:45 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 15 04:33:45 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:33:45 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:33:45 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:33:45 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 15 04:33:45 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:33:45 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:33:45 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:33:46 localhost nova_compute[231752]: 2025-12-15 09:33:46.086 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:33:46 localhost python3.9[238424]: ansible-systemd Invoked with state=restarted name=edpm_ceilometer_agent_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 04:33:46 localhost systemd[1]: Reloading. Dec 15 04:33:46 localhost systemd-sysv-generator[238455]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:33:46 localhost systemd-rc-local-generator[238450]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:33:46 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:33:46 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 15 04:33:46 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:33:46 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:33:46 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:33:46 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 15 04:33:46 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:33:46 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:33:46 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:33:46 localhost systemd[1]: Starting ceilometer_agent_compute container... Dec 15 04:33:46 localhost systemd[1]: tmp-crun.PVN7Jh.mount: Deactivated successfully. Dec 15 04:33:46 localhost systemd[1]: Started libcrun container. Dec 15 04:33:47 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c1e5ed9eaef5399fbe4374c3e6d9619203f154c5e688326c146c619b1f77dfea/merged/var/lib/kolla/config_files/src supports timestamps until 2038 (0x7fffffff) Dec 15 04:33:47 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c1e5ed9eaef5399fbe4374c3e6d9619203f154c5e688326c146c619b1f77dfea/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff) Dec 15 04:33:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a. Dec 15 04:33:47 localhost podman[238465]: 2025-12-15 09:33:47.035636951 +0000 UTC m=+0.150363116 container init b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute) Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: + sudo -E kolla_set_configs Dec 15 04:33:47 localhost nova_compute[231752]: 2025-12-15 09:33:47.055 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: sudo: unable to send audit message: Operation not permitted Dec 15 04:33:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a. Dec 15 04:33:47 localhost podman[238465]: 2025-12-15 09:33:47.084384328 +0000 UTC m=+0.199110443 container start b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS) Dec 15 04:33:47 localhost podman[238465]: ceilometer_agent_compute Dec 15 04:33:47 localhost systemd[1]: Started ceilometer_agent_compute container. Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: INFO:__main__:Validating config file Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: INFO:__main__:Copying service configuration files Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ceilometer.conf to /etc/ceilometer/ceilometer.conf Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: INFO:__main__:Copying /var/lib/kolla/config_files/src/polling.yaml to /etc/ceilometer/polling.yaml Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: INFO:__main__:Copying /var/lib/kolla/config_files/src/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: INFO:__main__:Writing out command to execute Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: ++ cat /run_command Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout' Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: + ARGS= Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: + sudo kolla_copy_cacerts Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: sudo: unable to send audit message: Operation not permitted Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: + [[ ! -n '' ]] Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: + . kolla_extend_start Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\''' Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout' Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: + umask 0022 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout Dec 15 04:33:47 localhost podman[238488]: 2025-12-15 09:33:47.182044716 +0000 UTC m=+0.090289187 container health_status b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Dec 15 04:33:47 localhost podman[238488]: 2025-12-15 09:33:47.213499169 +0000 UTC m=+0.121743650 container exec_died b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true) Dec 15 04:33:47 localhost podman[238488]: unhealthy Dec 15 04:33:47 localhost systemd[1]: b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a.service: Main process exited, code=exited, status=1/FAILURE Dec 15 04:33:47 localhost systemd[1]: b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a.service: Failed with result 'exit-code'. Dec 15 04:33:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18190 DF PROTO=TCP SPT=46342 DPT=9882 SEQ=321014561 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839E23260000000001030307) Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.869 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:40 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.869 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.869 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.869 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.869 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.869 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.869 2 DEBUG cotyledon.oslo_config_glue [-] batch_size = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.869 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.869 2 DEBUG cotyledon.oslo_config_glue [-] config_dir = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.870 2 DEBUG cotyledon.oslo_config_glue [-] config_file = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.870 2 DEBUG cotyledon.oslo_config_glue [-] config_source = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.870 2 DEBUG cotyledon.oslo_config_glue [-] debug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.870 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.870 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.870 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.870 2 DEBUG cotyledon.oslo_config_glue [-] host = np0005559462.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.870 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.870 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.870 2 DEBUG cotyledon.oslo_config_glue [-] instance_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.870 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.870 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.871 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.871 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.871 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.871 2 DEBUG cotyledon.oslo_config_glue [-] log_dir = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.871 2 DEBUG cotyledon.oslo_config_glue [-] log_file = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.871 2 DEBUG cotyledon.oslo_config_glue [-] log_options = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.871 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.871 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.871 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.871 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.871 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.871 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.871 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.872 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.872 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.872 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.872 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.872 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.872 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.872 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.872 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.872 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.872 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.872 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.872 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.872 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.873 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.873 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.873 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.873 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.873 2 DEBUG cotyledon.oslo_config_glue [-] sample_source = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.873 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.873 2 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.873 2 DEBUG cotyledon.oslo_config_glue [-] use_eventlog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.873 2 DEBUG cotyledon.oslo_config_glue [-] use_journal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.873 2 DEBUG cotyledon.oslo_config_glue [-] use_json = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.873 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.873 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.873 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.874 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.874 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.874 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.874 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.874 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.874 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.874 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.874 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.874 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.874 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.874 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.874 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.875 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.875 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.875 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.875 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.875 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.875 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.875 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.875 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.875 2 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.875 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.875 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.875 2 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.875 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.876 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.876 2 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.876 2 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.876 2 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.876 2 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.876 2 DEBUG cotyledon.oslo_config_glue [-] monasca.interface = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.876 2 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.876 2 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.876 2 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.876 2 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.876 2 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.876 2 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.876 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.877 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.877 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.877 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.877 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.877 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.877 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.877 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.877 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.877 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.877 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.877 2 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.878 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.878 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.878 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.878 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.878 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.878 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.878 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.878 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.878 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.878 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.878 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.878 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.879 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.879 2 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.879 2 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.879 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.879 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.879 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.879 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.879 2 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.879 2 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.879 2 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.879 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.879 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.879 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.880 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.880 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.880 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.880 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.880 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.880 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.880 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.880 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.880 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.880 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.880 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.880 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.880 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.881 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.881 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.881 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.881 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.881 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.881 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.881 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.881 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.881 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.881 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.881 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.881 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.882 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.882 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.882 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.882 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.882 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.882 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.900 12 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']]. Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.901 12 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d]. Dec 15 04:33:47 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:47.902 12 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']]. Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.012 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.083 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.083 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.083 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.083 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.083 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.083 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.083 12 DEBUG cotyledon.oslo_config_glue [-] batch_size = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.083 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.083 12 DEBUG cotyledon.oslo_config_glue [-] config_dir = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.083 12 DEBUG cotyledon.oslo_config_glue [-] config_file = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.084 12 DEBUG cotyledon.oslo_config_glue [-] config_source = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.084 12 DEBUG cotyledon.oslo_config_glue [-] control_exchange = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.084 12 DEBUG cotyledon.oslo_config_glue [-] debug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.084 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.084 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.084 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.084 12 DEBUG cotyledon.oslo_config_glue [-] host = np0005559462.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.084 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.084 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.084 12 DEBUG cotyledon.oslo_config_glue [-] instance_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.085 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.085 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.085 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.085 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.085 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.085 12 DEBUG cotyledon.oslo_config_glue [-] log_dir = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.085 12 DEBUG cotyledon.oslo_config_glue [-] log_file = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.085 12 DEBUG cotyledon.oslo_config_glue [-] log_options = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.085 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.085 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.085 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.085 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.085 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.086 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.086 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.086 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.086 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.086 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.086 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.086 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.086 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.086 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.086 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.086 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.086 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.087 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.087 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.087 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.087 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.087 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.087 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.087 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.087 12 DEBUG cotyledon.oslo_config_glue [-] sample_source = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.087 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.087 12 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.087 12 DEBUG cotyledon.oslo_config_glue [-] transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.087 12 DEBUG cotyledon.oslo_config_glue [-] use_eventlog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.087 12 DEBUG cotyledon.oslo_config_glue [-] use_journal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.088 12 DEBUG cotyledon.oslo_config_glue [-] use_json = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.088 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.088 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.088 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.088 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.088 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.088 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.088 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.088 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.088 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.089 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.089 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.089 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.089 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.089 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.089 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.089 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.089 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.089 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.089 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.089 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.089 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.090 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.090 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.090 12 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.090 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.090 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.090 12 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.090 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.090 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.090 12 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.090 12 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.090 12 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.090 12 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.091 12 DEBUG cotyledon.oslo_config_glue [-] monasca.interface = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.091 12 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.091 12 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.091 12 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.091 12 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.091 12 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.091 12 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.091 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.091 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.091 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.091 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.091 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.092 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.092 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.092 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.092 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.092 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.092 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.092 12 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.092 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.092 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.092 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.092 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.092 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.093 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.093 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.093 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.093 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.093 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.093 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.093 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.093 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.093 12 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.093 12 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.093 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.093 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.094 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.094 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.094 12 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.094 12 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.094 12 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.094 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.094 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.094 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.094 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.094 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.094 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.094 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.095 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.095 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.095 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.095 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.095 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.095 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.095 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.095 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.095 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.095 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.095 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.095 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.095 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.096 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.096 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.096 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.096 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.096 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.096 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.096 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.096 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.096 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.096 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.096 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.096 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.096 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.097 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.097 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.097 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.097 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.097 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.097 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.097 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.097 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.097 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.097 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.097 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.097 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.097 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.098 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.098 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.098 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.098 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.098 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.098 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.098 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.098 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.098 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.098 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.098 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.098 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.098 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.099 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.099 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.099 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.099 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.099 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.099 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.099 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.099 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.099 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.099 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.099 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.099 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.100 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.100 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.100 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.100 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.100 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.100 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.100 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.100 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.100 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_ca_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.100 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_cert_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.100 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.100 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_key_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.100 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_version = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.101 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.101 12 DEBUG cotyledon._service [-] Run service AgentManager(0) [12] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.104 12 DEBUG ceilometer.agent [-] Config file: {'sources': [{'name': 'pollsters', 'interval': 120, 'meters': ['power.state', 'cpu', 'memory.usage', 'disk.*', 'network.*']}]} load_config /usr/lib/python3.9/site-packages/ceilometer/agent.py:64 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.112 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93 Dec 15 04:33:48 localhost python3.9[238622]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.501 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET http://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}f9025d5df967f79e7acc7e55155d4b5ec75911ca267ffd714f746eb1b20c656a" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.596 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 327 Content-Type: application/json Date: Mon, 15 Dec 2025 09:33:48 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-a65d1c8d-eeed-4a50-861a-a42ac65b5f91 x-openstack-request-id: req-a65d1c8d-eeed-4a50-861a-a42ac65b5f91 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.597 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavors": [{"id": "2da0e147-aaa7-4bb9-a176-5fe1b15a32a0", "name": "m1.small", "links": [{"rel": "self", "href": "http://nova-internal.openstack.svc:8774/v2.1/flavors/2da0e147-aaa7-4bb9-a176-5fe1b15a32a0"}, {"rel": "bookmark", "href": "http://nova-internal.openstack.svc:8774/flavors/2da0e147-aaa7-4bb9-a176-5fe1b15a32a0"}]}]} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.598 12 DEBUG novaclient.v2.client [-] GET call to compute for http://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None used request id req-a65d1c8d-eeed-4a50-861a-a42ac65b5f91 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.600 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET http://nova-internal.openstack.svc:8774/v2.1/flavors/2da0e147-aaa7-4bb9-a176-5fe1b15a32a0 -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}f9025d5df967f79e7acc7e55155d4b5ec75911ca267ffd714f746eb1b20c656a" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.677 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 494 Content-Type: application/json Date: Mon, 15 Dec 2025 09:33:48 GMT Keep-Alive: timeout=5, max=99 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-81e6fb46-34b8-4601-989a-5b1da04c5e5f x-openstack-request-id: req-81e6fb46-34b8-4601-989a-5b1da04c5e5f _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.678 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavor": {"id": "2da0e147-aaa7-4bb9-a176-5fe1b15a32a0", "name": "m1.small", "ram": 512, "disk": 1, "swap": "", "OS-FLV-EXT-DATA:ephemeral": 1, "OS-FLV-DISABLED:disabled": false, "vcpus": 1, "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "links": [{"rel": "self", "href": "http://nova-internal.openstack.svc:8774/v2.1/flavors/2da0e147-aaa7-4bb9-a176-5fe1b15a32a0"}, {"rel": "bookmark", "href": "http://nova-internal.openstack.svc:8774/flavors/2da0e147-aaa7-4bb9-a176-5fe1b15a32a0"}]}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.678 12 DEBUG novaclient.v2.client [-] GET call to compute for http://nova-internal.openstack.svc:8774/v2.1/flavors/2da0e147-aaa7-4bb9-a176-5fe1b15a32a0 used request id req-81e6fb46-34b8-4601-989a-5b1da04c5e5f request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.680 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'name': 'test', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005559462.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'c785bf23f53946bc99867d8832a50266', 'user_id': '1ba5fce347b64bfebf995f187193f205', 'hostId': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.681 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.720 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.721 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.729 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8829b973-31e6-42ee-aed8-4f7c43afa6f2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:33:48.681690', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '28dbfd02-d999-11f0-817e-fa163ebaca0f', 'monotonic_time': 10294.874494309, 'message_signature': '8b8ea81ba3a92f9d115ca1f5f3be96af83db6b7aad769b04cb54389f23fb2c38'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:33:48.681690', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '28dc1c10-d999-11f0-817e-fa163ebaca0f', 'monotonic_time': 10294.874494309, 'message_signature': 'fc763b474064c941b516bf0e6c36ecf9a1de515643589635aaf980b4ea7b8c21'}]}, 'timestamp': '2025-12-15 09:33:48.722556', '_unique_id': '4504438613e44232a070e2c7d6cb888e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.729 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.729 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.729 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.729 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.729 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.729 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.729 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.729 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.729 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.729 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.729 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.729 12 ERROR oslo_messaging.notify.messaging Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.729 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.729 12 ERROR oslo_messaging.notify.messaging Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.729 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.729 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.729 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.729 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.729 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.729 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.729 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.729 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.729 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.729 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.729 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.729 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.729 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.729 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.729 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.729 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.729 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.729 12 ERROR oslo_messaging.notify.messaging Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.734 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.734 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.735 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.736 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ae3d90fd-41cd-4b14-905c-454df41efbec', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:33:48.734653', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '28de0b4c-d999-11f0-817e-fa163ebaca0f', 'monotonic_time': 10294.874494309, 'message_signature': '98433235a5d9e58c5d8cdbcf250e081faa89374e7e50b9eb53abba56f9a020bb'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:33:48.734653', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '28de2050-d999-11f0-817e-fa163ebaca0f', 'monotonic_time': 10294.874494309, 'message_signature': 'cf9d76d862f6e79ae07ceea7e257be875c48300401ae806fba3f75fa0fcf297c'}]}, 'timestamp': '2025-12-15 09:33:48.735717', '_unique_id': 'f4f277f310de4f289b2987097dbdd943'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.736 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.736 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.736 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.736 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.736 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.736 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.736 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.736 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.736 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.736 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.736 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.736 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.736 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.736 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.736 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.736 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.736 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.736 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.736 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.736 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.736 12 ERROR oslo_messaging.notify.messaging Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.736 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.736 12 ERROR oslo_messaging.notify.messaging Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.736 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.736 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.736 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.736 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.736 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.736 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.736 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.736 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.736 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.736 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.736 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.736 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.736 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.736 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.736 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.736 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.736 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.736 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.736 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.736 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.736 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.736 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.736 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.736 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.736 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.736 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.736 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.736 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.736 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.736 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.736 12 ERROR oslo_messaging.notify.messaging Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.738 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.742 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 / tap03ef8889-32 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.742 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.bytes volume: 11272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.744 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '714de1aa-36c6-42b5-9e1e-097ad7d287a9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11272, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:33:48.738618', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '28df434a-d999-11f0-817e-fa163ebaca0f', 'monotonic_time': 10294.93143381, 'message_signature': '6251bbad4e6a10e045da3a0c5cc34ddb1c0ad0e38e2f232f08b93d6c492be3b0'}]}, 'timestamp': '2025-12-15 09:33:48.743226', '_unique_id': 'd23e70e97639443ca6ca5686a635425f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.744 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.744 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.744 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.744 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.744 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.744 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.744 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.744 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.744 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.744 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.744 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.744 12 ERROR oslo_messaging.notify.messaging Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.744 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.744 12 ERROR oslo_messaging.notify.messaging Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.744 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.744 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.744 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.744 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.744 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.744 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.744 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.744 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.744 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.744 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.744 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.744 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.744 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.744 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.744 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.744 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.744 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.744 12 ERROR oslo_messaging.notify.messaging Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.745 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.746 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.packets volume: 129 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.748 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e1112f44-b2e1-4e63-a4ab-7f823e3301d6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 129, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:33:48.746039', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '28dfc892-d999-11f0-817e-fa163ebaca0f', 'monotonic_time': 10294.93143381, 'message_signature': '4f0708753c70ec0357a137449c00997f84829a8291fb9293b69855fb8b2bf14c'}]}, 'timestamp': '2025-12-15 09:33:48.746599', '_unique_id': '574bbcd759e9445f9b4780734871e7da'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.748 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.748 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.748 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.748 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.748 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.748 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.748 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.748 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.748 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.748 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.748 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.748 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.748 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.748 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.748 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.748 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.748 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.748 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.748 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.748 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.748 12 ERROR oslo_messaging.notify.messaging Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.748 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.748 12 ERROR oslo_messaging.notify.messaging Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.748 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.748 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.748 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.748 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.748 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.748 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.748 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.748 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.748 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.748 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.748 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.748 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.748 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.748 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.748 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.748 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.748 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.748 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.748 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.748 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.748 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.748 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.748 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.748 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.748 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.748 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.748 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.748 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.748 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.748 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.748 12 ERROR oslo_messaging.notify.messaging Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.751 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.751 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.753 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c82afdc6-24ad-45f6-a8aa-460e4852ab47', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:33:48.751244', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '28e0957e-d999-11f0-817e-fa163ebaca0f', 'monotonic_time': 10294.93143381, 'message_signature': '47bc1c96f7d51536f110784ceb958c163dda92706db9bbffff9b6f932934e9d2'}]}, 'timestamp': '2025-12-15 09:33:48.751978', '_unique_id': '3534077b51604dd4b06e3519e46c6988'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.753 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.753 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.753 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.753 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.753 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.753 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.753 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.753 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.753 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.753 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.753 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.753 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.753 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.753 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.753 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.753 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.753 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.753 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.753 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.753 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.753 12 ERROR oslo_messaging.notify.messaging Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.753 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.753 12 ERROR oslo_messaging.notify.messaging Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.753 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.753 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.753 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.753 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.753 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.753 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.753 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.753 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.753 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.753 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.753 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.753 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.753 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.753 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.753 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.753 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.753 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.753 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.753 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.753 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.753 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.753 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.753 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.753 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.753 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.753 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.753 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.753 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.753 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.753 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.753 12 ERROR oslo_messaging.notify.messaging Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.755 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.769 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.770 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.772 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e7a253ec-4d93-4107-9fb9-545c93c247a1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:33:48.755184', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '28e36236-d999-11f0-817e-fa163ebaca0f', 'monotonic_time': 10294.948106968, 'message_signature': '1c65e27196ea099dacd9c0bef7e34412da590097f9864bafa8004f5afd678b35'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:33:48.755184', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '28e37906-d999-11f0-817e-fa163ebaca0f', 'monotonic_time': 10294.948106968, 'message_signature': '7f80f338796bc9d107ee60f29c9408e833bc2da207dfed35e9c1e17d68af1662'}]}, 'timestamp': '2025-12-15 09:33:48.770770', '_unique_id': '2038c96411634d24b5df7c1a4d40361f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.772 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.772 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.772 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.772 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.772 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.772 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.772 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.772 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.772 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.772 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.772 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.772 12 ERROR oslo_messaging.notify.messaging Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.772 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.772 12 ERROR oslo_messaging.notify.messaging Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.772 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.772 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.772 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.772 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.772 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.772 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.772 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.772 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.772 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.772 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.772 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.772 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.772 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.772 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.772 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.772 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.772 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.772 12 ERROR oslo_messaging.notify.messaging Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.773 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.797 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/cpu volume: 53230000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.799 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd8d7ce5f-9191-4839-8edb-adce510d9831', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 53230000000, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'timestamp': '2025-12-15T09:33:48.773769', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '28e795cc-d999-11f0-817e-fa163ebaca0f', 'monotonic_time': 10294.989199705, 'message_signature': 'dfbe38ea2c56dfdaeb68ae3f62977d39cee3a1e5c41be4fd179fc48adc9ea51a'}]}, 'timestamp': '2025-12-15 09:33:48.797822', '_unique_id': '63b80d4775344aa5875e8c43beead736'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.799 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.799 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.799 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.799 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.799 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.799 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.799 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.799 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.799 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.799 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.799 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.799 12 ERROR oslo_messaging.notify.messaging Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.799 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.799 12 ERROR oslo_messaging.notify.messaging Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.799 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.799 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.799 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.799 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.799 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.799 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.799 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.799 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.799 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.799 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.799 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.799 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.799 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.799 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.799 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.799 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.799 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.799 12 ERROR oslo_messaging.notify.messaging Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.800 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.801 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.requests volume: 497 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.801 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.802 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '59deb719-a4e7-452a-bccd-ab5995eb0cef', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 497, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:33:48.800960', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '28e828ca-d999-11f0-817e-fa163ebaca0f', 'monotonic_time': 10294.874494309, 'message_signature': 'b9717cba2e77b2e6cbff5ed0541591b11c5b8b91e2b30b567e4adef57ff447d3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:33:48.800960', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '28e83a72-d999-11f0-817e-fa163ebaca0f', 'monotonic_time': 10294.874494309, 'message_signature': 'e4e30bfd79eef5a7348f0fe5d0727d728e08872e2a35a34efe7b41aa41a55ddc'}]}, 'timestamp': '2025-12-15 09:33:48.801911', '_unique_id': 'fc6216cdc7144bd39c5d7d276e60140f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.802 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.802 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.802 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.802 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.802 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.802 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.802 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.802 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.802 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.802 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.802 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.802 12 ERROR oslo_messaging.notify.messaging Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.802 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.802 12 ERROR oslo_messaging.notify.messaging Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.802 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.802 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.802 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.802 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.802 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.802 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.802 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.802 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.802 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.802 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.802 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.802 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.802 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.802 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.802 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.802 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.802 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.802 12 ERROR oslo_messaging.notify.messaging Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.804 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.804 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.bytes volume: 73912320 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.805 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.806 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ce6094b8-a821-4385-ba24-0d9b99919f7b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73912320, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:33:48.804643', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '28e8b70e-d999-11f0-817e-fa163ebaca0f', 'monotonic_time': 10294.874494309, 'message_signature': '43f1fdb3aa72096130a44f5c44b012c0b1a2979344e4bc4016dd923fe3c1cbf1'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:33:48.804643', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '28e8c9a6-d999-11f0-817e-fa163ebaca0f', 'monotonic_time': 10294.874494309, 'message_signature': 'eadbe719a1bcafda9335a2b755c02a114645bc243f411f385358c9a50eb2d9bc'}]}, 'timestamp': '2025-12-15 09:33:48.805580', '_unique_id': '877a75fcd28d46daba5fd0b7a07822df'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.806 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.806 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.806 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.806 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.806 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.806 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.806 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.806 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.806 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.806 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.806 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.806 12 ERROR oslo_messaging.notify.messaging Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.806 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.806 12 ERROR oslo_messaging.notify.messaging Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.806 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.806 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.806 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.806 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.806 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.806 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.806 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.806 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.806 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.806 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.806 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.806 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.806 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.806 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.806 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.806 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.806 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.806 12 ERROR oslo_messaging.notify.messaging Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.807 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.807 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.809 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6229ee67-50cf-4739-986c-4a1d9c9e00ab', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:33:48.807850', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '28e93602-d999-11f0-817e-fa163ebaca0f', 'monotonic_time': 10294.93143381, 'message_signature': 'c7d05c1e8937fa25604c4917978d00caa4f74f046272fc3270727b7489b0417d'}]}, 'timestamp': '2025-12-15 09:33:48.808383', '_unique_id': 'e144f3aa47de490597b1c986967696f5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.809 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.809 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.809 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.809 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.809 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.809 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.809 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.809 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.809 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.809 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.809 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.809 12 ERROR oslo_messaging.notify.messaging Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.809 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.809 12 ERROR oslo_messaging.notify.messaging Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.809 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.809 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.809 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.809 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.809 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.809 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.809 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.809 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.809 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.809 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.809 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.809 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.809 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.809 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.809 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.809 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.809 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.809 12 ERROR oslo_messaging.notify.messaging Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.810 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.810 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.bytes volume: 8783 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.812 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5bb286e5-0a5b-4ecf-a599-381804b003f8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 8783, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:33:48.810579', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '28e99e8a-d999-11f0-817e-fa163ebaca0f', 'monotonic_time': 10294.93143381, 'message_signature': '48f317fba20eb0395a50c9092526a79a96266d8f92d83fce1b525b354fe55860'}]}, 'timestamp': '2025-12-15 09:33:48.811083', '_unique_id': 'ac6031e6cde047f69adbbc70ef2e162d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.812 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.812 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.812 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.812 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.812 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.812 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.812 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.812 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.812 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.812 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.812 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.812 12 ERROR oslo_messaging.notify.messaging Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.812 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.812 12 ERROR oslo_messaging.notify.messaging Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.812 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.812 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.812 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.812 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.812 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.812 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.812 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.812 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.812 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.812 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.812 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.812 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.812 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.812 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.812 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.812 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.812 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.812 12 ERROR oslo_messaging.notify.messaging Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.813 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.813 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.813 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.815 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9a076f01-59d8-42d7-bc3c-c148aecef969', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:33:48.813265', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '28ea07bc-d999-11f0-817e-fa163ebaca0f', 'monotonic_time': 10294.948106968, 'message_signature': '030132ba8e0d9c5f4e3443527512f04245af50e428eb8e7c87cb09a5ef0cfc6d'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:33:48.813265', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '28ea1856-d999-11f0-817e-fa163ebaca0f', 'monotonic_time': 10294.948106968, 'message_signature': '602750100281316d24e82335b4c85a2765dfa25347d5404e8a0f315e5d7212cb'}]}, 'timestamp': '2025-12-15 09:33:48.814185', '_unique_id': '749ec202a73c43bcae92c5a1db48df3f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.815 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.815 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.815 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.815 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.815 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.815 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.815 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.815 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.815 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.815 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.815 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.815 12 ERROR oslo_messaging.notify.messaging Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.815 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.815 12 ERROR oslo_messaging.notify.messaging Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.815 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.815 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.815 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.815 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.815 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.815 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.815 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.815 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.815 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.815 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.815 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.815 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.815 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.815 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.815 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.815 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.815 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.815 12 ERROR oslo_messaging.notify.messaging Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.816 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.816 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.latency volume: 937264501 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.816 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.latency volume: 204572919 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.818 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8df464ad-549c-4492-a4a3-4f299433123e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 937264501, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:33:48.816392', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '28ea8228-d999-11f0-817e-fa163ebaca0f', 'monotonic_time': 10294.874494309, 'message_signature': '0438bf4d009e34c4e3321371407f088cb1632531907b954547fb84697044bcb9'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 204572919, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:33:48.816392', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '28ea938a-d999-11f0-817e-fa163ebaca0f', 'monotonic_time': 10294.874494309, 'message_signature': '1e573f1bf27e66265519771ea3a3c2c8a833caee0265f213a64325518b2777cf'}]}, 'timestamp': '2025-12-15 09:33:48.817298', '_unique_id': 'a77e8b99867346198bd980fb1259b51e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.818 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.818 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.818 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.818 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.818 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.818 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.818 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.818 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.818 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.818 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.818 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.818 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.818 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.818 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.818 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.818 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.818 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.818 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.818 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.818 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.818 12 ERROR oslo_messaging.notify.messaging Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.818 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.818 12 ERROR oslo_messaging.notify.messaging Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.818 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.818 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.818 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.818 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.818 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.818 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.818 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.818 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.818 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.818 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.818 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.818 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.818 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.818 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.818 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.818 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.818 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.818 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.818 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.818 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.818 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.818 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.818 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.818 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.818 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.818 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.818 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.818 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.818 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.818 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.818 12 ERROR oslo_messaging.notify.messaging Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.819 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.819 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.820 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '956ab039-1ee8-41bb-8c2e-e6210dedcc45', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:33:48.819547', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '28eafce4-d999-11f0-817e-fa163ebaca0f', 'monotonic_time': 10294.93143381, 'message_signature': '8ab73058877f108cd74139d97cbaf69cc6066d7419a2082fbd8893d169ab0009'}]}, 'timestamp': '2025-12-15 09:33:48.820053', '_unique_id': 'f92d50e211f5468f8b551dcfcdf38486'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.820 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.820 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.820 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.820 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.820 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.820 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.820 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.820 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.820 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.820 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.820 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.820 12 ERROR oslo_messaging.notify.messaging Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.820 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.820 12 ERROR oslo_messaging.notify.messaging Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.820 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.820 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.820 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.820 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.820 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.820 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.820 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.820 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.820 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.820 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.820 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.820 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.820 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.820 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.820 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.820 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.820 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.820 12 ERROR oslo_messaging.notify.messaging Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.822 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.822 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/memory.usage volume: 52.3125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.823 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e666b2eb-6146-4277-bda4-ca9931b9911f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.3125, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'timestamp': '2025-12-15T09:33:48.822252', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '28eb66a2-d999-11f0-817e-fa163ebaca0f', 'monotonic_time': 10294.989199705, 'message_signature': '9cd1ac4b6ef0b7e7648016039fe99175e4a9404ff1c3182622d5cae5168ee57b'}]}, 'timestamp': '2025-12-15 09:33:48.822716', '_unique_id': '476011c8618f4f79a00aa9e07a3d24fd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.823 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.823 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.823 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.823 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.823 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.823 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.823 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.823 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.823 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.823 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.823 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.823 12 ERROR oslo_messaging.notify.messaging Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.823 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.823 12 ERROR oslo_messaging.notify.messaging Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.823 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.823 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.823 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.823 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.823 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.823 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.823 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.823 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.823 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.823 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.823 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.823 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.823 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.823 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.823 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.823 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.823 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.823 12 ERROR oslo_messaging.notify.messaging Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.824 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.824 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.825 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.826 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c0a07975-b5c0-4ac2-8ea8-c25c9dc9b585', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:33:48.824888', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '28ebce6c-d999-11f0-817e-fa163ebaca0f', 'monotonic_time': 10294.948106968, 'message_signature': 'c897c53decd12950802fd8877f30019b9e4d92536632d91eadf7f848966643d4'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:33:48.824888', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '28ebe262-d999-11f0-817e-fa163ebaca0f', 'monotonic_time': 10294.948106968, 'message_signature': '6850148b0cefd3ab1640167d47b7dfbc20192395dca000241929034bf25a26aa'}]}, 'timestamp': '2025-12-15 09:33:48.825927', '_unique_id': '4b3b67e7bba942db84fe439a7931ed7a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.826 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.826 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.826 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.826 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.826 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.826 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.826 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.826 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.826 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.826 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.826 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.826 12 ERROR oslo_messaging.notify.messaging Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.826 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.826 12 ERROR oslo_messaging.notify.messaging Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.826 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.826 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.826 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.826 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.826 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.826 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.826 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.826 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.826 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.826 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.826 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.826 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.826 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.826 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.826 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.826 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.826 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.826 12 ERROR oslo_messaging.notify.messaging Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.828 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.828 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.828 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [] Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.829 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.829 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.829 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [] Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.829 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.829 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.830 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [] Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.830 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.830 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.831 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fef4b988-5884-4964-95d4-4b84d12a4824', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:33:48.830415', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '28eca526-d999-11f0-817e-fa163ebaca0f', 'monotonic_time': 10294.93143381, 'message_signature': '040656f3aaa5814f8584f4110cde84edc63d5af098cb7737b46b175f52f027ce'}]}, 'timestamp': '2025-12-15 09:33:48.830886', '_unique_id': '18c32f2839844be48ddfd0d94bda9136'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.831 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.831 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.831 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.831 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.831 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.831 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.831 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.831 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.831 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.831 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.831 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.831 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.831 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.831 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.831 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.831 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.831 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.831 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.831 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.831 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.831 12 ERROR oslo_messaging.notify.messaging Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.831 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.831 12 ERROR oslo_messaging.notify.messaging Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.831 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.831 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.831 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.831 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.831 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.831 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.831 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.831 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.831 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.831 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.831 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.831 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.831 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.831 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.831 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.831 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.831 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.831 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.831 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.831 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.831 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.831 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.831 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.831 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.831 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.831 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.831 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.831 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.831 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.831 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.831 12 ERROR oslo_messaging.notify.messaging Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.832 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.833 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.packets volume: 82 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.834 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1464bcc7-3971-4d2a-afdc-68eb7fba0923', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 82, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:33:48.833115', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '28ed0ec6-d999-11f0-817e-fa163ebaca0f', 'monotonic_time': 10294.93143381, 'message_signature': 'fa378ac455dec46748b2de89d04a949a97369397e5e9da48b2efc17b47fd0ee4'}]}, 'timestamp': '2025-12-15 09:33:48.833610', '_unique_id': 'b9f6f634011f4a25a4ae2e3acc2c746a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.834 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.834 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.834 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.834 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.834 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.834 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.834 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.834 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.834 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.834 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.834 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.834 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.834 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.834 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.834 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.834 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.834 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.834 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.834 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.834 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.834 12 ERROR oslo_messaging.notify.messaging Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.834 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.834 12 ERROR oslo_messaging.notify.messaging Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.834 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.834 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.834 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.834 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.834 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.834 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.834 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.834 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.834 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.834 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.834 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.834 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.834 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.834 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.834 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.834 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.834 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.834 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.834 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.834 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.834 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.834 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.834 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.834 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.834 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.834 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.834 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.834 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.834 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.834 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.834 12 ERROR oslo_messaging.notify.messaging Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.835 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.835 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.837 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b6ac24e0-5e75-4e3d-bf0e-7c9cad9cb915', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:33:48.835745', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '28ed7546-d999-11f0-817e-fa163ebaca0f', 'monotonic_time': 10294.93143381, 'message_signature': '061e228f93b73d4ce1e91d0b3222ccccd2e41836e1a3c42e47402f631f810709'}]}, 'timestamp': '2025-12-15 09:33:48.836236', '_unique_id': '6a22b1f28a2342569ed31e59de065f6d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.837 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.837 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.837 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.837 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.837 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.837 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.837 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.837 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.837 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.837 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.837 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.837 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.837 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.837 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.837 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.837 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.837 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.837 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.837 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.837 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.837 12 ERROR oslo_messaging.notify.messaging Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.837 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.837 12 ERROR oslo_messaging.notify.messaging Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.837 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.837 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.837 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.837 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.837 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.837 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.837 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.837 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.837 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.837 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.837 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.837 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.837 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.837 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.837 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.837 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.837 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.837 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.837 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.837 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.837 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.837 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.837 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.837 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.837 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.837 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.837 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.837 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.837 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.837 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.837 12 ERROR oslo_messaging.notify.messaging Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.838 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.838 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.latency volume: 213002426 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.838 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.latency volume: 24733520 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.839 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1022607c-791c-4a72-bb5a-2cdf69026bea', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 213002426, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:33:48.838411', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '28eddc84-d999-11f0-817e-fa163ebaca0f', 'monotonic_time': 10294.874494309, 'message_signature': '72ccb36c13a6e030866e52d7f84e24c1598e79fae593f8a3716a3b546fd591dd'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24733520, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:33:48.838411', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '28ede738-d999-11f0-817e-fa163ebaca0f', 'monotonic_time': 10294.874494309, 'message_signature': 'b6ecb99db9034ff2e2adad89c58d956f0be10560a80df1ac13810fc5007a072d'}]}, 'timestamp': '2025-12-15 09:33:48.839049', '_unique_id': 'ed4a68e95f6747a285b3bef1a6e9621f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.839 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.839 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.839 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.839 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.839 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.839 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.839 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.839 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.839 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.839 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.839 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.839 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.839 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.839 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.839 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.839 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.839 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.839 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.839 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.839 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.839 12 ERROR oslo_messaging.notify.messaging Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.839 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.839 12 ERROR oslo_messaging.notify.messaging Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.839 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.839 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.839 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.839 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.839 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.839 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.839 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.839 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.839 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.839 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.839 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.839 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.839 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.839 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.839 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.839 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.839 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.839 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.839 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.839 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.839 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.839 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.839 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.839 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.839 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.839 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.839 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.839 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.839 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.839 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.839 12 ERROR oslo_messaging.notify.messaging Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.840 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.840 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.840 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [] Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.840 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.840 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.841 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6f3a438d-b186-4871-9ccc-953b1dfe9a17', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:33:48.840876', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '28ee3c74-d999-11f0-817e-fa163ebaca0f', 'monotonic_time': 10294.93143381, 'message_signature': 'b5621b5baa36a2c1053cc877f301c7b6ffa78441afb06066300c964ec2810f2f'}]}, 'timestamp': '2025-12-15 09:33:48.841243', '_unique_id': '8659d40054474d7c8fad8c6fb070860e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.841 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.841 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.841 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.841 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.841 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.841 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.841 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.841 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.841 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.841 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.841 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.841 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.841 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.841 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.841 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.841 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.841 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.841 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.841 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.841 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.841 12 ERROR oslo_messaging.notify.messaging Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.841 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.841 12 ERROR oslo_messaging.notify.messaging Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.841 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.841 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.841 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.841 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.841 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.841 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.841 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.841 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.841 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.841 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.841 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.841 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.841 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.841 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.841 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.841 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.841 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.841 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.841 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.841 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.841 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.841 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.841 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.841 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.841 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.841 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.841 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.841 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.841 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.841 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:33:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:33:48.841 12 ERROR oslo_messaging.notify.messaging Dec 15 04:33:49 localhost python3.9[238734]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:33:49 localhost python3.9[238824]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765791228.7847106-1496-143186988571258/.source.yaml _original_basename=.k_w17hyo follow=False checksum=4b1aa2476f3a970cb16bc1060fb4cfb24f803458 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:33:50 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1391 DF PROTO=TCP SPT=35508 DPT=9105 SEQ=3873680184 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839E2F260000000001030307) Dec 15 04:33:50 localhost python3.9[238934]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/node_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:33:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 04:33:51 localhost podman[239023]: 2025-12-15 09:33:51.0823102 +0000 UTC m=+0.084940661 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.build-date=20251202) Dec 15 04:33:51 localhost nova_compute[231752]: 2025-12-15 09:33:51.091 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:33:51 localhost podman[239023]: 2025-12-15 09:33:51.122258656 +0000 UTC m=+0.124889047 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Dec 15 04:33:51 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 04:33:51 localhost python3.9[239022]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/node_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765791230.1864455-1541-178942079218330/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Dec 15 04:33:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:33:51.436 160590 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:33:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:33:51.437 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:33:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:33:51.439 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:33:52 localhost nova_compute[231752]: 2025-12-15 09:33:52.057 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:33:52 localhost python3.9[239158]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:33:53 localhost python3.9[239268]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 15 04:33:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39162 DF PROTO=TCP SPT=51796 DPT=9100 SEQ=1897002777 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839E3A220000000001030307) Dec 15 04:33:53 localhost python3.9[239378]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:33:54 localhost python3.9[239435]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/var/lib/kolla/config_files/ceilometer_agent_compute.json _original_basename=.isoemh1c recurse=False state=file path=/var/lib/kolla/config_files/ceilometer_agent_compute.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:33:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39163 DF PROTO=TCP SPT=51796 DPT=9100 SEQ=1897002777 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839E3E250000000001030307) Dec 15 04:33:55 localhost python3.9[239543]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/node_exporter state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:33:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17901 DF PROTO=TCP SPT=39776 DPT=9102 SEQ=1196762911 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839E44650000000001030307) Dec 15 04:33:56 localhost nova_compute[231752]: 2025-12-15 09:33:56.090 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:33:57 localhost nova_compute[231752]: 2025-12-15 09:33:57.059 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:33:58 localhost python3.9[239847]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/node_exporter config_pattern=*.json debug=False Dec 15 04:33:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33297 DF PROTO=TCP SPT=49340 DPT=9101 SEQ=4274873244 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839E50250000000001030307) Dec 15 04:33:59 localhost python3.9[239957]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack Dec 15 04:34:00 localhost python3.9[240067]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None Dec 15 04:34:01 localhost nova_compute[231752]: 2025-12-15 09:34:01.093 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:34:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 04:34:01 localhost podman[240111]: 2025-12-15 09:34:01.732536066 +0000 UTC m=+0.066573317 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Dec 15 04:34:01 localhost podman[240111]: 2025-12-15 09:34:01.738180841 +0000 UTC m=+0.072218082 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0) Dec 15 04:34:01 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 04:34:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28139 DF PROTO=TCP SPT=53968 DPT=9882 SEQ=910555791 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839E5BCA0000000001030307) Dec 15 04:34:02 localhost nova_compute[231752]: 2025-12-15 09:34:02.068 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:34:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28141 DF PROTO=TCP SPT=53968 DPT=9882 SEQ=910555791 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839E67E50000000001030307) Dec 15 04:34:05 localhost python3[240221]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/node_exporter config_id=node_exporter config_overrides={} config_patterns=*.json containers=['node_exporter'] log_base_path=/var/log/containers/stdouts debug=False Dec 15 04:34:05 localhost podman[240259]: Dec 15 04:34:05 localhost podman[240259]: 2025-12-15 09:34:05.870491081 +0000 UTC m=+0.080400626 container create 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, maintainer=The Prometheus Authors ) Dec 15 04:34:05 localhost podman[240259]: 2025-12-15 09:34:05.827156723 +0000 UTC m=+0.037066308 image pull quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c Dec 15 04:34:05 localhost python3[240221]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name node_exporter --conmon-pidfile /run/node_exporter.pid --env OS_ENDPOINT_TYPE=internal --healthcheck-command /openstack/healthcheck node_exporter --label config_id=node_exporter --label container_name=node_exporter --label managed_by=edpm_ansible --label config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9100:9100 --user root --volume /var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw --volume /var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c --web.disable-exporter-metrics --collector.systemd --collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service --no-collector.dmi --no-collector.entropy --no-collector.thermal_zone --no-collector.time --no-collector.timex --no-collector.uname --no-collector.stat --no-collector.hwmon --no-collector.os --no-collector.selinux --no-collector.textfile --no-collector.powersupplyclass --no-collector.pressure --no-collector.rapl Dec 15 04:34:06 localhost nova_compute[231752]: 2025-12-15 09:34:06.095 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:34:06 localhost python3.9[240407]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 15 04:34:07 localhost nova_compute[231752]: 2025-12-15 09:34:07.122 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:34:07 localhost python3.9[240519]: ansible-file Invoked with path=/etc/systemd/system/edpm_node_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:34:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13695 DF PROTO=TCP SPT=38830 DPT=9105 SEQ=3315691390 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839E73660000000001030307) Dec 15 04:34:08 localhost python3.9[240574]: ansible-stat Invoked with path=/etc/systemd/system/edpm_node_exporter_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 15 04:34:09 localhost python3.9[240683]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765791248.3869808-1913-62674467925256/source dest=/etc/systemd/system/edpm_node_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:34:09 localhost python3.9[240738]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Dec 15 04:34:09 localhost systemd[1]: Reloading. Dec 15 04:34:09 localhost systemd-sysv-generator[240766]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:34:09 localhost systemd-rc-local-generator[240759]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:34:09 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:34:09 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 15 04:34:09 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:34:09 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:34:09 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:34:09 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 15 04:34:09 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:34:09 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:34:09 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:34:10 localhost python3.9[240828]: ansible-systemd Invoked with state=restarted name=edpm_node_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 04:34:10 localhost systemd[1]: Reloading. Dec 15 04:34:10 localhost systemd-rc-local-generator[240857]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:34:10 localhost systemd-sysv-generator[240861]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:34:10 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:34:10 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 15 04:34:10 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:34:10 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:34:10 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:34:10 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 15 04:34:10 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:34:10 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:34:10 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:34:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 04:34:11 localhost systemd[1]: Starting node_exporter container... Dec 15 04:34:11 localhost nova_compute[231752]: 2025-12-15 09:34:11.113 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:34:11 localhost systemd[1]: tmp-crun.DzeeN5.mount: Deactivated successfully. Dec 15 04:34:11 localhost podman[240868]: 2025-12-15 09:34:11.133817182 +0000 UTC m=+0.115749695 container health_status 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd) Dec 15 04:34:11 localhost systemd[1]: Started libcrun container. Dec 15 04:34:11 localhost podman[240868]: 2025-12-15 09:34:11.217555469 +0000 UTC m=+0.199487972 container exec_died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team) Dec 15 04:34:11 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.service: Deactivated successfully. Dec 15 04:34:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0. Dec 15 04:34:11 localhost podman[240870]: 2025-12-15 09:34:11.299764474 +0000 UTC m=+0.280187276 container init 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Dec 15 04:34:11 localhost node_exporter[240903]: ts=2025-12-15T09:34:11.317Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)" Dec 15 04:34:11 localhost node_exporter[240903]: ts=2025-12-15T09:34:11.317Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)" Dec 15 04:34:11 localhost node_exporter[240903]: ts=2025-12-15T09:34:11.318Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required." Dec 15 04:34:11 localhost node_exporter[240903]: ts=2025-12-15T09:34:11.318Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$ Dec 15 04:34:11 localhost node_exporter[240903]: ts=2025-12-15T09:34:11.318Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data Dec 15 04:34:11 localhost node_exporter[240903]: ts=2025-12-15T09:34:11.319Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/) Dec 15 04:34:11 localhost node_exporter[240903]: ts=2025-12-15T09:34:11.319Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$ Dec 15 04:34:11 localhost node_exporter[240903]: ts=2025-12-15T09:34:11.319Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service Dec 15 04:34:11 localhost node_exporter[240903]: ts=2025-12-15T09:34:11.320Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice) Dec 15 04:34:11 localhost node_exporter[240903]: ts=2025-12-15T09:34:11.320Z caller=node_exporter.go:110 level=info msg="Enabled collectors" Dec 15 04:34:11 localhost node_exporter[240903]: ts=2025-12-15T09:34:11.320Z caller=node_exporter.go:117 level=info collector=arp Dec 15 04:34:11 localhost node_exporter[240903]: ts=2025-12-15T09:34:11.320Z caller=node_exporter.go:117 level=info collector=bcache Dec 15 04:34:11 localhost node_exporter[240903]: ts=2025-12-15T09:34:11.320Z caller=node_exporter.go:117 level=info collector=bonding Dec 15 04:34:11 localhost node_exporter[240903]: ts=2025-12-15T09:34:11.320Z caller=node_exporter.go:117 level=info collector=btrfs Dec 15 04:34:11 localhost node_exporter[240903]: ts=2025-12-15T09:34:11.320Z caller=node_exporter.go:117 level=info collector=conntrack Dec 15 04:34:11 localhost node_exporter[240903]: ts=2025-12-15T09:34:11.320Z caller=node_exporter.go:117 level=info collector=cpu Dec 15 04:34:11 localhost node_exporter[240903]: ts=2025-12-15T09:34:11.320Z caller=node_exporter.go:117 level=info collector=cpufreq Dec 15 04:34:11 localhost node_exporter[240903]: ts=2025-12-15T09:34:11.320Z caller=node_exporter.go:117 level=info collector=diskstats Dec 15 04:34:11 localhost node_exporter[240903]: ts=2025-12-15T09:34:11.320Z caller=node_exporter.go:117 level=info collector=edac Dec 15 04:34:11 localhost node_exporter[240903]: ts=2025-12-15T09:34:11.320Z caller=node_exporter.go:117 level=info collector=fibrechannel Dec 15 04:34:11 localhost node_exporter[240903]: ts=2025-12-15T09:34:11.320Z caller=node_exporter.go:117 level=info collector=filefd Dec 15 04:34:11 localhost node_exporter[240903]: ts=2025-12-15T09:34:11.320Z caller=node_exporter.go:117 level=info collector=filesystem Dec 15 04:34:11 localhost node_exporter[240903]: ts=2025-12-15T09:34:11.320Z caller=node_exporter.go:117 level=info collector=infiniband Dec 15 04:34:11 localhost node_exporter[240903]: ts=2025-12-15T09:34:11.320Z caller=node_exporter.go:117 level=info collector=ipvs Dec 15 04:34:11 localhost node_exporter[240903]: ts=2025-12-15T09:34:11.320Z caller=node_exporter.go:117 level=info collector=loadavg Dec 15 04:34:11 localhost node_exporter[240903]: ts=2025-12-15T09:34:11.320Z caller=node_exporter.go:117 level=info collector=mdadm Dec 15 04:34:11 localhost node_exporter[240903]: ts=2025-12-15T09:34:11.320Z caller=node_exporter.go:117 level=info collector=meminfo Dec 15 04:34:11 localhost node_exporter[240903]: ts=2025-12-15T09:34:11.320Z caller=node_exporter.go:117 level=info collector=netclass Dec 15 04:34:11 localhost node_exporter[240903]: ts=2025-12-15T09:34:11.320Z caller=node_exporter.go:117 level=info collector=netdev Dec 15 04:34:11 localhost node_exporter[240903]: ts=2025-12-15T09:34:11.320Z caller=node_exporter.go:117 level=info collector=netstat Dec 15 04:34:11 localhost node_exporter[240903]: ts=2025-12-15T09:34:11.320Z caller=node_exporter.go:117 level=info collector=nfs Dec 15 04:34:11 localhost node_exporter[240903]: ts=2025-12-15T09:34:11.320Z caller=node_exporter.go:117 level=info collector=nfsd Dec 15 04:34:11 localhost node_exporter[240903]: ts=2025-12-15T09:34:11.320Z caller=node_exporter.go:117 level=info collector=nvme Dec 15 04:34:11 localhost node_exporter[240903]: ts=2025-12-15T09:34:11.320Z caller=node_exporter.go:117 level=info collector=schedstat Dec 15 04:34:11 localhost node_exporter[240903]: ts=2025-12-15T09:34:11.320Z caller=node_exporter.go:117 level=info collector=sockstat Dec 15 04:34:11 localhost node_exporter[240903]: ts=2025-12-15T09:34:11.320Z caller=node_exporter.go:117 level=info collector=softnet Dec 15 04:34:11 localhost node_exporter[240903]: ts=2025-12-15T09:34:11.320Z caller=node_exporter.go:117 level=info collector=systemd Dec 15 04:34:11 localhost node_exporter[240903]: ts=2025-12-15T09:34:11.320Z caller=node_exporter.go:117 level=info collector=tapestats Dec 15 04:34:11 localhost node_exporter[240903]: ts=2025-12-15T09:34:11.320Z caller=node_exporter.go:117 level=info collector=udp_queues Dec 15 04:34:11 localhost node_exporter[240903]: ts=2025-12-15T09:34:11.320Z caller=node_exporter.go:117 level=info collector=vmstat Dec 15 04:34:11 localhost node_exporter[240903]: ts=2025-12-15T09:34:11.320Z caller=node_exporter.go:117 level=info collector=xfs Dec 15 04:34:11 localhost node_exporter[240903]: ts=2025-12-15T09:34:11.320Z caller=node_exporter.go:117 level=info collector=zfs Dec 15 04:34:11 localhost node_exporter[240903]: ts=2025-12-15T09:34:11.322Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100 Dec 15 04:34:11 localhost node_exporter[240903]: ts=2025-12-15T09:34:11.322Z caller=tls_config.go:235 level=info msg="TLS is disabled." http2=false address=[::]:9100 Dec 15 04:34:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0. Dec 15 04:34:11 localhost podman[240870]: 2025-12-15 09:34:11.340186863 +0000 UTC m=+0.320609665 container start 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Dec 15 04:34:11 localhost podman[240870]: node_exporter Dec 15 04:34:11 localhost systemd[1]: Started node_exporter container. Dec 15 04:34:11 localhost podman[240912]: 2025-12-15 09:34:11.430681735 +0000 UTC m=+0.086983657 container health_status 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=starting, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Dec 15 04:34:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33299 DF PROTO=TCP SPT=49340 DPT=9101 SEQ=4274873244 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839E81250000000001030307) Dec 15 04:34:11 localhost podman[240912]: 2025-12-15 09:34:11.470320672 +0000 UTC m=+0.126622614 container exec_died 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Dec 15 04:34:11 localhost systemd[1]: 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0.service: Deactivated successfully. Dec 15 04:34:12 localhost nova_compute[231752]: 2025-12-15 09:34:12.124 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:34:12 localhost python3.9[241042]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml Dec 15 04:34:13 localhost python3.9[241152]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:34:14 localhost python3.9[241242]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765791252.8978574-2036-18425827643566/.source.yaml _original_basename=.rkpp1gko follow=False checksum=26721cbaa1d1b43f3a6d6677685904ff1c90949d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:34:14 localhost python3.9[241352]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/podman_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:34:15 localhost python3.9[241440]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/podman_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765791254.4719603-2081-14211480600131/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Dec 15 04:34:16 localhost nova_compute[231752]: 2025-12-15 09:34:16.151 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:34:16 localhost python3.9[241550]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:34:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28143 DF PROTO=TCP SPT=53968 DPT=9882 SEQ=910555791 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839E97250000000001030307) Dec 15 04:34:17 localhost nova_compute[231752]: 2025-12-15 09:34:17.127 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:34:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a. Dec 15 04:34:17 localhost podman[241661]: 2025-12-15 09:34:17.446105564 +0000 UTC m=+0.089171136 container health_status b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202) Dec 15 04:34:17 localhost podman[241661]: 2025-12-15 09:34:17.476660862 +0000 UTC m=+0.119726374 container exec_died b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0) Dec 15 04:34:17 localhost podman[241661]: unhealthy Dec 15 04:34:17 localhost systemd[1]: b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a.service: Main process exited, code=exited, status=1/FAILURE Dec 15 04:34:17 localhost systemd[1]: b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a.service: Failed with result 'exit-code'. Dec 15 04:34:17 localhost python3.9[241660]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 15 04:34:18 localhost python3.9[241788]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:34:19 localhost python3.9[241845]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/var/lib/kolla/config_files/ceilometer_agent_compute.json _original_basename=.vyzzymq5 recurse=False state=file path=/var/lib/kolla/config_files/ceilometer_agent_compute.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:34:19 localhost python3.9[241953]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/podman_exporter state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:34:19 localhost sshd[241954]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:34:20 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13697 DF PROTO=TCP SPT=38830 DPT=9105 SEQ=3315691390 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839EA3260000000001030307) Dec 15 04:34:21 localhost nova_compute[231752]: 2025-12-15 09:34:21.154 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:34:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 04:34:21 localhost podman[242205]: 2025-12-15 09:34:21.766291688 +0000 UTC m=+0.089933029 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 04:34:21 localhost podman[242205]: 2025-12-15 09:34:21.838820386 +0000 UTC m=+0.162461687 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2) Dec 15 04:34:21 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 04:34:22 localhost python3.9[242285]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/podman_exporter config_pattern=*.json debug=False Dec 15 04:34:22 localhost nova_compute[231752]: 2025-12-15 09:34:22.159 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:34:23 localhost python3.9[242395]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack Dec 15 04:34:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14135 DF PROTO=TCP SPT=42520 DPT=9100 SEQ=1880190263 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839EAF520000000001030307) Dec 15 04:34:23 localhost python3.9[242505]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None Dec 15 04:34:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14136 DF PROTO=TCP SPT=42520 DPT=9100 SEQ=1880190263 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839EB3650000000001030307) Dec 15 04:34:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9388 DF PROTO=TCP SPT=56640 DPT=9102 SEQ=939830452 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839EB9A50000000001030307) Dec 15 04:34:26 localhost nova_compute[231752]: 2025-12-15 09:34:26.158 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:34:27 localhost nova_compute[231752]: 2025-12-15 09:34:27.192 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:34:28 localhost python3[242642]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/podman_exporter config_id=podman_exporter config_overrides={} config_patterns=*.json containers=['podman_exporter'] log_base_path=/var/log/containers/stdouts debug=False Dec 15 04:34:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57986 DF PROTO=TCP SPT=56536 DPT=9101 SEQ=1609692718 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839EC5650000000001030307) Dec 15 04:34:30 localhost podman[242656]: 2025-12-15 09:34:28.280796936 +0000 UTC m=+0.044833661 image pull quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd Dec 15 04:34:30 localhost podman[242729]: Dec 15 04:34:30 localhost podman[242729]: 2025-12-15 09:34:30.67849335 +0000 UTC m=+0.058736452 container create a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, maintainer=Navid Yaghoobi , config_id=podman_exporter) Dec 15 04:34:30 localhost podman[242729]: 2025-12-15 09:34:30.6496542 +0000 UTC m=+0.029897302 image pull quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd Dec 15 04:34:30 localhost python3[242642]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name podman_exporter --conmon-pidfile /run/podman_exporter.pid --env CONTAINER_HOST=unix:///run/podman/podman.sock --env OS_ENDPOINT_TYPE=internal --healthcheck-command /openstack/healthcheck podman_exporter --label config_id=podman_exporter --label container_name=podman_exporter --label managed_by=edpm_ansible --label config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9882:9882 --user root --volume /run/podman/podman.sock:/run/podman/podman.sock:rw,z --volume /var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd Dec 15 04:34:31 localhost nova_compute[231752]: 2025-12-15 09:34:31.196 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:34:31 localhost python3.9[242875]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 15 04:34:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65055 DF PROTO=TCP SPT=44724 DPT=9882 SEQ=2632830315 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839ED0F90000000001030307) Dec 15 04:34:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 04:34:32 localhost systemd[1]: tmp-crun.BF4I96.mount: Deactivated successfully. Dec 15 04:34:32 localhost podman[242988]: 2025-12-15 09:34:32.181608687 +0000 UTC m=+0.104895668 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3) Dec 15 04:34:32 localhost podman[242988]: 2025-12-15 09:34:32.191398386 +0000 UTC m=+0.114685417 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent) Dec 15 04:34:32 localhost nova_compute[231752]: 2025-12-15 09:34:32.194 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:34:32 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 04:34:32 localhost python3.9[242987]: ansible-file Invoked with path=/etc/systemd/system/edpm_podman_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:34:32 localhost python3.9[243061]: ansible-stat Invoked with path=/etc/systemd/system/edpm_podman_exporter_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 15 04:34:33 localhost python3.9[243170]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765791272.708074-2453-156782091376284/source dest=/etc/systemd/system/edpm_podman_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:34:33 localhost python3.9[243225]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Dec 15 04:34:33 localhost systemd[1]: Reloading. Dec 15 04:34:33 localhost systemd-rc-local-generator[243248]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:34:33 localhost systemd-sysv-generator[243253]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:34:34 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:34:34 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 15 04:34:34 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:34:34 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:34:34 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:34:34 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 15 04:34:34 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:34:34 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:34:34 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:34:34 localhost python3.9[243353]: ansible-systemd Invoked with state=restarted name=edpm_podman_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 04:34:34 localhost systemd[1]: Reloading. Dec 15 04:34:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65057 DF PROTO=TCP SPT=44724 DPT=9882 SEQ=2632830315 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839EDCE50000000001030307) Dec 15 04:34:34 localhost systemd-rc-local-generator[243409]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:34:34 localhost systemd-sysv-generator[243414]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:34:34 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:34:34 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 15 04:34:34 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:34:34 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:34:35 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:34:35 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 15 04:34:35 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:34:35 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:34:35 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:34:35 localhost systemd[1]: Starting podman_exporter container... Dec 15 04:34:35 localhost systemd[1]: tmp-crun.JahMrh.mount: Deactivated successfully. Dec 15 04:34:35 localhost systemd[1]: Started libcrun container. Dec 15 04:34:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e. Dec 15 04:34:35 localhost podman[243424]: 2025-12-15 09:34:35.339115531 +0000 UTC m=+0.155775664 container init a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 15 04:34:35 localhost podman_exporter[243438]: ts=2025-12-15T09:34:35.359Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)" Dec 15 04:34:35 localhost podman_exporter[243438]: ts=2025-12-15T09:34:35.359Z caller=exporter.go:69 level=info msg=metrics enhanced=false Dec 15 04:34:35 localhost podman_exporter[243438]: ts=2025-12-15T09:34:35.360Z caller=handler.go:94 level=info msg="enabled collectors" Dec 15 04:34:35 localhost podman_exporter[243438]: ts=2025-12-15T09:34:35.360Z caller=handler.go:105 level=info collector=container Dec 15 04:34:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e. Dec 15 04:34:35 localhost podman[243424]: 2025-12-15 09:34:35.374669815 +0000 UTC m=+0.191329898 container start a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Dec 15 04:34:35 localhost podman[243424]: podman_exporter Dec 15 04:34:35 localhost systemd[1]: Starting Podman API Service... Dec 15 04:34:35 localhost systemd[1]: Started Podman API Service. Dec 15 04:34:35 localhost systemd[1]: Started podman_exporter container. Dec 15 04:34:35 localhost podman[243449]: time="2025-12-15T09:34:35Z" level=info msg="/usr/bin/podman filtering at log level info" Dec 15 04:34:35 localhost podman[243448]: 2025-12-15 09:34:35.475855151 +0000 UTC m=+0.092574820 container health_status a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=starting, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Dec 15 04:34:35 localhost podman[243449]: time="2025-12-15T09:34:35Z" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" Dec 15 04:34:35 localhost podman[243449]: time="2025-12-15T09:34:35Z" level=info msg="Setting parallel job count to 25" Dec 15 04:34:35 localhost podman[243449]: time="2025-12-15T09:34:35Z" level=info msg="Using systemd socket activation to determine API endpoint" Dec 15 04:34:35 localhost podman[243449]: time="2025-12-15T09:34:35Z" level=info msg="API service listening on \"/run/podman/podman.sock\". URI: \"/run/podman/podman.sock\"" Dec 15 04:34:35 localhost podman[243448]: 2025-12-15 09:34:35.489388892 +0000 UTC m=+0.106108591 container exec_died a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Dec 15 04:34:35 localhost podman[243448]: unhealthy Dec 15 04:34:35 localhost systemd[1]: a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e.service: Main process exited, code=exited, status=1/FAILURE Dec 15 04:34:35 localhost podman[243449]: @ - - [15/Dec/2025:09:34:35 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1" Dec 15 04:34:35 localhost podman[243449]: time="2025-12-15T09:34:35Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 15 04:34:35 localhost systemd[1]: a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e.service: Failed with result 'exit-code'. Dec 15 04:34:36 localhost nova_compute[231752]: 2025-12-15 09:34:36.199 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:34:36 localhost systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully. Dec 15 04:34:36 localhost systemd[1]: var-lib-containers-storage-overlay-cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7-merged.mount: Deactivated successfully. Dec 15 04:34:37 localhost systemd[1]: var-lib-containers-storage-overlay-cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7-merged.mount: Deactivated successfully. Dec 15 04:34:37 localhost python3.9[243611]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml Dec 15 04:34:37 localhost nova_compute[231752]: 2025-12-15 09:34:37.228 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:34:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58444 DF PROTO=TCP SPT=41242 DPT=9105 SEQ=2687390131 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839EE8A60000000001030307) Dec 15 04:34:38 localhost python3.9[243721]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:34:38 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Dec 15 04:34:38 localhost systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully. Dec 15 04:34:38 localhost systemd[1]: var-lib-containers-storage-overlay-d6636e8195e20b46e9ff0be91c525681b79b061d34e7042a3302554bc91c2a8c-merged.mount: Deactivated successfully. Dec 15 04:34:39 localhost python3.9[243811]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765791277.6056185-2576-152439948806594/.source.yaml _original_basename=.i6f02whd follow=False checksum=791312cfe760a7c70e5d08303285c74267dd80a2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:34:39 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 15 04:34:39 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Dec 15 04:34:39 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Dec 15 04:34:39 localhost python3.9[243921]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/openstack_network_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:34:40 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Dec 15 04:34:40 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Dec 15 04:34:40 localhost python3.9[244009]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/openstack_network_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765791279.323755-2621-2416155649564/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None Dec 15 04:34:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57988 DF PROTO=TCP SPT=56536 DPT=9101 SEQ=1609692718 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839EF5260000000001030307) Dec 15 04:34:41 localhost nova_compute[231752]: 2025-12-15 09:34:41.201 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:34:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 04:34:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0. Dec 15 04:34:41 localhost podman[244027]: 2025-12-15 09:34:41.54814191 +0000 UTC m=+0.129770750 container health_status 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true) Dec 15 04:34:41 localhost podman[244027]: 2025-12-15 09:34:41.555532033 +0000 UTC m=+0.137161183 container exec_died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team) Dec 15 04:34:42 localhost systemd[1]: var-lib-containers-storage-overlay-cf8de856f68682579de884f5a9ccb4b00fffe375a72087325354c97a26c55ce7-merged.mount: Deactivated successfully. Dec 15 04:34:42 localhost systemd[1]: var-lib-containers-storage-overlay-53b8a95516d46c09ab3d6aa5613b1755b13426c834b6bc7a5ba27a227397c635-merged.mount: Deactivated successfully. Dec 15 04:34:42 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.service: Deactivated successfully. Dec 15 04:34:42 localhost podman[244045]: 2025-12-15 09:34:42.193883782 +0000 UTC m=+0.642832783 container health_status 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Dec 15 04:34:42 localhost podman[244045]: 2025-12-15 09:34:42.229487928 +0000 UTC m=+0.678436889 container exec_died 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Dec 15 04:34:42 localhost nova_compute[231752]: 2025-12-15 09:34:42.271 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:34:42 localhost python3.9[244150]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:34:42 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 15 04:34:42 localhost systemd[1]: var-lib-containers-storage-overlay-cae296f764831135e29cafc4ebb3dae4bbdc9f9a6aba7fb9c51fecf58f2b7f2e-merged.mount: Deactivated successfully. Dec 15 04:34:42 localhost systemd[1]: var-lib-containers-storage-overlay-cae296f764831135e29cafc4ebb3dae4bbdc9f9a6aba7fb9c51fecf58f2b7f2e-merged.mount: Deactivated successfully. Dec 15 04:34:42 localhost systemd[1]: 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0.service: Deactivated successfully. Dec 15 04:34:42 localhost python3.9[244271]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 15 04:34:43 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Dec 15 04:34:43 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 15 04:34:43 localhost python3.9[244381]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:34:43 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 15 04:34:44 localhost python3.9[244438]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/var/lib/kolla/config_files/ceilometer_agent_compute.json _original_basename=.ru7abdxd recurse=False state=file path=/var/lib/kolla/config_files/ceilometer_agent_compute.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:34:44 localhost nova_compute[231752]: 2025-12-15 09:34:44.611 231756 DEBUG oslo_service.periodic_task [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:34:44 localhost nova_compute[231752]: 2025-12-15 09:34:44.611 231756 DEBUG oslo_service.periodic_task [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:34:44 localhost nova_compute[231752]: 2025-12-15 09:34:44.630 231756 DEBUG oslo_service.periodic_task [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:34:44 localhost nova_compute[231752]: 2025-12-15 09:34:44.631 231756 DEBUG nova.compute.manager [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 15 04:34:44 localhost nova_compute[231752]: 2025-12-15 09:34:44.631 231756 DEBUG nova.compute.manager [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 15 04:34:44 localhost systemd[1]: var-lib-containers-storage-overlay-cae296f764831135e29cafc4ebb3dae4bbdc9f9a6aba7fb9c51fecf58f2b7f2e-merged.mount: Deactivated successfully. Dec 15 04:34:44 localhost python3.9[244546]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/openstack_network_exporter state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:34:44 localhost systemd[1]: var-lib-containers-storage-overlay-5f937d9d464184dc9258be95e010d108efe08788960fe016cf6a07726f0a4d55-merged.mount: Deactivated successfully. Dec 15 04:34:44 localhost systemd[1]: var-lib-containers-storage-overlay-5f937d9d464184dc9258be95e010d108efe08788960fe016cf6a07726f0a4d55-merged.mount: Deactivated successfully. Dec 15 04:34:45 localhost nova_compute[231752]: 2025-12-15 09:34:45.371 231756 DEBUG oslo_concurrency.lockutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Acquiring lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 15 04:34:45 localhost nova_compute[231752]: 2025-12-15 09:34:45.372 231756 DEBUG oslo_concurrency.lockutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Acquired lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 15 04:34:45 localhost nova_compute[231752]: 2025-12-15 09:34:45.372 231756 DEBUG nova.network.neutron [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 15 04:34:45 localhost nova_compute[231752]: 2025-12-15 09:34:45.372 231756 DEBUG nova.objects.instance [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Lazy-loading 'info_cache' on Instance uuid 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 15 04:34:46 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Dec 15 04:34:46 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 15 04:34:46 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 15 04:34:46 localhost nova_compute[231752]: 2025-12-15 09:34:46.209 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:34:46 localhost nova_compute[231752]: 2025-12-15 09:34:46.403 231756 DEBUG nova.network.neutron [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Updating instance_info_cache with network_info: [{"id": "03ef8889-3216-43fb-8a52-4be17a956ce1", "address": "fa:16:3e:74:df:7c", "network": {"id": "befb7a72-17a9-4bcb-b561-84b8f626685a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.201", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c785bf23f53946bc99867d8832a50266", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03ef8889-32", "ovs_interfaceid": "03ef8889-3216-43fb-8a52-4be17a956ce1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 15 04:34:46 localhost nova_compute[231752]: 2025-12-15 09:34:46.428 231756 DEBUG oslo_concurrency.lockutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Releasing lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 15 04:34:46 localhost nova_compute[231752]: 2025-12-15 09:34:46.429 231756 DEBUG nova.compute.manager [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 15 04:34:46 localhost nova_compute[231752]: 2025-12-15 09:34:46.429 231756 DEBUG oslo_service.periodic_task [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:34:46 localhost nova_compute[231752]: 2025-12-15 09:34:46.429 231756 DEBUG oslo_service.periodic_task [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:34:46 localhost nova_compute[231752]: 2025-12-15 09:34:46.430 231756 DEBUG oslo_service.periodic_task [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:34:46 localhost nova_compute[231752]: 2025-12-15 09:34:46.430 231756 DEBUG oslo_service.periodic_task [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:34:46 localhost nova_compute[231752]: 2025-12-15 09:34:46.430 231756 DEBUG oslo_service.periodic_task [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:34:46 localhost nova_compute[231752]: 2025-12-15 09:34:46.431 231756 DEBUG oslo_service.periodic_task [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:34:46 localhost nova_compute[231752]: 2025-12-15 09:34:46.431 231756 DEBUG nova.compute.manager [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 15 04:34:46 localhost nova_compute[231752]: 2025-12-15 09:34:46.431 231756 DEBUG oslo_service.periodic_task [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:34:46 localhost nova_compute[231752]: 2025-12-15 09:34:46.448 231756 DEBUG oslo_concurrency.lockutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:34:46 localhost nova_compute[231752]: 2025-12-15 09:34:46.449 231756 DEBUG oslo_concurrency.lockutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:34:46 localhost nova_compute[231752]: 2025-12-15 09:34:46.449 231756 DEBUG oslo_concurrency.lockutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:34:46 localhost nova_compute[231752]: 2025-12-15 09:34:46.450 231756 DEBUG nova.compute.resource_tracker [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Auditing locally available compute resources for np0005559462.localdomain (node: np0005559462.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 15 04:34:46 localhost nova_compute[231752]: 2025-12-15 09:34:46.450 231756 DEBUG oslo_concurrency.processutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 04:34:46 localhost python3.9[244870]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/openstack_network_exporter config_pattern=*.json debug=False Dec 15 04:34:47 localhost nova_compute[231752]: 2025-12-15 09:34:47.001 231756 DEBUG oslo_concurrency.processutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.551s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 04:34:47 localhost nova_compute[231752]: 2025-12-15 09:34:47.053 231756 DEBUG nova.virt.libvirt.driver [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 04:34:47 localhost nova_compute[231752]: 2025-12-15 09:34:47.054 231756 DEBUG nova.virt.libvirt.driver [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 04:34:47 localhost systemd[1]: var-lib-containers-storage-overlay-a802e2c2182c5081dae453e00ae55ca652c01124f4ff691b910ec76e11c97f5a-merged.mount: Deactivated successfully. Dec 15 04:34:47 localhost nova_compute[231752]: 2025-12-15 09:34:47.255 231756 WARNING nova.virt.libvirt.driver [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 15 04:34:47 localhost nova_compute[231752]: 2025-12-15 09:34:47.258 231756 DEBUG nova.compute.resource_tracker [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Hypervisor/Node resource view: name=np0005559462.localdomain free_ram=12711MB free_disk=41.83720779418945GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 15 04:34:47 localhost nova_compute[231752]: 2025-12-15 09:34:47.258 231756 DEBUG oslo_concurrency.lockutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:34:47 localhost nova_compute[231752]: 2025-12-15 09:34:47.259 231756 DEBUG oslo_concurrency.lockutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:34:47 localhost systemd[1]: var-lib-containers-storage-overlay-3cbc8d0b1fd940058dd16189a8a0f2adea168e1862db457c5612a248dfd3b9d7-merged.mount: Deactivated successfully. Dec 15 04:34:47 localhost nova_compute[231752]: 2025-12-15 09:34:47.274 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:34:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65059 DF PROTO=TCP SPT=44724 DPT=9882 SEQ=2632830315 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839F0D250000000001030307) Dec 15 04:34:47 localhost systemd[1]: var-lib-containers-storage-overlay-3cbc8d0b1fd940058dd16189a8a0f2adea168e1862db457c5612a248dfd3b9d7-merged.mount: Deactivated successfully. Dec 15 04:34:47 localhost nova_compute[231752]: 2025-12-15 09:34:47.335 231756 DEBUG nova.compute.resource_tracker [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Instance 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 15 04:34:47 localhost nova_compute[231752]: 2025-12-15 09:34:47.335 231756 DEBUG nova.compute.resource_tracker [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 15 04:34:47 localhost nova_compute[231752]: 2025-12-15 09:34:47.336 231756 DEBUG nova.compute.resource_tracker [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Final resource view: name=np0005559462.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 15 04:34:47 localhost nova_compute[231752]: 2025-12-15 09:34:47.370 231756 DEBUG oslo_concurrency.processutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 04:34:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a. Dec 15 04:34:47 localhost systemd[1]: tmp-crun.DProXH.mount: Deactivated successfully. Dec 15 04:34:47 localhost podman[244978]: 2025-12-15 09:34:47.667204102 +0000 UTC m=+0.114599734 container health_status b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 04:34:47 localhost podman[244978]: 2025-12-15 09:34:47.696314941 +0000 UTC m=+0.143710553 container exec_died b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=ceilometer_agent_compute, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 04:34:47 localhost podman[244978]: unhealthy Dec 15 04:34:47 localhost nova_compute[231752]: 2025-12-15 09:34:47.804 231756 DEBUG oslo_concurrency.processutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.434s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 04:34:47 localhost nova_compute[231752]: 2025-12-15 09:34:47.812 231756 DEBUG nova.compute.provider_tree [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Inventory has not changed in ProviderTree for provider: 26c8956b-6742-4951-b566-971b9bbe323b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 15 04:34:47 localhost nova_compute[231752]: 2025-12-15 09:34:47.827 231756 DEBUG nova.scheduler.client.report [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Inventory has not changed for provider 26c8956b-6742-4951-b566-971b9bbe323b based on inventory data: {'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 15 04:34:47 localhost nova_compute[231752]: 2025-12-15 09:34:47.829 231756 DEBUG nova.compute.resource_tracker [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Compute_service record updated for np0005559462.localdomain:np0005559462.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 15 04:34:47 localhost nova_compute[231752]: 2025-12-15 09:34:47.829 231756 DEBUG oslo_concurrency.lockutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.570s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:34:47 localhost python3.9[245017]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack Dec 15 04:34:48 localhost systemd[1]: var-lib-containers-storage-overlay-5e63dbc6f2c2fad3afb78d8adbb63d1357a03d400c05fbcd9ab42cd01e6497a2-merged.mount: Deactivated successfully. Dec 15 04:34:48 localhost systemd[1]: b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a.service: Main process exited, code=exited, status=1/FAILURE Dec 15 04:34:48 localhost systemd[1]: b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a.service: Failed with result 'exit-code'. Dec 15 04:34:48 localhost systemd[1]: var-lib-containers-storage-overlay-e007bb9d0888be9cba9b97125428a4f6aecdcc0d729e1ce5c64249815340e7d9-merged.mount: Deactivated successfully. Dec 15 04:34:49 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 15 04:34:49 localhost systemd[1]: var-lib-containers-storage-overlay-5e63dbc6f2c2fad3afb78d8adbb63d1357a03d400c05fbcd9ab42cd01e6497a2-merged.mount: Deactivated successfully. Dec 15 04:34:49 localhost systemd[1]: var-lib-containers-storage-overlay-5e63dbc6f2c2fad3afb78d8adbb63d1357a03d400c05fbcd9ab42cd01e6497a2-merged.mount: Deactivated successfully. Dec 15 04:34:49 localhost python3.9[245131]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None Dec 15 04:34:49 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Dec 15 04:34:49 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 15 04:34:49 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 15 04:34:50 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58446 DF PROTO=TCP SPT=41242 DPT=9105 SEQ=2687390131 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839F19250000000001030307) Dec 15 04:34:50 localhost systemd[1]: var-lib-containers-storage-overlay-e007bb9d0888be9cba9b97125428a4f6aecdcc0d729e1ce5c64249815340e7d9-merged.mount: Deactivated successfully. Dec 15 04:34:50 localhost systemd[1]: var-lib-containers-storage-overlay-d37fb345fe4e85b30ebfa0aa6234403fca41e6174e3eeb88a2e8baf1f4bd0d92-merged.mount: Deactivated successfully. Dec 15 04:34:50 localhost systemd[1]: var-lib-containers-storage-overlay-d37fb345fe4e85b30ebfa0aa6234403fca41e6174e3eeb88a2e8baf1f4bd0d92-merged.mount: Deactivated successfully. Dec 15 04:34:51 localhost nova_compute[231752]: 2025-12-15 09:34:51.250 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:34:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:34:51.437 160590 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:34:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:34:51.438 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:34:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:34:51.439 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:34:52 localhost nova_compute[231752]: 2025-12-15 09:34:52.276 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:34:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 04:34:52 localhost systemd[1]: tmp-crun.3XvphO.mount: Deactivated successfully. Dec 15 04:34:52 localhost podman[245145]: 2025-12-15 09:34:52.741067128 +0000 UTC m=+0.073948670 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller) Dec 15 04:34:52 localhost podman[245145]: 2025-12-15 09:34:52.768292314 +0000 UTC m=+0.101173786 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, container_name=ovn_controller, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 04:34:53 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50745 DF PROTO=TCP SPT=51828 DPT=9100 SEQ=4205695293 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839F24820000000001030307) Dec 15 04:34:53 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Dec 15 04:34:53 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Dec 15 04:34:53 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 04:34:53 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Dec 15 04:34:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50746 DF PROTO=TCP SPT=51828 DPT=9100 SEQ=4205695293 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839F28A50000000001030307) Dec 15 04:34:55 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Dec 15 04:34:55 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Dec 15 04:34:55 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Dec 15 04:34:56 localhost nova_compute[231752]: 2025-12-15 09:34:56.273 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:34:56 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50747 DF PROTO=TCP SPT=51828 DPT=9100 SEQ=4205695293 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839F30A50000000001030307) Dec 15 04:34:56 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 15 04:34:56 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Dec 15 04:34:56 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Dec 15 04:34:57 localhost nova_compute[231752]: 2025-12-15 09:34:57.280 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:34:57 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Dec 15 04:34:57 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 15 04:34:57 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 15 04:34:57 localhost kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 15 04:34:57 localhost kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 15 04:34:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28695 DF PROTO=TCP SPT=56222 DPT=9101 SEQ=1467576029 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839F3AA50000000001030307) Dec 15 04:35:00 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Dec 15 04:35:00 localhost systemd[1]: var-lib-containers-storage-overlay-ca79fd7369f4c32cc46dcf947016bb386f5de0d0f04a624169cf5c85994521b4-merged.mount: Deactivated successfully. Dec 15 04:35:00 localhost kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 15 04:35:00 localhost kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior. Dec 15 04:35:01 localhost nova_compute[231752]: 2025-12-15 09:35:01.304 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:35:01 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Dec 15 04:35:01 localhost systemd[1]: var-lib-containers-storage-overlay-88264f091bd3862d781bfa87f5675ae91e879ca34a7c2bbe081e8ea3bd8603d6-merged.mount: Deactivated successfully. Dec 15 04:35:01 localhost systemd[1]: var-lib-containers-storage-overlay-88264f091bd3862d781bfa87f5675ae91e879ca34a7c2bbe081e8ea3bd8603d6-merged.mount: Deactivated successfully. Dec 15 04:35:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45650 DF PROTO=TCP SPT=52772 DPT=9882 SEQ=2040480627 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839F462A0000000001030307) Dec 15 04:35:02 localhost nova_compute[231752]: 2025-12-15 09:35:02.283 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:35:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 04:35:02 localhost podman[245171]: 2025-12-15 09:35:02.507517953 +0000 UTC m=+0.090377013 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Dec 15 04:35:02 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 15 04:35:02 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Dec 15 04:35:02 localhost podman[245171]: 2025-12-15 09:35:02.54047837 +0000 UTC m=+0.123337490 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 04:35:02 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Dec 15 04:35:02 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 04:35:03 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Dec 15 04:35:03 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 15 04:35:03 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Dec 15 04:35:04 localhost systemd[1]: var-lib-containers-storage-overlay-88264f091bd3862d781bfa87f5675ae91e879ca34a7c2bbe081e8ea3bd8603d6-merged.mount: Deactivated successfully. Dec 15 04:35:04 localhost systemd[1]: var-lib-containers-storage-overlay-d63ab03ca441fedc5f5fcdf51699b396e9401963b7839d4b0e700c4e4e1e58a9-merged.mount: Deactivated successfully. Dec 15 04:35:04 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45652 DF PROTO=TCP SPT=52772 DPT=9882 SEQ=2040480627 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839F52250000000001030307) Dec 15 04:35:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e. Dec 15 04:35:05 localhost podman[245189]: 2025-12-15 09:35:05.75947225 +0000 UTC m=+0.087449322 container health_status a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=starting, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Dec 15 04:35:05 localhost podman[245189]: 2025-12-15 09:35:05.800417691 +0000 UTC m=+0.128394793 container exec_died a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 15 04:35:05 localhost podman[245189]: unhealthy Dec 15 04:35:05 localhost ceph-osd[31375]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Dec 15 04:35:05 localhost ceph-osd[31375]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 6600.1 total, 600.0 interval#012Cumulative writes: 4815 writes, 21K keys, 4815 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 4815 writes, 628 syncs, 7.67 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Dec 15 04:35:06 localhost nova_compute[231752]: 2025-12-15 09:35:06.338 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:35:06 localhost systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully. Dec 15 04:35:06 localhost systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully. Dec 15 04:35:07 localhost systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully. Dec 15 04:35:07 localhost systemd[1]: a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e.service: Main process exited, code=exited, status=1/FAILURE Dec 15 04:35:07 localhost systemd[1]: a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e.service: Failed with result 'exit-code'. Dec 15 04:35:07 localhost nova_compute[231752]: 2025-12-15 09:35:07.286 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:35:07 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31559 DF PROTO=TCP SPT=37098 DPT=9105 SEQ=1121519201 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839F5DE50000000001030307) Dec 15 04:35:08 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Dec 15 04:35:08 localhost systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully. Dec 15 04:35:08 localhost systemd[1]: var-lib-containers-storage-overlay-14ed6d3c1e7f0efbf3e5310f077b6fbf5a3cd333e0b5df7204752cd3df15a8b7-merged.mount: Deactivated successfully. Dec 15 04:35:09 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 15 04:35:09 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Dec 15 04:35:10 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Dec 15 04:35:10 localhost ceph-osd[32311]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Dec 15 04:35:10 localhost ceph-osd[32311]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 6600.2 total, 600.0 interval#012Cumulative writes: 5745 writes, 25K keys, 5745 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 5745 writes, 763 syncs, 7.53 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Dec 15 04:35:10 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Dec 15 04:35:10 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 15 04:35:10 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 15 04:35:11 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28697 DF PROTO=TCP SPT=56222 DPT=9101 SEQ=1467576029 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839F6B260000000001030307) Dec 15 04:35:11 localhost nova_compute[231752]: 2025-12-15 09:35:11.373 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:35:12 localhost nova_compute[231752]: 2025-12-15 09:35:12.288 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:35:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 04:35:12 localhost podman[245212]: 2025-12-15 09:35:12.750806762 +0000 UTC m=+0.083450247 container health_status 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 04:35:12 localhost podman[245212]: 2025-12-15 09:35:12.762499591 +0000 UTC m=+0.095143106 container exec_died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2) Dec 15 04:35:13 localhost systemd[1]: var-lib-containers-storage-overlay-5072aa4283df2440f817438926274b2ecc1fbb999174180268a40a1b62865efd-merged.mount: Deactivated successfully. Dec 15 04:35:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0. Dec 15 04:35:13 localhost systemd[1]: var-lib-containers-storage-overlay-a7a0fabd483d0287d5d447d66b01a81ddf0a08e391390a4f77973a583945daec-merged.mount: Deactivated successfully. Dec 15 04:35:13 localhost systemd[1]: var-lib-containers-storage-overlay-a7a0fabd483d0287d5d447d66b01a81ddf0a08e391390a4f77973a583945daec-merged.mount: Deactivated successfully. Dec 15 04:35:13 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.service: Deactivated successfully. Dec 15 04:35:13 localhost podman[245232]: 2025-12-15 09:35:13.331938328 +0000 UTC m=+0.180660193 container health_status 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 15 04:35:13 localhost podman[245232]: 2025-12-15 09:35:13.340325834 +0000 UTC m=+0.189047709 container exec_died 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Dec 15 04:35:15 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Dec 15 04:35:15 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Dec 15 04:35:15 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Dec 15 04:35:15 localhost systemd[1]: 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0.service: Deactivated successfully. Dec 15 04:35:16 localhost nova_compute[231752]: 2025-12-15 09:35:16.391 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:35:17 localhost nova_compute[231752]: 2025-12-15 09:35:17.290 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:35:17 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Dec 15 04:35:17 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Dec 15 04:35:17 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45654 DF PROTO=TCP SPT=52772 DPT=9882 SEQ=2040480627 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839F83250000000001030307) Dec 15 04:35:17 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Dec 15 04:35:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a. Dec 15 04:35:18 localhost systemd[1]: tmp-crun.ruczOJ.mount: Deactivated successfully. Dec 15 04:35:18 localhost podman[245255]: 2025-12-15 09:35:18.4819213 +0000 UTC m=+0.076667807 container health_status b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202) Dec 15 04:35:18 localhost podman[245255]: 2025-12-15 09:35:18.490523062 +0000 UTC m=+0.085269569 container exec_died b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2) Dec 15 04:35:18 localhost podman[245255]: unhealthy Dec 15 04:35:18 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 15 04:35:18 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Dec 15 04:35:19 localhost systemd[1]: b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a.service: Main process exited, code=exited, status=1/FAILURE Dec 15 04:35:19 localhost systemd[1]: b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a.service: Failed with result 'exit-code'. Dec 15 04:35:19 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Dec 15 04:35:19 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 15 04:35:19 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 15 04:35:20 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31561 DF PROTO=TCP SPT=37098 DPT=9105 SEQ=1121519201 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839F8D260000000001030307) Dec 15 04:35:21 localhost nova_compute[231752]: 2025-12-15 09:35:21.425 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:35:22 localhost nova_compute[231752]: 2025-12-15 09:35:22.293 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:35:22 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Dec 15 04:35:22 localhost systemd[1]: var-lib-containers-storage-overlay-ba4f6a51fa4a8ce51a3b8b83f42d30ba19f6e65ec47784832d65f45914b624c3-merged.mount: Deactivated successfully. Dec 15 04:35:23 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38700 DF PROTO=TCP SPT=48544 DPT=9100 SEQ=1526665518 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839F99B30000000001030307) Dec 15 04:35:23 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 15 04:35:23 localhost systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully. Dec 15 04:35:23 localhost systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully. Dec 15 04:35:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 04:35:24 localhost podman[245273]: 2025-12-15 09:35:24.029281348 +0000 UTC m=+0.084390834 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3) Dec 15 04:35:24 localhost podman[245273]: 2025-12-15 09:35:24.066389383 +0000 UTC m=+0.121498839 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, org.label-schema.vendor=CentOS) Dec 15 04:35:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38701 DF PROTO=TCP SPT=48544 DPT=9100 SEQ=1526665518 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839F9DA50000000001030307) Dec 15 04:35:24 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 15 04:35:24 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Dec 15 04:35:24 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 04:35:24 localhost systemd[1]: var-lib-containers-storage-overlay-a10f3c610bfd3a5166c8bb201abb4a07184bf8ddf69826ea8939f1a48ecba966-merged.mount: Deactivated successfully. Dec 15 04:35:25 localhost systemd[1]: var-lib-containers-storage-overlay-8386ac9ef0e341b40941113adbcd0de64d383dd53b6c975b3c29a443c4fff823-merged.mount: Deactivated successfully. Dec 15 04:35:26 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38702 DF PROTO=TCP SPT=48544 DPT=9100 SEQ=1526665518 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839FA5A50000000001030307) Dec 15 04:35:26 localhost nova_compute[231752]: 2025-12-15 09:35:26.427 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:35:27 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Dec 15 04:35:27 localhost nova_compute[231752]: 2025-12-15 09:35:27.297 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:35:27 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Dec 15 04:35:27 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Dec 15 04:35:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48282 DF PROTO=TCP SPT=45800 DPT=9101 SEQ=2089930627 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839FAFA50000000001030307) Dec 15 04:35:29 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Dec 15 04:35:29 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Dec 15 04:35:29 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Dec 15 04:35:30 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 15 04:35:30 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Dec 15 04:35:30 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Dec 15 04:35:31 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 15 04:35:31 localhost nova_compute[231752]: 2025-12-15 09:35:31.476 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:35:31 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Dec 15 04:35:31 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Dec 15 04:35:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45987 DF PROTO=TCP SPT=42368 DPT=9882 SEQ=1630756466 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839FBB590000000001030307) Dec 15 04:35:32 localhost nova_compute[231752]: 2025-12-15 09:35:32.299 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:35:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 04:35:32 localhost podman[245298]: 2025-12-15 09:35:32.77760727 +0000 UTC m=+0.070055031 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202) Dec 15 04:35:32 localhost podman[245298]: 2025-12-15 09:35:32.811475513 +0000 UTC m=+0.103923304 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Dec 15 04:35:33 localhost systemd[1]: var-lib-containers-storage-overlay-3edfdc699753a1c833a1247909047263cd4d267465db29104ef571eb019dbe34-merged.mount: Deactivated successfully. Dec 15 04:35:33 localhost systemd[1]: var-lib-containers-storage-overlay-c0eca1cbaf0df6dd2fb69cf60e02f6f367b7d2e448ccc3625823f47bdf01b658-merged.mount: Deactivated successfully. Dec 15 04:35:34 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 04:35:34 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45989 DF PROTO=TCP SPT=42368 DPT=9882 SEQ=1630756466 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839FC7650000000001030307) Dec 15 04:35:36 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Dec 15 04:35:36 localhost systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully. Dec 15 04:35:36 localhost nova_compute[231752]: 2025-12-15 09:35:36.478 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:35:36 localhost systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully. Dec 15 04:35:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e. Dec 15 04:35:37 localhost nova_compute[231752]: 2025-12-15 09:35:37.301 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:35:37 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23667 DF PROTO=TCP SPT=58182 DPT=9105 SEQ=3216642994 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839FD2E40000000001030307) Dec 15 04:35:38 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Dec 15 04:35:38 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Dec 15 04:35:38 localhost systemd[1]: var-lib-containers-storage-overlay-f49a20fc1f5020138578527318ecbf7083cb8c7be7c4014409c81f2cedb36958-merged.mount: Deactivated successfully. Dec 15 04:35:38 localhost podman[245374]: 2025-12-15 09:35:38.610620434 +0000 UTC m=+1.442858774 container health_status a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=starting, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 15 04:35:38 localhost podman[245374]: 2025-12-15 09:35:38.62220702 +0000 UTC m=+1.454445340 container exec_died a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Dec 15 04:35:38 localhost podman[245374]: unhealthy Dec 15 04:35:39 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 15 04:35:39 localhost systemd[1]: var-lib-containers-storage-overlay-3df44265ee334241877fc90da4598858e128dcd022ea76b8f6ef87bd0d8667ae-merged.mount: Deactivated successfully. Dec 15 04:35:39 localhost systemd[1]: a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e.service: Main process exited, code=exited, status=1/FAILURE Dec 15 04:35:39 localhost systemd[1]: a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e.service: Failed with result 'exit-code'. Dec 15 04:35:40 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Dec 15 04:35:40 localhost systemd[1]: var-lib-containers-storage-overlay-efd486ab4cd4ff83f3804626a19ad34bc69aaee72db0852b1e52409f0ff23ebf-merged.mount: Deactivated successfully. Dec 15 04:35:40 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Dec 15 04:35:40 localhost systemd[1]: var-lib-containers-storage-overlay-c892fd6b7d17c3244e97732d72b83cd3d1a569af20da04450edaf25f54095ce6-merged.mount: Deactivated successfully. Dec 15 04:35:41 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48284 DF PROTO=TCP SPT=45800 DPT=9101 SEQ=2089930627 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839FDF250000000001030307) Dec 15 04:35:41 localhost nova_compute[231752]: 2025-12-15 09:35:41.529 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:35:42 localhost nova_compute[231752]: 2025-12-15 09:35:42.303 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:35:42 localhost systemd[1]: var-lib-containers-storage-overlay-e7e7fc61a64bc57d1eb8c2a61f7791db4e4a30e6f64eed9bc93c76716d60ed28-merged.mount: Deactivated successfully. Dec 15 04:35:42 localhost systemd[1]: var-lib-containers-storage-overlay-1925df910a0bd163709115d5c6434edae9eb72581a26c20b4795234cbdad634b-merged.mount: Deactivated successfully. Dec 15 04:35:43 localhost systemd[1]: var-lib-containers-storage-overlay-1925df910a0bd163709115d5c6434edae9eb72581a26c20b4795234cbdad634b-merged.mount: Deactivated successfully. Dec 15 04:35:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 04:35:43 localhost podman[245434]: 2025-12-15 09:35:43.751605063 +0000 UTC m=+0.082800120 container health_status 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 04:35:43 localhost podman[245434]: 2025-12-15 09:35:43.762778828 +0000 UTC m=+0.093973885 container exec_died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}) Dec 15 04:35:44 localhost systemd[1]: var-lib-containers-storage-overlay-e286d84738a37bd2b207737e0500901c46e6f74c0034deffa3c9c2ea6c42af5a-merged.mount: Deactivated successfully. Dec 15 04:35:44 localhost systemd[1]: var-lib-containers-storage-overlay-1c99620ce928d1d7a7fa7a4a270012879db892360c109f88ecf7a139ea7db3ab-merged.mount: Deactivated successfully. Dec 15 04:35:44 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.service: Deactivated successfully. Dec 15 04:35:46 localhost nova_compute[231752]: 2025-12-15 09:35:46.531 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:35:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0. Dec 15 04:35:46 localhost podman[245455]: 2025-12-15 09:35:46.735242613 +0000 UTC m=+0.070041831 container health_status 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Dec 15 04:35:46 localhost podman[245455]: 2025-12-15 09:35:46.747267121 +0000 UTC m=+0.082066309 container exec_died 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Dec 15 04:35:46 localhost systemd[1]: var-lib-containers-storage-overlay-3f01a5f11d308182c9ef96830a09f87e28c35e55cefdcd5aaa0bea98e3111a1e-merged.mount: Deactivated successfully. Dec 15 04:35:46 localhost systemd[1]: var-lib-containers-storage-overlay-e286d84738a37bd2b207737e0500901c46e6f74c0034deffa3c9c2ea6c42af5a-merged.mount: Deactivated successfully. Dec 15 04:35:47 localhost systemd[1]: var-lib-containers-storage-overlay-e286d84738a37bd2b207737e0500901c46e6f74c0034deffa3c9c2ea6c42af5a-merged.mount: Deactivated successfully. Dec 15 04:35:47 localhost systemd[1]: 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0.service: Deactivated successfully. Dec 15 04:35:47 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45991 DF PROTO=TCP SPT=42368 DPT=9882 SEQ=1630756466 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A839FF7250000000001030307) Dec 15 04:35:47 localhost nova_compute[231752]: 2025-12-15 09:35:47.304 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:35:47 localhost nova_compute[231752]: 2025-12-15 09:35:47.832 231756 DEBUG oslo_service.periodic_task [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:35:47 localhost nova_compute[231752]: 2025-12-15 09:35:47.833 231756 DEBUG oslo_service.periodic_task [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:35:47 localhost nova_compute[231752]: 2025-12-15 09:35:47.833 231756 DEBUG nova.compute.manager [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 15 04:35:47 localhost nova_compute[231752]: 2025-12-15 09:35:47.833 231756 DEBUG nova.compute.manager [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.116 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'name': 'test', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005559462.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'c785bf23f53946bc99867d8832a50266', 'user_id': '1ba5fce347b64bfebf995f187193f205', 'hostId': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.117 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.122 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.124 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1fba1bc5-483c-4c33-b793-ac3e5a0f430d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:35:48.117632', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '7007335e-d999-11f0-817e-fa163ebaca0f', 'monotonic_time': 10414.310279707, 'message_signature': '6ffbb84b97b71608fd8aa2187973ffca5c69de4056e9efbb204531f99c178e46'}]}, 'timestamp': '2025-12-15 09:35:48.123115', '_unique_id': '41e4bba8361242e693d47e93ad4021e5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.124 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.124 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.124 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.124 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.124 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.124 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.124 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.124 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.124 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.124 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.124 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.124 12 ERROR oslo_messaging.notify.messaging Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.124 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.124 12 ERROR oslo_messaging.notify.messaging Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.124 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.124 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.124 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.124 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.124 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.124 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.124 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.124 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.124 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.124 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.124 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.124 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.124 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.124 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.124 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.124 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.124 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.124 12 ERROR oslo_messaging.notify.messaging Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.124 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.124 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.124 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.bytes volume: 11272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.125 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0e25133b-b292-4c98-a7ba-50300caf3a64', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11272, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:35:48.124783', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '7007817e-d999-11f0-817e-fa163ebaca0f', 'monotonic_time': 10414.310279707, 'message_signature': '29bb060743253dac2cff389c8ad6c7e15af2ca7ed81e0a1db762b46745e7e066'}]}, 'timestamp': '2025-12-15 09:35:48.125043', '_unique_id': '9195e2fbe7264f93af7bd0e51a0ad62d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.125 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.125 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.125 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.125 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.125 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.125 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.125 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.125 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.125 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.125 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.125 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.125 12 ERROR oslo_messaging.notify.messaging Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.125 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.125 12 ERROR oslo_messaging.notify.messaging Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.125 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.125 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.125 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.125 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.125 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.125 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.125 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.125 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.125 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.125 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.125 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.125 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.125 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.125 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.125 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.125 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.125 12 ERROR oslo_messaging.notify.messaging Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.125 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.126 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.packets volume: 129 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.126 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f2aba157-b5c2-4651-91f4-bca1f5f4f781', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 129, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:35:48.126017', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '7007b0ae-d999-11f0-817e-fa163ebaca0f', 'monotonic_time': 10414.310279707, 'message_signature': 'c7fc6d7e2d126e715e55db6c720c19a2d1cf1afe194496a8772c6ad71c94eb91'}]}, 'timestamp': '2025-12-15 09:35:48.126229', '_unique_id': '4f78117a41f8426f97c73ecd113baff8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.126 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.126 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.126 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.126 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.126 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.126 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.126 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.126 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.126 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.126 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.126 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.126 12 ERROR oslo_messaging.notify.messaging Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.126 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.126 12 ERROR oslo_messaging.notify.messaging Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.126 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.126 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.126 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.126 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.126 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.126 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.126 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.126 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.126 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.126 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.126 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.126 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.126 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.126 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.126 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.126 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.126 12 ERROR oslo_messaging.notify.messaging Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.127 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.127 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.155 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.bytes volume: 73912320 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.155 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.156 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '44df8f33-59d8-4234-80d5-1d2fd2680170', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73912320, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:35:48.127245', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '700c35c0-d999-11f0-817e-fa163ebaca0f', 'monotonic_time': 10414.319901508, 'message_signature': 'f699e3c7f51b85b0ee9fe9facb11d3b8295bb9925e1e9a8f2ad3f67b0a0a73a4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:35:48.127245', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '700c4042-d999-11f0-817e-fa163ebaca0f', 'monotonic_time': 10414.319901508, 'message_signature': '95e2a9b0c210ad10a7278bb6b785fbd81e279ed9947e12e27ae9d7609346d531'}]}, 'timestamp': '2025-12-15 09:35:48.156122', '_unique_id': 'c223208b464b4ba7b261e3fe396fca99'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.156 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.156 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.156 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.156 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.156 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.156 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.156 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.156 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.156 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.156 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.156 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.156 12 ERROR oslo_messaging.notify.messaging Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.156 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.156 12 ERROR oslo_messaging.notify.messaging Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.156 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.156 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.156 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.156 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.156 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.156 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.156 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.156 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.156 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.156 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.156 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.156 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.156 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.156 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.156 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.156 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.156 12 ERROR oslo_messaging.notify.messaging Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.157 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.157 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.157 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.158 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '08a6130c-a4d1-4b41-937f-74143cef71df', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:35:48.157624', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '700c834a-d999-11f0-817e-fa163ebaca0f', 'monotonic_time': 10414.319901508, 'message_signature': 'f658feb2379108723c80c7d7e5dd8b52176fce52320b346ead245255cb3e50cd'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:35:48.157624', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '700c8ab6-d999-11f0-817e-fa163ebaca0f', 'monotonic_time': 10414.319901508, 'message_signature': '1638947db882469a87dbd1f0a44ed8c7b230f71457199bd7e1cb96f5107b1515'}]}, 'timestamp': '2025-12-15 09:35:48.158029', '_unique_id': '39332df4b7e94ca083899492c97ebfac'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.158 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.158 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.158 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.158 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.158 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.158 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.158 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.158 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.158 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.158 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.158 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.158 12 ERROR oslo_messaging.notify.messaging Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.158 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.158 12 ERROR oslo_messaging.notify.messaging Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.158 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.158 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.158 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.158 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.158 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.158 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.158 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.158 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.158 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.158 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.158 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.158 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.158 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.158 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.158 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.158 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.158 12 ERROR oslo_messaging.notify.messaging Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.158 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.168 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.168 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.169 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6dad4d45-6b82-4b71-93ff-22d35cbe0fa6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:35:48.159013', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '700e2948-d999-11f0-817e-fa163ebaca0f', 'monotonic_time': 10414.351670792, 'message_signature': '69cd0c950c16db3eaf448088d3e76cfc9b575f1ffe0464b7a45e622eec077889'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:35:48.159013', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '700e3190-d999-11f0-817e-fa163ebaca0f', 'monotonic_time': 10414.351670792, 'message_signature': 'bb04397138d7967a580159099cfc0010ff3120d18b49aa9ad7465ed4adfc6065'}]}, 'timestamp': '2025-12-15 09:35:48.168836', '_unique_id': '4810ef7157424de4a49bf5afcb621928'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.169 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.169 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.169 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.169 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.169 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.169 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.169 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.169 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.169 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.169 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.169 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.169 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.169 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.169 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.169 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.169 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.169 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.169 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.169 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.169 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.169 12 ERROR oslo_messaging.notify.messaging Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.169 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.169 12 ERROR oslo_messaging.notify.messaging Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.169 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.169 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.169 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.169 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.169 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.169 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.169 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.169 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.169 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.169 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.169 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.169 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.169 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.169 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.169 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.169 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.169 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.169 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.169 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.169 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.169 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.169 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.169 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.169 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.169 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.169 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.169 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.169 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.169 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.169 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.169 12 ERROR oslo_messaging.notify.messaging Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.169 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.169 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.170 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '542a2062-d2b5-4154-b399-8d52207b4786', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:35:48.169866', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '700e6250-d999-11f0-817e-fa163ebaca0f', 'monotonic_time': 10414.310279707, 'message_signature': 'dcb7f3a9694f50ea8857dbbf24499715241b6a4ea79d7be1fd2db6775c0089f5'}]}, 'timestamp': '2025-12-15 09:35:48.170098', '_unique_id': '91981df930dd47eabeac60559f1fb4ca'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.170 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.170 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.170 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.170 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.170 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.170 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.170 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.170 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.170 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.170 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.170 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.170 12 ERROR oslo_messaging.notify.messaging Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.170 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.170 12 ERROR oslo_messaging.notify.messaging Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.170 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.170 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.170 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.170 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.170 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.170 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.170 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.170 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.170 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.170 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.170 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.170 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.170 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.170 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.170 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.170 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.170 12 ERROR oslo_messaging.notify.messaging Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.171 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.171 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.packets volume: 82 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.171 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '47d014f2-bcdf-4ed2-bfa0-15dbf63cd8d2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 82, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:35:48.171183', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '700e94d2-d999-11f0-817e-fa163ebaca0f', 'monotonic_time': 10414.310279707, 'message_signature': '54debf23f858db265c1ec2e448011f834e465a01109bc6eda04ee6015bd38eaf'}]}, 'timestamp': '2025-12-15 09:35:48.171389', '_unique_id': '16586b4570d44a4bba0627063832a338'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.171 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.171 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.171 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.171 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.171 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.171 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.171 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.171 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.171 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.171 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.171 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.171 12 ERROR oslo_messaging.notify.messaging Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.171 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.171 12 ERROR oslo_messaging.notify.messaging Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.171 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.171 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.171 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.171 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.171 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.171 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.171 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.171 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.171 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.171 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.171 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.171 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.171 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.171 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.171 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.171 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.171 12 ERROR oslo_messaging.notify.messaging Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.172 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.172 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.requests volume: 497 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.172 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.173 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '69c0ae64-9395-4c53-a358-0b1117e6c383', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 497, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:35:48.172328', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '700ec16e-d999-11f0-817e-fa163ebaca0f', 'monotonic_time': 10414.319901508, 'message_signature': 'd903fab5b7173dabc44a0de9ec0859f1993500532072367dd59c804a9e488326'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:35:48.172328', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '700ec89e-d999-11f0-817e-fa163ebaca0f', 'monotonic_time': 10414.319901508, 'message_signature': '6e99def4c5e88a2be995b52d6ab70a6a8f3504fbbb8e0ca477c7a468bfcfb1df'}]}, 'timestamp': '2025-12-15 09:35:48.172722', '_unique_id': '5f9d471d46f74e2fa567987bcc6bdfad'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.173 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.173 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.173 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.173 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.173 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.173 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.173 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.173 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.173 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.173 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.173 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.173 12 ERROR oslo_messaging.notify.messaging Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.173 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.173 12 ERROR oslo_messaging.notify.messaging Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.173 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.173 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.173 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.173 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.173 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.173 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.173 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.173 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.173 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.173 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.173 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.173 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.173 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.173 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.173 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.173 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.173 12 ERROR oslo_messaging.notify.messaging Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.173 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.173 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.174 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd93e2291-802f-474f-8b2d-f9f4f0abf842', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:35:48.173677', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '700ef63e-d999-11f0-817e-fa163ebaca0f', 'monotonic_time': 10414.310279707, 'message_signature': 'f0f1f47b3b956df2a8940359a016ac28174a379be54735675a31e7e660e26aa6'}]}, 'timestamp': '2025-12-15 09:35:48.173889', '_unique_id': '6f9ddaea5bb148539d42174b99f1e229'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.174 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.174 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.174 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.174 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.174 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.174 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.174 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.174 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.174 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.174 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.174 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.174 12 ERROR oslo_messaging.notify.messaging Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.174 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.174 12 ERROR oslo_messaging.notify.messaging Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.174 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.174 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.174 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.174 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.174 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.174 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.174 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.174 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.174 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.174 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.174 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.174 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.174 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.174 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.174 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.174 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.174 12 ERROR oslo_messaging.notify.messaging Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.174 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.174 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.175 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '91fd9f8a-7e14-4e67-b632-e34184da1bfe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:35:48.174823', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '700f22ee-d999-11f0-817e-fa163ebaca0f', 'monotonic_time': 10414.310279707, 'message_signature': '6e19a21eb42a7bfd076da98dfff9f9af0b6fa2bb6f0c2a74a8513a34cd8afe24'}]}, 'timestamp': '2025-12-15 09:35:48.175042', '_unique_id': '6245b76b15954c0e86a137b7d47f3d34'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.175 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.175 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.175 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.175 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.175 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.175 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.175 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.175 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.175 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.175 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.175 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.175 12 ERROR oslo_messaging.notify.messaging Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.175 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.175 12 ERROR oslo_messaging.notify.messaging Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.175 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.175 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.175 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.175 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.175 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.175 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.175 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.175 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.175 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.175 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.175 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.175 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.175 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.175 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.175 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.175 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.175 12 ERROR oslo_messaging.notify.messaging Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.175 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.175 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.176 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.176 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2af60115-0f0e-48a5-aea6-eedf6d86fc1f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:35:48.175953', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '700f4ff8-d999-11f0-817e-fa163ebaca0f', 'monotonic_time': 10414.319901508, 'message_signature': '1aa4c6bd9ba9cdbf52f13b75e99b28cd7feb2c8e66cf71f78b6504fef80261db'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:35:48.175953', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '700f5728-d999-11f0-817e-fa163ebaca0f', 'monotonic_time': 10414.319901508, 'message_signature': '853737cf875d3366fbec8022f6f20f2cb0d1d0c574c1c16575542bb797b89255'}]}, 'timestamp': '2025-12-15 09:35:48.176366', '_unique_id': '5c0b43c28213403c9dadda29228db689'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.176 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.176 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.176 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.176 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.176 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.176 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.176 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.176 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.176 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.176 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.176 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.176 12 ERROR oslo_messaging.notify.messaging Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.176 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.176 12 ERROR oslo_messaging.notify.messaging Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.176 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.176 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.176 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.176 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.176 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.176 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.176 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.176 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.176 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.176 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.176 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.176 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.176 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.176 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.176 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.176 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.176 12 ERROR oslo_messaging.notify.messaging Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.177 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.190 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/cpu volume: 54230000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.191 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e306a00a-16e7-49c5-a2c2-040fd9870174', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 54230000000, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'timestamp': '2025-12-15T09:35:48.177330', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '7011913c-d999-11f0-817e-fa163ebaca0f', 'monotonic_time': 10414.383304011, 'message_signature': '9614deff62c70365960ebfe96c2e00aa78208773f442a7b7782ddf935d180e3b'}]}, 'timestamp': '2025-12-15 09:35:48.190966', '_unique_id': '29eddc361bc5428589880e0ff414ae55'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.191 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.191 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.191 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.191 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.191 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.191 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.191 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.191 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.191 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.191 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.191 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.191 12 ERROR oslo_messaging.notify.messaging Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.191 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.191 12 ERROR oslo_messaging.notify.messaging Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.191 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.191 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.191 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.191 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.191 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.191 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.191 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.191 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.191 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.191 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.191 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.191 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.191 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.191 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.191 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.191 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.191 12 ERROR oslo_messaging.notify.messaging Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.191 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.192 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.192 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.latency volume: 213002426 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.192 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.latency volume: 24733520 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.192 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5accd2fd-1dd5-4dbb-b35b-81f2fbd4964f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 213002426, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:35:48.192070', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7011c4cc-d999-11f0-817e-fa163ebaca0f', 'monotonic_time': 10414.319901508, 'message_signature': 'b5c38c78700d2b04cc39cdf871f534bd9c33bd5d92722e5395ff4b742c1ae9b1'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24733520, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:35:48.192070', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7011cc06-d999-11f0-817e-fa163ebaca0f', 'monotonic_time': 10414.319901508, 'message_signature': '0835b0f1f3d7fb18cce10d9e774ba61777e3bf50b0a9e221888986e4f8d9e78d'}]}, 'timestamp': '2025-12-15 09:35:48.192451', '_unique_id': '395e67b9d39e4653a4c404365a7658bd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.192 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.192 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.192 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.192 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.192 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.192 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.192 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.192 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.192 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.192 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.192 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.192 12 ERROR oslo_messaging.notify.messaging Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.192 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.192 12 ERROR oslo_messaging.notify.messaging Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.192 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.192 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.192 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.192 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.192 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.192 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.192 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.192 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.192 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.192 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.192 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.192 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.192 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.192 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.192 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.192 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.192 12 ERROR oslo_messaging.notify.messaging Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.193 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.193 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.193 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.194 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '66055b8a-79c9-4673-96fb-5116088d1f65', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:35:48.193482', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '7011fbe0-d999-11f0-817e-fa163ebaca0f', 'monotonic_time': 10414.310279707, 'message_signature': '85c6d7b80c73598e853594a7687e9fb24162abdc66e044bcf94e27c4c2359f80'}]}, 'timestamp': '2025-12-15 09:35:48.193689', '_unique_id': '48aac103f1514ffc9bcfc5f6bf1f2d5b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.194 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.194 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.194 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.194 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.194 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.194 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.194 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.194 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.194 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.194 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.194 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.194 12 ERROR oslo_messaging.notify.messaging Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.194 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.194 12 ERROR oslo_messaging.notify.messaging Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.194 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.194 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.194 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.194 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.194 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.194 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.194 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.194 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.194 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.194 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.194 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.194 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.194 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.194 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.194 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.194 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.194 12 ERROR oslo_messaging.notify.messaging Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.194 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.194 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/memory.usage volume: 52.3125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.195 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '66ebcd56-199f-4e0e-8ac4-b25a70a0c45f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.3125, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'timestamp': '2025-12-15T09:35:48.194669', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '70122a2a-d999-11f0-817e-fa163ebaca0f', 'monotonic_time': 10414.383304011, 'message_signature': 'aa6929560a9ed380e810c3af652e8af406cee79412b4e85c3c8840402ab50011'}]}, 'timestamp': '2025-12-15 09:35:48.194866', '_unique_id': '2237c13ea7d44511bf57708e7bd84c5e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.195 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.195 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.195 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.195 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.195 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.195 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.195 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.195 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.195 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.195 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.195 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.195 12 ERROR oslo_messaging.notify.messaging Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.195 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.195 12 ERROR oslo_messaging.notify.messaging Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.195 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.195 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.195 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.195 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.195 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.195 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.195 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.195 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.195 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.195 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.195 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.195 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.195 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.195 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.195 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.195 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.195 12 ERROR oslo_messaging.notify.messaging Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.195 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.195 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.bytes volume: 8783 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.196 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd82f046c-fcbe-41d6-a640-124b7e008249', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 8783, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:35:48.195795', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '70125630-d999-11f0-817e-fa163ebaca0f', 'monotonic_time': 10414.310279707, 'message_signature': '771fd627bd4a71bc11501d1817f19ae2ce2213193efb621679a5c9738730e1f2'}]}, 'timestamp': '2025-12-15 09:35:48.196017', '_unique_id': '2f1aa8b68d614429bc152b216a27b04f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.196 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.196 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.196 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.196 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.196 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.196 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.196 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.196 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.196 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.196 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.196 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.196 12 ERROR oslo_messaging.notify.messaging Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.196 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.196 12 ERROR oslo_messaging.notify.messaging Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.196 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.196 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.196 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.196 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.196 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.196 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.196 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.196 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.196 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.196 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.196 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.196 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.196 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.196 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.196 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.196 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.196 12 ERROR oslo_messaging.notify.messaging Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.196 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.196 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.197 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.197 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6c440cc1-ff16-49c6-a3dc-8ae00b1b1157', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:35:48.196939', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '701283c6-d999-11f0-817e-fa163ebaca0f', 'monotonic_time': 10414.351670792, 'message_signature': 'a8640280e1d253b739c61dc6eec92219e23e97c93857b60877f33040260f744d'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:35:48.196939', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '70128b00-d999-11f0-817e-fa163ebaca0f', 'monotonic_time': 10414.351670792, 'message_signature': '303ec1bdc48c88348abd23efd0d92bf417f58ace4ae5dc2606d68880fd4c38af'}]}, 'timestamp': '2025-12-15 09:35:48.197339', '_unique_id': 'f40cdd0828d541eea91c7b39461b70d1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.197 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.197 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.197 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.197 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.197 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.197 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.197 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.197 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.197 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.197 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.197 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.197 12 ERROR oslo_messaging.notify.messaging Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.197 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.197 12 ERROR oslo_messaging.notify.messaging Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.197 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.197 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.197 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.197 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.197 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.197 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.197 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.197 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.197 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.197 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.197 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.197 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.197 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.197 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.197 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.197 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.197 12 ERROR oslo_messaging.notify.messaging Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.198 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.198 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.198 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.199 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '910d9e81-f2ba-4fe1-a459-b08e0c025673', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:35:48.198292', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7012b7a6-d999-11f0-817e-fa163ebaca0f', 'monotonic_time': 10414.351670792, 'message_signature': 'e8dea4f6991ba07f340dd6ca4e534e94e926e9cf17212fbdc70ad05cdeed7cbe'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:35:48.198292', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7012bed6-d999-11f0-817e-fa163ebaca0f', 'monotonic_time': 10414.351670792, 'message_signature': '8c7a6ccf22b6b0358ae08cf4da21e7e67b4dfa8444ebeae92b6c7b723b515a36'}]}, 'timestamp': '2025-12-15 09:35:48.198667', '_unique_id': '542697b404cb4dfa9c7afa1b9fca832a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.199 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.199 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.199 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.199 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.199 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.199 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.199 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.199 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.199 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.199 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.199 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.199 12 ERROR oslo_messaging.notify.messaging Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.199 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.199 12 ERROR oslo_messaging.notify.messaging Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.199 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.199 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.199 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.199 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.199 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.199 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.199 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.199 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.199 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.199 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.199 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.199 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.199 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.199 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.199 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.199 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.199 12 ERROR oslo_messaging.notify.messaging Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.199 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.199 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.latency volume: 937264501 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.199 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.latency volume: 204572919 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.200 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a86a90d4-4d15-44a2-8ee5-5a527630225f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 937264501, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:35:48.199608', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7012eb04-d999-11f0-817e-fa163ebaca0f', 'monotonic_time': 10414.319901508, 'message_signature': 'c2e1f9e62d27b09b7ae57dcb0ebd6add8e0711b812f227d851052b41fd99dc3d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 204572919, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:35:48.199608', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '7012f234-d999-11f0-817e-fa163ebaca0f', 'monotonic_time': 10414.319901508, 'message_signature': '458ce8979036b1898f0bb35483fad87f5cbfc8b448bb8ddee142f5685e05721e'}]}, 'timestamp': '2025-12-15 09:35:48.199999', '_unique_id': '7b4cb9dd34fd41d1b8e5330dcfe2d800'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.200 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.200 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.200 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.200 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.200 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.200 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.200 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.200 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.200 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.200 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.200 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.200 12 ERROR oslo_messaging.notify.messaging Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.200 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.200 12 ERROR oslo_messaging.notify.messaging Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.200 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.200 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.200 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.200 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.200 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.200 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.200 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.200 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.200 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.200 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.200 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.200 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.200 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.200 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.200 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.200 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.200 12 ERROR oslo_messaging.notify.messaging Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.200 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.200 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.201 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '27fafa33-2ca2-4129-8f37-32b764b6b3fa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:35:48.200944', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '70132042-d999-11f0-817e-fa163ebaca0f', 'monotonic_time': 10414.310279707, 'message_signature': 'f0abaac1cbee85fdb6663405190ed88734285406b6622207430d2b066edb9260'}]}, 'timestamp': '2025-12-15 09:35:48.201173', '_unique_id': 'a885fdc900ef466fad705168ed9576a2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.201 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.201 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.201 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.201 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.201 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.201 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.201 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.201 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.201 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.201 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.201 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.201 12 ERROR oslo_messaging.notify.messaging Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.201 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.201 12 ERROR oslo_messaging.notify.messaging Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.201 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.201 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.201 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.201 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.201 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.201 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.201 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.201 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.201 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.201 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.201 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.201 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.201 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.201 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.201 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.201 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:35:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:35:48.201 12 ERROR oslo_messaging.notify.messaging Dec 15 04:35:48 localhost systemd[1]: var-lib-containers-storage-overlay-4c2a493cc38fe0c2d274b137f7d549c92d76e83cf216e797584fb8469937682d-merged.mount: Deactivated successfully. Dec 15 04:35:48 localhost systemd[1]: var-lib-containers-storage-overlay-3f01a5f11d308182c9ef96830a09f87e28c35e55cefdcd5aaa0bea98e3111a1e-merged.mount: Deactivated successfully. Dec 15 04:35:48 localhost nova_compute[231752]: 2025-12-15 09:35:48.422 231756 DEBUG oslo_concurrency.lockutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Acquiring lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 15 04:35:48 localhost nova_compute[231752]: 2025-12-15 09:35:48.423 231756 DEBUG oslo_concurrency.lockutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Acquired lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 15 04:35:48 localhost nova_compute[231752]: 2025-12-15 09:35:48.423 231756 DEBUG nova.network.neutron [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 15 04:35:48 localhost nova_compute[231752]: 2025-12-15 09:35:48.423 231756 DEBUG nova.objects.instance [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Lazy-loading 'info_cache' on Instance uuid 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 15 04:35:48 localhost systemd[1]: var-lib-containers-storage-overlay-3f01a5f11d308182c9ef96830a09f87e28c35e55cefdcd5aaa0bea98e3111a1e-merged.mount: Deactivated successfully. Dec 15 04:35:48 localhost nova_compute[231752]: 2025-12-15 09:35:48.876 231756 DEBUG nova.network.neutron [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Updating instance_info_cache with network_info: [{"id": "03ef8889-3216-43fb-8a52-4be17a956ce1", "address": "fa:16:3e:74:df:7c", "network": {"id": "befb7a72-17a9-4bcb-b561-84b8f626685a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.201", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c785bf23f53946bc99867d8832a50266", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03ef8889-32", "ovs_interfaceid": "03ef8889-3216-43fb-8a52-4be17a956ce1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 15 04:35:48 localhost nova_compute[231752]: 2025-12-15 09:35:48.898 231756 DEBUG oslo_concurrency.lockutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Releasing lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 15 04:35:48 localhost nova_compute[231752]: 2025-12-15 09:35:48.898 231756 DEBUG nova.compute.manager [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 15 04:35:48 localhost nova_compute[231752]: 2025-12-15 09:35:48.899 231756 DEBUG oslo_service.periodic_task [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:35:48 localhost nova_compute[231752]: 2025-12-15 09:35:48.900 231756 DEBUG oslo_service.periodic_task [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:35:48 localhost nova_compute[231752]: 2025-12-15 09:35:48.900 231756 DEBUG oslo_service.periodic_task [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:35:48 localhost nova_compute[231752]: 2025-12-15 09:35:48.900 231756 DEBUG oslo_service.periodic_task [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:35:48 localhost nova_compute[231752]: 2025-12-15 09:35:48.900 231756 DEBUG oslo_service.periodic_task [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:35:48 localhost nova_compute[231752]: 2025-12-15 09:35:48.901 231756 DEBUG oslo_service.periodic_task [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:35:48 localhost nova_compute[231752]: 2025-12-15 09:35:48.901 231756 DEBUG nova.compute.manager [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 15 04:35:48 localhost nova_compute[231752]: 2025-12-15 09:35:48.901 231756 DEBUG oslo_service.periodic_task [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:35:48 localhost nova_compute[231752]: 2025-12-15 09:35:48.921 231756 DEBUG oslo_concurrency.lockutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:35:48 localhost nova_compute[231752]: 2025-12-15 09:35:48.921 231756 DEBUG oslo_concurrency.lockutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:35:48 localhost nova_compute[231752]: 2025-12-15 09:35:48.922 231756 DEBUG oslo_concurrency.lockutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:35:48 localhost nova_compute[231752]: 2025-12-15 09:35:48.922 231756 DEBUG nova.compute.resource_tracker [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Auditing locally available compute resources for np0005559462.localdomain (node: np0005559462.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 15 04:35:48 localhost nova_compute[231752]: 2025-12-15 09:35:48.923 231756 DEBUG oslo_concurrency.processutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 04:35:49 localhost systemd[1]: var-lib-containers-storage-overlay-102653142e2259aa6223045dee7736729104ac8aed3ce9b3c87a6d0787e59de8-merged.mount: Deactivated successfully. Dec 15 04:35:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a. Dec 15 04:35:49 localhost systemd[1]: var-lib-containers-storage-overlay-4c2a493cc38fe0c2d274b137f7d549c92d76e83cf216e797584fb8469937682d-merged.mount: Deactivated successfully. Dec 15 04:35:49 localhost nova_compute[231752]: 2025-12-15 09:35:49.453 231756 DEBUG oslo_concurrency.processutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.531s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 04:41:04 localhost openstack_network_exporter[246484]: ERROR 09:41:04 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 15 04:41:04 localhost openstack_network_exporter[246484]: ERROR 09:41:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 04:41:04 localhost openstack_network_exporter[246484]: ERROR 09:41:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 04:41:04 localhost openstack_network_exporter[246484]: ERROR 09:41:04 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 15 04:41:04 localhost openstack_network_exporter[246484]: Dec 15 04:41:04 localhost openstack_network_exporter[246484]: ERROR 09:41:04 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 15 04:41:04 localhost openstack_network_exporter[246484]: Dec 15 04:41:05 localhost rsyslogd[759]: imjournal: 4634 messages lost due to rate-limiting (20000 allowed within 600 seconds) Dec 15 04:41:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09. Dec 15 04:41:07 localhost podman[266308]: 2025-12-15 09:41:07.385895581 +0000 UTC m=+0.086132626 container health_status 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, container_name=openstack_network_exporter, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-type=git, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal) Dec 15 04:41:07 localhost podman[266308]: 2025-12-15 09:41:07.398057707 +0000 UTC m=+0.098294772 container exec_died 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64) Dec 15 04:41:07 localhost systemd[1]: 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09.service: Deactivated successfully. Dec 15 04:41:07 localhost python3[266332]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/neutron_dhcp config_id=neutron_dhcp config_overrides={} config_patterns=*.json containers=['neutron_dhcp_agent'] log_base_path=/var/log/containers/stdouts debug=False Dec 15 04:41:07 localhost nova_compute[231752]: 2025-12-15 09:41:07.816 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:41:07 localhost podman[266378]: Dec 15 04:41:07 localhost podman[266378]: 2025-12-15 09:41:07.874772167 +0000 UTC m=+0.081047135 container create a06657c080932aedd3bea4666dab45538f60b9902aca65881eb6fdb625848af1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-f5443fb6795d0c7ba01877e7655bd75302d51a135a5603013e710635161b4a46'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/openstack/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, org.label-schema.license=GPLv2, container_name=neutron_dhcp_agent, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=neutron_dhcp, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 04:41:07 localhost podman[266378]: 2025-12-15 09:41:07.843569473 +0000 UTC m=+0.049844451 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 15 04:41:07 localhost python3[266332]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name neutron_dhcp_agent --cgroupns=host --conmon-pidfile /run/neutron_dhcp_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-f5443fb6795d0c7ba01877e7655bd75302d51a135a5603013e710635161b4a46 --label config_id=neutron_dhcp --label container_name=neutron_dhcp_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-f5443fb6795d0c7ba01877e7655bd75302d51a135a5603013e710635161b4a46'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/openstack/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/netns:/run/netns:shared --volume /var/lib/openstack/neutron-dhcp-agent:/etc/neutron.conf.d:z --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 15 04:41:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 04:41:08 localhost systemd[1]: tmp-crun.pLNqAm.mount: Deactivated successfully. Dec 15 04:41:08 localhost podman[266526]: 2025-12-15 09:41:08.655251418 +0000 UTC m=+0.091264898 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Dec 15 04:41:08 localhost podman[266526]: 2025-12-15 09:41:08.72862273 +0000 UTC m=+0.164636250 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Dec 15 04:41:08 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 04:41:08 localhost python3.9[266525]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 15 04:41:09 localhost python3.9[266662]: ansible-file Invoked with path=/etc/systemd/system/edpm_neutron_dhcp_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:41:09 localhost python3.9[266717]: ansible-stat Invoked with path=/etc/systemd/system/edpm_neutron_dhcp_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 15 04:41:10 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29111 DF PROTO=TCP SPT=53922 DPT=9102 SEQ=3658843121 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83A4E5250000000001030307) Dec 15 04:41:10 localhost python3.9[266826]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765791670.0271544-1367-256284337143156/source dest=/etc/systemd/system/edpm_neutron_dhcp_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:41:11 localhost python3.9[266881]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Dec 15 04:41:11 localhost systemd[1]: Reloading. Dec 15 04:41:11 localhost systemd-sysv-generator[266906]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:41:11 localhost systemd-rc-local-generator[266902]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:41:11 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:41:11 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 15 04:41:11 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:41:11 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:41:11 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:41:11 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 15 04:41:11 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:41:11 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:41:11 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:41:12 localhost python3.9[266971]: ansible-systemd Invoked with state=restarted name=edpm_neutron_dhcp_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 04:41:12 localhost systemd[1]: Reloading. Dec 15 04:41:12 localhost systemd-rc-local-generator[266995]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:41:12 localhost systemd-sysv-generator[267001]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:41:12 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:41:12 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 15 04:41:12 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:41:12 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:41:12 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:41:12 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 15 04:41:12 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:41:12 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:41:12 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:41:12 localhost systemd[1]: Starting neutron_dhcp_agent container... Dec 15 04:41:12 localhost nova_compute[231752]: 2025-12-15 09:41:12.820 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:41:12 localhost nova_compute[231752]: 2025-12-15 09:41:12.822 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:41:12 localhost nova_compute[231752]: 2025-12-15 09:41:12.822 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 15 04:41:12 localhost nova_compute[231752]: 2025-12-15 09:41:12.822 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:41:12 localhost nova_compute[231752]: 2025-12-15 09:41:12.823 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:41:12 localhost nova_compute[231752]: 2025-12-15 09:41:12.826 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:41:12 localhost systemd[1]: Started libcrun container. Dec 15 04:41:12 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b45158859b78cce8d769337cec5a3eea2a53af84afd750a564fcaef52b9bcfed/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff) Dec 15 04:41:12 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b45158859b78cce8d769337cec5a3eea2a53af84afd750a564fcaef52b9bcfed/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 04:41:12 localhost podman[267012]: 2025-12-15 09:41:12.871186336 +0000 UTC m=+0.144726369 container init a06657c080932aedd3bea4666dab45538f60b9902aca65881eb6fdb625848af1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-f5443fb6795d0c7ba01877e7655bd75302d51a135a5603013e710635161b4a46'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/openstack/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, config_id=neutron_dhcp, container_name=neutron_dhcp_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3) Dec 15 04:41:12 localhost podman[267012]: 2025-12-15 09:41:12.878641402 +0000 UTC m=+0.152181445 container start a06657c080932aedd3bea4666dab45538f60b9902aca65881eb6fdb625848af1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, container_name=neutron_dhcp_agent, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-f5443fb6795d0c7ba01877e7655bd75302d51a135a5603013e710635161b4a46'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/openstack/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, config_id=neutron_dhcp, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 04:41:12 localhost podman[267012]: neutron_dhcp_agent Dec 15 04:41:12 localhost neutron_dhcp_agent[267027]: + sudo -E kolla_set_configs Dec 15 04:41:12 localhost systemd[1]: Started neutron_dhcp_agent container. Dec 15 04:41:12 localhost neutron_dhcp_agent[267027]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Dec 15 04:41:12 localhost neutron_dhcp_agent[267027]: INFO:__main__:Validating config file Dec 15 04:41:12 localhost neutron_dhcp_agent[267027]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Dec 15 04:41:12 localhost neutron_dhcp_agent[267027]: INFO:__main__:Copying service configuration files Dec 15 04:41:12 localhost neutron_dhcp_agent[267027]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf Dec 15 04:41:12 localhost neutron_dhcp_agent[267027]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf Dec 15 04:41:12 localhost neutron_dhcp_agent[267027]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf Dec 15 04:41:12 localhost neutron_dhcp_agent[267027]: INFO:__main__:Writing out command to execute Dec 15 04:41:12 localhost neutron_dhcp_agent[267027]: INFO:__main__:Setting permission for /var/lib/neutron Dec 15 04:41:12 localhost neutron_dhcp_agent[267027]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts Dec 15 04:41:12 localhost neutron_dhcp_agent[267027]: INFO:__main__:Setting permission for /var/lib/neutron/.cache Dec 15 04:41:12 localhost neutron_dhcp_agent[267027]: INFO:__main__:Setting permission for /var/lib/neutron/external Dec 15 04:41:12 localhost neutron_dhcp_agent[267027]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy Dec 15 04:41:12 localhost neutron_dhcp_agent[267027]: INFO:__main__:Setting permission for /var/lib/neutron/ns-metadata-proxy Dec 15 04:41:12 localhost neutron_dhcp_agent[267027]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper Dec 15 04:41:12 localhost neutron_dhcp_agent[267027]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy Dec 15 04:41:12 localhost neutron_dhcp_agent[267027]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp_agent_haproxy_wrapper Dec 15 04:41:12 localhost neutron_dhcp_agent[267027]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp_agent_dnsmasq_wrapper Dec 15 04:41:12 localhost neutron_dhcp_agent[267027]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill Dec 15 04:41:12 localhost neutron_dhcp_agent[267027]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/dnsmasq-kill Dec 15 04:41:12 localhost neutron_dhcp_agent[267027]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints Dec 15 04:41:12 localhost neutron_dhcp_agent[267027]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/adac9f827fd7fb11fb07020ef60ee06a1fede4feab743856dc8fb3266181d934 Dec 15 04:41:12 localhost neutron_dhcp_agent[267027]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/2f8bf5151cd8ab687baee0dc31396d121a2bd554ac71c7da49f0dd9a3fcf882e Dec 15 04:41:12 localhost neutron_dhcp_agent[267027]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids Dec 15 04:41:12 localhost neutron_dhcp_agent[267027]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids/befb7a72-17a9-4bcb-b561-84b8f626685a.pid.haproxy Dec 15 04:41:12 localhost neutron_dhcp_agent[267027]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy/befb7a72-17a9-4bcb-b561-84b8f626685a.conf Dec 15 04:41:12 localhost neutron_dhcp_agent[267027]: ++ cat /run_command Dec 15 04:41:12 localhost neutron_dhcp_agent[267027]: + CMD=/usr/bin/neutron-dhcp-agent Dec 15 04:41:12 localhost neutron_dhcp_agent[267027]: + ARGS= Dec 15 04:41:12 localhost neutron_dhcp_agent[267027]: + sudo kolla_copy_cacerts Dec 15 04:41:13 localhost neutron_dhcp_agent[267027]: + [[ ! -n '' ]] Dec 15 04:41:13 localhost neutron_dhcp_agent[267027]: + . kolla_extend_start Dec 15 04:41:13 localhost neutron_dhcp_agent[267027]: + echo 'Running command: '\''/usr/bin/neutron-dhcp-agent'\''' Dec 15 04:41:13 localhost neutron_dhcp_agent[267027]: Running command: '/usr/bin/neutron-dhcp-agent' Dec 15 04:41:13 localhost neutron_dhcp_agent[267027]: + umask 0022 Dec 15 04:41:13 localhost neutron_dhcp_agent[267027]: + exec /usr/bin/neutron-dhcp-agent Dec 15 04:41:13 localhost python3.9[267149]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml Dec 15 04:41:14 localhost neutron_dhcp_agent[267027]: 2025-12-15 09:41:14.265 267031 INFO neutron.common.config [-] Logging enabled!#033[00m Dec 15 04:41:14 localhost neutron_dhcp_agent[267027]: 2025-12-15 09:41:14.266 267031 INFO neutron.common.config [-] /usr/bin/neutron-dhcp-agent version 22.2.2.dev43#033[00m Dec 15 04:41:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 04:41:14 localhost neutron_dhcp_agent[267027]: 2025-12-15 09:41:14.636 267031 INFO neutron.agent.dhcp.agent [-] Synchronizing state#033[00m Dec 15 04:41:14 localhost podman[267260]: 2025-12-15 09:41:14.665326084 +0000 UTC m=+0.083728939 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_metadata_agent) Dec 15 04:41:14 localhost podman[267260]: 2025-12-15 09:41:14.703177042 +0000 UTC m=+0.121579867 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true) Dec 15 04:41:14 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 04:41:14 localhost python3.9[267259]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:41:15 localhost python3.9[267368]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765791674.2888627-1490-49579735472919/.source.yaml _original_basename=.97g906h4 follow=False checksum=b0dd74fadfd8c60d9ca49529cbd40d9c61c35aa0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:41:16 localhost python3.9[267478]: ansible-ansible.builtin.systemd Invoked with name=edpm_neutron_dhcp_agent.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Dec 15 04:41:16 localhost systemd[1]: Stopping neutron_dhcp_agent container... Dec 15 04:41:16 localhost systemd[1]: tmp-crun.zcBbBo.mount: Deactivated successfully. Dec 15 04:41:16 localhost systemd[1]: libpod-a06657c080932aedd3bea4666dab45538f60b9902aca65881eb6fdb625848af1.scope: Deactivated successfully. Dec 15 04:41:16 localhost systemd[1]: libpod-a06657c080932aedd3bea4666dab45538f60b9902aca65881eb6fdb625848af1.scope: Consumed 2.080s CPU time. Dec 15 04:41:16 localhost podman[267482]: 2025-12-15 09:41:16.556300994 +0000 UTC m=+0.392034326 container died a06657c080932aedd3bea4666dab45538f60b9902aca65881eb6fdb625848af1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, config_id=neutron_dhcp, container_name=neutron_dhcp_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-f5443fb6795d0c7ba01877e7655bd75302d51a135a5603013e710635161b4a46'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/openstack/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 04:41:16 localhost systemd[1]: tmp-crun.EfgMo4.mount: Deactivated successfully. Dec 15 04:41:16 localhost podman[267482]: 2025-12-15 09:41:16.626882299 +0000 UTC m=+0.462615621 container cleanup a06657c080932aedd3bea4666dab45538f60b9902aca65881eb6fdb625848af1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=neutron_dhcp_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-f5443fb6795d0c7ba01877e7655bd75302d51a135a5603013e710635161b4a46'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/openstack/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=neutron_dhcp, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 04:41:16 localhost podman[267482]: neutron_dhcp_agent Dec 15 04:41:16 localhost podman[267523]: error opening file `/run/crun/a06657c080932aedd3bea4666dab45538f60b9902aca65881eb6fdb625848af1/status`: No such file or directory Dec 15 04:41:16 localhost podman[267512]: 2025-12-15 09:41:16.728752189 +0000 UTC m=+0.069855465 container cleanup a06657c080932aedd3bea4666dab45538f60b9902aca65881eb6fdb625848af1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-f5443fb6795d0c7ba01877e7655bd75302d51a135a5603013e710635161b4a46'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/openstack/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, config_id=neutron_dhcp, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=neutron_dhcp_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Dec 15 04:41:16 localhost podman[267512]: neutron_dhcp_agent Dec 15 04:41:16 localhost systemd[1]: edpm_neutron_dhcp_agent.service: Deactivated successfully. Dec 15 04:41:16 localhost systemd[1]: Stopped neutron_dhcp_agent container. Dec 15 04:41:16 localhost systemd[1]: Starting neutron_dhcp_agent container... Dec 15 04:41:16 localhost systemd[1]: Started libcrun container. Dec 15 04:41:16 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b45158859b78cce8d769337cec5a3eea2a53af84afd750a564fcaef52b9bcfed/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff) Dec 15 04:41:16 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b45158859b78cce8d769337cec5a3eea2a53af84afd750a564fcaef52b9bcfed/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 04:41:16 localhost podman[267527]: 2025-12-15 09:41:16.884511673 +0000 UTC m=+0.121618879 container init a06657c080932aedd3bea4666dab45538f60b9902aca65881eb6fdb625848af1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, config_id=neutron_dhcp, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-f5443fb6795d0c7ba01877e7655bd75302d51a135a5603013e710635161b4a46'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/openstack/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=neutron_dhcp_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Dec 15 04:41:16 localhost podman[267527]: 2025-12-15 09:41:16.89596605 +0000 UTC m=+0.133073256 container start a06657c080932aedd3bea4666dab45538f60b9902aca65881eb6fdb625848af1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=neutron_dhcp, org.label-schema.name=CentOS Stream 9 Base Image, container_name=neutron_dhcp_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-f5443fb6795d0c7ba01877e7655bd75302d51a135a5603013e710635161b4a46'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/openstack/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, managed_by=edpm_ansible) Dec 15 04:41:16 localhost podman[267527]: neutron_dhcp_agent Dec 15 04:41:16 localhost neutron_dhcp_agent[267542]: + sudo -E kolla_set_configs Dec 15 04:41:16 localhost systemd[1]: Started neutron_dhcp_agent container. Dec 15 04:41:16 localhost neutron_dhcp_agent[267542]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Dec 15 04:41:16 localhost neutron_dhcp_agent[267542]: INFO:__main__:Validating config file Dec 15 04:41:16 localhost neutron_dhcp_agent[267542]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Dec 15 04:41:16 localhost neutron_dhcp_agent[267542]: INFO:__main__:Copying service configuration files Dec 15 04:41:16 localhost neutron_dhcp_agent[267542]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf Dec 15 04:41:16 localhost neutron_dhcp_agent[267542]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf Dec 15 04:41:16 localhost neutron_dhcp_agent[267542]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf Dec 15 04:41:16 localhost neutron_dhcp_agent[267542]: INFO:__main__:Writing out command to execute Dec 15 04:41:16 localhost neutron_dhcp_agent[267542]: INFO:__main__:Setting permission for /var/lib/neutron Dec 15 04:41:16 localhost neutron_dhcp_agent[267542]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts Dec 15 04:41:16 localhost neutron_dhcp_agent[267542]: INFO:__main__:Setting permission for /var/lib/neutron/.cache Dec 15 04:41:16 localhost neutron_dhcp_agent[267542]: INFO:__main__:Setting permission for /var/lib/neutron/external Dec 15 04:41:16 localhost neutron_dhcp_agent[267542]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy Dec 15 04:41:16 localhost neutron_dhcp_agent[267542]: INFO:__main__:Setting permission for /var/lib/neutron/ns-metadata-proxy Dec 15 04:41:16 localhost neutron_dhcp_agent[267542]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp Dec 15 04:41:16 localhost neutron_dhcp_agent[267542]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper Dec 15 04:41:16 localhost neutron_dhcp_agent[267542]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy Dec 15 04:41:16 localhost neutron_dhcp_agent[267542]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp_agent_haproxy_wrapper Dec 15 04:41:16 localhost neutron_dhcp_agent[267542]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp_agent_dnsmasq_wrapper Dec 15 04:41:16 localhost neutron_dhcp_agent[267542]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill Dec 15 04:41:16 localhost neutron_dhcp_agent[267542]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/dnsmasq-kill Dec 15 04:41:16 localhost neutron_dhcp_agent[267542]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints Dec 15 04:41:16 localhost neutron_dhcp_agent[267542]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/adac9f827fd7fb11fb07020ef60ee06a1fede4feab743856dc8fb3266181d934 Dec 15 04:41:16 localhost neutron_dhcp_agent[267542]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/2f8bf5151cd8ab687baee0dc31396d121a2bd554ac71c7da49f0dd9a3fcf882e Dec 15 04:41:16 localhost neutron_dhcp_agent[267542]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids Dec 15 04:41:16 localhost neutron_dhcp_agent[267542]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids/befb7a72-17a9-4bcb-b561-84b8f626685a.pid.haproxy Dec 15 04:41:16 localhost neutron_dhcp_agent[267542]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy/befb7a72-17a9-4bcb-b561-84b8f626685a.conf Dec 15 04:41:16 localhost neutron_dhcp_agent[267542]: ++ cat /run_command Dec 15 04:41:16 localhost neutron_dhcp_agent[267542]: + CMD=/usr/bin/neutron-dhcp-agent Dec 15 04:41:16 localhost neutron_dhcp_agent[267542]: + ARGS= Dec 15 04:41:16 localhost neutron_dhcp_agent[267542]: + sudo kolla_copy_cacerts Dec 15 04:41:17 localhost neutron_dhcp_agent[267542]: + [[ ! -n '' ]] Dec 15 04:41:17 localhost neutron_dhcp_agent[267542]: + . kolla_extend_start Dec 15 04:41:17 localhost neutron_dhcp_agent[267542]: + echo 'Running command: '\''/usr/bin/neutron-dhcp-agent'\''' Dec 15 04:41:17 localhost neutron_dhcp_agent[267542]: Running command: '/usr/bin/neutron-dhcp-agent' Dec 15 04:41:17 localhost neutron_dhcp_agent[267542]: + umask 0022 Dec 15 04:41:17 localhost neutron_dhcp_agent[267542]: + exec /usr/bin/neutron-dhcp-agent Dec 15 04:41:17 localhost systemd[1]: session-58.scope: Deactivated successfully. Dec 15 04:41:17 localhost systemd[1]: session-58.scope: Consumed 35.885s CPU time. Dec 15 04:41:17 localhost systemd-logind[763]: Session 58 logged out. Waiting for processes to exit. Dec 15 04:41:17 localhost systemd-logind[763]: Removed session 58. Dec 15 04:41:17 localhost nova_compute[231752]: 2025-12-15 09:41:17.821 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:41:17 localhost nova_compute[231752]: 2025-12-15 09:41:17.826 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:41:18 localhost neutron_dhcp_agent[267542]: 2025-12-15 09:41:18.131 267546 INFO neutron.common.config [-] Logging enabled!#033[00m Dec 15 04:41:18 localhost neutron_dhcp_agent[267542]: 2025-12-15 09:41:18.132 267546 INFO neutron.common.config [-] /usr/bin/neutron-dhcp-agent version 22.2.2.dev43#033[00m Dec 15 04:41:18 localhost neutron_dhcp_agent[267542]: 2025-12-15 09:41:18.499 267546 INFO neutron.agent.dhcp.agent [-] Synchronizing state#033[00m Dec 15 04:41:20 localhost neutron_dhcp_agent[267542]: 2025-12-15 09:41:20.145 267546 INFO neutron.agent.dhcp.agent [None req-fdab0b13-9cd4-4f41-958d-e973d0b42aaa - - - - - -] All active networks have been fetched through RPC.#033[00m Dec 15 04:41:20 localhost neutron_dhcp_agent[267542]: 2025-12-15 09:41:20.146 267546 INFO neutron.agent.dhcp.agent [None req-fdab0b13-9cd4-4f41-958d-e973d0b42aaa - - - - - -] Synchronizing state complete#033[00m Dec 15 04:41:20 localhost neutron_dhcp_agent[267542]: 2025-12-15 09:41:20.204 267546 INFO neutron.agent.dhcp.agent [None req-fdab0b13-9cd4-4f41-958d-e973d0b42aaa - - - - - -] DHCP agent started#033[00m Dec 15 04:41:20 localhost ovn_metadata_agent[160585]: 2025-12-15 09:41:20.801 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'fe:17:e3', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fe:55:2b:86:15:b5'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 04:41:20 localhost nova_compute[231752]: 2025-12-15 09:41:20.802 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:41:20 localhost ovn_metadata_agent[160585]: 2025-12-15 09:41:20.803 160590 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Dec 15 04:41:20 localhost ovn_metadata_agent[160585]: 2025-12-15 09:41:20.804 160590 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=12d96d64-e862-4f68-81e5-8d9ec5d3a5e2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 15 04:41:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e. Dec 15 04:41:21 localhost podman[267575]: 2025-12-15 09:41:21.743852574 +0000 UTC m=+0.078960677 container health_status a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 15 04:41:21 localhost podman[267575]: 2025-12-15 09:41:21.751548437 +0000 UTC m=+0.086656580 container exec_died a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Dec 15 04:41:21 localhost systemd[1]: a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e.service: Deactivated successfully. Dec 15 04:41:22 localhost nova_compute[231752]: 2025-12-15 09:41:22.824 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:41:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15390 DF PROTO=TCP SPT=56108 DPT=9102 SEQ=2608436167 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83A51E220000000001030307) Dec 15 04:41:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15391 DF PROTO=TCP SPT=56108 DPT=9102 SEQ=2608436167 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83A522250000000001030307) Dec 15 04:41:26 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29112 DF PROTO=TCP SPT=53922 DPT=9102 SEQ=3658843121 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83A525250000000001030307) Dec 15 04:41:27 localhost nova_compute[231752]: 2025-12-15 09:41:27.827 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:41:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15392 DF PROTO=TCP SPT=56108 DPT=9102 SEQ=2608436167 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83A52A250000000001030307) Dec 15 04:41:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19202 DF PROTO=TCP SPT=45430 DPT=9102 SEQ=3169568234 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83A52D260000000001030307) Dec 15 04:41:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0. Dec 15 04:41:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 04:41:31 localhost podman[267597]: 2025-12-15 09:41:31.749054135 +0000 UTC m=+0.074391049 container health_status 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Dec 15 04:41:31 localhost podman[267597]: 2025-12-15 09:41:31.791396981 +0000 UTC m=+0.116733855 container exec_died 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Dec 15 04:41:31 localhost podman[267598]: 2025-12-15 09:41:31.801871442 +0000 UTC m=+0.126261129 container health_status 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=multipathd) Dec 15 04:41:31 localhost systemd[1]: 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0.service: Deactivated successfully. Dec 15 04:41:31 localhost podman[267598]: 2025-12-15 09:41:31.844473896 +0000 UTC m=+0.168863633 container exec_died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=multipathd, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Dec 15 04:41:31 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.service: Deactivated successfully. Dec 15 04:41:31 localhost podman[243449]: time="2025-12-15T09:41:31Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 15 04:41:31 localhost podman[243449]: @ - - [15/Dec/2025:09:41:31 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149352 "" "Go-http-client/1.1" Dec 15 04:41:31 localhost podman[243449]: @ - - [15/Dec/2025:09:41:31 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17218 "" "Go-http-client/1.1" Dec 15 04:41:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15393 DF PROTO=TCP SPT=56108 DPT=9102 SEQ=2608436167 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83A539E50000000001030307) Dec 15 04:41:32 localhost nova_compute[231752]: 2025-12-15 09:41:32.830 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:41:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a. Dec 15 04:41:33 localhost systemd[1]: tmp-crun.oASqkJ.mount: Deactivated successfully. Dec 15 04:41:33 localhost podman[267637]: 2025-12-15 09:41:33.757129224 +0000 UTC m=+0.089538694 container health_status b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team) Dec 15 04:41:33 localhost podman[267637]: 2025-12-15 09:41:33.791970764 +0000 UTC m=+0.124380224 container exec_died b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 04:41:33 localhost systemd[1]: b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a.service: Deactivated successfully. Dec 15 04:41:34 localhost openstack_network_exporter[246484]: ERROR 09:41:34 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 15 04:41:34 localhost openstack_network_exporter[246484]: ERROR 09:41:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 04:41:34 localhost openstack_network_exporter[246484]: ERROR 09:41:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 04:41:34 localhost openstack_network_exporter[246484]: ERROR 09:41:34 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 15 04:41:34 localhost openstack_network_exporter[246484]: Dec 15 04:41:34 localhost openstack_network_exporter[246484]: ERROR 09:41:34 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 15 04:41:34 localhost openstack_network_exporter[246484]: Dec 15 04:41:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09. Dec 15 04:41:37 localhost systemd[1]: tmp-crun.JjwYRh.mount: Deactivated successfully. Dec 15 04:41:37 localhost podman[267656]: 2025-12-15 09:41:37.764106869 +0000 UTC m=+0.093170298 container health_status 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, vendor=Red Hat, Inc., managed_by=edpm_ansible, vcs-type=git, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, name=ubi9-minimal, container_name=openstack_network_exporter, release=1755695350, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, distribution-scope=public, maintainer=Red Hat, Inc.) Dec 15 04:41:37 localhost podman[267656]: 2025-12-15 09:41:37.775803214 +0000 UTC m=+0.104866623 container exec_died 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1755695350, config_id=openstack_network_exporter, io.buildah.version=1.33.7) Dec 15 04:41:37 localhost systemd[1]: 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09.service: Deactivated successfully. Dec 15 04:41:37 localhost nova_compute[231752]: 2025-12-15 09:41:37.832 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:41:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 04:41:39 localhost systemd[1]: tmp-crun.D5IWIV.mount: Deactivated successfully. Dec 15 04:41:39 localhost podman[267676]: 2025-12-15 09:41:39.735467953 +0000 UTC m=+0.074418600 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2) Dec 15 04:41:39 localhost podman[267676]: 2025-12-15 09:41:39.804359151 +0000 UTC m=+0.143309788 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller) Dec 15 04:41:39 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 04:41:40 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15394 DF PROTO=TCP SPT=56108 DPT=9102 SEQ=2608436167 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83A55B250000000001030307) Dec 15 04:41:42 localhost nova_compute[231752]: 2025-12-15 09:41:42.837 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:41:42 localhost nova_compute[231752]: 2025-12-15 09:41:42.839 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:41:42 localhost nova_compute[231752]: 2025-12-15 09:41:42.839 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 15 04:41:42 localhost nova_compute[231752]: 2025-12-15 09:41:42.839 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:41:42 localhost nova_compute[231752]: 2025-12-15 09:41:42.854 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:41:42 localhost nova_compute[231752]: 2025-12-15 09:41:42.854 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:41:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 04:41:45 localhost podman[267701]: 2025-12-15 09:41:45.748284602 +0000 UTC m=+0.082654236 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202) Dec 15 04:41:45 localhost podman[267701]: 2025-12-15 09:41:45.752962477 +0000 UTC m=+0.087332121 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true) Dec 15 04:41:45 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 04:41:45 localhost nova_compute[231752]: 2025-12-15 09:41:45.947 231756 DEBUG oslo_service.periodic_task [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:41:47 localhost nova_compute[231752]: 2025-12-15 09:41:47.855 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:41:47 localhost nova_compute[231752]: 2025-12-15 09:41:47.858 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:41:47 localhost nova_compute[231752]: 2025-12-15 09:41:47.951 231756 DEBUG oslo_service.periodic_task [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:41:47 localhost nova_compute[231752]: 2025-12-15 09:41:47.952 231756 DEBUG nova.compute.manager [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 15 04:41:47 localhost nova_compute[231752]: 2025-12-15 09:41:47.952 231756 DEBUG nova.compute.manager [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.118 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'name': 'test', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005559462.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'c785bf23f53946bc99867d8832a50266', 'user_id': '1ba5fce347b64bfebf995f187193f205', 'hostId': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.119 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.136 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.137 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.139 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6e203a55-b4c2-4d45-b073-dd12caef7526', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:41:48.119898', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '469d0254-d99a-11f0-817e-fa163ebaca0f', 'monotonic_time': 10774.312571191, 'message_signature': '0351daaedbb9b78da87ff0aa24ed44a626aa98e1a6faa302b1991f03f4a4ebb5'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:41:48.119898', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '469d16c2-d99a-11f0-817e-fa163ebaca0f', 'monotonic_time': 10774.312571191, 'message_signature': 'cb8db3cdaf1307750ce832bbbd1cc203b44017a30c0a26dc86f469110224bcfc'}]}, 'timestamp': '2025-12-15 09:41:48.137873', '_unique_id': '0e899f847c3b4f8c9a419ddbb472c6af'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.139 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.139 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.139 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.139 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.139 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.139 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.139 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.139 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.139 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.139 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.139 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.139 12 ERROR oslo_messaging.notify.messaging Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.139 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.139 12 ERROR oslo_messaging.notify.messaging Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.139 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.139 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.139 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.139 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.139 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.139 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.139 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.139 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.139 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.139 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.139 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.139 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.139 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.139 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.139 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.139 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.139 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.139 12 ERROR oslo_messaging.notify.messaging Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.140 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.147 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.148 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bd427589-b0c8-4a97-86c1-34abc5ca8831', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:41:48.141102', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '469e9308-d99a-11f0-817e-fa163ebaca0f', 'monotonic_time': 10774.33379426, 'message_signature': '7f5454acea03f61e44287980ac260c00bc498e49d0c9b1d9d3d2370b85f552fd'}]}, 'timestamp': '2025-12-15 09:41:48.147628', '_unique_id': '19d730a89ec7480e96d6010fff943167'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.148 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.148 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.148 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.148 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.148 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.148 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.148 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.148 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.148 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.148 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.148 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.148 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.148 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.148 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.148 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.148 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.148 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.148 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.148 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.148 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.148 12 ERROR oslo_messaging.notify.messaging Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.148 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.148 12 ERROR oslo_messaging.notify.messaging Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.148 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.148 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.148 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.148 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.148 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.148 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.148 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.148 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.148 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.148 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.148 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.148 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.148 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.148 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.148 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.148 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.148 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.148 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.148 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.148 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.148 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.148 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.148 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.148 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.148 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.148 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.148 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.148 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.148 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.148 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.148 12 ERROR oslo_messaging.notify.messaging Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.149 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.150 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.packets volume: 129 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.151 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd0c7e0a6-2b93-4e41-9da6-758b12c8a736', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 129, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:41:48.149948', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '469f0270-d99a-11f0-817e-fa163ebaca0f', 'monotonic_time': 10774.33379426, 'message_signature': '0b5f28d2a39bd89e7616375127a368c789b64d3305ea7cfe382b9301d64903e6'}]}, 'timestamp': '2025-12-15 09:41:48.150473', '_unique_id': '62a691aa3202456f9fb7eea9db8bfc0f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.151 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.151 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.151 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.151 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.151 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.151 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.151 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.151 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.151 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.151 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.151 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.151 12 ERROR oslo_messaging.notify.messaging Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.151 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.151 12 ERROR oslo_messaging.notify.messaging Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.151 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.151 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.151 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.151 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.151 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.151 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.151 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.151 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.151 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.151 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.151 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.151 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.151 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.151 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.151 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.151 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.151 12 ERROR oslo_messaging.notify.messaging Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.152 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.153 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.bytes volume: 9229 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.154 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4eb0af13-fac5-440a-abe8-c1777172106e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9229, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:41:48.153017', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '469f7912-d99a-11f0-817e-fa163ebaca0f', 'monotonic_time': 10774.33379426, 'message_signature': '8f5b4f9bda99160914e45cd2a0497c48e953286c701c1c949cb82de8df4e239c'}]}, 'timestamp': '2025-12-15 09:41:48.153511', '_unique_id': 'b7aceb4324f7477ab3dd274c9dbe7a41'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.154 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.154 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.154 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.154 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.154 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.154 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.154 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.154 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.154 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.154 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.154 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.154 12 ERROR oslo_messaging.notify.messaging Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.154 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.154 12 ERROR oslo_messaging.notify.messaging Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.154 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.154 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.154 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.154 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.154 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.154 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.154 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.154 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.154 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.154 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.154 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.154 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.154 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.154 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.154 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.154 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.154 12 ERROR oslo_messaging.notify.messaging Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.155 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.155 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.156 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.157 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e3bb9236-c716-4669-9fae-18606be2b789', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:41:48.155923', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '469feb86-d99a-11f0-817e-fa163ebaca0f', 'monotonic_time': 10774.312571191, 'message_signature': '8949c78ee5a06019a7342a67f8c8cbf0f5f9913ff9ac77381901f7ce9916c06c'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:41:48.155923', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '469ffc84-d99a-11f0-817e-fa163ebaca0f', 'monotonic_time': 10774.312571191, 'message_signature': '93a0cc4fdf8907b6c40c2ab4152250d47b0cdb9cf35c91a8eeb6998ffb925758'}]}, 'timestamp': '2025-12-15 09:41:48.156846', '_unique_id': 'c1f9fe36d16844cd913f684f03373ecf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.157 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.157 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.157 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.157 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.157 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.157 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.157 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.157 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.157 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.157 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.157 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.157 12 ERROR oslo_messaging.notify.messaging Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.157 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.157 12 ERROR oslo_messaging.notify.messaging Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.157 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.157 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.157 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.157 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.157 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.157 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.157 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.157 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.157 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.157 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.157 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.157 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.157 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.157 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.157 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.157 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.157 12 ERROR oslo_messaging.notify.messaging Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.159 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.159 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.159 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.161 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b35d85aa-6d03-47ab-b936-6a704bc6872a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:41:48.159149', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '46a067fa-d99a-11f0-817e-fa163ebaca0f', 'monotonic_time': 10774.312571191, 'message_signature': 'cc739586c80ad45eb64bd35958ef2910772b431ea86fa17fad2e684f2efa61dc'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:41:48.159149', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '46a078ee-d99a-11f0-817e-fa163ebaca0f', 'monotonic_time': 10774.312571191, 'message_signature': 'b6b69e24de14c7959113f53402a3ac12d0eaa5f1b6895f14e01b482b104399d5'}]}, 'timestamp': '2025-12-15 09:41:48.160088', '_unique_id': '8af1a5ce134a4ab482edeaf2a85f24c4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.161 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.161 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.161 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.161 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.161 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.161 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.161 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.161 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.161 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.161 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.161 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.161 12 ERROR oslo_messaging.notify.messaging Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.161 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.161 12 ERROR oslo_messaging.notify.messaging Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.161 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.161 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.161 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.161 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.161 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.161 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.161 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.161 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.161 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.161 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.161 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.161 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.161 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.161 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.161 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.161 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.161 12 ERROR oslo_messaging.notify.messaging Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.162 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.162 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.bytes.delta volume: 446 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.163 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ade99e7c-c166-4627-84b0-fd525767e349', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 446, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:41:48.162337', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '46a0e482-d99a-11f0-817e-fa163ebaca0f', 'monotonic_time': 10774.33379426, 'message_signature': '4119d167d8f3c53a9f55d1bb6a035942503f1e4d7cf3fc98c55ddf0938c46715'}]}, 'timestamp': '2025-12-15 09:41:48.162809', '_unique_id': '8d2a7b97f46944789b685f344535b716'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.163 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.163 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.163 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.163 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.163 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.163 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.163 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.163 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.163 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.163 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.163 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.163 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.163 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.163 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.163 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.163 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.163 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.163 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.163 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.163 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.163 12 ERROR oslo_messaging.notify.messaging Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.163 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.163 12 ERROR oslo_messaging.notify.messaging Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.163 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.163 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.163 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.163 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.163 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.163 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.163 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.163 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.163 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.163 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.163 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.163 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.163 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.163 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.163 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.163 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.163 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.163 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.163 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.163 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.163 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.163 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.163 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.163 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.163 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.163 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.163 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.163 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.163 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.163 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.163 12 ERROR oslo_messaging.notify.messaging Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.165 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.165 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.166 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7a9a9a75-b508-4dd8-929e-81c2e399876e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:41:48.165352', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '46a15a34-d99a-11f0-817e-fa163ebaca0f', 'monotonic_time': 10774.33379426, 'message_signature': 'ce5a63fdf2955a2fd08732478c393c73b18704196f0961d0707c46b01908319f'}]}, 'timestamp': '2025-12-15 09:41:48.165821', '_unique_id': '42b5457d11a84e619f3f55c599b6deee'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.166 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.166 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.166 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.166 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.166 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.166 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.166 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.166 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.166 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.166 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.166 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.166 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.166 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.166 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.166 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.166 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.166 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.166 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.166 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.166 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.166 12 ERROR oslo_messaging.notify.messaging Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.166 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.166 12 ERROR oslo_messaging.notify.messaging Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.166 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.166 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.166 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.166 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.166 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.166 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.166 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.166 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.166 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.166 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.166 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.166 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.166 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.166 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.166 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.166 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.166 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.166 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.166 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.166 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.166 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.166 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.166 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.166 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.166 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.166 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.166 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.166 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.166 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.166 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.166 12 ERROR oslo_messaging.notify.messaging Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.167 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.168 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.192 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/memory.usage volume: 52.3125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.193 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f3ccd63a-dc79-4497-8f9d-5ff6d415c86c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.3125, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'timestamp': '2025-12-15T09:41:48.168197', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '46a56dcc-d99a-11f0-817e-fa163ebaca0f', 'monotonic_time': 10774.384558158, 'message_signature': '3594c4f561738bef937b9f24bb0a1768a8ee42bf9f458610e62788db781cbea1'}]}, 'timestamp': '2025-12-15 09:41:48.192531', '_unique_id': 'd188b8c8ac424dd181eb7ac65203a841'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.193 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.193 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.193 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.193 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.193 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.193 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.193 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.193 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.193 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.193 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.193 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.193 12 ERROR oslo_messaging.notify.messaging Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.193 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.193 12 ERROR oslo_messaging.notify.messaging Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.193 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.193 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.193 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.193 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.193 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.193 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.193 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.193 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.193 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.193 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.193 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.193 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.193 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.193 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.193 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.193 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.193 12 ERROR oslo_messaging.notify.messaging Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.194 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.230 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.231 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.232 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '521efaf0-6a1d-4139-87af-2210a3766bf2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:41:48.194918', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '46ab4af8-d99a-11f0-817e-fa163ebaca0f', 'monotonic_time': 10774.387627856, 'message_signature': '0bd36ac999eafd946fa2dadafae24b1be76274db7c78c65d46a8e500ead38d21'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:41:48.194918', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '46ab6416-d99a-11f0-817e-fa163ebaca0f', 'monotonic_time': 10774.387627856, 'message_signature': 'd81414496676dc7ccd6141ee7ae2c10b56ed1e050ad16205eb2d77a525a6ea5c'}]}, 'timestamp': '2025-12-15 09:41:48.231602', '_unique_id': 'a4416f00d7054c4dad426ad4b8248155'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.232 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.232 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.232 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.232 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.232 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.232 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.232 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.232 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.232 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.232 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.232 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.232 12 ERROR oslo_messaging.notify.messaging Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.232 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.232 12 ERROR oslo_messaging.notify.messaging Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.232 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.232 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.232 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.232 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.232 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.232 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.232 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.232 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.232 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.232 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.232 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.232 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.232 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.232 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.232 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.232 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.232 12 ERROR oslo_messaging.notify.messaging Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.233 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.234 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.235 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '23fc7e37-e149-415c-a1fd-53b6358ded4e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:41:48.234106', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '46abd8b0-d99a-11f0-817e-fa163ebaca0f', 'monotonic_time': 10774.33379426, 'message_signature': '3703ccb0674660c73a36ca0be243f57b2f529721c55877e1a40b64913f0024f3'}]}, 'timestamp': '2025-12-15 09:41:48.234602', '_unique_id': '204d1b8482f34bb6a557d57f608271bf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.235 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.235 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.235 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.235 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.235 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.235 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.235 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.235 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.235 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.235 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.235 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.235 12 ERROR oslo_messaging.notify.messaging Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.235 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.235 12 ERROR oslo_messaging.notify.messaging Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.235 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.235 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.235 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.235 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.235 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.235 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.235 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.235 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.235 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.235 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.235 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.235 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.235 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.235 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.235 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.235 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.235 12 ERROR oslo_messaging.notify.messaging Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.236 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.236 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.requests volume: 497 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.237 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.238 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd735ffcd-d9ed-4b21-a3e5-12f2adf84466', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 497, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:41:48.236920', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '46ac47a0-d99a-11f0-817e-fa163ebaca0f', 'monotonic_time': 10774.387627856, 'message_signature': '13870f45fa3cbe984758fcfb0fa411af6d0778328095ebac36dd63b6804f6cc2'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:41:48.236920', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '46ac5902-d99a-11f0-817e-fa163ebaca0f', 'monotonic_time': 10774.387627856, 'message_signature': 'd9308aa2a629e318154b574c0535426d1b83d39e47f20374067c9b25d81186d3'}]}, 'timestamp': '2025-12-15 09:41:48.237858', '_unique_id': 'beb149a4f09d4340be157ffd123842c4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.238 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.238 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.238 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.238 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.238 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.238 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.238 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.238 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.238 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.238 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.238 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.238 12 ERROR oslo_messaging.notify.messaging Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.238 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.238 12 ERROR oslo_messaging.notify.messaging Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.238 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.238 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.238 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.238 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.238 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.238 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.238 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.238 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.238 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.238 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.238 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.238 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.238 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.238 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.238 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.238 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.238 12 ERROR oslo_messaging.notify.messaging Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.240 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.240 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.latency volume: 213002426 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.241 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.latency volume: 24733520 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.242 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '23efe6d6-c576-42a2-b3e9-9c1efad2baf7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 213002426, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:41:48.240284', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '46accf18-d99a-11f0-817e-fa163ebaca0f', 'monotonic_time': 10774.387627856, 'message_signature': '38f2c36aeabfb44d30ec12c1540f6c7325c526c2d557609e92fffdc28b13d203'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24733520, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:41:48.240284', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '46ace908-d99a-11f0-817e-fa163ebaca0f', 'monotonic_time': 10774.387627856, 'message_signature': '9426a044ac6bc05b8e7676eb7337fc10fccf3f8dff1fb0a9e8282773964e13c7'}]}, 'timestamp': '2025-12-15 09:41:48.241667', '_unique_id': '8367e41a9489492dab914b5838d6125f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.242 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.242 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.242 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.242 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.242 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.242 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.242 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.242 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.242 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.242 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.242 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.242 12 ERROR oslo_messaging.notify.messaging Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.242 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.242 12 ERROR oslo_messaging.notify.messaging Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.242 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.242 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.242 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.242 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.242 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.242 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.242 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.242 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.242 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.242 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.242 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.242 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.242 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.242 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.242 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.242 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.242 12 ERROR oslo_messaging.notify.messaging Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.243 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.244 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/cpu volume: 57380000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.245 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7aa404aa-8d9b-494d-b8d4-05cb821097dd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 57380000000, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'timestamp': '2025-12-15T09:41:48.244175', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '46ad625c-d99a-11f0-817e-fa163ebaca0f', 'monotonic_time': 10774.384558158, 'message_signature': 'f7f2f9a8596d2c772c767c5802002e3ee6d12118a749e8acbb6a93bd82c78e84'}]}, 'timestamp': '2025-12-15 09:41:48.244693', '_unique_id': '9cdaaed6456c4598839974ba6e5cce75'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.245 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.245 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.245 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.245 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.245 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.245 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.245 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.245 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.245 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.245 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.245 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.245 12 ERROR oslo_messaging.notify.messaging Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.245 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.245 12 ERROR oslo_messaging.notify.messaging Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.245 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.245 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.245 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.245 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.245 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.245 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.245 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.245 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.245 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.245 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.245 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.245 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.245 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.245 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.245 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.245 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.245 12 ERROR oslo_messaging.notify.messaging Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.246 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.246 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.latency volume: 937264501 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.247 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.latency volume: 204572919 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.248 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fd123b37-1559-4629-8c3b-08f4fbe45da0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 937264501, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:41:48.246928', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '46adce5e-d99a-11f0-817e-fa163ebaca0f', 'monotonic_time': 10774.387627856, 'message_signature': '2cf64d6d857861ee02d65f9db332d1b33c5e97bdcafb2ef4a007ce129f9d98fe'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 204572919, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:41:48.246928', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '46addf70-d99a-11f0-817e-fa163ebaca0f', 'monotonic_time': 10774.387627856, 'message_signature': 'fd9bc3bd355fb8f44e71a7da72b78eb6fa6dcb27c5fad87613ea84cda4849fc4'}]}, 'timestamp': '2025-12-15 09:41:48.247888', '_unique_id': 'bbb439a898914ea994ce9a9e2f2d11cf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.248 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.248 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.248 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.248 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.248 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.248 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.248 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.248 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.248 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.248 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.248 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.248 12 ERROR oslo_messaging.notify.messaging Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.248 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.248 12 ERROR oslo_messaging.notify.messaging Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.248 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.248 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.248 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.248 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.248 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.248 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.248 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.248 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.248 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.248 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.248 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.248 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.248 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.248 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.248 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.248 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.248 12 ERROR oslo_messaging.notify.messaging Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.250 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.250 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.bytes volume: 11272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.251 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a17c1196-2267-4ce4-8199-3d06f6815a42', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11272, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:41:48.250216', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '46ae4dca-d99a-11f0-817e-fa163ebaca0f', 'monotonic_time': 10774.33379426, 'message_signature': '7586923b5bf73e9cd96a49dd8b8d3496e3b0710352e6cb4036d177ef50e7fba5'}]}, 'timestamp': '2025-12-15 09:41:48.250709', '_unique_id': 'b3204d98cd7449d1be7e434f6d031df1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.251 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.251 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.251 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.251 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.251 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.251 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.251 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.251 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.251 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.251 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.251 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.251 12 ERROR oslo_messaging.notify.messaging Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.251 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.251 12 ERROR oslo_messaging.notify.messaging Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.251 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.251 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.251 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.251 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.251 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.251 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.251 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.251 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.251 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.251 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.251 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.251 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.251 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.251 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.251 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.251 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.251 12 ERROR oslo_messaging.notify.messaging Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.253 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.253 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.packets volume: 87 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.254 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4a3f5f7d-a854-4720-8071-5181f226001e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 87, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:41:48.253233', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '46aec386-d99a-11f0-817e-fa163ebaca0f', 'monotonic_time': 10774.33379426, 'message_signature': '58cdc46d031a6d48fccbd4c78da26a3ff5776ccacd7c37e0abf65be8d30c17c3'}]}, 'timestamp': '2025-12-15 09:41:48.253723', '_unique_id': 'e2cbac3c80b04de28e73dc050784b972'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.254 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.254 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.254 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.254 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.254 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.254 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.254 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.254 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.254 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.254 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.254 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.254 12 ERROR oslo_messaging.notify.messaging Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.254 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.254 12 ERROR oslo_messaging.notify.messaging Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.254 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.254 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.254 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.254 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.254 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.254 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.254 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.254 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.254 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.254 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.254 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.254 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.254 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.254 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.254 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.254 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.254 12 ERROR oslo_messaging.notify.messaging Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.255 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.256 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.256 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.257 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '84b00724-708e-4c88-b3c5-75ed75b0d72a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:41:48.256181', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '46af37c6-d99a-11f0-817e-fa163ebaca0f', 'monotonic_time': 10774.33379426, 'message_signature': 'e43b004cfb5f6f29d0a54d4329edac7ac172a0e6a13ce96add9bdf22e5a82459'}]}, 'timestamp': '2025-12-15 09:41:48.256733', '_unique_id': '013d160582a843078a9c177bda02a554'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.257 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.257 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.257 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.257 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.257 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.257 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.257 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.257 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.257 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.257 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.257 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.257 12 ERROR oslo_messaging.notify.messaging Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.257 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.257 12 ERROR oslo_messaging.notify.messaging Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.257 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.257 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.257 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.257 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.257 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.257 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.257 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.257 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.257 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.257 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.257 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.257 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.257 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.257 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.257 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.257 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.257 12 ERROR oslo_messaging.notify.messaging Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.258 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.259 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.bytes volume: 73912320 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.259 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.261 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '398f789f-dc8e-47cb-861f-e9e832a50850', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73912320, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:41:48.258951', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '46afa404-d99a-11f0-817e-fa163ebaca0f', 'monotonic_time': 10774.387627856, 'message_signature': '896773db90c0dcf070a383d6d41a5c3685f3356c7566a5db9fb86b45abe426b8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:41:48.258951', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '46afb516-d99a-11f0-817e-fa163ebaca0f', 'monotonic_time': 10774.387627856, 'message_signature': 'b8c2211b65a46e69f5725c8977577143c20399567cb843a469aa4b15e4a59dfe'}]}, 'timestamp': '2025-12-15 09:41:48.259919', '_unique_id': '92ff26ffe69b44549f677017b309f433'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.261 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.261 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.261 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.261 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.261 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.261 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.261 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.261 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.261 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.261 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.261 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.261 12 ERROR oslo_messaging.notify.messaging Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.261 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.261 12 ERROR oslo_messaging.notify.messaging Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.261 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.261 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.261 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.261 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.261 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.261 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.261 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.261 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.261 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.261 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.261 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.261 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.261 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.261 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.261 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.261 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.261 12 ERROR oslo_messaging.notify.messaging Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.262 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.262 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.263 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.264 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8447895b-d2df-4e6c-93a3-22ce45d69431', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:41:48.262622', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '46b03360-d99a-11f0-817e-fa163ebaca0f', 'monotonic_time': 10774.387627856, 'message_signature': 'ef3b08d806b218c52a68b6d757183cd9062865789bce91019dcff2a23ab3bab2'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:41:48.262622', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '46b04670-d99a-11f0-817e-fa163ebaca0f', 'monotonic_time': 10774.387627856, 'message_signature': 'f830f085195911da178bc7db51cd5ff2ab2c7f895e07c59cc2b43a98cfdf07de'}]}, 'timestamp': '2025-12-15 09:41:48.263595', '_unique_id': '97f474cbcafe422d84fb5f2fbce7a4e8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.264 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.264 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.264 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.264 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.264 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.264 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.264 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.264 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.264 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.264 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.264 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.264 12 ERROR oslo_messaging.notify.messaging Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.264 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.264 12 ERROR oslo_messaging.notify.messaging Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.264 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.264 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.264 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.264 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.264 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.264 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.264 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.264 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.264 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.264 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.264 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.264 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.264 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.264 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.264 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.264 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.264 12 ERROR oslo_messaging.notify.messaging Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.265 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.266 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.267 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ad24c9bb-7d43-4644-a57d-3e4cc5eddc39', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:41:48.265935', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '46b0b542-d99a-11f0-817e-fa163ebaca0f', 'monotonic_time': 10774.33379426, 'message_signature': '11291bff2853f31b6849925a60dffbe99c5fbb4c6c957ee48125b824d1673acf'}]}, 'timestamp': '2025-12-15 09:41:48.266465', '_unique_id': '1c8fb65e02824b4d8ca6bd5dcc131c60'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.267 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.267 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.267 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.267 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.267 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.267 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.267 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.267 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.267 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.267 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.267 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.267 12 ERROR oslo_messaging.notify.messaging Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.267 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.267 12 ERROR oslo_messaging.notify.messaging Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.267 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.267 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.267 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.267 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.267 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.267 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.267 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.267 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.267 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.267 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.267 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.267 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.267 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.267 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.267 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.267 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.267 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.267 12 ERROR oslo_messaging.notify.messaging Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.268 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 15 04:41:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:41:48.268 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 15 04:41:48 localhost nova_compute[231752]: 2025-12-15 09:41:48.565 231756 DEBUG oslo_concurrency.lockutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Acquiring lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 15 04:41:48 localhost nova_compute[231752]: 2025-12-15 09:41:48.565 231756 DEBUG oslo_concurrency.lockutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Acquired lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 15 04:41:48 localhost nova_compute[231752]: 2025-12-15 09:41:48.565 231756 DEBUG nova.network.neutron [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 15 04:41:48 localhost nova_compute[231752]: 2025-12-15 09:41:48.566 231756 DEBUG nova.objects.instance [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Lazy-loading 'info_cache' on Instance uuid 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 15 04:41:49 localhost nova_compute[231752]: 2025-12-15 09:41:49.650 231756 DEBUG nova.network.neutron [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Updating instance_info_cache with network_info: [{"id": "03ef8889-3216-43fb-8a52-4be17a956ce1", "address": "fa:16:3e:74:df:7c", "network": {"id": "befb7a72-17a9-4bcb-b561-84b8f626685a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.201", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "c785bf23f53946bc99867d8832a50266", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03ef8889-32", "ovs_interfaceid": "03ef8889-3216-43fb-8a52-4be17a956ce1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 15 04:41:49 localhost nova_compute[231752]: 2025-12-15 09:41:49.668 231756 DEBUG oslo_concurrency.lockutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Releasing lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 15 04:41:49 localhost nova_compute[231752]: 2025-12-15 09:41:49.669 231756 DEBUG nova.compute.manager [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 15 04:41:49 localhost nova_compute[231752]: 2025-12-15 09:41:49.670 231756 DEBUG oslo_service.periodic_task [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:41:49 localhost nova_compute[231752]: 2025-12-15 09:41:49.670 231756 DEBUG oslo_service.periodic_task [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:41:49 localhost nova_compute[231752]: 2025-12-15 09:41:49.670 231756 DEBUG nova.compute.manager [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 15 04:41:49 localhost nova_compute[231752]: 2025-12-15 09:41:49.671 231756 DEBUG oslo_service.periodic_task [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:41:49 localhost nova_compute[231752]: 2025-12-15 09:41:49.689 231756 DEBUG oslo_concurrency.lockutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:41:49 localhost nova_compute[231752]: 2025-12-15 09:41:49.690 231756 DEBUG oslo_concurrency.lockutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:41:49 localhost nova_compute[231752]: 2025-12-15 09:41:49.690 231756 DEBUG oslo_concurrency.lockutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:41:49 localhost nova_compute[231752]: 2025-12-15 09:41:49.691 231756 DEBUG nova.compute.resource_tracker [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Auditing locally available compute resources for np0005559462.localdomain (node: np0005559462.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 15 04:41:49 localhost nova_compute[231752]: 2025-12-15 09:41:49.691 231756 DEBUG oslo_concurrency.processutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 04:41:50 localhost nova_compute[231752]: 2025-12-15 09:41:50.127 231756 DEBUG oslo_concurrency.processutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.436s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 04:41:50 localhost nova_compute[231752]: 2025-12-15 09:41:50.202 231756 DEBUG nova.virt.libvirt.driver [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 04:41:50 localhost nova_compute[231752]: 2025-12-15 09:41:50.202 231756 DEBUG nova.virt.libvirt.driver [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 04:41:50 localhost nova_compute[231752]: 2025-12-15 09:41:50.437 231756 WARNING nova.virt.libvirt.driver [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 15 04:41:50 localhost nova_compute[231752]: 2025-12-15 09:41:50.439 231756 DEBUG nova.compute.resource_tracker [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Hypervisor/Node resource view: name=np0005559462.localdomain free_ram=12175MB free_disk=41.83720779418945GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 15 04:41:50 localhost nova_compute[231752]: 2025-12-15 09:41:50.439 231756 DEBUG oslo_concurrency.lockutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:41:50 localhost nova_compute[231752]: 2025-12-15 09:41:50.440 231756 DEBUG oslo_concurrency.lockutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:41:50 localhost nova_compute[231752]: 2025-12-15 09:41:50.536 231756 DEBUG nova.compute.resource_tracker [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Instance 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 15 04:41:50 localhost nova_compute[231752]: 2025-12-15 09:41:50.537 231756 DEBUG nova.compute.resource_tracker [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 15 04:41:50 localhost nova_compute[231752]: 2025-12-15 09:41:50.537 231756 DEBUG nova.compute.resource_tracker [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Final resource view: name=np0005559462.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 15 04:41:50 localhost nova_compute[231752]: 2025-12-15 09:41:50.596 231756 DEBUG oslo_concurrency.processutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 04:41:50 localhost ovn_controller[154603]: 2025-12-15T09:41:50Z|00048|memory_trim|INFO|Detected inactivity (last active 30012 ms ago): trimming memory Dec 15 04:41:51 localhost nova_compute[231752]: 2025-12-15 09:41:51.025 231756 DEBUG oslo_concurrency.processutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.429s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 04:41:51 localhost nova_compute[231752]: 2025-12-15 09:41:51.031 231756 DEBUG nova.compute.provider_tree [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Inventory has not changed in ProviderTree for provider: 26c8956b-6742-4951-b566-971b9bbe323b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 15 04:41:51 localhost nova_compute[231752]: 2025-12-15 09:41:51.166 231756 DEBUG nova.scheduler.client.report [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Inventory has not changed for provider 26c8956b-6742-4951-b566-971b9bbe323b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 15 04:41:51 localhost nova_compute[231752]: 2025-12-15 09:41:51.169 231756 DEBUG nova.compute.resource_tracker [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Compute_service record updated for np0005559462.localdomain:np0005559462.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 15 04:41:51 localhost nova_compute[231752]: 2025-12-15 09:41:51.169 231756 DEBUG oslo_concurrency.lockutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.729s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:41:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:41:51.445 160590 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:41:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:41:51.446 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:41:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:41:51.447 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:41:51 localhost nova_compute[231752]: 2025-12-15 09:41:51.451 231756 DEBUG oslo_service.periodic_task [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:41:51 localhost nova_compute[231752]: 2025-12-15 09:41:51.452 231756 DEBUG oslo_service.periodic_task [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:41:51 localhost nova_compute[231752]: 2025-12-15 09:41:51.452 231756 DEBUG oslo_service.periodic_task [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:41:51 localhost nova_compute[231752]: 2025-12-15 09:41:51.453 231756 DEBUG oslo_service.periodic_task [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:41:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e. Dec 15 04:41:52 localhost podman[267763]: 2025-12-15 09:41:52.760879329 +0000 UTC m=+0.086231679 container health_status a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Dec 15 04:41:52 localhost podman[267763]: 2025-12-15 09:41:52.774539201 +0000 UTC m=+0.099891541 container exec_died a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Dec 15 04:41:52 localhost systemd[1]: a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e.service: Deactivated successfully. Dec 15 04:41:52 localhost nova_compute[231752]: 2025-12-15 09:41:52.860 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:41:52 localhost nova_compute[231752]: 2025-12-15 09:41:52.861 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:41:52 localhost nova_compute[231752]: 2025-12-15 09:41:52.861 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 15 04:41:52 localhost nova_compute[231752]: 2025-12-15 09:41:52.861 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:41:52 localhost nova_compute[231752]: 2025-12-15 09:41:52.880 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:41:52 localhost nova_compute[231752]: 2025-12-15 09:41:52.881 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:41:52 localhost nova_compute[231752]: 2025-12-15 09:41:52.886 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:41:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49852 DF PROTO=TCP SPT=38476 DPT=9102 SEQ=2681177536 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83A593510000000001030307) Dec 15 04:41:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49853 DF PROTO=TCP SPT=38476 DPT=9102 SEQ=2681177536 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83A597650000000001030307) Dec 15 04:41:56 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15395 DF PROTO=TCP SPT=56108 DPT=9102 SEQ=2608436167 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83A59B250000000001030307) Dec 15 04:41:57 localhost nova_compute[231752]: 2025-12-15 09:41:57.883 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:41:57 localhost nova_compute[231752]: 2025-12-15 09:41:57.888 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:41:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49854 DF PROTO=TCP SPT=38476 DPT=9102 SEQ=2681177536 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83A59F650000000001030307) Dec 15 04:41:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29113 DF PROTO=TCP SPT=53922 DPT=9102 SEQ=3658843121 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83A5A3250000000001030307) Dec 15 04:42:01 localhost podman[243449]: time="2025-12-15T09:42:01Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 15 04:42:01 localhost podman[243449]: @ - - [15/Dec/2025:09:42:01 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149352 "" "Go-http-client/1.1" Dec 15 04:42:01 localhost podman[243449]: @ - - [15/Dec/2025:09:42:01 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17222 "" "Go-http-client/1.1" Dec 15 04:42:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49855 DF PROTO=TCP SPT=38476 DPT=9102 SEQ=2681177536 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83A5AF250000000001030307) Dec 15 04:42:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0. Dec 15 04:42:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 04:42:02 localhost podman[267873]: 2025-12-15 09:42:02.759572957 +0000 UTC m=+0.086346811 container health_status 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team) Dec 15 04:42:02 localhost podman[267873]: 2025-12-15 09:42:02.801294346 +0000 UTC m=+0.128068180 container exec_died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.vendor=CentOS) Dec 15 04:42:02 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.service: Deactivated successfully. Dec 15 04:42:02 localhost podman[267872]: 2025-12-15 09:42:02.817726448 +0000 UTC m=+0.145993446 container health_status 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Dec 15 04:42:02 localhost podman[267872]: 2025-12-15 09:42:02.831485833 +0000 UTC m=+0.159752831 container exec_died 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Dec 15 04:42:02 localhost systemd[1]: 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0.service: Deactivated successfully. Dec 15 04:42:02 localhost nova_compute[231752]: 2025-12-15 09:42:02.921 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:42:02 localhost nova_compute[231752]: 2025-12-15 09:42:02.923 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:42:02 localhost nova_compute[231752]: 2025-12-15 09:42:02.923 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5034 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 15 04:42:02 localhost nova_compute[231752]: 2025-12-15 09:42:02.923 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:42:02 localhost nova_compute[231752]: 2025-12-15 09:42:02.924 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:42:02 localhost nova_compute[231752]: 2025-12-15 09:42:02.929 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:42:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a. Dec 15 04:42:04 localhost podman[267917]: 2025-12-15 09:42:04.750587236 +0000 UTC m=+0.083483710 container health_status b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, container_name=ceilometer_agent_compute) Dec 15 04:42:04 localhost podman[267917]: 2025-12-15 09:42:04.763436075 +0000 UTC m=+0.096332589 container exec_died b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Dec 15 04:42:04 localhost systemd[1]: b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a.service: Deactivated successfully. Dec 15 04:42:04 localhost openstack_network_exporter[246484]: ERROR 09:42:04 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 15 04:42:04 localhost openstack_network_exporter[246484]: ERROR 09:42:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 04:42:04 localhost openstack_network_exporter[246484]: ERROR 09:42:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 04:42:04 localhost openstack_network_exporter[246484]: ERROR 09:42:04 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 15 04:42:04 localhost openstack_network_exporter[246484]: Dec 15 04:42:04 localhost openstack_network_exporter[246484]: ERROR 09:42:04 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 15 04:42:04 localhost openstack_network_exporter[246484]: Dec 15 04:42:07 localhost nova_compute[231752]: 2025-12-15 09:42:07.930 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:42:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09. Dec 15 04:42:08 localhost systemd[1]: tmp-crun.0TkyEg.mount: Deactivated successfully. Dec 15 04:42:08 localhost podman[267936]: 2025-12-15 09:42:08.75200385 +0000 UTC m=+0.086048412 container health_status 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.33.7, container_name=openstack_network_exporter, version=9.6, build-date=2025-08-20T13:12:41, architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, config_id=openstack_network_exporter, managed_by=edpm_ansible, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal) Dec 15 04:42:08 localhost podman[267936]: 2025-12-15 09:42:08.764472579 +0000 UTC m=+0.098517121 container exec_died 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-type=git, vendor=Red Hat, Inc., container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, release=1755695350, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.openshift.expose-services=, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41) Dec 15 04:42:08 localhost systemd[1]: 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09.service: Deactivated successfully. Dec 15 04:42:10 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49856 DF PROTO=TCP SPT=38476 DPT=9102 SEQ=2681177536 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83A5CF250000000001030307) Dec 15 04:42:10 localhost sshd[267957]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:42:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 04:42:10 localhost systemd-logind[763]: New session 59 of user zuul. Dec 15 04:42:10 localhost systemd[1]: Started Session 59 of User zuul. Dec 15 04:42:10 localhost podman[267959]: 2025-12-15 09:42:10.405957705 +0000 UTC m=+0.083440239 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=ovn_controller, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true) Dec 15 04:42:10 localhost podman[267959]: 2025-12-15 09:42:10.481905367 +0000 UTC m=+0.159387851 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251202) Dec 15 04:42:10 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 04:42:11 localhost python3.9[268091]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Dec 15 04:42:12 localhost python3.9[268203]: ansible-ansible.builtin.service_facts Invoked Dec 15 04:42:12 localhost network[268220]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Dec 15 04:42:12 localhost network[268221]: 'network-scripts' will be removed from distribution in near future. Dec 15 04:42:12 localhost network[268222]: It is advised to switch to 'NetworkManager' instead for network management. Dec 15 04:42:12 localhost nova_compute[231752]: 2025-12-15 09:42:12.932 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:42:12 localhost nova_compute[231752]: 2025-12-15 09:42:12.934 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:42:12 localhost nova_compute[231752]: 2025-12-15 09:42:12.934 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 15 04:42:12 localhost nova_compute[231752]: 2025-12-15 09:42:12.934 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:42:12 localhost nova_compute[231752]: 2025-12-15 09:42:12.952 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:42:12 localhost nova_compute[231752]: 2025-12-15 09:42:12.952 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:42:14 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:42:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 04:42:15 localhost podman[268307]: 2025-12-15 09:42:15.896321082 +0000 UTC m=+0.089874613 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible) Dec 15 04:42:15 localhost podman[268307]: 2025-12-15 09:42:15.906518015 +0000 UTC m=+0.100071526 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2) Dec 15 04:42:15 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 04:42:17 localhost nova_compute[231752]: 2025-12-15 09:42:17.954 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:42:17 localhost nova_compute[231752]: 2025-12-15 09:42:17.993 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:42:17 localhost nova_compute[231752]: 2025-12-15 09:42:17.993 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5041 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 15 04:42:17 localhost nova_compute[231752]: 2025-12-15 09:42:17.994 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:42:17 localhost nova_compute[231752]: 2025-12-15 09:42:17.995 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:42:17 localhost nova_compute[231752]: 2025-12-15 09:42:17.998 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:42:18 localhost python3.9[268474]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Dec 15 04:42:19 localhost python3.9[268537]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Dec 15 04:42:22 localhost python3.9[268649]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 15 04:42:22 localhost nova_compute[231752]: 2025-12-15 09:42:22.998 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:42:23 localhost nova_compute[231752]: 2025-12-15 09:42:23.000 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:42:23 localhost nova_compute[231752]: 2025-12-15 09:42:23.000 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 15 04:42:23 localhost nova_compute[231752]: 2025-12-15 09:42:23.001 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:42:23 localhost nova_compute[231752]: 2025-12-15 09:42:23.042 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:42:23 localhost nova_compute[231752]: 2025-12-15 09:42:23.043 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:42:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e. Dec 15 04:42:23 localhost podman[268705]: 2025-12-15 09:42:23.411706116 +0000 UTC m=+0.083107239 container health_status a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Dec 15 04:42:23 localhost podman[268705]: 2025-12-15 09:42:23.424206675 +0000 UTC m=+0.095607778 container exec_died a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 15 04:42:23 localhost systemd[1]: a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e.service: Deactivated successfully. Dec 15 04:42:23 localhost python3.9[268782]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:42:24 localhost python3.9[268893]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 15 04:42:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17479 DF PROTO=TCP SPT=36494 DPT=9102 SEQ=4291295771 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83A608820000000001030307) Dec 15 04:42:25 localhost python3.9[269005]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:42:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17480 DF PROTO=TCP SPT=36494 DPT=9102 SEQ=4291295771 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83A60CA50000000001030307) Dec 15 04:42:26 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49857 DF PROTO=TCP SPT=38476 DPT=9102 SEQ=2681177536 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83A60F250000000001030307) Dec 15 04:42:26 localhost python3.9[269115]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 04:42:27 localhost python3.9[269227]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 04:42:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17481 DF PROTO=TCP SPT=36494 DPT=9102 SEQ=4291295771 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83A614A50000000001030307) Dec 15 04:42:28 localhost nova_compute[231752]: 2025-12-15 09:42:28.044 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:42:28 localhost nova_compute[231752]: 2025-12-15 09:42:28.045 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:42:28 localhost nova_compute[231752]: 2025-12-15 09:42:28.046 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 15 04:42:28 localhost nova_compute[231752]: 2025-12-15 09:42:28.046 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:42:28 localhost nova_compute[231752]: 2025-12-15 09:42:28.081 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:42:28 localhost nova_compute[231752]: 2025-12-15 09:42:28.082 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:42:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15396 DF PROTO=TCP SPT=56108 DPT=9102 SEQ=2608436167 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83A619250000000001030307) Dec 15 04:42:29 localhost python3.9[269339]: ansible-ansible.builtin.service_facts Invoked Dec 15 04:42:29 localhost network[269356]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Dec 15 04:42:29 localhost network[269357]: 'network-scripts' will be removed from distribution in near future. Dec 15 04:42:29 localhost network[269358]: It is advised to switch to 'NetworkManager' instead for network management. Dec 15 04:42:30 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:42:31 localhost podman[243449]: time="2025-12-15T09:42:31Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 15 04:42:31 localhost podman[243449]: @ - - [15/Dec/2025:09:42:31 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149352 "" "Go-http-client/1.1" Dec 15 04:42:31 localhost podman[243449]: @ - - [15/Dec/2025:09:42:31 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17226 "" "Go-http-client/1.1" Dec 15 04:42:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17482 DF PROTO=TCP SPT=36494 DPT=9102 SEQ=4291295771 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83A624660000000001030307) Dec 15 04:42:33 localhost nova_compute[231752]: 2025-12-15 09:42:33.083 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:42:33 localhost nova_compute[231752]: 2025-12-15 09:42:33.123 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:42:33 localhost nova_compute[231752]: 2025-12-15 09:42:33.123 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5041 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 15 04:42:33 localhost nova_compute[231752]: 2025-12-15 09:42:33.123 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:42:33 localhost nova_compute[231752]: 2025-12-15 09:42:33.128 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:42:33 localhost nova_compute[231752]: 2025-12-15 09:42:33.128 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:42:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0. Dec 15 04:42:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 04:42:33 localhost systemd[1]: tmp-crun.0jlefN.mount: Deactivated successfully. Dec 15 04:42:33 localhost podman[269440]: 2025-12-15 09:42:33.773606299 +0000 UTC m=+0.097300786 container health_status 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}) Dec 15 04:42:33 localhost podman[269439]: 2025-12-15 09:42:33.819760966 +0000 UTC m=+0.144404321 container health_status 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Dec 15 04:42:33 localhost podman[269439]: 2025-12-15 09:42:33.827767766 +0000 UTC m=+0.152411131 container exec_died 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Dec 15 04:42:33 localhost podman[269440]: 2025-12-15 09:42:33.838741822 +0000 UTC m=+0.162436359 container exec_died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, tcib_managed=true, container_name=multipathd, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}) Dec 15 04:42:33 localhost systemd[1]: 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0.service: Deactivated successfully. Dec 15 04:42:33 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.service: Deactivated successfully. Dec 15 04:42:34 localhost openstack_network_exporter[246484]: ERROR 09:42:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 04:42:34 localhost openstack_network_exporter[246484]: ERROR 09:42:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 04:42:34 localhost openstack_network_exporter[246484]: ERROR 09:42:34 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 15 04:42:34 localhost openstack_network_exporter[246484]: ERROR 09:42:34 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 15 04:42:34 localhost openstack_network_exporter[246484]: Dec 15 04:42:34 localhost openstack_network_exporter[246484]: ERROR 09:42:34 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 15 04:42:34 localhost openstack_network_exporter[246484]: Dec 15 04:42:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a. Dec 15 04:42:35 localhost podman[269632]: 2025-12-15 09:42:35.509829487 +0000 UTC m=+0.088862364 container health_status b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, org.label-schema.vendor=CentOS) Dec 15 04:42:35 localhost podman[269632]: 2025-12-15 09:42:35.550325832 +0000 UTC m=+0.129358689 container exec_died b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Dec 15 04:42:35 localhost systemd[1]: b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a.service: Deactivated successfully. Dec 15 04:42:35 localhost python3.9[269631]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None Dec 15 04:42:36 localhost python3.9[269764]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled Dec 15 04:42:37 localhost python3.9[269874]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:42:37 localhost python3.9[269931]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/modules-load.d/dm-multipath.conf _original_basename=module-load.conf.j2 recurse=False state=file path=/etc/modules-load.d/dm-multipath.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:42:38 localhost nova_compute[231752]: 2025-12-15 09:42:38.129 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:42:38 localhost nova_compute[231752]: 2025-12-15 09:42:38.131 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:42:38 localhost nova_compute[231752]: 2025-12-15 09:42:38.131 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 15 04:42:38 localhost nova_compute[231752]: 2025-12-15 09:42:38.132 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:42:38 localhost nova_compute[231752]: 2025-12-15 09:42:38.165 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:42:38 localhost nova_compute[231752]: 2025-12-15 09:42:38.165 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:42:38 localhost python3.9[270041]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:42:38 localhost python3.9[270151]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 15 04:42:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09. Dec 15 04:42:39 localhost systemd[1]: tmp-crun.VPVzZG.mount: Deactivated successfully. Dec 15 04:42:39 localhost podman[270261]: 2025-12-15 09:42:39.668044357 +0000 UTC m=+0.094014022 container health_status 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, managed_by=edpm_ansible, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers) Dec 15 04:42:39 localhost podman[270261]: 2025-12-15 09:42:39.685583792 +0000 UTC m=+0.111553487 container exec_died 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, architecture=x86_64, distribution-scope=public, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, version=9.6, managed_by=edpm_ansible, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, name=ubi9-minimal, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc.) Dec 15 04:42:39 localhost systemd[1]: 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09.service: Deactivated successfully. Dec 15 04:42:39 localhost python3.9[270262]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 15 04:42:40 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17483 DF PROTO=TCP SPT=36494 DPT=9102 SEQ=4291295771 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83A645250000000001030307) Dec 15 04:42:40 localhost python3.9[270394]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 15 04:42:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 04:42:40 localhost podman[270414]: 2025-12-15 09:42:40.754061953 +0000 UTC m=+0.085775785 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 04:42:40 localhost podman[270414]: 2025-12-15 09:42:40.792293992 +0000 UTC m=+0.124007884 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 04:42:40 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 04:42:41 localhost python3.9[270533]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:42:42 localhost python3.9[270644]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:42:42 localhost python3.9[270754]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line= find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:42:43 localhost nova_compute[231752]: 2025-12-15 09:42:43.166 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:42:43 localhost nova_compute[231752]: 2025-12-15 09:42:43.168 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:42:43 localhost nova_compute[231752]: 2025-12-15 09:42:43.168 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 15 04:42:43 localhost nova_compute[231752]: 2025-12-15 09:42:43.168 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:42:43 localhost nova_compute[231752]: 2025-12-15 09:42:43.208 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:42:43 localhost nova_compute[231752]: 2025-12-15 09:42:43.208 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:42:43 localhost python3.9[270864]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line= recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:42:44 localhost python3.9[270974]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line= skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:42:44 localhost python3.9[271084]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line= user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:42:45 localhost python3.9[271194]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 15 04:42:45 localhost nova_compute[231752]: 2025-12-15 09:42:45.949 231756 DEBUG oslo_service.periodic_task [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:42:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 04:42:46 localhost podman[271307]: 2025-12-15 09:42:46.156767462 +0000 UTC m=+0.077686873 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Dec 15 04:42:46 localhost podman[271307]: 2025-12-15 09:42:46.187059803 +0000 UTC m=+0.107979244 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 04:42:46 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 04:42:46 localhost python3.9[271306]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 15 04:42:46 localhost python3.9[271435]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:42:47 localhost python3.9[271492]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 15 04:42:47 localhost python3.9[271602]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:42:47 localhost nova_compute[231752]: 2025-12-15 09:42:47.951 231756 DEBUG oslo_service.periodic_task [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:42:48 localhost nova_compute[231752]: 2025-12-15 09:42:48.209 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:42:48 localhost nova_compute[231752]: 2025-12-15 09:42:48.211 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:42:48 localhost nova_compute[231752]: 2025-12-15 09:42:48.211 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 15 04:42:48 localhost nova_compute[231752]: 2025-12-15 09:42:48.211 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:42:48 localhost nova_compute[231752]: 2025-12-15 09:42:48.251 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:42:48 localhost nova_compute[231752]: 2025-12-15 09:42:48.252 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:42:48 localhost sshd[271660]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:42:48 localhost python3.9[271659]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 15 04:42:48 localhost nova_compute[231752]: 2025-12-15 09:42:48.965 231756 DEBUG oslo_service.periodic_task [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:42:48 localhost nova_compute[231752]: 2025-12-15 09:42:48.966 231756 DEBUG nova.compute.manager [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 15 04:42:48 localhost nova_compute[231752]: 2025-12-15 09:42:48.966 231756 DEBUG nova.compute.manager [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 15 04:42:49 localhost nova_compute[231752]: 2025-12-15 09:42:49.039 231756 DEBUG oslo_concurrency.lockutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Acquiring lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 15 04:42:49 localhost nova_compute[231752]: 2025-12-15 09:42:49.039 231756 DEBUG oslo_concurrency.lockutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Acquired lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 15 04:42:49 localhost nova_compute[231752]: 2025-12-15 09:42:49.040 231756 DEBUG nova.network.neutron [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 15 04:42:49 localhost nova_compute[231752]: 2025-12-15 09:42:49.040 231756 DEBUG nova.objects.instance [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Lazy-loading 'info_cache' on Instance uuid 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 15 04:42:49 localhost python3.9[271771]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:42:49 localhost nova_compute[231752]: 2025-12-15 09:42:49.381 231756 DEBUG nova.network.neutron [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Updating instance_info_cache with network_info: [{"id": "03ef8889-3216-43fb-8a52-4be17a956ce1", "address": "fa:16:3e:74:df:7c", "network": {"id": "befb7a72-17a9-4bcb-b561-84b8f626685a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.201", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "c785bf23f53946bc99867d8832a50266", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03ef8889-32", "ovs_interfaceid": "03ef8889-3216-43fb-8a52-4be17a956ce1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 15 04:42:49 localhost nova_compute[231752]: 2025-12-15 09:42:49.405 231756 DEBUG oslo_concurrency.lockutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Releasing lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 15 04:42:49 localhost nova_compute[231752]: 2025-12-15 09:42:49.406 231756 DEBUG nova.compute.manager [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 15 04:42:49 localhost nova_compute[231752]: 2025-12-15 09:42:49.406 231756 DEBUG oslo_service.periodic_task [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:42:49 localhost python3.9[271881]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:42:49 localhost nova_compute[231752]: 2025-12-15 09:42:49.950 231756 DEBUG oslo_service.periodic_task [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:42:49 localhost nova_compute[231752]: 2025-12-15 09:42:49.968 231756 DEBUG oslo_service.periodic_task [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:42:49 localhost nova_compute[231752]: 2025-12-15 09:42:49.969 231756 DEBUG oslo_service.periodic_task [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:42:49 localhost nova_compute[231752]: 2025-12-15 09:42:49.969 231756 DEBUG nova.compute.manager [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 15 04:42:49 localhost nova_compute[231752]: 2025-12-15 09:42:49.969 231756 DEBUG oslo_service.periodic_task [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:42:49 localhost nova_compute[231752]: 2025-12-15 09:42:49.984 231756 DEBUG oslo_concurrency.lockutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:42:49 localhost nova_compute[231752]: 2025-12-15 09:42:49.985 231756 DEBUG oslo_concurrency.lockutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:42:49 localhost nova_compute[231752]: 2025-12-15 09:42:49.985 231756 DEBUG oslo_concurrency.lockutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:42:49 localhost nova_compute[231752]: 2025-12-15 09:42:49.986 231756 DEBUG nova.compute.resource_tracker [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Auditing locally available compute resources for np0005559462.localdomain (node: np0005559462.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 15 04:42:49 localhost nova_compute[231752]: 2025-12-15 09:42:49.986 231756 DEBUG oslo_concurrency.processutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 04:42:50 localhost python3.9[271941]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:42:50 localhost nova_compute[231752]: 2025-12-15 09:42:50.488 231756 DEBUG oslo_concurrency.processutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.501s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 04:42:50 localhost nova_compute[231752]: 2025-12-15 09:42:50.545 231756 DEBUG nova.virt.libvirt.driver [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 04:42:50 localhost nova_compute[231752]: 2025-12-15 09:42:50.546 231756 DEBUG nova.virt.libvirt.driver [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 04:42:50 localhost nova_compute[231752]: 2025-12-15 09:42:50.766 231756 WARNING nova.virt.libvirt.driver [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 15 04:42:50 localhost nova_compute[231752]: 2025-12-15 09:42:50.768 231756 DEBUG nova.compute.resource_tracker [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Hypervisor/Node resource view: name=np0005559462.localdomain free_ram=12150MB free_disk=41.83720779418945GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 15 04:42:50 localhost nova_compute[231752]: 2025-12-15 09:42:50.769 231756 DEBUG oslo_concurrency.lockutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:42:50 localhost nova_compute[231752]: 2025-12-15 09:42:50.769 231756 DEBUG oslo_concurrency.lockutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:42:50 localhost nova_compute[231752]: 2025-12-15 09:42:50.859 231756 DEBUG nova.compute.resource_tracker [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Instance 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 15 04:42:50 localhost nova_compute[231752]: 2025-12-15 09:42:50.859 231756 DEBUG nova.compute.resource_tracker [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 15 04:42:50 localhost nova_compute[231752]: 2025-12-15 09:42:50.860 231756 DEBUG nova.compute.resource_tracker [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Final resource view: name=np0005559462.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 15 04:42:50 localhost nova_compute[231752]: 2025-12-15 09:42:50.900 231756 DEBUG nova.scheduler.client.report [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Refreshing inventories for resource provider 26c8956b-6742-4951-b566-971b9bbe323b _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Dec 15 04:42:50 localhost nova_compute[231752]: 2025-12-15 09:42:50.947 231756 DEBUG nova.scheduler.client.report [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Updating ProviderTree inventory for provider 26c8956b-6742-4951-b566-971b9bbe323b from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Dec 15 04:42:50 localhost nova_compute[231752]: 2025-12-15 09:42:50.947 231756 DEBUG nova.compute.provider_tree [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Updating inventory in ProviderTree for provider 26c8956b-6742-4951-b566-971b9bbe323b with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Dec 15 04:42:50 localhost nova_compute[231752]: 2025-12-15 09:42:50.961 231756 DEBUG nova.scheduler.client.report [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Refreshing aggregate associations for resource provider 26c8956b-6742-4951-b566-971b9bbe323b, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Dec 15 04:42:50 localhost nova_compute[231752]: 2025-12-15 09:42:50.978 231756 DEBUG nova.scheduler.client.report [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Refreshing trait associations for resource provider 26c8956b-6742-4951-b566-971b9bbe323b, traits: COMPUTE_RESCUE_BFV,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_F16C,HW_CPU_X86_SVM,COMPUTE_ACCELERATORS,HW_CPU_X86_ABM,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_AVX2,HW_CPU_X86_BMI2,HW_CPU_X86_SSE42,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_FMA3,HW_CPU_X86_SHA,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SSE2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_AVX,HW_CPU_X86_MMX,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSSE3,HW_CPU_X86_CLMUL,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSE,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_AESNI,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_VOLUME_EXTEND,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE41,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NODE,HW_CPU_X86_BMI,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE4A _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Dec 15 04:42:51 localhost nova_compute[231752]: 2025-12-15 09:42:51.009 231756 DEBUG oslo_concurrency.processutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 04:42:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:42:51.447 160590 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:42:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:42:51.447 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:42:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:42:51.449 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:42:51 localhost nova_compute[231752]: 2025-12-15 09:42:51.488 231756 DEBUG oslo_concurrency.processutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.478s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 04:42:51 localhost nova_compute[231752]: 2025-12-15 09:42:51.493 231756 DEBUG nova.compute.provider_tree [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Inventory has not changed in ProviderTree for provider: 26c8956b-6742-4951-b566-971b9bbe323b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 15 04:42:51 localhost nova_compute[231752]: 2025-12-15 09:42:51.512 231756 DEBUG nova.scheduler.client.report [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Inventory has not changed for provider 26c8956b-6742-4951-b566-971b9bbe323b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 15 04:42:51 localhost nova_compute[231752]: 2025-12-15 09:42:51.514 231756 DEBUG nova.compute.resource_tracker [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Compute_service record updated for np0005559462.localdomain:np0005559462.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 15 04:42:51 localhost nova_compute[231752]: 2025-12-15 09:42:51.514 231756 DEBUG oslo_concurrency.lockutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.745s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:42:51 localhost nova_compute[231752]: 2025-12-15 09:42:51.515 231756 DEBUG oslo_service.periodic_task [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:42:51 localhost nova_compute[231752]: 2025-12-15 09:42:51.515 231756 DEBUG nova.compute.manager [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m Dec 15 04:42:51 localhost nova_compute[231752]: 2025-12-15 09:42:51.534 231756 DEBUG nova.compute.manager [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m Dec 15 04:42:52 localhost python3.9[272092]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:42:52 localhost python3.9[272149]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:42:52 localhost nova_compute[231752]: 2025-12-15 09:42:52.517 231756 DEBUG oslo_service.periodic_task [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:42:52 localhost nova_compute[231752]: 2025-12-15 09:42:52.517 231756 DEBUG oslo_service.periodic_task [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:42:52 localhost nova_compute[231752]: 2025-12-15 09:42:52.518 231756 DEBUG oslo_service.periodic_task [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:42:52 localhost nova_compute[231752]: 2025-12-15 09:42:52.951 231756 DEBUG oslo_service.periodic_task [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:42:52 localhost nova_compute[231752]: 2025-12-15 09:42:52.951 231756 DEBUG nova.compute.manager [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Cleaning up deleted instances with incomplete migration _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m Dec 15 04:42:53 localhost nova_compute[231752]: 2025-12-15 09:42:53.252 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:42:53 localhost nova_compute[231752]: 2025-12-15 09:42:53.254 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:42:53 localhost nova_compute[231752]: 2025-12-15 09:42:53.254 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 15 04:42:53 localhost nova_compute[231752]: 2025-12-15 09:42:53.254 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:42:53 localhost nova_compute[231752]: 2025-12-15 09:42:53.291 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:42:53 localhost nova_compute[231752]: 2025-12-15 09:42:53.292 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:42:53 localhost python3.9[272259]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 04:42:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e. Dec 15 04:42:53 localhost systemd[1]: Reloading. Dec 15 04:42:53 localhost podman[272261]: 2025-12-15 09:42:53.658841733 +0000 UTC m=+0.075955203 container health_status a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Dec 15 04:42:53 localhost podman[272261]: 2025-12-15 09:42:53.670491358 +0000 UTC m=+0.087604668 container exec_died a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Dec 15 04:42:53 localhost systemd-sysv-generator[272307]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:42:53 localhost systemd-rc-local-generator[272303]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:42:53 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:42:53 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 15 04:42:53 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:42:53 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:42:53 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:42:53 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 15 04:42:53 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:42:53 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:42:53 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:42:53 localhost systemd[1]: a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e.service: Deactivated successfully. Dec 15 04:42:54 localhost python3.9[272430]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:42:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35409 DF PROTO=TCP SPT=38304 DPT=9102 SEQ=3326393576 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83A67DB10000000001030307) Dec 15 04:42:55 localhost python3.9[272487]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:42:55 localhost python3.9[272597]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:42:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35410 DF PROTO=TCP SPT=38304 DPT=9102 SEQ=3326393576 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83A681A50000000001030307) Dec 15 04:42:56 localhost python3.9[272654]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:42:56 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17484 DF PROTO=TCP SPT=36494 DPT=9102 SEQ=4291295771 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83A685250000000001030307) Dec 15 04:42:57 localhost python3.9[272764]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 04:42:57 localhost systemd[1]: Reloading. Dec 15 04:42:57 localhost systemd-rc-local-generator[272785]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:42:57 localhost systemd-sysv-generator[272790]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:42:57 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:42:57 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 15 04:42:57 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:42:57 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:42:57 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:42:57 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 15 04:42:57 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:42:57 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:42:57 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:42:57 localhost systemd[1]: Starting Create netns directory... Dec 15 04:42:57 localhost systemd[1]: run-netns-placeholder.mount: Deactivated successfully. Dec 15 04:42:57 localhost systemd[1]: netns-placeholder.service: Deactivated successfully. Dec 15 04:42:57 localhost systemd[1]: Finished Create netns directory. Dec 15 04:42:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35411 DF PROTO=TCP SPT=38304 DPT=9102 SEQ=3326393576 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83A689A60000000001030307) Dec 15 04:42:58 localhost nova_compute[231752]: 2025-12-15 09:42:58.293 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:42:58 localhost nova_compute[231752]: 2025-12-15 09:42:58.295 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:42:58 localhost nova_compute[231752]: 2025-12-15 09:42:58.296 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 15 04:42:58 localhost nova_compute[231752]: 2025-12-15 09:42:58.296 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:42:58 localhost nova_compute[231752]: 2025-12-15 09:42:58.324 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:42:58 localhost nova_compute[231752]: 2025-12-15 09:42:58.325 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:42:58 localhost python3.9[272916]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 15 04:42:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49858 DF PROTO=TCP SPT=38476 DPT=9102 SEQ=2681177536 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83A68D260000000001030307) Dec 15 04:42:59 localhost python3.9[273026]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:42:59 localhost python3.9[273083]: ansible-ansible.legacy.file Invoked with group=zuul mode=0700 owner=zuul setype=container_file_t dest=/var/lib/openstack/healthchecks/multipathd/ _original_basename=healthcheck recurse=False state=file path=/var/lib/openstack/healthchecks/multipathd/ force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 15 04:43:00 localhost python3.9[273193]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:43:01 localhost python3.9[273303]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 15 04:43:01 localhost podman[243449]: time="2025-12-15T09:43:01Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 15 04:43:01 localhost podman[243449]: @ - - [15/Dec/2025:09:43:01 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149352 "" "Go-http-client/1.1" Dec 15 04:43:01 localhost podman[243449]: @ - - [15/Dec/2025:09:43:01 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17221 "" "Go-http-client/1.1" Dec 15 04:43:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35412 DF PROTO=TCP SPT=38304 DPT=9102 SEQ=3326393576 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83A699650000000001030307) Dec 15 04:43:02 localhost python3.9[273460]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:43:02 localhost python3.9[273536]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/var/lib/kolla/config_files/multipathd.json _original_basename=.btg3k5xw recurse=False state=file path=/var/lib/kolla/config_files/multipathd.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:43:03 localhost nova_compute[231752]: 2025-12-15 09:43:03.327 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:43:03 localhost nova_compute[231752]: 2025-12-15 09:43:03.329 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:43:03 localhost nova_compute[231752]: 2025-12-15 09:43:03.329 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 15 04:43:03 localhost nova_compute[231752]: 2025-12-15 09:43:03.329 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:43:03 localhost nova_compute[231752]: 2025-12-15 09:43:03.330 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:43:03 localhost nova_compute[231752]: 2025-12-15 09:43:03.333 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:43:03 localhost python3.9[273662]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:43:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0. Dec 15 04:43:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 04:43:04 localhost systemd[1]: tmp-crun.BwZIjd.mount: Deactivated successfully. Dec 15 04:43:04 localhost podman[273844]: 2025-12-15 09:43:04.751564495 +0000 UTC m=+0.080547355 container health_status 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.schema-version=1.0) Dec 15 04:43:04 localhost podman[273844]: 2025-12-15 09:43:04.763045726 +0000 UTC m=+0.092028656 container exec_died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=multipathd, container_name=multipathd) Dec 15 04:43:04 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.service: Deactivated successfully. Dec 15 04:43:04 localhost openstack_network_exporter[246484]: ERROR 09:43:04 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 15 04:43:04 localhost openstack_network_exporter[246484]: ERROR 09:43:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 04:43:04 localhost openstack_network_exporter[246484]: ERROR 09:43:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 04:43:04 localhost openstack_network_exporter[246484]: ERROR 09:43:04 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 15 04:43:04 localhost openstack_network_exporter[246484]: Dec 15 04:43:04 localhost openstack_network_exporter[246484]: ERROR 09:43:04 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 15 04:43:04 localhost openstack_network_exporter[246484]: Dec 15 04:43:04 localhost podman[273843]: 2025-12-15 09:43:04.85163228 +0000 UTC m=+0.179847898 container health_status 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Dec 15 04:43:04 localhost podman[273843]: 2025-12-15 09:43:04.862806031 +0000 UTC m=+0.191021659 container exec_died 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Dec 15 04:43:04 localhost systemd[1]: 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0.service: Deactivated successfully. Dec 15 04:43:05 localhost python3.9[273977]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False Dec 15 04:43:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a. Dec 15 04:43:05 localhost podman[273995]: 2025-12-15 09:43:05.753205416 +0000 UTC m=+0.082957645 container health_status b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute) Dec 15 04:43:05 localhost podman[273995]: 2025-12-15 09:43:05.764297015 +0000 UTC m=+0.094049224 container exec_died b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Dec 15 04:43:05 localhost systemd[1]: b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a.service: Deactivated successfully. Dec 15 04:43:06 localhost python3.9[274106]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack Dec 15 04:43:07 localhost python3.9[274216]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None Dec 15 04:43:08 localhost nova_compute[231752]: 2025-12-15 09:43:08.334 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:43:08 localhost nova_compute[231752]: 2025-12-15 09:43:08.336 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:43:08 localhost nova_compute[231752]: 2025-12-15 09:43:08.337 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 15 04:43:08 localhost nova_compute[231752]: 2025-12-15 09:43:08.337 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:43:08 localhost nova_compute[231752]: 2025-12-15 09:43:08.358 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:43:08 localhost nova_compute[231752]: 2025-12-15 09:43:08.359 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:43:10 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35413 DF PROTO=TCP SPT=38304 DPT=9102 SEQ=3326393576 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83A6B9250000000001030307) Dec 15 04:43:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09. Dec 15 04:43:10 localhost podman[274261]: 2025-12-15 09:43:10.753723119 +0000 UTC m=+0.081537154 container health_status 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, name=ubi9-minimal, io.openshift.expose-services=, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, architecture=x86_64, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., distribution-scope=public, version=9.6) Dec 15 04:43:10 localhost podman[274261]: 2025-12-15 09:43:10.770431769 +0000 UTC m=+0.098245804 container exec_died 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, io.openshift.expose-services=, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, name=ubi9-minimal, vendor=Red Hat, Inc., vcs-type=git, config_id=openstack_network_exporter, architecture=x86_64, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, version=9.6, build-date=2025-08-20T13:12:41) Dec 15 04:43:10 localhost systemd[1]: 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09.service: Deactivated successfully. Dec 15 04:43:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 04:43:11 localhost podman[274370]: 2025-12-15 09:43:11.746547886 +0000 UTC m=+0.072273297 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Dec 15 04:43:11 localhost podman[274370]: 2025-12-15 09:43:11.817366602 +0000 UTC m=+0.143092023 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.build-date=20251202) Dec 15 04:43:11 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 04:43:12 localhost python3[274385]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json containers=['multipathd'] log_base_path=/var/log/containers/stdouts debug=False Dec 15 04:43:12 localhost python3[274385]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [#012 {#012 "Id": "bcd3898ac099c7fff3d2ff3fc32de931119ed36068f8a2617bd8fa95e51d1b81",#012 "Digest": "sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f",#012 "RepoTags": [#012 "quay.io/podified-antelope-centos9/openstack-multipathd:current-podified"#012 ],#012 "RepoDigests": [#012 "quay.io/podified-antelope-centos9/openstack-multipathd@sha256:df38dbd6b3eccec2abaa8e3618a385405ccec1b73ae8c3573a138b0c961ed31f"#012 ],#012 "Parent": "",#012 "Comment": "",#012 "Created": "2025-12-08T06:10:05.835596707Z",#012 "Config": {#012 "User": "root",#012 "Env": [#012 "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",#012 "LANG=en_US.UTF-8",#012 "TZ=UTC",#012 "container=oci"#012 ],#012 "Entrypoint": [#012 "dumb-init",#012 "--single-child",#012 "--"#012 ],#012 "Cmd": [#012 "kolla_start"#012 ],#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251202",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "c3923531bcda0b0811b2d5053f189beb",#012 "tcib_managed": "true"#012 },#012 "StopSignal": "SIGTERM"#012 },#012 "Version": "",#012 "Author": "",#012 "Architecture": "amd64",#012 "Os": "linux",#012 "Size": 250004968,#012 "VirtualSize": 250004968,#012 "GraphDriver": {#012 "Name": "overlay",#012 "Data": {#012 "LowerDir": "/var/lib/containers/storage/overlay/102653142e2259aa6223045dee7736729104ac8aed3ce9b3c87a6d0787e59de8/diff:/var/lib/containers/storage/overlay/a170762be59c15b133bd19c602942600caa3082ffe7158ccee8771dfc16bb660/diff",#012 "UpperDir": "/var/lib/containers/storage/overlay/bd66482ae52e5d60dae4275269d853addcdaa29eea9ccf57897aa9813983d9b9/diff",#012 "WorkDir": "/var/lib/containers/storage/overlay/bd66482ae52e5d60dae4275269d853addcdaa29eea9ccf57897aa9813983d9b9/work"#012 }#012 },#012 "RootFS": {#012 "Type": "layers",#012 "Layers": [#012 "sha256:a170762be59c15b133bd19c602942600caa3082ffe7158ccee8771dfc16bb660",#012 "sha256:47bbb708952ccfdaf6b1a15cd5347cc2e9ee37e63ec65603401dcebf66de9242",#012 "sha256:ccd3d74704a5e6cddf38f6a0093a76e70e771f31b3c4bb1db14346740d014124"#012 ]#012 },#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251202",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "c3923531bcda0b0811b2d5053f189beb",#012 "tcib_managed": "true"#012 },#012 "Annotations": {},#012 "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",#012 "User": "root",#012 "History": [#012 {#012 "created": "2025-12-02T04:26:51.317229596Z",#012 "created_by": "/bin/sh -c #(nop) ADD file:9f05a0f58e10b77188c7243d914ce56c5ce3e0f2ee7e13a7b0d4990588c97b99 in / ",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-02T04:26:51.317315213Z",#012 "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\" org.label-schema.name=\"CentOS Stream 9 Base Image\" org.label-schema.vendor=\"CentOS\" org.label-schema.license=\"GPLv2\" org.label-schema.build-date=\"20251202\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-02T04:26:54.063957926Z",#012 "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"#012 },#012 {#012 "created": "2025-12-08T06:08:28.750777742Z",#012 "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",#012 "comment": "FROM quay.io/centos/centos:stream9",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-08T06:08:28.750791962Z",#012 "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-08T06:08:28.750804372Z",#012 "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-08T06:08:28.750813613Z",#012 "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-08T06:08:28.750824813Z",#012 "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-08T06:08:28.750833663Z",#012 "created_by": "/bin/sh -c #(nop) USER root",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-08T06:08:29.160435164Z",#012 "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-08T06:09:05.859236491Z",#012 "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-08T06:09:09.443839088Z",#012 "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util-linux-user which python-tcib-containers",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-08T06:09:09.832605984Z",#012 Dec 15 04:43:13 localhost python3.9[274568]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 15 04:43:13 localhost nova_compute[231752]: 2025-12-15 09:43:13.358 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:43:13 localhost nova_compute[231752]: 2025-12-15 09:43:13.359 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:43:13 localhost python3.9[274680]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:43:14 localhost python3.9[274735]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 15 04:43:15 localhost python3.9[274844]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765791794.3753374-1394-36899174956140/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:43:15 localhost python3.9[274899]: ansible-systemd Invoked with state=started name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 04:43:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 04:43:16 localhost podman[274919]: 2025-12-15 09:43:16.732583832 +0000 UTC m=+0.067736477 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Dec 15 04:43:16 localhost podman[274919]: 2025-12-15 09:43:16.765449517 +0000 UTC m=+0.100602212 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true) Dec 15 04:43:16 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 04:43:17 localhost python3.9[275027]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml Dec 15 04:43:18 localhost nova_compute[231752]: 2025-12-15 09:43:18.360 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:43:18 localhost nova_compute[231752]: 2025-12-15 09:43:18.362 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:43:18 localhost nova_compute[231752]: 2025-12-15 09:43:18.362 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 15 04:43:18 localhost nova_compute[231752]: 2025-12-15 09:43:18.363 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:43:18 localhost nova_compute[231752]: 2025-12-15 09:43:18.380 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:43:18 localhost nova_compute[231752]: 2025-12-15 09:43:18.381 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:43:18 localhost python3.9[275137]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:43:18 localhost python3.9[275227]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765791797.9506614-1505-92943797182347/.source.yaml _original_basename=.kxfhk2co follow=False checksum=43862d31d8e68081861e6672a0a75a3cd73bea1a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:43:19 localhost python3.9[275335]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 15 04:43:20 localhost python3.9[275445]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:43:21 localhost python3.9[275555]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None Dec 15 04:43:22 localhost python3.9[275665]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled Dec 15 04:43:22 localhost python3.9[275775]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:43:23 localhost python3.9[275832]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/modules-load.d/nvme-fabrics.conf _original_basename=module-load.conf.j2 recurse=False state=file path=/etc/modules-load.d/nvme-fabrics.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:43:23 localhost nova_compute[231752]: 2025-12-15 09:43:23.382 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:43:24 localhost python3.9[275942]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:43:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e. Dec 15 04:43:24 localhost podman[276053]: 2025-12-15 09:43:24.739759597 +0000 UTC m=+0.080333090 container health_status a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 15 04:43:24 localhost podman[276053]: 2025-12-15 09:43:24.748257511 +0000 UTC m=+0.088830994 container exec_died a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Dec 15 04:43:24 localhost systemd[1]: a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e.service: Deactivated successfully. Dec 15 04:43:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34301 DF PROTO=TCP SPT=36460 DPT=9102 SEQ=1386507001 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83A6F2E20000000001030307) Dec 15 04:43:24 localhost python3.9[276052]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Dec 15 04:43:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34302 DF PROTO=TCP SPT=36460 DPT=9102 SEQ=1386507001 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83A6F6E50000000001030307) Dec 15 04:43:26 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35414 DF PROTO=TCP SPT=38304 DPT=9102 SEQ=3326393576 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83A6F9250000000001030307) Dec 15 04:43:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34303 DF PROTO=TCP SPT=36460 DPT=9102 SEQ=1386507001 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83A6FEE60000000001030307) Dec 15 04:43:28 localhost nova_compute[231752]: 2025-12-15 09:43:28.385 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:43:28 localhost nova_compute[231752]: 2025-12-15 09:43:28.387 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:43:28 localhost nova_compute[231752]: 2025-12-15 09:43:28.388 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 15 04:43:28 localhost nova_compute[231752]: 2025-12-15 09:43:28.388 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:43:28 localhost nova_compute[231752]: 2025-12-15 09:43:28.415 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:43:28 localhost nova_compute[231752]: 2025-12-15 09:43:28.416 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:43:28 localhost python3.9[276183]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Dec 15 04:43:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17485 DF PROTO=TCP SPT=36494 DPT=9102 SEQ=4291295771 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83A703250000000001030307) Dec 15 04:43:29 localhost python3.9[276297]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:43:30 localhost python3.9[276407]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Dec 15 04:43:30 localhost systemd[1]: Reloading. Dec 15 04:43:30 localhost systemd-rc-local-generator[276432]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:43:30 localhost systemd-sysv-generator[276438]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:43:31 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:43:31 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 15 04:43:31 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:43:31 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:43:31 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:43:31 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 15 04:43:31 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:43:31 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:43:31 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:43:31 localhost podman[243449]: time="2025-12-15T09:43:31Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 15 04:43:31 localhost podman[243449]: @ - - [15/Dec/2025:09:43:31 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149352 "" "Go-http-client/1.1" Dec 15 04:43:31 localhost python3.9[276551]: ansible-ansible.builtin.service_facts Invoked Dec 15 04:43:31 localhost podman[243449]: @ - - [15/Dec/2025:09:43:31 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17223 "" "Go-http-client/1.1" Dec 15 04:43:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34304 DF PROTO=TCP SPT=36460 DPT=9102 SEQ=1386507001 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83A70EA60000000001030307) Dec 15 04:43:32 localhost network[276568]: You are using 'network' service provided by 'network-scripts', which are now deprecated. Dec 15 04:43:32 localhost network[276569]: 'network-scripts' will be removed from distribution in near future. Dec 15 04:43:32 localhost network[276570]: It is advised to switch to 'NetworkManager' instead for network management. Dec 15 04:43:33 localhost nova_compute[231752]: 2025-12-15 09:43:33.418 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:43:34 localhost systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:43:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 04:43:34 localhost openstack_network_exporter[246484]: ERROR 09:43:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 04:43:34 localhost openstack_network_exporter[246484]: ERROR 09:43:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 04:43:34 localhost openstack_network_exporter[246484]: ERROR 09:43:34 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 15 04:43:34 localhost openstack_network_exporter[246484]: ERROR 09:43:34 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 15 04:43:34 localhost openstack_network_exporter[246484]: Dec 15 04:43:34 localhost openstack_network_exporter[246484]: ERROR 09:43:34 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 15 04:43:34 localhost openstack_network_exporter[246484]: Dec 15 04:43:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0. Dec 15 04:43:34 localhost systemd[1]: tmp-crun.8NypoY.mount: Deactivated successfully. Dec 15 04:43:34 localhost podman[276639]: 2025-12-15 09:43:34.933622951 +0000 UTC m=+0.105423069 container health_status 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true) Dec 15 04:43:34 localhost systemd[1]: tmp-crun.GC4wGa.mount: Deactivated successfully. Dec 15 04:43:34 localhost podman[276654]: 2025-12-15 09:43:34.997636241 +0000 UTC m=+0.083017656 container health_status 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Dec 15 04:43:35 localhost podman[276654]: 2025-12-15 09:43:35.006231537 +0000 UTC m=+0.091612972 container exec_died 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 15 04:43:35 localhost systemd[1]: 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0.service: Deactivated successfully. Dec 15 04:43:35 localhost podman[276639]: 2025-12-15 09:43:35.02615915 +0000 UTC m=+0.197959318 container exec_died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true) Dec 15 04:43:35 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.service: Deactivated successfully. Dec 15 04:43:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a. Dec 15 04:43:36 localhost podman[276692]: 2025-12-15 09:43:36.150640831 +0000 UTC m=+0.085982202 container health_status b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible) Dec 15 04:43:36 localhost podman[276692]: 2025-12-15 09:43:36.162756749 +0000 UTC m=+0.098098150 container exec_died b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2) Dec 15 04:43:36 localhost systemd[1]: b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a.service: Deactivated successfully. Dec 15 04:43:36 localhost nova_compute[231752]: 2025-12-15 09:43:36.628 231756 DEBUG oslo_service.periodic_task [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:43:36 localhost nova_compute[231752]: 2025-12-15 09:43:36.648 231756 DEBUG nova.compute.manager [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Triggering sync for uuid 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m Dec 15 04:43:36 localhost nova_compute[231752]: 2025-12-15 09:43:36.649 231756 DEBUG oslo_concurrency.lockutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Acquiring lock "39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:43:36 localhost nova_compute[231752]: 2025-12-15 09:43:36.650 231756 DEBUG oslo_concurrency.lockutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Lock "39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:43:36 localhost nova_compute[231752]: 2025-12-15 09:43:36.682 231756 DEBUG oslo_concurrency.lockutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Lock "39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.032s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:43:38 localhost nova_compute[231752]: 2025-12-15 09:43:38.420 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:43:38 localhost nova_compute[231752]: 2025-12-15 09:43:38.422 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:43:38 localhost nova_compute[231752]: 2025-12-15 09:43:38.422 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 15 04:43:38 localhost nova_compute[231752]: 2025-12-15 09:43:38.422 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:43:38 localhost nova_compute[231752]: 2025-12-15 09:43:38.456 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:43:38 localhost nova_compute[231752]: 2025-12-15 09:43:38.457 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:43:38 localhost python3.9[276864]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 04:43:39 localhost python3.9[276975]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 04:43:40 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34305 DF PROTO=TCP SPT=36460 DPT=9102 SEQ=1386507001 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83A72F260000000001030307) Dec 15 04:43:40 localhost python3.9[277086]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 04:43:40 localhost python3.9[277197]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 04:43:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09. Dec 15 04:43:41 localhost podman[277199]: 2025-12-15 09:43:41.107344087 +0000 UTC m=+0.080860729 container health_status 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, managed_by=edpm_ansible, release=1755695350, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, architecture=x86_64, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, distribution-scope=public, vendor=Red Hat, Inc.) Dec 15 04:43:41 localhost podman[277199]: 2025-12-15 09:43:41.124610319 +0000 UTC m=+0.098126951 container exec_died 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, version=9.6, container_name=openstack_network_exporter, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., vcs-type=git, build-date=2025-08-20T13:12:41) Dec 15 04:43:41 localhost systemd[1]: 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09.service: Deactivated successfully. Dec 15 04:43:41 localhost python3.9[277330]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 04:43:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 04:43:42 localhost podman[277441]: 2025-12-15 09:43:42.259155952 +0000 UTC m=+0.084639845 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller) Dec 15 04:43:42 localhost podman[277441]: 2025-12-15 09:43:42.33179261 +0000 UTC m=+0.157276493 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_id=ovn_controller) Dec 15 04:43:42 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 04:43:42 localhost python3.9[277442]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 04:43:43 localhost nova_compute[231752]: 2025-12-15 09:43:43.458 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:43:43 localhost nova_compute[231752]: 2025-12-15 09:43:43.460 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:43:43 localhost nova_compute[231752]: 2025-12-15 09:43:43.460 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 15 04:43:43 localhost nova_compute[231752]: 2025-12-15 09:43:43.461 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:43:43 localhost nova_compute[231752]: 2025-12-15 09:43:43.488 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:43:43 localhost nova_compute[231752]: 2025-12-15 09:43:43.488 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:43:44 localhost python3.9[277577]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 04:43:44 localhost python3.9[277688]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 04:43:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 04:43:47 localhost podman[277799]: 2025-12-15 09:43:47.695154224 +0000 UTC m=+0.074799940 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team) Dec 15 04:43:47 localhost podman[277799]: 2025-12-15 09:43:47.704384992 +0000 UTC m=+0.084030798 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 04:43:47 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 04:43:47 localhost python3.9[277800]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:43:47 localhost nova_compute[231752]: 2025-12-15 09:43:47.969 231756 DEBUG oslo_service.periodic_task [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.117 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'name': 'test', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005559462.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'c785bf23f53946bc99867d8832a50266', 'user_id': '1ba5fce347b64bfebf995f187193f205', 'hostId': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.118 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.123 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.125 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '240ed196-1d5a-48d3-8b76-acd2908b0042', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:43:48.118890', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '8e217ea2-d99a-11f0-817e-fa163ebaca0f', 'monotonic_time': 10894.31155988, 'message_signature': 'ce61d50d997d3cbec9ecffe5adcc08098eb2d3b6a2fbd44b67b192f90ebdd217'}]}, 'timestamp': '2025-12-15 09:43:48.123913', '_unique_id': '62ee5691ef1a4f179afcf89e774beb90'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.125 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.125 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.125 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.125 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.125 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.125 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.125 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.125 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.125 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.125 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.125 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.125 12 ERROR oslo_messaging.notify.messaging Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.125 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.125 12 ERROR oslo_messaging.notify.messaging Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.125 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.125 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.125 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.125 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.125 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.125 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.125 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.125 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.125 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.125 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.125 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.125 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.125 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.125 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.125 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.125 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.125 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.125 12 ERROR oslo_messaging.notify.messaging Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.126 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.162 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.latency volume: 937264501 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.163 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.latency volume: 204572919 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.165 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e83093af-fcf4-49b7-a32a-30fe66d0688d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 937264501, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:43:48.127090', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8e277d70-d99a-11f0-817e-fa163ebaca0f', 'monotonic_time': 10894.319766699, 'message_signature': 'dbc04a712c4441aa827cdf9debe0c5488683d6e9f69ff366b651f2bec7622a5b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 204572919, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:43:48.127090', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8e27979c-d99a-11f0-817e-fa163ebaca0f', 'monotonic_time': 10894.319766699, 'message_signature': 'ff5cea60ef7e4956546839101d92520abe7e9429997a66a9fe733111e863b659'}]}, 'timestamp': '2025-12-15 09:43:48.163884', '_unique_id': 'b654f9da686040d4b370b33bec295a19'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.165 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.165 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.165 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.165 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.165 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.165 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.165 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.165 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.165 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.165 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.165 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.165 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.165 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.165 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.165 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.165 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.165 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.165 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.165 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.165 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.165 12 ERROR oslo_messaging.notify.messaging Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.165 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.165 12 ERROR oslo_messaging.notify.messaging Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.165 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.165 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.165 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.165 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.165 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.165 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.165 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.165 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.165 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.165 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.165 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.165 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.165 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.165 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.165 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.165 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.165 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.165 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.165 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.165 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.165 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.165 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.165 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.165 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.165 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.165 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.165 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.165 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.165 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.165 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.165 12 ERROR oslo_messaging.notify.messaging Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.166 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.179 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.180 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.182 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e1969090-98f5-47f9-9ac9-f569e0683770', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:43:48.167034', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8e2a1d6e-d99a-11f0-817e-fa163ebaca0f', 'monotonic_time': 10894.359738235, 'message_signature': '75c1086d0ac87fcf7a00ae64b322205e52b8f788677d268b54fadcfeae0d8c10'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:43:48.167034', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8e2a3222-d99a-11f0-817e-fa163ebaca0f', 'monotonic_time': 10894.359738235, 'message_signature': 'e2d8edc3292b04925da4deb82c21cddaed3b48eacacdcc997145973be70a3f78'}]}, 'timestamp': '2025-12-15 09:43:48.180854', '_unique_id': 'ae658394df354341beeb90e2338660a2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.182 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.182 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.182 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.182 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.182 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.182 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.182 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.182 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.182 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.182 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.182 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.182 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.182 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.182 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.182 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.182 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.182 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.182 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.182 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.182 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.182 12 ERROR oslo_messaging.notify.messaging Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.182 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.182 12 ERROR oslo_messaging.notify.messaging Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.182 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.182 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.182 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.182 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.182 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.182 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.182 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.182 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.182 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.182 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.182 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.182 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.182 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.182 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.182 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.182 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.182 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.182 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.182 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.182 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.182 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.182 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.182 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.182 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.182 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.182 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.182 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.182 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.182 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.182 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.182 12 ERROR oslo_messaging.notify.messaging Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.183 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.183 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.bytes volume: 9229 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.185 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ea90d758-176a-469e-8a4d-469b74c9d701', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9229, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:43:48.183457', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '8e2aac2a-d99a-11f0-817e-fa163ebaca0f', 'monotonic_time': 10894.31155988, 'message_signature': '009221bddaa0f88f02031e30e01bde7cb608c64c0676d4e966f2dc2e2e9011dc'}]}, 'timestamp': '2025-12-15 09:43:48.184047', '_unique_id': '09430a0699d44a02b23276fd0edf02eb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.185 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.185 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.185 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.185 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.185 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.185 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.185 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.185 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.185 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.185 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.185 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.185 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.185 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.185 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.185 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.185 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.185 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.185 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.185 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.185 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.185 12 ERROR oslo_messaging.notify.messaging Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.185 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.185 12 ERROR oslo_messaging.notify.messaging Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.185 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.185 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.185 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.185 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.185 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.185 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.185 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.185 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.185 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.185 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.185 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.185 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.185 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.185 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.185 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.185 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.185 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.185 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.185 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.185 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.185 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.185 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.185 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.185 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.185 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.185 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.185 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.185 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.185 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.185 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.185 12 ERROR oslo_messaging.notify.messaging Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.186 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.186 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.186 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.188 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f1571b9a-2576-41d1-bf60-6a1261f77917', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:43:48.186441', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8e2b1f16-d99a-11f0-817e-fa163ebaca0f', 'monotonic_time': 10894.359738235, 'message_signature': '8ae95ca7501c7599933466ab1e000209c188a6402d55dbbb36fd75bbe68278d5'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:43:48.186441', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8e2b34c4-d99a-11f0-817e-fa163ebaca0f', 'monotonic_time': 10894.359738235, 'message_signature': 'b2da180cd317f99a07a7cf46b735e6b5f94efcd652328ade6ee9927b7981078b'}]}, 'timestamp': '2025-12-15 09:43:48.187478', '_unique_id': '771b4371124f4847ae12ed11aab6f0eb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.188 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.188 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.188 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.188 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.188 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.188 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.188 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.188 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.188 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.188 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.188 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.188 12 ERROR oslo_messaging.notify.messaging Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.188 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.188 12 ERROR oslo_messaging.notify.messaging Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.188 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.188 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.188 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.188 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.188 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.188 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.188 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.188 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.188 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.188 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.188 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.188 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.188 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.188 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.188 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.188 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.188 12 ERROR oslo_messaging.notify.messaging Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.189 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.190 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.190 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.191 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6ab263ea-8fbf-48c9-98d6-6727274bfcf9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:43:48.190219', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '8e2bb296-d99a-11f0-817e-fa163ebaca0f', 'monotonic_time': 10894.31155988, 'message_signature': 'fa023c9f310f5dd159869b509056ed2b8c5e484aa60b1130f626b6468b127ac6'}]}, 'timestamp': '2025-12-15 09:43:48.190724', '_unique_id': '2f02dd7b990d44daa0964958621b69c1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.191 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.191 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.191 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.191 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.191 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.191 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.191 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.191 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.191 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.191 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.191 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.191 12 ERROR oslo_messaging.notify.messaging Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.191 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.191 12 ERROR oslo_messaging.notify.messaging Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.191 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.191 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.191 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.191 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.191 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.191 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.191 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.191 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.191 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.191 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.191 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.191 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.191 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.191 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.191 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.191 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.191 12 ERROR oslo_messaging.notify.messaging Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.193 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.216 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/cpu volume: 58360000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.218 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e42ddd1c-35f1-457d-9625-2c135281b3a8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 58360000000, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'timestamp': '2025-12-15T09:43:48.193419', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '8e2facfc-d99a-11f0-817e-fa163ebaca0f', 'monotonic_time': 10894.408799876, 'message_signature': 'af1d8a496e3949d53349482ee16a2fc0f2367c7f4f2ec64f67e68b319b9b8792'}]}, 'timestamp': '2025-12-15 09:43:48.216858', '_unique_id': 'ca5687455fab4896a27c0f572e0f92ff'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.218 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.218 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.218 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.218 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.218 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.218 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.218 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.218 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.218 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.218 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.218 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.218 12 ERROR oslo_messaging.notify.messaging Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.218 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.218 12 ERROR oslo_messaging.notify.messaging Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.218 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.218 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.218 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.218 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.218 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.218 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.218 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.218 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.218 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.218 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.218 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.218 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.218 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.218 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.218 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.218 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.218 12 ERROR oslo_messaging.notify.messaging Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.219 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.219 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.220 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.222 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b01edc30-0899-47d5-85e5-bf809d71c2c6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:43:48.219748', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8e303582-d99a-11f0-817e-fa163ebaca0f', 'monotonic_time': 10894.319766699, 'message_signature': '8e476d3a9653dbce3c028e4df7b7f85779550c31e8e67084e304872cf6dc8ffb'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:43:48.219748', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8e304888-d99a-11f0-817e-fa163ebaca0f', 'monotonic_time': 10894.319766699, 'message_signature': '9954e1a2813dd98b6723784d3e9eacdf9c7e91ff8fa295601d3fa140d8a0a809'}]}, 'timestamp': '2025-12-15 09:43:48.220838', '_unique_id': 'd40b8da7798745af8ecc8096a5ffc76f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.222 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.222 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.222 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.222 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.222 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.222 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.222 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.222 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.222 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.222 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.222 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.222 12 ERROR oslo_messaging.notify.messaging Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.222 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.222 12 ERROR oslo_messaging.notify.messaging Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.222 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.222 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.222 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.222 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.222 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.222 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.222 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.222 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.222 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.222 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.222 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.222 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.222 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.222 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.222 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.222 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.222 12 ERROR oslo_messaging.notify.messaging Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.223 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.224 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.224 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.226 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '48d47b85-684a-4a80-9403-ba7e9ff85ee4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:43:48.224147', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8e30e37e-d99a-11f0-817e-fa163ebaca0f', 'monotonic_time': 10894.319766699, 'message_signature': '5ae3f0ac3934aa8219870caad02bb6a4174aaa9f9e4b41702fac47e63457e02c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:43:48.224147', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8e30fe04-d99a-11f0-817e-fa163ebaca0f', 'monotonic_time': 10894.319766699, 'message_signature': 'c9c287bacc697ba35bc1d532760c9c0e174b3401ecd78b2d6d76c2b57d1e7c0e'}]}, 'timestamp': '2025-12-15 09:43:48.225507', '_unique_id': '923cf37c495e4ea493a10367eba23c18'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.226 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.226 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.226 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.226 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.226 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.226 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.226 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.226 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.226 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.226 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.226 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.226 12 ERROR oslo_messaging.notify.messaging Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.226 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.226 12 ERROR oslo_messaging.notify.messaging Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.226 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.226 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.226 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.226 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.226 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.226 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.226 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.226 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.226 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.226 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.226 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.226 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.226 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.226 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.226 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.226 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.226 12 ERROR oslo_messaging.notify.messaging Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.228 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.228 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.requests volume: 497 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.229 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.231 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bf5713c9-9f5d-48dc-954c-b2009dba3d18', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 497, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:43:48.228734', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8e3197ce-d99a-11f0-817e-fa163ebaca0f', 'monotonic_time': 10894.319766699, 'message_signature': '9bd390f94f30cbb974b7069d7caa98ea81c42df96195556c5f8d383cdaf177b5'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:43:48.228734', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8e31b164-d99a-11f0-817e-fa163ebaca0f', 'monotonic_time': 10894.319766699, 'message_signature': 'a84e53c831cc7cf23ed5206a7d58e01517e4917f8744858ed528ce1b4ae29668'}]}, 'timestamp': '2025-12-15 09:43:48.230120', '_unique_id': 'e885ccb4c0c8478681b98a735844b023'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.231 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.231 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.231 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.231 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.231 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.231 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.231 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.231 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.231 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.231 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.231 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.231 12 ERROR oslo_messaging.notify.messaging Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.231 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.231 12 ERROR oslo_messaging.notify.messaging Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.231 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.231 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.231 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.231 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.231 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.231 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.231 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.231 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.231 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.231 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.231 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.231 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.231 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.231 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.231 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.231 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.231 12 ERROR oslo_messaging.notify.messaging Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.233 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.233 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.packets volume: 129 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.235 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f7f4e6cb-818b-4b4c-9ea3-c04eb4879e18', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 129, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:43:48.233382', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '8e324c8c-d99a-11f0-817e-fa163ebaca0f', 'monotonic_time': 10894.31155988, 'message_signature': 'e8e520de8b6787ff5d42891b81ad95581727c9a805a9fb76a530cd451860e7ff'}]}, 'timestamp': '2025-12-15 09:43:48.234135', '_unique_id': 'b89ac278adfb40f79dcd5c8749c66c69'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.235 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.235 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.235 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.235 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.235 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.235 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.235 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.235 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.235 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.235 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.235 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.235 12 ERROR oslo_messaging.notify.messaging Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.235 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.235 12 ERROR oslo_messaging.notify.messaging Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.235 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.235 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.235 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.235 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.235 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.235 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.235 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.235 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.235 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.235 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.235 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.235 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.235 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.235 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.235 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.235 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.235 12 ERROR oslo_messaging.notify.messaging Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.237 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.237 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.238 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.240 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e7be8509-653a-42c4-9549-3ec2cf7565ec', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:43:48.237331', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8e32e692-d99a-11f0-817e-fa163ebaca0f', 'monotonic_time': 10894.359738235, 'message_signature': '05942a3ae7be47b0ca202c8d62fcc7329299f8ee530200da09357020dd860b72'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:43:48.237331', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8e330078-d99a-11f0-817e-fa163ebaca0f', 'monotonic_time': 10894.359738235, 'message_signature': 'e1ad41f376aeced4340429df89fdc460e5b9011e08bcbb967b27195bd743a9d8'}]}, 'timestamp': '2025-12-15 09:43:48.238668', '_unique_id': '98eba6237f8b4357a8c7a619c3c95884'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.240 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.240 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.240 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.240 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.240 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.240 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.240 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.240 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.240 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.240 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.240 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.240 12 ERROR oslo_messaging.notify.messaging Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.240 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.240 12 ERROR oslo_messaging.notify.messaging Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.240 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.240 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.240 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.240 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.240 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.240 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.240 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.240 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.240 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.240 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.240 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.240 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.240 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.240 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.240 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.240 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.240 12 ERROR oslo_messaging.notify.messaging Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.241 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.242 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.244 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f78ce27a-8b32-457b-a13e-cc5b3ab3c1c9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:43:48.241939', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '8e339c22-d99a-11f0-817e-fa163ebaca0f', 'monotonic_time': 10894.31155988, 'message_signature': '50a52fc061f1155983ac682d61567cf2735bb625c10b3188fe550659ff3231c7'}]}, 'timestamp': '2025-12-15 09:43:48.242685', '_unique_id': 'efc6d40738d041fe8bc64aa836193803'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.244 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.244 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.244 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.244 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.244 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.244 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.244 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.244 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.244 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.244 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.244 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.244 12 ERROR oslo_messaging.notify.messaging Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.244 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.244 12 ERROR oslo_messaging.notify.messaging Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.244 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.244 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.244 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.244 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.244 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.244 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.244 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.244 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.244 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.244 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.244 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.244 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.244 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.244 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.244 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.244 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.244 12 ERROR oslo_messaging.notify.messaging Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.245 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.245 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.246 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.248 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '47fde64b-e9db-4b08-9921-d45bc59e933b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:43:48.246162', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '8e343f60-d99a-11f0-817e-fa163ebaca0f', 'monotonic_time': 10894.31155988, 'message_signature': '463719ddbee216575b14d4e37c9b2a30ba903b3be2319ac771b87c406157d27e'}]}, 'timestamp': '2025-12-15 09:43:48.246867', '_unique_id': 'b7876b10660c464c9de52228be0e7f46'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.248 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.248 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.248 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.248 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.248 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.248 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.248 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.248 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.248 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.248 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.248 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.248 12 ERROR oslo_messaging.notify.messaging Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.248 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.248 12 ERROR oslo_messaging.notify.messaging Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.248 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.248 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.248 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.248 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.248 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.248 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.248 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.248 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.248 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.248 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.248 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.248 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.248 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.248 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.248 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.248 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.248 12 ERROR oslo_messaging.notify.messaging Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.249 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.249 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.packets volume: 87 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.251 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b824a6bb-454d-4044-ae7c-f0b4b59ca5da', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 87, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:43:48.249837', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '8e34ca48-d99a-11f0-817e-fa163ebaca0f', 'monotonic_time': 10894.31155988, 'message_signature': '68fc0fa4a80ed3261bed361fac1ab528f348501da74560432dc2b10361214ee9'}]}, 'timestamp': '2025-12-15 09:43:48.250287', '_unique_id': 'c472dbdad1b8455bb59b54e880b15f78'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.251 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.251 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.251 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.251 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.251 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.251 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.251 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.251 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.251 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.251 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.251 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.251 12 ERROR oslo_messaging.notify.messaging Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.251 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.251 12 ERROR oslo_messaging.notify.messaging Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.251 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.251 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.251 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.251 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.251 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.251 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.251 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.251 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.251 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.251 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.251 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.251 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.251 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.251 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.251 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.251 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.251 12 ERROR oslo_messaging.notify.messaging Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.252 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.252 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.252 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.bytes volume: 11272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.253 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aecc434e-eaf8-4734-92e7-a1513c768a2e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11272, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:43:48.252362', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '8e352c68-d99a-11f0-817e-fa163ebaca0f', 'monotonic_time': 10894.31155988, 'message_signature': 'd0377263a9bc948ceb7b7c0daf71eb3b1d4d26b55e478ff4558bce6a49abe403'}]}, 'timestamp': '2025-12-15 09:43:48.252799', '_unique_id': 'd714f8b119124bd9aa61f2a1c487b078'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.253 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.253 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.253 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.253 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.253 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.253 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.253 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.253 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.253 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.253 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.253 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.253 12 ERROR oslo_messaging.notify.messaging Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.253 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.253 12 ERROR oslo_messaging.notify.messaging Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.253 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.253 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.253 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.253 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.253 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.253 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.253 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.253 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.253 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.253 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.253 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.253 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.253 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.253 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.253 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.253 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.253 12 ERROR oslo_messaging.notify.messaging Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.254 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.254 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/memory.usage volume: 52.3125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.255 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a30c4da7-a382-4b50-8d0e-230e7bd3f125', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.3125, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'timestamp': '2025-12-15T09:43:48.254764', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '8e3589d8-d99a-11f0-817e-fa163ebaca0f', 'monotonic_time': 10894.408799876, 'message_signature': '40187e9941e375e937bad215626b9aaf634db2b3ad4f927c8d28e97ae8122855'}]}, 'timestamp': '2025-12-15 09:43:48.255198', '_unique_id': '2a18f2e41b79464491b3aa4c7eb31b65'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.255 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.255 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.255 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.255 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.255 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.255 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.255 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.255 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.255 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.255 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.255 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.255 12 ERROR oslo_messaging.notify.messaging Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.255 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.255 12 ERROR oslo_messaging.notify.messaging Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.255 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.255 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.255 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.255 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.255 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.255 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.255 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.255 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.255 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.255 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.255 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.255 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.255 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.255 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.255 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.255 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.255 12 ERROR oslo_messaging.notify.messaging Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.256 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.257 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.258 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ad3028d4-029a-496e-9784-d793d91931bb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:43:48.257012', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '8e35e31a-d99a-11f0-817e-fa163ebaca0f', 'monotonic_time': 10894.31155988, 'message_signature': '2f5e26a914b0cf9518e20b9bf0dc0a2bdca209ecee871a50b2ec44af1384ae86'}]}, 'timestamp': '2025-12-15 09:43:48.257475', '_unique_id': '4772b83aa3974357a97cdf5cce8505a2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.258 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.258 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.258 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.258 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.258 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.258 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.258 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.258 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.258 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.258 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.258 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.258 12 ERROR oslo_messaging.notify.messaging Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.258 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.258 12 ERROR oslo_messaging.notify.messaging Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.258 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.258 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.258 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.258 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.258 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.258 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.258 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.258 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.258 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.258 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.258 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.258 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.258 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.258 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.258 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.258 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.258 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.258 12 ERROR oslo_messaging.notify.messaging Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.259 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.259 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.latency volume: 213002426 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.259 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.latency volume: 24733520 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.261 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'df4477c2-6230-483e-9d42-ff2bf804b881', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 213002426, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:43:48.259419', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8e363fae-d99a-11f0-817e-fa163ebaca0f', 'monotonic_time': 10894.319766699, 'message_signature': '03ddc244191c62a20d7115d94a192aeab4efa51c81212d4068c9da7fdbdaf882'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24733520, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:43:48.259419', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8e364fd0-d99a-11f0-817e-fa163ebaca0f', 'monotonic_time': 10894.319766699, 'message_signature': '2ca9b015e0bb749d7e898d8450152f00f6b9092519886ffb7d5597167d89c01e'}]}, 'timestamp': '2025-12-15 09:43:48.260243', '_unique_id': '0a760c0eea0d4a71b44c2f1c3e561203'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.261 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.261 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.261 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.261 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.261 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.261 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.261 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.261 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.261 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.261 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.261 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.261 12 ERROR oslo_messaging.notify.messaging Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.261 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.261 12 ERROR oslo_messaging.notify.messaging Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.261 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.261 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.261 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.261 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.261 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.261 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.261 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.261 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.261 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.261 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.261 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.261 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.261 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.261 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.261 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.261 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.261 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.261 12 ERROR oslo_messaging.notify.messaging Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.262 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.262 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.262 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.bytes volume: 73912320 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.262 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.264 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f252fded-fafe-4210-b8ad-88aa378fce65', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73912320, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:43:48.262392', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8e36b394-d99a-11f0-817e-fa163ebaca0f', 'monotonic_time': 10894.319766699, 'message_signature': '55171ea4f31ce559f0360c36f3956152d4dc688917a9fc1d683795c625196f93'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:43:48.262392', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8e36c24e-d99a-11f0-817e-fa163ebaca0f', 'monotonic_time': 10894.319766699, 'message_signature': '8194a5994c8de32abfd6e3336cbf0a995a335e4889e894cde1e33d6f4f640cc7'}]}, 'timestamp': '2025-12-15 09:43:48.263190', '_unique_id': '285bdcadea834251b58ca55ffd011e07'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.264 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.264 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.264 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.264 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.264 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.264 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.264 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.264 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.264 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.264 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.264 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.264 12 ERROR oslo_messaging.notify.messaging Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.264 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.264 12 ERROR oslo_messaging.notify.messaging Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.264 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.264 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.264 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.264 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.264 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.264 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.264 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.264 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.264 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.264 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.264 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.264 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.264 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.264 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.264 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.264 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.264 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.264 12 ERROR oslo_messaging.notify.messaging Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.265 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.265 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.266 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4c3e4abd-ca2a-49a2-97b2-d353ba7199e6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:43:48.265183', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '8e37211c-d99a-11f0-817e-fa163ebaca0f', 'monotonic_time': 10894.31155988, 'message_signature': '7936a21201641e73700b11cea58b6042a1513955f4237d61b2c6308ada5c10bb'}]}, 'timestamp': '2025-12-15 09:43:48.265616', '_unique_id': 'c3973162e7474094aa954ace8181ec68'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.266 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.266 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.266 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.266 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.266 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.266 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.266 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.266 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.266 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.266 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.266 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.266 12 ERROR oslo_messaging.notify.messaging Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.266 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.266 12 ERROR oslo_messaging.notify.messaging Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.266 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.266 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.266 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.266 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.266 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.266 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.266 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.266 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.266 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.266 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.266 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.266 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.266 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.266 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.266 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.266 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:43:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:43:48.266 12 ERROR oslo_messaging.notify.messaging Dec 15 04:43:48 localhost nova_compute[231752]: 2025-12-15 09:43:48.489 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:43:48 localhost nova_compute[231752]: 2025-12-15 09:43:48.491 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:43:48 localhost nova_compute[231752]: 2025-12-15 09:43:48.491 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 15 04:43:48 localhost nova_compute[231752]: 2025-12-15 09:43:48.492 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:43:48 localhost python3.9[277926]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:43:48 localhost nova_compute[231752]: 2025-12-15 09:43:48.527 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:43:48 localhost nova_compute[231752]: 2025-12-15 09:43:48.527 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:43:49 localhost python3.9[278036]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:43:49 localhost python3.9[278146]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:43:49 localhost nova_compute[231752]: 2025-12-15 09:43:49.951 231756 DEBUG oslo_service.periodic_task [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:43:49 localhost nova_compute[231752]: 2025-12-15 09:43:49.951 231756 DEBUG oslo_service.periodic_task [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:43:50 localhost python3.9[278256]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:43:50 localhost python3.9[278366]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:43:50 localhost nova_compute[231752]: 2025-12-15 09:43:50.952 231756 DEBUG oslo_service.periodic_task [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:43:50 localhost nova_compute[231752]: 2025-12-15 09:43:50.953 231756 DEBUG nova.compute.manager [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 15 04:43:50 localhost nova_compute[231752]: 2025-12-15 09:43:50.953 231756 DEBUG nova.compute.manager [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 15 04:43:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:43:51.448 160590 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:43:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:43:51.449 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:43:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:43:51.451 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:43:51 localhost python3.9[278476]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:43:51 localhost nova_compute[231752]: 2025-12-15 09:43:51.586 231756 DEBUG oslo_concurrency.lockutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Acquiring lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 15 04:43:51 localhost nova_compute[231752]: 2025-12-15 09:43:51.587 231756 DEBUG oslo_concurrency.lockutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Acquired lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 15 04:43:51 localhost nova_compute[231752]: 2025-12-15 09:43:51.587 231756 DEBUG nova.network.neutron [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 15 04:43:51 localhost nova_compute[231752]: 2025-12-15 09:43:51.587 231756 DEBUG nova.objects.instance [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Lazy-loading 'info_cache' on Instance uuid 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 15 04:43:51 localhost nova_compute[231752]: 2025-12-15 09:43:51.992 231756 DEBUG nova.network.neutron [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Updating instance_info_cache with network_info: [{"id": "03ef8889-3216-43fb-8a52-4be17a956ce1", "address": "fa:16:3e:74:df:7c", "network": {"id": "befb7a72-17a9-4bcb-b561-84b8f626685a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.201", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "c785bf23f53946bc99867d8832a50266", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03ef8889-32", "ovs_interfaceid": "03ef8889-3216-43fb-8a52-4be17a956ce1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 15 04:43:52 localhost nova_compute[231752]: 2025-12-15 09:43:52.009 231756 DEBUG oslo_concurrency.lockutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Releasing lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 15 04:43:52 localhost nova_compute[231752]: 2025-12-15 09:43:52.009 231756 DEBUG nova.compute.manager [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 15 04:43:52 localhost nova_compute[231752]: 2025-12-15 09:43:52.009 231756 DEBUG oslo_service.periodic_task [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:43:52 localhost nova_compute[231752]: 2025-12-15 09:43:52.010 231756 DEBUG oslo_service.periodic_task [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:43:52 localhost nova_compute[231752]: 2025-12-15 09:43:52.010 231756 DEBUG nova.compute.manager [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 15 04:43:52 localhost nova_compute[231752]: 2025-12-15 09:43:52.010 231756 DEBUG oslo_service.periodic_task [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:43:52 localhost nova_compute[231752]: 2025-12-15 09:43:52.025 231756 DEBUG oslo_concurrency.lockutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:43:52 localhost nova_compute[231752]: 2025-12-15 09:43:52.026 231756 DEBUG oslo_concurrency.lockutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:43:52 localhost nova_compute[231752]: 2025-12-15 09:43:52.026 231756 DEBUG oslo_concurrency.lockutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:43:52 localhost nova_compute[231752]: 2025-12-15 09:43:52.026 231756 DEBUG nova.compute.resource_tracker [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Auditing locally available compute resources for np0005559462.localdomain (node: np0005559462.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 15 04:43:52 localhost nova_compute[231752]: 2025-12-15 09:43:52.027 231756 DEBUG oslo_concurrency.processutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 04:43:52 localhost python3.9[278586]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:43:52 localhost nova_compute[231752]: 2025-12-15 09:43:52.466 231756 DEBUG oslo_concurrency.processutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.440s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 04:43:52 localhost nova_compute[231752]: 2025-12-15 09:43:52.532 231756 DEBUG nova.virt.libvirt.driver [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 04:43:52 localhost nova_compute[231752]: 2025-12-15 09:43:52.533 231756 DEBUG nova.virt.libvirt.driver [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 04:43:52 localhost nova_compute[231752]: 2025-12-15 09:43:52.749 231756 WARNING nova.virt.libvirt.driver [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 15 04:43:52 localhost nova_compute[231752]: 2025-12-15 09:43:52.751 231756 DEBUG nova.compute.resource_tracker [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Hypervisor/Node resource view: name=np0005559462.localdomain free_ram=12147MB free_disk=41.83720779418945GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 15 04:43:52 localhost nova_compute[231752]: 2025-12-15 09:43:52.751 231756 DEBUG oslo_concurrency.lockutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:43:52 localhost nova_compute[231752]: 2025-12-15 09:43:52.752 231756 DEBUG oslo_concurrency.lockutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:43:52 localhost nova_compute[231752]: 2025-12-15 09:43:52.804 231756 DEBUG nova.compute.resource_tracker [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Instance 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 15 04:43:52 localhost nova_compute[231752]: 2025-12-15 09:43:52.805 231756 DEBUG nova.compute.resource_tracker [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 15 04:43:52 localhost nova_compute[231752]: 2025-12-15 09:43:52.805 231756 DEBUG nova.compute.resource_tracker [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Final resource view: name=np0005559462.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 15 04:43:52 localhost python3.9[278718]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:43:52 localhost nova_compute[231752]: 2025-12-15 09:43:52.839 231756 DEBUG oslo_concurrency.processutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 04:43:53 localhost nova_compute[231752]: 2025-12-15 09:43:53.265 231756 DEBUG oslo_concurrency.processutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.426s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 04:43:53 localhost nova_compute[231752]: 2025-12-15 09:43:53.273 231756 DEBUG nova.compute.provider_tree [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Inventory has not changed in ProviderTree for provider: 26c8956b-6742-4951-b566-971b9bbe323b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 15 04:43:53 localhost nova_compute[231752]: 2025-12-15 09:43:53.295 231756 DEBUG nova.scheduler.client.report [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Inventory has not changed for provider 26c8956b-6742-4951-b566-971b9bbe323b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 15 04:43:53 localhost nova_compute[231752]: 2025-12-15 09:43:53.299 231756 DEBUG nova.compute.resource_tracker [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Compute_service record updated for np0005559462.localdomain:np0005559462.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 15 04:43:53 localhost nova_compute[231752]: 2025-12-15 09:43:53.300 231756 DEBUG oslo_concurrency.lockutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.548s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:43:53 localhost python3.9[278848]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:43:53 localhost nova_compute[231752]: 2025-12-15 09:43:53.526 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:43:54 localhost python3.9[278960]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:43:54 localhost nova_compute[231752]: 2025-12-15 09:43:54.242 231756 DEBUG oslo_service.periodic_task [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:43:54 localhost nova_compute[231752]: 2025-12-15 09:43:54.243 231756 DEBUG oslo_service.periodic_task [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:43:54 localhost python3.9[279070]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:43:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=708 DF PROTO=TCP SPT=32832 DPT=9102 SEQ=3208699565 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83A768110000000001030307) Dec 15 04:43:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e. Dec 15 04:43:55 localhost podman[279180]: 2025-12-15 09:43:55.165674261 +0000 UTC m=+0.084013087 container health_status a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Dec 15 04:43:55 localhost podman[279180]: 2025-12-15 09:43:55.173881039 +0000 UTC m=+0.092219845 container exec_died a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Dec 15 04:43:55 localhost systemd[1]: a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e.service: Deactivated successfully. Dec 15 04:43:55 localhost python3.9[279181]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:43:55 localhost python3.9[279313]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:43:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=709 DF PROTO=TCP SPT=32832 DPT=9102 SEQ=3208699565 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83A76C250000000001030307) Dec 15 04:43:56 localhost python3.9[279423]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:43:56 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34306 DF PROTO=TCP SPT=36460 DPT=9102 SEQ=1386507001 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83A76F260000000001030307) Dec 15 04:43:57 localhost python3.9[279533]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:43:57 localhost python3.9[279643]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012 systemctl disable --now certmonger.service#012 test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:43:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=710 DF PROTO=TCP SPT=32832 DPT=9102 SEQ=3208699565 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83A774250000000001030307) Dec 15 04:43:58 localhost nova_compute[231752]: 2025-12-15 09:43:58.530 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:43:58 localhost nova_compute[231752]: 2025-12-15 09:43:58.532 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:43:58 localhost nova_compute[231752]: 2025-12-15 09:43:58.532 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 15 04:43:58 localhost nova_compute[231752]: 2025-12-15 09:43:58.532 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:43:58 localhost nova_compute[231752]: 2025-12-15 09:43:58.561 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:43:58 localhost nova_compute[231752]: 2025-12-15 09:43:58.562 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:43:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35415 DF PROTO=TCP SPT=38304 DPT=9102 SEQ=3326393576 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83A777250000000001030307) Dec 15 04:43:59 localhost python3.9[279753]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None Dec 15 04:44:00 localhost python3.9[279863]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None Dec 15 04:44:00 localhost systemd[1]: Reloading. Dec 15 04:44:00 localhost systemd-sysv-generator[279893]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:44:00 localhost systemd-rc-local-generator[279890]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:44:00 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:44:00 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 15 04:44:00 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:44:00 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:44:00 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:44:00 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 15 04:44:00 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:44:00 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:44:00 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:44:01 localhost python3.9[280009]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:44:01 localhost podman[243449]: time="2025-12-15T09:44:01Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 15 04:44:01 localhost podman[243449]: @ - - [15/Dec/2025:09:44:01 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149352 "" "Go-http-client/1.1" Dec 15 04:44:01 localhost podman[243449]: @ - - [15/Dec/2025:09:44:01 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17221 "" "Go-http-client/1.1" Dec 15 04:44:01 localhost python3.9[280120]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:44:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=711 DF PROTO=TCP SPT=32832 DPT=9102 SEQ=3208699565 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83A783E50000000001030307) Dec 15 04:44:02 localhost python3.9[280231]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:44:03 localhost python3.9[280342]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:44:03 localhost nova_compute[231752]: 2025-12-15 09:44:03.563 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:44:03 localhost python3.9[280501]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:44:04 localhost python3.9[280631]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:44:04 localhost openstack_network_exporter[246484]: ERROR 09:44:04 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 15 04:44:04 localhost openstack_network_exporter[246484]: ERROR 09:44:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 04:44:04 localhost openstack_network_exporter[246484]: ERROR 09:44:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 04:44:04 localhost openstack_network_exporter[246484]: ERROR 09:44:04 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 15 04:44:04 localhost openstack_network_exporter[246484]: Dec 15 04:44:04 localhost openstack_network_exporter[246484]: ERROR 09:44:04 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 15 04:44:04 localhost openstack_network_exporter[246484]: Dec 15 04:44:05 localhost python3.9[280742]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:44:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0. Dec 15 04:44:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 04:44:05 localhost podman[280745]: 2025-12-15 09:44:05.292508447 +0000 UTC m=+0.075924122 container health_status 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251202) Dec 15 04:44:05 localhost podman[280745]: 2025-12-15 09:44:05.301826877 +0000 UTC m=+0.085242582 container exec_died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}) Dec 15 04:44:05 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.service: Deactivated successfully. Dec 15 04:44:05 localhost podman[280744]: 2025-12-15 09:44:05.342321118 +0000 UTC m=+0.128141020 container health_status 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Dec 15 04:44:05 localhost podman[280744]: 2025-12-15 09:44:05.377338486 +0000 UTC m=+0.163158358 container exec_died 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Dec 15 04:44:05 localhost systemd[1]: 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0.service: Deactivated successfully. Dec 15 04:44:05 localhost python3.9[280895]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:44:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a. Dec 15 04:44:06 localhost podman[280914]: 2025-12-15 09:44:06.747171669 +0000 UTC m=+0.081412244 container health_status b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202) Dec 15 04:44:06 localhost podman[280914]: 2025-12-15 09:44:06.762452956 +0000 UTC m=+0.096693531 container exec_died b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible) Dec 15 04:44:06 localhost systemd[1]: b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a.service: Deactivated successfully. Dec 15 04:44:08 localhost python3.9[281043]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 15 04:44:08 localhost nova_compute[231752]: 2025-12-15 09:44:08.565 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:44:08 localhost nova_compute[231752]: 2025-12-15 09:44:08.567 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:44:08 localhost nova_compute[231752]: 2025-12-15 09:44:08.567 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 15 04:44:08 localhost nova_compute[231752]: 2025-12-15 09:44:08.568 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:44:08 localhost nova_compute[231752]: 2025-12-15 09:44:08.591 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:44:08 localhost nova_compute[231752]: 2025-12-15 09:44:08.592 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:44:08 localhost python3.9[281153]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 15 04:44:09 localhost python3.9[281263]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 15 04:44:10 localhost python3.9[281373]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 15 04:44:10 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=712 DF PROTO=TCP SPT=32832 DPT=9102 SEQ=3208699565 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83A7A5250000000001030307) Dec 15 04:44:11 localhost python3.9[281483]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 15 04:44:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09. Dec 15 04:44:11 localhost systemd[1]: tmp-crun.h080rl.mount: Deactivated successfully. Dec 15 04:44:11 localhost podman[281594]: 2025-12-15 09:44:11.556076469 +0000 UTC m=+0.091424773 container health_status 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vcs-type=git, name=ubi9-minimal, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, version=9.6, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., io.openshift.expose-services=) Dec 15 04:44:11 localhost podman[281594]: 2025-12-15 09:44:11.569294668 +0000 UTC m=+0.104642902 container exec_died 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, managed_by=edpm_ansible, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., release=1755695350, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, container_name=openstack_network_exporter, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers) Dec 15 04:44:11 localhost systemd[1]: 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09.service: Deactivated successfully. Dec 15 04:44:11 localhost python3.9[281593]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 15 04:44:12 localhost python3.9[281724]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None Dec 15 04:44:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 04:44:12 localhost systemd[1]: tmp-crun.P2xfpC.mount: Deactivated successfully. Dec 15 04:44:12 localhost podman[281834]: 2025-12-15 09:44:12.76502321 +0000 UTC m=+0.090280142 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Dec 15 04:44:12 localhost podman[281834]: 2025-12-15 09:44:12.848405519 +0000 UTC m=+0.173662471 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible) Dec 15 04:44:12 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 04:44:12 localhost python3.9[281835]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Dec 15 04:44:13 localhost python3.9[281969]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Dec 15 04:44:13 localhost nova_compute[231752]: 2025-12-15 09:44:13.592 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:44:13 localhost nova_compute[231752]: 2025-12-15 09:44:13.593 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:44:14 localhost python3.9[282079]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None Dec 15 04:44:18 localhost nova_compute[231752]: 2025-12-15 09:44:18.594 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:44:18 localhost nova_compute[231752]: 2025-12-15 09:44:18.596 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:44:18 localhost nova_compute[231752]: 2025-12-15 09:44:18.596 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 15 04:44:18 localhost nova_compute[231752]: 2025-12-15 09:44:18.597 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:44:18 localhost nova_compute[231752]: 2025-12-15 09:44:18.616 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:44:18 localhost nova_compute[231752]: 2025-12-15 09:44:18.617 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:44:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 04:44:18 localhost podman[282097]: 2025-12-15 09:44:18.750320431 +0000 UTC m=+0.080476417 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Dec 15 04:44:18 localhost podman[282097]: 2025-12-15 09:44:18.783354664 +0000 UTC m=+0.113510620 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 04:44:18 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 04:44:19 localhost python3.9[282209]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None Dec 15 04:44:20 localhost sshd[282228]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:44:20 localhost systemd-logind[763]: New session 60 of user zuul. Dec 15 04:44:20 localhost systemd[1]: Started Session 60 of User zuul. Dec 15 04:44:21 localhost systemd[1]: session-60.scope: Deactivated successfully. Dec 15 04:44:21 localhost systemd-logind[763]: Session 60 logged out. Waiting for processes to exit. Dec 15 04:44:21 localhost systemd-logind[763]: Removed session 60. Dec 15 04:44:21 localhost python3.9[282339]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:44:22 localhost python3.9[282425]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765791861.195667-3151-244886875961228/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 15 04:44:22 localhost python3.9[282533]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:44:23 localhost python3.9[282588]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 15 04:44:23 localhost nova_compute[231752]: 2025-12-15 09:44:23.617 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:44:23 localhost nova_compute[231752]: 2025-12-15 09:44:23.621 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:44:23 localhost python3.9[282696]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:44:24 localhost python3.9[282782]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765791863.4308343-3151-195886374274380/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 15 04:44:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52582 DF PROTO=TCP SPT=53566 DPT=9102 SEQ=4061031728 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83A7DD420000000001030307) Dec 15 04:44:24 localhost python3.9[282890]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:44:25 localhost python3.9[282976]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765791864.482272-3151-73407398485088/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=2665bfc4419dff19b3a41ac57ea64cb1932d7c0f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 15 04:44:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e. Dec 15 04:44:25 localhost systemd[1]: tmp-crun.cGhwiw.mount: Deactivated successfully. Dec 15 04:44:25 localhost podman[283048]: 2025-12-15 09:44:25.729383324 +0000 UTC m=+0.065391407 container health_status a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 15 04:44:25 localhost podman[283048]: 2025-12-15 09:44:25.742472639 +0000 UTC m=+0.078480672 container exec_died a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Dec 15 04:44:25 localhost systemd[1]: a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e.service: Deactivated successfully. Dec 15 04:44:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52583 DF PROTO=TCP SPT=53566 DPT=9102 SEQ=4061031728 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83A7E1650000000001030307) Dec 15 04:44:25 localhost python3.9[283107]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:44:26 localhost python3.9[283193]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765791865.5540354-3151-52081267285815/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 15 04:44:26 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=713 DF PROTO=TCP SPT=32832 DPT=9102 SEQ=3208699565 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83A7E5250000000001030307) Dec 15 04:44:26 localhost python3.9[283301]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:44:27 localhost python3.9[283387]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765791866.5283449-3151-155382794381630/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 15 04:44:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52584 DF PROTO=TCP SPT=53566 DPT=9102 SEQ=4061031728 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83A7E9650000000001030307) Dec 15 04:44:28 localhost python3.9[283497]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:44:28 localhost nova_compute[231752]: 2025-12-15 09:44:28.623 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:44:28 localhost nova_compute[231752]: 2025-12-15 09:44:28.625 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:44:28 localhost nova_compute[231752]: 2025-12-15 09:44:28.625 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 15 04:44:28 localhost nova_compute[231752]: 2025-12-15 09:44:28.625 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:44:28 localhost nova_compute[231752]: 2025-12-15 09:44:28.663 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:44:28 localhost nova_compute[231752]: 2025-12-15 09:44:28.664 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:44:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34307 DF PROTO=TCP SPT=36460 DPT=9102 SEQ=1386507001 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83A7ED250000000001030307) Dec 15 04:44:29 localhost python3.9[283607]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:44:29 localhost python3.9[283717]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 15 04:44:30 localhost python3.9[283829]: ansible-ansible.builtin.file Invoked with group=nova mode=0400 owner=nova path=/var/lib/nova/compute_id state=file recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:44:31 localhost python3.9[283937]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 15 04:44:31 localhost podman[243449]: time="2025-12-15T09:44:31Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 15 04:44:31 localhost podman[243449]: @ - - [15/Dec/2025:09:44:31 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149352 "" "Go-http-client/1.1" Dec 15 04:44:31 localhost podman[243449]: @ - - [15/Dec/2025:09:44:31 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17226 "" "Go-http-client/1.1" Dec 15 04:44:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52585 DF PROTO=TCP SPT=53566 DPT=9102 SEQ=4061031728 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83A7F9250000000001030307) Dec 15 04:44:32 localhost python3.9[284047]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:44:32 localhost python3.9[284102]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/containers/nova_compute.json _original_basename=nova_compute.json.j2 recurse=False state=file path=/var/lib/openstack/config/containers/nova_compute.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 15 04:44:33 localhost python3.9[284210]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False Dec 15 04:44:33 localhost python3.9[284265]: ansible-ansible.legacy.file Invoked with mode=0700 setype=container_file_t dest=/var/lib/openstack/config/containers/nova_compute_init.json _original_basename=nova_compute_init.json.j2 recurse=False state=file path=/var/lib/openstack/config/containers/nova_compute_init.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None Dec 15 04:44:33 localhost nova_compute[231752]: 2025-12-15 09:44:33.664 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:44:34 localhost python3.9[284375]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False Dec 15 04:44:34 localhost openstack_network_exporter[246484]: ERROR 09:44:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 04:44:34 localhost openstack_network_exporter[246484]: ERROR 09:44:34 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 15 04:44:34 localhost openstack_network_exporter[246484]: ERROR 09:44:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 04:44:34 localhost openstack_network_exporter[246484]: ERROR 09:44:34 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 15 04:44:34 localhost openstack_network_exporter[246484]: Dec 15 04:44:34 localhost openstack_network_exporter[246484]: ERROR 09:44:34 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 15 04:44:34 localhost openstack_network_exporter[246484]: Dec 15 04:44:35 localhost python3.9[284485]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack Dec 15 04:44:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0. Dec 15 04:44:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 04:44:35 localhost podman[284504]: 2025-12-15 09:44:35.739870508 +0000 UTC m=+0.063207985 container health_status 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0) Dec 15 04:44:35 localhost systemd[1]: tmp-crun.AJaZ3k.mount: Deactivated successfully. Dec 15 04:44:35 localhost podman[284503]: 2025-12-15 09:44:35.810800489 +0000 UTC m=+0.133686164 container health_status 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 15 04:44:35 localhost podman[284503]: 2025-12-15 09:44:35.825403377 +0000 UTC m=+0.148288972 container exec_died 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Dec 15 04:44:35 localhost podman[284504]: 2025-12-15 09:44:35.832856905 +0000 UTC m=+0.156194432 container exec_died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd) Dec 15 04:44:35 localhost systemd[1]: 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0.service: Deactivated successfully. Dec 15 04:44:35 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.service: Deactivated successfully. Dec 15 04:44:36 localhost python3[284639]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json containers=[] log_base_path=/var/log/containers/stdouts debug=False Dec 15 04:44:36 localhost python3[284639]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [#012 {#012 "Id": "e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18",#012 "Digest": "sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b",#012 "RepoTags": [#012 "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"#012 ],#012 "RepoDigests": [#012 "quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b"#012 ],#012 "Parent": "",#012 "Comment": "",#012 "Created": "2025-12-08T06:29:58.491919425Z",#012 "Config": {#012 "User": "nova",#012 "Env": [#012 "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",#012 "LANG=en_US.UTF-8",#012 "TZ=UTC",#012 "container=oci"#012 ],#012 "Entrypoint": [#012 "dumb-init",#012 "--single-child",#012 "--"#012 ],#012 "Cmd": [#012 "kolla_start"#012 ],#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251202",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "c3923531bcda0b0811b2d5053f189beb",#012 "tcib_managed": "true"#012 },#012 "StopSignal": "SIGTERM"#012 },#012 "Version": "",#012 "Author": "",#012 "Architecture": "amd64",#012 "Os": "linux",#012 "Size": 1212370809,#012 "VirtualSize": 1212370809,#012 "GraphDriver": {#012 "Name": "overlay",#012 "Data": {#012 "LowerDir": "/var/lib/containers/storage/overlay/0825dff58a4425bd2cec24761b25b1273896b2e1fd9e1bbd68a0daa8371ae8a9/diff:/var/lib/containers/storage/overlay/4c2a493cc38fe0c2d274b137f7d549c92d76e83cf216e797584fb8469937682d/diff:/var/lib/containers/storage/overlay/102653142e2259aa6223045dee7736729104ac8aed3ce9b3c87a6d0787e59de8/diff:/var/lib/containers/storage/overlay/a170762be59c15b133bd19c602942600caa3082ffe7158ccee8771dfc16bb660/diff",#012 "UpperDir": "/var/lib/containers/storage/overlay/fa799eba62894ffeea98e2f07b9b6a110cbee9080a6c3af1755332d33ad27617/diff",#012 "WorkDir": "/var/lib/containers/storage/overlay/fa799eba62894ffeea98e2f07b9b6a110cbee9080a6c3af1755332d33ad27617/work"#012 }#012 },#012 "RootFS": {#012 "Type": "layers",#012 "Layers": [#012 "sha256:a170762be59c15b133bd19c602942600caa3082ffe7158ccee8771dfc16bb660",#012 "sha256:47bbb708952ccfdaf6b1a15cd5347cc2e9ee37e63ec65603401dcebf66de9242",#012 "sha256:dde195c4be3ea0882f3029365e3a9510c9e08a199c8a2c93ddc2b8aa725a10f1",#012 "sha256:191522e021d026966b0789970c823d3aa8f268180d3ac4a9714f61201ef3b79e",#012 "sha256:226bf9c0939dc7236a630e18a3cd37bc8e773e86f3cf4ef2cedf4d22a6a7d337"#012 ]#012 },#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251202",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "c3923531bcda0b0811b2d5053f189beb",#012 "tcib_managed": "true"#012 },#012 "Annotations": {},#012 "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",#012 "User": "nova",#012 "History": [#012 {#012 "created": "2025-12-02T04:26:51.317229596Z",#012 "created_by": "/bin/sh -c #(nop) ADD file:9f05a0f58e10b77188c7243d914ce56c5ce3e0f2ee7e13a7b0d4990588c97b99 in / ",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-02T04:26:51.317315213Z",#012 "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\" org.label-schema.name=\"CentOS Stream 9 Base Image\" org.label-schema.vendor=\"CentOS\" org.label-schema.license=\"GPLv2\" org.label-schema.build-date=\"20251202\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-02T04:26:54.063957926Z",#012 "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"#012 },#012 {#012 "created": "2025-12-08T06:08:28.750777742Z",#012 "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",#012 "comment": "FROM quay.io/centos/centos:stream9",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-08T06:08:28.750791962Z",#012 "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-08T06:08:28.750804372Z",#012 "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-08T06:08:28.750813613Z",#012 "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-08T06:08:28.750824813Z",#012 "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-08T06:08:28.750833663Z",#012 "created_by": "/bin/sh -c #(nop) USER root",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-08T06:08:29.160435164Z",#012 "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-08T06:09:05.859236491Z",#012 "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",#012 "empty_layer": true#012 },#012 {#012 Dec 15 04:44:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a. Dec 15 04:44:37 localhost podman[284810]: 2025-12-15 09:44:37.342142193 +0000 UTC m=+0.082614268 container health_status b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible) Dec 15 04:44:37 localhost podman[284810]: 2025-12-15 09:44:37.352292796 +0000 UTC m=+0.092764901 container exec_died b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Dec 15 04:44:37 localhost systemd[1]: b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a.service: Deactivated successfully. Dec 15 04:44:37 localhost python3.9[284809]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 15 04:44:38 localhost python3.9[284940]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False Dec 15 04:44:38 localhost nova_compute[231752]: 2025-12-15 09:44:38.666 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:44:38 localhost nova_compute[231752]: 2025-12-15 09:44:38.668 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:44:38 localhost nova_compute[231752]: 2025-12-15 09:44:38.668 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 15 04:44:38 localhost nova_compute[231752]: 2025-12-15 09:44:38.668 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:44:38 localhost nova_compute[231752]: 2025-12-15 09:44:38.702 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:44:38 localhost nova_compute[231752]: 2025-12-15 09:44:38.702 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:44:39 localhost python3.9[285050]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack Dec 15 04:44:40 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52586 DF PROTO=TCP SPT=53566 DPT=9102 SEQ=4061031728 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83A819260000000001030307) Dec 15 04:44:40 localhost python3[285160]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json containers=[] log_base_path=/var/log/containers/stdouts debug=False Dec 15 04:44:40 localhost python3[285160]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [#012 {#012 "Id": "e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18",#012 "Digest": "sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b",#012 "RepoTags": [#012 "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"#012 ],#012 "RepoDigests": [#012 "quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b"#012 ],#012 "Parent": "",#012 "Comment": "",#012 "Created": "2025-12-08T06:29:58.491919425Z",#012 "Config": {#012 "User": "nova",#012 "Env": [#012 "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",#012 "LANG=en_US.UTF-8",#012 "TZ=UTC",#012 "container=oci"#012 ],#012 "Entrypoint": [#012 "dumb-init",#012 "--single-child",#012 "--"#012 ],#012 "Cmd": [#012 "kolla_start"#012 ],#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251202",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "c3923531bcda0b0811b2d5053f189beb",#012 "tcib_managed": "true"#012 },#012 "StopSignal": "SIGTERM"#012 },#012 "Version": "",#012 "Author": "",#012 "Architecture": "amd64",#012 "Os": "linux",#012 "Size": 1212370809,#012 "VirtualSize": 1212370809,#012 "GraphDriver": {#012 "Name": "overlay",#012 "Data": {#012 "LowerDir": "/var/lib/containers/storage/overlay/0825dff58a4425bd2cec24761b25b1273896b2e1fd9e1bbd68a0daa8371ae8a9/diff:/var/lib/containers/storage/overlay/4c2a493cc38fe0c2d274b137f7d549c92d76e83cf216e797584fb8469937682d/diff:/var/lib/containers/storage/overlay/102653142e2259aa6223045dee7736729104ac8aed3ce9b3c87a6d0787e59de8/diff:/var/lib/containers/storage/overlay/a170762be59c15b133bd19c602942600caa3082ffe7158ccee8771dfc16bb660/diff",#012 "UpperDir": "/var/lib/containers/storage/overlay/fa799eba62894ffeea98e2f07b9b6a110cbee9080a6c3af1755332d33ad27617/diff",#012 "WorkDir": "/var/lib/containers/storage/overlay/fa799eba62894ffeea98e2f07b9b6a110cbee9080a6c3af1755332d33ad27617/work"#012 }#012 },#012 "RootFS": {#012 "Type": "layers",#012 "Layers": [#012 "sha256:a170762be59c15b133bd19c602942600caa3082ffe7158ccee8771dfc16bb660",#012 "sha256:47bbb708952ccfdaf6b1a15cd5347cc2e9ee37e63ec65603401dcebf66de9242",#012 "sha256:dde195c4be3ea0882f3029365e3a9510c9e08a199c8a2c93ddc2b8aa725a10f1",#012 "sha256:191522e021d026966b0789970c823d3aa8f268180d3ac4a9714f61201ef3b79e",#012 "sha256:226bf9c0939dc7236a630e18a3cd37bc8e773e86f3cf4ef2cedf4d22a6a7d337"#012 ]#012 },#012 "Labels": {#012 "io.buildah.version": "1.41.3",#012 "maintainer": "OpenStack Kubernetes Operator team",#012 "org.label-schema.build-date": "20251202",#012 "org.label-schema.license": "GPLv2",#012 "org.label-schema.name": "CentOS Stream 9 Base Image",#012 "org.label-schema.schema-version": "1.0",#012 "org.label-schema.vendor": "CentOS",#012 "tcib_build_tag": "c3923531bcda0b0811b2d5053f189beb",#012 "tcib_managed": "true"#012 },#012 "Annotations": {},#012 "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",#012 "User": "nova",#012 "History": [#012 {#012 "created": "2025-12-02T04:26:51.317229596Z",#012 "created_by": "/bin/sh -c #(nop) ADD file:9f05a0f58e10b77188c7243d914ce56c5ce3e0f2ee7e13a7b0d4990588c97b99 in / ",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-02T04:26:51.317315213Z",#012 "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\" org.label-schema.name=\"CentOS Stream 9 Base Image\" org.label-schema.vendor=\"CentOS\" org.label-schema.license=\"GPLv2\" org.label-schema.build-date=\"20251202\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-02T04:26:54.063957926Z",#012 "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"#012 },#012 {#012 "created": "2025-12-08T06:08:28.750777742Z",#012 "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",#012 "comment": "FROM quay.io/centos/centos:stream9",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-08T06:08:28.750791962Z",#012 "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-08T06:08:28.750804372Z",#012 "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-08T06:08:28.750813613Z",#012 "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-08T06:08:28.750824813Z",#012 "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-08T06:08:28.750833663Z",#012 "created_by": "/bin/sh -c #(nop) USER root",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-08T06:08:29.160435164Z",#012 "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",#012 "empty_layer": true#012 },#012 {#012 "created": "2025-12-08T06:09:05.859236491Z",#012 "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",#012 "empty_layer": true#012 },#012 {#012 Dec 15 04:44:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09. Dec 15 04:44:41 localhost python3.9[285331]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 15 04:44:41 localhost podman[285332]: 2025-12-15 09:44:41.746335412 +0000 UTC m=+0.077559446 container health_status 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, distribution-scope=public, release=1755695350, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, managed_by=edpm_ansible, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, maintainer=Red Hat, Inc.) Dec 15 04:44:41 localhost podman[285332]: 2025-12-15 09:44:41.788706895 +0000 UTC m=+0.119930939 container exec_died 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, release=1755695350, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.openshift.expose-services=, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, architecture=x86_64, com.redhat.component=ubi9-minimal-container, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal) Dec 15 04:44:41 localhost systemd[1]: 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09.service: Deactivated successfully. Dec 15 04:44:42 localhost python3.9[285463]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:44:43 localhost python3.9[285572]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765791882.5073607-3841-15821684824614/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:44:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 04:44:43 localhost systemd[1]: tmp-crun.2tw3yE.mount: Deactivated successfully. Dec 15 04:44:43 localhost podman[285628]: 2025-12-15 09:44:43.479258995 +0000 UTC m=+0.093481592 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 04:44:43 localhost podman[285628]: 2025-12-15 09:44:43.521325759 +0000 UTC m=+0.135548366 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2) Dec 15 04:44:43 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 04:44:43 localhost python3.9[285627]: ansible-systemd Invoked with state=started name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 04:44:43 localhost nova_compute[231752]: 2025-12-15 09:44:43.702 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:44:43 localhost nova_compute[231752]: 2025-12-15 09:44:43.704 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:44:44 localhost python3.9[285763]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 15 04:44:45 localhost python3.9[285871]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 15 04:44:46 localhost python3.9[285979]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1 Dec 15 04:44:47 localhost python3.9[286089]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None Dec 15 04:44:48 localhost nova_compute[231752]: 2025-12-15 09:44:48.707 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:44:48 localhost systemd-journald[47230]: Field hash table of /run/log/journal/738a39f68bc78fb81032e509449fb759/system.journal has a fill level at 119.2 (397 of 333 items), suggesting rotation. Dec 15 04:44:48 localhost systemd-journald[47230]: /run/log/journal/738a39f68bc78fb81032e509449fb759/system.journal: Journal header limits reached or header out-of-date, rotating. Dec 15 04:44:48 localhost nova_compute[231752]: 2025-12-15 09:44:48.708 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:44:48 localhost nova_compute[231752]: 2025-12-15 09:44:48.708 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 15 04:44:48 localhost nova_compute[231752]: 2025-12-15 09:44:48.708 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:44:48 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Dec 15 04:44:48 localhost nova_compute[231752]: 2025-12-15 09:44:48.735 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:44:48 localhost nova_compute[231752]: 2025-12-15 09:44:48.735 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:44:48 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Dec 15 04:44:48 localhost nova_compute[231752]: 2025-12-15 09:44:48.947 231756 DEBUG oslo_service.periodic_task [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:44:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 04:44:49 localhost podman[286224]: 2025-12-15 09:44:49.704213787 +0000 UTC m=+0.067846345 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202) Dec 15 04:44:49 localhost podman[286224]: 2025-12-15 09:44:49.73725651 +0000 UTC m=+0.100889048 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_metadata_agent, io.buildah.version=1.41.3) Dec 15 04:44:49 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 04:44:49 localhost python3.9[286223]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None Dec 15 04:44:49 localhost nova_compute[231752]: 2025-12-15 09:44:49.950 231756 DEBUG oslo_service.periodic_task [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:44:50 localhost nova_compute[231752]: 2025-12-15 09:44:50.951 231756 DEBUG oslo_service.periodic_task [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:44:50 localhost nova_compute[231752]: 2025-12-15 09:44:50.952 231756 DEBUG nova.compute.manager [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 15 04:44:50 localhost nova_compute[231752]: 2025-12-15 09:44:50.952 231756 DEBUG nova.compute.manager [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 15 04:44:50 localhost systemd[1]: Stopping nova_compute container... Dec 15 04:44:51 localhost nova_compute[231752]: 2025-12-15 09:44:51.049 231756 DEBUG oslo_privsep.comm [-] EOF on privsep read channel _reader_main /usr/lib/python3.9/site-packages/oslo_privsep/comm.py:170#033[00m Dec 15 04:44:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:44:51.460 160590 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:44:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:44:51.460 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:44:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:44:51.462 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:44:51 localhost nova_compute[231752]: 2025-12-15 09:44:51.613 231756 DEBUG oslo_concurrency.lockutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Acquiring lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 15 04:44:51 localhost nova_compute[231752]: 2025-12-15 09:44:51.613 231756 DEBUG oslo_concurrency.lockutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Acquired lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 15 04:44:51 localhost nova_compute[231752]: 2025-12-15 09:44:51.613 231756 DEBUG nova.network.neutron [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 15 04:44:51 localhost nova_compute[231752]: 2025-12-15 09:44:51.613 231756 DEBUG nova.objects.instance [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Lazy-loading 'info_cache' on Instance uuid 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 15 04:44:52 localhost nova_compute[231752]: 2025-12-15 09:44:52.022 231756 DEBUG nova.network.neutron [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Updating instance_info_cache with network_info: [{"id": "03ef8889-3216-43fb-8a52-4be17a956ce1", "address": "fa:16:3e:74:df:7c", "network": {"id": "befb7a72-17a9-4bcb-b561-84b8f626685a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.201", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "c785bf23f53946bc99867d8832a50266", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03ef8889-32", "ovs_interfaceid": "03ef8889-3216-43fb-8a52-4be17a956ce1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 15 04:44:52 localhost nova_compute[231752]: 2025-12-15 09:44:52.048 231756 DEBUG oslo_concurrency.lockutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Releasing lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 15 04:44:52 localhost nova_compute[231752]: 2025-12-15 09:44:52.049 231756 DEBUG nova.compute.manager [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 15 04:44:52 localhost nova_compute[231752]: 2025-12-15 09:44:52.049 231756 DEBUG oslo_service.periodic_task [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:44:52 localhost nova_compute[231752]: 2025-12-15 09:44:52.050 231756 DEBUG oslo_service.periodic_task [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:44:52 localhost nova_compute[231752]: 2025-12-15 09:44:52.065 231756 DEBUG oslo_concurrency.lockutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:44:52 localhost nova_compute[231752]: 2025-12-15 09:44:52.065 231756 DEBUG oslo_concurrency.lockutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:44:52 localhost nova_compute[231752]: 2025-12-15 09:44:52.066 231756 DEBUG oslo_concurrency.lockutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:44:52 localhost nova_compute[231752]: 2025-12-15 09:44:52.066 231756 DEBUG nova.compute.resource_tracker [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Auditing locally available compute resources for np0005559462.localdomain (node: np0005559462.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 15 04:44:52 localhost nova_compute[231752]: 2025-12-15 09:44:52.067 231756 DEBUG oslo_concurrency.processutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 04:44:52 localhost nova_compute[231752]: 2025-12-15 09:44:52.507 231756 DEBUG oslo_concurrency.processutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.440s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 04:44:52 localhost nova_compute[231752]: 2025-12-15 09:44:52.581 231756 DEBUG nova.virt.libvirt.driver [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 04:44:52 localhost nova_compute[231752]: 2025-12-15 09:44:52.582 231756 DEBUG nova.virt.libvirt.driver [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 04:44:52 localhost nova_compute[231752]: 2025-12-15 09:44:52.811 231756 WARNING nova.virt.libvirt.driver [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 15 04:44:52 localhost nova_compute[231752]: 2025-12-15 09:44:52.812 231756 DEBUG nova.compute.resource_tracker [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Hypervisor/Node resource view: name=np0005559462.localdomain free_ram=12133MB free_disk=41.83720779418945GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 15 04:44:52 localhost nova_compute[231752]: 2025-12-15 09:44:52.813 231756 DEBUG oslo_concurrency.lockutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:44:52 localhost nova_compute[231752]: 2025-12-15 09:44:52.813 231756 DEBUG oslo_concurrency.lockutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:44:52 localhost nova_compute[231752]: 2025-12-15 09:44:52.872 231756 DEBUG nova.compute.resource_tracker [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Instance 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 15 04:44:52 localhost nova_compute[231752]: 2025-12-15 09:44:52.873 231756 DEBUG nova.compute.resource_tracker [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 15 04:44:52 localhost nova_compute[231752]: 2025-12-15 09:44:52.873 231756 DEBUG nova.compute.resource_tracker [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Final resource view: name=np0005559462.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 15 04:44:52 localhost nova_compute[231752]: 2025-12-15 09:44:52.914 231756 DEBUG oslo_concurrency.processutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 04:44:53 localhost nova_compute[231752]: 2025-12-15 09:44:53.421 231756 DEBUG oslo_concurrency.processutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.507s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 04:44:53 localhost nova_compute[231752]: 2025-12-15 09:44:53.427 231756 DEBUG nova.compute.provider_tree [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Inventory has not changed in ProviderTree for provider: 26c8956b-6742-4951-b566-971b9bbe323b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 15 04:44:53 localhost nova_compute[231752]: 2025-12-15 09:44:53.445 231756 DEBUG nova.scheduler.client.report [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Inventory has not changed for provider 26c8956b-6742-4951-b566-971b9bbe323b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 15 04:44:53 localhost nova_compute[231752]: 2025-12-15 09:44:53.448 231756 DEBUG nova.compute.resource_tracker [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Compute_service record updated for np0005559462.localdomain:np0005559462.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 15 04:44:53 localhost nova_compute[231752]: 2025-12-15 09:44:53.448 231756 DEBUG oslo_concurrency.lockutils [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.635s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:44:53 localhost nova_compute[231752]: 2025-12-15 09:44:53.736 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:44:53 localhost nova_compute[231752]: 2025-12-15 09:44:53.737 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:44:53 localhost nova_compute[231752]: 2025-12-15 09:44:53.738 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 15 04:44:53 localhost nova_compute[231752]: 2025-12-15 09:44:53.738 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:44:53 localhost nova_compute[231752]: 2025-12-15 09:44:53.739 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:44:53 localhost nova_compute[231752]: 2025-12-15 09:44:53.742 231756 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:44:54 localhost nova_compute[231752]: 2025-12-15 09:44:54.350 231756 DEBUG oslo_service.periodic_task [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:44:54 localhost nova_compute[231752]: 2025-12-15 09:44:54.351 231756 DEBUG oslo_service.periodic_task [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:44:54 localhost nova_compute[231752]: 2025-12-15 09:44:54.351 231756 DEBUG oslo_service.periodic_task [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:44:54 localhost nova_compute[231752]: 2025-12-15 09:44:54.352 231756 DEBUG oslo_service.periodic_task [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:44:54 localhost nova_compute[231752]: 2025-12-15 09:44:54.352 231756 DEBUG nova.compute.manager [None req-c7fd4a3f-c95b-4936-a071-10d43fe8d19c - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 15 04:44:54 localhost nova_compute[231752]: 2025-12-15 09:44:54.607 231756 WARNING amqp [-] Received method (60, 30) during closing channel 1. This method will be ignored#033[00m Dec 15 04:44:54 localhost nova_compute[231752]: 2025-12-15 09:44:54.609 231756 DEBUG oslo_concurrency.lockutils [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 15 04:44:54 localhost nova_compute[231752]: 2025-12-15 09:44:54.610 231756 DEBUG oslo_concurrency.lockutils [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 15 04:44:54 localhost nova_compute[231752]: 2025-12-15 09:44:54.610 231756 DEBUG oslo_concurrency.lockutils [None req-c9759d09-8117-4682-aa2d-922d8f1f5fc3 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 15 04:44:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23882 DF PROTO=TCP SPT=49422 DPT=9102 SEQ=2260569640 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83A852720000000001030307) Dec 15 04:44:54 localhost journal[204381]: End of file while reading data: Input/output error Dec 15 04:44:54 localhost systemd[1]: libpod-b5d335c872cd511e4f9ef497ade55685828922bcddd4d43ed3b0589e6eb54c6a.scope: Deactivated successfully. Dec 15 04:44:54 localhost systemd[1]: libpod-b5d335c872cd511e4f9ef497ade55685828922bcddd4d43ed3b0589e6eb54c6a.scope: Consumed 20.749s CPU time. Dec 15 04:44:54 localhost podman[286245]: 2025-12-15 09:44:54.957580579 +0000 UTC m=+3.975249141 container died b5d335c872cd511e4f9ef497ade55685828922bcddd4d43ed3b0589e6eb54c6a (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=nova_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS) Dec 15 04:44:54 localhost systemd[1]: tmp-crun.ruaay2.mount: Deactivated successfully. Dec 15 04:44:55 localhost podman[286245]: 2025-12-15 09:44:55.119607354 +0000 UTC m=+4.137275896 container cleanup b5d335c872cd511e4f9ef497ade55685828922bcddd4d43ed3b0589e6eb54c6a (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, container_name=nova_compute, org.label-schema.license=GPLv2, config_id=edpm, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible) Dec 15 04:44:55 localhost podman[286245]: nova_compute Dec 15 04:44:55 localhost podman[286328]: error opening file `/run/crun/b5d335c872cd511e4f9ef497ade55685828922bcddd4d43ed3b0589e6eb54c6a/status`: No such file or directory Dec 15 04:44:55 localhost podman[286316]: 2025-12-15 09:44:55.219497574 +0000 UTC m=+0.068111804 container cleanup b5d335c872cd511e4f9ef497ade55685828922bcddd4d43ed3b0589e6eb54c6a (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.schema-version=1.0, config_id=edpm, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible) Dec 15 04:44:55 localhost podman[286316]: nova_compute Dec 15 04:44:55 localhost systemd[1]: edpm_nova_compute.service: Deactivated successfully. Dec 15 04:44:55 localhost systemd[1]: Stopped nova_compute container. Dec 15 04:44:55 localhost systemd[1]: Starting nova_compute container... Dec 15 04:44:55 localhost systemd[1]: Started libcrun container. Dec 15 04:44:55 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/90c3d76968c2380582c34ed47ce3bd3c1017e3f0027bb2b60c7c3fb4e76ff2b1/merged/etc/multipath supports timestamps until 2038 (0x7fffffff) Dec 15 04:44:55 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/90c3d76968c2380582c34ed47ce3bd3c1017e3f0027bb2b60c7c3fb4e76ff2b1/merged/etc/nvme supports timestamps until 2038 (0x7fffffff) Dec 15 04:44:55 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/90c3d76968c2380582c34ed47ce3bd3c1017e3f0027bb2b60c7c3fb4e76ff2b1/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff) Dec 15 04:44:55 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/90c3d76968c2380582c34ed47ce3bd3c1017e3f0027bb2b60c7c3fb4e76ff2b1/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Dec 15 04:44:55 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/90c3d76968c2380582c34ed47ce3bd3c1017e3f0027bb2b60c7c3fb4e76ff2b1/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff) Dec 15 04:44:55 localhost podman[286330]: 2025-12-15 09:44:55.366736495 +0000 UTC m=+0.119641291 container init b5d335c872cd511e4f9ef497ade55685828922bcddd4d43ed3b0589e6eb54c6a (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, container_name=nova_compute) Dec 15 04:44:55 localhost podman[286330]: 2025-12-15 09:44:55.37517234 +0000 UTC m=+0.128077146 container start b5d335c872cd511e4f9ef497ade55685828922bcddd4d43ed3b0589e6eb54c6a (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=nova_compute, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 04:44:55 localhost podman[286330]: nova_compute Dec 15 04:44:55 localhost nova_compute[286344]: + sudo -E kolla_set_configs Dec 15 04:44:55 localhost systemd[1]: Started nova_compute container. Dec 15 04:44:55 localhost nova_compute[286344]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json Dec 15 04:44:55 localhost nova_compute[286344]: INFO:__main__:Validating config file Dec 15 04:44:55 localhost nova_compute[286344]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS Dec 15 04:44:55 localhost nova_compute[286344]: INFO:__main__:Copying service configuration files Dec 15 04:44:55 localhost nova_compute[286344]: INFO:__main__:Deleting /etc/nova/nova.conf Dec 15 04:44:55 localhost nova_compute[286344]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf Dec 15 04:44:55 localhost nova_compute[286344]: INFO:__main__:Setting permission for /etc/nova/nova.conf Dec 15 04:44:55 localhost nova_compute[286344]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf Dec 15 04:44:55 localhost nova_compute[286344]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf Dec 15 04:44:55 localhost nova_compute[286344]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf Dec 15 04:44:55 localhost nova_compute[286344]: INFO:__main__:Deleting /etc/nova/nova.conf.d/03-ceph-nova.conf Dec 15 04:44:55 localhost nova_compute[286344]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf Dec 15 04:44:55 localhost nova_compute[286344]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf Dec 15 04:44:55 localhost nova_compute[286344]: INFO:__main__:Deleting /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf Dec 15 04:44:55 localhost nova_compute[286344]: INFO:__main__:Copying /var/lib/kolla/config_files/99-nova-compute-cells-workarounds.conf to /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf Dec 15 04:44:55 localhost nova_compute[286344]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf Dec 15 04:44:55 localhost nova_compute[286344]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf Dec 15 04:44:55 localhost nova_compute[286344]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf Dec 15 04:44:55 localhost nova_compute[286344]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf Dec 15 04:44:55 localhost nova_compute[286344]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf Dec 15 04:44:55 localhost nova_compute[286344]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf Dec 15 04:44:55 localhost nova_compute[286344]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf Dec 15 04:44:55 localhost nova_compute[286344]: INFO:__main__:Deleting /etc/ceph Dec 15 04:44:55 localhost nova_compute[286344]: INFO:__main__:Creating directory /etc/ceph Dec 15 04:44:55 localhost nova_compute[286344]: INFO:__main__:Setting permission for /etc/ceph Dec 15 04:44:55 localhost nova_compute[286344]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf Dec 15 04:44:55 localhost nova_compute[286344]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf Dec 15 04:44:55 localhost nova_compute[286344]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring Dec 15 04:44:55 localhost nova_compute[286344]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring Dec 15 04:44:55 localhost nova_compute[286344]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey Dec 15 04:44:55 localhost nova_compute[286344]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey Dec 15 04:44:55 localhost nova_compute[286344]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey Dec 15 04:44:55 localhost nova_compute[286344]: INFO:__main__:Deleting /var/lib/nova/.ssh/config Dec 15 04:44:55 localhost nova_compute[286344]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config Dec 15 04:44:55 localhost nova_compute[286344]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config Dec 15 04:44:55 localhost nova_compute[286344]: INFO:__main__:Deleting /usr/sbin/iscsiadm Dec 15 04:44:55 localhost nova_compute[286344]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm Dec 15 04:44:55 localhost nova_compute[286344]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm Dec 15 04:44:55 localhost nova_compute[286344]: INFO:__main__:Writing out command to execute Dec 15 04:44:55 localhost nova_compute[286344]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf Dec 15 04:44:55 localhost nova_compute[286344]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring Dec 15 04:44:55 localhost nova_compute[286344]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ Dec 15 04:44:55 localhost nova_compute[286344]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey Dec 15 04:44:55 localhost nova_compute[286344]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config Dec 15 04:44:55 localhost nova_compute[286344]: ++ cat /run_command Dec 15 04:44:55 localhost nova_compute[286344]: + CMD=nova-compute Dec 15 04:44:55 localhost nova_compute[286344]: + ARGS= Dec 15 04:44:55 localhost nova_compute[286344]: + sudo kolla_copy_cacerts Dec 15 04:44:55 localhost nova_compute[286344]: + [[ ! -n '' ]] Dec 15 04:44:55 localhost nova_compute[286344]: + . kolla_extend_start Dec 15 04:44:55 localhost nova_compute[286344]: + echo 'Running command: '\''nova-compute'\''' Dec 15 04:44:55 localhost nova_compute[286344]: Running command: 'nova-compute' Dec 15 04:44:55 localhost nova_compute[286344]: + umask 0022 Dec 15 04:44:55 localhost nova_compute[286344]: + exec nova-compute Dec 15 04:44:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23883 DF PROTO=TCP SPT=49422 DPT=9102 SEQ=2260569640 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83A856660000000001030307) Dec 15 04:44:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e. Dec 15 04:44:56 localhost podman[286465]: 2025-12-15 09:44:56.010720618 +0000 UTC m=+0.086592639 container health_status a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Dec 15 04:44:56 localhost podman[286465]: 2025-12-15 09:44:56.018533057 +0000 UTC m=+0.094405108 container exec_died a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Dec 15 04:44:56 localhost systemd[1]: a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e.service: Deactivated successfully. Dec 15 04:44:56 localhost python3.9[286471]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None Dec 15 04:44:56 localhost systemd[1]: Started libpod-conmon-c86c0d58ed395af407f97b4ea9c613ee83c5b4d277ca4b706ab48762a26987aa.scope. Dec 15 04:44:56 localhost systemd[1]: Started libcrun container. Dec 15 04:44:56 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5c523cc249966f9abaa8c86aeb9bc1f9fc67bbfacedf0d474f96681ec85a864c/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff) Dec 15 04:44:56 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5c523cc249966f9abaa8c86aeb9bc1f9fc67bbfacedf0d474f96681ec85a864c/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff) Dec 15 04:44:56 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5c523cc249966f9abaa8c86aeb9bc1f9fc67bbfacedf0d474f96681ec85a864c/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff) Dec 15 04:44:56 localhost podman[286512]: 2025-12-15 09:44:56.500069354 +0000 UTC m=+0.177613001 container init c86c0d58ed395af407f97b4ea9c613ee83c5b4d277ca4b706ab48762a26987aa (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, container_name=nova_compute_init, managed_by=edpm_ansible, org.label-schema.vendor=CentOS) Dec 15 04:44:56 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52587 DF PROTO=TCP SPT=53566 DPT=9102 SEQ=4061031728 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83A859250000000001030307) Dec 15 04:44:56 localhost nova_compute_init[286532]: INFO:nova_statedir:Applying nova statedir ownership Dec 15 04:44:56 localhost nova_compute_init[286532]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436 Dec 15 04:44:56 localhost nova_compute_init[286532]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/ Dec 15 04:44:56 localhost nova_compute_init[286532]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436 Dec 15 04:44:56 localhost nova_compute_init[286532]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0 Dec 15 04:44:56 localhost nova_compute_init[286532]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/ Dec 15 04:44:56 localhost nova_compute_init[286532]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436 Dec 15 04:44:56 localhost nova_compute_init[286532]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0 Dec 15 04:44:56 localhost nova_compute_init[286532]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/ Dec 15 04:44:56 localhost nova_compute_init[286532]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 already 42436:42436 Dec 15 04:44:56 localhost nova_compute_init[286532]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 to system_u:object_r:container_file_t:s0 Dec 15 04:44:56 localhost nova_compute_init[286532]: INFO:nova_statedir:Checking uid: 0 gid: 0 path: /var/lib/nova/instances/39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/console.log Dec 15 04:44:56 localhost nova_compute_init[286532]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/_base/ Dec 15 04:44:56 localhost nova_compute_init[286532]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/_base already 42436:42436 Dec 15 04:44:56 localhost nova_compute_init[286532]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/_base to system_u:object_r:container_file_t:s0 Dec 15 04:44:56 localhost nova_compute_init[286532]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/_base/f51109af1d3e72d8fb41e75a49bf4f04de3202b1 Dec 15 04:44:56 localhost nova_compute_init[286532]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/_base/ephemeral_1_0706d66 Dec 15 04:44:56 localhost nova_compute_init[286532]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/ Dec 15 04:44:56 localhost nova_compute_init[286532]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/locks already 42436:42436 Dec 15 04:44:56 localhost nova_compute_init[286532]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/locks to system_u:object_r:container_file_t:s0 Dec 15 04:44:56 localhost nova_compute_init[286532]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/nova-f51109af1d3e72d8fb41e75a49bf4f04de3202b1 Dec 15 04:44:56 localhost nova_compute_init[286532]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/nova-ephemeral_1_0706d66 Dec 15 04:44:56 localhost nova_compute_init[286532]: INFO:nova_statedir:Checking uid: 0 gid: 0 path: /var/lib/nova/delay-nova-compute Dec 15 04:44:56 localhost nova_compute_init[286532]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ Dec 15 04:44:56 localhost nova_compute_init[286532]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436 Dec 15 04:44:56 localhost nova_compute_init[286532]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0 Dec 15 04:44:56 localhost nova_compute_init[286532]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey Dec 15 04:44:56 localhost nova_compute_init[286532]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config Dec 15 04:44:56 localhost nova_compute_init[286532]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/ Dec 15 04:44:56 localhost nova_compute_init[286532]: INFO:nova_statedir:Ownership of /var/lib/nova/.cache already 42436:42436 Dec 15 04:44:56 localhost nova_compute_init[286532]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.cache to system_u:object_r:container_file_t:s0 Dec 15 04:44:56 localhost nova_compute_init[286532]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/ Dec 15 04:44:56 localhost nova_compute_init[286532]: INFO:nova_statedir:Ownership of /var/lib/nova/.cache/python-entrypoints already 42436:42436 Dec 15 04:44:56 localhost nova_compute_init[286532]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.cache/python-entrypoints to system_u:object_r:container_file_t:s0 Dec 15 04:44:56 localhost nova_compute_init[286532]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/b234715fc878456b41e32c4fbc669b417044dbe6c6684bbc9059e5c93396ffea Dec 15 04:44:56 localhost nova_compute_init[286532]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/16958d231615fa2e15154aac2f4371388ef8f2a8455c69ba0e5e08f2c33545f5 Dec 15 04:44:56 localhost nova_compute_init[286532]: INFO:nova_statedir:Nova statedir ownership complete Dec 15 04:44:56 localhost systemd[1]: libpod-c86c0d58ed395af407f97b4ea9c613ee83c5b4d277ca4b706ab48762a26987aa.scope: Deactivated successfully. Dec 15 04:44:56 localhost podman[286512]: 2025-12-15 09:44:56.672616832 +0000 UTC m=+0.350160479 container start c86c0d58ed395af407f97b4ea9c613ee83c5b4d277ca4b706ab48762a26987aa (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=edpm, container_name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2) Dec 15 04:44:56 localhost python3.9[286471]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init Dec 15 04:44:56 localhost podman[286533]: 2025-12-15 09:44:56.693692891 +0000 UTC m=+0.098966855 container died c86c0d58ed395af407f97b4ea9c613ee83c5b4d277ca4b706ab48762a26987aa (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Dec 15 04:44:56 localhost podman[286533]: 2025-12-15 09:44:56.732131985 +0000 UTC m=+0.137405949 container cleanup c86c0d58ed395af407f97b4ea9c613ee83c5b4d277ca4b706ab48762a26987aa (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, container_name=nova_compute_init, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS) Dec 15 04:44:56 localhost systemd[1]: libpod-conmon-c86c0d58ed395af407f97b4ea9c613ee83c5b4d277ca4b706ab48762a26987aa.scope: Deactivated successfully. Dec 15 04:44:56 localhost systemd[1]: tmp-crun.aKWcST.mount: Deactivated successfully. Dec 15 04:44:56 localhost systemd[1]: var-lib-containers-storage-overlay-5c523cc249966f9abaa8c86aeb9bc1f9fc67bbfacedf0d474f96681ec85a864c-merged.mount: Deactivated successfully. Dec 15 04:44:56 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c86c0d58ed395af407f97b4ea9c613ee83c5b4d277ca4b706ab48762a26987aa-userdata-shm.mount: Deactivated successfully. Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.124 286348 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.125 286348 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.125 286348 DEBUG os_vif [-] Loaded VIF plugin class '' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.125 286348 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.239 286348 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.261 286348 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.022s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.262 286348 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m Dec 15 04:44:57 localhost systemd[1]: session-59.scope: Deactivated successfully. Dec 15 04:44:57 localhost systemd[1]: session-59.scope: Consumed 1min 31.213s CPU time. Dec 15 04:44:57 localhost systemd-logind[763]: Session 59 logged out. Waiting for processes to exit. Dec 15 04:44:57 localhost systemd-logind[763]: Removed session 59. Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.679 286348 INFO nova.virt.driver [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.787 286348 INFO nova.compute.provider_config [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.798 286348 DEBUG oslo_concurrency.lockutils [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.798 286348 DEBUG oslo_concurrency.lockutils [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.798 286348 DEBUG oslo_concurrency.lockutils [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.798 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.799 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.799 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.799 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.799 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.799 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.799 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] allow_resize_to_same_host = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.799 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] arq_binding_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.799 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] backdoor_port = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.800 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] backdoor_socket = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.800 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] block_device_allocate_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.800 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.800 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] cert = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.800 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] compute_driver = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.800 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] compute_monitors = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.800 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] config_dir = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.801 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] config_drive_format = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.801 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] config_file = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.801 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] config_source = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.801 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] console_host = np0005559462.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.801 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] control_exchange = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.801 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] cpu_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.801 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] daemon = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.802 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] debug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.802 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.802 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] default_availability_zone = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.802 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] default_ephemeral_format = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.802 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.802 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] default_schedule_zone = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.803 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] disk_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.803 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] enable_new_services = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.803 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] enabled_apis = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.803 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] enabled_ssl_apis = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.803 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] flat_injected = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.803 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] force_config_drive = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.803 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] force_raw_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.804 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] graceful_shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.804 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.804 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] host = np0005559462.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.804 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] initial_cpu_allocation_ratio = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.804 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] initial_disk_allocation_ratio = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.804 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] initial_ram_allocation_ratio = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.804 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] injected_network_template = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.805 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] instance_build_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.805 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] instance_delete_interval = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.805 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] instance_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.805 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] instance_name_template = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.805 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] instance_usage_audit = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.805 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] instance_usage_audit_period = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.805 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] instance_uuid_format = [instance: %(uuid)s] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.806 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] instances_path = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.806 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.806 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.806 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] live_migration_retry_count = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.806 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] log_config_append = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.806 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] log_date_format = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.806 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] log_dir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.806 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] log_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.807 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] log_options = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.807 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] log_rotate_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.807 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] log_rotate_interval_type = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.807 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] log_rotation_type = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.807 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] logging_context_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.807 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] logging_debug_format_suffix = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.807 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] logging_default_format_string = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.807 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] logging_exception_prefix = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.808 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.808 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] long_rpc_timeout = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.808 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] max_concurrent_builds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.808 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.808 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] max_concurrent_snapshots = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.808 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] max_local_block_devices = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.808 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] max_logfile_count = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.808 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] max_logfile_size_mb = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.809 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.809 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] metadata_listen = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.809 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] metadata_listen_port = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.809 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] metadata_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.809 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] migrate_max_retries = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.809 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] mkisofs_cmd = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.809 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] my_block_storage_ip = 192.168.122.106 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.810 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] my_ip = 192.168.122.106 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.810 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] network_allocate_retries = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.810 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.810 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] osapi_compute_listen = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.810 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] osapi_compute_listen_port = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.810 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] osapi_compute_unique_server_name_scope = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.810 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] osapi_compute_workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.810 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] password_length = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.811 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] periodic_enable = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.811 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] periodic_fuzzy_delay = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.811 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] pointer_model = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.811 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] preallocate_images = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.811 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] publish_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.811 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] pybasedir = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.811 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] ram_allocation_ratio = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.811 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] rate_limit_burst = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.812 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] rate_limit_except_level = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.812 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] rate_limit_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.812 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] reboot_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.812 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] reclaim_instance_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.813 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] record = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.813 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] reimage_timeout_per_gb = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.813 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] report_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.813 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] rescue_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.813 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] reserved_host_cpus = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.813 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] reserved_host_disk_mb = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.813 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] reserved_host_memory_mb = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.814 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] reserved_huge_pages = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.814 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] resize_confirm_window = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.814 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] resize_fs_using_block_device = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.814 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.814 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] rootwrap_config = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.814 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] rpc_response_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.814 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] run_external_periodic_tasks = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.815 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.815 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.815 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.815 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.815 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] service_down_time = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.815 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] servicegroup_driver = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.815 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] shelved_offload_time = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.816 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] shelved_poll_interval = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.816 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] shutdown_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.816 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] source_is_ipv6 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.816 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] ssl_only = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.816 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] state_path = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.816 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] sync_power_state_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.816 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] sync_power_state_pool_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.816 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] syslog_log_facility = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.817 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] tempdir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.817 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] timeout_nbd = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.817 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.817 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] update_resources_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.817 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] use_cow_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.817 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] use_eventlog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.817 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] use_journal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.817 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] use_json = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.818 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] use_rootwrap_daemon = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.818 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] use_stderr = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.818 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] use_syslog = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.818 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] vcpu_pin_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.818 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] vif_plugging_is_fatal = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.818 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] vif_plugging_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.818 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] virt_mkfs = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.818 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] volume_usage_poll_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.819 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] watch_log_file = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.819 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] web = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.819 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.819 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_concurrency.lock_path = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.819 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.819 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.819 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_messaging_metrics.metrics_process_name = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.820 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.820 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.820 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] api.auth_strategy = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.820 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] api.compute_link_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.820 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.820 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] api.dhcp_domain = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.820 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] api.enable_instance_password = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.821 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] api.glance_link_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.821 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.821 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.821 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.821 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.821 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] api.local_metadata_per_cell = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.821 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] api.max_limit = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.822 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] api.metadata_cache_expiration = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.822 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] api.neutron_default_tenant_id = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.822 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] api.use_forwarded_for = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.822 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] api.use_neutron_default_nets = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.822 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.822 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.822 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.822 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] api.vendordata_dynamic_ssl_certfile = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.823 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.823 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] api.vendordata_jsonfile_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.823 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] api.vendordata_providers = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.823 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] cache.backend = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.823 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] cache.backend_argument = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.823 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] cache.config_prefix = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.823 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] cache.dead_timeout = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.824 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] cache.debug_cache_backend = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.824 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] cache.enable_retry_client = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.824 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] cache.enable_socket_keepalive = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.824 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] cache.enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.824 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] cache.expiration_time = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.824 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.824 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] cache.hashclient_retry_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.824 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] cache.memcache_dead_retry = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.825 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] cache.memcache_password = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.825 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.825 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.825 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] cache.memcache_pool_maxsize = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.825 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.825 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] cache.memcache_sasl_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.825 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] cache.memcache_servers = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.826 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] cache.memcache_socket_timeout = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.826 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] cache.memcache_username = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.826 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] cache.proxies = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.826 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] cache.retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.826 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] cache.retry_delay = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.826 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] cache.socket_keepalive_count = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.826 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] cache.socket_keepalive_idle = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.826 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.827 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] cache.tls_allowed_ciphers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.827 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] cache.tls_cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.827 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] cache.tls_certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.827 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] cache.tls_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.827 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] cache.tls_keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.827 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] cinder.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.827 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] cinder.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.828 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] cinder.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.828 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] cinder.catalog_info = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.828 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] cinder.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.828 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] cinder.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.828 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] cinder.cross_az_attach = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.828 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] cinder.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.828 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] cinder.endpoint_template = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.828 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] cinder.http_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.829 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] cinder.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.829 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] cinder.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.829 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] cinder.os_region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.829 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] cinder.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.829 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] cinder.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.829 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.829 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] compute.cpu_dedicated_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.830 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] compute.cpu_shared_set = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.830 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.830 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.830 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.830 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.830 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.830 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.830 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.831 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.831 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.831 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] conductor.workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.831 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] console.allowed_origins = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.831 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] console.ssl_ciphers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.831 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] console.ssl_minimum_version = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.831 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] consoleauth.token_ttl = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.832 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] cyborg.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.832 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] cyborg.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.832 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] cyborg.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.832 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] cyborg.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.832 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] cyborg.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.832 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] cyborg.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.832 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] cyborg.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.832 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] cyborg.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.833 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] cyborg.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.833 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] cyborg.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.833 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] cyborg.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.833 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] cyborg.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.833 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] cyborg.service_type = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.833 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] cyborg.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.834 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] cyborg.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.834 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.834 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] cyborg.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.834 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] cyborg.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.834 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] cyborg.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.834 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] database.backend = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.834 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] database.connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.835 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] database.connection_debug = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.835 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] database.connection_parameters = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.835 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.835 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] database.connection_trace = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.835 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.835 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] database.db_max_retries = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.835 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.835 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.836 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] database.max_overflow = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.836 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] database.max_pool_size = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.836 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] database.max_retries = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.836 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] database.mysql_enable_ndb = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.836 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] database.mysql_sql_mode = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.836 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.836 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] database.pool_timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.837 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] database.retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.837 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] database.slave_connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.837 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.837 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] api_database.backend = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.837 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] api_database.connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.837 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] api_database.connection_debug = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.837 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] api_database.connection_parameters = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.838 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.838 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] api_database.connection_trace = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.838 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.838 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] api_database.db_max_retries = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.838 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.838 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.838 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] api_database.max_overflow = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.838 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] api_database.max_pool_size = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.839 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] api_database.max_retries = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.839 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] api_database.mysql_enable_ndb = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.839 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] api_database.mysql_sql_mode = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.839 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.839 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] api_database.pool_timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.839 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] api_database.retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.839 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] api_database.slave_connection = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.840 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.840 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] devices.enabled_mdev_types = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.840 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.840 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.840 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.840 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] glance.api_servers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.840 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] glance.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.841 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] glance.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.841 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] glance.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.841 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] glance.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.841 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] glance.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.841 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] glance.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.841 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.841 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.842 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] glance.enable_rbd_download = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.842 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] glance.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.842 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] glance.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.842 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] glance.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.842 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] glance.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.842 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] glance.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.842 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] glance.num_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.842 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] glance.rbd_ceph_conf = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.843 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] glance.rbd_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.843 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] glance.rbd_pool = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.843 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] glance.rbd_user = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.843 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] glance.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.843 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] glance.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.843 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] glance.service_type = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.843 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] glance.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.844 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] glance.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.844 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.844 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] glance.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.844 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] glance.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.844 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.844 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] glance.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.844 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] guestfs.debug = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.844 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] hyperv.config_drive_cdrom = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.845 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.845 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] hyperv.dynamic_memory_ratio = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.845 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.845 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] hyperv.enable_remotefx = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.845 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] hyperv.instances_path_share = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.845 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] hyperv.iscsi_initiator_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.845 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] hyperv.limit_cpu_features = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.846 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.846 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.846 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.846 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.846 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] hyperv.qemu_img_cmd = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.846 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] hyperv.use_multipath_io = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.846 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.846 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.847 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] hyperv.vswitch_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.847 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.847 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] mks.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.847 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] mks.mksproxy_base_url = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.847 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] image_cache.manager_interval = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.847 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.848 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.848 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.848 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.848 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] image_cache.subdirectory_name = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.848 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] ironic.api_max_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.848 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] ironic.api_retry_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.848 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] ironic.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.849 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] ironic.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.849 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] ironic.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.849 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] ironic.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.849 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] ironic.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.849 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] ironic.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.849 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] ironic.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.849 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] ironic.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.849 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] ironic.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.850 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] ironic.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.850 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] ironic.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.850 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] ironic.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.850 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] ironic.partition_key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.850 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] ironic.peer_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.850 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] ironic.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.850 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.850 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] ironic.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.851 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] ironic.service_type = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.851 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] ironic.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.851 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] ironic.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.851 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.851 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] ironic.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.851 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] ironic.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.851 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] ironic.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.852 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] key_manager.backend = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.852 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] key_manager.fixed_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.852 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] barbican.auth_endpoint = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.852 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] barbican.barbican_api_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.852 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] barbican.barbican_endpoint = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.852 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.852 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] barbican.barbican_region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.852 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] barbican.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.853 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] barbican.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.853 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] barbican.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.853 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] barbican.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.853 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] barbican.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.853 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] barbican.number_of_retries = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.853 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] barbican.retry_delay = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.853 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.854 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] barbican.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.854 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] barbican.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.854 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] barbican.verify_ssl = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.854 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] barbican.verify_ssl_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.854 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.854 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.854 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] barbican_service_user.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.854 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.855 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.855 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.855 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] barbican_service_user.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.855 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.855 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] barbican_service_user.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.855 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] vault.approle_role_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.855 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] vault.approle_secret_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.855 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] vault.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.856 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] vault.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.856 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] vault.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.856 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] vault.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.856 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] vault.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.856 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] vault.kv_mountpoint = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.856 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] vault.kv_version = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.856 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] vault.namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.856 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] vault.root_token_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.857 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] vault.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.857 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] vault.ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.857 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] vault.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.857 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] vault.use_ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.857 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] vault.vault_url = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.857 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] keystone.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.857 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] keystone.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.858 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] keystone.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.858 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] keystone.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.858 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] keystone.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.858 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] keystone.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.858 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] keystone.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.858 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] keystone.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.858 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] keystone.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.858 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] keystone.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.859 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] keystone.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.859 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] keystone.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.859 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] keystone.service_type = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.859 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] keystone.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.859 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] keystone.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.859 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.859 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] keystone.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.859 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] keystone.valid_interfaces = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.860 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] keystone.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.860 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.connection_uri = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.860 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.cpu_mode = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.860 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.cpu_model_extra_flags = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.860 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.cpu_models = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.860 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.860 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.861 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.cpu_power_management = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.861 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.861 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.861 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.device_detach_timeout = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.861 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.disk_cachemodes = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.861 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.disk_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.861 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.enabled_perf_events = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.862 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.file_backed_memory = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.862 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.gid_maps = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.862 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.hw_disk_discard = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.862 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.hw_machine_type = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.862 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.images_rbd_ceph_conf = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.862 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.862 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.862 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.863 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.images_rbd_pool = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.863 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.images_type = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.863 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.images_volume_group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.863 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.inject_key = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.863 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.inject_partition = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.863 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.863 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.iscsi_iface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.864 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.iser_use_multipath = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.864 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.864 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.864 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.864 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.864 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.864 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.864 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.865 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.865 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.live_migration_scheme = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.865 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.865 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.865 286348 WARNING oslo_config.cfg [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal ( Dec 15 04:44:57 localhost nova_compute[286344]: live_migration_uri is deprecated for removal in favor of two other options that Dec 15 04:44:57 localhost nova_compute[286344]: allow to change live migration scheme and target URI: ``live_migration_scheme`` Dec 15 04:44:57 localhost nova_compute[286344]: and ``live_migration_inbound_addr`` respectively. Dec 15 04:44:57 localhost nova_compute[286344]: ). Its value may be silently ignored in the future.#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.865 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.live_migration_uri = qemu+ssh://nova@%s/system?keyfile=/var/lib/nova/.ssh/ssh-privatekey log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.865 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.live_migration_with_native_tls = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.866 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.max_queues = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.866 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.866 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.nfs_mount_options = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.866 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.nfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.866 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.866 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.num_iser_scan_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.866 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.867 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.867 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.num_pcie_ports = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.867 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.num_volume_scan_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.867 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.pmem_namespaces = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.867 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.quobyte_client_cfg = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.867 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.867 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.rbd_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.868 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.868 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.868 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.rbd_secret_uuid = bce17446-41b5-5408-a23e-0b011906b44a log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.868 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.rbd_user = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.868 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.868 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.868 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.rescue_image_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.869 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.rescue_kernel_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.869 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.rescue_ramdisk_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.869 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.rng_dev_path = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.869 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.rx_queue_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.869 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.smbfs_mount_options = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.869 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.869 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.snapshot_compression = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.869 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.snapshot_image_format = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.870 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.snapshots_directory = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.870 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.870 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.swtpm_enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.870 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.swtpm_group = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.870 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.swtpm_user = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.870 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.sysinfo_serial = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.871 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.tx_queue_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.871 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.uid_maps = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.871 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.871 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.virt_type = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.871 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.volume_clear = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.871 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.volume_clear_size = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.871 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.volume_use_multipath = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.872 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.vzstorage_cache_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.872 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.872 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.vzstorage_mount_group = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.872 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.vzstorage_mount_opts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.872 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.vzstorage_mount_perms = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.872 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.872 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.vzstorage_mount_user = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.872 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.873 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] neutron.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.873 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] neutron.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.873 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] neutron.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.873 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] neutron.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.873 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] neutron.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.873 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] neutron.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.873 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] neutron.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.874 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] neutron.default_floating_pool = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.874 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] neutron.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.874 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.874 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] neutron.http_retries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.874 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] neutron.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.874 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] neutron.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.874 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] neutron.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.874 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.875 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] neutron.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.875 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] neutron.ovs_bridge = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.875 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] neutron.physnets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.875 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] neutron.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.875 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.875 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] neutron.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.875 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] neutron.service_type = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.876 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] neutron.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.876 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] neutron.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.876 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.876 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] neutron.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.876 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] neutron.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.876 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] neutron.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.876 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.876 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] notifications.default_level = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.877 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.877 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.877 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.877 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] pci.alias = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.877 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] pci.device_spec = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.877 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] pci.report_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.877 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] placement.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.878 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] placement.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.878 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] placement.auth_url = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.878 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] placement.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.878 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] placement.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.878 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] placement.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.878 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] placement.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.878 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] placement.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.878 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] placement.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.879 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] placement.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.879 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] placement.domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.879 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] placement.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.879 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] placement.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.879 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] placement.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.879 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] placement.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.879 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] placement.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.879 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] placement.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.880 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] placement.password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.880 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] placement.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.880 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] placement.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.880 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] placement.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.880 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] placement.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.880 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] placement.region_name = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.880 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] placement.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.881 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] placement.service_type = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.881 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] placement.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.881 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] placement.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.881 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.881 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] placement.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.881 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] placement.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.881 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] placement.trust_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.881 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] placement.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.882 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] placement.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.882 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] placement.user_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.882 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] placement.username = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.882 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] placement.valid_interfaces = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.882 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] placement.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.882 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] quota.cores = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.882 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.882 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] quota.driver = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.883 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.883 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.883 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] quota.injected_files = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.883 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] quota.instances = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.883 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] quota.key_pairs = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.883 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] quota.metadata_items = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.883 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] quota.ram = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.884 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] quota.recheck_quota = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.884 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] quota.server_group_members = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.884 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] quota.server_groups = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.884 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] rdp.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.884 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.884 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.884 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.885 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.885 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.885 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] scheduler.max_attempts = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.885 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.885 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.885 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.885 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.886 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.886 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] scheduler.workers = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.886 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.886 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.886 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.886 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.886 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.887 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.887 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.887 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.887 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.887 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.887 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.887 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.887 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.888 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.888 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.888 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.888 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.888 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.888 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.888 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.889 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.889 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.889 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.889 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.889 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] metrics.required = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.889 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] metrics.weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.889 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] metrics.weight_of_unavailable = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.890 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] metrics.weight_setting = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.890 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] serial_console.base_url = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.890 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] serial_console.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.890 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] serial_console.port_range = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.890 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.890 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.890 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.891 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.891 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] service_user.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.891 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] service_user.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.891 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.891 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.891 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.891 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] service_user.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.891 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.892 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.892 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] service_user.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.892 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] spice.agent_enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.892 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] spice.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.892 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] spice.html5proxy_base_url = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.892 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] spice.html5proxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.892 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] spice.html5proxy_port = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.893 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] spice.image_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.893 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] spice.jpeg_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.893 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] spice.playback_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.893 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] spice.server_listen = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.893 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.893 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] spice.streaming_mode = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.893 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] spice.zlib_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.894 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] upgrade_levels.baseapi = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.894 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] upgrade_levels.cert = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.894 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] upgrade_levels.compute = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.894 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] upgrade_levels.conductor = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.894 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] upgrade_levels.scheduler = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.894 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.894 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.894 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.895 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.895 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.895 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.895 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.895 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.895 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.895 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] vmware.api_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.895 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] vmware.ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.896 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] vmware.cache_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.896 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] vmware.cluster_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.896 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] vmware.connection_pool_size = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.896 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] vmware.console_delay_seconds = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.896 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] vmware.datastore_regex = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.896 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] vmware.host_ip = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.896 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] vmware.host_password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.897 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] vmware.host_port = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.897 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] vmware.host_username = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.897 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] vmware.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.897 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] vmware.integration_bridge = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.897 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] vmware.maximum_objects = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.897 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] vmware.pbm_default_policy = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.897 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] vmware.pbm_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.897 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] vmware.pbm_wsdl_location = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.898 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] vmware.serial_log_dir = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.898 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] vmware.serial_port_proxy_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.898 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.898 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] vmware.task_poll_interval = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.898 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] vmware.use_linked_clone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.898 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] vmware.vnc_keymap = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.898 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] vmware.vnc_port = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.899 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] vmware.vnc_port_total = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.899 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] vnc.auth_schemes = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.899 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] vnc.enabled = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.899 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] vnc.novncproxy_base_url = http://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.899 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] vnc.novncproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.899 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] vnc.novncproxy_port = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.899 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] vnc.server_listen = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.900 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] vnc.server_proxyclient_address = 192.168.122.106 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.900 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] vnc.vencrypt_ca_certs = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.900 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] vnc.vencrypt_client_cert = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.900 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] vnc.vencrypt_client_key = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.900 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.900 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.900 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.901 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.901 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.901 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] workarounds.disable_rootwrap = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.901 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.901 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.901 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.901 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.902 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.902 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.902 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.902 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.902 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.902 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.902 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23884 DF PROTO=TCP SPT=49422 DPT=9102 SEQ=2260569640 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83A85E650000000001030307) Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.902 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.903 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.903 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.903 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.903 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] wsgi.api_paste_config = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.903 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] wsgi.client_socket_timeout = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.903 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] wsgi.default_pool_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.903 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] wsgi.keep_alive = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.904 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] wsgi.max_header_line = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.904 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] wsgi.secure_proxy_ssl_header = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.904 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] wsgi.ssl_ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.904 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] wsgi.ssl_cert_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.904 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] wsgi.ssl_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.904 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] wsgi.tcp_keepidle = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.904 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.905 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] zvm.ca_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.905 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] zvm.cloud_connector_url = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.905 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] zvm.image_tmp_path = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.905 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] zvm.reachable_timeout = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.905 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.905 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_policy.enforce_scope = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.905 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.906 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_policy.policy_dirs = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.906 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_policy.policy_file = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.906 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.906 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.906 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.906 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.906 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.907 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.907 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.907 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] remote_debug.host = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.907 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] remote_debug.port = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.907 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.907 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.907 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.908 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.908 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.908 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.908 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.908 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.908 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.908 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.908 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.909 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.909 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.909 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.909 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.909 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.909 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.909 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.910 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.910 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.910 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.910 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.910 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.910 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.910 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.910 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_messaging_rabbit.ssl = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.911 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_messaging_rabbit.ssl_ca_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.911 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_messaging_rabbit.ssl_cert_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.911 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.911 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_messaging_rabbit.ssl_key_file = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.911 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_messaging_rabbit.ssl_version = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.911 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.911 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.912 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.912 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.912 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_limit.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.912 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_limit.auth_type = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.912 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_limit.auth_url = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.912 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_limit.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.912 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_limit.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.913 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_limit.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.913 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_limit.connect_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.913 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.913 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_limit.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.913 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.913 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_limit.domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.913 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_limit.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.913 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_limit.endpoint_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.914 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_limit.endpoint_override = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.914 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_limit.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.914 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_limit.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.914 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_limit.max_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.914 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_limit.min_version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.914 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_limit.password = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.914 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_limit.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.914 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.915 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_limit.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.915 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_limit.project_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.915 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_limit.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.915 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_limit.service_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.915 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_limit.service_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.915 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_limit.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.915 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.915 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.916 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_limit.system_scope = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.916 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_limit.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.916 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_limit.trust_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.916 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_limit.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.916 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_limit.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.916 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_limit.user_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.916 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_limit.username = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.917 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_limit.valid_interfaces = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.917 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_limit.version = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.917 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.917 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.917 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] oslo_reports.log_dir = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.917 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.917 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.917 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.918 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.918 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.918 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.918 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.918 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] vif_plug_ovs_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.918 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.918 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.919 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.919 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] vif_plug_ovs_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.919 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.919 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.919 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.919 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.919 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] os_vif_linux_bridge.iptables_top_regex = log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.919 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.920 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] os_vif_linux_bridge.use_ipv6 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.920 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.920 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] os_vif_ovs.isolate_vif = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.920 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] os_vif_ovs.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.920 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] os_vif_ovs.ovs_vsctl_timeout = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.920 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] os_vif_ovs.ovsdb_connection = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.920 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] os_vif_ovs.ovsdb_interface = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.921 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] os_vif_ovs.per_port_bridge = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.921 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] os_brick.lock_path = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.921 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.921 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.921 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] privsep_osbrick.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.921 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] privsep_osbrick.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.921 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.921 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] privsep_osbrick.logger_name = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.922 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.922 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] privsep_osbrick.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.922 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] nova_sys_admin.capabilities = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.922 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] nova_sys_admin.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.922 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] nova_sys_admin.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.922 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] nova_sys_admin.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.922 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.922 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] nova_sys_admin.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.923 286348 DEBUG oslo_service.service [None req-c282a20c-719d-4b58-81e3-ad3fe1a9c2f4 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.923 286348 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.936 286348 INFO nova.virt.node [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] Determined node identity 26c8956b-6742-4951-b566-971b9bbe323b from /var/lib/nova/compute_id#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.936 286348 DEBUG nova.virt.libvirt.host [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.937 286348 DEBUG nova.virt.libvirt.host [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.937 286348 DEBUG nova.virt.libvirt.host [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.937 286348 DEBUG nova.virt.libvirt.host [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.948 286348 DEBUG nova.virt.libvirt.host [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] Registering for lifecycle events _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.950 286348 DEBUG nova.virt.libvirt.host [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] Registering for connection events: _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.950 286348 INFO nova.virt.libvirt.driver [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] Connection event '1' reason 'None'#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.957 286348 INFO nova.virt.libvirt.host [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] Libvirt host capabilities Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: 12c7b589-8d2b-44b6-80e1-1f4b0f34f69b Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: x86_64 Dec 15 04:44:57 localhost nova_compute[286344]: EPYC-Rome-v4 Dec 15 04:44:57 localhost nova_compute[286344]: AMD Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: tcp Dec 15 04:44:57 localhost nova_compute[286344]: rdma Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: 16116604 Dec 15 04:44:57 localhost nova_compute[286344]: 4029151 Dec 15 04:44:57 localhost nova_compute[286344]: 0 Dec 15 04:44:57 localhost nova_compute[286344]: 0 Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: selinux Dec 15 04:44:57 localhost nova_compute[286344]: 0 Dec 15 04:44:57 localhost nova_compute[286344]: system_u:system_r:svirt_t:s0 Dec 15 04:44:57 localhost nova_compute[286344]: system_u:system_r:svirt_tcg_t:s0 Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: dac Dec 15 04:44:57 localhost nova_compute[286344]: 0 Dec 15 04:44:57 localhost nova_compute[286344]: +107:+107 Dec 15 04:44:57 localhost nova_compute[286344]: +107:+107 Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: hvm Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: 32 Dec 15 04:44:57 localhost nova_compute[286344]: /usr/libexec/qemu-kvm Dec 15 04:44:57 localhost nova_compute[286344]: pc-i440fx-rhel7.6.0 Dec 15 04:44:57 localhost nova_compute[286344]: pc Dec 15 04:44:57 localhost nova_compute[286344]: pc-q35-rhel9.8.0 Dec 15 04:44:57 localhost nova_compute[286344]: q35 Dec 15 04:44:57 localhost nova_compute[286344]: pc-q35-rhel9.6.0 Dec 15 04:44:57 localhost nova_compute[286344]: pc-q35-rhel8.6.0 Dec 15 04:44:57 localhost nova_compute[286344]: pc-q35-rhel9.4.0 Dec 15 04:44:57 localhost nova_compute[286344]: pc-q35-rhel8.5.0 Dec 15 04:44:57 localhost nova_compute[286344]: pc-q35-rhel8.3.0 Dec 15 04:44:57 localhost nova_compute[286344]: pc-q35-rhel7.6.0 Dec 15 04:44:57 localhost nova_compute[286344]: pc-q35-rhel8.4.0 Dec 15 04:44:57 localhost nova_compute[286344]: pc-q35-rhel9.2.0 Dec 15 04:44:57 localhost nova_compute[286344]: pc-q35-rhel8.2.0 Dec 15 04:44:57 localhost nova_compute[286344]: pc-q35-rhel9.0.0 Dec 15 04:44:57 localhost nova_compute[286344]: pc-q35-rhel8.0.0 Dec 15 04:44:57 localhost nova_compute[286344]: pc-q35-rhel8.1.0 Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: hvm Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: 64 Dec 15 04:44:57 localhost nova_compute[286344]: /usr/libexec/qemu-kvm Dec 15 04:44:57 localhost nova_compute[286344]: pc-i440fx-rhel7.6.0 Dec 15 04:44:57 localhost nova_compute[286344]: pc Dec 15 04:44:57 localhost nova_compute[286344]: pc-q35-rhel9.8.0 Dec 15 04:44:57 localhost nova_compute[286344]: q35 Dec 15 04:44:57 localhost nova_compute[286344]: pc-q35-rhel9.6.0 Dec 15 04:44:57 localhost nova_compute[286344]: pc-q35-rhel8.6.0 Dec 15 04:44:57 localhost nova_compute[286344]: pc-q35-rhel9.4.0 Dec 15 04:44:57 localhost nova_compute[286344]: pc-q35-rhel8.5.0 Dec 15 04:44:57 localhost nova_compute[286344]: pc-q35-rhel8.3.0 Dec 15 04:44:57 localhost nova_compute[286344]: pc-q35-rhel7.6.0 Dec 15 04:44:57 localhost nova_compute[286344]: pc-q35-rhel8.4.0 Dec 15 04:44:57 localhost nova_compute[286344]: pc-q35-rhel9.2.0 Dec 15 04:44:57 localhost nova_compute[286344]: pc-q35-rhel8.2.0 Dec 15 04:44:57 localhost nova_compute[286344]: pc-q35-rhel9.0.0 Dec 15 04:44:57 localhost nova_compute[286344]: pc-q35-rhel8.0.0 Dec 15 04:44:57 localhost nova_compute[286344]: pc-q35-rhel8.1.0 Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: #033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.962 286348 DEBUG nova.virt.libvirt.volume.mount [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.965 286348 DEBUG nova.virt.libvirt.host [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m Dec 15 04:44:57 localhost nova_compute[286344]: 2025-12-15 09:44:57.970 286348 DEBUG nova.virt.libvirt.host [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: /usr/libexec/qemu-kvm Dec 15 04:44:57 localhost nova_compute[286344]: kvm Dec 15 04:44:57 localhost nova_compute[286344]: pc-q35-rhel9.8.0 Dec 15 04:44:57 localhost nova_compute[286344]: i686 Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: /usr/share/OVMF/OVMF_CODE.secboot.fd Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: rom Dec 15 04:44:57 localhost nova_compute[286344]: pflash Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: yes Dec 15 04:44:57 localhost nova_compute[286344]: no Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: no Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: on Dec 15 04:44:57 localhost nova_compute[286344]: off Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: on Dec 15 04:44:57 localhost nova_compute[286344]: off Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: EPYC-Rome Dec 15 04:44:57 localhost nova_compute[286344]: AMD Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: 486 Dec 15 04:44:57 localhost nova_compute[286344]: 486-v1 Dec 15 04:44:57 localhost nova_compute[286344]: Broadwell Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Broadwell-IBRS Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Broadwell-noTSX Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Broadwell-noTSX-IBRS Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Broadwell-v1 Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Broadwell-v2 Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Broadwell-v3 Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Broadwell-v4 Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Cascadelake-Server Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Cascadelake-Server-noTSX Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Cascadelake-Server-v1 Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Cascadelake-Server-v2 Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Cascadelake-Server-v3 Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Cascadelake-Server-v4 Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Cascadelake-Server-v5 Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Conroe Dec 15 04:44:57 localhost nova_compute[286344]: Conroe-v1 Dec 15 04:44:57 localhost nova_compute[286344]: Cooperlake Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Cooperlake-v1 Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Cooperlake-v2 Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Denverton Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Denverton-v1 Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Denverton-v2 Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Denverton-v3 Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dhyana Dec 15 04:44:57 localhost nova_compute[286344]: Dhyana-v1 Dec 15 04:44:57 localhost nova_compute[286344]: Dhyana-v2 Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: EPYC Dec 15 04:44:57 localhost nova_compute[286344]: EPYC-Genoa Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: EPYC-Genoa-v1 Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: EPYC-IBPB Dec 15 04:44:57 localhost nova_compute[286344]: EPYC-Milan Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: EPYC-Milan-v1 Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: EPYC-Milan-v2 Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: EPYC-Rome Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: EPYC-Rome-v1 Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: EPYC-Rome-v2 Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: EPYC-Rome-v3 Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: EPYC-Rome-v4 Dec 15 04:44:57 localhost nova_compute[286344]: EPYC-v1 Dec 15 04:44:57 localhost nova_compute[286344]: EPYC-v2 Dec 15 04:44:57 localhost nova_compute[286344]: EPYC-v3 Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: EPYC-v4 Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: GraniteRapids Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: GraniteRapids-v1 Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: GraniteRapids-v2 Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Haswell Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Haswell-IBRS Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Haswell-noTSX Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Haswell-noTSX-IBRS Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Haswell-v1 Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Haswell-v2 Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Haswell-v3 Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Haswell-v4 Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Icelake-Server Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Icelake-Server-noTSX Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Icelake-Server-v1 Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Icelake-Server-v2 Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Icelake-Server-v3 Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Icelake-Server-v4 Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Icelake-Server-v5 Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Icelake-Server-v6 Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Icelake-Server-v7 Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: IvyBridge Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: IvyBridge-IBRS Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: IvyBridge-v1 Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: IvyBridge-v2 Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: KnightsMill Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: KnightsMill-v1 Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Nehalem Dec 15 04:44:57 localhost nova_compute[286344]: Nehalem-IBRS Dec 15 04:44:57 localhost nova_compute[286344]: Nehalem-v1 Dec 15 04:44:57 localhost nova_compute[286344]: Nehalem-v2 Dec 15 04:44:57 localhost nova_compute[286344]: Opteron_G1 Dec 15 04:44:57 localhost nova_compute[286344]: Opteron_G1-v1 Dec 15 04:44:57 localhost nova_compute[286344]: Opteron_G2 Dec 15 04:44:57 localhost nova_compute[286344]: Opteron_G2-v1 Dec 15 04:44:57 localhost nova_compute[286344]: Opteron_G3 Dec 15 04:44:57 localhost nova_compute[286344]: Opteron_G3-v1 Dec 15 04:44:57 localhost nova_compute[286344]: Opteron_G4 Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Opteron_G4-v1 Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Opteron_G5 Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Opteron_G5-v1 Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Penryn Dec 15 04:44:57 localhost nova_compute[286344]: Penryn-v1 Dec 15 04:44:57 localhost nova_compute[286344]: SandyBridge Dec 15 04:44:57 localhost nova_compute[286344]: SandyBridge-IBRS Dec 15 04:44:57 localhost nova_compute[286344]: SandyBridge-v1 Dec 15 04:44:57 localhost nova_compute[286344]: SandyBridge-v2 Dec 15 04:44:57 localhost nova_compute[286344]: SapphireRapids Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:57 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: SapphireRapids-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: SapphireRapids-v2 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: SapphireRapids-v3 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: SierraForest Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: SierraForest-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Skylake-Client Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Skylake-Client-IBRS Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Skylake-Client-noTSX-IBRS Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Skylake-Client-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Skylake-Client-v2 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Skylake-Client-v3 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Skylake-Client-v4 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Skylake-Server Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Skylake-Server-IBRS Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Skylake-Server-noTSX-IBRS Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Skylake-Server-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Skylake-Server-v2 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Skylake-Server-v3 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Skylake-Server-v4 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Skylake-Server-v5 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Snowridge Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Snowridge-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Snowridge-v2 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Snowridge-v3 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Snowridge-v4 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Westmere Dec 15 04:44:58 localhost nova_compute[286344]: Westmere-IBRS Dec 15 04:44:58 localhost nova_compute[286344]: Westmere-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Westmere-v2 Dec 15 04:44:58 localhost nova_compute[286344]: athlon Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: athlon-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: core2duo Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: core2duo-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: coreduo Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: coreduo-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: kvm32 Dec 15 04:44:58 localhost nova_compute[286344]: kvm32-v1 Dec 15 04:44:58 localhost nova_compute[286344]: kvm64 Dec 15 04:44:58 localhost nova_compute[286344]: kvm64-v1 Dec 15 04:44:58 localhost nova_compute[286344]: n270 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: n270-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: pentium Dec 15 04:44:58 localhost nova_compute[286344]: pentium-v1 Dec 15 04:44:58 localhost nova_compute[286344]: pentium2 Dec 15 04:44:58 localhost nova_compute[286344]: pentium2-v1 Dec 15 04:44:58 localhost nova_compute[286344]: pentium3 Dec 15 04:44:58 localhost nova_compute[286344]: pentium3-v1 Dec 15 04:44:58 localhost nova_compute[286344]: phenom Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: phenom-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: qemu32 Dec 15 04:44:58 localhost nova_compute[286344]: qemu32-v1 Dec 15 04:44:58 localhost nova_compute[286344]: qemu64 Dec 15 04:44:58 localhost nova_compute[286344]: qemu64-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: file Dec 15 04:44:58 localhost nova_compute[286344]: anonymous Dec 15 04:44:58 localhost nova_compute[286344]: memfd Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: disk Dec 15 04:44:58 localhost nova_compute[286344]: cdrom Dec 15 04:44:58 localhost nova_compute[286344]: floppy Dec 15 04:44:58 localhost nova_compute[286344]: lun Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: fdc Dec 15 04:44:58 localhost nova_compute[286344]: scsi Dec 15 04:44:58 localhost nova_compute[286344]: virtio Dec 15 04:44:58 localhost nova_compute[286344]: usb Dec 15 04:44:58 localhost nova_compute[286344]: sata Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: virtio Dec 15 04:44:58 localhost nova_compute[286344]: virtio-transitional Dec 15 04:44:58 localhost nova_compute[286344]: virtio-non-transitional Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: vnc Dec 15 04:44:58 localhost nova_compute[286344]: egl-headless Dec 15 04:44:58 localhost nova_compute[286344]: dbus Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: subsystem Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: default Dec 15 04:44:58 localhost nova_compute[286344]: mandatory Dec 15 04:44:58 localhost nova_compute[286344]: requisite Dec 15 04:44:58 localhost nova_compute[286344]: optional Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: usb Dec 15 04:44:58 localhost nova_compute[286344]: pci Dec 15 04:44:58 localhost nova_compute[286344]: scsi Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: virtio Dec 15 04:44:58 localhost nova_compute[286344]: virtio-transitional Dec 15 04:44:58 localhost nova_compute[286344]: virtio-non-transitional Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: random Dec 15 04:44:58 localhost nova_compute[286344]: egd Dec 15 04:44:58 localhost nova_compute[286344]: builtin Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: path Dec 15 04:44:58 localhost nova_compute[286344]: handle Dec 15 04:44:58 localhost nova_compute[286344]: virtiofs Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: tpm-tis Dec 15 04:44:58 localhost nova_compute[286344]: tpm-crb Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: emulator Dec 15 04:44:58 localhost nova_compute[286344]: external Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: 2.0 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: usb Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: pty Dec 15 04:44:58 localhost nova_compute[286344]: unix Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: qemu Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: builtin Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: default Dec 15 04:44:58 localhost nova_compute[286344]: passt Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: isa Dec 15 04:44:58 localhost nova_compute[286344]: hyperv Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: null Dec 15 04:44:58 localhost nova_compute[286344]: vc Dec 15 04:44:58 localhost nova_compute[286344]: pty Dec 15 04:44:58 localhost nova_compute[286344]: dev Dec 15 04:44:58 localhost nova_compute[286344]: file Dec 15 04:44:58 localhost nova_compute[286344]: pipe Dec 15 04:44:58 localhost nova_compute[286344]: stdio Dec 15 04:44:58 localhost nova_compute[286344]: udp Dec 15 04:44:58 localhost nova_compute[286344]: tcp Dec 15 04:44:58 localhost nova_compute[286344]: unix Dec 15 04:44:58 localhost nova_compute[286344]: qemu-vdagent Dec 15 04:44:58 localhost nova_compute[286344]: dbus Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: relaxed Dec 15 04:44:58 localhost nova_compute[286344]: vapic Dec 15 04:44:58 localhost nova_compute[286344]: spinlocks Dec 15 04:44:58 localhost nova_compute[286344]: vpindex Dec 15 04:44:58 localhost nova_compute[286344]: runtime Dec 15 04:44:58 localhost nova_compute[286344]: synic Dec 15 04:44:58 localhost nova_compute[286344]: stimer Dec 15 04:44:58 localhost nova_compute[286344]: reset Dec 15 04:44:58 localhost nova_compute[286344]: vendor_id Dec 15 04:44:58 localhost nova_compute[286344]: frequencies Dec 15 04:44:58 localhost nova_compute[286344]: reenlightenment Dec 15 04:44:58 localhost nova_compute[286344]: tlbflush Dec 15 04:44:58 localhost nova_compute[286344]: ipi Dec 15 04:44:58 localhost nova_compute[286344]: avic Dec 15 04:44:58 localhost nova_compute[286344]: emsr_bitmap Dec 15 04:44:58 localhost nova_compute[286344]: xmm_input Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: 4095 Dec 15 04:44:58 localhost nova_compute[286344]: on Dec 15 04:44:58 localhost nova_compute[286344]: off Dec 15 04:44:58 localhost nova_compute[286344]: off Dec 15 04:44:58 localhost nova_compute[286344]: Linux KVM Hv Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: tdx Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Dec 15 04:44:58 localhost nova_compute[286344]: 2025-12-15 09:44:57.976 286348 DEBUG nova.virt.libvirt.host [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: /usr/libexec/qemu-kvm Dec 15 04:44:58 localhost nova_compute[286344]: kvm Dec 15 04:44:58 localhost nova_compute[286344]: pc-i440fx-rhel7.6.0 Dec 15 04:44:58 localhost nova_compute[286344]: i686 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: /usr/share/OVMF/OVMF_CODE.secboot.fd Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: rom Dec 15 04:44:58 localhost nova_compute[286344]: pflash Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: yes Dec 15 04:44:58 localhost nova_compute[286344]: no Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: no Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: on Dec 15 04:44:58 localhost nova_compute[286344]: off Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: on Dec 15 04:44:58 localhost nova_compute[286344]: off Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: EPYC-Rome Dec 15 04:44:58 localhost nova_compute[286344]: AMD Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: 486 Dec 15 04:44:58 localhost nova_compute[286344]: 486-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Broadwell Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Broadwell-IBRS Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Broadwell-noTSX Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Broadwell-noTSX-IBRS Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Broadwell-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Broadwell-v2 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Broadwell-v3 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Broadwell-v4 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Cascadelake-Server Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Cascadelake-Server-noTSX Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Cascadelake-Server-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Cascadelake-Server-v2 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Cascadelake-Server-v3 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Cascadelake-Server-v4 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Cascadelake-Server-v5 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Conroe Dec 15 04:44:58 localhost nova_compute[286344]: Conroe-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Cooperlake Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Cooperlake-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Cooperlake-v2 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Denverton Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Denverton-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Denverton-v2 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Denverton-v3 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dhyana Dec 15 04:44:58 localhost nova_compute[286344]: Dhyana-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dhyana-v2 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: EPYC Dec 15 04:44:58 localhost nova_compute[286344]: EPYC-Genoa Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: EPYC-Genoa-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: EPYC-IBPB Dec 15 04:44:58 localhost nova_compute[286344]: EPYC-Milan Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: EPYC-Milan-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: EPYC-Milan-v2 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: EPYC-Rome Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: EPYC-Rome-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: EPYC-Rome-v2 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: EPYC-Rome-v3 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: EPYC-Rome-v4 Dec 15 04:44:58 localhost nova_compute[286344]: EPYC-v1 Dec 15 04:44:58 localhost nova_compute[286344]: EPYC-v2 Dec 15 04:44:58 localhost nova_compute[286344]: EPYC-v3 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: EPYC-v4 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: GraniteRapids Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: GraniteRapids-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: GraniteRapids-v2 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Haswell Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Haswell-IBRS Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Haswell-noTSX Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Haswell-noTSX-IBRS Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Haswell-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Haswell-v2 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Haswell-v3 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Haswell-v4 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Icelake-Server Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Icelake-Server-noTSX Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Icelake-Server-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Icelake-Server-v2 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Icelake-Server-v3 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Icelake-Server-v4 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Icelake-Server-v5 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Icelake-Server-v6 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Icelake-Server-v7 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: IvyBridge Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: IvyBridge-IBRS Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: IvyBridge-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: IvyBridge-v2 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: KnightsMill Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: KnightsMill-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Nehalem Dec 15 04:44:58 localhost nova_compute[286344]: Nehalem-IBRS Dec 15 04:44:58 localhost nova_compute[286344]: Nehalem-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Nehalem-v2 Dec 15 04:44:58 localhost nova_compute[286344]: Opteron_G1 Dec 15 04:44:58 localhost nova_compute[286344]: Opteron_G1-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Opteron_G2 Dec 15 04:44:58 localhost nova_compute[286344]: Opteron_G2-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Opteron_G3 Dec 15 04:44:58 localhost nova_compute[286344]: Opteron_G3-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Opteron_G4 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Opteron_G4-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Opteron_G5 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Opteron_G5-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Penryn Dec 15 04:44:58 localhost nova_compute[286344]: Penryn-v1 Dec 15 04:44:58 localhost nova_compute[286344]: SandyBridge Dec 15 04:44:58 localhost nova_compute[286344]: SandyBridge-IBRS Dec 15 04:44:58 localhost nova_compute[286344]: SandyBridge-v1 Dec 15 04:44:58 localhost nova_compute[286344]: SandyBridge-v2 Dec 15 04:44:58 localhost nova_compute[286344]: SapphireRapids Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: SapphireRapids-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: SapphireRapids-v2 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: SapphireRapids-v3 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: SierraForest Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: SierraForest-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Skylake-Client Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Skylake-Client-IBRS Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Skylake-Client-noTSX-IBRS Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Skylake-Client-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Skylake-Client-v2 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Skylake-Client-v3 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Skylake-Client-v4 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Skylake-Server Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Skylake-Server-IBRS Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Skylake-Server-noTSX-IBRS Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Skylake-Server-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Skylake-Server-v2 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Skylake-Server-v3 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Skylake-Server-v4 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Skylake-Server-v5 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Snowridge Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Snowridge-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Snowridge-v2 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Snowridge-v3 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Snowridge-v4 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Westmere Dec 15 04:44:58 localhost nova_compute[286344]: Westmere-IBRS Dec 15 04:44:58 localhost nova_compute[286344]: Westmere-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Westmere-v2 Dec 15 04:44:58 localhost nova_compute[286344]: athlon Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: athlon-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: core2duo Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: core2duo-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: coreduo Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: coreduo-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: kvm32 Dec 15 04:44:58 localhost nova_compute[286344]: kvm32-v1 Dec 15 04:44:58 localhost nova_compute[286344]: kvm64 Dec 15 04:44:58 localhost nova_compute[286344]: kvm64-v1 Dec 15 04:44:58 localhost nova_compute[286344]: n270 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: n270-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: pentium Dec 15 04:44:58 localhost nova_compute[286344]: pentium-v1 Dec 15 04:44:58 localhost nova_compute[286344]: pentium2 Dec 15 04:44:58 localhost nova_compute[286344]: pentium2-v1 Dec 15 04:44:58 localhost nova_compute[286344]: pentium3 Dec 15 04:44:58 localhost nova_compute[286344]: pentium3-v1 Dec 15 04:44:58 localhost nova_compute[286344]: phenom Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: phenom-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: qemu32 Dec 15 04:44:58 localhost nova_compute[286344]: qemu32-v1 Dec 15 04:44:58 localhost nova_compute[286344]: qemu64 Dec 15 04:44:58 localhost nova_compute[286344]: qemu64-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: file Dec 15 04:44:58 localhost nova_compute[286344]: anonymous Dec 15 04:44:58 localhost nova_compute[286344]: memfd Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: disk Dec 15 04:44:58 localhost nova_compute[286344]: cdrom Dec 15 04:44:58 localhost nova_compute[286344]: floppy Dec 15 04:44:58 localhost nova_compute[286344]: lun Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: ide Dec 15 04:44:58 localhost nova_compute[286344]: fdc Dec 15 04:44:58 localhost nova_compute[286344]: scsi Dec 15 04:44:58 localhost nova_compute[286344]: virtio Dec 15 04:44:58 localhost nova_compute[286344]: usb Dec 15 04:44:58 localhost nova_compute[286344]: sata Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: virtio Dec 15 04:44:58 localhost nova_compute[286344]: virtio-transitional Dec 15 04:44:58 localhost nova_compute[286344]: virtio-non-transitional Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: vnc Dec 15 04:44:58 localhost nova_compute[286344]: egl-headless Dec 15 04:44:58 localhost nova_compute[286344]: dbus Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: subsystem Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: default Dec 15 04:44:58 localhost nova_compute[286344]: mandatory Dec 15 04:44:58 localhost nova_compute[286344]: requisite Dec 15 04:44:58 localhost nova_compute[286344]: optional Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: usb Dec 15 04:44:58 localhost nova_compute[286344]: pci Dec 15 04:44:58 localhost nova_compute[286344]: scsi Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: virtio Dec 15 04:44:58 localhost nova_compute[286344]: virtio-transitional Dec 15 04:44:58 localhost nova_compute[286344]: virtio-non-transitional Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: random Dec 15 04:44:58 localhost nova_compute[286344]: egd Dec 15 04:44:58 localhost nova_compute[286344]: builtin Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: path Dec 15 04:44:58 localhost nova_compute[286344]: handle Dec 15 04:44:58 localhost nova_compute[286344]: virtiofs Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: tpm-tis Dec 15 04:44:58 localhost nova_compute[286344]: tpm-crb Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: emulator Dec 15 04:44:58 localhost nova_compute[286344]: external Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: 2.0 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: usb Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: pty Dec 15 04:44:58 localhost nova_compute[286344]: unix Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: qemu Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: builtin Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: default Dec 15 04:44:58 localhost nova_compute[286344]: passt Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: isa Dec 15 04:44:58 localhost nova_compute[286344]: hyperv Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: null Dec 15 04:44:58 localhost nova_compute[286344]: vc Dec 15 04:44:58 localhost nova_compute[286344]: pty Dec 15 04:44:58 localhost nova_compute[286344]: dev Dec 15 04:44:58 localhost nova_compute[286344]: file Dec 15 04:44:58 localhost nova_compute[286344]: pipe Dec 15 04:44:58 localhost nova_compute[286344]: stdio Dec 15 04:44:58 localhost nova_compute[286344]: udp Dec 15 04:44:58 localhost nova_compute[286344]: tcp Dec 15 04:44:58 localhost nova_compute[286344]: unix Dec 15 04:44:58 localhost nova_compute[286344]: qemu-vdagent Dec 15 04:44:58 localhost nova_compute[286344]: dbus Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: relaxed Dec 15 04:44:58 localhost nova_compute[286344]: vapic Dec 15 04:44:58 localhost nova_compute[286344]: spinlocks Dec 15 04:44:58 localhost nova_compute[286344]: vpindex Dec 15 04:44:58 localhost nova_compute[286344]: runtime Dec 15 04:44:58 localhost nova_compute[286344]: synic Dec 15 04:44:58 localhost nova_compute[286344]: stimer Dec 15 04:44:58 localhost nova_compute[286344]: reset Dec 15 04:44:58 localhost nova_compute[286344]: vendor_id Dec 15 04:44:58 localhost nova_compute[286344]: frequencies Dec 15 04:44:58 localhost nova_compute[286344]: reenlightenment Dec 15 04:44:58 localhost nova_compute[286344]: tlbflush Dec 15 04:44:58 localhost nova_compute[286344]: ipi Dec 15 04:44:58 localhost nova_compute[286344]: avic Dec 15 04:44:58 localhost nova_compute[286344]: emsr_bitmap Dec 15 04:44:58 localhost nova_compute[286344]: xmm_input Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: 4095 Dec 15 04:44:58 localhost nova_compute[286344]: on Dec 15 04:44:58 localhost nova_compute[286344]: off Dec 15 04:44:58 localhost nova_compute[286344]: off Dec 15 04:44:58 localhost nova_compute[286344]: Linux KVM Hv Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: tdx Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Dec 15 04:44:58 localhost nova_compute[286344]: 2025-12-15 09:44:57.997 286348 DEBUG nova.virt.libvirt.host [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m Dec 15 04:44:58 localhost nova_compute[286344]: 2025-12-15 09:44:58.001 286348 DEBUG nova.virt.libvirt.host [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: /usr/libexec/qemu-kvm Dec 15 04:44:58 localhost nova_compute[286344]: kvm Dec 15 04:44:58 localhost nova_compute[286344]: pc-q35-rhel9.8.0 Dec 15 04:44:58 localhost nova_compute[286344]: x86_64 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: efi Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: /usr/share/edk2/ovmf/OVMF_CODE.secboot.fd Dec 15 04:44:58 localhost nova_compute[286344]: /usr/share/edk2/ovmf/OVMF_CODE.fd Dec 15 04:44:58 localhost nova_compute[286344]: /usr/share/edk2/ovmf/OVMF.amdsev.fd Dec 15 04:44:58 localhost nova_compute[286344]: /usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: rom Dec 15 04:44:58 localhost nova_compute[286344]: pflash Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: yes Dec 15 04:44:58 localhost nova_compute[286344]: no Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: yes Dec 15 04:44:58 localhost nova_compute[286344]: no Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: on Dec 15 04:44:58 localhost nova_compute[286344]: off Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: on Dec 15 04:44:58 localhost nova_compute[286344]: off Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: EPYC-Rome Dec 15 04:44:58 localhost nova_compute[286344]: AMD Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: 486 Dec 15 04:44:58 localhost nova_compute[286344]: 486-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Broadwell Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Broadwell-IBRS Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Broadwell-noTSX Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Broadwell-noTSX-IBRS Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Broadwell-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Broadwell-v2 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Broadwell-v3 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Broadwell-v4 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Cascadelake-Server Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Cascadelake-Server-noTSX Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Cascadelake-Server-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Cascadelake-Server-v2 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Cascadelake-Server-v3 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Cascadelake-Server-v4 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Cascadelake-Server-v5 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Conroe Dec 15 04:44:58 localhost nova_compute[286344]: Conroe-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Cooperlake Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Cooperlake-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Cooperlake-v2 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Denverton Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Denverton-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Denverton-v2 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Denverton-v3 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dhyana Dec 15 04:44:58 localhost nova_compute[286344]: Dhyana-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dhyana-v2 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: EPYC Dec 15 04:44:58 localhost nova_compute[286344]: EPYC-Genoa Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: EPYC-Genoa-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: EPYC-IBPB Dec 15 04:44:58 localhost nova_compute[286344]: EPYC-Milan Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: EPYC-Milan-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: EPYC-Milan-v2 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: EPYC-Rome Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: EPYC-Rome-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: EPYC-Rome-v2 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: EPYC-Rome-v3 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: EPYC-Rome-v4 Dec 15 04:44:58 localhost nova_compute[286344]: EPYC-v1 Dec 15 04:44:58 localhost nova_compute[286344]: EPYC-v2 Dec 15 04:44:58 localhost nova_compute[286344]: EPYC-v3 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: EPYC-v4 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: GraniteRapids Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: GraniteRapids-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: GraniteRapids-v2 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Haswell Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Haswell-IBRS Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Haswell-noTSX Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Haswell-noTSX-IBRS Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Haswell-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Haswell-v2 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Haswell-v3 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Haswell-v4 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Icelake-Server Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Icelake-Server-noTSX Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Icelake-Server-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Icelake-Server-v2 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Icelake-Server-v3 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Icelake-Server-v4 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Icelake-Server-v5 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Icelake-Server-v6 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Icelake-Server-v7 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: IvyBridge Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: IvyBridge-IBRS Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: IvyBridge-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: IvyBridge-v2 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: KnightsMill Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: KnightsMill-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Nehalem Dec 15 04:44:58 localhost nova_compute[286344]: Nehalem-IBRS Dec 15 04:44:58 localhost nova_compute[286344]: Nehalem-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Nehalem-v2 Dec 15 04:44:58 localhost nova_compute[286344]: Opteron_G1 Dec 15 04:44:58 localhost nova_compute[286344]: Opteron_G1-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Opteron_G2 Dec 15 04:44:58 localhost nova_compute[286344]: Opteron_G2-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Opteron_G3 Dec 15 04:44:58 localhost nova_compute[286344]: Opteron_G3-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Opteron_G4 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Opteron_G4-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Opteron_G5 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Opteron_G5-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Penryn Dec 15 04:44:58 localhost nova_compute[286344]: Penryn-v1 Dec 15 04:44:58 localhost nova_compute[286344]: SandyBridge Dec 15 04:44:58 localhost nova_compute[286344]: SandyBridge-IBRS Dec 15 04:44:58 localhost nova_compute[286344]: SandyBridge-v1 Dec 15 04:44:58 localhost nova_compute[286344]: SandyBridge-v2 Dec 15 04:44:58 localhost nova_compute[286344]: SapphireRapids Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: SapphireRapids-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: SapphireRapids-v2 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: SapphireRapids-v3 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: SierraForest Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: SierraForest-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Skylake-Client Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Skylake-Client-IBRS Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Skylake-Client-noTSX-IBRS Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Skylake-Client-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Skylake-Client-v2 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Skylake-Client-v3 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Skylake-Client-v4 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Skylake-Server Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Skylake-Server-IBRS Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Skylake-Server-noTSX-IBRS Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Skylake-Server-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Skylake-Server-v2 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Skylake-Server-v3 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Skylake-Server-v4 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Skylake-Server-v5 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Snowridge Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Snowridge-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Snowridge-v2 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Snowridge-v3 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Snowridge-v4 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Westmere Dec 15 04:44:58 localhost nova_compute[286344]: Westmere-IBRS Dec 15 04:44:58 localhost nova_compute[286344]: Westmere-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Westmere-v2 Dec 15 04:44:58 localhost nova_compute[286344]: athlon Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: athlon-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: core2duo Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: core2duo-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: coreduo Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: coreduo-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: kvm32 Dec 15 04:44:58 localhost nova_compute[286344]: kvm32-v1 Dec 15 04:44:58 localhost nova_compute[286344]: kvm64 Dec 15 04:44:58 localhost nova_compute[286344]: kvm64-v1 Dec 15 04:44:58 localhost nova_compute[286344]: n270 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: n270-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: pentium Dec 15 04:44:58 localhost nova_compute[286344]: pentium-v1 Dec 15 04:44:58 localhost nova_compute[286344]: pentium2 Dec 15 04:44:58 localhost nova_compute[286344]: pentium2-v1 Dec 15 04:44:58 localhost nova_compute[286344]: pentium3 Dec 15 04:44:58 localhost nova_compute[286344]: pentium3-v1 Dec 15 04:44:58 localhost nova_compute[286344]: phenom Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: phenom-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: qemu32 Dec 15 04:44:58 localhost nova_compute[286344]: qemu32-v1 Dec 15 04:44:58 localhost nova_compute[286344]: qemu64 Dec 15 04:44:58 localhost nova_compute[286344]: qemu64-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: file Dec 15 04:44:58 localhost nova_compute[286344]: anonymous Dec 15 04:44:58 localhost nova_compute[286344]: memfd Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: disk Dec 15 04:44:58 localhost nova_compute[286344]: cdrom Dec 15 04:44:58 localhost nova_compute[286344]: floppy Dec 15 04:44:58 localhost nova_compute[286344]: lun Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: fdc Dec 15 04:44:58 localhost nova_compute[286344]: scsi Dec 15 04:44:58 localhost nova_compute[286344]: virtio Dec 15 04:44:58 localhost nova_compute[286344]: usb Dec 15 04:44:58 localhost nova_compute[286344]: sata Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: virtio Dec 15 04:44:58 localhost nova_compute[286344]: virtio-transitional Dec 15 04:44:58 localhost nova_compute[286344]: virtio-non-transitional Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: vnc Dec 15 04:44:58 localhost nova_compute[286344]: egl-headless Dec 15 04:44:58 localhost nova_compute[286344]: dbus Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: subsystem Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: default Dec 15 04:44:58 localhost nova_compute[286344]: mandatory Dec 15 04:44:58 localhost nova_compute[286344]: requisite Dec 15 04:44:58 localhost nova_compute[286344]: optional Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: usb Dec 15 04:44:58 localhost nova_compute[286344]: pci Dec 15 04:44:58 localhost nova_compute[286344]: scsi Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: virtio Dec 15 04:44:58 localhost nova_compute[286344]: virtio-transitional Dec 15 04:44:58 localhost nova_compute[286344]: virtio-non-transitional Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: random Dec 15 04:44:58 localhost nova_compute[286344]: egd Dec 15 04:44:58 localhost nova_compute[286344]: builtin Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: path Dec 15 04:44:58 localhost nova_compute[286344]: handle Dec 15 04:44:58 localhost nova_compute[286344]: virtiofs Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: tpm-tis Dec 15 04:44:58 localhost nova_compute[286344]: tpm-crb Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: emulator Dec 15 04:44:58 localhost nova_compute[286344]: external Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: 2.0 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: usb Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: pty Dec 15 04:44:58 localhost nova_compute[286344]: unix Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: qemu Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: builtin Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: default Dec 15 04:44:58 localhost nova_compute[286344]: passt Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: isa Dec 15 04:44:58 localhost nova_compute[286344]: hyperv Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: null Dec 15 04:44:58 localhost nova_compute[286344]: vc Dec 15 04:44:58 localhost nova_compute[286344]: pty Dec 15 04:44:58 localhost nova_compute[286344]: dev Dec 15 04:44:58 localhost nova_compute[286344]: file Dec 15 04:44:58 localhost nova_compute[286344]: pipe Dec 15 04:44:58 localhost nova_compute[286344]: stdio Dec 15 04:44:58 localhost nova_compute[286344]: udp Dec 15 04:44:58 localhost nova_compute[286344]: tcp Dec 15 04:44:58 localhost nova_compute[286344]: unix Dec 15 04:44:58 localhost nova_compute[286344]: qemu-vdagent Dec 15 04:44:58 localhost nova_compute[286344]: dbus Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: relaxed Dec 15 04:44:58 localhost nova_compute[286344]: vapic Dec 15 04:44:58 localhost nova_compute[286344]: spinlocks Dec 15 04:44:58 localhost nova_compute[286344]: vpindex Dec 15 04:44:58 localhost nova_compute[286344]: runtime Dec 15 04:44:58 localhost nova_compute[286344]: synic Dec 15 04:44:58 localhost nova_compute[286344]: stimer Dec 15 04:44:58 localhost nova_compute[286344]: reset Dec 15 04:44:58 localhost nova_compute[286344]: vendor_id Dec 15 04:44:58 localhost nova_compute[286344]: frequencies Dec 15 04:44:58 localhost nova_compute[286344]: reenlightenment Dec 15 04:44:58 localhost nova_compute[286344]: tlbflush Dec 15 04:44:58 localhost nova_compute[286344]: ipi Dec 15 04:44:58 localhost nova_compute[286344]: avic Dec 15 04:44:58 localhost nova_compute[286344]: emsr_bitmap Dec 15 04:44:58 localhost nova_compute[286344]: xmm_input Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: 4095 Dec 15 04:44:58 localhost nova_compute[286344]: on Dec 15 04:44:58 localhost nova_compute[286344]: off Dec 15 04:44:58 localhost nova_compute[286344]: off Dec 15 04:44:58 localhost nova_compute[286344]: Linux KVM Hv Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: tdx Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Dec 15 04:44:58 localhost nova_compute[286344]: 2025-12-15 09:44:58.047 286348 DEBUG nova.virt.libvirt.host [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: /usr/libexec/qemu-kvm Dec 15 04:44:58 localhost nova_compute[286344]: kvm Dec 15 04:44:58 localhost nova_compute[286344]: pc-i440fx-rhel7.6.0 Dec 15 04:44:58 localhost nova_compute[286344]: x86_64 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: /usr/share/OVMF/OVMF_CODE.secboot.fd Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: rom Dec 15 04:44:58 localhost nova_compute[286344]: pflash Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: yes Dec 15 04:44:58 localhost nova_compute[286344]: no Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: no Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: on Dec 15 04:44:58 localhost nova_compute[286344]: off Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: on Dec 15 04:44:58 localhost nova_compute[286344]: off Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: EPYC-Rome Dec 15 04:44:58 localhost nova_compute[286344]: AMD Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: 486 Dec 15 04:44:58 localhost nova_compute[286344]: 486-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Broadwell Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Broadwell-IBRS Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Broadwell-noTSX Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Broadwell-noTSX-IBRS Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Broadwell-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Broadwell-v2 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Broadwell-v3 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Broadwell-v4 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Cascadelake-Server Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Cascadelake-Server-noTSX Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Cascadelake-Server-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Cascadelake-Server-v2 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Cascadelake-Server-v3 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Cascadelake-Server-v4 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Cascadelake-Server-v5 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Conroe Dec 15 04:44:58 localhost nova_compute[286344]: Conroe-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Cooperlake Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Cooperlake-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Cooperlake-v2 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Denverton Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Denverton-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Denverton-v2 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Denverton-v3 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dhyana Dec 15 04:44:58 localhost nova_compute[286344]: Dhyana-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dhyana-v2 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: EPYC Dec 15 04:44:58 localhost nova_compute[286344]: EPYC-Genoa Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: EPYC-Genoa-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: EPYC-IBPB Dec 15 04:44:58 localhost nova_compute[286344]: EPYC-Milan Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: EPYC-Milan-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: EPYC-Milan-v2 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: EPYC-Rome Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: EPYC-Rome-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: EPYC-Rome-v2 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: EPYC-Rome-v3 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: EPYC-Rome-v4 Dec 15 04:44:58 localhost nova_compute[286344]: EPYC-v1 Dec 15 04:44:58 localhost nova_compute[286344]: EPYC-v2 Dec 15 04:44:58 localhost nova_compute[286344]: EPYC-v3 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: EPYC-v4 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: GraniteRapids Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: GraniteRapids-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: GraniteRapids-v2 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Haswell Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Haswell-IBRS Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Haswell-noTSX Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Haswell-noTSX-IBRS Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Haswell-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Haswell-v2 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Haswell-v3 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Haswell-v4 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Icelake-Server Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Icelake-Server-noTSX Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Icelake-Server-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Icelake-Server-v2 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Icelake-Server-v3 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Icelake-Server-v4 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Icelake-Server-v5 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Icelake-Server-v6 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Icelake-Server-v7 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: IvyBridge Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: IvyBridge-IBRS Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: IvyBridge-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: IvyBridge-v2 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: KnightsMill Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: KnightsMill-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Nehalem Dec 15 04:44:58 localhost nova_compute[286344]: Nehalem-IBRS Dec 15 04:44:58 localhost nova_compute[286344]: Nehalem-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Nehalem-v2 Dec 15 04:44:58 localhost nova_compute[286344]: Opteron_G1 Dec 15 04:44:58 localhost nova_compute[286344]: Opteron_G1-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Opteron_G2 Dec 15 04:44:58 localhost nova_compute[286344]: Opteron_G2-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Opteron_G3 Dec 15 04:44:58 localhost nova_compute[286344]: Opteron_G3-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Opteron_G4 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Opteron_G4-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Opteron_G5 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Opteron_G5-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Penryn Dec 15 04:44:58 localhost nova_compute[286344]: Penryn-v1 Dec 15 04:44:58 localhost nova_compute[286344]: SandyBridge Dec 15 04:44:58 localhost nova_compute[286344]: SandyBridge-IBRS Dec 15 04:44:58 localhost nova_compute[286344]: SandyBridge-v1 Dec 15 04:44:58 localhost nova_compute[286344]: SandyBridge-v2 Dec 15 04:44:58 localhost nova_compute[286344]: SapphireRapids Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: SapphireRapids-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: SapphireRapids-v2 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: SapphireRapids-v3 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: SierraForest Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: SierraForest-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Skylake-Client Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Skylake-Client-IBRS Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Skylake-Client-noTSX-IBRS Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Skylake-Client-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Skylake-Client-v2 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Skylake-Client-v3 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Skylake-Client-v4 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Skylake-Server Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Skylake-Server-IBRS Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Skylake-Server-noTSX-IBRS Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Skylake-Server-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Skylake-Server-v2 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Skylake-Server-v3 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Skylake-Server-v4 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Skylake-Server-v5 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Snowridge Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Snowridge-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Snowridge-v2 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Snowridge-v3 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Snowridge-v4 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Westmere Dec 15 04:44:58 localhost nova_compute[286344]: Westmere-IBRS Dec 15 04:44:58 localhost nova_compute[286344]: Westmere-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Westmere-v2 Dec 15 04:44:58 localhost nova_compute[286344]: athlon Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: athlon-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: core2duo Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: core2duo-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: coreduo Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: coreduo-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: kvm32 Dec 15 04:44:58 localhost nova_compute[286344]: kvm32-v1 Dec 15 04:44:58 localhost nova_compute[286344]: kvm64 Dec 15 04:44:58 localhost nova_compute[286344]: kvm64-v1 Dec 15 04:44:58 localhost nova_compute[286344]: n270 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: n270-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: pentium Dec 15 04:44:58 localhost nova_compute[286344]: pentium-v1 Dec 15 04:44:58 localhost nova_compute[286344]: pentium2 Dec 15 04:44:58 localhost nova_compute[286344]: pentium2-v1 Dec 15 04:44:58 localhost nova_compute[286344]: pentium3 Dec 15 04:44:58 localhost nova_compute[286344]: pentium3-v1 Dec 15 04:44:58 localhost nova_compute[286344]: phenom Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: phenom-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: qemu32 Dec 15 04:44:58 localhost nova_compute[286344]: qemu32-v1 Dec 15 04:44:58 localhost nova_compute[286344]: qemu64 Dec 15 04:44:58 localhost nova_compute[286344]: qemu64-v1 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: file Dec 15 04:44:58 localhost nova_compute[286344]: anonymous Dec 15 04:44:58 localhost nova_compute[286344]: memfd Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: disk Dec 15 04:44:58 localhost nova_compute[286344]: cdrom Dec 15 04:44:58 localhost nova_compute[286344]: floppy Dec 15 04:44:58 localhost nova_compute[286344]: lun Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: ide Dec 15 04:44:58 localhost nova_compute[286344]: fdc Dec 15 04:44:58 localhost nova_compute[286344]: scsi Dec 15 04:44:58 localhost nova_compute[286344]: virtio Dec 15 04:44:58 localhost nova_compute[286344]: usb Dec 15 04:44:58 localhost nova_compute[286344]: sata Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: virtio Dec 15 04:44:58 localhost nova_compute[286344]: virtio-transitional Dec 15 04:44:58 localhost nova_compute[286344]: virtio-non-transitional Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: vnc Dec 15 04:44:58 localhost nova_compute[286344]: egl-headless Dec 15 04:44:58 localhost nova_compute[286344]: dbus Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: subsystem Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: default Dec 15 04:44:58 localhost nova_compute[286344]: mandatory Dec 15 04:44:58 localhost nova_compute[286344]: requisite Dec 15 04:44:58 localhost nova_compute[286344]: optional Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: usb Dec 15 04:44:58 localhost nova_compute[286344]: pci Dec 15 04:44:58 localhost nova_compute[286344]: scsi Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: virtio Dec 15 04:44:58 localhost nova_compute[286344]: virtio-transitional Dec 15 04:44:58 localhost nova_compute[286344]: virtio-non-transitional Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: random Dec 15 04:44:58 localhost nova_compute[286344]: egd Dec 15 04:44:58 localhost nova_compute[286344]: builtin Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: path Dec 15 04:44:58 localhost nova_compute[286344]: handle Dec 15 04:44:58 localhost nova_compute[286344]: virtiofs Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: tpm-tis Dec 15 04:44:58 localhost nova_compute[286344]: tpm-crb Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: emulator Dec 15 04:44:58 localhost nova_compute[286344]: external Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: 2.0 Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: usb Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: pty Dec 15 04:44:58 localhost nova_compute[286344]: unix Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: qemu Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: builtin Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: default Dec 15 04:44:58 localhost nova_compute[286344]: passt Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: isa Dec 15 04:44:58 localhost nova_compute[286344]: hyperv Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: null Dec 15 04:44:58 localhost nova_compute[286344]: vc Dec 15 04:44:58 localhost nova_compute[286344]: pty Dec 15 04:44:58 localhost nova_compute[286344]: dev Dec 15 04:44:58 localhost nova_compute[286344]: file Dec 15 04:44:58 localhost nova_compute[286344]: pipe Dec 15 04:44:58 localhost nova_compute[286344]: stdio Dec 15 04:44:58 localhost nova_compute[286344]: udp Dec 15 04:44:58 localhost nova_compute[286344]: tcp Dec 15 04:44:58 localhost nova_compute[286344]: unix Dec 15 04:44:58 localhost nova_compute[286344]: qemu-vdagent Dec 15 04:44:58 localhost nova_compute[286344]: dbus Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: relaxed Dec 15 04:44:58 localhost nova_compute[286344]: vapic Dec 15 04:44:58 localhost nova_compute[286344]: spinlocks Dec 15 04:44:58 localhost nova_compute[286344]: vpindex Dec 15 04:44:58 localhost nova_compute[286344]: runtime Dec 15 04:44:58 localhost nova_compute[286344]: synic Dec 15 04:44:58 localhost nova_compute[286344]: stimer Dec 15 04:44:58 localhost nova_compute[286344]: reset Dec 15 04:44:58 localhost nova_compute[286344]: vendor_id Dec 15 04:44:58 localhost nova_compute[286344]: frequencies Dec 15 04:44:58 localhost nova_compute[286344]: reenlightenment Dec 15 04:44:58 localhost nova_compute[286344]: tlbflush Dec 15 04:44:58 localhost nova_compute[286344]: ipi Dec 15 04:44:58 localhost nova_compute[286344]: avic Dec 15 04:44:58 localhost nova_compute[286344]: emsr_bitmap Dec 15 04:44:58 localhost nova_compute[286344]: xmm_input Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: 4095 Dec 15 04:44:58 localhost nova_compute[286344]: on Dec 15 04:44:58 localhost nova_compute[286344]: off Dec 15 04:44:58 localhost nova_compute[286344]: off Dec 15 04:44:58 localhost nova_compute[286344]: Linux KVM Hv Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: tdx Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: Dec 15 04:44:58 localhost nova_compute[286344]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m Dec 15 04:44:58 localhost nova_compute[286344]: 2025-12-15 09:44:58.103 286348 DEBUG nova.virt.libvirt.host [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m Dec 15 04:44:58 localhost nova_compute[286344]: 2025-12-15 09:44:58.103 286348 INFO nova.virt.libvirt.host [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] Secure Boot support detected#033[00m Dec 15 04:44:58 localhost nova_compute[286344]: 2025-12-15 09:44:58.104 286348 INFO nova.virt.libvirt.driver [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m Dec 15 04:44:58 localhost nova_compute[286344]: 2025-12-15 09:44:58.104 286348 INFO nova.virt.libvirt.driver [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m Dec 15 04:44:58 localhost nova_compute[286344]: 2025-12-15 09:44:58.113 286348 DEBUG nova.virt.libvirt.driver [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m Dec 15 04:44:58 localhost nova_compute[286344]: 2025-12-15 09:44:58.134 286348 INFO nova.virt.node [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] Determined node identity 26c8956b-6742-4951-b566-971b9bbe323b from /var/lib/nova/compute_id#033[00m Dec 15 04:44:58 localhost nova_compute[286344]: 2025-12-15 09:44:58.149 286348 DEBUG nova.compute.manager [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] Verified node 26c8956b-6742-4951-b566-971b9bbe323b matches my host np0005559462.localdomain _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568#033[00m Dec 15 04:44:58 localhost nova_compute[286344]: 2025-12-15 09:44:58.170 286348 DEBUG nova.compute.manager [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Dec 15 04:44:58 localhost nova_compute[286344]: 2025-12-15 09:44:58.174 286348 DEBUG nova.virt.libvirt.vif [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-15T08:29:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='test',display_name='test',ec2_ids=,ephemeral_gb=1,ephemeral_key_uuid=None,fault=,flavor=,hidden=False,host='np0005559462.localdomain',hostname='test',id=2,image_ref='7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2025-12-15T08:30:01Z,launched_on='np0005559462.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=,new_flavor=,node='np0005559462.localdomain',numa_topology=None,old_flavor=,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='c785bf23f53946bc99867d8832a50266',ramdisk_id='',reservation_id='r-e1tbwc05',resources=,root_device_name='/dev/vda',root_gb=1,security_groups=,services=,shutdown_terminate=False,system_metadata=,tags=,task_state=None,terminated_at=None,trusted_certs=,updated_at=2025-12-15T08:30:01Z,user_data=None,user_id='1ba5fce347b64bfebf995f187193f205',uuid=39ff1bd9-6f6b-44c8-bbec-a1fd9d196359,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "03ef8889-3216-43fb-8a52-4be17a956ce1", "address": "fa:16:3e:74:df:7c", "network": {"id": "befb7a72-17a9-4bcb-b561-84b8f626685a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.201", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "c785bf23f53946bc99867d8832a50266", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03ef8889-32", "ovs_interfaceid": "03ef8889-3216-43fb-8a52-4be17a956ce1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m Dec 15 04:44:58 localhost nova_compute[286344]: 2025-12-15 09:44:58.174 286348 DEBUG nova.network.os_vif_util [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] Converting VIF {"id": "03ef8889-3216-43fb-8a52-4be17a956ce1", "address": "fa:16:3e:74:df:7c", "network": {"id": "befb7a72-17a9-4bcb-b561-84b8f626685a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.201", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "c785bf23f53946bc99867d8832a50266", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03ef8889-32", "ovs_interfaceid": "03ef8889-3216-43fb-8a52-4be17a956ce1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m Dec 15 04:44:58 localhost nova_compute[286344]: 2025-12-15 09:44:58.175 286348 DEBUG nova.network.os_vif_util [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:74:df:7c,bridge_name='br-int',has_traffic_filtering=True,id=03ef8889-3216-43fb-8a52-4be17a956ce1,network=Network(befb7a72-17a9-4bcb-b561-84b8f626685a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap03ef8889-32') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m Dec 15 04:44:58 localhost nova_compute[286344]: 2025-12-15 09:44:58.176 286348 DEBUG os_vif [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:74:df:7c,bridge_name='br-int',has_traffic_filtering=True,id=03ef8889-3216-43fb-8a52-4be17a956ce1,network=Network(befb7a72-17a9-4bcb-b561-84b8f626685a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap03ef8889-32') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m Dec 15 04:44:58 localhost nova_compute[286344]: 2025-12-15 09:44:58.258 286348 DEBUG ovsdbapp.backend.ovs_idl [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Dec 15 04:44:58 localhost nova_compute[286344]: 2025-12-15 09:44:58.259 286348 DEBUG ovsdbapp.backend.ovs_idl [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Dec 15 04:44:58 localhost nova_compute[286344]: 2025-12-15 09:44:58.259 286348 DEBUG ovsdbapp.backend.ovs_idl [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m Dec 15 04:44:58 localhost nova_compute[286344]: 2025-12-15 09:44:58.259 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:44:58 localhost nova_compute[286344]: 2025-12-15 09:44:58.259 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] [POLLOUT] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:44:58 localhost nova_compute[286344]: 2025-12-15 09:44:58.260 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:44:58 localhost nova_compute[286344]: 2025-12-15 09:44:58.260 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:44:58 localhost nova_compute[286344]: 2025-12-15 09:44:58.261 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:44:58 localhost nova_compute[286344]: 2025-12-15 09:44:58.264 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:44:58 localhost nova_compute[286344]: 2025-12-15 09:44:58.279 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:44:58 localhost nova_compute[286344]: 2025-12-15 09:44:58.279 286348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 15 04:44:58 localhost nova_compute[286344]: 2025-12-15 09:44:58.280 286348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Dec 15 04:44:58 localhost nova_compute[286344]: 2025-12-15 09:44:58.281 286348 INFO oslo.privsep.daemon [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmp3s3d6h5r/privsep.sock']#033[00m Dec 15 04:44:58 localhost nova_compute[286344]: 2025-12-15 09:44:58.740 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:44:58 localhost nova_compute[286344]: 2025-12-15 09:44:58.842 286348 INFO oslo.privsep.daemon [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] Spawned new privsep daemon via rootwrap#033[00m Dec 15 04:44:58 localhost nova_compute[286344]: 2025-12-15 09:44:58.736 286618 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m Dec 15 04:44:58 localhost nova_compute[286344]: 2025-12-15 09:44:58.744 286618 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m Dec 15 04:44:58 localhost nova_compute[286344]: 2025-12-15 09:44:58.747 286618 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none#033[00m Dec 15 04:44:58 localhost nova_compute[286344]: 2025-12-15 09:44:58.747 286618 INFO oslo.privsep.daemon [-] privsep daemon running as pid 286618#033[00m Dec 15 04:44:59 localhost nova_compute[286344]: 2025-12-15 09:44:59.102 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:44:59 localhost nova_compute[286344]: 2025-12-15 09:44:59.103 286348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap03ef8889-32, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 15 04:44:59 localhost nova_compute[286344]: 2025-12-15 09:44:59.103 286348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap03ef8889-32, col_values=(('external_ids', {'iface-id': '03ef8889-3216-43fb-8a52-4be17a956ce1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:74:df:7c', 'vm-uuid': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 15 04:44:59 localhost nova_compute[286344]: 2025-12-15 09:44:59.104 286348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Dec 15 04:44:59 localhost nova_compute[286344]: 2025-12-15 09:44:59.105 286348 INFO os_vif [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:74:df:7c,bridge_name='br-int',has_traffic_filtering=True,id=03ef8889-3216-43fb-8a52-4be17a956ce1,network=Network(befb7a72-17a9-4bcb-b561-84b8f626685a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap03ef8889-32')#033[00m Dec 15 04:44:59 localhost nova_compute[286344]: 2025-12-15 09:44:59.106 286348 DEBUG nova.compute.manager [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Dec 15 04:44:59 localhost nova_compute[286344]: 2025-12-15 09:44:59.109 286348 DEBUG nova.compute.manager [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Current state is 1, state in DB is 1. _init_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:1304#033[00m Dec 15 04:44:59 localhost nova_compute[286344]: 2025-12-15 09:44:59.109 286348 INFO nova.compute.manager [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m Dec 15 04:44:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=714 DF PROTO=TCP SPT=32832 DPT=9102 SEQ=3208699565 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83A863250000000001030307) Dec 15 04:44:59 localhost nova_compute[286344]: 2025-12-15 09:44:59.230 286348 DEBUG oslo_concurrency.lockutils [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:44:59 localhost nova_compute[286344]: 2025-12-15 09:44:59.230 286348 DEBUG oslo_concurrency.lockutils [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:44:59 localhost nova_compute[286344]: 2025-12-15 09:44:59.231 286348 DEBUG oslo_concurrency.lockutils [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:44:59 localhost nova_compute[286344]: 2025-12-15 09:44:59.231 286348 DEBUG nova.compute.resource_tracker [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] Auditing locally available compute resources for np0005559462.localdomain (node: np0005559462.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 15 04:44:59 localhost nova_compute[286344]: 2025-12-15 09:44:59.232 286348 DEBUG oslo_concurrency.processutils [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 04:44:59 localhost nova_compute[286344]: 2025-12-15 09:44:59.688 286348 DEBUG oslo_concurrency.processutils [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 04:44:59 localhost nova_compute[286344]: 2025-12-15 09:44:59.749 286348 DEBUG nova.virt.libvirt.driver [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 04:44:59 localhost nova_compute[286344]: 2025-12-15 09:44:59.749 286348 DEBUG nova.virt.libvirt.driver [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 04:44:59 localhost nova_compute[286344]: 2025-12-15 09:44:59.966 286348 WARNING nova.virt.libvirt.driver [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 15 04:44:59 localhost nova_compute[286344]: 2025-12-15 09:44:59.968 286348 DEBUG nova.compute.resource_tracker [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] Hypervisor/Node resource view: name=np0005559462.localdomain free_ram=12163MB free_disk=41.83720779418945GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 15 04:44:59 localhost nova_compute[286344]: 2025-12-15 09:44:59.968 286348 DEBUG oslo_concurrency.lockutils [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:44:59 localhost nova_compute[286344]: 2025-12-15 09:44:59.969 286348 DEBUG oslo_concurrency.lockutils [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:45:00 localhost nova_compute[286344]: 2025-12-15 09:45:00.154 286348 DEBUG nova.compute.resource_tracker [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] Instance 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 15 04:45:00 localhost nova_compute[286344]: 2025-12-15 09:45:00.155 286348 DEBUG nova.compute.resource_tracker [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 15 04:45:00 localhost nova_compute[286344]: 2025-12-15 09:45:00.155 286348 DEBUG nova.compute.resource_tracker [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] Final resource view: name=np0005559462.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 15 04:45:00 localhost nova_compute[286344]: 2025-12-15 09:45:00.231 286348 DEBUG nova.scheduler.client.report [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] Refreshing inventories for resource provider 26c8956b-6742-4951-b566-971b9bbe323b _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Dec 15 04:45:00 localhost nova_compute[286344]: 2025-12-15 09:45:00.248 286348 DEBUG nova.scheduler.client.report [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] Updating ProviderTree inventory for provider 26c8956b-6742-4951-b566-971b9bbe323b from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Dec 15 04:45:00 localhost nova_compute[286344]: 2025-12-15 09:45:00.249 286348 DEBUG nova.compute.provider_tree [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] Updating inventory in ProviderTree for provider 26c8956b-6742-4951-b566-971b9bbe323b with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Dec 15 04:45:00 localhost nova_compute[286344]: 2025-12-15 09:45:00.263 286348 DEBUG nova.scheduler.client.report [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] Refreshing aggregate associations for resource provider 26c8956b-6742-4951-b566-971b9bbe323b, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Dec 15 04:45:00 localhost nova_compute[286344]: 2025-12-15 09:45:00.284 286348 DEBUG nova.scheduler.client.report [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] Refreshing trait associations for resource provider 26c8956b-6742-4951-b566-971b9bbe323b, traits: COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_IDE,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NODE,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_CLMUL,HW_CPU_X86_ABM,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_BMI2,HW_CPU_X86_SSE42,HW_CPU_X86_FMA3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_BMI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE,HW_CPU_X86_AVX,HW_CPU_X86_SVM,COMPUTE_SECURITY_TPM_2_0,COMPUTE_TRUSTED_CERTS,COMPUTE_RESCUE_BFV,HW_CPU_X86_AVX2,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSSE3,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_AESNI,HW_CPU_X86_SSE4A,HW_CPU_X86_MMX,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_F16C,HW_CPU_X86_SHA,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_SATA _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Dec 15 04:45:00 localhost nova_compute[286344]: 2025-12-15 09:45:00.312 286348 DEBUG oslo_concurrency.processutils [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 04:45:01 localhost nova_compute[286344]: 2025-12-15 09:45:01.583 286348 DEBUG oslo_concurrency.processutils [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.271s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 04:45:01 localhost nova_compute[286344]: 2025-12-15 09:45:01.588 286348 DEBUG nova.virt.libvirt.host [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N Dec 15 04:45:01 localhost nova_compute[286344]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803#033[00m Dec 15 04:45:01 localhost nova_compute[286344]: 2025-12-15 09:45:01.588 286348 INFO nova.virt.libvirt.host [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] kernel doesn't support AMD SEV#033[00m Dec 15 04:45:01 localhost nova_compute[286344]: 2025-12-15 09:45:01.589 286348 DEBUG nova.compute.provider_tree [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] Inventory has not changed in ProviderTree for provider: 26c8956b-6742-4951-b566-971b9bbe323b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 15 04:45:01 localhost nova_compute[286344]: 2025-12-15 09:45:01.590 286348 DEBUG nova.virt.libvirt.driver [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m Dec 15 04:45:01 localhost nova_compute[286344]: 2025-12-15 09:45:01.606 286348 DEBUG nova.scheduler.client.report [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] Inventory has not changed for provider 26c8956b-6742-4951-b566-971b9bbe323b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 15 04:45:01 localhost nova_compute[286344]: 2025-12-15 09:45:01.626 286348 DEBUG nova.compute.resource_tracker [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] Compute_service record updated for np0005559462.localdomain:np0005559462.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 15 04:45:01 localhost nova_compute[286344]: 2025-12-15 09:45:01.626 286348 DEBUG oslo_concurrency.lockutils [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.657s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:45:01 localhost nova_compute[286344]: 2025-12-15 09:45:01.626 286348 DEBUG nova.service [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182#033[00m Dec 15 04:45:01 localhost nova_compute[286344]: 2025-12-15 09:45:01.644 286348 DEBUG nova.service [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199#033[00m Dec 15 04:45:01 localhost nova_compute[286344]: 2025-12-15 09:45:01.644 286348 DEBUG nova.servicegroup.drivers.db [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] DB_Driver: join new ServiceGroup member np0005559462.localdomain to the compute group, service = join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44#033[00m Dec 15 04:45:01 localhost podman[243449]: time="2025-12-15T09:45:01Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 15 04:45:01 localhost podman[243449]: @ - - [15/Dec/2025:09:45:01 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149352 "" "Go-http-client/1.1" Dec 15 04:45:01 localhost podman[243449]: @ - - [15/Dec/2025:09:45:01 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17220 "" "Go-http-client/1.1" Dec 15 04:45:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23885 DF PROTO=TCP SPT=49422 DPT=9102 SEQ=2260569640 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83A86E250000000001030307) Dec 15 04:45:03 localhost nova_compute[286344]: 2025-12-15 09:45:03.280 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:45:03 localhost nova_compute[286344]: 2025-12-15 09:45:03.744 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:45:04 localhost openstack_network_exporter[246484]: ERROR 09:45:04 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 15 04:45:04 localhost openstack_network_exporter[246484]: ERROR 09:45:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 04:45:04 localhost openstack_network_exporter[246484]: ERROR 09:45:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 04:45:04 localhost openstack_network_exporter[246484]: ERROR 09:45:04 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 15 04:45:04 localhost openstack_network_exporter[246484]: Dec 15 04:45:04 localhost openstack_network_exporter[246484]: ERROR 09:45:04 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 15 04:45:04 localhost openstack_network_exporter[246484]: Dec 15 04:45:05 localhost sshd[286666]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:45:05 localhost ceph-osd[31375]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Dec 15 04:45:05 localhost ceph-osd[31375]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 7200.1 total, 600.0 interval#012Cumulative writes: 4815 writes, 21K keys, 4815 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 4815 writes, 628 syncs, 7.67 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Dec 15 04:45:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0. Dec 15 04:45:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 04:45:06 localhost podman[286669]: 2025-12-15 09:45:06.765549662 +0000 UTC m=+0.092473364 container health_status 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible) Dec 15 04:45:06 localhost podman[286668]: 2025-12-15 09:45:06.80307451 +0000 UTC m=+0.129783906 container health_status 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 15 04:45:06 localhost podman[286668]: 2025-12-15 09:45:06.813236863 +0000 UTC m=+0.139946259 container exec_died 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Dec 15 04:45:06 localhost systemd[1]: 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0.service: Deactivated successfully. Dec 15 04:45:06 localhost podman[286669]: 2025-12-15 09:45:06.855955177 +0000 UTC m=+0.182878879 container exec_died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 04:45:06 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.service: Deactivated successfully. Dec 15 04:45:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a. Dec 15 04:45:07 localhost podman[286710]: 2025-12-15 09:45:07.561598951 +0000 UTC m=+0.080109547 container health_status b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0) Dec 15 04:45:07 localhost podman[286710]: 2025-12-15 09:45:07.600459616 +0000 UTC m=+0.118970202 container exec_died b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0) Dec 15 04:45:07 localhost systemd[1]: b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a.service: Deactivated successfully. Dec 15 04:45:08 localhost nova_compute[286344]: 2025-12-15 09:45:08.322 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:45:08 localhost nova_compute[286344]: 2025-12-15 09:45:08.747 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:45:10 localhost ceph-osd[32311]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Dec 15 04:45:10 localhost ceph-osd[32311]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 7200.2 total, 600.0 interval#012Cumulative writes: 5745 writes, 25K keys, 5745 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 5745 writes, 763 syncs, 7.53 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Dec 15 04:45:10 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23886 DF PROTO=TCP SPT=49422 DPT=9102 SEQ=2260569640 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83A88F250000000001030307) Dec 15 04:45:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09. Dec 15 04:45:12 localhost podman[286871]: 2025-12-15 09:45:12.716244437 +0000 UTC m=+0.080416556 container health_status 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_id=openstack_network_exporter, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6) Dec 15 04:45:12 localhost podman[286871]: 2025-12-15 09:45:12.733468709 +0000 UTC m=+0.097640798 container exec_died 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, name=ubi9-minimal, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, release=1755695350, architecture=x86_64) Dec 15 04:45:12 localhost systemd[1]: 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09.service: Deactivated successfully. Dec 15 04:45:13 localhost nova_compute[286344]: 2025-12-15 09:45:13.326 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:45:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 04:45:13 localhost systemd[1]: tmp-crun.UlaoQs.mount: Deactivated successfully. Dec 15 04:45:13 localhost nova_compute[286344]: 2025-12-15 09:45:13.751 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:45:13 localhost podman[286891]: 2025-12-15 09:45:13.760192621 +0000 UTC m=+0.092288079 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team) Dec 15 04:45:13 localhost podman[286891]: 2025-12-15 09:45:13.817788529 +0000 UTC m=+0.149883977 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202) Dec 15 04:45:13 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 04:45:15 localhost ovn_metadata_agent[160585]: 2025-12-15 09:45:15.669 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'fe:17:e3', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fe:55:2b:86:15:b5'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 04:45:15 localhost nova_compute[286344]: 2025-12-15 09:45:15.670 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:45:15 localhost ovn_metadata_agent[160585]: 2025-12-15 09:45:15.671 160590 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Dec 15 04:45:18 localhost nova_compute[286344]: 2025-12-15 09:45:18.362 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:45:18 localhost nova_compute[286344]: 2025-12-15 09:45:18.753 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:45:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 04:45:20 localhost podman[286919]: 2025-12-15 09:45:20.745016643 +0000 UTC m=+0.080371385 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Dec 15 04:45:20 localhost podman[286919]: 2025-12-15 09:45:20.75422414 +0000 UTC m=+0.089578862 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent) Dec 15 04:45:20 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 04:45:23 localhost nova_compute[286344]: 2025-12-15 09:45:23.365 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:45:23 localhost nova_compute[286344]: 2025-12-15 09:45:23.756 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:45:24 localhost ovn_metadata_agent[160585]: 2025-12-15 09:45:24.673 160590 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=12d96d64-e862-4f68-81e5-8d9ec5d3a5e2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 15 04:45:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54628 DF PROTO=TCP SPT=54598 DPT=9102 SEQ=3163513306 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83A8C7A20000000001030307) Dec 15 04:45:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54629 DF PROTO=TCP SPT=54598 DPT=9102 SEQ=3163513306 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83A8CBA60000000001030307) Dec 15 04:45:26 localhost nova_compute[286344]: 2025-12-15 09:45:26.646 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:45:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e. Dec 15 04:45:26 localhost nova_compute[286344]: 2025-12-15 09:45:26.671 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Triggering sync for uuid 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m Dec 15 04:45:26 localhost nova_compute[286344]: 2025-12-15 09:45:26.672 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:45:26 localhost nova_compute[286344]: 2025-12-15 09:45:26.673 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:45:26 localhost nova_compute[286344]: 2025-12-15 09:45:26.673 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:45:26 localhost nova_compute[286344]: 2025-12-15 09:45:26.730 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.057s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:45:26 localhost systemd[1]: tmp-crun.Z6ceS4.mount: Deactivated successfully. Dec 15 04:45:26 localhost podman[286936]: 2025-12-15 09:45:26.750265422 +0000 UTC m=+0.085776676 container health_status a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 15 04:45:26 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23887 DF PROTO=TCP SPT=49422 DPT=9102 SEQ=2260569640 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83A8CF250000000001030307) Dec 15 04:45:26 localhost podman[286936]: 2025-12-15 09:45:26.78421625 +0000 UTC m=+0.119727494 container exec_died a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Dec 15 04:45:26 localhost systemd[1]: a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e.service: Deactivated successfully. Dec 15 04:45:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54630 DF PROTO=TCP SPT=54598 DPT=9102 SEQ=3163513306 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83A8D3A60000000001030307) Dec 15 04:45:28 localhost nova_compute[286344]: 2025-12-15 09:45:28.413 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:45:28 localhost nova_compute[286344]: 2025-12-15 09:45:28.760 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:45:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52588 DF PROTO=TCP SPT=53566 DPT=9102 SEQ=4061031728 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83A8D7250000000001030307) Dec 15 04:45:31 localhost nova_compute[286344]: 2025-12-15 09:45:31.636 286348 DEBUG nova.compute.manager [None req-f12c7f9a-a546-4422-8ed0-f49bd5a433e3 1ba5fce347b64bfebf995f187193f205 c785bf23f53946bc99867d8832a50266 - - default default] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Dec 15 04:45:31 localhost nova_compute[286344]: 2025-12-15 09:45:31.640 286348 INFO nova.compute.manager [None req-f12c7f9a-a546-4422-8ed0-f49bd5a433e3 1ba5fce347b64bfebf995f187193f205 c785bf23f53946bc99867d8832a50266 - - default default] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Retrieving diagnostics#033[00m Dec 15 04:45:31 localhost podman[243449]: time="2025-12-15T09:45:31Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 15 04:45:31 localhost podman[243449]: @ - - [15/Dec/2025:09:45:31 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149352 "" "Go-http-client/1.1" Dec 15 04:45:31 localhost podman[243449]: @ - - [15/Dec/2025:09:45:31 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17224 "" "Go-http-client/1.1" Dec 15 04:45:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54631 DF PROTO=TCP SPT=54598 DPT=9102 SEQ=3163513306 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83A8E3660000000001030307) Dec 15 04:45:33 localhost nova_compute[286344]: 2025-12-15 09:45:33.417 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:45:33 localhost nova_compute[286344]: 2025-12-15 09:45:33.762 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:45:34 localhost openstack_network_exporter[246484]: ERROR 09:45:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 04:45:34 localhost openstack_network_exporter[246484]: ERROR 09:45:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 04:45:34 localhost openstack_network_exporter[246484]: ERROR 09:45:34 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 15 04:45:34 localhost openstack_network_exporter[246484]: ERROR 09:45:34 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 15 04:45:34 localhost openstack_network_exporter[246484]: Dec 15 04:45:34 localhost openstack_network_exporter[246484]: ERROR 09:45:34 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 15 04:45:34 localhost openstack_network_exporter[246484]: Dec 15 04:45:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0. Dec 15 04:45:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 04:45:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a. Dec 15 04:45:37 localhost podman[286962]: 2025-12-15 09:45:37.760689581 +0000 UTC m=+0.090349554 container health_status 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Dec 15 04:45:37 localhost podman[286962]: 2025-12-15 09:45:37.773371656 +0000 UTC m=+0.103031659 container exec_died 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Dec 15 04:45:37 localhost systemd[1]: 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0.service: Deactivated successfully. Dec 15 04:45:37 localhost podman[286964]: 2025-12-15 09:45:37.874458939 +0000 UTC m=+0.195878491 container health_status b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_id=ceilometer_agent_compute) Dec 15 04:45:37 localhost podman[286964]: 2025-12-15 09:45:37.917596713 +0000 UTC m=+0.239016225 container exec_died b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible) Dec 15 04:45:37 localhost systemd[1]: b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a.service: Deactivated successfully. Dec 15 04:45:37 localhost podman[286963]: 2025-12-15 09:45:37.919127096 +0000 UTC m=+0.244499148 container health_status 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 04:45:37 localhost nova_compute[286344]: 2025-12-15 09:45:37.971 286348 DEBUG oslo_concurrency.lockutils [None req-95854b29-0d11-4468-9b4b-db2ee9a7eb55 1ba5fce347b64bfebf995f187193f205 c785bf23f53946bc99867d8832a50266 - - default default] Acquiring lock "39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" by "nova.compute.manager.ComputeManager.stop_instance..do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:45:37 localhost nova_compute[286344]: 2025-12-15 09:45:37.972 286348 DEBUG oslo_concurrency.lockutils [None req-95854b29-0d11-4468-9b4b-db2ee9a7eb55 1ba5fce347b64bfebf995f187193f205 c785bf23f53946bc99867d8832a50266 - - default default] Lock "39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" acquired by "nova.compute.manager.ComputeManager.stop_instance..do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:45:37 localhost nova_compute[286344]: 2025-12-15 09:45:37.972 286348 DEBUG nova.compute.manager [None req-95854b29-0d11-4468-9b4b-db2ee9a7eb55 1ba5fce347b64bfebf995f187193f205 c785bf23f53946bc99867d8832a50266 - - default default] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Dec 15 04:45:37 localhost nova_compute[286344]: 2025-12-15 09:45:37.977 286348 DEBUG nova.compute.manager [None req-95854b29-0d11-4468-9b4b-db2ee9a7eb55 1ba5fce347b64bfebf995f187193f205 c785bf23f53946bc99867d8832a50266 - - default default] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338#033[00m Dec 15 04:45:37 localhost nova_compute[286344]: 2025-12-15 09:45:37.983 286348 DEBUG nova.objects.instance [None req-95854b29-0d11-4468-9b4b-db2ee9a7eb55 1ba5fce347b64bfebf995f187193f205 c785bf23f53946bc99867d8832a50266 - - default default] Lazy-loading 'flavor' on Instance uuid 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 15 04:45:38 localhost podman[286963]: 2025-12-15 09:45:38.00452259 +0000 UTC m=+0.329894692 container exec_died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, config_id=multipathd, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 04:45:38 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.service: Deactivated successfully. Dec 15 04:45:38 localhost nova_compute[286344]: 2025-12-15 09:45:38.026 286348 DEBUG nova.virt.libvirt.driver [None req-95854b29-0d11-4468-9b4b-db2ee9a7eb55 1ba5fce347b64bfebf995f187193f205 c785bf23f53946bc99867d8832a50266 - - default default] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m Dec 15 04:45:38 localhost nova_compute[286344]: 2025-12-15 09:45:38.453 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:45:38 localhost nova_compute[286344]: 2025-12-15 09:45:38.765 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:45:40 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54632 DF PROTO=TCP SPT=54598 DPT=9102 SEQ=3163513306 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83A903250000000001030307) Dec 15 04:45:40 localhost kernel: device tap03ef8889-32 left promiscuous mode Dec 15 04:45:40 localhost NetworkManager[5963]: [1765791940.5016] device (tap03ef8889-32): state change: disconnected -> unmanaged (reason 'unmanaged', sys-iface-state: 'removed') Dec 15 04:45:40 localhost ovn_controller[154603]: 2025-12-15T09:45:40Z|00049|binding|INFO|Releasing lport 03ef8889-3216-43fb-8a52-4be17a956ce1 from this chassis (sb_readonly=0) Dec 15 04:45:40 localhost ovn_controller[154603]: 2025-12-15T09:45:40Z|00050|binding|INFO|Setting lport 03ef8889-3216-43fb-8a52-4be17a956ce1 down in Southbound Dec 15 04:45:40 localhost ovn_controller[154603]: 2025-12-15T09:45:40Z|00051|binding|INFO|Removing iface tap03ef8889-32 ovn-installed in OVS Dec 15 04:45:40 localhost nova_compute[286344]: 2025-12-15 09:45:40.517 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:45:40 localhost ovn_metadata_agent[160585]: 2025-12-15 09:45:40.523 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:74:df:7c 192.168.0.201'], port_security=['fa:16:3e:74:df:7c 192.168.0.201'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.0.201/24', 'neutron:device_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'neutron:device_owner': 'compute:nova', 'neutron:host_id': 'np0005559462.localdomain', 'neutron:mtu': '', 'neutron:network_name': 'neutron-befb7a72-17a9-4bcb-b561-84b8f626685a', 'neutron:port_capabilities': '', 'neutron:port_fip': '192.168.122.20', 'neutron:port_name': '', 'neutron:project_id': 'c785bf23f53946bc99867d8832a50266', 'neutron:revision_number': '7', 'neutron:security_group_ids': 'adeef2d9-3b61-4849-9b44-ac3bff90d0cd fa685b85-67a9-4a56-ba21-4767a05c4811', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=56a5044a-5384-46d9-b45d-bcd5602105ab, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=03ef8889-3216-43fb-8a52-4be17a956ce1) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 04:45:40 localhost systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000002.scope: Deactivated successfully. Dec 15 04:45:40 localhost systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000002.scope: Consumed 3min 52.414s CPU time. Dec 15 04:45:40 localhost ovn_metadata_agent[160585]: 2025-12-15 09:45:40.525 160590 INFO neutron.agent.ovn.metadata.agent [-] Port 03ef8889-3216-43fb-8a52-4be17a956ce1 in datapath befb7a72-17a9-4bcb-b561-84b8f626685a unbound from our chassis#033[00m Dec 15 04:45:40 localhost ovn_metadata_agent[160585]: 2025-12-15 09:45:40.526 160590 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network befb7a72-17a9-4bcb-b561-84b8f626685a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 15 04:45:40 localhost nova_compute[286344]: 2025-12-15 09:45:40.528 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:45:40 localhost systemd-machined[84011]: Machine qemu-1-instance-00000002 terminated. Dec 15 04:45:40 localhost ovn_controller[154603]: 2025-12-15T09:45:40Z|00052|ovn_bfd|INFO|Disabled BFD on interface ovn-9f826b-0 Dec 15 04:45:40 localhost ovn_controller[154603]: 2025-12-15T09:45:40Z|00053|ovn_bfd|INFO|Disabled BFD on interface ovn-843308-0 Dec 15 04:45:40 localhost ovn_controller[154603]: 2025-12-15T09:45:40Z|00054|ovn_bfd|INFO|Disabled BFD on interface ovn-c1fd65-0 Dec 15 04:45:40 localhost ovn_metadata_agent[160585]: 2025-12-15 09:45:40.533 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[0d4af6e2-78d4-4f1a-9237-8aa57810eb58]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 04:45:40 localhost ovn_metadata_agent[160585]: 2025-12-15 09:45:40.534 160590 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-befb7a72-17a9-4bcb-b561-84b8f626685a namespace which is not needed anymore#033[00m Dec 15 04:45:40 localhost ovn_controller[154603]: 2025-12-15T09:45:40Z|00055|binding|INFO|Releasing lport b35254ad-12eb-47bb-92be-44fefe0694f0 from this chassis (sb_readonly=0) Dec 15 04:45:40 localhost nova_compute[286344]: 2025-12-15 09:45:40.537 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:45:40 localhost nova_compute[286344]: 2025-12-15 09:45:40.540 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:45:40 localhost nova_compute[286344]: 2025-12-15 09:45:40.580 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:45:40 localhost ovn_controller[154603]: 2025-12-15T09:45:40Z|00056|binding|INFO|Releasing lport b35254ad-12eb-47bb-92be-44fefe0694f0 from this chassis (sb_readonly=0) Dec 15 04:45:40 localhost nova_compute[286344]: 2025-12-15 09:45:40.588 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:45:40 localhost systemd[1]: libpod-9e7442899f892fc0ac4ff30dac8e7344498d50f9a93adf21950447c50c9a3168.scope: Deactivated successfully. Dec 15 04:45:40 localhost podman[287048]: 2025-12-15 09:45:40.721178744 +0000 UTC m=+0.078166824 container died 9e7442899f892fc0ac4ff30dac8e7344498d50f9a93adf21950447c50c9a3168 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-befb7a72-17a9-4bcb-b561-84b8f626685a, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, url=https://www.redhat.com, build-date=2025-11-19T00:14:25Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.openshift.expose-services=, release=1761123044, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, architecture=x86_64, batch=17.1_20251118.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, io.buildah.version=1.41.4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn) Dec 15 04:45:40 localhost nova_compute[286344]: 2025-12-15 09:45:40.724 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:45:40 localhost nova_compute[286344]: 2025-12-15 09:45:40.729 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:45:40 localhost podman[287048]: 2025-12-15 09:45:40.870644898 +0000 UTC m=+0.227632968 container cleanup 9e7442899f892fc0ac4ff30dac8e7344498d50f9a93adf21950447c50c9a3168 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-befb7a72-17a9-4bcb-b561-84b8f626685a, architecture=x86_64, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2025-11-19T00:14:25Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, batch=17.1_20251118.1, io.openshift.expose-services=, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, name=rhosp17/openstack-neutron-metadata-agent-ovn, version=17.1.12, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, tcib_managed=true, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c) Dec 15 04:45:40 localhost nova_compute[286344]: 2025-12-15 09:45:40.884 286348 DEBUG nova.compute.manager [req-ac380fcc-7f2c-4c95-a149-f7bf611f24be req-6d2c1f3c-b6df-433a-bb1c-f98d0b0c3790 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Received event network-vif-unplugged-03ef8889-3216-43fb-8a52-4be17a956ce1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Dec 15 04:45:40 localhost nova_compute[286344]: 2025-12-15 09:45:40.885 286348 DEBUG oslo_concurrency.lockutils [req-ac380fcc-7f2c-4c95-a149-f7bf611f24be req-6d2c1f3c-b6df-433a-bb1c-f98d0b0c3790 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] Acquiring lock "39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:45:40 localhost nova_compute[286344]: 2025-12-15 09:45:40.886 286348 DEBUG oslo_concurrency.lockutils [req-ac380fcc-7f2c-4c95-a149-f7bf611f24be req-6d2c1f3c-b6df-433a-bb1c-f98d0b0c3790 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] Lock "39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:45:40 localhost nova_compute[286344]: 2025-12-15 09:45:40.886 286348 DEBUG oslo_concurrency.lockutils [req-ac380fcc-7f2c-4c95-a149-f7bf611f24be req-6d2c1f3c-b6df-433a-bb1c-f98d0b0c3790 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] Lock "39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:45:40 localhost nova_compute[286344]: 2025-12-15 09:45:40.886 286348 DEBUG nova.compute.manager [req-ac380fcc-7f2c-4c95-a149-f7bf611f24be req-6d2c1f3c-b6df-433a-bb1c-f98d0b0c3790 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] No waiting events found dispatching network-vif-unplugged-03ef8889-3216-43fb-8a52-4be17a956ce1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m Dec 15 04:45:40 localhost nova_compute[286344]: 2025-12-15 09:45:40.887 286348 WARNING nova.compute.manager [req-ac380fcc-7f2c-4c95-a149-f7bf611f24be req-6d2c1f3c-b6df-433a-bb1c-f98d0b0c3790 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Received unexpected event network-vif-unplugged-03ef8889-3216-43fb-8a52-4be17a956ce1 for instance with vm_state active and task_state powering-off.#033[00m Dec 15 04:45:40 localhost podman[287064]: 2025-12-15 09:45:40.888741333 +0000 UTC m=+0.154339771 container cleanup 9e7442899f892fc0ac4ff30dac8e7344498d50f9a93adf21950447c50c9a3168 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-befb7a72-17a9-4bcb-b561-84b8f626685a, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, release=1761123044, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp17/openstack-neutron-metadata-agent-ovn, batch=17.1_20251118.1, io.buildah.version=1.41.4, io.openshift.expose-services=, tcib_managed=true, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05, build-date=2025-11-19T00:14:25Z, vcs-type=git, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, version=17.1.12, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.12 17.1_20251118.1, vendor=Red Hat, Inc.) Dec 15 04:45:40 localhost systemd[1]: libpod-conmon-9e7442899f892fc0ac4ff30dac8e7344498d50f9a93adf21950447c50c9a3168.scope: Deactivated successfully. Dec 15 04:45:40 localhost podman[287088]: 2025-12-15 09:45:40.960165698 +0000 UTC m=+0.067784924 container remove 9e7442899f892fc0ac4ff30dac8e7344498d50f9a93adf21950447c50c9a3168 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-befb7a72-17a9-4bcb-b561-84b8f626685a, name=rhosp17/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, version=17.1.12, konflux.additional-tags=17.1.12 17.1_20251118.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=89d55f10f82ff50b4f24de36868d7c635c279c7c, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2025-11-19T00:14:25Z, vendor=Red Hat, Inc., release=1761123044, vcs-ref=89d55f10f82ff50b4f24de36868d7c635c279c7c, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20251118.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:rhel_e4s:9.2::appstream, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.buildah.version=1.41.4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:dd3e22348293588538689be8c51c23472fd4ca53650b3898401947ef9c7e1a05) Dec 15 04:45:40 localhost ovn_metadata_agent[160585]: 2025-12-15 09:45:40.964 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[f3084f66-3680-4f5f-a33a-3d940154ad7b]: (4, ('Mon Dec 15 09:45:40 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-befb7a72-17a9-4bcb-b561-84b8f626685a (9e7442899f892fc0ac4ff30dac8e7344498d50f9a93adf21950447c50c9a3168)\n9e7442899f892fc0ac4ff30dac8e7344498d50f9a93adf21950447c50c9a3168\nMon Dec 15 09:45:40 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-befb7a72-17a9-4bcb-b561-84b8f626685a (9e7442899f892fc0ac4ff30dac8e7344498d50f9a93adf21950447c50c9a3168)\n9e7442899f892fc0ac4ff30dac8e7344498d50f9a93adf21950447c50c9a3168\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 04:45:40 localhost ovn_metadata_agent[160585]: 2025-12-15 09:45:40.967 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[8cab7f62-7c41-4b7b-a2e1-abd4ad2a64b9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 04:45:40 localhost ovn_metadata_agent[160585]: 2025-12-15 09:45:40.968 160590 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbefb7a72-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 15 04:45:40 localhost nova_compute[286344]: 2025-12-15 09:45:40.970 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:45:40 localhost kernel: device tapbefb7a72-10 left promiscuous mode Dec 15 04:45:40 localhost nova_compute[286344]: 2025-12-15 09:45:40.978 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:45:40 localhost ovn_metadata_agent[160585]: 2025-12-15 09:45:40.981 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[4251bb85-89bc-41a8-9bfb-0ad8a55b4f34]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 04:45:40 localhost ovn_metadata_agent[160585]: 2025-12-15 09:45:40.996 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[742e8fc8-6fb6-45ec-9838-88addfc082fe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 04:45:40 localhost ovn_metadata_agent[160585]: 2025-12-15 09:45:40.997 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[e2a011c7-573a-4933-83ca-834e79d55831]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 04:45:41 localhost ovn_metadata_agent[160585]: 2025-12-15 09:45:41.012 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[eeabb113-df60-4e2d-990a-1100bdfb8fc7]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 646659, 'reachable_time': 24728, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1356, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 287109, 'error': None, 'target': 'ovnmeta-befb7a72-17a9-4bcb-b561-84b8f626685a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 04:45:41 localhost ovn_metadata_agent[160585]: 2025-12-15 09:45:41.021 160979 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-befb7a72-17a9-4bcb-b561-84b8f626685a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m Dec 15 04:45:41 localhost ovn_metadata_agent[160585]: 2025-12-15 09:45:41.022 160979 DEBUG oslo.privsep.daemon [-] privsep: reply[0e188d1a-59d7-49c9-a9d0-17208d8a20e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 04:45:41 localhost nova_compute[286344]: 2025-12-15 09:45:41.045 286348 INFO nova.virt.libvirt.driver [None req-95854b29-0d11-4468-9b4b-db2ee9a7eb55 1ba5fce347b64bfebf995f187193f205 c785bf23f53946bc99867d8832a50266 - - default default] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Instance shutdown successfully after 3 seconds.#033[00m Dec 15 04:45:41 localhost nova_compute[286344]: 2025-12-15 09:45:41.053 286348 INFO nova.virt.libvirt.driver [-] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Instance destroyed successfully.#033[00m Dec 15 04:45:41 localhost nova_compute[286344]: 2025-12-15 09:45:41.054 286348 DEBUG nova.objects.instance [None req-95854b29-0d11-4468-9b4b-db2ee9a7eb55 1ba5fce347b64bfebf995f187193f205 c785bf23f53946bc99867d8832a50266 - - default default] Lazy-loading 'numa_topology' on Instance uuid 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 15 04:45:41 localhost nova_compute[286344]: 2025-12-15 09:45:41.070 286348 DEBUG nova.compute.manager [None req-95854b29-0d11-4468-9b4b-db2ee9a7eb55 1ba5fce347b64bfebf995f187193f205 c785bf23f53946bc99867d8832a50266 - - default default] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Dec 15 04:45:41 localhost nova_compute[286344]: 2025-12-15 09:45:41.128 286348 DEBUG oslo_concurrency.lockutils [None req-95854b29-0d11-4468-9b4b-db2ee9a7eb55 1ba5fce347b64bfebf995f187193f205 c785bf23f53946bc99867d8832a50266 - - default default] Lock "39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" "released" by "nova.compute.manager.ComputeManager.stop_instance..do_stop_instance" :: held 3.156s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:45:41 localhost systemd[1]: var-lib-containers-storage-overlay-666eed5dd2f565d13149f1b7f3b5648936bfa7703003f9a7dc35b189f28bc0ec-merged.mount: Deactivated successfully. Dec 15 04:45:41 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9e7442899f892fc0ac4ff30dac8e7344498d50f9a93adf21950447c50c9a3168-userdata-shm.mount: Deactivated successfully. Dec 15 04:45:41 localhost systemd[1]: run-netns-ovnmeta\x2dbefb7a72\x2d17a9\x2d4bcb\x2db561\x2d84b8f626685a.mount: Deactivated successfully. Dec 15 04:45:42 localhost nova_compute[286344]: 2025-12-15 09:45:42.933 286348 DEBUG nova.compute.manager [req-9db28735-e52a-4e41-b1ad-83e2136c0854 req-60b38806-dc04-4e33-84f7-124dfc2995d1 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Received event network-vif-plugged-03ef8889-3216-43fb-8a52-4be17a956ce1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Dec 15 04:45:42 localhost nova_compute[286344]: 2025-12-15 09:45:42.934 286348 DEBUG oslo_concurrency.lockutils [req-9db28735-e52a-4e41-b1ad-83e2136c0854 req-60b38806-dc04-4e33-84f7-124dfc2995d1 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] Acquiring lock "39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:45:42 localhost nova_compute[286344]: 2025-12-15 09:45:42.935 286348 DEBUG oslo_concurrency.lockutils [req-9db28735-e52a-4e41-b1ad-83e2136c0854 req-60b38806-dc04-4e33-84f7-124dfc2995d1 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] Lock "39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:45:42 localhost nova_compute[286344]: 2025-12-15 09:45:42.935 286348 DEBUG oslo_concurrency.lockutils [req-9db28735-e52a-4e41-b1ad-83e2136c0854 req-60b38806-dc04-4e33-84f7-124dfc2995d1 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] Lock "39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:45:42 localhost nova_compute[286344]: 2025-12-15 09:45:42.935 286348 DEBUG nova.compute.manager [req-9db28735-e52a-4e41-b1ad-83e2136c0854 req-60b38806-dc04-4e33-84f7-124dfc2995d1 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] No waiting events found dispatching network-vif-plugged-03ef8889-3216-43fb-8a52-4be17a956ce1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m Dec 15 04:45:42 localhost nova_compute[286344]: 2025-12-15 09:45:42.935 286348 WARNING nova.compute.manager [req-9db28735-e52a-4e41-b1ad-83e2136c0854 req-60b38806-dc04-4e33-84f7-124dfc2995d1 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Received unexpected event network-vif-plugged-03ef8889-3216-43fb-8a52-4be17a956ce1 for instance with vm_state stopped and task_state None.#033[00m Dec 15 04:45:43 localhost nova_compute[286344]: 2025-12-15 09:45:43.457 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:45:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09. Dec 15 04:45:43 localhost podman[287111]: 2025-12-15 09:45:43.747097794 +0000 UTC m=+0.080450717 container health_status 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, vcs-type=git, version=9.6, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b) Dec 15 04:45:43 localhost podman[287111]: 2025-12-15 09:45:43.762432073 +0000 UTC m=+0.095784996 container exec_died 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, vcs-type=git, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, release=1755695350, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, com.redhat.component=ubi9-minimal-container) Dec 15 04:45:43 localhost nova_compute[286344]: 2025-12-15 09:45:43.768 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:45:43 localhost systemd[1]: 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09.service: Deactivated successfully. Dec 15 04:45:43 localhost nova_compute[286344]: 2025-12-15 09:45:43.982 286348 DEBUG nova.compute.manager [None req-f1cca4c1-61a0-4ff5-8783-3b495a18d2db 1ba5fce347b64bfebf995f187193f205 c785bf23f53946bc99867d8832a50266 - - default default] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Dec 15 04:45:44 localhost nova_compute[286344]: 2025-12-15 09:45:44.004 286348 ERROR oslo_messaging.rpc.server [None req-f1cca4c1-61a0-4ff5-8783-3b495a18d2db 1ba5fce347b64bfebf995f187193f205 c785bf23f53946bc99867d8832a50266 - - default default] Exception during message handling: nova.exception.InstanceInvalidState: Instance 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 in power state shutdown. Cannot get_diagnostics while the instance is in this state. Dec 15 04:45:44 localhost nova_compute[286344]: 2025-12-15 09:45:44.004 286348 ERROR oslo_messaging.rpc.server Traceback (most recent call last): Dec 15 04:45:44 localhost nova_compute[286344]: 2025-12-15 09:45:44.004 286348 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming Dec 15 04:45:44 localhost nova_compute[286344]: 2025-12-15 09:45:44.004 286348 ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message) Dec 15 04:45:44 localhost nova_compute[286344]: 2025-12-15 09:45:44.004 286348 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch Dec 15 04:45:44 localhost nova_compute[286344]: 2025-12-15 09:45:44.004 286348 ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args) Dec 15 04:45:44 localhost nova_compute[286344]: 2025-12-15 09:45:44.004 286348 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch Dec 15 04:45:44 localhost nova_compute[286344]: 2025-12-15 09:45:44.004 286348 ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args) Dec 15 04:45:44 localhost nova_compute[286344]: 2025-12-15 09:45:44.004 286348 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/nova/exception_wrapper.py", line 71, in wrapped Dec 15 04:45:44 localhost nova_compute[286344]: 2025-12-15 09:45:44.004 286348 ERROR oslo_messaging.rpc.server _emit_versioned_exception_notification( Dec 15 04:45:44 localhost nova_compute[286344]: 2025-12-15 09:45:44.004 286348 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__ Dec 15 04:45:44 localhost nova_compute[286344]: 2025-12-15 09:45:44.004 286348 ERROR oslo_messaging.rpc.server self.force_reraise() Dec 15 04:45:44 localhost nova_compute[286344]: 2025-12-15 09:45:44.004 286348 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise Dec 15 04:45:44 localhost nova_compute[286344]: 2025-12-15 09:45:44.004 286348 ERROR oslo_messaging.rpc.server raise self.value Dec 15 04:45:44 localhost nova_compute[286344]: 2025-12-15 09:45:44.004 286348 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/nova/exception_wrapper.py", line 63, in wrapped Dec 15 04:45:44 localhost nova_compute[286344]: 2025-12-15 09:45:44.004 286348 ERROR oslo_messaging.rpc.server return f(self, context, *args, **kw) Dec 15 04:45:44 localhost nova_compute[286344]: 2025-12-15 09:45:44.004 286348 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 214, in decorated_function Dec 15 04:45:44 localhost nova_compute[286344]: 2025-12-15 09:45:44.004 286348 ERROR oslo_messaging.rpc.server compute_utils.add_instance_fault_from_exc(context, Dec 15 04:45:44 localhost nova_compute[286344]: 2025-12-15 09:45:44.004 286348 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__ Dec 15 04:45:44 localhost nova_compute[286344]: 2025-12-15 09:45:44.004 286348 ERROR oslo_messaging.rpc.server self.force_reraise() Dec 15 04:45:44 localhost nova_compute[286344]: 2025-12-15 09:45:44.004 286348 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise Dec 15 04:45:44 localhost nova_compute[286344]: 2025-12-15 09:45:44.004 286348 ERROR oslo_messaging.rpc.server raise self.value Dec 15 04:45:44 localhost nova_compute[286344]: 2025-12-15 09:45:44.004 286348 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 203, in decorated_function Dec 15 04:45:44 localhost nova_compute[286344]: 2025-12-15 09:45:44.004 286348 ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) Dec 15 04:45:44 localhost nova_compute[286344]: 2025-12-15 09:45:44.004 286348 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 6739, in get_instance_diagnostics Dec 15 04:45:44 localhost nova_compute[286344]: 2025-12-15 09:45:44.004 286348 ERROR oslo_messaging.rpc.server raise exception.InstanceInvalidState( Dec 15 04:45:44 localhost nova_compute[286344]: 2025-12-15 09:45:44.004 286348 ERROR oslo_messaging.rpc.server nova.exception.InstanceInvalidState: Instance 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 in power state shutdown. Cannot get_diagnostics while the instance is in this state. Dec 15 04:45:44 localhost nova_compute[286344]: 2025-12-15 09:45:44.004 286348 ERROR oslo_messaging.rpc.server #033[00m Dec 15 04:45:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 04:45:44 localhost podman[287131]: 2025-12-15 09:45:44.747857391 +0000 UTC m=+0.080103878 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2) Dec 15 04:45:44 localhost podman[287131]: 2025-12-15 09:45:44.81445011 +0000 UTC m=+0.146696627 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, config_id=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true) Dec 15 04:45:44 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 04:45:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:45:48.119 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'name': 'test', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005559462.localdomain', 'OS-EXT-STS:vm_state': 'shutdown', 'tenant_id': 'c785bf23f53946bc99867d8832a50266', 'user_id': '1ba5fce347b64bfebf995f187193f205', 'hostId': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'status': 'stopped', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Dec 15 04:45:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:45:48.119 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Dec 15 04:45:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:45:48.121 12 DEBUG ceilometer.compute.pollsters [-] Instance 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 was shut off while getting sample of disk.device.usage: Failed to inspect data of instance , domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152 Dec 15 04:45:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:45:48.122 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Dec 15 04:45:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:45:48.122 12 DEBUG ceilometer.compute.pollsters [-] Instance 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 was shut off while getting sample of network.incoming.bytes.delta: Failed to inspect data of instance , domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152 Dec 15 04:45:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:45:48.123 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Dec 15 04:45:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:45:48.124 12 DEBUG ceilometer.compute.pollsters [-] Instance 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 was shut off while getting sample of disk.device.read.latency: Failed to inspect data of instance , domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152 Dec 15 04:45:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:45:48.124 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Dec 15 04:45:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:45:48.125 12 DEBUG ceilometer.compute.pollsters [-] Instance 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 was shut off while getting sample of network.incoming.packets: Failed to inspect data of instance , domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152 Dec 15 04:45:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:45:48.125 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 15 04:45:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:45:48.125 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Dec 15 04:45:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:45:48.126 12 DEBUG ceilometer.compute.pollsters [-] Instance 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 was shut off while getting sample of disk.device.capacity: Failed to inspect data of instance , domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152 Dec 15 04:45:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:45:48.126 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Dec 15 04:45:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:45:48.127 12 DEBUG ceilometer.compute.pollsters [-] Instance 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 was shut off while getting sample of disk.device.write.requests: Failed to inspect data of instance , domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152 Dec 15 04:45:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:45:48.127 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Dec 15 04:45:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:45:48.128 12 DEBUG ceilometer.compute.pollsters [-] Instance 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 was shut off while getting sample of disk.device.write.latency: Failed to inspect data of instance , domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152 Dec 15 04:45:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:45:48.129 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Dec 15 04:45:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:45:48.130 12 DEBUG ceilometer.compute.pollsters [-] Instance 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 was shut off while getting sample of network.outgoing.packets.error: Failed to inspect data of instance , domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152 Dec 15 04:45:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:45:48.130 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Dec 15 04:45:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:45:48.131 12 DEBUG ceilometer.compute.pollsters [-] Instance 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 was shut off while getting sample of network.incoming.packets.error: Failed to inspect data of instance , domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152 Dec 15 04:45:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:45:48.131 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Dec 15 04:45:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:45:48.132 12 DEBUG ceilometer.compute.pollsters [-] Instance 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 was shut off while getting sample of disk.device.read.bytes: Failed to inspect data of instance , domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152 Dec 15 04:45:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:45:48.132 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 15 04:45:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:45:48.132 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Dec 15 04:45:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:45:48.133 12 DEBUG ceilometer.compute.pollsters [-] Instance 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 was shut off while getting sample of network.outgoing.packets.drop: Failed to inspect data of instance , domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152 Dec 15 04:45:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:45:48.134 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Dec 15 04:45:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:45:48.134 12 DEBUG ceilometer.compute.pollsters [-] Instance 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 was shut off while getting sample of network.outgoing.bytes: Failed to inspect data of instance , domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152 Dec 15 04:45:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:45:48.135 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Dec 15 04:45:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:45:48.136 12 DEBUG ceilometer.compute.pollsters [-] Instance 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 was shut off while getting sample of cpu: Failed to inspect data of instance , domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152 Dec 15 04:45:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:45:48.136 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Dec 15 04:45:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:45:48.137 12 DEBUG ceilometer.compute.pollsters [-] Instance 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 was shut off while getting sample of disk.device.allocation: Failed to inspect data of instance , domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152 Dec 15 04:45:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:45:48.137 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Dec 15 04:45:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:45:48.138 12 DEBUG ceilometer.compute.pollsters [-] Instance 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 was shut off while getting sample of memory.usage: Failed to inspect data of instance , domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152 Dec 15 04:45:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:45:48.138 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Dec 15 04:45:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:45:48.139 12 DEBUG ceilometer.compute.pollsters [-] Instance 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 was shut off while getting sample of network.outgoing.bytes.delta: Failed to inspect data of instance , domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152 Dec 15 04:45:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:45:48.139 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Dec 15 04:45:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:45:48.140 12 DEBUG ceilometer.compute.pollsters [-] Instance 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 was shut off while getting sample of disk.device.read.requests: Failed to inspect data of instance , domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152 Dec 15 04:45:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:45:48.141 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Dec 15 04:45:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:45:48.142 12 DEBUG ceilometer.compute.pollsters [-] Instance 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 was shut off while getting sample of network.incoming.packets.drop: Failed to inspect data of instance , domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152 Dec 15 04:45:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:45:48.142 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Dec 15 04:45:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:45:48.143 12 DEBUG ceilometer.compute.pollsters [-] Instance 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 was shut off while getting sample of network.incoming.bytes: Failed to inspect data of instance , domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152 Dec 15 04:45:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:45:48.143 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Dec 15 04:45:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:45:48.144 12 DEBUG ceilometer.compute.pollsters [-] Instance 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 was shut off while getting sample of disk.device.write.bytes: Failed to inspect data of instance , domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152 Dec 15 04:45:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:45:48.144 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Dec 15 04:45:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:45:48.145 12 DEBUG ceilometer.compute.pollsters [-] Instance 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 was shut off while getting sample of network.outgoing.packets: Failed to inspect data of instance , domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152 Dec 15 04:45:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:45:48.145 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 15 04:45:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:45:48.145 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 15 04:45:48 localhost nova_compute[286344]: 2025-12-15 09:45:48.496 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:45:48 localhost nova_compute[286344]: 2025-12-15 09:45:48.771 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:45:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:45:51.461 160590 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:45:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:45:51.462 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:45:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:45:51.462 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:45:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 04:45:51 localhost systemd[1]: tmp-crun.8FSxxI.mount: Deactivated successfully. Dec 15 04:45:51 localhost podman[287156]: 2025-12-15 09:45:51.788348383 +0000 UTC m=+0.096784545 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2) Dec 15 04:45:51 localhost podman[287156]: 2025-12-15 09:45:51.824449495 +0000 UTC m=+0.132885607 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent) Dec 15 04:45:51 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 04:45:53 localhost nova_compute[286344]: 2025-12-15 09:45:53.500 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:45:53 localhost nova_compute[286344]: 2025-12-15 09:45:53.775 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:45:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42872 DF PROTO=TCP SPT=59076 DPT=9102 SEQ=2096573310 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83A93CD20000000001030307) Dec 15 04:45:55 localhost nova_compute[286344]: 2025-12-15 09:45:55.738 286348 DEBUG nova.virt.driver [-] Emitting event Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m Dec 15 04:45:55 localhost nova_compute[286344]: 2025-12-15 09:45:55.739 286348 INFO nova.compute.manager [-] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] VM Stopped (Lifecycle Event)#033[00m Dec 15 04:45:55 localhost nova_compute[286344]: 2025-12-15 09:45:55.756 286348 DEBUG nova.compute.manager [None req-3d2e521f-d890-4462-ae2f-b8df27be0d60 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Dec 15 04:45:55 localhost nova_compute[286344]: 2025-12-15 09:45:55.760 286348 DEBUG nova.compute.manager [None req-3d2e521f-d890-4462-ae2f-b8df27be0d60 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: stopped, current task_state: None, current DB power_state: 4, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m Dec 15 04:45:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42873 DF PROTO=TCP SPT=59076 DPT=9102 SEQ=2096573310 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83A940E50000000001030307) Dec 15 04:45:56 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54633 DF PROTO=TCP SPT=54598 DPT=9102 SEQ=3163513306 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83A943250000000001030307) Dec 15 04:45:57 localhost nova_compute[286344]: 2025-12-15 09:45:57.329 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:45:57 localhost nova_compute[286344]: 2025-12-15 09:45:57.329 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:45:57 localhost nova_compute[286344]: 2025-12-15 09:45:57.330 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 15 04:45:57 localhost nova_compute[286344]: 2025-12-15 09:45:57.330 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 15 04:45:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e. Dec 15 04:45:57 localhost podman[287176]: 2025-12-15 09:45:57.755878283 +0000 UTC m=+0.083803270 container health_status a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 15 04:45:57 localhost podman[287176]: 2025-12-15 09:45:57.766748178 +0000 UTC m=+0.094673205 container exec_died a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 15 04:45:57 localhost systemd[1]: a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e.service: Deactivated successfully. Dec 15 04:45:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42874 DF PROTO=TCP SPT=59076 DPT=9102 SEQ=2096573310 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83A948E50000000001030307) Dec 15 04:45:58 localhost nova_compute[286344]: 2025-12-15 09:45:58.542 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:45:58 localhost nova_compute[286344]: 2025-12-15 09:45:58.631 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 15 04:45:58 localhost nova_compute[286344]: 2025-12-15 09:45:58.631 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquired lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 15 04:45:58 localhost nova_compute[286344]: 2025-12-15 09:45:58.632 286348 DEBUG nova.network.neutron [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 15 04:45:58 localhost nova_compute[286344]: 2025-12-15 09:45:58.632 286348 DEBUG nova.objects.instance [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 15 04:45:58 localhost nova_compute[286344]: 2025-12-15 09:45:58.776 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:45:59 localhost nova_compute[286344]: 2025-12-15 09:45:59.019 286348 DEBUG nova.network.neutron [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Updating instance_info_cache with network_info: [{"id": "03ef8889-3216-43fb-8a52-4be17a956ce1", "address": "fa:16:3e:74:df:7c", "network": {"id": "befb7a72-17a9-4bcb-b561-84b8f626685a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.201", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "c785bf23f53946bc99867d8832a50266", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03ef8889-32", "ovs_interfaceid": "03ef8889-3216-43fb-8a52-4be17a956ce1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 15 04:45:59 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23888 DF PROTO=TCP SPT=49422 DPT=9102 SEQ=2260569640 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83A94D250000000001030307) Dec 15 04:45:59 localhost nova_compute[286344]: 2025-12-15 09:45:59.046 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Releasing lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 15 04:45:59 localhost nova_compute[286344]: 2025-12-15 09:45:59.047 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 15 04:45:59 localhost nova_compute[286344]: 2025-12-15 09:45:59.048 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:45:59 localhost nova_compute[286344]: 2025-12-15 09:45:59.048 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:45:59 localhost nova_compute[286344]: 2025-12-15 09:45:59.049 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:45:59 localhost nova_compute[286344]: 2025-12-15 09:45:59.049 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:45:59 localhost nova_compute[286344]: 2025-12-15 09:45:59.050 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:45:59 localhost nova_compute[286344]: 2025-12-15 09:45:59.050 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:45:59 localhost nova_compute[286344]: 2025-12-15 09:45:59.051 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 15 04:45:59 localhost nova_compute[286344]: 2025-12-15 09:45:59.051 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:45:59 localhost nova_compute[286344]: 2025-12-15 09:45:59.070 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:45:59 localhost nova_compute[286344]: 2025-12-15 09:45:59.071 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:45:59 localhost nova_compute[286344]: 2025-12-15 09:45:59.071 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:45:59 localhost nova_compute[286344]: 2025-12-15 09:45:59.071 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Auditing locally available compute resources for np0005559462.localdomain (node: np0005559462.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 15 04:45:59 localhost nova_compute[286344]: 2025-12-15 09:45:59.072 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 04:45:59 localhost nova_compute[286344]: 2025-12-15 09:45:59.523 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 04:45:59 localhost nova_compute[286344]: 2025-12-15 09:45:59.580 286348 DEBUG nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 04:45:59 localhost nova_compute[286344]: 2025-12-15 09:45:59.581 286348 DEBUG nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 04:45:59 localhost nova_compute[286344]: 2025-12-15 09:45:59.783 286348 WARNING nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 15 04:45:59 localhost nova_compute[286344]: 2025-12-15 09:45:59.785 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Hypervisor/Node resource view: name=np0005559462.localdomain free_ram=12601MB free_disk=41.8370475769043GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 15 04:45:59 localhost nova_compute[286344]: 2025-12-15 09:45:59.786 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:45:59 localhost nova_compute[286344]: 2025-12-15 09:45:59.786 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:46:00 localhost nova_compute[286344]: 2025-12-15 09:46:00.070 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Instance 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 15 04:46:00 localhost nova_compute[286344]: 2025-12-15 09:46:00.070 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 15 04:46:00 localhost nova_compute[286344]: 2025-12-15 09:46:00.071 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Final resource view: name=np0005559462.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 15 04:46:00 localhost nova_compute[286344]: 2025-12-15 09:46:00.110 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 04:46:00 localhost nova_compute[286344]: 2025-12-15 09:46:00.573 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 04:46:00 localhost nova_compute[286344]: 2025-12-15 09:46:00.580 286348 DEBUG nova.compute.provider_tree [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Inventory has not changed in ProviderTree for provider: 26c8956b-6742-4951-b566-971b9bbe323b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 15 04:46:00 localhost nova_compute[286344]: 2025-12-15 09:46:00.596 286348 DEBUG nova.scheduler.client.report [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Inventory has not changed for provider 26c8956b-6742-4951-b566-971b9bbe323b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 15 04:46:00 localhost nova_compute[286344]: 2025-12-15 09:46:00.621 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Compute_service record updated for np0005559462.localdomain:np0005559462.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 15 04:46:00 localhost nova_compute[286344]: 2025-12-15 09:46:00.622 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.836s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:46:01 localhost nova_compute[286344]: 2025-12-15 09:46:01.826 286348 DEBUG nova.compute.manager [None req-f3672543-d91e-41ff-9a9d-2d05687a3760 1ba5fce347b64bfebf995f187193f205 c785bf23f53946bc99867d8832a50266 - - default default] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Dec 15 04:46:01 localhost nova_compute[286344]: 2025-12-15 09:46:01.847 286348 ERROR oslo_messaging.rpc.server [None req-f3672543-d91e-41ff-9a9d-2d05687a3760 1ba5fce347b64bfebf995f187193f205 c785bf23f53946bc99867d8832a50266 - - default default] Exception during message handling: nova.exception.InstanceInvalidState: Instance 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 in power state shutdown. Cannot get_diagnostics while the instance is in this state. Dec 15 04:46:01 localhost nova_compute[286344]: 2025-12-15 09:46:01.847 286348 ERROR oslo_messaging.rpc.server Traceback (most recent call last): Dec 15 04:46:01 localhost nova_compute[286344]: 2025-12-15 09:46:01.847 286348 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming Dec 15 04:46:01 localhost nova_compute[286344]: 2025-12-15 09:46:01.847 286348 ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message) Dec 15 04:46:01 localhost nova_compute[286344]: 2025-12-15 09:46:01.847 286348 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch Dec 15 04:46:01 localhost nova_compute[286344]: 2025-12-15 09:46:01.847 286348 ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args) Dec 15 04:46:01 localhost nova_compute[286344]: 2025-12-15 09:46:01.847 286348 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch Dec 15 04:46:01 localhost nova_compute[286344]: 2025-12-15 09:46:01.847 286348 ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args) Dec 15 04:46:01 localhost nova_compute[286344]: 2025-12-15 09:46:01.847 286348 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/nova/exception_wrapper.py", line 71, in wrapped Dec 15 04:46:01 localhost nova_compute[286344]: 2025-12-15 09:46:01.847 286348 ERROR oslo_messaging.rpc.server _emit_versioned_exception_notification( Dec 15 04:46:01 localhost nova_compute[286344]: 2025-12-15 09:46:01.847 286348 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__ Dec 15 04:46:01 localhost nova_compute[286344]: 2025-12-15 09:46:01.847 286348 ERROR oslo_messaging.rpc.server self.force_reraise() Dec 15 04:46:01 localhost nova_compute[286344]: 2025-12-15 09:46:01.847 286348 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise Dec 15 04:46:01 localhost nova_compute[286344]: 2025-12-15 09:46:01.847 286348 ERROR oslo_messaging.rpc.server raise self.value Dec 15 04:46:01 localhost nova_compute[286344]: 2025-12-15 09:46:01.847 286348 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/nova/exception_wrapper.py", line 63, in wrapped Dec 15 04:46:01 localhost nova_compute[286344]: 2025-12-15 09:46:01.847 286348 ERROR oslo_messaging.rpc.server return f(self, context, *args, **kw) Dec 15 04:46:01 localhost nova_compute[286344]: 2025-12-15 09:46:01.847 286348 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 214, in decorated_function Dec 15 04:46:01 localhost nova_compute[286344]: 2025-12-15 09:46:01.847 286348 ERROR oslo_messaging.rpc.server compute_utils.add_instance_fault_from_exc(context, Dec 15 04:46:01 localhost nova_compute[286344]: 2025-12-15 09:46:01.847 286348 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__ Dec 15 04:46:01 localhost nova_compute[286344]: 2025-12-15 09:46:01.847 286348 ERROR oslo_messaging.rpc.server self.force_reraise() Dec 15 04:46:01 localhost nova_compute[286344]: 2025-12-15 09:46:01.847 286348 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise Dec 15 04:46:01 localhost nova_compute[286344]: 2025-12-15 09:46:01.847 286348 ERROR oslo_messaging.rpc.server raise self.value Dec 15 04:46:01 localhost nova_compute[286344]: 2025-12-15 09:46:01.847 286348 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 203, in decorated_function Dec 15 04:46:01 localhost nova_compute[286344]: 2025-12-15 09:46:01.847 286348 ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) Dec 15 04:46:01 localhost nova_compute[286344]: 2025-12-15 09:46:01.847 286348 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 6739, in get_instance_diagnostics Dec 15 04:46:01 localhost nova_compute[286344]: 2025-12-15 09:46:01.847 286348 ERROR oslo_messaging.rpc.server raise exception.InstanceInvalidState( Dec 15 04:46:01 localhost nova_compute[286344]: 2025-12-15 09:46:01.847 286348 ERROR oslo_messaging.rpc.server nova.exception.InstanceInvalidState: Instance 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 in power state shutdown. Cannot get_diagnostics while the instance is in this state. Dec 15 04:46:01 localhost nova_compute[286344]: 2025-12-15 09:46:01.847 286348 ERROR oslo_messaging.rpc.server #033[00m Dec 15 04:46:01 localhost podman[243449]: time="2025-12-15T09:46:01Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 15 04:46:01 localhost podman[243449]: @ - - [15/Dec/2025:09:46:01 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147078 "" "Go-http-client/1.1" Dec 15 04:46:01 localhost podman[243449]: @ - - [15/Dec/2025:09:46:01 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16745 "" "Go-http-client/1.1" Dec 15 04:46:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42875 DF PROTO=TCP SPT=59076 DPT=9102 SEQ=2096573310 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83A958A60000000001030307) Dec 15 04:46:03 localhost nova_compute[286344]: 2025-12-15 09:46:03.545 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:46:03 localhost nova_compute[286344]: 2025-12-15 09:46:03.780 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:46:04 localhost openstack_network_exporter[246484]: ERROR 09:46:04 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 15 04:46:04 localhost openstack_network_exporter[246484]: ERROR 09:46:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 04:46:04 localhost openstack_network_exporter[246484]: ERROR 09:46:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 04:46:04 localhost openstack_network_exporter[246484]: ERROR 09:46:04 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 15 04:46:04 localhost openstack_network_exporter[246484]: Dec 15 04:46:04 localhost openstack_network_exporter[246484]: ERROR 09:46:04 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 15 04:46:04 localhost openstack_network_exporter[246484]: Dec 15 04:46:07 localhost nova_compute[286344]: 2025-12-15 09:46:07.881 286348 DEBUG nova.objects.instance [None req-be2ed574-8520-4b6c-9168-297eba4d6469 1ba5fce347b64bfebf995f187193f205 c785bf23f53946bc99867d8832a50266 - - default default] Lazy-loading 'flavor' on Instance uuid 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 15 04:46:07 localhost nova_compute[286344]: 2025-12-15 09:46:07.901 286348 DEBUG oslo_concurrency.lockutils [None req-be2ed574-8520-4b6c-9168-297eba4d6469 1ba5fce347b64bfebf995f187193f205 c785bf23f53946bc99867d8832a50266 - - default default] Acquiring lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 15 04:46:07 localhost nova_compute[286344]: 2025-12-15 09:46:07.902 286348 DEBUG oslo_concurrency.lockutils [None req-be2ed574-8520-4b6c-9168-297eba4d6469 1ba5fce347b64bfebf995f187193f205 c785bf23f53946bc99867d8832a50266 - - default default] Acquired lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 15 04:46:07 localhost nova_compute[286344]: 2025-12-15 09:46:07.902 286348 DEBUG nova.network.neutron [None req-be2ed574-8520-4b6c-9168-297eba4d6469 1ba5fce347b64bfebf995f187193f205 c785bf23f53946bc99867d8832a50266 - - default default] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m Dec 15 04:46:07 localhost nova_compute[286344]: 2025-12-15 09:46:07.903 286348 DEBUG nova.objects.instance [None req-be2ed574-8520-4b6c-9168-297eba4d6469 1ba5fce347b64bfebf995f187193f205 c785bf23f53946bc99867d8832a50266 - - default default] Lazy-loading 'info_cache' on Instance uuid 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 15 04:46:08 localhost nova_compute[286344]: 2025-12-15 09:46:08.589 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:46:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0. Dec 15 04:46:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 04:46:08 localhost nova_compute[286344]: 2025-12-15 09:46:08.663 286348 DEBUG nova.network.neutron [None req-be2ed574-8520-4b6c-9168-297eba4d6469 1ba5fce347b64bfebf995f187193f205 c785bf23f53946bc99867d8832a50266 - - default default] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Updating instance_info_cache with network_info: [{"id": "03ef8889-3216-43fb-8a52-4be17a956ce1", "address": "fa:16:3e:74:df:7c", "network": {"id": "befb7a72-17a9-4bcb-b561-84b8f626685a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.201", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "c785bf23f53946bc99867d8832a50266", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03ef8889-32", "ovs_interfaceid": "03ef8889-3216-43fb-8a52-4be17a956ce1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 15 04:46:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a. Dec 15 04:46:08 localhost nova_compute[286344]: 2025-12-15 09:46:08.680 286348 DEBUG oslo_concurrency.lockutils [None req-be2ed574-8520-4b6c-9168-297eba4d6469 1ba5fce347b64bfebf995f187193f205 c785bf23f53946bc99867d8832a50266 - - default default] Releasing lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 15 04:46:08 localhost nova_compute[286344]: 2025-12-15 09:46:08.707 286348 INFO nova.virt.libvirt.driver [-] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Instance destroyed successfully.#033[00m Dec 15 04:46:08 localhost nova_compute[286344]: 2025-12-15 09:46:08.708 286348 DEBUG nova.objects.instance [None req-be2ed574-8520-4b6c-9168-297eba4d6469 1ba5fce347b64bfebf995f187193f205 c785bf23f53946bc99867d8832a50266 - - default default] Lazy-loading 'numa_topology' on Instance uuid 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 15 04:46:08 localhost nova_compute[286344]: 2025-12-15 09:46:08.722 286348 DEBUG nova.objects.instance [None req-be2ed574-8520-4b6c-9168-297eba4d6469 1ba5fce347b64bfebf995f187193f205 c785bf23f53946bc99867d8832a50266 - - default default] Lazy-loading 'resources' on Instance uuid 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 15 04:46:08 localhost podman[287244]: 2025-12-15 09:46:08.736303267 +0000 UTC m=+0.064047877 container health_status 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0) Dec 15 04:46:08 localhost nova_compute[286344]: 2025-12-15 09:46:08.741 286348 DEBUG nova.virt.libvirt.vif [None req-be2ed574-8520-4b6c-9168-297eba4d6469 1ba5fce347b64bfebf995f187193f205 c785bf23f53946bc99867d8832a50266 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-15T08:29:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='test',display_name='test',ec2_ids=,ephemeral_gb=1,ephemeral_key_uuid=None,fault=,flavor=Flavor(2),hidden=False,host='np0005559462.localdomain',hostname='test',id=2,image_ref='7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2025-12-15T08:30:01Z,launched_on='np0005559462.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=,new_flavor=None,node='np0005559462.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=4,progress=0,project_id='c785bf23f53946bc99867d8832a50266',ramdisk_id='',reservation_id='r-e1tbwc05',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=,services=,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader',image_base_image_ref='7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='pc-q35-rhel9.0.0',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros',image_owner_specified.openstack.sha256='',owner_project_name='admin',owner_user_name='admin'},tags=,task_state='powering-on',terminated_at=None,trusted_certs=,updated_at=2025-12-15T09:45:41Z,user_data=None,user_id='1ba5fce347b64bfebf995f187193f205',uuid=39ff1bd9-6f6b-44c8-bbec-a1fd9d196359,vcpu_model=,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "03ef8889-3216-43fb-8a52-4be17a956ce1", "address": "fa:16:3e:74:df:7c", "network": {"id": "befb7a72-17a9-4bcb-b561-84b8f626685a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.201", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "c785bf23f53946bc99867d8832a50266", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03ef8889-32", "ovs_interfaceid": "03ef8889-3216-43fb-8a52-4be17a956ce1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m Dec 15 04:46:08 localhost nova_compute[286344]: 2025-12-15 09:46:08.741 286348 DEBUG nova.network.os_vif_util [None req-be2ed574-8520-4b6c-9168-297eba4d6469 1ba5fce347b64bfebf995f187193f205 c785bf23f53946bc99867d8832a50266 - - default default] Converting VIF {"id": "03ef8889-3216-43fb-8a52-4be17a956ce1", "address": "fa:16:3e:74:df:7c", "network": {"id": "befb7a72-17a9-4bcb-b561-84b8f626685a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.201", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "c785bf23f53946bc99867d8832a50266", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03ef8889-32", "ovs_interfaceid": "03ef8889-3216-43fb-8a52-4be17a956ce1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m Dec 15 04:46:08 localhost nova_compute[286344]: 2025-12-15 09:46:08.742 286348 DEBUG nova.network.os_vif_util [None req-be2ed574-8520-4b6c-9168-297eba4d6469 1ba5fce347b64bfebf995f187193f205 c785bf23f53946bc99867d8832a50266 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:74:df:7c,bridge_name='br-int',has_traffic_filtering=True,id=03ef8889-3216-43fb-8a52-4be17a956ce1,network=Network(befb7a72-17a9-4bcb-b561-84b8f626685a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap03ef8889-32') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m Dec 15 04:46:08 localhost nova_compute[286344]: 2025-12-15 09:46:08.743 286348 DEBUG os_vif [None req-be2ed574-8520-4b6c-9168-297eba4d6469 1ba5fce347b64bfebf995f187193f205 c785bf23f53946bc99867d8832a50266 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:74:df:7c,bridge_name='br-int',has_traffic_filtering=True,id=03ef8889-3216-43fb-8a52-4be17a956ce1,network=Network(befb7a72-17a9-4bcb-b561-84b8f626685a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap03ef8889-32') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m Dec 15 04:46:08 localhost nova_compute[286344]: 2025-12-15 09:46:08.746 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:46:08 localhost nova_compute[286344]: 2025-12-15 09:46:08.746 286348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap03ef8889-32, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 15 04:46:08 localhost nova_compute[286344]: 2025-12-15 09:46:08.748 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:46:08 localhost nova_compute[286344]: 2025-12-15 09:46:08.750 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:46:08 localhost nova_compute[286344]: 2025-12-15 09:46:08.753 286348 INFO os_vif [None req-be2ed574-8520-4b6c-9168-297eba4d6469 1ba5fce347b64bfebf995f187193f205 c785bf23f53946bc99867d8832a50266 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:74:df:7c,bridge_name='br-int',has_traffic_filtering=True,id=03ef8889-3216-43fb-8a52-4be17a956ce1,network=Network(befb7a72-17a9-4bcb-b561-84b8f626685a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap03ef8889-32')#033[00m Dec 15 04:46:08 localhost nova_compute[286344]: 2025-12-15 09:46:08.756 286348 DEBUG nova.virt.libvirt.host [None req-be2ed574-8520-4b6c-9168-297eba4d6469 1ba5fce347b64bfebf995f187193f205 c785bf23f53946bc99867d8832a50266 - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754#033[00m Dec 15 04:46:08 localhost nova_compute[286344]: 2025-12-15 09:46:08.756 286348 INFO nova.virt.libvirt.host [None req-be2ed574-8520-4b6c-9168-297eba4d6469 1ba5fce347b64bfebf995f187193f205 c785bf23f53946bc99867d8832a50266 - - default default] UEFI support detected#033[00m Dec 15 04:46:08 localhost nova_compute[286344]: 2025-12-15 09:46:08.764 286348 DEBUG nova.virt.libvirt.driver [None req-be2ed574-8520-4b6c-9168-297eba4d6469 1ba5fce347b64bfebf995f187193f205 c785bf23f53946bc99867d8832a50266 - - default default] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Start _get_guest_xml network_info=[{"id": "03ef8889-3216-43fb-8a52-4be17a956ce1", "address": "fa:16:3e:74:df:7c", "network": {"id": "befb7a72-17a9-4bcb-b561-84b8f626685a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.201", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "c785bf23f53946bc99867d8832a50266", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03ef8889-32", "ovs_interfaceid": "03ef8889-3216-43fb-8a52-4be17a956ce1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.eph0': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}}} image_meta=ImageMeta(checksum=,container_format='bare',created_at=,direct_url=,disk_format='qcow2',id=7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e,min_disk=1,min_ram=0,name=,owner=,properties=ImageMetaProps,protected=,size=,status=,tags=,updated_at=,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'size': 0, 'device_name': '/dev/vda', 'encrypted': False, 'device_type': 'disk', 'encryption_format': None, 'boot_index': 0, 'guest_format': None, 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}], 'ephemerals': [{'encryption_secret_uuid': None, 'size': 1, 'device_name': '/dev/vdb', 'encrypted': False, 'device_type': 'disk', 'encryption_format': None, 'guest_format': None, 'encryption_options': None, 'disk_bus': 'virtio'}], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m Dec 15 04:46:08 localhost podman[287243]: 2025-12-15 09:46:08.764658142 +0000 UTC m=+0.091003153 container health_status 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Dec 15 04:46:08 localhost nova_compute[286344]: 2025-12-15 09:46:08.769 286348 WARNING nova.virt.libvirt.driver [None req-be2ed574-8520-4b6c-9168-297eba4d6469 1ba5fce347b64bfebf995f187193f205 c785bf23f53946bc99867d8832a50266 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 15 04:46:08 localhost nova_compute[286344]: 2025-12-15 09:46:08.771 286348 DEBUG nova.virt.libvirt.host [None req-be2ed574-8520-4b6c-9168-297eba4d6469 1ba5fce347b64bfebf995f187193f205 c785bf23f53946bc99867d8832a50266 - - default default] Searching host: 'np0005559462.localdomain' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m Dec 15 04:46:08 localhost nova_compute[286344]: 2025-12-15 09:46:08.772 286348 DEBUG nova.virt.libvirt.host [None req-be2ed574-8520-4b6c-9168-297eba4d6469 1ba5fce347b64bfebf995f187193f205 c785bf23f53946bc99867d8832a50266 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m Dec 15 04:46:08 localhost nova_compute[286344]: 2025-12-15 09:46:08.774 286348 DEBUG nova.virt.libvirt.host [None req-be2ed574-8520-4b6c-9168-297eba4d6469 1ba5fce347b64bfebf995f187193f205 c785bf23f53946bc99867d8832a50266 - - default default] Searching host: 'np0005559462.localdomain' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m Dec 15 04:46:08 localhost nova_compute[286344]: 2025-12-15 09:46:08.775 286348 DEBUG nova.virt.libvirt.host [None req-be2ed574-8520-4b6c-9168-297eba4d6469 1ba5fce347b64bfebf995f187193f205 c785bf23f53946bc99867d8832a50266 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m Dec 15 04:46:08 localhost nova_compute[286344]: 2025-12-15 09:46:08.775 286348 DEBUG nova.virt.libvirt.driver [None req-be2ed574-8520-4b6c-9168-297eba4d6469 1ba5fce347b64bfebf995f187193f205 c785bf23f53946bc99867d8832a50266 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m Dec 15 04:46:08 localhost nova_compute[286344]: 2025-12-15 09:46:08.776 286348 DEBUG nova.virt.hardware [None req-be2ed574-8520-4b6c-9168-297eba4d6469 1ba5fce347b64bfebf995f187193f205 c785bf23f53946bc99867d8832a50266 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-15T08:28:54Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=1,extra_specs={},flavorid='2da0e147-aaa7-4bb9-a176-5fe1b15a32a0',id=2,is_public=True,memory_mb=512,name='m1.small',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=,container_format='bare',created_at=,direct_url=,disk_format='qcow2',id=7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e,min_disk=1,min_ram=0,name=,owner=,properties=ImageMetaProps,protected=,size=,status=,tags=,updated_at=,virtual_size=,visibility=), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m Dec 15 04:46:08 localhost nova_compute[286344]: 2025-12-15 09:46:08.777 286348 DEBUG nova.virt.hardware [None req-be2ed574-8520-4b6c-9168-297eba4d6469 1ba5fce347b64bfebf995f187193f205 c785bf23f53946bc99867d8832a50266 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m Dec 15 04:46:08 localhost nova_compute[286344]: 2025-12-15 09:46:08.777 286348 DEBUG nova.virt.hardware [None req-be2ed574-8520-4b6c-9168-297eba4d6469 1ba5fce347b64bfebf995f187193f205 c785bf23f53946bc99867d8832a50266 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m Dec 15 04:46:08 localhost nova_compute[286344]: 2025-12-15 09:46:08.777 286348 DEBUG nova.virt.hardware [None req-be2ed574-8520-4b6c-9168-297eba4d6469 1ba5fce347b64bfebf995f187193f205 c785bf23f53946bc99867d8832a50266 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m Dec 15 04:46:08 localhost nova_compute[286344]: 2025-12-15 09:46:08.778 286348 DEBUG nova.virt.hardware [None req-be2ed574-8520-4b6c-9168-297eba4d6469 1ba5fce347b64bfebf995f187193f205 c785bf23f53946bc99867d8832a50266 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m Dec 15 04:46:08 localhost nova_compute[286344]: 2025-12-15 09:46:08.778 286348 DEBUG nova.virt.hardware [None req-be2ed574-8520-4b6c-9168-297eba4d6469 1ba5fce347b64bfebf995f187193f205 c785bf23f53946bc99867d8832a50266 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m Dec 15 04:46:08 localhost nova_compute[286344]: 2025-12-15 09:46:08.779 286348 DEBUG nova.virt.hardware [None req-be2ed574-8520-4b6c-9168-297eba4d6469 1ba5fce347b64bfebf995f187193f205 c785bf23f53946bc99867d8832a50266 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m Dec 15 04:46:08 localhost nova_compute[286344]: 2025-12-15 09:46:08.779 286348 DEBUG nova.virt.hardware [None req-be2ed574-8520-4b6c-9168-297eba4d6469 1ba5fce347b64bfebf995f187193f205 c785bf23f53946bc99867d8832a50266 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m Dec 15 04:46:08 localhost nova_compute[286344]: 2025-12-15 09:46:08.779 286348 DEBUG nova.virt.hardware [None req-be2ed574-8520-4b6c-9168-297eba4d6469 1ba5fce347b64bfebf995f187193f205 c785bf23f53946bc99867d8832a50266 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m Dec 15 04:46:08 localhost nova_compute[286344]: 2025-12-15 09:46:08.780 286348 DEBUG nova.virt.hardware [None req-be2ed574-8520-4b6c-9168-297eba4d6469 1ba5fce347b64bfebf995f187193f205 c785bf23f53946bc99867d8832a50266 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m Dec 15 04:46:08 localhost nova_compute[286344]: 2025-12-15 09:46:08.780 286348 DEBUG nova.virt.hardware [None req-be2ed574-8520-4b6c-9168-297eba4d6469 1ba5fce347b64bfebf995f187193f205 c785bf23f53946bc99867d8832a50266 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m Dec 15 04:46:08 localhost nova_compute[286344]: 2025-12-15 09:46:08.781 286348 DEBUG nova.objects.instance [None req-be2ed574-8520-4b6c-9168-297eba4d6469 1ba5fce347b64bfebf995f187193f205 c785bf23f53946bc99867d8832a50266 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 15 04:46:08 localhost nova_compute[286344]: 2025-12-15 09:46:08.782 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:46:08 localhost nova_compute[286344]: 2025-12-15 09:46:08.801 286348 DEBUG nova.privsep.utils [None req-be2ed574-8520-4b6c-9168-297eba4d6469 1ba5fce347b64bfebf995f187193f205 c785bf23f53946bc99867d8832a50266 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m Dec 15 04:46:08 localhost nova_compute[286344]: 2025-12-15 09:46:08.801 286348 DEBUG oslo_concurrency.processutils [None req-be2ed574-8520-4b6c-9168-297eba4d6469 1ba5fce347b64bfebf995f187193f205 c785bf23f53946bc99867d8832a50266 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 04:46:08 localhost podman[287243]: 2025-12-15 09:46:08.808429228 +0000 UTC m=+0.134774199 container exec_died 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 15 04:46:08 localhost podman[287245]: 2025-12-15 09:46:08.82061861 +0000 UTC m=+0.143343599 container health_status b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, tcib_managed=true, container_name=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS) Dec 15 04:46:08 localhost systemd[1]: 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0.service: Deactivated successfully. Dec 15 04:46:08 localhost podman[287245]: 2025-12-15 09:46:08.859393877 +0000 UTC m=+0.182118836 container exec_died b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Dec 15 04:46:08 localhost podman[287244]: 2025-12-15 09:46:08.869325785 +0000 UTC m=+0.197070455 container exec_died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, managed_by=edpm_ansible) Dec 15 04:46:08 localhost systemd[1]: b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a.service: Deactivated successfully. Dec 15 04:46:08 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.service: Deactivated successfully. Dec 15 04:46:09 localhost nova_compute[286344]: 2025-12-15 09:46:09.243 286348 DEBUG oslo_concurrency.processutils [None req-be2ed574-8520-4b6c-9168-297eba4d6469 1ba5fce347b64bfebf995f187193f205 c785bf23f53946bc99867d8832a50266 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 04:46:09 localhost nova_compute[286344]: 2025-12-15 09:46:09.246 286348 DEBUG oslo_concurrency.processutils [None req-be2ed574-8520-4b6c-9168-297eba4d6469 1ba5fce347b64bfebf995f187193f205 c785bf23f53946bc99867d8832a50266 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 04:46:09 localhost nova_compute[286344]: 2025-12-15 09:46:09.660 286348 DEBUG oslo_concurrency.processutils [None req-be2ed574-8520-4b6c-9168-297eba4d6469 1ba5fce347b64bfebf995f187193f205 c785bf23f53946bc99867d8832a50266 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.414s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 04:46:09 localhost nova_compute[286344]: 2025-12-15 09:46:09.662 286348 DEBUG nova.virt.libvirt.vif [None req-be2ed574-8520-4b6c-9168-297eba4d6469 1ba5fce347b64bfebf995f187193f205 c785bf23f53946bc99867d8832a50266 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-15T08:29:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='test',display_name='test',ec2_ids=,ephemeral_gb=1,ephemeral_key_uuid=None,fault=,flavor=Flavor(2),hidden=False,host='np0005559462.localdomain',hostname='test',id=2,image_ref='7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2025-12-15T08:30:01Z,launched_on='np0005559462.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=,new_flavor=None,node='np0005559462.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=4,progress=0,project_id='c785bf23f53946bc99867d8832a50266',ramdisk_id='',reservation_id='r-e1tbwc05',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=,services=,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader',image_base_image_ref='7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='pc-q35-rhel9.0.0',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros',image_owner_specified.openstack.sha256='',owner_project_name='admin',owner_user_name='admin'},tags=,task_state='powering-on',terminated_at=None,trusted_certs=,updated_at=2025-12-15T09:45:41Z,user_data=None,user_id='1ba5fce347b64bfebf995f187193f205',uuid=39ff1bd9-6f6b-44c8-bbec-a1fd9d196359,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "03ef8889-3216-43fb-8a52-4be17a956ce1", "address": "fa:16:3e:74:df:7c", "network": {"id": "befb7a72-17a9-4bcb-b561-84b8f626685a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.201", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "c785bf23f53946bc99867d8832a50266", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03ef8889-32", "ovs_interfaceid": "03ef8889-3216-43fb-8a52-4be17a956ce1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m Dec 15 04:46:09 localhost nova_compute[286344]: 2025-12-15 09:46:09.662 286348 DEBUG nova.network.os_vif_util [None req-be2ed574-8520-4b6c-9168-297eba4d6469 1ba5fce347b64bfebf995f187193f205 c785bf23f53946bc99867d8832a50266 - - default default] Converting VIF {"id": "03ef8889-3216-43fb-8a52-4be17a956ce1", "address": "fa:16:3e:74:df:7c", "network": {"id": "befb7a72-17a9-4bcb-b561-84b8f626685a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.201", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "c785bf23f53946bc99867d8832a50266", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03ef8889-32", "ovs_interfaceid": "03ef8889-3216-43fb-8a52-4be17a956ce1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m Dec 15 04:46:09 localhost nova_compute[286344]: 2025-12-15 09:46:09.663 286348 DEBUG nova.network.os_vif_util [None req-be2ed574-8520-4b6c-9168-297eba4d6469 1ba5fce347b64bfebf995f187193f205 c785bf23f53946bc99867d8832a50266 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:74:df:7c,bridge_name='br-int',has_traffic_filtering=True,id=03ef8889-3216-43fb-8a52-4be17a956ce1,network=Network(befb7a72-17a9-4bcb-b561-84b8f626685a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap03ef8889-32') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m Dec 15 04:46:09 localhost nova_compute[286344]: 2025-12-15 09:46:09.664 286348 DEBUG nova.objects.instance [None req-be2ed574-8520-4b6c-9168-297eba4d6469 1ba5fce347b64bfebf995f187193f205 c785bf23f53946bc99867d8832a50266 - - default default] Lazy-loading 'pci_devices' on Instance uuid 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 15 04:46:09 localhost nova_compute[286344]: 2025-12-15 09:46:09.677 286348 DEBUG nova.virt.libvirt.driver [None req-be2ed574-8520-4b6c-9168-297eba4d6469 1ba5fce347b64bfebf995f187193f205 c785bf23f53946bc99867d8832a50266 - - default default] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] End _get_guest_xml xml= Dec 15 04:46:09 localhost nova_compute[286344]: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 Dec 15 04:46:09 localhost nova_compute[286344]: instance-00000002 Dec 15 04:46:09 localhost nova_compute[286344]: 524288 Dec 15 04:46:09 localhost nova_compute[286344]: 1 Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: test Dec 15 04:46:09 localhost nova_compute[286344]: 2025-12-15 09:46:08 Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: 512 Dec 15 04:46:09 localhost nova_compute[286344]: 1 Dec 15 04:46:09 localhost nova_compute[286344]: 0 Dec 15 04:46:09 localhost nova_compute[286344]: 1 Dec 15 04:46:09 localhost nova_compute[286344]: 1 Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: admin Dec 15 04:46:09 localhost nova_compute[286344]: admin Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: RDO Dec 15 04:46:09 localhost nova_compute[286344]: OpenStack Compute Dec 15 04:46:09 localhost nova_compute[286344]: 27.5.2-0.20250829104910.6f8decf.el9 Dec 15 04:46:09 localhost nova_compute[286344]: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 Dec 15 04:46:09 localhost nova_compute[286344]: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 Dec 15 04:46:09 localhost nova_compute[286344]: Virtual Machine Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: hvm Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: /dev/urandom Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: Dec 15 04:46:09 localhost nova_compute[286344]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m Dec 15 04:46:09 localhost nova_compute[286344]: 2025-12-15 09:46:09.680 286348 DEBUG nova.virt.libvirt.driver [None req-be2ed574-8520-4b6c-9168-297eba4d6469 1ba5fce347b64bfebf995f187193f205 c785bf23f53946bc99867d8832a50266 - - default default] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 04:46:09 localhost nova_compute[286344]: 2025-12-15 09:46:09.681 286348 DEBUG nova.virt.libvirt.driver [None req-be2ed574-8520-4b6c-9168-297eba4d6469 1ba5fce347b64bfebf995f187193f205 c785bf23f53946bc99867d8832a50266 - - default default] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 04:46:09 localhost nova_compute[286344]: 2025-12-15 09:46:09.683 286348 DEBUG nova.virt.libvirt.vif [None req-be2ed574-8520-4b6c-9168-297eba4d6469 1ba5fce347b64bfebf995f187193f205 c785bf23f53946bc99867d8832a50266 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-15T08:29:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='test',display_name='test',ec2_ids=,ephemeral_gb=1,ephemeral_key_uuid=None,fault=,flavor=Flavor(2),hidden=False,host='np0005559462.localdomain',hostname='test',id=2,image_ref='7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2025-12-15T08:30:01Z,launched_on='np0005559462.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=,new_flavor=None,node='np0005559462.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=,power_state=4,progress=0,project_id='c785bf23f53946bc99867d8832a50266',ramdisk_id='',reservation_id='r-e1tbwc05',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=,services=,shutdown_terminate=False,system_metadata={boot_roles='admin,member,reader',image_base_image_ref='7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='pc-q35-rhel9.0.0',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros',image_owner_specified.openstack.sha256='',owner_project_name='admin',owner_user_name='admin'},tags=,task_state='powering-on',terminated_at=None,trusted_certs=,updated_at=2025-12-15T09:45:41Z,user_data=None,user_id='1ba5fce347b64bfebf995f187193f205',uuid=39ff1bd9-6f6b-44c8-bbec-a1fd9d196359,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "03ef8889-3216-43fb-8a52-4be17a956ce1", "address": "fa:16:3e:74:df:7c", "network": {"id": "befb7a72-17a9-4bcb-b561-84b8f626685a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.201", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "c785bf23f53946bc99867d8832a50266", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03ef8889-32", "ovs_interfaceid": "03ef8889-3216-43fb-8a52-4be17a956ce1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m Dec 15 04:46:09 localhost nova_compute[286344]: 2025-12-15 09:46:09.683 286348 DEBUG nova.network.os_vif_util [None req-be2ed574-8520-4b6c-9168-297eba4d6469 1ba5fce347b64bfebf995f187193f205 c785bf23f53946bc99867d8832a50266 - - default default] Converting VIF {"id": "03ef8889-3216-43fb-8a52-4be17a956ce1", "address": "fa:16:3e:74:df:7c", "network": {"id": "befb7a72-17a9-4bcb-b561-84b8f626685a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.201", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "c785bf23f53946bc99867d8832a50266", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03ef8889-32", "ovs_interfaceid": "03ef8889-3216-43fb-8a52-4be17a956ce1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m Dec 15 04:46:09 localhost nova_compute[286344]: 2025-12-15 09:46:09.685 286348 DEBUG nova.network.os_vif_util [None req-be2ed574-8520-4b6c-9168-297eba4d6469 1ba5fce347b64bfebf995f187193f205 c785bf23f53946bc99867d8832a50266 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:74:df:7c,bridge_name='br-int',has_traffic_filtering=True,id=03ef8889-3216-43fb-8a52-4be17a956ce1,network=Network(befb7a72-17a9-4bcb-b561-84b8f626685a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap03ef8889-32') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m Dec 15 04:46:09 localhost nova_compute[286344]: 2025-12-15 09:46:09.685 286348 DEBUG os_vif [None req-be2ed574-8520-4b6c-9168-297eba4d6469 1ba5fce347b64bfebf995f187193f205 c785bf23f53946bc99867d8832a50266 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:74:df:7c,bridge_name='br-int',has_traffic_filtering=True,id=03ef8889-3216-43fb-8a52-4be17a956ce1,network=Network(befb7a72-17a9-4bcb-b561-84b8f626685a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap03ef8889-32') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m Dec 15 04:46:09 localhost nova_compute[286344]: 2025-12-15 09:46:09.686 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:46:09 localhost nova_compute[286344]: 2025-12-15 09:46:09.687 286348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 15 04:46:09 localhost nova_compute[286344]: 2025-12-15 09:46:09.688 286348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Dec 15 04:46:09 localhost nova_compute[286344]: 2025-12-15 09:46:09.692 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:46:09 localhost nova_compute[286344]: 2025-12-15 09:46:09.692 286348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap03ef8889-32, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 15 04:46:09 localhost nova_compute[286344]: 2025-12-15 09:46:09.693 286348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap03ef8889-32, col_values=(('external_ids', {'iface-id': '03ef8889-3216-43fb-8a52-4be17a956ce1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:74:df:7c', 'vm-uuid': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 15 04:46:09 localhost systemd[1]: tmp-crun.1cN0ei.mount: Deactivated successfully. Dec 15 04:46:09 localhost nova_compute[286344]: 2025-12-15 09:46:09.736 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:46:09 localhost nova_compute[286344]: 2025-12-15 09:46:09.741 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:46:09 localhost nova_compute[286344]: 2025-12-15 09:46:09.742 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:46:09 localhost nova_compute[286344]: 2025-12-15 09:46:09.743 286348 INFO os_vif [None req-be2ed574-8520-4b6c-9168-297eba4d6469 1ba5fce347b64bfebf995f187193f205 c785bf23f53946bc99867d8832a50266 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:74:df:7c,bridge_name='br-int',has_traffic_filtering=True,id=03ef8889-3216-43fb-8a52-4be17a956ce1,network=Network(befb7a72-17a9-4bcb-b561-84b8f626685a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap03ef8889-32')#033[00m Dec 15 04:46:09 localhost systemd[1]: Started libvirt secret daemon. Dec 15 04:46:09 localhost kernel: device tap03ef8889-32 entered promiscuous mode Dec 15 04:46:09 localhost NetworkManager[5963]: [1765791969.8446] manager: (tap03ef8889-32): new Tun device (/org/freedesktop/NetworkManager/Devices/15) Dec 15 04:46:09 localhost ovn_controller[154603]: 2025-12-15T09:46:09Z|00057|binding|INFO|Claiming lport 03ef8889-3216-43fb-8a52-4be17a956ce1 for this chassis. Dec 15 04:46:09 localhost ovn_controller[154603]: 2025-12-15T09:46:09Z|00058|binding|INFO|03ef8889-3216-43fb-8a52-4be17a956ce1: Claiming fa:16:3e:74:df:7c 192.168.0.201 Dec 15 04:46:09 localhost nova_compute[286344]: 2025-12-15 09:46:09.844 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:46:09 localhost systemd-udevd[287377]: Network interface NamePolicy= disabled on kernel command line. Dec 15 04:46:09 localhost nova_compute[286344]: 2025-12-15 09:46:09.852 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:46:09 localhost nova_compute[286344]: 2025-12-15 09:46:09.857 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:46:09 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:09.863 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:74:df:7c 192.168.0.201'], port_security=['fa:16:3e:74:df:7c 192.168.0.201'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.0.201/24', 'neutron:device_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-befb7a72-17a9-4bcb-b561-84b8f626685a', 'neutron:port_capabilities': '', 'neutron:port_fip': '192.168.122.20', 'neutron:port_name': '', 'neutron:project_id': 'c785bf23f53946bc99867d8832a50266', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'adeef2d9-3b61-4849-9b44-ac3bff90d0cd fa685b85-67a9-4a56-ba21-4767a05c4811', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=56a5044a-5384-46d9-b45d-bcd5602105ab, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=03ef8889-3216-43fb-8a52-4be17a956ce1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 04:46:09 localhost NetworkManager[5963]: [1765791969.8658] device (tap03ef8889-32): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external') Dec 15 04:46:09 localhost NetworkManager[5963]: [1765791969.8673] device (tap03ef8889-32): state change: unavailable -> disconnected (reason 'none', sys-iface-state: 'external') Dec 15 04:46:09 localhost ovn_controller[154603]: 2025-12-15T09:46:09Z|00059|ovn_bfd|INFO|Enabled BFD on interface ovn-9f826b-0 Dec 15 04:46:09 localhost ovn_controller[154603]: 2025-12-15T09:46:09Z|00060|ovn_bfd|INFO|Enabled BFD on interface ovn-843308-0 Dec 15 04:46:09 localhost ovn_controller[154603]: 2025-12-15T09:46:09Z|00061|ovn_bfd|INFO|Enabled BFD on interface ovn-c1fd65-0 Dec 15 04:46:09 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:09.865 160590 INFO neutron.agent.ovn.metadata.agent [-] Port 03ef8889-3216-43fb-8a52-4be17a956ce1 in datapath befb7a72-17a9-4bcb-b561-84b8f626685a bound to our chassis#033[00m Dec 15 04:46:09 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:09.866 160590 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network befb7a72-17a9-4bcb-b561-84b8f626685a#033[00m Dec 15 04:46:09 localhost nova_compute[286344]: 2025-12-15 09:46:09.868 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:46:09 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:09.876 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[c1cf56ee-9862-4fb4-92e8-91f490346fa1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 04:46:09 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:09.877 160590 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapbefb7a72-11 in ovnmeta-befb7a72-17a9-4bcb-b561-84b8f626685a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m Dec 15 04:46:09 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:09.879 160858 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapbefb7a72-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m Dec 15 04:46:09 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:09.879 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[345975bd-4465-4ce3-83df-f0cda7bef805]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 04:46:09 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:09.880 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[50235ad0-7141-4402-86c4-9f3574231a5c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 04:46:09 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:09.890 160979 DEBUG oslo.privsep.daemon [-] privsep: reply[5c97935f-95f9-44d3-86dc-adb729950c3c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 04:46:09 localhost nova_compute[286344]: 2025-12-15 09:46:09.894 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:46:09 localhost systemd-machined[84011]: New machine qemu-2-instance-00000002. Dec 15 04:46:09 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:09.904 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[45ff2a7a-d17f-440f-8518-78c9f3469f63]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 04:46:09 localhost nova_compute[286344]: 2025-12-15 09:46:09.909 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:46:09 localhost ovn_controller[154603]: 2025-12-15T09:46:09Z|00062|binding|INFO|Setting lport 03ef8889-3216-43fb-8a52-4be17a956ce1 ovn-installed in OVS Dec 15 04:46:09 localhost ovn_controller[154603]: 2025-12-15T09:46:09Z|00063|binding|INFO|Setting lport 03ef8889-3216-43fb-8a52-4be17a956ce1 up in Southbound Dec 15 04:46:09 localhost nova_compute[286344]: 2025-12-15 09:46:09.912 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:46:09 localhost systemd[1]: Started Virtual Machine qemu-2-instance-00000002. Dec 15 04:46:09 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:09.923 160959 DEBUG oslo.privsep.daemon [-] privsep: reply[4886ee21-ef91-4d28-b7bb-fec4180e9e8c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 04:46:09 localhost systemd-udevd[287380]: Network interface NamePolicy= disabled on kernel command line. Dec 15 04:46:09 localhost NetworkManager[5963]: [1765791969.9342] manager: (tapbefb7a72-10): new Veth device (/org/freedesktop/NetworkManager/Devices/16) Dec 15 04:46:09 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:09.932 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[e9d431ac-8aa3-4f2b-9cea-67e9f39dc6e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 04:46:09 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:09.954 160959 DEBUG oslo.privsep.daemon [-] privsep: reply[b164a087-b774-454e-a2ab-e86ec3f8b956]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 04:46:09 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:09.956 160959 DEBUG oslo.privsep.daemon [-] privsep: reply[5c450f5a-36a1-4427-adea-208a9aba11a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 04:46:09 localhost nova_compute[286344]: 2025-12-15 09:46:09.963 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:46:09 localhost NetworkManager[5963]: [1765791969.9708] device (tapbefb7a72-10): carrier: link connected Dec 15 04:46:09 localhost kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tapbefb7a72-11: link becomes ready Dec 15 04:46:09 localhost kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tapbefb7a72-10: link becomes ready Dec 15 04:46:09 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:09.977 160959 DEBUG oslo.privsep.daemon [-] privsep: reply[cbb7409e-ec48-411e-b87e-813fa3f05c29]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 04:46:09 localhost nova_compute[286344]: 2025-12-15 09:46:09.987 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:46:09 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:09.992 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[943ad149-c746-4555-814a-df54a6575be8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbefb7a72-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:d6:2e:fb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1103608, 'reachable_time': 16906, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 287415, 'error': None, 'target': 'ovnmeta-befb7a72-17a9-4bcb-b561-84b8f626685a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 04:46:10 localhost nova_compute[286344]: 2025-12-15 09:46:10.001 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:46:10 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:10.003 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[1985c2e0-20b0-40ac-9f85-1cca3a8b63b7]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed6:2efb'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1103608, 'tstamp': 1103608}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 287416, 'error': None, 'target': 'ovnmeta-befb7a72-17a9-4bcb-b561-84b8f626685a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 04:46:10 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:10.012 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[0a1fd956-5003-47ed-b831-13791e2e02be]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbefb7a72-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:d6:2e:fb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1103608, 'reachable_time': 16906, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 287421, 'error': None, 'target': 'ovnmeta-befb7a72-17a9-4bcb-b561-84b8f626685a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 04:46:10 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:10.032 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[007137bf-2ae4-46dd-80f8-b819b05382e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 04:46:10 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:10.079 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[fb4b0756-5e0e-4a0d-bbc0-5ac765dc0ed2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 04:46:10 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:10.081 160590 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbefb7a72-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 15 04:46:10 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:10.082 160590 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Dec 15 04:46:10 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:10.083 160590 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbefb7a72-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 15 04:46:10 localhost kernel: device tapbefb7a72-10 entered promiscuous mode Dec 15 04:46:10 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:10.089 160590 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapbefb7a72-10, col_values=(('external_ids', {'iface-id': 'b35254ad-12eb-47bb-92be-44fefe0694f0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 15 04:46:10 localhost ovn_controller[154603]: 2025-12-15T09:46:10Z|00064|binding|INFO|Releasing lport b35254ad-12eb-47bb-92be-44fefe0694f0 from this chassis (sb_readonly=0) Dec 15 04:46:10 localhost nova_compute[286344]: 2025-12-15 09:46:10.091 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:46:10 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:10.092 160590 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/befb7a72-17a9-4bcb-b561-84b8f626685a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/befb7a72-17a9-4bcb-b561-84b8f626685a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m Dec 15 04:46:10 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:10.093 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[3f8edaa8-3d5b-4eef-bc38-0be51d87f286]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 04:46:10 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:10.095 160590 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = Dec 15 04:46:10 localhost ovn_metadata_agent[160585]: global Dec 15 04:46:10 localhost ovn_metadata_agent[160585]: log /dev/log local0 debug Dec 15 04:46:10 localhost ovn_metadata_agent[160585]: log-tag haproxy-metadata-proxy-befb7a72-17a9-4bcb-b561-84b8f626685a Dec 15 04:46:10 localhost ovn_metadata_agent[160585]: user root Dec 15 04:46:10 localhost ovn_metadata_agent[160585]: group root Dec 15 04:46:10 localhost ovn_metadata_agent[160585]: maxconn 1024 Dec 15 04:46:10 localhost ovn_metadata_agent[160585]: pidfile /var/lib/neutron/external/pids/befb7a72-17a9-4bcb-b561-84b8f626685a.pid.haproxy Dec 15 04:46:10 localhost ovn_metadata_agent[160585]: daemon Dec 15 04:46:10 localhost ovn_metadata_agent[160585]: Dec 15 04:46:10 localhost ovn_metadata_agent[160585]: defaults Dec 15 04:46:10 localhost ovn_metadata_agent[160585]: log global Dec 15 04:46:10 localhost ovn_metadata_agent[160585]: mode http Dec 15 04:46:10 localhost ovn_metadata_agent[160585]: option httplog Dec 15 04:46:10 localhost ovn_metadata_agent[160585]: option dontlognull Dec 15 04:46:10 localhost ovn_metadata_agent[160585]: option http-server-close Dec 15 04:46:10 localhost ovn_metadata_agent[160585]: option forwardfor Dec 15 04:46:10 localhost ovn_metadata_agent[160585]: retries 3 Dec 15 04:46:10 localhost ovn_metadata_agent[160585]: timeout http-request 30s Dec 15 04:46:10 localhost ovn_metadata_agent[160585]: timeout connect 30s Dec 15 04:46:10 localhost ovn_metadata_agent[160585]: timeout client 32s Dec 15 04:46:10 localhost ovn_metadata_agent[160585]: timeout server 32s Dec 15 04:46:10 localhost ovn_metadata_agent[160585]: timeout http-keep-alive 30s Dec 15 04:46:10 localhost ovn_metadata_agent[160585]: Dec 15 04:46:10 localhost ovn_metadata_agent[160585]: Dec 15 04:46:10 localhost ovn_metadata_agent[160585]: listen listener Dec 15 04:46:10 localhost ovn_metadata_agent[160585]: bind 169.254.169.254:80 Dec 15 04:46:10 localhost ovn_metadata_agent[160585]: server metadata /var/lib/neutron/metadata_proxy Dec 15 04:46:10 localhost ovn_metadata_agent[160585]: http-request add-header X-OVN-Network-ID befb7a72-17a9-4bcb-b561-84b8f626685a Dec 15 04:46:10 localhost ovn_metadata_agent[160585]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m Dec 15 04:46:10 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:10.097 160590 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-befb7a72-17a9-4bcb-b561-84b8f626685a', 'env', 'PROCESS_TAG=haproxy-befb7a72-17a9-4bcb-b561-84b8f626685a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/befb7a72-17a9-4bcb-b561-84b8f626685a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m Dec 15 04:46:10 localhost nova_compute[286344]: 2025-12-15 09:46:10.271 286348 DEBUG nova.virt.driver [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] Emitting event Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m Dec 15 04:46:10 localhost nova_compute[286344]: 2025-12-15 09:46:10.272 286348 INFO nova.compute.manager [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] VM Resumed (Lifecycle Event)#033[00m Dec 15 04:46:10 localhost nova_compute[286344]: 2025-12-15 09:46:10.275 286348 DEBUG nova.compute.manager [None req-be2ed574-8520-4b6c-9168-297eba4d6469 1ba5fce347b64bfebf995f187193f205 c785bf23f53946bc99867d8832a50266 - - default default] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Instance event wait completed in 0 seconds for wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m Dec 15 04:46:10 localhost nova_compute[286344]: 2025-12-15 09:46:10.279 286348 INFO nova.virt.libvirt.driver [-] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Instance rebooted successfully.#033[00m Dec 15 04:46:10 localhost nova_compute[286344]: 2025-12-15 09:46:10.280 286348 DEBUG nova.compute.manager [None req-be2ed574-8520-4b6c-9168-297eba4d6469 1ba5fce347b64bfebf995f187193f205 c785bf23f53946bc99867d8832a50266 - - default default] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Dec 15 04:46:10 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42876 DF PROTO=TCP SPT=59076 DPT=9102 SEQ=2096573310 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83A979260000000001030307) Dec 15 04:46:10 localhost nova_compute[286344]: 2025-12-15 09:46:10.294 286348 DEBUG nova.compute.manager [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Dec 15 04:46:10 localhost nova_compute[286344]: 2025-12-15 09:46:10.302 286348 DEBUG nova.compute.manager [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m Dec 15 04:46:10 localhost nova_compute[286344]: 2025-12-15 09:46:10.325 286348 INFO nova.compute.manager [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] During sync_power_state the instance has a pending task (powering-on). Skip.#033[00m Dec 15 04:46:10 localhost nova_compute[286344]: 2025-12-15 09:46:10.326 286348 DEBUG nova.virt.driver [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] Emitting event Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m Dec 15 04:46:10 localhost nova_compute[286344]: 2025-12-15 09:46:10.326 286348 INFO nova.compute.manager [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] VM Started (Lifecycle Event)#033[00m Dec 15 04:46:10 localhost nova_compute[286344]: 2025-12-15 09:46:10.377 286348 DEBUG nova.compute.manager [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Dec 15 04:46:10 localhost nova_compute[286344]: 2025-12-15 09:46:10.381 286348 DEBUG nova.compute.manager [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Synchronizing instance power state after lifecycle event "Started"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m Dec 15 04:46:10 localhost nova_compute[286344]: 2025-12-15 09:46:10.422 286348 INFO nova.compute.manager [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] During sync_power_state the instance has a pending task (powering-on). Skip.#033[00m Dec 15 04:46:10 localhost podman[287491]: Dec 15 04:46:10 localhost podman[287491]: 2025-12-15 09:46:10.531806845 +0000 UTC m=+0.087964047 container create c435cae79ba246ae145a8d0e68debe32f1b3cf4cf977b8153ea785dc477a48a9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-befb7a72-17a9-4bcb-b561-84b8f626685a, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3) Dec 15 04:46:10 localhost systemd[1]: Started libpod-conmon-c435cae79ba246ae145a8d0e68debe32f1b3cf4cf977b8153ea785dc477a48a9.scope. Dec 15 04:46:10 localhost podman[287491]: 2025-12-15 09:46:10.488601123 +0000 UTC m=+0.044758355 image pull quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified Dec 15 04:46:10 localhost systemd[1]: Started libcrun container. Dec 15 04:46:10 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/052c6914a31d00cdac70e0ab49449b7cbaa60ade99ff6abb20807b01c882d1a2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 04:46:10 localhost podman[287491]: 2025-12-15 09:46:10.605292585 +0000 UTC m=+0.161449787 container init c435cae79ba246ae145a8d0e68debe32f1b3cf4cf977b8153ea785dc477a48a9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-befb7a72-17a9-4bcb-b561-84b8f626685a, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3) Dec 15 04:46:10 localhost podman[287491]: 2025-12-15 09:46:10.614207074 +0000 UTC m=+0.170364306 container start c435cae79ba246ae145a8d0e68debe32f1b3cf4cf977b8153ea785dc477a48a9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-befb7a72-17a9-4bcb-b561-84b8f626685a, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team) Dec 15 04:46:10 localhost neutron-haproxy-ovnmeta-befb7a72-17a9-4bcb-b561-84b8f626685a[287505]: [NOTICE] (287509) : New worker (287511) forked Dec 15 04:46:10 localhost neutron-haproxy-ovnmeta-befb7a72-17a9-4bcb-b561-84b8f626685a[287505]: [NOTICE] (287509) : Loading success. Dec 15 04:46:10 localhost ovn_controller[154603]: 2025-12-15T09:46:10Z|00065|binding|INFO|Releasing lport b35254ad-12eb-47bb-92be-44fefe0694f0 from this chassis (sb_readonly=0) Dec 15 04:46:10 localhost nova_compute[286344]: 2025-12-15 09:46:10.912 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:46:10 localhost ovn_controller[154603]: 2025-12-15T09:46:10Z|00066|binding|INFO|Releasing lport b35254ad-12eb-47bb-92be-44fefe0694f0 from this chassis (sb_readonly=0) Dec 15 04:46:10 localhost nova_compute[286344]: 2025-12-15 09:46:10.945 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:46:10 localhost ovn_controller[154603]: 2025-12-15T09:46:10Z|00067|binding|INFO|Releasing lport b35254ad-12eb-47bb-92be-44fefe0694f0 from this chassis (sb_readonly=0) Dec 15 04:46:10 localhost nova_compute[286344]: 2025-12-15 09:46:10.973 286348 DEBUG nova.compute.manager [req-e3d6e8be-6435-40cc-9416-69a615a3076c req-5922e4aa-5611-4a1b-8328-d80a34f8540c 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Received event network-vif-plugged-03ef8889-3216-43fb-8a52-4be17a956ce1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Dec 15 04:46:10 localhost nova_compute[286344]: 2025-12-15 09:46:10.973 286348 DEBUG oslo_concurrency.lockutils [req-e3d6e8be-6435-40cc-9416-69a615a3076c req-5922e4aa-5611-4a1b-8328-d80a34f8540c 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] Acquiring lock "39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:46:10 localhost nova_compute[286344]: 2025-12-15 09:46:10.974 286348 DEBUG oslo_concurrency.lockutils [req-e3d6e8be-6435-40cc-9416-69a615a3076c req-5922e4aa-5611-4a1b-8328-d80a34f8540c 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] Lock "39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:46:10 localhost nova_compute[286344]: 2025-12-15 09:46:10.974 286348 DEBUG oslo_concurrency.lockutils [req-e3d6e8be-6435-40cc-9416-69a615a3076c req-5922e4aa-5611-4a1b-8328-d80a34f8540c 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] Lock "39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:46:10 localhost nova_compute[286344]: 2025-12-15 09:46:10.974 286348 DEBUG nova.compute.manager [req-e3d6e8be-6435-40cc-9416-69a615a3076c req-5922e4aa-5611-4a1b-8328-d80a34f8540c 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] No waiting events found dispatching network-vif-plugged-03ef8889-3216-43fb-8a52-4be17a956ce1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m Dec 15 04:46:10 localhost nova_compute[286344]: 2025-12-15 09:46:10.974 286348 WARNING nova.compute.manager [req-e3d6e8be-6435-40cc-9416-69a615a3076c req-5922e4aa-5611-4a1b-8328-d80a34f8540c 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Received unexpected event network-vif-plugged-03ef8889-3216-43fb-8a52-4be17a956ce1 for instance with vm_state active and task_state None.#033[00m Dec 15 04:46:11 localhost snmpd[69387]: IfIndex of an interface changed. Such interfaces will appear multiple times in IF-MIB. Dec 15 04:46:13 localhost nova_compute[286344]: 2025-12-15 09:46:13.015 286348 DEBUG nova.compute.manager [req-66741461-d6d9-4a83-b94a-85c97c33920b req-4d6718c9-b189-4889-9d59-10c08fb53460 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Received event network-vif-plugged-03ef8889-3216-43fb-8a52-4be17a956ce1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Dec 15 04:46:13 localhost nova_compute[286344]: 2025-12-15 09:46:13.016 286348 DEBUG oslo_concurrency.lockutils [req-66741461-d6d9-4a83-b94a-85c97c33920b req-4d6718c9-b189-4889-9d59-10c08fb53460 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] Acquiring lock "39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:46:13 localhost nova_compute[286344]: 2025-12-15 09:46:13.017 286348 DEBUG oslo_concurrency.lockutils [req-66741461-d6d9-4a83-b94a-85c97c33920b req-4d6718c9-b189-4889-9d59-10c08fb53460 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] Lock "39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:46:13 localhost nova_compute[286344]: 2025-12-15 09:46:13.017 286348 DEBUG oslo_concurrency.lockutils [req-66741461-d6d9-4a83-b94a-85c97c33920b req-4d6718c9-b189-4889-9d59-10c08fb53460 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] Lock "39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:46:13 localhost nova_compute[286344]: 2025-12-15 09:46:13.018 286348 DEBUG nova.compute.manager [req-66741461-d6d9-4a83-b94a-85c97c33920b req-4d6718c9-b189-4889-9d59-10c08fb53460 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] No waiting events found dispatching network-vif-plugged-03ef8889-3216-43fb-8a52-4be17a956ce1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m Dec 15 04:46:13 localhost nova_compute[286344]: 2025-12-15 09:46:13.019 286348 WARNING nova.compute.manager [req-66741461-d6d9-4a83-b94a-85c97c33920b req-4d6718c9-b189-4889-9d59-10c08fb53460 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Received unexpected event network-vif-plugged-03ef8889-3216-43fb-8a52-4be17a956ce1 for instance with vm_state active and task_state None.#033[00m Dec 15 04:46:13 localhost nova_compute[286344]: 2025-12-15 09:46:13.785 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:46:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09. Dec 15 04:46:14 localhost systemd[1]: tmp-crun.2noynL.mount: Deactivated successfully. Dec 15 04:46:14 localhost podman[287606]: 2025-12-15 09:46:14.118790368 +0000 UTC m=+0.120935751 container health_status 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vendor=Red Hat, Inc., distribution-scope=public, version=9.6, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, vcs-type=git, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9) Dec 15 04:46:14 localhost podman[287606]: 2025-12-15 09:46:14.162259357 +0000 UTC m=+0.164404730 container exec_died 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.buildah.version=1.33.7, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, version=9.6, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=) Dec 15 04:46:14 localhost systemd[1]: 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09.service: Deactivated successfully. Dec 15 04:46:14 localhost podman[287687]: Dec 15 04:46:14 localhost podman[287687]: 2025-12-15 09:46:14.676553252 +0000 UTC m=+0.077608487 container create 3496832304bfc1113fabe1bc91e39742d8adfcbd403c4dfc18e67b000a5bd40c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_khorana, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, release=1763362218, io.openshift.expose-services=, build-date=2025-11-26T19:44:28Z, ceph=True, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, GIT_CLEAN=True, CEPH_POINT_RELEASE=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, distribution-scope=public, GIT_BRANCH=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, RELEASE=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0) Dec 15 04:46:14 localhost systemd[1]: Started libpod-conmon-3496832304bfc1113fabe1bc91e39742d8adfcbd403c4dfc18e67b000a5bd40c.scope. Dec 15 04:46:14 localhost systemd[1]: Started libcrun container. Dec 15 04:46:14 localhost podman[287687]: 2025-12-15 09:46:14.642418065 +0000 UTC m=+0.043473350 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 04:46:14 localhost nova_compute[286344]: 2025-12-15 09:46:14.774 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:46:14 localhost podman[287687]: 2025-12-15 09:46:14.779133118 +0000 UTC m=+0.180188323 container init 3496832304bfc1113fabe1bc91e39742d8adfcbd403c4dfc18e67b000a5bd40c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_khorana, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1763362218, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, distribution-scope=public, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, RELEASE=main, vcs-type=git, ceph=True) Dec 15 04:46:14 localhost podman[287687]: 2025-12-15 09:46:14.79026314 +0000 UTC m=+0.191318345 container start 3496832304bfc1113fabe1bc91e39742d8adfcbd403c4dfc18e67b000a5bd40c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_khorana, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, build-date=2025-11-26T19:44:28Z, description=Red Hat Ceph Storage 7, name=rhceph, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., version=7, io.k8s.description=Red Hat Ceph Storage 7, release=1763362218, RELEASE=main, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, GIT_CLEAN=True, distribution-scope=public, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Dec 15 04:46:14 localhost podman[287687]: 2025-12-15 09:46:14.790382993 +0000 UTC m=+0.191438198 container attach 3496832304bfc1113fabe1bc91e39742d8adfcbd403c4dfc18e67b000a5bd40c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_khorana, architecture=x86_64, io.openshift.expose-services=, release=1763362218, description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, GIT_BRANCH=main, build-date=2025-11-26T19:44:28Z, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, RELEASE=main, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 15 04:46:14 localhost competent_khorana[287701]: 167 167 Dec 15 04:46:14 localhost systemd[1]: libpod-3496832304bfc1113fabe1bc91e39742d8adfcbd403c4dfc18e67b000a5bd40c.scope: Deactivated successfully. Dec 15 04:46:14 localhost podman[287687]: 2025-12-15 09:46:14.796767482 +0000 UTC m=+0.197822777 container died 3496832304bfc1113fabe1bc91e39742d8adfcbd403c4dfc18e67b000a5bd40c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_khorana, release=1763362218, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, vcs-type=git, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, maintainer=Guillaume Abrioux , architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, version=7, distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, GIT_CLEAN=True, name=rhceph, GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7) Dec 15 04:46:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 04:46:14 localhost podman[287706]: 2025-12-15 09:46:14.902142516 +0000 UTC m=+0.094997124 container remove 3496832304bfc1113fabe1bc91e39742d8adfcbd403c4dfc18e67b000a5bd40c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_khorana, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z, ceph=True, vcs-type=git, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, maintainer=Guillaume Abrioux , RELEASE=main, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, release=1763362218, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=rhceph) Dec 15 04:46:14 localhost systemd[1]: libpod-conmon-3496832304bfc1113fabe1bc91e39742d8adfcbd403c4dfc18e67b000a5bd40c.scope: Deactivated successfully. Dec 15 04:46:14 localhost podman[287720]: 2025-12-15 09:46:14.998344942 +0000 UTC m=+0.107204786 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202) Dec 15 04:46:15 localhost podman[287720]: 2025-12-15 09:46:15.038379574 +0000 UTC m=+0.147239468 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS) Dec 15 04:46:15 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 04:46:15 localhost podman[287752]: Dec 15 04:46:15 localhost systemd[1]: var-lib-containers-storage-overlay-4de228ada009ccd2206737874a5b62d0abed09f36bb0f05c508d8af714c71a4c-merged.mount: Deactivated successfully. Dec 15 04:46:15 localhost podman[287752]: 2025-12-15 09:46:15.110200057 +0000 UTC m=+0.069048936 container create 9090db0fdbb8c9e5395cda50908e6d3c78c4b32388597563f2f8b70a10515d85 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zen_elion, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, architecture=x86_64, CEPH_POINT_RELEASE=, GIT_CLEAN=True, maintainer=Guillaume Abrioux , distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, url=https://catalog.redhat.com/en/search?searchType=containers) Dec 15 04:46:15 localhost systemd[1]: Started libpod-conmon-9090db0fdbb8c9e5395cda50908e6d3c78c4b32388597563f2f8b70a10515d85.scope. Dec 15 04:46:15 localhost podman[287752]: 2025-12-15 09:46:15.075780273 +0000 UTC m=+0.034629222 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 04:46:15 localhost systemd[1]: Started libcrun container. Dec 15 04:46:15 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/929beded6df297ab5ef6457cc90d67b4c61fd5f82ba7216fb1e9310fb09ab6c2/merged/rootfs supports timestamps until 2038 (0x7fffffff) Dec 15 04:46:15 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/929beded6df297ab5ef6457cc90d67b4c61fd5f82ba7216fb1e9310fb09ab6c2/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Dec 15 04:46:15 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/929beded6df297ab5ef6457cc90d67b4c61fd5f82ba7216fb1e9310fb09ab6c2/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Dec 15 04:46:15 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/929beded6df297ab5ef6457cc90d67b4c61fd5f82ba7216fb1e9310fb09ab6c2/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Dec 15 04:46:15 localhost podman[287752]: 2025-12-15 09:46:15.191571638 +0000 UTC m=+0.150420547 container init 9090db0fdbb8c9e5395cda50908e6d3c78c4b32388597563f2f8b70a10515d85 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zen_elion, version=7, description=Red Hat Ceph Storage 7, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, release=1763362218, com.redhat.component=rhceph-container, GIT_BRANCH=main, architecture=x86_64, build-date=2025-11-26T19:44:28Z, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True) Dec 15 04:46:15 localhost podman[287752]: 2025-12-15 09:46:15.212111624 +0000 UTC m=+0.170960543 container start 9090db0fdbb8c9e5395cda50908e6d3c78c4b32388597563f2f8b70a10515d85 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zen_elion, version=7, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, distribution-scope=public, RELEASE=main, release=1763362218, description=Red Hat Ceph Storage 7, architecture=x86_64, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, name=rhceph, ceph=True, io.buildah.version=1.41.4, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=) Dec 15 04:46:15 localhost podman[287752]: 2025-12-15 09:46:15.21233874 +0000 UTC m=+0.171187689 container attach 9090db0fdbb8c9e5395cda50908e6d3c78c4b32388597563f2f8b70a10515d85 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zen_elion, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, name=rhceph, io.openshift.expose-services=, ceph=True, RELEASE=main, release=1763362218, distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, version=7, vendor=Red Hat, Inc., GIT_CLEAN=True, vcs-type=git, build-date=2025-11-26T19:44:28Z, com.redhat.component=rhceph-container) Dec 15 04:46:16 localhost systemd[1]: tmp-crun.bGi3WH.mount: Deactivated successfully. Dec 15 04:46:16 localhost zen_elion[287768]: [ Dec 15 04:46:16 localhost zen_elion[287768]: { Dec 15 04:46:16 localhost zen_elion[287768]: "available": false, Dec 15 04:46:16 localhost zen_elion[287768]: "ceph_device": false, Dec 15 04:46:16 localhost zen_elion[287768]: "device_id": "QEMU_DVD-ROM_QM00001", Dec 15 04:46:16 localhost zen_elion[287768]: "lsm_data": {}, Dec 15 04:46:16 localhost zen_elion[287768]: "lvs": [], Dec 15 04:46:16 localhost zen_elion[287768]: "path": "/dev/sr0", Dec 15 04:46:16 localhost zen_elion[287768]: "rejected_reasons": [ Dec 15 04:46:16 localhost zen_elion[287768]: "Insufficient space (<5GB)", Dec 15 04:46:16 localhost zen_elion[287768]: "Has a FileSystem" Dec 15 04:46:16 localhost zen_elion[287768]: ], Dec 15 04:46:16 localhost zen_elion[287768]: "sys_api": { Dec 15 04:46:16 localhost zen_elion[287768]: "actuators": null, Dec 15 04:46:16 localhost zen_elion[287768]: "device_nodes": "sr0", Dec 15 04:46:16 localhost zen_elion[287768]: "human_readable_size": "482.00 KB", Dec 15 04:46:16 localhost zen_elion[287768]: "id_bus": "ata", Dec 15 04:46:16 localhost zen_elion[287768]: "model": "QEMU DVD-ROM", Dec 15 04:46:16 localhost zen_elion[287768]: "nr_requests": "2", Dec 15 04:46:16 localhost zen_elion[287768]: "partitions": {}, Dec 15 04:46:16 localhost zen_elion[287768]: "path": "/dev/sr0", Dec 15 04:46:16 localhost zen_elion[287768]: "removable": "1", Dec 15 04:46:16 localhost zen_elion[287768]: "rev": "2.5+", Dec 15 04:46:16 localhost zen_elion[287768]: "ro": "0", Dec 15 04:46:16 localhost zen_elion[287768]: "rotational": "1", Dec 15 04:46:16 localhost zen_elion[287768]: "sas_address": "", Dec 15 04:46:16 localhost zen_elion[287768]: "sas_device_handle": "", Dec 15 04:46:16 localhost zen_elion[287768]: "scheduler_mode": "mq-deadline", Dec 15 04:46:16 localhost zen_elion[287768]: "sectors": 0, Dec 15 04:46:16 localhost zen_elion[287768]: "sectorsize": "2048", Dec 15 04:46:16 localhost zen_elion[287768]: "size": 493568.0, Dec 15 04:46:16 localhost zen_elion[287768]: "support_discard": "0", Dec 15 04:46:16 localhost zen_elion[287768]: "type": "disk", Dec 15 04:46:16 localhost zen_elion[287768]: "vendor": "QEMU" Dec 15 04:46:16 localhost zen_elion[287768]: } Dec 15 04:46:16 localhost zen_elion[287768]: } Dec 15 04:46:16 localhost zen_elion[287768]: ] Dec 15 04:46:16 localhost systemd[1]: libpod-9090db0fdbb8c9e5395cda50908e6d3c78c4b32388597563f2f8b70a10515d85.scope: Deactivated successfully. Dec 15 04:46:16 localhost podman[287752]: 2025-12-15 09:46:16.152225196 +0000 UTC m=+1.111074125 container died 9090db0fdbb8c9e5395cda50908e6d3c78c4b32388597563f2f8b70a10515d85 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zen_elion, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., RELEASE=main, release=1763362218, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, vcs-type=git, ceph=True, com.redhat.component=rhceph-container, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-11-26T19:44:28Z, GIT_CLEAN=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , version=7, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, name=rhceph, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Dec 15 04:46:16 localhost systemd[1]: var-lib-containers-storage-overlay-929beded6df297ab5ef6457cc90d67b4c61fd5f82ba7216fb1e9310fb09ab6c2-merged.mount: Deactivated successfully. Dec 15 04:46:16 localhost podman[289473]: 2025-12-15 09:46:16.250389497 +0000 UTC m=+0.091202038 container remove 9090db0fdbb8c9e5395cda50908e6d3c78c4b32388597563f2f8b70a10515d85 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zen_elion, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, RELEASE=main, release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , version=7, vcs-type=git, architecture=x86_64, distribution-scope=public, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, GIT_BRANCH=main) Dec 15 04:46:16 localhost systemd[1]: libpod-conmon-9090db0fdbb8c9e5395cda50908e6d3c78c4b32388597563f2f8b70a10515d85.scope: Deactivated successfully. Dec 15 04:46:18 localhost nova_compute[286344]: 2025-12-15 09:46:18.813 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:46:19 localhost nova_compute[286344]: 2025-12-15 09:46:19.776 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:46:21 localhost ovn_controller[154603]: 2025-12-15T09:46:21Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:74:df:7c 192.168.0.201 Dec 15 04:46:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 04:46:22 localhost podman[289504]: 2025-12-15 09:46:22.74925145 +0000 UTC m=+0.080684183 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true) Dec 15 04:46:22 localhost podman[289504]: 2025-12-15 09:46:22.782554244 +0000 UTC m=+0.113987067 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Dec 15 04:46:22 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 04:46:23 localhost nova_compute[286344]: 2025-12-15 09:46:23.817 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:46:24 localhost nova_compute[286344]: 2025-12-15 09:46:24.813 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:46:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26398 DF PROTO=TCP SPT=47226 DPT=9102 SEQ=13051780 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83A9B2010000000001030307) Dec 15 04:46:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26399 DF PROTO=TCP SPT=47226 DPT=9102 SEQ=13051780 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83A9B6250000000001030307) Dec 15 04:46:26 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:26.441 160779 DEBUG eventlet.wsgi.server [-] (160779) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Dec 15 04:46:26 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:26.443 160779 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0#015 Dec 15 04:46:26 localhost ovn_metadata_agent[160585]: Accept: */*#015 Dec 15 04:46:26 localhost ovn_metadata_agent[160585]: Connection: close#015 Dec 15 04:46:26 localhost ovn_metadata_agent[160585]: Content-Type: text/plain#015 Dec 15 04:46:26 localhost ovn_metadata_agent[160585]: Host: 169.254.169.254#015 Dec 15 04:46:26 localhost ovn_metadata_agent[160585]: User-Agent: curl/7.84.0#015 Dec 15 04:46:26 localhost ovn_metadata_agent[160585]: X-Forwarded-For: 192.168.0.201#015 Dec 15 04:46:26 localhost ovn_metadata_agent[160585]: X-Ovn-Network-Id: befb7a72-17a9-4bcb-b561-84b8f626685a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Dec 15 04:46:26 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42877 DF PROTO=TCP SPT=59076 DPT=9102 SEQ=2096573310 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83A9B9250000000001030307) Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:27.770 160779 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:27.771 160779 INFO eventlet.wsgi.server [-] 192.168.0.201, "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200 len: 146 time: 1.3283246#033[00m Dec 15 04:46:27 localhost haproxy-metadata-proxy-befb7a72-17a9-4bcb-b561-84b8f626685a[287511]: 192.168.0.201:51558 [15/Dec/2025:09:46:26.440] listener listener/metadata 0/0/0/1330/1330 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1" Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:27.787 160779 DEBUG eventlet.wsgi.server [-] (160779) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:27.788 160779 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0#015 Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: Accept: */*#015 Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: Connection: close#015 Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: Content-Type: text/plain#015 Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: Host: 169.254.169.254#015 Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: User-Agent: curl/7.84.0#015 Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: X-Forwarded-For: 192.168.0.201#015 Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: X-Ovn-Network-Id: befb7a72-17a9-4bcb-b561-84b8f626685a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Dec 15 04:46:27 localhost haproxy-metadata-proxy-befb7a72-17a9-4bcb-b561-84b8f626685a[287511]: 192.168.0.201:51570 [15/Dec/2025:09:46:27.786] listener listener/metadata 0/0/0/21/21 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1" Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:27.808 160779 INFO eventlet.wsgi.server [-] 192.168.0.201, "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 404 len: 297 time: 0.0199349#033[00m Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:27.823 160779 DEBUG eventlet.wsgi.server [-] (160779) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:27.824 160779 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0#015 Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: Accept: */*#015 Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: Connection: close#015 Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: Content-Type: text/plain#015 Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: Host: 169.254.169.254#015 Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: User-Agent: curl/7.84.0#015 Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: X-Forwarded-For: 192.168.0.201#015 Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: X-Ovn-Network-Id: befb7a72-17a9-4bcb-b561-84b8f626685a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:27.838 160779 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Dec 15 04:46:27 localhost haproxy-metadata-proxy-befb7a72-17a9-4bcb-b561-84b8f626685a[287511]: 192.168.0.201:51576 [15/Dec/2025:09:46:27.823] listener listener/metadata 0/0/0/15/15 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1" Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:27.839 160779 INFO eventlet.wsgi.server [-] 192.168.0.201, "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200 len: 146 time: 0.0148685#033[00m Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:27.845 160779 DEBUG eventlet.wsgi.server [-] (160779) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:27.846 160779 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0#015 Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: Accept: */*#015 Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: Connection: close#015 Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: Content-Type: text/plain#015 Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: Host: 169.254.169.254#015 Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: User-Agent: curl/7.84.0#015 Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: X-Forwarded-For: 192.168.0.201#015 Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: X-Ovn-Network-Id: befb7a72-17a9-4bcb-b561-84b8f626685a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:27.858 160779 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:27.859 160779 INFO eventlet.wsgi.server [-] 192.168.0.201, "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200 len: 136 time: 0.0131168#033[00m Dec 15 04:46:27 localhost haproxy-metadata-proxy-befb7a72-17a9-4bcb-b561-84b8f626685a[287511]: 192.168.0.201:51588 [15/Dec/2025:09:46:27.845] listener listener/metadata 0/0/0/14/14 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:27.866 160779 DEBUG eventlet.wsgi.server [-] (160779) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:27.866 160779 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0#015 Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: Accept: */*#015 Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: Connection: close#015 Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: Content-Type: text/plain#015 Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: Host: 169.254.169.254#015 Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: User-Agent: curl/7.84.0#015 Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: X-Forwarded-For: 192.168.0.201#015 Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: X-Ovn-Network-Id: befb7a72-17a9-4bcb-b561-84b8f626685a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:27.878 160779 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Dec 15 04:46:27 localhost haproxy-metadata-proxy-befb7a72-17a9-4bcb-b561-84b8f626685a[287511]: 192.168.0.201:51592 [15/Dec/2025:09:46:27.865] listener listener/metadata 0/0/0/13/13 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1" Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:27.878 160779 INFO eventlet.wsgi.server [-] 192.168.0.201, "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200 len: 143 time: 0.0117610#033[00m Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:27.885 160779 DEBUG eventlet.wsgi.server [-] (160779) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:27.886 160779 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0#015 Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: Accept: */*#015 Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: Connection: close#015 Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: Content-Type: text/plain#015 Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: Host: 169.254.169.254#015 Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: User-Agent: curl/7.84.0#015 Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: X-Forwarded-For: 192.168.0.201#015 Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: X-Ovn-Network-Id: befb7a72-17a9-4bcb-b561-84b8f626685a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:27.901 160779 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Dec 15 04:46:27 localhost haproxy-metadata-proxy-befb7a72-17a9-4bcb-b561-84b8f626685a[287511]: 192.168.0.201:51606 [15/Dec/2025:09:46:27.884] listener listener/metadata 0/0/0/16/16 200 133 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:27.901 160779 INFO eventlet.wsgi.server [-] 192.168.0.201, "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200 len: 149 time: 0.0148394#033[00m Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:27.909 160779 DEBUG eventlet.wsgi.server [-] (160779) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:27.910 160779 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0#015 Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: Accept: */*#015 Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: Connection: close#015 Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: Content-Type: text/plain#015 Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: Host: 169.254.169.254#015 Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: User-Agent: curl/7.84.0#015 Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: X-Forwarded-For: 192.168.0.201#015 Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: X-Ovn-Network-Id: befb7a72-17a9-4bcb-b561-84b8f626685a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:27.921 160779 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Dec 15 04:46:27 localhost haproxy-metadata-proxy-befb7a72-17a9-4bcb-b561-84b8f626685a[287511]: 192.168.0.201:51610 [15/Dec/2025:09:46:27.908] listener listener/metadata 0/0/0/13/13 200 134 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:27.922 160779 INFO eventlet.wsgi.server [-] 192.168.0.201, "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200 len: 150 time: 0.0123756#033[00m Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:27.929 160779 DEBUG eventlet.wsgi.server [-] (160779) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:27.930 160779 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0#015 Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: Accept: */*#015 Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: Connection: close#015 Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: Content-Type: text/plain#015 Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: Host: 169.254.169.254#015 Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: User-Agent: curl/7.84.0#015 Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: X-Forwarded-For: 192.168.0.201#015 Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: X-Ovn-Network-Id: befb7a72-17a9-4bcb-b561-84b8f626685a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Dec 15 04:46:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26400 DF PROTO=TCP SPT=47226 DPT=9102 SEQ=13051780 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83A9BE250000000001030307) Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:27.951 160779 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Dec 15 04:46:27 localhost haproxy-metadata-proxy-befb7a72-17a9-4bcb-b561-84b8f626685a[287511]: 192.168.0.201:51620 [15/Dec/2025:09:46:27.928] listener listener/metadata 0/0/0/23/23 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1" Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:27.951 160779 INFO eventlet.wsgi.server [-] 192.168.0.201, "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200 len: 139 time: 0.0219848#033[00m Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:27.958 160779 DEBUG eventlet.wsgi.server [-] (160779) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:27.959 160779 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0#015 Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: Accept: */*#015 Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: Connection: close#015 Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: Content-Type: text/plain#015 Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: Host: 169.254.169.254#015 Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: User-Agent: curl/7.84.0#015 Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: X-Forwarded-For: 192.168.0.201#015 Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: X-Ovn-Network-Id: befb7a72-17a9-4bcb-b561-84b8f626685a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:27.973 160779 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Dec 15 04:46:27 localhost haproxy-metadata-proxy-befb7a72-17a9-4bcb-b561-84b8f626685a[287511]: 192.168.0.201:51630 [15/Dec/2025:09:46:27.958] listener listener/metadata 0/0/0/15/15 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:27.973 160779 INFO eventlet.wsgi.server [-] 192.168.0.201, "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200 len: 139 time: 0.0144749#033[00m Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:27.980 160779 DEBUG eventlet.wsgi.server [-] (160779) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:27.981 160779 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0#015 Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: Accept: */*#015 Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: Connection: close#015 Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: Content-Type: text/plain#015 Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: Host: 169.254.169.254#015 Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: User-Agent: curl/7.84.0#015 Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: X-Forwarded-For: 192.168.0.201#015 Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: X-Ovn-Network-Id: befb7a72-17a9-4bcb-b561-84b8f626685a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Dec 15 04:46:27 localhost haproxy-metadata-proxy-befb7a72-17a9-4bcb-b561-84b8f626685a[287511]: 192.168.0.201:51644 [15/Dec/2025:09:46:27.979] listener listener/metadata 0/0/0/13/13 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1" Dec 15 04:46:27 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:27.993 160779 INFO eventlet.wsgi.server [-] 192.168.0.201, "GET /2009-04-04/user-data HTTP/1.1" status: 404 len: 297 time: 0.0127912#033[00m Dec 15 04:46:28 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:28.007 160779 DEBUG eventlet.wsgi.server [-] (160779) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Dec 15 04:46:28 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:28.007 160779 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0#015 Dec 15 04:46:28 localhost ovn_metadata_agent[160585]: Accept: */*#015 Dec 15 04:46:28 localhost ovn_metadata_agent[160585]: Connection: close#015 Dec 15 04:46:28 localhost ovn_metadata_agent[160585]: Content-Type: text/plain#015 Dec 15 04:46:28 localhost ovn_metadata_agent[160585]: Host: 169.254.169.254#015 Dec 15 04:46:28 localhost ovn_metadata_agent[160585]: User-Agent: curl/7.84.0#015 Dec 15 04:46:28 localhost ovn_metadata_agent[160585]: X-Forwarded-For: 192.168.0.201#015 Dec 15 04:46:28 localhost ovn_metadata_agent[160585]: X-Ovn-Network-Id: befb7a72-17a9-4bcb-b561-84b8f626685a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Dec 15 04:46:28 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:28.020 160779 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Dec 15 04:46:28 localhost haproxy-metadata-proxy-befb7a72-17a9-4bcb-b561-84b8f626685a[287511]: 192.168.0.201:51660 [15/Dec/2025:09:46:28.006] listener listener/metadata 0/0/0/14/14 200 139 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" Dec 15 04:46:28 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:28.021 160779 INFO eventlet.wsgi.server [-] 192.168.0.201, "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200 len: 155 time: 0.0131769#033[00m Dec 15 04:46:28 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:28.026 160779 DEBUG eventlet.wsgi.server [-] (160779) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Dec 15 04:46:28 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:28.027 160779 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0#015 Dec 15 04:46:28 localhost ovn_metadata_agent[160585]: Accept: */*#015 Dec 15 04:46:28 localhost ovn_metadata_agent[160585]: Connection: close#015 Dec 15 04:46:28 localhost ovn_metadata_agent[160585]: Content-Type: text/plain#015 Dec 15 04:46:28 localhost ovn_metadata_agent[160585]: Host: 169.254.169.254#015 Dec 15 04:46:28 localhost ovn_metadata_agent[160585]: User-Agent: curl/7.84.0#015 Dec 15 04:46:28 localhost ovn_metadata_agent[160585]: X-Forwarded-For: 192.168.0.201#015 Dec 15 04:46:28 localhost ovn_metadata_agent[160585]: X-Ovn-Network-Id: befb7a72-17a9-4bcb-b561-84b8f626685a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Dec 15 04:46:28 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:28.039 160779 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Dec 15 04:46:28 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:28.039 160779 INFO eventlet.wsgi.server [-] 192.168.0.201, "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200 len: 138 time: 0.0122070#033[00m Dec 15 04:46:28 localhost haproxy-metadata-proxy-befb7a72-17a9-4bcb-b561-84b8f626685a[287511]: 192.168.0.201:51666 [15/Dec/2025:09:46:28.026] listener listener/metadata 0/0/0/13/13 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" Dec 15 04:46:28 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:28.044 160779 DEBUG eventlet.wsgi.server [-] (160779) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Dec 15 04:46:28 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:28.045 160779 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ephemeral0 HTTP/1.0#015 Dec 15 04:46:28 localhost ovn_metadata_agent[160585]: Accept: */*#015 Dec 15 04:46:28 localhost ovn_metadata_agent[160585]: Connection: close#015 Dec 15 04:46:28 localhost ovn_metadata_agent[160585]: Content-Type: text/plain#015 Dec 15 04:46:28 localhost ovn_metadata_agent[160585]: Host: 169.254.169.254#015 Dec 15 04:46:28 localhost ovn_metadata_agent[160585]: User-Agent: curl/7.84.0#015 Dec 15 04:46:28 localhost ovn_metadata_agent[160585]: X-Forwarded-For: 192.168.0.201#015 Dec 15 04:46:28 localhost ovn_metadata_agent[160585]: X-Ovn-Network-Id: befb7a72-17a9-4bcb-b561-84b8f626685a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Dec 15 04:46:28 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:28.057 160779 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Dec 15 04:46:28 localhost haproxy-metadata-proxy-befb7a72-17a9-4bcb-b561-84b8f626685a[287511]: 192.168.0.201:51676 [15/Dec/2025:09:46:28.044] listener listener/metadata 0/0/0/13/13 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ephemeral0 HTTP/1.1" Dec 15 04:46:28 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:28.057 160779 INFO eventlet.wsgi.server [-] 192.168.0.201, "GET /2009-04-04/meta-data/block-device-mapping/ephemeral0 HTTP/1.1" status: 200 len: 143 time: 0.0121768#033[00m Dec 15 04:46:28 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:28.063 160779 DEBUG eventlet.wsgi.server [-] (160779) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Dec 15 04:46:28 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:28.063 160779 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0#015 Dec 15 04:46:28 localhost ovn_metadata_agent[160585]: Accept: */*#015 Dec 15 04:46:28 localhost ovn_metadata_agent[160585]: Connection: close#015 Dec 15 04:46:28 localhost ovn_metadata_agent[160585]: Content-Type: text/plain#015 Dec 15 04:46:28 localhost ovn_metadata_agent[160585]: Host: 169.254.169.254#015 Dec 15 04:46:28 localhost ovn_metadata_agent[160585]: User-Agent: curl/7.84.0#015 Dec 15 04:46:28 localhost ovn_metadata_agent[160585]: X-Forwarded-For: 192.168.0.201#015 Dec 15 04:46:28 localhost ovn_metadata_agent[160585]: X-Ovn-Network-Id: befb7a72-17a9-4bcb-b561-84b8f626685a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Dec 15 04:46:28 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:28.075 160779 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Dec 15 04:46:28 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:28.075 160779 INFO eventlet.wsgi.server [-] 192.168.0.201, "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200 len: 143 time: 0.0120099#033[00m Dec 15 04:46:28 localhost haproxy-metadata-proxy-befb7a72-17a9-4bcb-b561-84b8f626685a[287511]: 192.168.0.201:51686 [15/Dec/2025:09:46:28.062] listener listener/metadata 0/0/0/13/13 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" Dec 15 04:46:28 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:28.082 160779 DEBUG eventlet.wsgi.server [-] (160779) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Dec 15 04:46:28 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:28.083 160779 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0#015 Dec 15 04:46:28 localhost ovn_metadata_agent[160585]: Accept: */*#015 Dec 15 04:46:28 localhost ovn_metadata_agent[160585]: Connection: close#015 Dec 15 04:46:28 localhost ovn_metadata_agent[160585]: Content-Type: text/plain#015 Dec 15 04:46:28 localhost ovn_metadata_agent[160585]: Host: 169.254.169.254#015 Dec 15 04:46:28 localhost ovn_metadata_agent[160585]: User-Agent: curl/7.84.0#015 Dec 15 04:46:28 localhost ovn_metadata_agent[160585]: X-Forwarded-For: 192.168.0.201#015 Dec 15 04:46:28 localhost ovn_metadata_agent[160585]: X-Ovn-Network-Id: befb7a72-17a9-4bcb-b561-84b8f626685a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Dec 15 04:46:28 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:28.108 160779 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Dec 15 04:46:28 localhost haproxy-metadata-proxy-befb7a72-17a9-4bcb-b561-84b8f626685a[287511]: 192.168.0.201:51692 [15/Dec/2025:09:46:28.081] listener listener/metadata 0/0/0/27/27 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" Dec 15 04:46:28 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:28.109 160779 INFO eventlet.wsgi.server [-] 192.168.0.201, "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200 len: 139 time: 0.0262125#033[00m Dec 15 04:46:28 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:28.115 160779 DEBUG eventlet.wsgi.server [-] (160779) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m Dec 15 04:46:28 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:28.116 160779 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0#015 Dec 15 04:46:28 localhost ovn_metadata_agent[160585]: Accept: */*#015 Dec 15 04:46:28 localhost ovn_metadata_agent[160585]: Connection: close#015 Dec 15 04:46:28 localhost ovn_metadata_agent[160585]: Content-Type: text/plain#015 Dec 15 04:46:28 localhost ovn_metadata_agent[160585]: Host: 169.254.169.254#015 Dec 15 04:46:28 localhost ovn_metadata_agent[160585]: User-Agent: curl/7.84.0#015 Dec 15 04:46:28 localhost ovn_metadata_agent[160585]: X-Forwarded-For: 192.168.0.201#015 Dec 15 04:46:28 localhost ovn_metadata_agent[160585]: X-Ovn-Network-Id: befb7a72-17a9-4bcb-b561-84b8f626685a __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m Dec 15 04:46:28 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:28.134 160779 DEBUG neutron.agent.ovn.metadata.server [-] _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m Dec 15 04:46:28 localhost haproxy-metadata-proxy-befb7a72-17a9-4bcb-b561-84b8f626685a[287511]: 192.168.0.201:51702 [15/Dec/2025:09:46:28.115] listener listener/metadata 0/0/0/19/19 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" Dec 15 04:46:28 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:28.134 160779 INFO eventlet.wsgi.server [-] 192.168.0.201, "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200 len: 139 time: 0.0184495#033[00m Dec 15 04:46:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e. Dec 15 04:46:28 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54634 DF PROTO=TCP SPT=54598 DPT=9102 SEQ=3163513306 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83A9C1260000000001030307) Dec 15 04:46:28 localhost podman[289522]: 2025-12-15 09:46:28.748486548 +0000 UTC m=+0.079307733 container health_status a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Dec 15 04:46:28 localhost podman[289522]: 2025-12-15 09:46:28.761636607 +0000 UTC m=+0.092457832 container exec_died a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Dec 15 04:46:28 localhost systemd[1]: a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e.service: Deactivated successfully. Dec 15 04:46:28 localhost nova_compute[286344]: 2025-12-15 09:46:28.821 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:46:29 localhost nova_compute[286344]: 2025-12-15 09:46:29.855 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:46:29 localhost nova_compute[286344]: 2025-12-15 09:46:29.962 286348 DEBUG nova.compute.manager [None req-cebdd535-43c1-45d3-bcb0-34633885f21b 1ba5fce347b64bfebf995f187193f205 c785bf23f53946bc99867d8832a50266 - - default default] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Dec 15 04:46:29 localhost nova_compute[286344]: 2025-12-15 09:46:29.967 286348 INFO nova.compute.manager [None req-cebdd535-43c1-45d3-bcb0-34633885f21b 1ba5fce347b64bfebf995f187193f205 c785bf23f53946bc99867d8832a50266 - - default default] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Retrieving diagnostics#033[00m Dec 15 04:46:31 localhost podman[243449]: time="2025-12-15T09:46:31Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 15 04:46:31 localhost podman[243449]: @ - - [15/Dec/2025:09:46:31 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 148265 "" "Go-http-client/1.1" Dec 15 04:46:31 localhost podman[243449]: @ - - [15/Dec/2025:09:46:31 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17229 "" "Go-http-client/1.1" Dec 15 04:46:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26401 DF PROTO=TCP SPT=47226 DPT=9102 SEQ=13051780 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83A9CDE50000000001030307) Dec 15 04:46:33 localhost nova_compute[286344]: 2025-12-15 09:46:33.825 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:46:34 localhost openstack_network_exporter[246484]: ERROR 09:46:34 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 15 04:46:34 localhost openstack_network_exporter[246484]: ERROR 09:46:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 04:46:34 localhost openstack_network_exporter[246484]: ERROR 09:46:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 04:46:34 localhost openstack_network_exporter[246484]: ERROR 09:46:34 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 15 04:46:34 localhost openstack_network_exporter[246484]: Dec 15 04:46:34 localhost openstack_network_exporter[246484]: ERROR 09:46:34 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 15 04:46:34 localhost openstack_network_exporter[246484]: Dec 15 04:46:34 localhost nova_compute[286344]: 2025-12-15 09:46:34.856 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:46:38 localhost nova_compute[286344]: 2025-12-15 09:46:38.864 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:46:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0. Dec 15 04:46:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 04:46:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a. Dec 15 04:46:39 localhost podman[289547]: 2025-12-15 09:46:39.757405921 +0000 UTC m=+0.091624540 container health_status 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 15 04:46:39 localhost podman[289554]: 2025-12-15 09:46:39.804697096 +0000 UTC m=+0.130248041 container health_status b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3) Dec 15 04:46:39 localhost podman[289554]: 2025-12-15 09:46:39.815720755 +0000 UTC m=+0.141271650 container exec_died b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Dec 15 04:46:39 localhost systemd[1]: b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a.service: Deactivated successfully. Dec 15 04:46:39 localhost nova_compute[286344]: 2025-12-15 09:46:39.858 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:46:39 localhost podman[289548]: 2025-12-15 09:46:39.866267382 +0000 UTC m=+0.191474397 container health_status 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, container_name=multipathd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, config_id=multipathd, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}) Dec 15 04:46:39 localhost podman[289547]: 2025-12-15 09:46:39.871618702 +0000 UTC m=+0.205837361 container exec_died 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Dec 15 04:46:39 localhost podman[289548]: 2025-12-15 09:46:39.882751254 +0000 UTC m=+0.207958239 container exec_died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd) Dec 15 04:46:39 localhost systemd[1]: 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0.service: Deactivated successfully. Dec 15 04:46:39 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.service: Deactivated successfully. Dec 15 04:46:39 localhost ovn_controller[154603]: 2025-12-15T09:46:39Z|00068|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory Dec 15 04:46:40 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26402 DF PROTO=TCP SPT=47226 DPT=9102 SEQ=13051780 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83A9EF250000000001030307) Dec 15 04:46:43 localhost nova_compute[286344]: 2025-12-15 09:46:43.871 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:46:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09. Dec 15 04:46:44 localhost podman[289607]: 2025-12-15 09:46:44.748352199 +0000 UTC m=+0.083810721 container health_status 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, version=9.6, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, container_name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, distribution-scope=public) Dec 15 04:46:44 localhost podman[289607]: 2025-12-15 09:46:44.785448418 +0000 UTC m=+0.120906900 container exec_died 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, config_id=openstack_network_exporter, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.buildah.version=1.33.7, distribution-scope=public, maintainer=Red Hat, Inc., vcs-type=git, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible) Dec 15 04:46:44 localhost systemd[1]: 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09.service: Deactivated successfully. Dec 15 04:46:44 localhost nova_compute[286344]: 2025-12-15 09:46:44.864 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:46:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 04:46:45 localhost podman[289628]: 2025-12-15 09:46:45.744631853 +0000 UTC m=+0.076309299 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true) Dec 15 04:46:45 localhost podman[289628]: 2025-12-15 09:46:45.783521614 +0000 UTC m=+0.115199070 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Dec 15 04:46:45 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 04:46:48 localhost nova_compute[286344]: 2025-12-15 09:46:48.895 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:46:49 localhost nova_compute[286344]: 2025-12-15 09:46:49.865 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:46:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:51.463 160590 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:46:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:51.463 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:46:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:46:51.464 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:46:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 04:46:53 localhost podman[289652]: 2025-12-15 09:46:53.396643752 +0000 UTC m=+0.082339168 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent) Dec 15 04:46:53 localhost podman[289652]: 2025-12-15 09:46:53.431358095 +0000 UTC m=+0.117053471 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 04:46:53 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 04:46:53 localhost nova_compute[286344]: 2025-12-15 09:46:53.895 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:46:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17471 DF PROTO=TCP SPT=47126 DPT=9102 SEQ=2216522171 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83AA27320000000001030307) Dec 15 04:46:54 localhost nova_compute[286344]: 2025-12-15 09:46:54.868 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:46:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17472 DF PROTO=TCP SPT=47126 DPT=9102 SEQ=2216522171 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83AA2B250000000001030307) Dec 15 04:46:56 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26403 DF PROTO=TCP SPT=47226 DPT=9102 SEQ=13051780 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83AA2F260000000001030307) Dec 15 04:46:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17473 DF PROTO=TCP SPT=47126 DPT=9102 SEQ=2216522171 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83AA33250000000001030307) Dec 15 04:46:58 localhost nova_compute[286344]: 2025-12-15 09:46:58.898 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:46:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42878 DF PROTO=TCP SPT=59076 DPT=9102 SEQ=2096573310 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83AA37250000000001030307) Dec 15 04:46:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e. Dec 15 04:46:59 localhost systemd[1]: tmp-crun.qyNZ3a.mount: Deactivated successfully. Dec 15 04:46:59 localhost podman[289670]: 2025-12-15 09:46:59.759310839 +0000 UTC m=+0.089525841 container health_status a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Dec 15 04:46:59 localhost podman[289670]: 2025-12-15 09:46:59.764115474 +0000 UTC m=+0.094330526 container exec_died a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Dec 15 04:46:59 localhost systemd[1]: a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e.service: Deactivated successfully. Dec 15 04:46:59 localhost nova_compute[286344]: 2025-12-15 09:46:59.908 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:47:00 localhost nova_compute[286344]: 2025-12-15 09:47:00.559 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:47:00 localhost nova_compute[286344]: 2025-12-15 09:47:00.560 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:47:00 localhost nova_compute[286344]: 2025-12-15 09:47:00.589 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:47:00 localhost nova_compute[286344]: 2025-12-15 09:47:00.590 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 15 04:47:00 localhost nova_compute[286344]: 2025-12-15 09:47:00.590 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 15 04:47:00 localhost nova_compute[286344]: 2025-12-15 09:47:00.727 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 15 04:47:00 localhost nova_compute[286344]: 2025-12-15 09:47:00.728 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquired lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 15 04:47:00 localhost nova_compute[286344]: 2025-12-15 09:47:00.728 286348 DEBUG nova.network.neutron [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 15 04:47:00 localhost nova_compute[286344]: 2025-12-15 09:47:00.729 286348 DEBUG nova.objects.instance [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 15 04:47:01 localhost nova_compute[286344]: 2025-12-15 09:47:01.147 286348 DEBUG nova.network.neutron [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Updating instance_info_cache with network_info: [{"id": "03ef8889-3216-43fb-8a52-4be17a956ce1", "address": "fa:16:3e:74:df:7c", "network": {"id": "befb7a72-17a9-4bcb-b561-84b8f626685a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.201", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "c785bf23f53946bc99867d8832a50266", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03ef8889-32", "ovs_interfaceid": "03ef8889-3216-43fb-8a52-4be17a956ce1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 15 04:47:01 localhost nova_compute[286344]: 2025-12-15 09:47:01.170 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Releasing lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 15 04:47:01 localhost nova_compute[286344]: 2025-12-15 09:47:01.170 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 15 04:47:01 localhost nova_compute[286344]: 2025-12-15 09:47:01.171 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:47:01 localhost nova_compute[286344]: 2025-12-15 09:47:01.171 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:47:01 localhost nova_compute[286344]: 2025-12-15 09:47:01.172 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:47:01 localhost nova_compute[286344]: 2025-12-15 09:47:01.172 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:47:01 localhost nova_compute[286344]: 2025-12-15 09:47:01.172 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:47:01 localhost nova_compute[286344]: 2025-12-15 09:47:01.173 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:47:01 localhost nova_compute[286344]: 2025-12-15 09:47:01.173 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 15 04:47:01 localhost nova_compute[286344]: 2025-12-15 09:47:01.174 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:47:01 localhost nova_compute[286344]: 2025-12-15 09:47:01.190 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:47:01 localhost nova_compute[286344]: 2025-12-15 09:47:01.190 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:47:01 localhost nova_compute[286344]: 2025-12-15 09:47:01.190 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:47:01 localhost nova_compute[286344]: 2025-12-15 09:47:01.191 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Auditing locally available compute resources for np0005559462.localdomain (node: np0005559462.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 15 04:47:01 localhost nova_compute[286344]: 2025-12-15 09:47:01.191 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 04:47:01 localhost nova_compute[286344]: 2025-12-15 09:47:01.664 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.472s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 04:47:01 localhost nova_compute[286344]: 2025-12-15 09:47:01.725 286348 DEBUG nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 04:47:01 localhost nova_compute[286344]: 2025-12-15 09:47:01.726 286348 DEBUG nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 04:47:01 localhost podman[243449]: time="2025-12-15T09:47:01Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 15 04:47:01 localhost podman[243449]: @ - - [15/Dec/2025:09:47:01 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 148265 "" "Go-http-client/1.1" Dec 15 04:47:01 localhost podman[243449]: @ - - [15/Dec/2025:09:47:01 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17236 "" "Go-http-client/1.1" Dec 15 04:47:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17474 DF PROTO=TCP SPT=47126 DPT=9102 SEQ=2216522171 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83AA42E50000000001030307) Dec 15 04:47:01 localhost nova_compute[286344]: 2025-12-15 09:47:01.974 286348 WARNING nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 15 04:47:01 localhost nova_compute[286344]: 2025-12-15 09:47:01.976 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Hypervisor/Node resource view: name=np0005559462.localdomain free_ram=12313MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 15 04:47:01 localhost nova_compute[286344]: 2025-12-15 09:47:01.976 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:47:01 localhost nova_compute[286344]: 2025-12-15 09:47:01.977 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:47:02 localhost nova_compute[286344]: 2025-12-15 09:47:02.047 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Instance 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 15 04:47:02 localhost nova_compute[286344]: 2025-12-15 09:47:02.047 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 15 04:47:02 localhost nova_compute[286344]: 2025-12-15 09:47:02.048 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Final resource view: name=np0005559462.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 15 04:47:02 localhost nova_compute[286344]: 2025-12-15 09:47:02.082 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 04:47:02 localhost nova_compute[286344]: 2025-12-15 09:47:02.538 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 04:47:02 localhost nova_compute[286344]: 2025-12-15 09:47:02.545 286348 DEBUG nova.compute.provider_tree [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Inventory has not changed in ProviderTree for provider: 26c8956b-6742-4951-b566-971b9bbe323b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 15 04:47:02 localhost nova_compute[286344]: 2025-12-15 09:47:02.566 286348 DEBUG nova.scheduler.client.report [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Inventory has not changed for provider 26c8956b-6742-4951-b566-971b9bbe323b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 15 04:47:02 localhost nova_compute[286344]: 2025-12-15 09:47:02.590 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Compute_service record updated for np0005559462.localdomain:np0005559462.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 15 04:47:02 localhost nova_compute[286344]: 2025-12-15 09:47:02.591 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.614s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:47:03 localhost nova_compute[286344]: 2025-12-15 09:47:03.901 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:47:04 localhost openstack_network_exporter[246484]: ERROR 09:47:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 04:47:04 localhost openstack_network_exporter[246484]: ERROR 09:47:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 04:47:04 localhost openstack_network_exporter[246484]: ERROR 09:47:04 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 15 04:47:04 localhost openstack_network_exporter[246484]: ERROR 09:47:04 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 15 04:47:04 localhost openstack_network_exporter[246484]: Dec 15 04:47:04 localhost openstack_network_exporter[246484]: ERROR 09:47:04 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 15 04:47:04 localhost openstack_network_exporter[246484]: Dec 15 04:47:04 localhost nova_compute[286344]: 2025-12-15 09:47:04.909 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:47:08 localhost nova_compute[286344]: 2025-12-15 09:47:08.930 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:47:09 localhost nova_compute[286344]: 2025-12-15 09:47:09.911 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:47:10 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17475 DF PROTO=TCP SPT=47126 DPT=9102 SEQ=2216522171 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83AA63250000000001030307) Dec 15 04:47:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0. Dec 15 04:47:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 04:47:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a. Dec 15 04:47:10 localhost systemd[1]: tmp-crun.0JsR6a.mount: Deactivated successfully. Dec 15 04:47:10 localhost podman[289739]: 2025-12-15 09:47:10.764754515 +0000 UTC m=+0.091484136 container health_status 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 15 04:47:10 localhost podman[289739]: 2025-12-15 09:47:10.803610554 +0000 UTC m=+0.130340205 container exec_died 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Dec 15 04:47:10 localhost systemd[1]: tmp-crun.2Ns9hl.mount: Deactivated successfully. Dec 15 04:47:10 localhost podman[289740]: 2025-12-15 09:47:10.819749936 +0000 UTC m=+0.142672421 container health_status 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, container_name=multipathd, org.label-schema.build-date=20251202, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Dec 15 04:47:10 localhost systemd[1]: 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0.service: Deactivated successfully. Dec 15 04:47:10 localhost podman[289740]: 2025-12-15 09:47:10.860514039 +0000 UTC m=+0.183436534 container exec_died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202) Dec 15 04:47:10 localhost podman[289741]: 2025-12-15 09:47:10.868631206 +0000 UTC m=+0.186721045 container health_status b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true) Dec 15 04:47:10 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.service: Deactivated successfully. Dec 15 04:47:10 localhost podman[289741]: 2025-12-15 09:47:10.88230535 +0000 UTC m=+0.200395189 container exec_died b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible) Dec 15 04:47:10 localhost systemd[1]: b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a.service: Deactivated successfully. Dec 15 04:47:10 localhost snmpd[69387]: empty variable list in _query Dec 15 04:47:10 localhost snmpd[69387]: empty variable list in _query Dec 15 04:47:13 localhost nova_compute[286344]: 2025-12-15 09:47:13.933 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:47:14 localhost sshd[289801]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:47:14 localhost nova_compute[286344]: 2025-12-15 09:47:14.913 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:47:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09. Dec 15 04:47:15 localhost podman[289803]: 2025-12-15 09:47:15.752643886 +0000 UTC m=+0.077008509 container health_status 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, release=1755695350, vcs-type=git, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, version=9.6) Dec 15 04:47:15 localhost podman[289803]: 2025-12-15 09:47:15.768295946 +0000 UTC m=+0.092660569 container exec_died 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.33.7, release=1755695350, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, version=9.6, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9) Dec 15 04:47:15 localhost systemd[1]: 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09.service: Deactivated successfully. Dec 15 04:47:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 04:47:16 localhost systemd[1]: tmp-crun.AnCXId.mount: Deactivated successfully. Dec 15 04:47:16 localhost podman[289823]: 2025-12-15 09:47:16.599120613 +0000 UTC m=+0.069311693 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3) Dec 15 04:47:16 localhost podman[289823]: 2025-12-15 09:47:16.633292022 +0000 UTC m=+0.103483062 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Dec 15 04:47:16 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 04:47:18 localhost nova_compute[286344]: 2025-12-15 09:47:18.963 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:47:19 localhost nova_compute[286344]: 2025-12-15 09:47:19.917 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:47:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 04:47:23 localhost podman[289935]: 2025-12-15 09:47:23.753259504 +0000 UTC m=+0.086027552 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true) Dec 15 04:47:23 localhost podman[289935]: 2025-12-15 09:47:23.78734321 +0000 UTC m=+0.120111158 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 04:47:23 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 04:47:23 localhost nova_compute[286344]: 2025-12-15 09:47:23.965 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:47:24 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8248 DF PROTO=TCP SPT=37004 DPT=9102 SEQ=3550340854 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83AA9C610000000001030307) Dec 15 04:47:24 localhost nova_compute[286344]: 2025-12-15 09:47:24.918 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:47:25 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8249 DF PROTO=TCP SPT=37004 DPT=9102 SEQ=3550340854 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83AAA0650000000001030307) Dec 15 04:47:26 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17476 DF PROTO=TCP SPT=47126 DPT=9102 SEQ=2216522171 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83AAA3250000000001030307) Dec 15 04:47:27 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8250 DF PROTO=TCP SPT=37004 DPT=9102 SEQ=3550340854 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83AAA8650000000001030307) Dec 15 04:47:28 localhost nova_compute[286344]: 2025-12-15 09:47:28.968 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:47:29 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26404 DF PROTO=TCP SPT=47226 DPT=9102 SEQ=13051780 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83AAAD250000000001030307) Dec 15 04:47:29 localhost nova_compute[286344]: 2025-12-15 09:47:29.954 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:47:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e. Dec 15 04:47:30 localhost systemd[1]: tmp-crun.NAJlPm.mount: Deactivated successfully. Dec 15 04:47:30 localhost podman[289953]: 2025-12-15 09:47:30.755415206 +0000 UTC m=+0.086054313 container health_status a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Dec 15 04:47:30 localhost podman[289953]: 2025-12-15 09:47:30.765020925 +0000 UTC m=+0.095660012 container exec_died a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Dec 15 04:47:30 localhost systemd[1]: a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e.service: Deactivated successfully. Dec 15 04:47:31 localhost podman[243449]: time="2025-12-15T09:47:31Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 15 04:47:31 localhost podman[243449]: @ - - [15/Dec/2025:09:47:31 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 148265 "" "Go-http-client/1.1" Dec 15 04:47:31 localhost podman[243449]: @ - - [15/Dec/2025:09:47:31 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17230 "" "Go-http-client/1.1" Dec 15 04:47:31 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8251 DF PROTO=TCP SPT=37004 DPT=9102 SEQ=3550340854 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83AAB8250000000001030307) Dec 15 04:47:33 localhost nova_compute[286344]: 2025-12-15 09:47:33.971 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:47:34 localhost openstack_network_exporter[246484]: ERROR 09:47:34 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 15 04:47:34 localhost openstack_network_exporter[246484]: ERROR 09:47:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 04:47:34 localhost openstack_network_exporter[246484]: ERROR 09:47:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 04:47:34 localhost openstack_network_exporter[246484]: ERROR 09:47:34 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 15 04:47:34 localhost openstack_network_exporter[246484]: Dec 15 04:47:34 localhost openstack_network_exporter[246484]: ERROR 09:47:34 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 15 04:47:34 localhost openstack_network_exporter[246484]: Dec 15 04:47:34 localhost nova_compute[286344]: 2025-12-15 09:47:34.956 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:47:39 localhost nova_compute[286344]: 2025-12-15 09:47:39.004 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:47:39 localhost sshd[289975]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:47:39 localhost systemd-logind[763]: New session 61 of user zuul. Dec 15 04:47:39 localhost systemd[1]: Started Session 61 of User zuul. Dec 15 04:47:39 localhost nova_compute[286344]: 2025-12-15 09:47:39.959 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:47:40 localhost python3[289997]: ansible-ansible.legacy.command Invoked with _raw_params=subscription-manager unregister _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:47:40 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8252 DF PROTO=TCP SPT=37004 DPT=9102 SEQ=3550340854 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83AAD9250000000001030307) Dec 15 04:47:40 localhost systemd-journald[47230]: Field hash table of /run/log/journal/738a39f68bc78fb81032e509449fb759/system.journal has a fill level at 75.4 (251 of 333 items), suggesting rotation. Dec 15 04:47:40 localhost systemd-journald[47230]: /run/log/journal/738a39f68bc78fb81032e509449fb759/system.journal: Journal header limits reached or header out-of-date, rotating. Dec 15 04:47:40 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Dec 15 04:47:40 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Dec 15 04:47:40 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Dec 15 04:47:40 localhost subscription-manager[289998]: Unregistered machine with identity: d8870b96-e9e0-4007-a17e-134952a74633 Dec 15 04:47:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0. Dec 15 04:47:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 04:47:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a. Dec 15 04:47:41 localhost systemd[1]: tmp-crun.OcmJJd.mount: Deactivated successfully. Dec 15 04:47:41 localhost podman[290001]: 2025-12-15 09:47:41.810573443 +0000 UTC m=+0.142685930 container health_status 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 15 04:47:41 localhost podman[290003]: 2025-12-15 09:47:41.774999746 +0000 UTC m=+0.102452812 container health_status b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Dec 15 04:47:41 localhost podman[290001]: 2025-12-15 09:47:41.845157192 +0000 UTC m=+0.177269629 container exec_died 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Dec 15 04:47:41 localhost podman[290003]: 2025-12-15 09:47:41.854854985 +0000 UTC m=+0.182308071 container exec_died b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Dec 15 04:47:41 localhost systemd[1]: 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0.service: Deactivated successfully. Dec 15 04:47:41 localhost systemd[1]: b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a.service: Deactivated successfully. Dec 15 04:47:41 localhost podman[290002]: 2025-12-15 09:47:41.909186177 +0000 UTC m=+0.237119627 container health_status 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=multipathd, container_name=multipathd, tcib_managed=true, io.buildah.version=1.41.3) Dec 15 04:47:41 localhost podman[290002]: 2025-12-15 09:47:41.925650339 +0000 UTC m=+0.253583789 container exec_died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Dec 15 04:47:41 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.service: Deactivated successfully. Dec 15 04:47:44 localhost nova_compute[286344]: 2025-12-15 09:47:44.006 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:47:44 localhost nova_compute[286344]: 2025-12-15 09:47:44.961 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:47:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09. Dec 15 04:47:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 04:47:46 localhost systemd[1]: tmp-crun.mBKqep.mount: Deactivated successfully. Dec 15 04:47:46 localhost podman[290066]: 2025-12-15 09:47:46.75334891 +0000 UTC m=+0.081384103 container health_status 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, architecture=x86_64, container_name=openstack_network_exporter, io.openshift.expose-services=, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., release=1755695350, version=9.6, com.redhat.component=ubi9-minimal-container) Dec 15 04:47:46 localhost podman[290066]: 2025-12-15 09:47:46.771704455 +0000 UTC m=+0.099739678 container exec_died 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, release=1755695350, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, distribution-scope=public, name=ubi9-minimal) Dec 15 04:47:46 localhost systemd[1]: 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09.service: Deactivated successfully. Dec 15 04:47:46 localhost podman[290067]: 2025-12-15 09:47:46.856410869 +0000 UTC m=+0.180810799 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller) Dec 15 04:47:46 localhost podman[290067]: 2025-12-15 09:47:46.925733622 +0000 UTC m=+0.250133572 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Dec 15 04:47:46 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.119 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'name': 'test', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005559462.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'c785bf23f53946bc99867d8832a50266', 'user_id': '1ba5fce347b64bfebf995f187193f205', 'hostId': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.120 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.150 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.latency volume: 1342134926 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.151 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.latency volume: 123356132 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.153 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '868d063f-a431-4ae9-af03-e081c0f99196', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1342134926, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:47:48.120740', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1d32c952-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11134.313428532, 'message_signature': '20e36aa347c94bc466ceeeb71a432891de1ace10c706e1b674e41f92edb20df1'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 123356132, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:47:48.120740', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1d32dff0-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11134.313428532, 'message_signature': '8782531f87be6f51315d0c096642932602bf67f3fb980909d98b9e51ca9a9e59'}]}, 'timestamp': '2025-12-15 09:47:48.152065', '_unique_id': '5b5eb40916f14b37a738704ffffdea0d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.153 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.153 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.153 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.153 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.153 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.153 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.153 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.153 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.153 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.153 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.153 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.153 12 ERROR oslo_messaging.notify.messaging Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.153 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.153 12 ERROR oslo_messaging.notify.messaging Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.153 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.153 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.153 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.153 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.153 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.153 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.153 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.153 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.153 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.153 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.153 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.153 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.153 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.153 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.153 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.153 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.153 12 ERROR oslo_messaging.notify.messaging Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.155 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.155 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.159 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.161 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b1403e94-bb20-497a-a3c9-2f082ae680df', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:47:48.155642', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '1d3429d2-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11134.348360061, 'message_signature': '68cfc7156ba58a5ae67930c6b706cadd19b0f26de6723297f8e3e470674a286b'}]}, 'timestamp': '2025-12-15 09:47:48.160412', '_unique_id': '5cbbd9bd837644a3bc0fa994da644e14'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.161 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.161 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.161 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.161 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.161 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.161 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.161 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.161 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.161 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.161 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.161 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.161 12 ERROR oslo_messaging.notify.messaging Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.161 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.161 12 ERROR oslo_messaging.notify.messaging Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.161 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.161 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.161 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.161 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.161 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.161 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.161 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.161 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.161 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.161 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.161 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.161 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.161 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.161 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.161 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.161 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.161 12 ERROR oslo_messaging.notify.messaging Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.162 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.162 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.164 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1a3f6d47-8699-454c-b663-c5a8e2aebbd2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:47:48.162762', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '1d349b42-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11134.348360061, 'message_signature': '43502d53049babf7ac96dff1f592f4fdedf2666a65c94ec2cc3f9f89b2c51d9a'}]}, 'timestamp': '2025-12-15 09:47:48.163290', '_unique_id': '29ea58365a1647edbd984907b9b71dbe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.164 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.164 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.164 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.164 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.164 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.164 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.164 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.164 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.164 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.164 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.164 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.164 12 ERROR oslo_messaging.notify.messaging Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.164 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.164 12 ERROR oslo_messaging.notify.messaging Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.164 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.164 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.164 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.164 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.164 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.164 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.164 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.164 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.164 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.164 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.164 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.164 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.164 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.164 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.164 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.164 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.164 12 ERROR oslo_messaging.notify.messaging Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.165 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.165 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.166 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.168 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '13d2dad0-253f-4237-b89e-5d45bb7fd7ff', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:47:48.165747', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1d350d2a-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11134.313428532, 'message_signature': '0c72c1cb6200177d9c745f395e8d9f1fbea4e9c6316af205842243cd15ee0a36'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:47:48.165747', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1d353462-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11134.313428532, 'message_signature': '46b440e2d03445bcc4755c2ac55baa1acf14285676bea53c9e710a4701e1044f'}]}, 'timestamp': '2025-12-15 09:47:48.167325', '_unique_id': '2d79049bf9024a2fae055c985268a12d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.168 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.168 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.168 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.168 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.168 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.168 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.168 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.168 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.168 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.168 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.168 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.168 12 ERROR oslo_messaging.notify.messaging Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.168 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.168 12 ERROR oslo_messaging.notify.messaging Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.168 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.168 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.168 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.168 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.168 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.168 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.168 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.168 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.168 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.168 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.168 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.168 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.168 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.168 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.168 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.168 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.168 12 ERROR oslo_messaging.notify.messaging Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.170 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.189 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/memory.usage volume: 51.73828125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.190 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aeee6ae2-e1b1-4348-af8c-883263e70bbf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.73828125, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'timestamp': '2025-12-15T09:47:48.170280', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '1d38a854-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11134.381893011, 'message_signature': '85ae9131691b2138f68b7be02296ab5ea5d1c152c2904ce4cf73957acd77656a'}]}, 'timestamp': '2025-12-15 09:47:48.189840', '_unique_id': '931a9577b34148a4b89094cc71b66197'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.190 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.190 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.190 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.190 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.190 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.190 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.190 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.190 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.190 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.190 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.190 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.190 12 ERROR oslo_messaging.notify.messaging Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.190 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.190 12 ERROR oslo_messaging.notify.messaging Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.190 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.190 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.190 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.190 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.190 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.190 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.190 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.190 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.190 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.190 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.190 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.190 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.190 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.190 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.190 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.190 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.190 12 ERROR oslo_messaging.notify.messaging Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.192 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.192 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/cpu volume: 9830000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.193 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9278ff96-f69a-4b84-b657-708a23489573', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 9830000000, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'timestamp': '2025-12-15T09:47:48.192220', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '1d391758-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11134.381893011, 'message_signature': '0bd43a01d61cc0774ed8abeafc2091b18698b47337d110c254594033a0f33c9f'}]}, 'timestamp': '2025-12-15 09:47:48.192660', '_unique_id': '38299e3270744a56a73d867897e66353'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.193 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.193 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.193 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.193 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.193 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.193 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.193 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.193 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.193 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.193 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.193 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.193 12 ERROR oslo_messaging.notify.messaging Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.193 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.193 12 ERROR oslo_messaging.notify.messaging Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.193 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.193 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.193 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.193 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.193 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.193 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.193 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.193 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.193 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.193 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.193 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.193 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.193 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.193 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.193 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.193 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.193 12 ERROR oslo_messaging.notify.messaging Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.194 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.205 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.206 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.207 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6e451740-fa33-4cda-b61b-d49b4cb08f90', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:47:48.194758', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1d3b232c-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11134.387427406, 'message_signature': '235da3fd6c283c9f9360c6fdbbe84361abe6871cc68575998359a7ed74c24e1f'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:47:48.194758', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1d3b3538-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11134.387427406, 'message_signature': 'd42752c08e37a601dd19a48ed05a38197598aac438706826a5399e02d0570c72'}]}, 'timestamp': '2025-12-15 09:47:48.206519', '_unique_id': 'eda6151440b34af8afc270a6949f2ce0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.207 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.207 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.207 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.207 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.207 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.207 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.207 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.207 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.207 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.207 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.207 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.207 12 ERROR oslo_messaging.notify.messaging Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.207 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.207 12 ERROR oslo_messaging.notify.messaging Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.207 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.207 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.207 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.207 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.207 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.207 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.207 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.207 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.207 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.207 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.207 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.207 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.207 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.207 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.207 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.207 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.207 12 ERROR oslo_messaging.notify.messaging Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.208 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.208 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.208 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.209 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.210 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1674147c-4727-49b3-9f42-9def3c9ef8fa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:47:48.208908', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1d3ba45a-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11134.387427406, 'message_signature': 'd3f070c90b14231d576b0b2e82fd05171cfc505b77997bb3283a21af9c411e63'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:47:48.208908', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1d3bb45e-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11134.387427406, 'message_signature': 'bdd0c7b44ee84ff4d5d23681c91b22252606a553a51078643cf71bd4c5351d99'}]}, 'timestamp': '2025-12-15 09:47:48.209769', '_unique_id': '32e5fd910cd8457d8ea3b33462880d6e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.210 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.210 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.210 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.210 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.210 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.210 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.210 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.210 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.210 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.210 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.210 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.210 12 ERROR oslo_messaging.notify.messaging Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.210 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.210 12 ERROR oslo_messaging.notify.messaging Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.210 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.210 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.210 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.210 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.210 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.210 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.210 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.210 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.210 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.210 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.210 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.210 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.210 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.210 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.210 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.210 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.210 12 ERROR oslo_messaging.notify.messaging Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.211 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.212 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.212 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.213 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'db3a108b-6a2b-4fdf-9136-9fb36b867241', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:47:48.212191', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '1d3c2402-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11134.348360061, 'message_signature': 'e0f32c994add8c647dbc0cd22ce3189938bf4fe533c2eece0850663cc74e43af'}]}, 'timestamp': '2025-12-15 09:47:48.212659', '_unique_id': 'd070afb875044de5a7469a5f945edcc2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.213 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.213 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.213 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.213 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.213 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.213 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.213 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.213 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.213 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.213 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.213 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.213 12 ERROR oslo_messaging.notify.messaging Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.213 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.213 12 ERROR oslo_messaging.notify.messaging Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.213 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.213 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.213 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.213 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.213 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.213 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.213 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.213 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.213 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.213 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.213 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.213 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.213 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.213 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.213 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.213 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.213 12 ERROR oslo_messaging.notify.messaging Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.214 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.214 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.bytes.delta volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.216 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2f68c168-38df-4c35-a196-8bcb6b66fe83', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:47:48.214724', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '1d3c864a-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11134.348360061, 'message_signature': '8be93ed236dfab74d84536a89d0ca6a52010b807629dbc7c3ac0b3c97dc7bc44'}]}, 'timestamp': '2025-12-15 09:47:48.215224', '_unique_id': '386772ffc5204ce1bec485ab98f1fc97'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.216 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.216 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.216 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.216 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.216 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.216 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.216 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.216 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.216 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.216 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.216 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.216 12 ERROR oslo_messaging.notify.messaging Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.216 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.216 12 ERROR oslo_messaging.notify.messaging Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.216 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.216 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.216 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.216 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.216 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.216 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.216 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.216 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.216 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.216 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.216 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.216 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.216 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.216 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.216 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.216 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.216 12 ERROR oslo_messaging.notify.messaging Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.217 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.217 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.219 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3b566bab-9930-4587-8c01-fdb3b2bd4e4b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:47:48.217445', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '1d3cf2c4-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11134.348360061, 'message_signature': '2bcc7fa2cedea31a88abed9f046c05be4f02253b59543ac95fd66b7edc98f855'}]}, 'timestamp': '2025-12-15 09:47:48.218041', '_unique_id': 'f92b681a16d84116bf8a243957c07b59'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.219 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.219 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.219 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.219 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.219 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.219 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.219 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.219 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.219 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.219 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.219 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.219 12 ERROR oslo_messaging.notify.messaging Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.219 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.219 12 ERROR oslo_messaging.notify.messaging Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.219 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.219 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.219 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.219 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.219 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.219 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.219 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.219 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.219 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.219 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.219 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.219 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.219 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.219 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.219 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.219 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.219 12 ERROR oslo_messaging.notify.messaging Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.220 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.220 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.220 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.222 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a3210ea9-4557-4dd0-a8d8-aad395f4a0f1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:47:48.220395', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1d3d639e-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11134.387427406, 'message_signature': '9e1f15503eb31f3aa1774f2eb1cda1f4d9b3094a973087118bfca5a4d0ef9555'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:47:48.220395', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1d3d74b0-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11134.387427406, 'message_signature': '6d5df57731a55b1bde149bc03a8f72f1136019424481c0a81b36dd4de9698874'}]}, 'timestamp': '2025-12-15 09:47:48.221252', '_unique_id': 'a46839d1780f4979bdf0048ba4a994dc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.222 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.222 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.222 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.222 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.222 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.222 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.222 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.222 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.222 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.222 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.222 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.222 12 ERROR oslo_messaging.notify.messaging Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.222 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.222 12 ERROR oslo_messaging.notify.messaging Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.222 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.222 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.222 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.222 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.222 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.222 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.222 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.222 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.222 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.222 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.222 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.222 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.222 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.222 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.222 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.222 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.222 12 ERROR oslo_messaging.notify.messaging Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.223 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.223 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.223 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.225 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '89dad406-334a-4afb-9889-4535bd3400dc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:47:48.223390', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1d3dd8a6-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11134.313428532, 'message_signature': 'd9a73a5e9596162cde755a39138445b0e3492ad8e11ad42de407d09824615949'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:47:48.223390', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1d3de9ea-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11134.313428532, 'message_signature': '986d676ca87c35f934d75b32ed69195a1e113593b72063d9735cf3773b4a5525'}]}, 'timestamp': '2025-12-15 09:47:48.224251', '_unique_id': '5e8d6526c0454377b885fcef784d497a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.225 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.225 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.225 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.225 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.225 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.225 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.225 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.225 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.225 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.225 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.225 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.225 12 ERROR oslo_messaging.notify.messaging Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.225 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.225 12 ERROR oslo_messaging.notify.messaging Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.225 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.225 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.225 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.225 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.225 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.225 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.225 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.225 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.225 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.225 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.225 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.225 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.225 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.225 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.225 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.225 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.225 12 ERROR oslo_messaging.notify.messaging Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.226 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.226 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.227 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f8d5bf22-8de3-4dfa-ab0e-02d77a68f2d7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:47:48.226352', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '1d3e4c96-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11134.348360061, 'message_signature': '6d100d5a40dd32e48d12e889d7e336d0b0510563837b191d993d2a5c2018747e'}]}, 'timestamp': '2025-12-15 09:47:48.226807', '_unique_id': 'd42b1866b7fc4d9c8e944f0f96b478ee'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.227 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.227 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.227 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.227 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.227 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.227 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.227 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.227 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.227 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.227 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.227 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.227 12 ERROR oslo_messaging.notify.messaging Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.227 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.227 12 ERROR oslo_messaging.notify.messaging Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.227 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.227 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.227 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.227 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.227 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.227 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.227 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.227 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.227 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.227 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.227 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.227 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.227 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.227 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.227 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.227 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.227 12 ERROR oslo_messaging.notify.messaging Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.228 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.228 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.230 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a3301868-a49c-45fd-af42-209f61cf71da', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:47:48.228902', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '1d3eb168-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11134.348360061, 'message_signature': '0db2b5512bf1895cfe946366634662d13ab5a993e85637ba25546639ede3ad4d'}]}, 'timestamp': '2025-12-15 09:47:48.229388', '_unique_id': 'd3e405672ba44e928b1930b689ba0b2f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.230 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.230 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.230 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.230 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.230 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.230 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.230 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.230 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.230 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.230 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.230 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.230 12 ERROR oslo_messaging.notify.messaging Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.230 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.230 12 ERROR oslo_messaging.notify.messaging Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.230 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.230 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.230 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.230 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.230 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.230 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.230 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.230 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.230 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.230 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.230 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.230 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.230 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.230 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.230 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.230 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.230 12 ERROR oslo_messaging.notify.messaging Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.231 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.231 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.232 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '238b5eb1-6a81-4bd8-91ea-5abd6f81de9f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:47:48.231446', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '1d3f139c-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11134.348360061, 'message_signature': '13818e66cf66974481e3d79d89257b29c261d535c787ec3495ac646f73f73e08'}]}, 'timestamp': '2025-12-15 09:47:48.231900', '_unique_id': '22484251caa74ee5a352d4ff1660a5c3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.232 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.232 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.232 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.232 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.232 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.232 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.232 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.232 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.232 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.232 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.232 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.232 12 ERROR oslo_messaging.notify.messaging Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.232 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.232 12 ERROR oslo_messaging.notify.messaging Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.232 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.232 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.232 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.232 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.232 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.232 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.232 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.232 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.232 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.232 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.232 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.232 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.232 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.232 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.232 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.232 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.232 12 ERROR oslo_messaging.notify.messaging Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.233 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.234 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.bytes.delta volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.235 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e9f36b4f-ffd8-4220-a229-e3bd69e8ef6b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:47:48.233968', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '1d3f774c-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11134.348360061, 'message_signature': 'f2ebad2206d62d9b148d8a3f42c482db91ce9a52cf86018cbf3ab6f4f9993560'}]}, 'timestamp': '2025-12-15 09:47:48.234451', '_unique_id': 'd9918e3f24e44a5db4edafcef8daae79'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.235 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.235 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.235 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.235 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.235 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.235 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.235 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.235 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.235 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.235 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.235 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.235 12 ERROR oslo_messaging.notify.messaging Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.235 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.235 12 ERROR oslo_messaging.notify.messaging Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.235 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.235 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.235 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.235 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.235 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.235 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.235 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.235 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.235 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.235 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.235 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.235 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.235 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.235 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.235 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.235 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.235 12 ERROR oslo_messaging.notify.messaging Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.236 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.236 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.latency volume: 1243487016 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.236 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.latency volume: 24779175 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.238 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fe5ea9bb-8b2d-43cf-b5df-ad6557eb92f1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1243487016, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:47:48.236516', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1d3fd962-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11134.313428532, 'message_signature': 'bf4a237014e118dd2ccd9b6d24d028897984a2a3427bbdda67e4566c9ed75ce5'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24779175, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:47:48.236516', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1d3fea9c-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11134.313428532, 'message_signature': '5961933e748064d51bc87b7496d9b23e5b38e8defacce5d815fe89466f154090'}]}, 'timestamp': '2025-12-15 09:47:48.237377', '_unique_id': '7116881bafcd4abfa6757200bba4ac30'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.238 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.238 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.238 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.238 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.238 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.238 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.238 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.238 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.238 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.238 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.238 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.238 12 ERROR oslo_messaging.notify.messaging Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.238 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.238 12 ERROR oslo_messaging.notify.messaging Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.238 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.238 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.238 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.238 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.238 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.238 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.238 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.238 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.238 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.238 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.238 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.238 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.238 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.238 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.238 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.238 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.238 12 ERROR oslo_messaging.notify.messaging Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.239 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.239 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.239 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.241 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '400dbd3a-387d-4f10-be14-82522f04edd4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:47:48.239512', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1d404e60-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11134.313428532, 'message_signature': '8e22cd98e5da3dbd77c4e087602e305d141cc717845d8b0d63595abc1ae1758a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:47:48.239512', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1d405fe0-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11134.313428532, 'message_signature': 'e8c2a3fdd61718695c622f60c8d3baa3af4a2691d8ac4b99cb71f0b41012316d'}]}, 'timestamp': '2025-12-15 09:47:48.240378', '_unique_id': '52ea20c1bea44e918869affd12596b8f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.241 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.241 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.241 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.241 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.241 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.241 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.241 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.241 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.241 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.241 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.241 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.241 12 ERROR oslo_messaging.notify.messaging Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.241 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.241 12 ERROR oslo_messaging.notify.messaging Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.241 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.241 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.241 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.241 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.241 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.241 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.241 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.241 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.241 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.241 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.241 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.241 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.241 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.241 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.241 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.241 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.241 12 ERROR oslo_messaging.notify.messaging Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.242 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.242 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.242 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.243 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.244 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e62ebfb6-aa2f-4998-bbe2-55c4c64ada4c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:47:48.242592', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1d40c6ec-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11134.313428532, 'message_signature': 'ace60e62d255bc7341703ef47b7b29e30e5541c0774ffb97f74bb6fafc894acb'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:47:48.242592', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1d40d8b2-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11134.313428532, 'message_signature': '02d11a0b4857fdbde7443b9c4334332d9567aa5522db0442ff3149232a1e40e4'}]}, 'timestamp': '2025-12-15 09:47:48.243470', '_unique_id': '5b5f4eb2338544b49bd4ebbb8d96760d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.244 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.244 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.244 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.244 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.244 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.244 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.244 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.244 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.244 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.244 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.244 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.244 12 ERROR oslo_messaging.notify.messaging Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.244 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.244 12 ERROR oslo_messaging.notify.messaging Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.244 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.244 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.244 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.244 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.244 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.244 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.244 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.244 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.244 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.244 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.244 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.244 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.244 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.244 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.244 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.244 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.244 12 ERROR oslo_messaging.notify.messaging Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.245 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.245 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.246 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f1fb9cdd-ccb9-49f3-a609-ee8e27946f0a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:47:48.245546', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '1d413a50-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11134.348360061, 'message_signature': '8643651d4a5955119d9d34fd653b8cd3aac3b8de6edcd149836683f3f73f527a'}]}, 'timestamp': '2025-12-15 09:47:48.246026', '_unique_id': '090d24b8429c4214b74ff1f3ca33f808'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.246 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.246 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.246 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.246 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.246 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.246 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.246 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.246 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.246 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.246 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.246 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.246 12 ERROR oslo_messaging.notify.messaging Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.246 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.246 12 ERROR oslo_messaging.notify.messaging Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.246 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.246 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.246 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.246 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.246 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.246 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.246 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.246 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.246 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.246 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.246 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.246 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.246 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.246 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.246 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.246 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:47:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:47:48.246 12 ERROR oslo_messaging.notify.messaging Dec 15 04:47:49 localhost nova_compute[286344]: 2025-12-15 09:47:49.049 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:47:49 localhost nova_compute[286344]: 2025-12-15 09:47:49.964 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:47:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:47:51.464 160590 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:47:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:47:51.465 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:47:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:47:51.465 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:47:54 localhost nova_compute[286344]: 2025-12-15 09:47:54.052 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:47:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 04:47:54 localhost podman[290111]: 2025-12-15 09:47:54.743052282 +0000 UTC m=+0.076368561 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2) Dec 15 04:47:54 localhost podman[290111]: 2025-12-15 09:47:54.781375307 +0000 UTC m=+0.114691566 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_metadata_agent) Dec 15 04:47:54 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 04:47:54 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42630 DF PROTO=TCP SPT=33952 DPT=9102 SEQ=753804607 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83AB11910000000001030307) Dec 15 04:47:54 localhost nova_compute[286344]: 2025-12-15 09:47:54.965 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:47:55 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42631 DF PROTO=TCP SPT=33952 DPT=9102 SEQ=753804607 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83AB15A50000000001030307) Dec 15 04:47:56 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8253 DF PROTO=TCP SPT=37004 DPT=9102 SEQ=3550340854 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83AB19250000000001030307) Dec 15 04:47:57 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42632 DF PROTO=TCP SPT=33952 DPT=9102 SEQ=753804607 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83AB1DA50000000001030307) Dec 15 04:47:58 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17477 DF PROTO=TCP SPT=47126 DPT=9102 SEQ=2216522171 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83AB21250000000001030307) Dec 15 04:47:59 localhost nova_compute[286344]: 2025-12-15 09:47:59.054 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:48:00 localhost nova_compute[286344]: 2025-12-15 09:48:00.009 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:48:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e. Dec 15 04:48:01 localhost podman[290128]: 2025-12-15 09:48:01.747021882 +0000 UTC m=+0.082111817 container health_status a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Dec 15 04:48:01 localhost podman[290128]: 2025-12-15 09:48:01.754269695 +0000 UTC m=+0.089359620 container exec_died a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 15 04:48:01 localhost systemd[1]: a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e.service: Deactivated successfully. Dec 15 04:48:01 localhost podman[243449]: time="2025-12-15T09:48:01Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 15 04:48:01 localhost podman[243449]: @ - - [15/Dec/2025:09:48:01 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 148265 "" "Go-http-client/1.1" Dec 15 04:48:01 localhost podman[243449]: @ - - [15/Dec/2025:09:48:01 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17226 "" "Go-http-client/1.1" Dec 15 04:48:01 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42633 DF PROTO=TCP SPT=33952 DPT=9102 SEQ=753804607 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83AB2D650000000001030307) Dec 15 04:48:02 localhost nova_compute[286344]: 2025-12-15 09:48:02.593 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:48:02 localhost nova_compute[286344]: 2025-12-15 09:48:02.593 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:48:02 localhost nova_compute[286344]: 2025-12-15 09:48:02.593 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 15 04:48:02 localhost nova_compute[286344]: 2025-12-15 09:48:02.594 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 15 04:48:02 localhost nova_compute[286344]: 2025-12-15 09:48:02.747 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 15 04:48:02 localhost nova_compute[286344]: 2025-12-15 09:48:02.747 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquired lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 15 04:48:02 localhost nova_compute[286344]: 2025-12-15 09:48:02.748 286348 DEBUG nova.network.neutron [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 15 04:48:02 localhost nova_compute[286344]: 2025-12-15 09:48:02.748 286348 DEBUG nova.objects.instance [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 15 04:48:03 localhost nova_compute[286344]: 2025-12-15 09:48:03.105 286348 DEBUG nova.network.neutron [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Updating instance_info_cache with network_info: [{"id": "03ef8889-3216-43fb-8a52-4be17a956ce1", "address": "fa:16:3e:74:df:7c", "network": {"id": "befb7a72-17a9-4bcb-b561-84b8f626685a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.201", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "c785bf23f53946bc99867d8832a50266", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03ef8889-32", "ovs_interfaceid": "03ef8889-3216-43fb-8a52-4be17a956ce1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 15 04:48:03 localhost nova_compute[286344]: 2025-12-15 09:48:03.140 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Releasing lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 15 04:48:03 localhost nova_compute[286344]: 2025-12-15 09:48:03.140 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 15 04:48:03 localhost nova_compute[286344]: 2025-12-15 09:48:03.141 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:48:03 localhost nova_compute[286344]: 2025-12-15 09:48:03.141 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:48:03 localhost nova_compute[286344]: 2025-12-15 09:48:03.142 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:48:03 localhost nova_compute[286344]: 2025-12-15 09:48:03.142 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:48:03 localhost nova_compute[286344]: 2025-12-15 09:48:03.143 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:48:03 localhost nova_compute[286344]: 2025-12-15 09:48:03.143 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:48:03 localhost nova_compute[286344]: 2025-12-15 09:48:03.144 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 15 04:48:03 localhost nova_compute[286344]: 2025-12-15 09:48:03.144 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:48:03 localhost nova_compute[286344]: 2025-12-15 09:48:03.275 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:48:03 localhost nova_compute[286344]: 2025-12-15 09:48:03.275 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:48:03 localhost nova_compute[286344]: 2025-12-15 09:48:03.276 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:48:03 localhost nova_compute[286344]: 2025-12-15 09:48:03.276 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Auditing locally available compute resources for np0005559462.localdomain (node: np0005559462.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 15 04:48:03 localhost nova_compute[286344]: 2025-12-15 09:48:03.277 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 04:48:03 localhost nova_compute[286344]: 2025-12-15 09:48:03.734 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 04:48:03 localhost nova_compute[286344]: 2025-12-15 09:48:03.798 286348 DEBUG nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 04:48:03 localhost nova_compute[286344]: 2025-12-15 09:48:03.799 286348 DEBUG nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 04:48:04 localhost nova_compute[286344]: 2025-12-15 09:48:04.007 286348 WARNING nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 15 04:48:04 localhost nova_compute[286344]: 2025-12-15 09:48:04.009 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Hypervisor/Node resource view: name=np0005559462.localdomain free_ram=12315MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 15 04:48:04 localhost nova_compute[286344]: 2025-12-15 09:48:04.009 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:48:04 localhost nova_compute[286344]: 2025-12-15 09:48:04.009 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:48:04 localhost nova_compute[286344]: 2025-12-15 09:48:04.058 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:48:04 localhost nova_compute[286344]: 2025-12-15 09:48:04.129 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Instance 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 15 04:48:04 localhost nova_compute[286344]: 2025-12-15 09:48:04.129 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 15 04:48:04 localhost nova_compute[286344]: 2025-12-15 09:48:04.130 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Final resource view: name=np0005559462.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 15 04:48:04 localhost nova_compute[286344]: 2025-12-15 09:48:04.164 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 04:48:04 localhost nova_compute[286344]: 2025-12-15 09:48:04.626 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.462s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 04:48:04 localhost nova_compute[286344]: 2025-12-15 09:48:04.634 286348 DEBUG nova.compute.provider_tree [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Inventory has not changed in ProviderTree for provider: 26c8956b-6742-4951-b566-971b9bbe323b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 15 04:48:04 localhost nova_compute[286344]: 2025-12-15 09:48:04.686 286348 DEBUG nova.scheduler.client.report [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Inventory has not changed for provider 26c8956b-6742-4951-b566-971b9bbe323b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 15 04:48:04 localhost nova_compute[286344]: 2025-12-15 09:48:04.689 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Compute_service record updated for np0005559462.localdomain:np0005559462.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 15 04:48:04 localhost nova_compute[286344]: 2025-12-15 09:48:04.689 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.680s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:48:04 localhost openstack_network_exporter[246484]: ERROR 09:48:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 04:48:04 localhost openstack_network_exporter[246484]: ERROR 09:48:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 04:48:04 localhost openstack_network_exporter[246484]: ERROR 09:48:04 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 15 04:48:04 localhost openstack_network_exporter[246484]: ERROR 09:48:04 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 15 04:48:04 localhost openstack_network_exporter[246484]: Dec 15 04:48:04 localhost openstack_network_exporter[246484]: ERROR 09:48:04 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 15 04:48:04 localhost openstack_network_exporter[246484]: Dec 15 04:48:05 localhost nova_compute[286344]: 2025-12-15 09:48:05.012 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:48:09 localhost nova_compute[286344]: 2025-12-15 09:48:09.088 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:48:09 localhost systemd[1]: virtsecretd.service: Deactivated successfully. Dec 15 04:48:10 localhost nova_compute[286344]: 2025-12-15 09:48:10.014 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:48:10 localhost kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:fc:1f:13 MACDST=fa:16:3e:b3:bb:e3 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.106 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42634 DF PROTO=TCP SPT=33952 DPT=9102 SEQ=753804607 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A83AB4D250000000001030307) Dec 15 04:48:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0. Dec 15 04:48:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 04:48:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a. Dec 15 04:48:12 localhost podman[290197]: 2025-12-15 09:48:12.766244395 +0000 UTC m=+0.088292450 container health_status 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible) Dec 15 04:48:12 localhost podman[290197]: 2025-12-15 09:48:12.800810089 +0000 UTC m=+0.122858174 container exec_died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true) Dec 15 04:48:12 localhost podman[290198]: 2025-12-15 09:48:12.818169339 +0000 UTC m=+0.136921811 container health_status b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Dec 15 04:48:12 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.service: Deactivated successfully. Dec 15 04:48:12 localhost podman[290198]: 2025-12-15 09:48:12.827890813 +0000 UTC m=+0.146643305 container exec_died b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, config_id=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 04:48:12 localhost systemd[1]: b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a.service: Deactivated successfully. Dec 15 04:48:12 localhost podman[290196]: 2025-12-15 09:48:12.922539202 +0000 UTC m=+0.246370737 container health_status 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Dec 15 04:48:12 localhost podman[290196]: 2025-12-15 09:48:12.934428697 +0000 UTC m=+0.258260292 container exec_died 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 15 04:48:12 localhost systemd[1]: 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0.service: Deactivated successfully. Dec 15 04:48:14 localhost nova_compute[286344]: 2025-12-15 09:48:14.089 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:48:14 localhost sshd[290275]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:48:14 localhost systemd[1]: Created slice User Slice of UID 1003. Dec 15 04:48:14 localhost systemd[1]: Starting User Runtime Directory /run/user/1003... Dec 15 04:48:14 localhost systemd-logind[763]: New session 62 of user tripleo-admin. Dec 15 04:48:14 localhost systemd[1]: Finished User Runtime Directory /run/user/1003. Dec 15 04:48:14 localhost systemd[1]: Starting User Manager for UID 1003... Dec 15 04:48:14 localhost systemd[290297]: Queued start job for default target Main User Target. Dec 15 04:48:14 localhost systemd[290297]: Created slice User Application Slice. Dec 15 04:48:14 localhost systemd[290297]: Started Mark boot as successful after the user session has run 2 minutes. Dec 15 04:48:14 localhost systemd[290297]: Started Daily Cleanup of User's Temporary Directories. Dec 15 04:48:14 localhost systemd[290297]: Reached target Paths. Dec 15 04:48:14 localhost systemd[290297]: Reached target Timers. Dec 15 04:48:14 localhost systemd[290297]: Starting D-Bus User Message Bus Socket... Dec 15 04:48:14 localhost systemd[290297]: Starting Create User's Volatile Files and Directories... Dec 15 04:48:14 localhost systemd[290297]: Finished Create User's Volatile Files and Directories. Dec 15 04:48:14 localhost systemd[290297]: Listening on D-Bus User Message Bus Socket. Dec 15 04:48:14 localhost systemd[290297]: Reached target Sockets. Dec 15 04:48:14 localhost systemd[290297]: Reached target Basic System. Dec 15 04:48:14 localhost systemd[290297]: Reached target Main User Target. Dec 15 04:48:14 localhost systemd[290297]: Startup finished in 169ms. Dec 15 04:48:14 localhost systemd[1]: Started User Manager for UID 1003. Dec 15 04:48:14 localhost systemd[1]: Started Session 62 of User tripleo-admin. Dec 15 04:48:15 localhost nova_compute[286344]: 2025-12-15 09:48:15.016 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:48:15 localhost python3[290440]: ansible-ansible.builtin.blockinfile Invoked with marker_begin=BEGIN ceph firewall rules marker_end=END ceph firewall rules path=/etc/nftables/edpm-rules.nft mode=0644 block=# 100 ceph_alertmanager (9093)#012add rule inet filter EDPM_INPUT tcp dport { 9093 } ct state new counter accept comment "100 ceph_alertmanager"#012# 100 ceph_dashboard (8443)#012add rule inet filter EDPM_INPUT tcp dport { 8443 } ct state new counter accept comment "100 ceph_dashboard"#012# 100 ceph_grafana (3100)#012add rule inet filter EDPM_INPUT tcp dport { 3100 } ct state new counter accept comment "100 ceph_grafana"#012# 100 ceph_prometheus (9092)#012add rule inet filter EDPM_INPUT tcp dport { 9092 } ct state new counter accept comment "100 ceph_prometheus"#012# 100 ceph_rgw (8080)#012add rule inet filter EDPM_INPUT tcp dport { 8080 } ct state new counter accept comment "100 ceph_rgw"#012# 110 ceph_mon (6789, 3300, 9100)#012add rule inet filter EDPM_INPUT tcp dport { 6789,3300,9100 } ct state new counter accept comment "110 ceph_mon"#012# 112 ceph_mds (6800-7300, 9100)#012add rule inet filter EDPM_INPUT tcp dport { 6800-7300,9100 } ct state new counter accept comment "112 ceph_mds"#012# 113 ceph_mgr (6800-7300, 8444)#012add rule inet filter EDPM_INPUT tcp dport { 6800-7300,8444 } ct state new counter accept comment "113 ceph_mgr"#012# 120 ceph_nfs (2049, 12049)#012add rule inet filter EDPM_INPUT tcp dport { 2049,12049 } ct state new counter accept comment "120 ceph_nfs"#012# 123 ceph_dashboard (9090, 9094, 9283)#012add rule inet filter EDPM_INPUT tcp dport { 9090,9094,9283 } ct state new counter accept comment "123 ceph_dashboard"#012 insertbefore=^# Lock down INPUT chains state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False unsafe_writes=False insertafter=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:48:16 localhost python3[290584]: ansible-ansible.builtin.systemd Invoked with name=nftables state=restarted enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None Dec 15 04:48:16 localhost systemd[1]: Stopping Netfilter Tables... Dec 15 04:48:16 localhost systemd[1]: nftables.service: Deactivated successfully. Dec 15 04:48:16 localhost systemd[1]: Stopped Netfilter Tables. Dec 15 04:48:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09. Dec 15 04:48:16 localhost systemd[1]: Starting Netfilter Tables... Dec 15 04:48:16 localhost systemd[1]: Finished Netfilter Tables. Dec 15 04:48:16 localhost podman[290590]: 2025-12-15 09:48:16.916792063 +0000 UTC m=+0.083364012 container health_status 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., release=1755695350, vcs-type=git, config_id=openstack_network_exporter, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., container_name=openstack_network_exporter, name=ubi9-minimal) Dec 15 04:48:16 localhost podman[290590]: 2025-12-15 09:48:16.939851533 +0000 UTC m=+0.106423482 container exec_died 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., release=1755695350, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, managed_by=edpm_ansible, container_name=openstack_network_exporter, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7) Dec 15 04:48:16 localhost systemd[1]: 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09.service: Deactivated successfully. Dec 15 04:48:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 04:48:17 localhost podman[290626]: 2025-12-15 09:48:17.078712358 +0000 UTC m=+0.094197407 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 04:48:17 localhost podman[290626]: 2025-12-15 09:48:17.123446709 +0000 UTC m=+0.138931738 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS) Dec 15 04:48:17 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 04:48:19 localhost nova_compute[286344]: 2025-12-15 09:48:19.122 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:48:20 localhost nova_compute[286344]: 2025-12-15 09:48:20.018 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:48:24 localhost nova_compute[286344]: 2025-12-15 09:48:24.123 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:48:25 localhost nova_compute[286344]: 2025-12-15 09:48:25.020 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:48:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 04:48:25 localhost systemd[1]: tmp-crun.ePYOPc.mount: Deactivated successfully. Dec 15 04:48:25 localhost podman[290741]: 2025-12-15 09:48:25.222613478 +0000 UTC m=+0.091816550 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202) Dec 15 04:48:25 localhost podman[290741]: 2025-12-15 09:48:25.231503999 +0000 UTC m=+0.100707041 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Dec 15 04:48:25 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 04:48:29 localhost nova_compute[286344]: 2025-12-15 09:48:29.162 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:48:30 localhost nova_compute[286344]: 2025-12-15 09:48:30.023 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:48:31 localhost podman[243449]: time="2025-12-15T09:48:31Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 15 04:48:31 localhost podman[243449]: @ - - [15/Dec/2025:09:48:31 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 148265 "" "Go-http-client/1.1" Dec 15 04:48:31 localhost podman[243449]: @ - - [15/Dec/2025:09:48:31 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17226 "" "Go-http-client/1.1" Dec 15 04:48:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e. Dec 15 04:48:32 localhost podman[290850]: 2025-12-15 09:48:32.750524783 +0000 UTC m=+0.085586835 container health_status a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Dec 15 04:48:32 localhost podman[290850]: 2025-12-15 09:48:32.759133975 +0000 UTC m=+0.094196037 container exec_died a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Dec 15 04:48:32 localhost systemd[1]: a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e.service: Deactivated successfully. Dec 15 04:48:34 localhost nova_compute[286344]: 2025-12-15 09:48:34.165 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:48:34 localhost openstack_network_exporter[246484]: ERROR 09:48:34 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 15 04:48:34 localhost openstack_network_exporter[246484]: ERROR 09:48:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 04:48:34 localhost openstack_network_exporter[246484]: ERROR 09:48:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 04:48:34 localhost openstack_network_exporter[246484]: ERROR 09:48:34 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 15 04:48:34 localhost openstack_network_exporter[246484]: Dec 15 04:48:34 localhost openstack_network_exporter[246484]: ERROR 09:48:34 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 15 04:48:34 localhost openstack_network_exporter[246484]: Dec 15 04:48:35 localhost nova_compute[286344]: 2025-12-15 09:48:35.024 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:48:37 localhost podman[290950]: Dec 15 04:48:37 localhost podman[290950]: 2025-12-15 09:48:37.200560802 +0000 UTC m=+0.074302785 container create 9fb58aa1f9aeba149dbfbdda098c6b027d17545135be27cd794f5d173fbab661 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_merkle, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, description=Red Hat Ceph Storage 7, architecture=x86_64, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, RELEASE=main, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, version=7, release=1763362218, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0) Dec 15 04:48:37 localhost systemd[1]: Started libpod-conmon-9fb58aa1f9aeba149dbfbdda098c6b027d17545135be27cd794f5d173fbab661.scope. Dec 15 04:48:37 localhost podman[290950]: 2025-12-15 09:48:37.170516606 +0000 UTC m=+0.044258609 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 04:48:37 localhost systemd[1]: Started libcrun container. Dec 15 04:48:37 localhost podman[290950]: 2025-12-15 09:48:37.287188064 +0000 UTC m=+0.160930037 container init 9fb58aa1f9aeba149dbfbdda098c6b027d17545135be27cd794f5d173fbab661 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_merkle, description=Red Hat Ceph Storage 7, release=1763362218, build-date=2025-11-26T19:44:28Z, CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, RELEASE=main, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Dec 15 04:48:37 localhost podman[290950]: 2025-12-15 09:48:37.299280816 +0000 UTC m=+0.173022799 container start 9fb58aa1f9aeba149dbfbdda098c6b027d17545135be27cd794f5d173fbab661 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_merkle, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, build-date=2025-11-26T19:44:28Z, name=rhceph, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., com.redhat.component=rhceph-container, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, vcs-type=git, description=Red Hat Ceph Storage 7, release=1763362218, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, io.openshift.expose-services=, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, GIT_CLEAN=True, distribution-scope=public) Dec 15 04:48:37 localhost podman[290950]: 2025-12-15 09:48:37.300045727 +0000 UTC m=+0.173787720 container attach 9fb58aa1f9aeba149dbfbdda098c6b027d17545135be27cd794f5d173fbab661 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_merkle, RELEASE=main, distribution-scope=public, name=rhceph, description=Red Hat Ceph Storage 7, release=1763362218, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Dec 15 04:48:37 localhost optimistic_merkle[290965]: 167 167 Dec 15 04:48:37 localhost systemd[1]: libpod-9fb58aa1f9aeba149dbfbdda098c6b027d17545135be27cd794f5d173fbab661.scope: Deactivated successfully. Dec 15 04:48:37 localhost podman[290950]: 2025-12-15 09:48:37.302330552 +0000 UTC m=+0.176072595 container died 9fb58aa1f9aeba149dbfbdda098c6b027d17545135be27cd794f5d173fbab661 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_merkle, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , distribution-scope=public, io.openshift.tags=rhceph ceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, GIT_CLEAN=True, release=1763362218, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, io.buildah.version=1.41.4, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, build-date=2025-11-26T19:44:28Z, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 15 04:48:37 localhost podman[290970]: 2025-12-15 09:48:37.399243763 +0000 UTC m=+0.088162606 container remove 9fb58aa1f9aeba149dbfbdda098c6b027d17545135be27cd794f5d173fbab661 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_merkle, io.openshift.expose-services=, RELEASE=main, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1763362218, ceph=True, name=rhceph, com.redhat.component=rhceph-container, version=7, vcs-type=git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=) Dec 15 04:48:37 localhost systemd[1]: libpod-conmon-9fb58aa1f9aeba149dbfbdda098c6b027d17545135be27cd794f5d173fbab661.scope: Deactivated successfully. Dec 15 04:48:37 localhost systemd[1]: Reloading. Dec 15 04:48:37 localhost systemd-rc-local-generator[291013]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:48:37 localhost systemd-sysv-generator[291016]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:48:37 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:48:37 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 15 04:48:37 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:48:37 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:48:37 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:48:37 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 15 04:48:37 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:48:37 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:48:37 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:48:37 localhost systemd[1]: tmp-crun.KwSQBy.mount: Deactivated successfully. Dec 15 04:48:37 localhost systemd[1]: var-lib-containers-storage-overlay-eac54044eda8f8f7a0f949af836d13667471936187a62584f82c57c4b61030bb-merged.mount: Deactivated successfully. Dec 15 04:48:37 localhost systemd[1]: Reloading. Dec 15 04:48:37 localhost systemd-sysv-generator[291057]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:48:37 localhost systemd-rc-local-generator[291054]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:48:38 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:48:38 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 15 04:48:38 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:48:38 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:48:38 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:48:38 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 15 04:48:38 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:48:38 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:48:38 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:48:38 localhost systemd[1]: Starting Ceph mds.mds.np0005559462.mhigvc for bce17446-41b5-5408-a23e-0b011906b44a... Dec 15 04:48:38 localhost podman[291116]: Dec 15 04:48:38 localhost podman[291116]: 2025-12-15 09:48:38.596247281 +0000 UTC m=+0.076314103 container create 579f9ed995d353795b14a844045b1d996814f67b92bb72b83d22bc9bac593044 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-mds-mds-np0005559462-mhigvc, io.openshift.expose-services=, ceph=True, build-date=2025-11-26T19:44:28Z, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, vcs-type=git, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1763362218) Dec 15 04:48:38 localhost systemd[1]: tmp-crun.nbIW1I.mount: Deactivated successfully. Dec 15 04:48:38 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4ec1a80c913becc41fd39a92125537ea3cb0bc59f65acaae40933a63af11105f/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Dec 15 04:48:38 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4ec1a80c913becc41fd39a92125537ea3cb0bc59f65acaae40933a63af11105f/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Dec 15 04:48:38 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4ec1a80c913becc41fd39a92125537ea3cb0bc59f65acaae40933a63af11105f/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Dec 15 04:48:38 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4ec1a80c913becc41fd39a92125537ea3cb0bc59f65acaae40933a63af11105f/merged/var/lib/ceph/mds/ceph-mds.np0005559462.mhigvc supports timestamps until 2038 (0x7fffffff) Dec 15 04:48:38 localhost podman[291116]: 2025-12-15 09:48:38.659028691 +0000 UTC m=+0.139095513 container init 579f9ed995d353795b14a844045b1d996814f67b92bb72b83d22bc9bac593044 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-mds-mds-np0005559462-mhigvc, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, GIT_CLEAN=True, version=7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, release=1763362218, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, ceph=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, com.redhat.component=rhceph-container, build-date=2025-11-26T19:44:28Z, vcs-type=git, maintainer=Guillaume Abrioux , RELEASE=main, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=) Dec 15 04:48:38 localhost podman[291116]: 2025-12-15 09:48:38.565360741 +0000 UTC m=+0.045427603 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 04:48:38 localhost podman[291116]: 2025-12-15 09:48:38.671911245 +0000 UTC m=+0.151978077 container start 579f9ed995d353795b14a844045b1d996814f67b92bb72b83d22bc9bac593044 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-mds-mds-np0005559462-mhigvc, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-type=git, version=7, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, ceph=True, RELEASE=main, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, io.openshift.expose-services=, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0) Dec 15 04:48:38 localhost bash[291116]: 579f9ed995d353795b14a844045b1d996814f67b92bb72b83d22bc9bac593044 Dec 15 04:48:38 localhost systemd[1]: Started Ceph mds.mds.np0005559462.mhigvc for bce17446-41b5-5408-a23e-0b011906b44a. Dec 15 04:48:38 localhost ceph-mds[291134]: set uid:gid to 167:167 (ceph:ceph) Dec 15 04:48:38 localhost ceph-mds[291134]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-mds, pid 2 Dec 15 04:48:38 localhost ceph-mds[291134]: main not setting numa affinity Dec 15 04:48:38 localhost ceph-mds[291134]: pidfile_write: ignore empty --pid-file Dec 15 04:48:38 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-mds-mds-np0005559462-mhigvc[291130]: starting mds.mds.np0005559462.mhigvc at Dec 15 04:48:38 localhost ceph-mds[291134]: mds.mds.np0005559462.mhigvc Updating MDS map to version 9 from mon.0 Dec 15 04:48:39 localhost nova_compute[286344]: 2025-12-15 09:48:39.189 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:48:39 localhost ceph-mds[291134]: mds.mds.np0005559462.mhigvc Updating MDS map to version 10 from mon.0 Dec 15 04:48:39 localhost ceph-mds[291134]: mds.mds.np0005559462.mhigvc Monitors have assigned me to become a standby. Dec 15 04:48:40 localhost nova_compute[286344]: 2025-12-15 09:48:40.027 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:48:40 localhost podman[291280]: 2025-12-15 09:48:40.217212071 +0000 UTC m=+0.100775282 container exec 8dcda56b365b42dc8758aab77a9ec80db304780e449052738f7e4e648ae1ecaf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-crash-np0005559462, vcs-type=git, RELEASE=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, build-date=2025-11-26T19:44:28Z, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, architecture=x86_64, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, GIT_BRANCH=main, GIT_CLEAN=True, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, name=rhceph, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public) Dec 15 04:48:40 localhost podman[291280]: 2025-12-15 09:48:40.34237072 +0000 UTC m=+0.225933911 container exec_died 8dcda56b365b42dc8758aab77a9ec80db304780e449052738f7e4e648ae1ecaf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-crash-np0005559462, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, vcs-type=git, io.buildah.version=1.41.4, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, io.openshift.expose-services=, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, vendor=Red Hat, Inc., ceph=True, release=1763362218, RELEASE=main, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, GIT_BRANCH=main, architecture=x86_64, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-11-26T19:44:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 15 04:48:40 localhost systemd[1]: session-61.scope: Deactivated successfully. Dec 15 04:48:40 localhost systemd-logind[763]: Session 61 logged out. Waiting for processes to exit. Dec 15 04:48:40 localhost systemd-logind[763]: Removed session 61. Dec 15 04:48:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0. Dec 15 04:48:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 04:48:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a. Dec 15 04:48:43 localhost podman[291403]: 2025-12-15 09:48:43.759373796 +0000 UTC m=+0.085557524 container health_status 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true) Dec 15 04:48:43 localhost podman[291402]: 2025-12-15 09:48:43.815439126 +0000 UTC m=+0.142922200 container health_status 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 15 04:48:43 localhost podman[291403]: 2025-12-15 09:48:43.84678594 +0000 UTC m=+0.172969668 container exec_died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0) Dec 15 04:48:43 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.service: Deactivated successfully. Dec 15 04:48:43 localhost podman[291404]: 2025-12-15 09:48:43.866027303 +0000 UTC m=+0.187906419 container health_status b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3) Dec 15 04:48:43 localhost podman[291402]: 2025-12-15 09:48:43.88011907 +0000 UTC m=+0.207602134 container exec_died 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Dec 15 04:48:43 localhost systemd[1]: 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0.service: Deactivated successfully. Dec 15 04:48:43 localhost podman[291404]: 2025-12-15 09:48:43.904373493 +0000 UTC m=+0.226252579 container exec_died b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 04:48:43 localhost systemd[1]: b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a.service: Deactivated successfully. Dec 15 04:48:44 localhost nova_compute[286344]: 2025-12-15 09:48:44.189 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:48:45 localhost nova_compute[286344]: 2025-12-15 09:48:45.028 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:48:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09. Dec 15 04:48:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 04:48:47 localhost podman[291464]: 2025-12-15 09:48:47.76971302 +0000 UTC m=+0.089304529 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3) Dec 15 04:48:47 localhost podman[291464]: 2025-12-15 09:48:47.815550683 +0000 UTC m=+0.135142232 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202) Dec 15 04:48:47 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 04:48:47 localhost podman[291463]: 2025-12-15 09:48:47.828165668 +0000 UTC m=+0.150660038 container health_status 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-type=git, architecture=x86_64, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, distribution-scope=public, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, version=9.6, io.openshift.expose-services=) Dec 15 04:48:47 localhost podman[291463]: 2025-12-15 09:48:47.841385431 +0000 UTC m=+0.163879801 container exec_died 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, architecture=x86_64, vcs-type=git, io.buildah.version=1.33.7, release=1755695350, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., version=9.6) Dec 15 04:48:47 localhost systemd[1]: 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09.service: Deactivated successfully. Dec 15 04:48:49 localhost nova_compute[286344]: 2025-12-15 09:48:49.213 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:48:50 localhost nova_compute[286344]: 2025-12-15 09:48:50.030 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:48:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:48:51.466 160590 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:48:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:48:51.467 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:48:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:48:51.468 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:48:54 localhost nova_compute[286344]: 2025-12-15 09:48:54.216 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:48:55 localhost nova_compute[286344]: 2025-12-15 09:48:55.031 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:48:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 04:48:55 localhost systemd[1]: tmp-crun.ZRi8Wl.mount: Deactivated successfully. Dec 15 04:48:55 localhost podman[291506]: 2025-12-15 09:48:55.77448943 +0000 UTC m=+0.102587034 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true) Dec 15 04:48:55 localhost podman[291506]: 2025-12-15 09:48:55.783403781 +0000 UTC m=+0.111501375 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Dec 15 04:48:55 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 04:48:59 localhost nova_compute[286344]: 2025-12-15 09:48:59.217 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:49:00 localhost nova_compute[286344]: 2025-12-15 09:49:00.033 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:49:01 localhost podman[243449]: time="2025-12-15T09:49:01Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 15 04:49:01 localhost podman[243449]: @ - - [15/Dec/2025:09:49:01 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 150413 "" "Go-http-client/1.1" Dec 15 04:49:01 localhost podman[243449]: @ - - [15/Dec/2025:09:49:01 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17712 "" "Go-http-client/1.1" Dec 15 04:49:02 localhost nova_compute[286344]: 2025-12-15 09:49:02.363 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:49:02 localhost nova_compute[286344]: 2025-12-15 09:49:02.364 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:49:02 localhost nova_compute[286344]: 2025-12-15 09:49:02.387 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:49:02 localhost nova_compute[286344]: 2025-12-15 09:49:02.387 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 15 04:49:02 localhost nova_compute[286344]: 2025-12-15 09:49:02.387 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 15 04:49:02 localhost nova_compute[286344]: 2025-12-15 09:49:02.784 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 15 04:49:02 localhost nova_compute[286344]: 2025-12-15 09:49:02.785 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquired lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 15 04:49:02 localhost nova_compute[286344]: 2025-12-15 09:49:02.785 286348 DEBUG nova.network.neutron [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 15 04:49:02 localhost nova_compute[286344]: 2025-12-15 09:49:02.785 286348 DEBUG nova.objects.instance [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 15 04:49:03 localhost nova_compute[286344]: 2025-12-15 09:49:03.253 286348 DEBUG nova.network.neutron [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Updating instance_info_cache with network_info: [{"id": "03ef8889-3216-43fb-8a52-4be17a956ce1", "address": "fa:16:3e:74:df:7c", "network": {"id": "befb7a72-17a9-4bcb-b561-84b8f626685a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.201", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "c785bf23f53946bc99867d8832a50266", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03ef8889-32", "ovs_interfaceid": "03ef8889-3216-43fb-8a52-4be17a956ce1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 15 04:49:03 localhost nova_compute[286344]: 2025-12-15 09:49:03.271 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Releasing lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 15 04:49:03 localhost nova_compute[286344]: 2025-12-15 09:49:03.272 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 15 04:49:03 localhost nova_compute[286344]: 2025-12-15 09:49:03.272 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:49:03 localhost nova_compute[286344]: 2025-12-15 09:49:03.273 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:49:03 localhost nova_compute[286344]: 2025-12-15 09:49:03.273 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:49:03 localhost nova_compute[286344]: 2025-12-15 09:49:03.273 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:49:03 localhost nova_compute[286344]: 2025-12-15 09:49:03.273 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:49:03 localhost nova_compute[286344]: 2025-12-15 09:49:03.274 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:49:03 localhost nova_compute[286344]: 2025-12-15 09:49:03.274 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 15 04:49:03 localhost nova_compute[286344]: 2025-12-15 09:49:03.275 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:49:03 localhost nova_compute[286344]: 2025-12-15 09:49:03.337 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:49:03 localhost nova_compute[286344]: 2025-12-15 09:49:03.337 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:49:03 localhost nova_compute[286344]: 2025-12-15 09:49:03.338 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:49:03 localhost nova_compute[286344]: 2025-12-15 09:49:03.338 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Auditing locally available compute resources for np0005559462.localdomain (node: np0005559462.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 15 04:49:03 localhost nova_compute[286344]: 2025-12-15 09:49:03.338 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 04:49:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e. Dec 15 04:49:03 localhost systemd[1]: tmp-crun.lJ9UVj.mount: Deactivated successfully. Dec 15 04:49:03 localhost podman[291545]: 2025-12-15 09:49:03.765259976 +0000 UTC m=+0.094367841 container health_status a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Dec 15 04:49:03 localhost nova_compute[286344]: 2025-12-15 09:49:03.772 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.434s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 04:49:03 localhost podman[291545]: 2025-12-15 09:49:03.803511255 +0000 UTC m=+0.132619150 container exec_died a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Dec 15 04:49:03 localhost systemd[1]: a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e.service: Deactivated successfully. Dec 15 04:49:03 localhost nova_compute[286344]: 2025-12-15 09:49:03.892 286348 DEBUG nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 04:49:03 localhost nova_compute[286344]: 2025-12-15 09:49:03.893 286348 DEBUG nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 04:49:04 localhost nova_compute[286344]: 2025-12-15 09:49:04.113 286348 WARNING nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 15 04:49:04 localhost nova_compute[286344]: 2025-12-15 09:49:04.116 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Hypervisor/Node resource view: name=np0005559462.localdomain free_ram=12265MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 15 04:49:04 localhost nova_compute[286344]: 2025-12-15 09:49:04.116 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:49:04 localhost nova_compute[286344]: 2025-12-15 09:49:04.117 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:49:04 localhost nova_compute[286344]: 2025-12-15 09:49:04.177 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Instance 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 15 04:49:04 localhost nova_compute[286344]: 2025-12-15 09:49:04.178 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 15 04:49:04 localhost nova_compute[286344]: 2025-12-15 09:49:04.178 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Final resource view: name=np0005559462.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 15 04:49:04 localhost nova_compute[286344]: 2025-12-15 09:49:04.208 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 04:49:04 localhost nova_compute[286344]: 2025-12-15 09:49:04.260 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:49:04 localhost nova_compute[286344]: 2025-12-15 09:49:04.666 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 04:49:04 localhost nova_compute[286344]: 2025-12-15 09:49:04.672 286348 DEBUG nova.compute.provider_tree [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Inventory has not changed in ProviderTree for provider: 26c8956b-6742-4951-b566-971b9bbe323b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 15 04:49:04 localhost nova_compute[286344]: 2025-12-15 09:49:04.707 286348 DEBUG nova.scheduler.client.report [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Inventory has not changed for provider 26c8956b-6742-4951-b566-971b9bbe323b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 15 04:49:04 localhost nova_compute[286344]: 2025-12-15 09:49:04.709 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Compute_service record updated for np0005559462.localdomain:np0005559462.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 15 04:49:04 localhost nova_compute[286344]: 2025-12-15 09:49:04.710 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.593s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:49:04 localhost openstack_network_exporter[246484]: ERROR 09:49:04 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 15 04:49:04 localhost openstack_network_exporter[246484]: ERROR 09:49:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 04:49:04 localhost openstack_network_exporter[246484]: ERROR 09:49:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 04:49:04 localhost openstack_network_exporter[246484]: ERROR 09:49:04 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 15 04:49:04 localhost openstack_network_exporter[246484]: Dec 15 04:49:04 localhost openstack_network_exporter[246484]: ERROR 09:49:04 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 15 04:49:04 localhost openstack_network_exporter[246484]: Dec 15 04:49:05 localhost nova_compute[286344]: 2025-12-15 09:49:05.035 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:49:09 localhost nova_compute[286344]: 2025-12-15 09:49:09.285 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:49:10 localhost nova_compute[286344]: 2025-12-15 09:49:10.037 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:49:14 localhost nova_compute[286344]: 2025-12-15 09:49:14.302 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:49:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0. Dec 15 04:49:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 04:49:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a. Dec 15 04:49:14 localhost podman[291668]: 2025-12-15 09:49:14.763051018 +0000 UTC m=+0.087481737 container health_status 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible) Dec 15 04:49:14 localhost podman[291669]: 2025-12-15 09:49:14.81522143 +0000 UTC m=+0.135931314 container health_status b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Dec 15 04:49:14 localhost podman[291669]: 2025-12-15 09:49:14.823812782 +0000 UTC m=+0.144522666 container exec_died b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute) Dec 15 04:49:14 localhost systemd[1]: b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a.service: Deactivated successfully. Dec 15 04:49:14 localhost podman[291667]: 2025-12-15 09:49:14.917833042 +0000 UTC m=+0.243698152 container health_status 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Dec 15 04:49:14 localhost podman[291667]: 2025-12-15 09:49:14.928073011 +0000 UTC m=+0.253938121 container exec_died 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Dec 15 04:49:14 localhost systemd[1]: 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0.service: Deactivated successfully. Dec 15 04:49:14 localhost podman[291668]: 2025-12-15 09:49:14.98195205 +0000 UTC m=+0.306382749 container exec_died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=multipathd, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 04:49:14 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.service: Deactivated successfully. Dec 15 04:49:15 localhost nova_compute[286344]: 2025-12-15 09:49:15.039 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:49:15 localhost systemd[1]: tmp-crun.pbXMqP.mount: Deactivated successfully. Dec 15 04:49:16 localhost systemd[1]: session-62.scope: Deactivated successfully. Dec 15 04:49:16 localhost systemd[1]: session-62.scope: Consumed 1.345s CPU time. Dec 15 04:49:16 localhost systemd-logind[763]: Session 62 logged out. Waiting for processes to exit. Dec 15 04:49:16 localhost systemd-logind[763]: Removed session 62. Dec 15 04:49:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09. Dec 15 04:49:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 04:49:18 localhost systemd[1]: tmp-crun.RnSGPO.mount: Deactivated successfully. Dec 15 04:49:18 localhost podman[291730]: 2025-12-15 09:49:18.771532369 +0000 UTC m=+0.090886174 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true) Dec 15 04:49:18 localhost podman[291729]: 2025-12-15 09:49:18.871420304 +0000 UTC m=+0.191139599 container health_status 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, container_name=openstack_network_exporter, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, vcs-type=git, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, name=ubi9-minimal, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350) Dec 15 04:49:18 localhost podman[291729]: 2025-12-15 09:49:18.887351594 +0000 UTC m=+0.207070919 container exec_died 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, distribution-scope=public, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, managed_by=edpm_ansible, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, container_name=openstack_network_exporter) Dec 15 04:49:18 localhost systemd[1]: 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09.service: Deactivated successfully. Dec 15 04:49:18 localhost podman[291730]: 2025-12-15 09:49:18.939298879 +0000 UTC m=+0.258652704 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 04:49:18 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 04:49:19 localhost nova_compute[286344]: 2025-12-15 09:49:19.326 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:49:19 localhost systemd[1]: tmp-crun.jq5Kxi.mount: Deactivated successfully. Dec 15 04:49:20 localhost nova_compute[286344]: 2025-12-15 09:49:20.040 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:49:24 localhost nova_compute[286344]: 2025-12-15 09:49:24.379 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:49:25 localhost sshd[291774]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:49:25 localhost nova_compute[286344]: 2025-12-15 09:49:25.042 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:49:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 04:49:26 localhost systemd[1]: Stopping User Manager for UID 1003... Dec 15 04:49:26 localhost systemd[290297]: Activating special unit Exit the Session... Dec 15 04:49:26 localhost systemd[290297]: Stopped target Main User Target. Dec 15 04:49:26 localhost systemd[290297]: Stopped target Basic System. Dec 15 04:49:26 localhost systemd[290297]: Stopped target Paths. Dec 15 04:49:26 localhost systemd[290297]: Stopped target Sockets. Dec 15 04:49:26 localhost systemd[290297]: Stopped target Timers. Dec 15 04:49:26 localhost systemd[290297]: Stopped Mark boot as successful after the user session has run 2 minutes. Dec 15 04:49:26 localhost systemd[290297]: Stopped Daily Cleanup of User's Temporary Directories. Dec 15 04:49:26 localhost systemd[290297]: Closed D-Bus User Message Bus Socket. Dec 15 04:49:26 localhost systemd[290297]: Stopped Create User's Volatile Files and Directories. Dec 15 04:49:26 localhost systemd[290297]: Removed slice User Application Slice. Dec 15 04:49:26 localhost systemd[290297]: Reached target Shutdown. Dec 15 04:49:26 localhost systemd[290297]: Finished Exit the Session. Dec 15 04:49:26 localhost systemd[290297]: Reached target Exit the Session. Dec 15 04:49:26 localhost systemd[1]: user@1003.service: Deactivated successfully. Dec 15 04:49:26 localhost systemd[1]: Stopped User Manager for UID 1003. Dec 15 04:49:26 localhost systemd[1]: Stopping User Runtime Directory /run/user/1003... Dec 15 04:49:26 localhost systemd[1]: run-user-1003.mount: Deactivated successfully. Dec 15 04:49:26 localhost systemd[1]: user-runtime-dir@1003.service: Deactivated successfully. Dec 15 04:49:26 localhost systemd[1]: Stopped User Runtime Directory /run/user/1003. Dec 15 04:49:26 localhost systemd[1]: Removed slice User Slice of UID 1003. Dec 15 04:49:26 localhost systemd[1]: user-1003.slice: Consumed 1.780s CPU time. Dec 15 04:49:26 localhost podman[291793]: 2025-12-15 09:49:26.659842853 +0000 UTC m=+0.083618439 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team) Dec 15 04:49:26 localhost podman[291793]: 2025-12-15 09:49:26.69343191 +0000 UTC m=+0.117207496 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202) Dec 15 04:49:26 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 04:49:29 localhost nova_compute[286344]: 2025-12-15 09:49:29.423 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:49:30 localhost nova_compute[286344]: 2025-12-15 09:49:30.044 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:49:31 localhost podman[243449]: time="2025-12-15T09:49:31Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 15 04:49:31 localhost podman[243449]: @ - - [15/Dec/2025:09:49:31 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 150413 "" "Go-http-client/1.1" Dec 15 04:49:31 localhost podman[243449]: @ - - [15/Dec/2025:09:49:31 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17718 "" "Go-http-client/1.1" Dec 15 04:49:34 localhost nova_compute[286344]: 2025-12-15 09:49:34.451 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:49:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e. Dec 15 04:49:34 localhost podman[291916]: 2025-12-15 09:49:34.74748171 +0000 UTC m=+0.081580861 container health_status a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Dec 15 04:49:34 localhost podman[291916]: 2025-12-15 09:49:34.781201891 +0000 UTC m=+0.115301052 container exec_died a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Dec 15 04:49:34 localhost systemd[1]: a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e.service: Deactivated successfully. Dec 15 04:49:34 localhost openstack_network_exporter[246484]: ERROR 09:49:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 04:49:34 localhost openstack_network_exporter[246484]: ERROR 09:49:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 04:49:34 localhost openstack_network_exporter[246484]: ERROR 09:49:34 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 15 04:49:34 localhost openstack_network_exporter[246484]: ERROR 09:49:34 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 15 04:49:34 localhost openstack_network_exporter[246484]: Dec 15 04:49:34 localhost openstack_network_exporter[246484]: ERROR 09:49:34 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 15 04:49:34 localhost openstack_network_exporter[246484]: Dec 15 04:49:35 localhost nova_compute[286344]: 2025-12-15 09:49:35.045 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:49:39 localhost nova_compute[286344]: 2025-12-15 09:49:39.478 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:49:40 localhost nova_compute[286344]: 2025-12-15 09:49:40.047 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:49:44 localhost nova_compute[286344]: 2025-12-15 09:49:44.507 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:49:45 localhost nova_compute[286344]: 2025-12-15 09:49:45.049 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:49:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0. Dec 15 04:49:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 04:49:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a. Dec 15 04:49:45 localhost podman[291939]: 2025-12-15 09:49:45.76404247 +0000 UTC m=+0.090856872 container health_status 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 15 04:49:45 localhost podman[291939]: 2025-12-15 09:49:45.79703897 +0000 UTC m=+0.123853352 container exec_died 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Dec 15 04:49:45 localhost systemd[1]: 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0.service: Deactivated successfully. Dec 15 04:49:45 localhost podman[291940]: 2025-12-15 09:49:45.816848719 +0000 UTC m=+0.140745329 container health_status 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=multipathd, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2) Dec 15 04:49:45 localhost podman[291940]: 2025-12-15 09:49:45.82826256 +0000 UTC m=+0.152159200 container exec_died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS) Dec 15 04:49:45 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.service: Deactivated successfully. Dec 15 04:49:45 localhost podman[291941]: 2025-12-15 09:49:45.926925932 +0000 UTC m=+0.242509158 container health_status b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2) Dec 15 04:49:45 localhost podman[291941]: 2025-12-15 09:49:45.942384728 +0000 UTC m=+0.257967944 container exec_died b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0) Dec 15 04:49:45 localhost systemd[1]: b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a.service: Deactivated successfully. Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.120 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'name': 'test', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005559462.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'c785bf23f53946bc99867d8832a50266', 'user_id': '1ba5fce347b64bfebf995f187193f205', 'hostId': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.120 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.126 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.129 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '482bd4b7-9bfa-410f-9d29-1edd95210a77', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:49:48.121120', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '64b5b078-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11254.313786527, 'message_signature': 'df6a7758d523dbc393a899a0ddbb24eb4f0bf03b3f3749c026eb3a2f1ac2318e'}]}, 'timestamp': '2025-12-15 09:49:48.127538', '_unique_id': '54578dba69374bf9baea2c805da3ba1a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.129 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.129 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.129 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.129 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.129 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.129 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.129 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.129 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.129 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.129 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.129 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.129 12 ERROR oslo_messaging.notify.messaging Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.129 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.129 12 ERROR oslo_messaging.notify.messaging Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.129 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.129 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.129 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.129 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.129 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.129 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.129 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.129 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.129 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.129 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.129 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.129 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.129 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.129 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.129 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.129 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.129 12 ERROR oslo_messaging.notify.messaging Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.130 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.130 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.131 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '30006289-d139-451d-8a77-3989ab2d8084', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:49:48.130475', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '64b637dc-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11254.313786527, 'message_signature': '33115df0562110e04e888cee0ffdbf24f1cba0762acad8aa00fc14e1fa38a5f8'}]}, 'timestamp': '2025-12-15 09:49:48.130950', '_unique_id': 'ed44db35591344c7869a46daf3b165a8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.131 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.131 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.131 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.131 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.131 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.131 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.131 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.131 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.131 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.131 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.131 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.131 12 ERROR oslo_messaging.notify.messaging Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.131 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.131 12 ERROR oslo_messaging.notify.messaging Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.131 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.131 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.131 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.131 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.131 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.131 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.131 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.131 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.131 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.131 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.131 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.131 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.131 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.131 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.131 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.131 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.131 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.131 12 ERROR oslo_messaging.notify.messaging Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.132 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.145 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.145 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.147 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '181b8e53-30fc-4dc9-9f3a-9e0ce7a89629', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:49:48.133143', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '64b8823a-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11254.325811526, 'message_signature': 'abfbdcfbaf53933cc0d8d46649f2493c28c556ae266995d26973255eff9dcd28'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:49:48.133143', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '64b894d2-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11254.325811526, 'message_signature': 'a230d89958c92523e3d024b1641676d2382375e1803ac2f1e143527b482b5ade'}]}, 'timestamp': '2025-12-15 09:49:48.146428', '_unique_id': 'e907ec3fe6494518a4cec4c908e58a3a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.147 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.147 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.147 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.147 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.147 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.147 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.147 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.147 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.147 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.147 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.147 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.147 12 ERROR oslo_messaging.notify.messaging Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.147 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.147 12 ERROR oslo_messaging.notify.messaging Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.147 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.147 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.147 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.147 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.147 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.147 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.147 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.147 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.147 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.147 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.147 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.147 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.147 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.147 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.147 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.147 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.147 12 ERROR oslo_messaging.notify.messaging Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.148 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.148 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.150 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '148439db-b595-4c4e-b108-5b2d1c4a5fcf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:49:48.148916', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '64b9099e-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11254.313786527, 'message_signature': '7fc3fcb10375ebb5490812f3a868cb6cc3f0e32ca4a972f8f410e53a209d0e56'}]}, 'timestamp': '2025-12-15 09:49:48.149423', '_unique_id': 'e3eaf7481e944817b262205a5449a674'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.150 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.150 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.150 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.150 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.150 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.150 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.150 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.150 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.150 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.150 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.150 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.150 12 ERROR oslo_messaging.notify.messaging Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.150 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.150 12 ERROR oslo_messaging.notify.messaging Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.150 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.150 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.150 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.150 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.150 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.150 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.150 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.150 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.150 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.150 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.150 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.150 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.150 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.150 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.150 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.150 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.150 12 ERROR oslo_messaging.notify.messaging Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.151 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.151 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.151 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.151 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.152 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.153 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '67d0c827-c790-4845-8692-886c860115da', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:49:48.151847', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '64b97b9a-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11254.325811526, 'message_signature': '845ab6598c93dc0723a125f9ca7eeebba223dbde08ed6876d74c4bba605bf7fe'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:49:48.151847', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '64b98bda-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11254.325811526, 'message_signature': 'c9982505a8b81ff8b289287dce49664bc2c508309c9723721bef6bbe8043c7ea'}]}, 'timestamp': '2025-12-15 09:49:48.152721', '_unique_id': 'bd9cb01c67de47e784e3869f29547bf4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.153 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.153 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.153 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.153 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.153 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.153 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.153 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.153 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.153 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.153 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.153 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.153 12 ERROR oslo_messaging.notify.messaging Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.153 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.153 12 ERROR oslo_messaging.notify.messaging Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.153 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.153 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.153 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.153 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.153 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.153 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.153 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.153 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.153 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.153 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.153 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.153 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.153 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.153 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.153 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.153 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.153 12 ERROR oslo_messaging.notify.messaging Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.154 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.154 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.156 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '212fe3b9-28f0-42ba-8a11-8d29de18af33', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:49:48.154917', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '64b9f3c2-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11254.313786527, 'message_signature': '49049b91a8d798a32e2dae12eca8ef82c0a7af8a78b01a66b9bfdef3ec97fbe0'}]}, 'timestamp': '2025-12-15 09:49:48.155412', '_unique_id': '20a167a5b3324c0b9add0dcc9b9f80dd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.156 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.156 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.156 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.156 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.156 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.156 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.156 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.156 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.156 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.156 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.156 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.156 12 ERROR oslo_messaging.notify.messaging Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.156 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.156 12 ERROR oslo_messaging.notify.messaging Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.156 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.156 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.156 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.156 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.156 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.156 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.156 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.156 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.156 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.156 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.156 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.156 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.156 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.156 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.156 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.156 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.156 12 ERROR oslo_messaging.notify.messaging Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.157 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.157 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.158 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4c651cf3-33ea-435d-a2f1-387a3cb36d10', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:49:48.157488', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '64ba5682-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11254.313786527, 'message_signature': 'caaa7cc2b7863f198757ca491656b0d9f17199a04ca9eeb6aab6ff93c0da6afd'}]}, 'timestamp': '2025-12-15 09:49:48.157937', '_unique_id': 'e3fbff74808348cf911b070d43a33cf8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.158 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.158 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.158 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.158 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.158 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.158 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.158 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.158 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.158 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.158 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.158 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.158 12 ERROR oslo_messaging.notify.messaging Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.158 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.158 12 ERROR oslo_messaging.notify.messaging Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.158 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.158 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.158 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.158 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.158 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.158 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.158 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.158 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.158 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.158 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.158 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.158 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.158 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.158 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.158 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.158 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.158 12 ERROR oslo_messaging.notify.messaging Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.159 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.189 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.189 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.191 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8500e78a-633a-428d-b989-8f231b47188c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:49:48.160060', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '64bf2c84-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11254.352733855, 'message_signature': '31f38669d7ddf208895178eacebb0b658fad0531201702e41360a3e9709c0971'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:49:48.160060', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '64bf3dc8-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11254.352733855, 'message_signature': 'd970e8d3730ec4d38c63c9f6555fd9c5aba67c78867e1f4358f8df1aa7a8308d'}]}, 'timestamp': '2025-12-15 09:49:48.190085', '_unique_id': 'cde4b09c69c94384b474249fe5b4a8b8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.191 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.191 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.191 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.191 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.191 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.191 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.191 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.191 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.191 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.191 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.191 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.191 12 ERROR oslo_messaging.notify.messaging Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.191 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.191 12 ERROR oslo_messaging.notify.messaging Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.191 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.191 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.191 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.191 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.191 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.191 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.191 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.191 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.191 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.191 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.191 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.191 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.191 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.191 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.191 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.191 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.191 12 ERROR oslo_messaging.notify.messaging Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.192 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.192 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.193 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4a575806-9aa1-4a1c-a3bc-7c9992bf1987', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:49:48.192533', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '64bfaff6-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11254.313786527, 'message_signature': '99b530dc80f335e20c3ec31669790fc7d9d26740d2d4109a6650bb4f0d76621b'}]}, 'timestamp': '2025-12-15 09:49:48.193035', '_unique_id': '6ad1b0c4f1e845dfb3bcfcb49093bfdb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.193 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.193 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.193 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.193 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.193 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.193 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.193 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.193 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.193 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.193 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.193 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.193 12 ERROR oslo_messaging.notify.messaging Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.193 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.193 12 ERROR oslo_messaging.notify.messaging Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.193 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.193 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.193 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.193 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.193 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.193 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.193 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.193 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.193 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.193 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.193 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.193 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.193 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.193 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.193 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.193 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.193 12 ERROR oslo_messaging.notify.messaging Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.194 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.195 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.195 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.196 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '89571c97-e025-4e08-b619-a6c1a3ffcfcf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:49:48.195125', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '64c01482-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11254.325811526, 'message_signature': '4e3d9b785db368ce9b0c39325ea198808f93973da6b2181da481c26c22de651e'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:49:48.195125', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '64c02436-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11254.325811526, 'message_signature': '79c04741a5375bd9f28360d59957ad59c2bcd3dfee59e9717ebbe92bb1703239'}]}, 'timestamp': '2025-12-15 09:49:48.195943', '_unique_id': 'c40c7e3639ab41dd97880858c9797b5e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.196 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.196 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.196 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.196 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.196 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.196 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.196 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.196 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.196 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.196 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.196 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.196 12 ERROR oslo_messaging.notify.messaging Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.196 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.196 12 ERROR oslo_messaging.notify.messaging Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.196 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.196 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.196 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.196 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.196 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.196 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.196 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.196 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.196 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.196 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.196 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.196 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.196 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.196 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.196 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.196 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.196 12 ERROR oslo_messaging.notify.messaging Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.197 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.198 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.198 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.202 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a2ce35ce-c4b9-4277-91e2-ecb9e0bbc475', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:49:48.198068', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '64c08796-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11254.352733855, 'message_signature': 'b1c613542876c005188422e4b5a68488d611e2effb45f26be4476fb0f22beb73'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:49:48.198068', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '64c09826-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11254.352733855, 'message_signature': '02dd21e777b9e63a272fcbc39e9ac70892c5131eafb982d410c2c114cbbbc708'}]}, 'timestamp': '2025-12-15 09:49:48.198950', '_unique_id': '74a2b408ca394a15a37321ef1fa34471'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.202 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.202 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.202 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.202 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.202 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.202 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.202 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.202 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.202 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.202 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.202 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.202 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.202 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.202 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.202 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.202 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.202 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.202 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.202 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.202 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.202 12 ERROR oslo_messaging.notify.messaging Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.202 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.202 12 ERROR oslo_messaging.notify.messaging Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.202 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.202 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.202 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.202 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.202 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.202 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.202 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.202 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.202 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.202 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.202 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.202 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.202 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.202 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.202 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.202 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.202 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.202 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.202 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.202 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.202 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.202 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.202 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.202 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.202 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.202 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.202 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.202 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.202 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.202 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.202 12 ERROR oslo_messaging.notify.messaging Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.204 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.204 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.206 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cad6429b-b664-4a47-93d6-0a6672b4778a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:49:48.204603', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '64c18c90-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11254.313786527, 'message_signature': 'b1145ba475983f9ff2939acf35f724a4d43900c045dfc036395df3497f677863'}]}, 'timestamp': '2025-12-15 09:49:48.205338', '_unique_id': '7741faa6feb7447a920ec3696725a9e1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.206 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.206 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.206 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.206 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.206 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.206 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.206 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.206 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.206 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.206 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.206 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.206 12 ERROR oslo_messaging.notify.messaging Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.206 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.206 12 ERROR oslo_messaging.notify.messaging Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.206 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.206 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.206 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.206 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.206 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.206 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.206 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.206 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.206 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.206 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.206 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.206 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.206 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.206 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.206 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.206 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.206 12 ERROR oslo_messaging.notify.messaging Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.207 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.207 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.209 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f7f42402-bc02-4f71-b880-8ab7258bd91a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:49:48.207900', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '64c20a58-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11254.313786527, 'message_signature': 'd8d00f173652da3ae51e4ae8d5c6dbf568798c9786f900ccf467c5468012beb3'}]}, 'timestamp': '2025-12-15 09:49:48.208433', '_unique_id': '0dd94fa3a1264df0b51ab2089db08063'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.209 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.209 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.209 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.209 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.209 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.209 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.209 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.209 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.209 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.209 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.209 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.209 12 ERROR oslo_messaging.notify.messaging Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.209 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.209 12 ERROR oslo_messaging.notify.messaging Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.209 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.209 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.209 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.209 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.209 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.209 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.209 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.209 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.209 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.209 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.209 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.209 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.209 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.209 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.209 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.209 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.209 12 ERROR oslo_messaging.notify.messaging Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.210 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.210 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.latency volume: 1243487016 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.211 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.latency volume: 24779175 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.212 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7b42b86b-6859-41fd-a965-b8e5ff35acc9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1243487016, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:49:48.210760', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '64c279ac-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11254.352733855, 'message_signature': '25f6560517fa6c12af1f36d28f60cae171982790608a129450b74011f2615339'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24779175, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:49:48.210760', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '64c28a96-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11254.352733855, 'message_signature': '2f70e48c68050f7a55a21c1a6bc0c14ebc8fe0a634862fd254430fa1994b9f69'}]}, 'timestamp': '2025-12-15 09:49:48.211683', '_unique_id': '42733218da354f29a870067575807713'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.212 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.212 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.212 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.212 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.212 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.212 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.212 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.212 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.212 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.212 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.212 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.212 12 ERROR oslo_messaging.notify.messaging Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.212 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.212 12 ERROR oslo_messaging.notify.messaging Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.212 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.212 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.212 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.212 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.212 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.212 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.212 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.212 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.212 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.212 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.212 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.212 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.212 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.212 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.212 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.212 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.212 12 ERROR oslo_messaging.notify.messaging Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.213 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.214 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.215 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '96ad9496-360a-4a34-a85d-8ab61029645a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:49:48.214101', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '64c2fcb0-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11254.313786527, 'message_signature': '0cf178c5d4319b661ca3564025c9961d25f9fb58bcd7b0b405eb7fd8825ed579'}]}, 'timestamp': '2025-12-15 09:49:48.214636', '_unique_id': 'b5aaaa9833cb4281a4e70b2a8f3732e4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.215 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.215 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.215 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.215 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.215 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.215 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.215 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.215 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.215 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.215 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.215 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.215 12 ERROR oslo_messaging.notify.messaging Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.215 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.215 12 ERROR oslo_messaging.notify.messaging Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.215 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.215 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.215 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.215 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.215 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.215 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.215 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.215 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.215 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.215 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.215 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.215 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.215 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.215 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.215 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.215 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.215 12 ERROR oslo_messaging.notify.messaging Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.216 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.217 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.217 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.latency volume: 1342134926 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.217 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.latency volume: 123356132 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.219 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3273c938-5aff-4274-b32c-8fd5c6054d8e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1342134926, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:49:48.217188', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '64c37622-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11254.352733855, 'message_signature': '34a01163bbf72dfa0ab730ce9d88e678728c9e4f72ef31580736d5bb9024f35f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 123356132, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:49:48.217188', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '64c38752-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11254.352733855, 'message_signature': '4004bc0cd31e06e33c46067c7b98dc71dec2c42da65299b5b64031f540390d4f'}]}, 'timestamp': '2025-12-15 09:49:48.218187', '_unique_id': 'acc5c4f5a6d4448eb04f446e9560afb3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.219 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.219 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.219 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.219 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.219 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.219 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.219 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.219 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.219 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.219 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.219 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.219 12 ERROR oslo_messaging.notify.messaging Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.219 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.219 12 ERROR oslo_messaging.notify.messaging Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.219 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.219 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.219 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.219 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.219 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.219 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.219 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.219 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.219 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.219 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.219 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.219 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.219 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.219 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.219 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.219 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.219 12 ERROR oslo_messaging.notify.messaging Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.220 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.237 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/memory.usage volume: 51.73828125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.238 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b5f50ed4-b971-44b8-af3c-9c1d3a942227', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.73828125, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'timestamp': '2025-12-15T09:49:48.220488', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '64c6813c-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11254.429691654, 'message_signature': '3e06d70d23ed668d2c3571ed8dfeb7223083736bb10bc54402064f323ed94e9d'}]}, 'timestamp': '2025-12-15 09:49:48.237747', '_unique_id': 'af1bdcce227f478c92b01c0fe463ec0b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.238 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.238 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.238 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.238 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.238 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.238 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.238 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.238 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.238 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.238 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.238 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.238 12 ERROR oslo_messaging.notify.messaging Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.238 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.238 12 ERROR oslo_messaging.notify.messaging Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.238 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.238 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.238 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.238 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.238 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.238 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.238 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.238 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.238 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.238 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.238 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.238 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.238 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.238 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.238 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.238 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.238 12 ERROR oslo_messaging.notify.messaging Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.240 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.240 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.241 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '35e026aa-5dc2-436b-bde0-b5451ccaedd0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:49:48.240272', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '64c6f932-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11254.313786527, 'message_signature': 'd44e98b0b36e85a2b5849652cf15d9418e6c472ead537f5fbac6945e7fd54019'}]}, 'timestamp': '2025-12-15 09:49:48.240801', '_unique_id': '97e1cbca66414c3c8af27770a4508176'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.241 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.241 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.241 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.241 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.241 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.241 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.241 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.241 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.241 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.241 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.241 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.241 12 ERROR oslo_messaging.notify.messaging Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.241 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.241 12 ERROR oslo_messaging.notify.messaging Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.241 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.241 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.241 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.241 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.241 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.241 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.241 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.241 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.241 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.241 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.241 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.241 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.241 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.241 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.241 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.241 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.241 12 ERROR oslo_messaging.notify.messaging Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.242 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.243 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.243 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.243 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.245 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b5f054dc-10e8-41c3-ba93-fc32bfb19d85', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:49:48.243299', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '64c76ef8-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11254.352733855, 'message_signature': '7c3b9122894996a2a0ee4a92aee7f4a7c001d0801f83e3935f40f0167b974a64'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:49:48.243299', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '64c78050-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11254.352733855, 'message_signature': '4ce419e8bf304fe5ed88414ce93cdbe4ae0b20e3acf2a6381d0f5097a77f4b11'}]}, 'timestamp': '2025-12-15 09:49:48.244219', '_unique_id': 'd1c5d09d0d7b4eb59aad3b4ea3920959'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.245 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.245 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.245 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.245 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.245 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.245 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.245 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.245 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.245 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.245 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.245 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.245 12 ERROR oslo_messaging.notify.messaging Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.245 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.245 12 ERROR oslo_messaging.notify.messaging Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.245 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.245 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.245 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.245 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.245 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.245 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.245 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.245 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.245 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.245 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.245 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.245 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.245 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.245 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.245 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.245 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.245 12 ERROR oslo_messaging.notify.messaging Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.246 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.246 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.247 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.248 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6d13f54a-279f-49a5-8851-cb37508c16c0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:49:48.246751', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '64c7f9c2-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11254.352733855, 'message_signature': '52da736987808643fd8c06304432b2834c394dc0af755e80284de0f5092850c3'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:49:48.246751', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '64c80b6a-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11254.352733855, 'message_signature': '83bd80b6d1698e783ae5d993d09fbab86436baeefe5dba7bf9b026d0c2b662e3'}]}, 'timestamp': '2025-12-15 09:49:48.247747', '_unique_id': '086b2294cf2c46c49ee57090b0b1f176'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.248 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.248 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.248 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.248 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.248 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.248 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.248 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.248 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.248 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.248 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.248 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.248 12 ERROR oslo_messaging.notify.messaging Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.248 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.248 12 ERROR oslo_messaging.notify.messaging Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.248 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.248 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.248 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.248 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.248 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.248 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.248 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.248 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.248 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.248 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.248 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.248 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.248 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.248 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.248 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.248 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.248 12 ERROR oslo_messaging.notify.messaging Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.250 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.250 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/cpu volume: 10420000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.251 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '93ce84a7-0b18-4304-b73f-c8b08cfa8ca9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10420000000, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'timestamp': '2025-12-15T09:49:48.250325', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '64c88202-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11254.429691654, 'message_signature': '79a68b3c3dd59a03cab841e893953a0495666cef69c9ddeefcfb5e7403055f00'}]}, 'timestamp': '2025-12-15 09:49:48.250800', '_unique_id': '3ad6b36f7122479199b4b21dd61975e7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.251 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.251 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.251 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.251 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.251 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.251 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.251 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.251 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.251 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.251 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.251 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.251 12 ERROR oslo_messaging.notify.messaging Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.251 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.251 12 ERROR oslo_messaging.notify.messaging Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.251 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.251 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.251 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.251 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.251 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.251 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.251 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.251 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.251 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.251 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.251 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.251 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.251 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.251 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.251 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.251 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:49:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:49:48.251 12 ERROR oslo_messaging.notify.messaging Dec 15 04:49:49 localhost nova_compute[286344]: 2025-12-15 09:49:49.551 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:49:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09. Dec 15 04:49:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 04:49:49 localhost systemd[1]: tmp-crun.r0EZU9.mount: Deactivated successfully. Dec 15 04:49:49 localhost podman[291999]: 2025-12-15 09:49:49.752723455 +0000 UTC m=+0.076096617 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true) Dec 15 04:49:49 localhost podman[291999]: 2025-12-15 09:49:49.816738629 +0000 UTC m=+0.140111811 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 04:49:49 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 04:49:49 localhost podman[291998]: 2025-12-15 09:49:49.823623733 +0000 UTC m=+0.146012667 container health_status 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, architecture=x86_64, build-date=2025-08-20T13:12:41, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git) Dec 15 04:49:49 localhost podman[291998]: 2025-12-15 09:49:49.908671171 +0000 UTC m=+0.231060095 container exec_died 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vendor=Red Hat, Inc., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, io.buildah.version=1.33.7, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, distribution-scope=public, name=ubi9-minimal, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, architecture=x86_64, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.) Dec 15 04:49:49 localhost systemd[1]: 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09.service: Deactivated successfully. Dec 15 04:49:50 localhost nova_compute[286344]: 2025-12-15 09:49:50.051 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:49:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:49:51.466 160590 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:49:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:49:51.468 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:49:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:49:51.469 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:49:54 localhost nova_compute[286344]: 2025-12-15 09:49:54.578 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:49:55 localhost nova_compute[286344]: 2025-12-15 09:49:55.053 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:49:57 localhost nova_compute[286344]: 2025-12-15 09:49:57.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:49:57 localhost nova_compute[286344]: 2025-12-15 09:49:57.271 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m Dec 15 04:49:57 localhost nova_compute[286344]: 2025-12-15 09:49:57.311 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m Dec 15 04:49:57 localhost nova_compute[286344]: 2025-12-15 09:49:57.311 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:49:57 localhost nova_compute[286344]: 2025-12-15 09:49:57.311 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Cleaning up deleted instances with incomplete migration _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m Dec 15 04:49:57 localhost nova_compute[286344]: 2025-12-15 09:49:57.326 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:49:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 04:49:57 localhost podman[292079]: 2025-12-15 09:49:57.714016147 +0000 UTC m=+0.079222894 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team) Dec 15 04:49:57 localhost podman[292079]: 2025-12-15 09:49:57.747403469 +0000 UTC m=+0.112610206 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true) Dec 15 04:49:57 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 04:49:59 localhost nova_compute[286344]: 2025-12-15 09:49:59.369 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:49:59 localhost nova_compute[286344]: 2025-12-15 09:49:59.396 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:49:59 localhost nova_compute[286344]: 2025-12-15 09:49:59.397 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:49:59 localhost nova_compute[286344]: 2025-12-15 09:49:59.397 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:49:59 localhost nova_compute[286344]: 2025-12-15 09:49:59.398 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Auditing locally available compute resources for np0005559462.localdomain (node: np0005559462.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 15 04:49:59 localhost nova_compute[286344]: 2025-12-15 09:49:59.398 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 04:49:59 localhost nova_compute[286344]: 2025-12-15 09:49:59.616 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:49:59 localhost nova_compute[286344]: 2025-12-15 09:49:59.853 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 04:49:59 localhost nova_compute[286344]: 2025-12-15 09:49:59.915 286348 DEBUG nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 04:49:59 localhost nova_compute[286344]: 2025-12-15 09:49:59.915 286348 DEBUG nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 04:50:00 localhost nova_compute[286344]: 2025-12-15 09:50:00.054 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:50:00 localhost nova_compute[286344]: 2025-12-15 09:50:00.127 286348 WARNING nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 15 04:50:00 localhost nova_compute[286344]: 2025-12-15 09:50:00.129 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Hypervisor/Node resource view: name=np0005559462.localdomain free_ram=12294MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 15 04:50:00 localhost nova_compute[286344]: 2025-12-15 09:50:00.129 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:50:00 localhost nova_compute[286344]: 2025-12-15 09:50:00.130 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:50:00 localhost nova_compute[286344]: 2025-12-15 09:50:00.240 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Instance 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 15 04:50:00 localhost nova_compute[286344]: 2025-12-15 09:50:00.240 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 15 04:50:00 localhost nova_compute[286344]: 2025-12-15 09:50:00.241 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Final resource view: name=np0005559462.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 15 04:50:00 localhost nova_compute[286344]: 2025-12-15 09:50:00.347 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 04:50:00 localhost podman[292219]: Dec 15 04:50:00 localhost podman[292219]: 2025-12-15 09:50:00.530225555 +0000 UTC m=+0.085326056 container create 6bd0f71249a06afb515c57faf14e61d94f3e463df9629ca02675b52623b2efa5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_brahmagupta, release=1763362218, version=7, CEPH_POINT_RELEASE=, GIT_BRANCH=main, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., ceph=True, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, GIT_CLEAN=True, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-26T19:44:28Z, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7) Dec 15 04:50:00 localhost systemd[1]: Started libpod-conmon-6bd0f71249a06afb515c57faf14e61d94f3e463df9629ca02675b52623b2efa5.scope. Dec 15 04:50:00 localhost systemd[1]: Started libcrun container. Dec 15 04:50:00 localhost podman[292219]: 2025-12-15 09:50:00.491581256 +0000 UTC m=+0.046681787 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 04:50:00 localhost podman[292219]: 2025-12-15 09:50:00.601146195 +0000 UTC m=+0.156246716 container init 6bd0f71249a06afb515c57faf14e61d94f3e463df9629ca02675b52623b2efa5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_brahmagupta, CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, build-date=2025-11-26T19:44:28Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, architecture=x86_64, vcs-type=git, maintainer=Guillaume Abrioux , distribution-scope=public, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, version=7, ceph=True, description=Red Hat Ceph Storage 7, release=1763362218, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Dec 15 04:50:00 localhost systemd[1]: tmp-crun.IKtlUH.mount: Deactivated successfully. Dec 15 04:50:00 localhost podman[292219]: 2025-12-15 09:50:00.625012838 +0000 UTC m=+0.180113399 container start 6bd0f71249a06afb515c57faf14e61d94f3e463df9629ca02675b52623b2efa5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_brahmagupta, build-date=2025-11-26T19:44:28Z, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, name=rhceph, io.buildah.version=1.41.4, release=1763362218, vcs-type=git, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, vendor=Red Hat, Inc., ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7) Dec 15 04:50:00 localhost podman[292219]: 2025-12-15 09:50:00.62544483 +0000 UTC m=+0.180545381 container attach 6bd0f71249a06afb515c57faf14e61d94f3e463df9629ca02675b52623b2efa5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_brahmagupta, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-11-26T19:44:28Z, vcs-type=git, GIT_CLEAN=True, release=1763362218, version=7, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, ceph=True, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, io.openshift.expose-services=, RELEASE=main, architecture=x86_64, GIT_BRANCH=main) Dec 15 04:50:00 localhost lucid_brahmagupta[292251]: 167 167 Dec 15 04:50:00 localhost systemd[1]: libpod-6bd0f71249a06afb515c57faf14e61d94f3e463df9629ca02675b52623b2efa5.scope: Deactivated successfully. Dec 15 04:50:00 localhost podman[292219]: 2025-12-15 09:50:00.627315403 +0000 UTC m=+0.182415894 container died 6bd0f71249a06afb515c57faf14e61d94f3e463df9629ca02675b52623b2efa5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_brahmagupta, distribution-scope=public, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, maintainer=Guillaume Abrioux , cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, GIT_CLEAN=True, build-date=2025-11-26T19:44:28Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, version=7, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, RELEASE=main, description=Red Hat Ceph Storage 7) Dec 15 04:50:00 localhost podman[292257]: 2025-12-15 09:50:00.712797932 +0000 UTC m=+0.076118977 container remove 6bd0f71249a06afb515c57faf14e61d94f3e463df9629ca02675b52623b2efa5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_brahmagupta, architecture=x86_64, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1763362218, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, distribution-scope=public, vendor=Red Hat, Inc., GIT_CLEAN=True, build-date=2025-11-26T19:44:28Z, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, com.redhat.component=rhceph-container, io.openshift.expose-services=, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , GIT_BRANCH=main, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Dec 15 04:50:00 localhost systemd[1]: libpod-conmon-6bd0f71249a06afb515c57faf14e61d94f3e463df9629ca02675b52623b2efa5.scope: Deactivated successfully. Dec 15 04:50:00 localhost systemd[1]: Reloading. Dec 15 04:50:00 localhost nova_compute[286344]: 2025-12-15 09:50:00.817 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.470s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 04:50:00 localhost nova_compute[286344]: 2025-12-15 09:50:00.826 286348 DEBUG nova.compute.provider_tree [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Inventory has not changed in ProviderTree for provider: 26c8956b-6742-4951-b566-971b9bbe323b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 15 04:50:00 localhost nova_compute[286344]: 2025-12-15 09:50:00.847 286348 DEBUG nova.scheduler.client.report [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Inventory has not changed for provider 26c8956b-6742-4951-b566-971b9bbe323b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 15 04:50:00 localhost nova_compute[286344]: 2025-12-15 09:50:00.851 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Compute_service record updated for np0005559462.localdomain:np0005559462.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 15 04:50:00 localhost nova_compute[286344]: 2025-12-15 09:50:00.852 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.723s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:50:00 localhost systemd-rc-local-generator[292300]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:50:00 localhost systemd-sysv-generator[292304]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:50:00 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:50:00 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 15 04:50:00 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:50:00 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:50:00 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:50:00 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 15 04:50:00 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:50:00 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:50:00 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:50:01 localhost systemd[1]: var-lib-containers-storage-overlay-e915a929931f1ddbd20a59db5552a193107fed4ac4e71d3f628ba9f725a2e782-merged.mount: Deactivated successfully. Dec 15 04:50:01 localhost systemd[1]: Reloading. Dec 15 04:50:01 localhost systemd-rc-local-generator[292345]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:50:01 localhost systemd-sysv-generator[292349]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:50:01 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:50:01 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 15 04:50:01 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:50:01 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:50:01 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:50:01 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 15 04:50:01 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:50:01 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:50:01 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:50:01 localhost systemd[1]: Starting Ceph mgr.np0005559462.fudvyx for bce17446-41b5-5408-a23e-0b011906b44a... Dec 15 04:50:01 localhost nova_compute[286344]: 2025-12-15 09:50:01.755 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:50:01 localhost nova_compute[286344]: 2025-12-15 09:50:01.757 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:50:01 localhost nova_compute[286344]: 2025-12-15 09:50:01.757 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 15 04:50:01 localhost nova_compute[286344]: 2025-12-15 09:50:01.757 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 15 04:50:01 localhost podman[243449]: time="2025-12-15T09:50:01Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 15 04:50:01 localhost podman[292403]: Dec 15 04:50:01 localhost podman[243449]: @ - - [15/Dec/2025:09:50:01 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 150413 "" "Go-http-client/1.1" Dec 15 04:50:01 localhost podman[243449]: @ - - [15/Dec/2025:09:50:01 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17721 "" "Go-http-client/1.1" Dec 15 04:50:01 localhost podman[292403]: 2025-12-15 09:50:01.945170737 +0000 UTC m=+0.130495610 container create a55c05987e9a315ed16f4c782a575a004d46a41e9204fd5e3d3220235c317a12 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-mgr-np0005559462-fudvyx, distribution-scope=public, version=7, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , name=rhceph, io.openshift.expose-services=, io.buildah.version=1.41.4, build-date=2025-11-26T19:44:28Z, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, release=1763362218, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, ceph=True) Dec 15 04:50:01 localhost podman[292403]: 2025-12-15 09:50:01.860787848 +0000 UTC m=+0.046112761 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 04:50:01 localhost systemd[1]: tmp-crun.avu4qC.mount: Deactivated successfully. Dec 15 04:50:01 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/174bfae49418540a5a772cdcfb4aca08df98172f582aa29ca05ba2a45d5c3e71/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Dec 15 04:50:01 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/174bfae49418540a5a772cdcfb4aca08df98172f582aa29ca05ba2a45d5c3e71/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Dec 15 04:50:01 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/174bfae49418540a5a772cdcfb4aca08df98172f582aa29ca05ba2a45d5c3e71/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Dec 15 04:50:02 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/174bfae49418540a5a772cdcfb4aca08df98172f582aa29ca05ba2a45d5c3e71/merged/var/lib/ceph/mgr/ceph-np0005559462.fudvyx supports timestamps until 2038 (0x7fffffff) Dec 15 04:50:02 localhost podman[292403]: 2025-12-15 09:50:02.007110723 +0000 UTC m=+0.192435586 container init a55c05987e9a315ed16f4c782a575a004d46a41e9204fd5e3d3220235c317a12 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-mgr-np0005559462-fudvyx, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, io.openshift.tags=rhceph ceph, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux , io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, GIT_CLEAN=True, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, architecture=x86_64, name=rhceph, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, build-date=2025-11-26T19:44:28Z, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git) Dec 15 04:50:02 localhost podman[292403]: 2025-12-15 09:50:02.015359406 +0000 UTC m=+0.200684269 container start a55c05987e9a315ed16f4c782a575a004d46a41e9204fd5e3d3220235c317a12 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-mgr-np0005559462-fudvyx, release=1763362218, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, architecture=x86_64, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, distribution-scope=public, version=7, build-date=2025-11-26T19:44:28Z, GIT_BRANCH=main, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 15 04:50:02 localhost bash[292403]: a55c05987e9a315ed16f4c782a575a004d46a41e9204fd5e3d3220235c317a12 Dec 15 04:50:02 localhost systemd[1]: Started Ceph mgr.np0005559462.fudvyx for bce17446-41b5-5408-a23e-0b011906b44a. Dec 15 04:50:02 localhost ceph-mgr[292421]: set uid:gid to 167:167 (ceph:ceph) Dec 15 04:50:02 localhost ceph-mgr[292421]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-mgr, pid 2 Dec 15 04:50:02 localhost ceph-mgr[292421]: pidfile_write: ignore empty --pid-file Dec 15 04:50:02 localhost ceph-mgr[292421]: mgr[py] Loading python module 'alerts' Dec 15 04:50:02 localhost ceph-mgr[292421]: mgr[py] Module alerts has missing NOTIFY_TYPES member Dec 15 04:50:02 localhost ceph-mgr[292421]: mgr[py] Loading python module 'balancer' Dec 15 04:50:02 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-mgr-np0005559462-fudvyx[292417]: 2025-12-15T09:50:02.192+0000 7ff1f5c65140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member Dec 15 04:50:02 localhost ceph-mgr[292421]: mgr[py] Module balancer has missing NOTIFY_TYPES member Dec 15 04:50:02 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-mgr-np0005559462-fudvyx[292417]: 2025-12-15T09:50:02.262+0000 7ff1f5c65140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member Dec 15 04:50:02 localhost ceph-mgr[292421]: mgr[py] Loading python module 'cephadm' Dec 15 04:50:02 localhost nova_compute[286344]: 2025-12-15 09:50:02.829 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 15 04:50:02 localhost nova_compute[286344]: 2025-12-15 09:50:02.830 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquired lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 15 04:50:02 localhost nova_compute[286344]: 2025-12-15 09:50:02.830 286348 DEBUG nova.network.neutron [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 15 04:50:02 localhost nova_compute[286344]: 2025-12-15 09:50:02.830 286348 DEBUG nova.objects.instance [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 15 04:50:02 localhost ceph-mgr[292421]: mgr[py] Loading python module 'crash' Dec 15 04:50:02 localhost ceph-mgr[292421]: mgr[py] Module crash has missing NOTIFY_TYPES member Dec 15 04:50:02 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-mgr-np0005559462-fudvyx[292417]: 2025-12-15T09:50:02.916+0000 7ff1f5c65140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member Dec 15 04:50:02 localhost ceph-mgr[292421]: mgr[py] Loading python module 'dashboard' Dec 15 04:50:03 localhost ceph-mgr[292421]: mgr[py] Loading python module 'devicehealth' Dec 15 04:50:03 localhost ceph-mgr[292421]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member Dec 15 04:50:03 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-mgr-np0005559462-fudvyx[292417]: 2025-12-15T09:50:03.456+0000 7ff1f5c65140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member Dec 15 04:50:03 localhost ceph-mgr[292421]: mgr[py] Loading python module 'diskprediction_local' Dec 15 04:50:03 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-mgr-np0005559462-fudvyx[292417]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode. Dec 15 04:50:03 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-mgr-np0005559462-fudvyx[292417]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve. Dec 15 04:50:03 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-mgr-np0005559462-fudvyx[292417]: from numpy import show_config as show_numpy_config Dec 15 04:50:03 localhost ceph-mgr[292421]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member Dec 15 04:50:03 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-mgr-np0005559462-fudvyx[292417]: 2025-12-15T09:50:03.589+0000 7ff1f5c65140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member Dec 15 04:50:03 localhost ceph-mgr[292421]: mgr[py] Loading python module 'influx' Dec 15 04:50:03 localhost ceph-mgr[292421]: mgr[py] Module influx has missing NOTIFY_TYPES member Dec 15 04:50:03 localhost ceph-mgr[292421]: mgr[py] Loading python module 'insights' Dec 15 04:50:03 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-mgr-np0005559462-fudvyx[292417]: 2025-12-15T09:50:03.646+0000 7ff1f5c65140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member Dec 15 04:50:03 localhost ceph-mgr[292421]: mgr[py] Loading python module 'iostat' Dec 15 04:50:03 localhost ceph-mgr[292421]: mgr[py] Module iostat has missing NOTIFY_TYPES member Dec 15 04:50:03 localhost ceph-mgr[292421]: mgr[py] Loading python module 'k8sevents' Dec 15 04:50:03 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-mgr-np0005559462-fudvyx[292417]: 2025-12-15T09:50:03.757+0000 7ff1f5c65140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member Dec 15 04:50:04 localhost ceph-mgr[292421]: mgr[py] Loading python module 'localpool' Dec 15 04:50:04 localhost ceph-mgr[292421]: mgr[py] Loading python module 'mds_autoscaler' Dec 15 04:50:04 localhost ceph-mgr[292421]: mgr[py] Loading python module 'mirroring' Dec 15 04:50:04 localhost ceph-mgr[292421]: mgr[py] Loading python module 'nfs' Dec 15 04:50:04 localhost ceph-mgr[292421]: mgr[py] Module nfs has missing NOTIFY_TYPES member Dec 15 04:50:04 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-mgr-np0005559462-fudvyx[292417]: 2025-12-15T09:50:04.485+0000 7ff1f5c65140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member Dec 15 04:50:04 localhost ceph-mgr[292421]: mgr[py] Loading python module 'orchestrator' Dec 15 04:50:04 localhost ceph-mgr[292421]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member Dec 15 04:50:04 localhost ceph-mgr[292421]: mgr[py] Loading python module 'osd_perf_query' Dec 15 04:50:04 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-mgr-np0005559462-fudvyx[292417]: 2025-12-15T09:50:04.627+0000 7ff1f5c65140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member Dec 15 04:50:04 localhost nova_compute[286344]: 2025-12-15 09:50:04.656 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:50:04 localhost ceph-mgr[292421]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member Dec 15 04:50:04 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-mgr-np0005559462-fudvyx[292417]: 2025-12-15T09:50:04.689+0000 7ff1f5c65140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member Dec 15 04:50:04 localhost ceph-mgr[292421]: mgr[py] Loading python module 'osd_support' Dec 15 04:50:04 localhost ceph-mgr[292421]: mgr[py] Module osd_support has missing NOTIFY_TYPES member Dec 15 04:50:04 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-mgr-np0005559462-fudvyx[292417]: 2025-12-15T09:50:04.743+0000 7ff1f5c65140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member Dec 15 04:50:04 localhost ceph-mgr[292421]: mgr[py] Loading python module 'pg_autoscaler' Dec 15 04:50:04 localhost ceph-mgr[292421]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member Dec 15 04:50:04 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-mgr-np0005559462-fudvyx[292417]: 2025-12-15T09:50:04.809+0000 7ff1f5c65140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member Dec 15 04:50:04 localhost ceph-mgr[292421]: mgr[py] Loading python module 'progress' Dec 15 04:50:04 localhost openstack_network_exporter[246484]: ERROR 09:50:04 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 15 04:50:04 localhost openstack_network_exporter[246484]: ERROR 09:50:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 04:50:04 localhost openstack_network_exporter[246484]: ERROR 09:50:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 04:50:04 localhost openstack_network_exporter[246484]: ERROR 09:50:04 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 15 04:50:04 localhost openstack_network_exporter[246484]: Dec 15 04:50:04 localhost openstack_network_exporter[246484]: ERROR 09:50:04 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 15 04:50:04 localhost openstack_network_exporter[246484]: Dec 15 04:50:04 localhost ceph-mgr[292421]: mgr[py] Module progress has missing NOTIFY_TYPES member Dec 15 04:50:04 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-mgr-np0005559462-fudvyx[292417]: 2025-12-15T09:50:04.902+0000 7ff1f5c65140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member Dec 15 04:50:04 localhost ceph-mgr[292421]: mgr[py] Loading python module 'prometheus' Dec 15 04:50:05 localhost nova_compute[286344]: 2025-12-15 09:50:05.015 286348 DEBUG nova.network.neutron [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Updating instance_info_cache with network_info: [{"id": "03ef8889-3216-43fb-8a52-4be17a956ce1", "address": "fa:16:3e:74:df:7c", "network": {"id": "befb7a72-17a9-4bcb-b561-84b8f626685a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.201", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "c785bf23f53946bc99867d8832a50266", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03ef8889-32", "ovs_interfaceid": "03ef8889-3216-43fb-8a52-4be17a956ce1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 15 04:50:05 localhost nova_compute[286344]: 2025-12-15 09:50:05.031 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Releasing lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 15 04:50:05 localhost nova_compute[286344]: 2025-12-15 09:50:05.032 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 15 04:50:05 localhost nova_compute[286344]: 2025-12-15 09:50:05.033 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:50:05 localhost nova_compute[286344]: 2025-12-15 09:50:05.033 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:50:05 localhost nova_compute[286344]: 2025-12-15 09:50:05.034 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:50:05 localhost nova_compute[286344]: 2025-12-15 09:50:05.034 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:50:05 localhost nova_compute[286344]: 2025-12-15 09:50:05.035 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:50:05 localhost nova_compute[286344]: 2025-12-15 09:50:05.036 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:50:05 localhost nova_compute[286344]: 2025-12-15 09:50:05.036 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 15 04:50:05 localhost nova_compute[286344]: 2025-12-15 09:50:05.056 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:50:05 localhost ceph-mgr[292421]: mgr[py] Module prometheus has missing NOTIFY_TYPES member Dec 15 04:50:05 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-mgr-np0005559462-fudvyx[292417]: 2025-12-15T09:50:05.216+0000 7ff1f5c65140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member Dec 15 04:50:05 localhost ceph-mgr[292421]: mgr[py] Loading python module 'rbd_support' Dec 15 04:50:05 localhost ceph-mgr[292421]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member Dec 15 04:50:05 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-mgr-np0005559462-fudvyx[292417]: 2025-12-15T09:50:05.301+0000 7ff1f5c65140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member Dec 15 04:50:05 localhost ceph-mgr[292421]: mgr[py] Loading python module 'restful' Dec 15 04:50:05 localhost ceph-mgr[292421]: mgr[py] Loading python module 'rgw' Dec 15 04:50:05 localhost ceph-mgr[292421]: mgr[py] Module rgw has missing NOTIFY_TYPES member Dec 15 04:50:05 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-mgr-np0005559462-fudvyx[292417]: 2025-12-15T09:50:05.624+0000 7ff1f5c65140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member Dec 15 04:50:05 localhost ceph-mgr[292421]: mgr[py] Loading python module 'rook' Dec 15 04:50:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e. Dec 15 04:50:05 localhost systemd[1]: tmp-crun.VjQ01B.mount: Deactivated successfully. Dec 15 04:50:05 localhost podman[292451]: 2025-12-15 09:50:05.761180462 +0000 UTC m=+0.088955348 container health_status a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Dec 15 04:50:05 localhost podman[292451]: 2025-12-15 09:50:05.772345557 +0000 UTC m=+0.100120493 container exec_died a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Dec 15 04:50:05 localhost systemd[1]: a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e.service: Deactivated successfully. Dec 15 04:50:06 localhost ceph-mgr[292421]: mgr[py] Module rook has missing NOTIFY_TYPES member Dec 15 04:50:06 localhost ceph-mgr[292421]: mgr[py] Loading python module 'selftest' Dec 15 04:50:06 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-mgr-np0005559462-fudvyx[292417]: 2025-12-15T09:50:06.058+0000 7ff1f5c65140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member Dec 15 04:50:06 localhost ceph-mgr[292421]: mgr[py] Module selftest has missing NOTIFY_TYPES member Dec 15 04:50:06 localhost ceph-mgr[292421]: mgr[py] Loading python module 'snap_schedule' Dec 15 04:50:06 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-mgr-np0005559462-fudvyx[292417]: 2025-12-15T09:50:06.117+0000 7ff1f5c65140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member Dec 15 04:50:06 localhost ceph-mgr[292421]: mgr[py] Loading python module 'stats' Dec 15 04:50:06 localhost ceph-mgr[292421]: mgr[py] Loading python module 'status' Dec 15 04:50:06 localhost ceph-mgr[292421]: mgr[py] Module status has missing NOTIFY_TYPES member Dec 15 04:50:06 localhost ceph-mgr[292421]: mgr[py] Loading python module 'telegraf' Dec 15 04:50:06 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-mgr-np0005559462-fudvyx[292417]: 2025-12-15T09:50:06.304+0000 7ff1f5c65140 -1 mgr[py] Module status has missing NOTIFY_TYPES member Dec 15 04:50:06 localhost ceph-mgr[292421]: mgr[py] Module telegraf has missing NOTIFY_TYPES member Dec 15 04:50:06 localhost ceph-mgr[292421]: mgr[py] Loading python module 'telemetry' Dec 15 04:50:06 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-mgr-np0005559462-fudvyx[292417]: 2025-12-15T09:50:06.362+0000 7ff1f5c65140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member Dec 15 04:50:06 localhost ceph-mgr[292421]: mgr[py] Module telemetry has missing NOTIFY_TYPES member Dec 15 04:50:06 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-mgr-np0005559462-fudvyx[292417]: 2025-12-15T09:50:06.498+0000 7ff1f5c65140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member Dec 15 04:50:06 localhost ceph-mgr[292421]: mgr[py] Loading python module 'test_orchestrator' Dec 15 04:50:06 localhost ceph-mgr[292421]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member Dec 15 04:50:06 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-mgr-np0005559462-fudvyx[292417]: 2025-12-15T09:50:06.646+0000 7ff1f5c65140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member Dec 15 04:50:06 localhost ceph-mgr[292421]: mgr[py] Loading python module 'volumes' Dec 15 04:50:06 localhost ceph-mgr[292421]: mgr[py] Module volumes has missing NOTIFY_TYPES member Dec 15 04:50:06 localhost ceph-mgr[292421]: mgr[py] Loading python module 'zabbix' Dec 15 04:50:06 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-mgr-np0005559462-fudvyx[292417]: 2025-12-15T09:50:06.840+0000 7ff1f5c65140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member Dec 15 04:50:06 localhost ceph-mgr[292421]: mgr[py] Module zabbix has missing NOTIFY_TYPES member Dec 15 04:50:06 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-mgr-np0005559462-fudvyx[292417]: 2025-12-15T09:50:06.898+0000 7ff1f5c65140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member Dec 15 04:50:06 localhost ceph-mgr[292421]: ms_deliver_dispatch: unhandled message 0x55975582b1e0 mon_map magic: 0 from mon.2 v2:172.18.0.104:3300/0 Dec 15 04:50:06 localhost ceph-mgr[292421]: client.0 ms_handle_reset on v2:172.18.0.103:6800/2662188067 Dec 15 04:50:08 localhost systemd[1]: tmp-crun.3N5W2W.mount: Deactivated successfully. Dec 15 04:50:08 localhost podman[292598]: 2025-12-15 09:50:08.265815925 +0000 UTC m=+0.106552651 container exec 8dcda56b365b42dc8758aab77a9ec80db304780e449052738f7e4e648ae1ecaf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-crash-np0005559462, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, release=1763362218, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , distribution-scope=public, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, vendor=Red Hat, Inc., RELEASE=main, architecture=x86_64, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, version=7, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Dec 15 04:50:08 localhost podman[292598]: 2025-12-15 09:50:08.364343042 +0000 UTC m=+0.205079758 container exec_died 8dcda56b365b42dc8758aab77a9ec80db304780e449052738f7e4e648ae1ecaf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-crash-np0005559462, io.buildah.version=1.41.4, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, CEPH_POINT_RELEASE=, build-date=2025-11-26T19:44:28Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, version=7, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, distribution-scope=public, vcs-type=git, com.redhat.component=rhceph-container, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, ceph=True, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7) Dec 15 04:50:09 localhost nova_compute[286344]: 2025-12-15 09:50:09.684 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:50:10 localhost nova_compute[286344]: 2025-12-15 09:50:10.058 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:50:14 localhost nova_compute[286344]: 2025-12-15 09:50:14.723 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:50:15 localhost nova_compute[286344]: 2025-12-15 09:50:15.059 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:50:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0. Dec 15 04:50:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 04:50:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a. Dec 15 04:50:16 localhost systemd[1]: tmp-crun.wiwunh.mount: Deactivated successfully. Dec 15 04:50:16 localhost podman[293415]: 2025-12-15 09:50:16.796050381 +0000 UTC m=+0.117867168 container health_status b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team) Dec 15 04:50:16 localhost podman[293413]: 2025-12-15 09:50:16.825596544 +0000 UTC m=+0.151524025 container health_status 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 15 04:50:16 localhost podman[293414]: 2025-12-15 09:50:16.871619267 +0000 UTC m=+0.196389367 container health_status 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, container_name=multipathd, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true) Dec 15 04:50:16 localhost podman[293414]: 2025-12-15 09:50:16.884514816 +0000 UTC m=+0.209284936 container exec_died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=multipathd, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3) Dec 15 04:50:16 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.service: Deactivated successfully. Dec 15 04:50:16 localhost podman[293415]: 2025-12-15 09:50:16.933450901 +0000 UTC m=+0.255267648 container exec_died b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute) Dec 15 04:50:16 localhost systemd[1]: b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a.service: Deactivated successfully. Dec 15 04:50:16 localhost podman[293413]: 2025-12-15 09:50:16.990878772 +0000 UTC m=+0.316806303 container exec_died 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Dec 15 04:50:17 localhost systemd[1]: 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0.service: Deactivated successfully. Dec 15 04:50:19 localhost nova_compute[286344]: 2025-12-15 09:50:19.745 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:50:20 localhost nova_compute[286344]: 2025-12-15 09:50:20.061 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:50:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09. Dec 15 04:50:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 04:50:20 localhost systemd[1]: tmp-crun.FdzL1D.mount: Deactivated successfully. Dec 15 04:50:20 localhost podman[293493]: 2025-12-15 09:50:20.746642389 +0000 UTC m=+0.078392666 container health_status 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vcs-type=git, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, architecture=x86_64, version=9.6, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter) Dec 15 04:50:20 localhost podman[293494]: 2025-12-15 09:50:20.791222862 +0000 UTC m=+0.120187572 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2) Dec 15 04:50:20 localhost podman[293493]: 2025-12-15 09:50:20.871104429 +0000 UTC m=+0.202854716 container exec_died 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vcs-type=git, release=1755695350, config_id=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, architecture=x86_64, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter) Dec 15 04:50:20 localhost systemd[1]: 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09.service: Deactivated successfully. Dec 15 04:50:20 localhost podman[293494]: 2025-12-15 09:50:20.887166497 +0000 UTC m=+0.216131197 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 04:50:20 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 04:50:23 localhost ceph-mgr[292421]: ms_deliver_dispatch: unhandled message 0x55975582b1e0 mon_map magic: 0 from mon.2 v2:172.18.0.104:3300/0 Dec 15 04:50:24 localhost nova_compute[286344]: 2025-12-15 09:50:24.774 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:50:25 localhost nova_compute[286344]: 2025-12-15 09:50:25.064 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:50:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 04:50:28 localhost podman[293540]: 2025-12-15 09:50:28.754021416 +0000 UTC m=+0.082977914 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3) Dec 15 04:50:28 localhost podman[293540]: 2025-12-15 09:50:28.759165459 +0000 UTC m=+0.088121997 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0) Dec 15 04:50:28 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 04:50:29 localhost podman[293638]: Dec 15 04:50:29 localhost podman[293638]: 2025-12-15 09:50:29.612812128 +0000 UTC m=+0.071127644 container create 5521123bf1699d88858617b32d35a776f40483c84eed74e939026e5850a640f6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_yalow, com.redhat.component=rhceph-container, io.openshift.expose-services=, GIT_BRANCH=main, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, vendor=Red Hat, Inc., version=7, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1763362218, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, architecture=x86_64, RELEASE=main) Dec 15 04:50:29 localhost systemd[1]: Started libpod-conmon-5521123bf1699d88858617b32d35a776f40483c84eed74e939026e5850a640f6.scope. Dec 15 04:50:29 localhost podman[293638]: 2025-12-15 09:50:29.577112833 +0000 UTC m=+0.035428379 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 04:50:29 localhost systemd[1]: Started libcrun container. Dec 15 04:50:29 localhost podman[293638]: 2025-12-15 09:50:29.694579998 +0000 UTC m=+0.152895504 container init 5521123bf1699d88858617b32d35a776f40483c84eed74e939026e5850a640f6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_yalow, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, architecture=x86_64, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, CEPH_POINT_RELEASE=, ceph=True, vendor=Red Hat, Inc., GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1763362218, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, version=7, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, build-date=2025-11-26T19:44:28Z, distribution-scope=public, RELEASE=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0) Dec 15 04:50:29 localhost podman[293638]: 2025-12-15 09:50:29.706241193 +0000 UTC m=+0.164556699 container start 5521123bf1699d88858617b32d35a776f40483c84eed74e939026e5850a640f6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_yalow, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, version=7, CEPH_POINT_RELEASE=, release=1763362218, build-date=2025-11-26T19:44:28Z, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, name=rhceph, ceph=True, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, GIT_CLEAN=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux , vcs-type=git, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container) Dec 15 04:50:29 localhost podman[293638]: 2025-12-15 09:50:29.70650598 +0000 UTC m=+0.164821486 container attach 5521123bf1699d88858617b32d35a776f40483c84eed74e939026e5850a640f6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_yalow, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1763362218, architecture=x86_64, ceph=True, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, name=rhceph, io.openshift.expose-services=, distribution-scope=public, RELEASE=main, vcs-type=git, build-date=2025-11-26T19:44:28Z, maintainer=Guillaume Abrioux , url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream) Dec 15 04:50:29 localhost nifty_yalow[293653]: 167 167 Dec 15 04:50:29 localhost systemd[1]: libpod-5521123bf1699d88858617b32d35a776f40483c84eed74e939026e5850a640f6.scope: Deactivated successfully. Dec 15 04:50:29 localhost podman[293638]: 2025-12-15 09:50:29.710749119 +0000 UTC m=+0.169064645 container died 5521123bf1699d88858617b32d35a776f40483c84eed74e939026e5850a640f6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_yalow, io.buildah.version=1.41.4, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-11-26T19:44:28Z, io.openshift.expose-services=, GIT_BRANCH=main, distribution-scope=public, io.openshift.tags=rhceph ceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, GIT_CLEAN=True, release=1763362218, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, ceph=True, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7) Dec 15 04:50:29 localhost nova_compute[286344]: 2025-12-15 09:50:29.820 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:50:29 localhost systemd[1]: var-lib-containers-storage-overlay-1088f06447f7670e348264e4e3d0c5c56418772712eb82f88c4649f6c8e22d57-merged.mount: Deactivated successfully. Dec 15 04:50:29 localhost podman[293658]: 2025-12-15 09:50:29.848918421 +0000 UTC m=+0.125654305 container remove 5521123bf1699d88858617b32d35a776f40483c84eed74e939026e5850a640f6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_yalow, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, release=1763362218, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, io.openshift.expose-services=, build-date=2025-11-26T19:44:28Z, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, architecture=x86_64, ceph=True, io.buildah.version=1.41.4, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph) Dec 15 04:50:29 localhost systemd[1]: libpod-conmon-5521123bf1699d88858617b32d35a776f40483c84eed74e939026e5850a640f6.scope: Deactivated successfully. Dec 15 04:50:29 localhost podman[293674]: Dec 15 04:50:29 localhost podman[293674]: 2025-12-15 09:50:29.962606771 +0000 UTC m=+0.077183834 container create 564af69eec1a96fb0db8467197b247540b861a70c19bbd8c9d02312f38052f84 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jolly_payne, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, RELEASE=main, distribution-scope=public, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, release=1763362218, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, CEPH_POINT_RELEASE=) Dec 15 04:50:30 localhost systemd[1]: Started libpod-conmon-564af69eec1a96fb0db8467197b247540b861a70c19bbd8c9d02312f38052f84.scope. Dec 15 04:50:30 localhost systemd[1]: Started libcrun container. Dec 15 04:50:30 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a31a293ad76058e5d25646f51c922c94e063f1e4b95a2b49a2b2ccb8e2b38443/merged/tmp/config supports timestamps until 2038 (0x7fffffff) Dec 15 04:50:30 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a31a293ad76058e5d25646f51c922c94e063f1e4b95a2b49a2b2ccb8e2b38443/merged/tmp/keyring supports timestamps until 2038 (0x7fffffff) Dec 15 04:50:30 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a31a293ad76058e5d25646f51c922c94e063f1e4b95a2b49a2b2ccb8e2b38443/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Dec 15 04:50:30 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a31a293ad76058e5d25646f51c922c94e063f1e4b95a2b49a2b2ccb8e2b38443/merged/var/lib/ceph/mon/ceph-np0005559462 supports timestamps until 2038 (0x7fffffff) Dec 15 04:50:30 localhost podman[293674]: 2025-12-15 09:50:29.932152111 +0000 UTC m=+0.046729184 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 04:50:30 localhost podman[293674]: 2025-12-15 09:50:30.032652343 +0000 UTC m=+0.147229416 container init 564af69eec1a96fb0db8467197b247540b861a70c19bbd8c9d02312f38052f84 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jolly_payne, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, release=1763362218, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, maintainer=Guillaume Abrioux , name=rhceph, version=7, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, RELEASE=main, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph) Dec 15 04:50:30 localhost podman[293674]: 2025-12-15 09:50:30.044604636 +0000 UTC m=+0.159181699 container start 564af69eec1a96fb0db8467197b247540b861a70c19bbd8c9d02312f38052f84 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jolly_payne, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, RELEASE=main, vcs-type=git, name=rhceph, build-date=2025-11-26T19:44:28Z, io.openshift.tags=rhceph ceph, distribution-scope=public, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, version=7, architecture=x86_64, com.redhat.component=rhceph-container) Dec 15 04:50:30 localhost podman[293674]: 2025-12-15 09:50:30.045545612 +0000 UTC m=+0.160122715 container attach 564af69eec1a96fb0db8467197b247540b861a70c19bbd8c9d02312f38052f84 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jolly_payne, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-11-26T19:44:28Z, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, com.redhat.component=rhceph-container, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, ceph=True, release=1763362218, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, architecture=x86_64, name=rhceph, version=7, vendor=Red Hat, Inc.) Dec 15 04:50:30 localhost nova_compute[286344]: 2025-12-15 09:50:30.066 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:50:30 localhost systemd[1]: libpod-564af69eec1a96fb0db8467197b247540b861a70c19bbd8c9d02312f38052f84.scope: Deactivated successfully. Dec 15 04:50:30 localhost podman[293674]: 2025-12-15 09:50:30.136961271 +0000 UTC m=+0.251538374 container died 564af69eec1a96fb0db8467197b247540b861a70c19bbd8c9d02312f38052f84 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jolly_payne, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, GIT_BRANCH=main, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, architecture=x86_64, release=1763362218, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, ceph=True, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, vcs-type=git, version=7, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-11-26T19:44:28Z, com.redhat.component=rhceph-container, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Dec 15 04:50:30 localhost podman[293716]: 2025-12-15 09:50:30.231412145 +0000 UTC m=+0.082247905 container remove 564af69eec1a96fb0db8467197b247540b861a70c19bbd8c9d02312f38052f84 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jolly_payne, RELEASE=main, distribution-scope=public, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-11-26T19:44:28Z, vcs-type=git, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, CEPH_POINT_RELEASE=, name=rhceph, release=1763362218, vendor=Red Hat, Inc., version=7, io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=) Dec 15 04:50:30 localhost systemd[1]: libpod-conmon-564af69eec1a96fb0db8467197b247540b861a70c19bbd8c9d02312f38052f84.scope: Deactivated successfully. Dec 15 04:50:30 localhost systemd[1]: Reloading. Dec 15 04:50:30 localhost systemd-rc-local-generator[293753]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:50:30 localhost systemd-sysv-generator[293756]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:50:30 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:50:30 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 15 04:50:30 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:50:30 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:50:30 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:50:30 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 15 04:50:30 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:50:30 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:50:30 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:50:30 localhost ceph-mgr[292421]: ms_deliver_dispatch: unhandled message 0x55975582af20 mon_map magic: 0 from mon.2 v2:172.18.0.104:3300/0 Dec 15 04:50:30 localhost systemd[1]: var-lib-containers-storage-overlay-a31a293ad76058e5d25646f51c922c94e063f1e4b95a2b49a2b2ccb8e2b38443-merged.mount: Deactivated successfully. Dec 15 04:50:30 localhost systemd[1]: Reloading. Dec 15 04:50:30 localhost systemd-sysv-generator[293797]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:50:30 localhost systemd-rc-local-generator[293793]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:50:30 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:50:30 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 15 04:50:30 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:50:30 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:50:30 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:50:30 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 15 04:50:30 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:50:30 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:50:30 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:50:31 localhost systemd[1]: Starting Ceph mon.np0005559462 for bce17446-41b5-5408-a23e-0b011906b44a... Dec 15 04:50:31 localhost podman[293857]: Dec 15 04:50:31 localhost podman[293857]: 2025-12-15 09:50:31.391297321 +0000 UTC m=+0.082051079 container create 69c9f90fbffad2f38f53a53e2be5c38fb80cf84a5c01e1ea6e8b126de4a3eecf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-mon-np0005559462, RELEASE=main, version=7, io.openshift.expose-services=, vcs-type=git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1763362218, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_BRANCH=main, vendor=Red Hat, Inc., GIT_CLEAN=True, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers) Dec 15 04:50:31 localhost podman[293857]: 2025-12-15 09:50:31.356670755 +0000 UTC m=+0.047424493 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 04:50:31 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/78e8bfed3971c8ae628e85288100cabc11c29a2235c1f0c43453dda465da0d92/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Dec 15 04:50:31 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/78e8bfed3971c8ae628e85288100cabc11c29a2235c1f0c43453dda465da0d92/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Dec 15 04:50:31 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/78e8bfed3971c8ae628e85288100cabc11c29a2235c1f0c43453dda465da0d92/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Dec 15 04:50:31 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/78e8bfed3971c8ae628e85288100cabc11c29a2235c1f0c43453dda465da0d92/merged/var/lib/ceph/mon/ceph-np0005559462 supports timestamps until 2038 (0x7fffffff) Dec 15 04:50:31 localhost podman[293857]: 2025-12-15 09:50:31.467946398 +0000 UTC m=+0.158700146 container init 69c9f90fbffad2f38f53a53e2be5c38fb80cf84a5c01e1ea6e8b126de4a3eecf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-mon-np0005559462, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, distribution-scope=public, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1763362218, com.redhat.component=rhceph-container, GIT_BRANCH=main, build-date=2025-11-26T19:44:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc.) Dec 15 04:50:31 localhost podman[293857]: 2025-12-15 09:50:31.476432304 +0000 UTC m=+0.167186032 container start 69c9f90fbffad2f38f53a53e2be5c38fb80cf84a5c01e1ea6e8b126de4a3eecf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-mon-np0005559462, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, vcs-type=git, com.redhat.component=rhceph-container, ceph=True, maintainer=Guillaume Abrioux , build-date=2025-11-26T19:44:28Z, release=1763362218, distribution-scope=public) Dec 15 04:50:31 localhost bash[293857]: 69c9f90fbffad2f38f53a53e2be5c38fb80cf84a5c01e1ea6e8b126de4a3eecf Dec 15 04:50:31 localhost systemd[1]: Started Ceph mon.np0005559462 for bce17446-41b5-5408-a23e-0b011906b44a. Dec 15 04:50:31 localhost ceph-mon[293875]: set uid:gid to 167:167 (ceph:ceph) Dec 15 04:50:31 localhost ceph-mon[293875]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-mon, pid 2 Dec 15 04:50:31 localhost ceph-mon[293875]: pidfile_write: ignore empty --pid-file Dec 15 04:50:31 localhost ceph-mon[293875]: load: jerasure load: lrc Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: RocksDB version: 7.9.2 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Git sha 0 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Compile date 2025-09-23 00:00:00 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: DB SUMMARY Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: DB Session ID: 227D9X1HL8Q2PJIK8L6U Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: CURRENT file: CURRENT Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: IDENTITY file: IDENTITY Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: MANIFEST file: MANIFEST-000005 size: 59 Bytes Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: SST files in /var/lib/ceph/mon/ceph-np0005559462/store.db dir, Total Num: 0, files: Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-np0005559462/store.db: 000004.log size: 886 ; Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.error_if_exists: 0 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.create_if_missing: 0 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.paranoid_checks: 1 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.flush_verify_memtable_count: 1 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.track_and_verify_wals_in_manifest: 0 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.verify_sst_unique_id_in_manifest: 1 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.env: 0x562d870269e0 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.fs: PosixFileSystem Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.info_log: 0x562d8838cd20 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.max_file_opening_threads: 16 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.statistics: (nil) Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.use_fsync: 0 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.max_log_file_size: 0 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.max_manifest_file_size: 1073741824 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.log_file_time_to_roll: 0 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.keep_log_file_num: 1000 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.recycle_log_file_num: 0 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.allow_fallocate: 1 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.allow_mmap_reads: 0 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.allow_mmap_writes: 0 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.use_direct_reads: 0 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.use_direct_io_for_flush_and_compaction: 0 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.create_missing_column_families: 0 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.db_log_dir: Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.wal_dir: Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.table_cache_numshardbits: 6 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.WAL_ttl_seconds: 0 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.WAL_size_limit_MB: 0 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.max_write_batch_group_size_bytes: 1048576 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.manifest_preallocation_size: 4194304 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.is_fd_close_on_exec: 1 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.advise_random_on_open: 1 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.db_write_buffer_size: 0 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.write_buffer_manager: 0x562d8839d540 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.access_hint_on_compaction_start: 1 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.random_access_max_buffer_size: 1048576 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.use_adaptive_mutex: 0 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.rate_limiter: (nil) Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.sst_file_manager.rate_bytes_per_sec: 0 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.wal_recovery_mode: 2 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.enable_thread_tracking: 0 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.enable_pipelined_write: 0 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.unordered_write: 0 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.allow_concurrent_memtable_write: 1 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.enable_write_thread_adaptive_yield: 1 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.write_thread_max_yield_usec: 100 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.write_thread_slow_yield_usec: 3 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.row_cache: None Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.wal_filter: None Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.avoid_flush_during_recovery: 0 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.allow_ingest_behind: 0 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.two_write_queues: 0 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.manual_wal_flush: 0 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.wal_compression: 0 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.atomic_flush: 0 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.avoid_unnecessary_blocking_io: 0 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.persist_stats_to_disk: 0 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.write_dbid_to_manifest: 0 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.log_readahead_size: 0 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.file_checksum_gen_factory: Unknown Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.best_efforts_recovery: 0 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.max_bgerror_resume_count: 2147483647 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.bgerror_resume_retry_interval: 1000000 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.allow_data_in_errors: 0 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.db_host_id: __hostname__ Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.enforce_single_del_contracts: true Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.max_background_jobs: 2 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.max_background_compactions: -1 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.max_subcompactions: 1 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.avoid_flush_during_shutdown: 0 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.writable_file_max_buffer_size: 1048576 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.delayed_write_rate : 16777216 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.max_total_wal_size: 0 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.delete_obsolete_files_period_micros: 21600000000 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.stats_dump_period_sec: 600 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.stats_persist_period_sec: 600 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.stats_history_buffer_size: 1048576 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.max_open_files: -1 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.bytes_per_sync: 0 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.wal_bytes_per_sync: 0 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.strict_bytes_per_sync: 0 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.compaction_readahead_size: 0 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.max_background_flushes: -1 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Compression algorithms supported: Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: #011kZSTD supported: 0 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: #011kXpressCompression supported: 0 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: #011kBZip2Compression supported: 0 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: #011kZSTDNotFinalCompression supported: 0 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: #011kLZ4Compression supported: 1 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: #011kZlibCompression supported: 1 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: #011kLZ4HCCompression supported: 1 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: #011kSnappyCompression supported: 1 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Fast CRC32 supported: Supported on x86 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: DMutex implementation: pthread_mutex_t Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-np0005559462/store.db/MANIFEST-000005 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]: Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.comparator: leveldb.BytewiseComparator Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.merge_operator: Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.compaction_filter: None Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.compaction_filter_factory: None Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.sst_partitioner_factory: None Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.memtable_factory: SkipListFactory Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.table_factory: BlockBasedTable Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x562d8838c980)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x562d88389350#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.write_buffer_size: 33554432 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.max_write_buffer_number: 2 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.compression: NoCompression Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.bottommost_compression: Disabled Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.prefix_extractor: nullptr Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.num_levels: 7 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.min_write_buffer_number_to_merge: 1 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.bottommost_compression_opts.level: 32767 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.bottommost_compression_opts.enabled: false Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.compression_opts.window_bits: -14 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.compression_opts.level: 32767 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.compression_opts.strategy: 0 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.compression_opts.parallel_threads: 1 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.compression_opts.enabled: false Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.level0_file_num_compaction_trigger: 4 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.level0_stop_writes_trigger: 36 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.target_file_size_base: 67108864 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.target_file_size_multiplier: 1 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.max_bytes_for_level_base: 268435456 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.max_bytes_for_level_multiplier: 10.000000 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.max_compaction_bytes: 1677721600 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.arena_block_size: 1048576 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.disable_auto_compactions: 0 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.compaction_style: kCompactionStyleLevel Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.table_properties_collectors: Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.inplace_update_support: 0 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.inplace_update_num_locks: 10000 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.memtable_whole_key_filtering: 0 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.memtable_huge_page_size: 0 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.bloom_locality: 0 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.max_successive_merges: 0 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.optimize_filters_for_hits: 0 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.paranoid_file_checks: 0 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.force_consistency_checks: 1 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.report_bg_io_stats: 0 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.ttl: 2592000 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.periodic_compaction_seconds: 0 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.preclude_last_level_data_seconds: 0 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.preserve_internal_time_seconds: 0 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.enable_blob_files: false Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.min_blob_size: 0 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.blob_file_size: 268435456 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.blob_compression_type: NoCompression Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.enable_blob_garbage_collection: false Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.blob_compaction_readahead_size: 0 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.blob_file_starting_level: 0 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-np0005559462/store.db/MANIFEST-000005 succeeded,manifest_file_number is 5, next_file_number is 7, last_sequence is 0, log_number is 0,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 0 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 0 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 8a79eac5-c138-4069-90b4-52dac737d5cc Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765792231532712, "job": 1, "event": "recovery_started", "wal_files": [4]} Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #4 mode 2 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765792231535535, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 8, "file_size": 2012, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 1, "largest_seqno": 5, "table_properties": {"data_size": 898, "index_size": 31, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 115, "raw_average_key_size": 23, "raw_value_size": 776, "raw_average_value_size": 155, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765792231, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8a79eac5-c138-4069-90b4-52dac737d5cc", "db_session_id": "227D9X1HL8Q2PJIK8L6U", "orig_file_number": 8, "seqno_to_time_mapping": "N/A"}} Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765792231535713, "job": 1, "event": "recovery_finished"} Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: [db/version_set.cc:5047] Creating manifest 10 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005559462/store.db/000004.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x562d883b0e00 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: DB pointer 0x562d884a6000 Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Dec 15 04:50:31 localhost ceph-mon[293875]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 0.0 total, 0.0 interval#012Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s#012Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 1/0 1.96 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.7 0.00 0.00 1 0.003 0 0 0.0 0.0#012 Sum 1/0 1.96 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.7 0.00 0.00 1 0.003 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.7 0.00 0.00 1 0.003 0 0 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.7 0.00 0.00 1 0.003 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.0 total, 0.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.13 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.13 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x562d88389350#2 capacity: 512.00 MB usage: 0.22 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 1.9e-05 secs_since: 0#012Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.11 KB,2.08616e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] ** Dec 15 04:50:31 localhost ceph-mon[293875]: mon.np0005559462 does not exist in monmap, will attempt to join an existing cluster Dec 15 04:50:31 localhost ceph-mon[293875]: using public_addr v2:172.18.0.106:0/0 -> [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] Dec 15 04:50:31 localhost ceph-mon[293875]: starting mon.np0005559462 rank -1 at public addrs [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] at bind addrs [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] mon_data /var/lib/ceph/mon/ceph-np0005559462 fsid bce17446-41b5-5408-a23e-0b011906b44a Dec 15 04:50:31 localhost ceph-mon[293875]: mon.np0005559462@-1(???) e0 preinit fsid bce17446-41b5-5408-a23e-0b011906b44a Dec 15 04:50:31 localhost ceph-mon[293875]: mon.np0005559462@-1(synchronizing) e5 sync_obtain_latest_monmap Dec 15 04:50:31 localhost ceph-mon[293875]: mon.np0005559462@-1(synchronizing) e5 sync_obtain_latest_monmap obtained monmap e5 Dec 15 04:50:31 localhost ceph-mon[293875]: mon.np0005559462@-1(synchronizing).mds e17 new map Dec 15 04:50:31 localhost ceph-mon[293875]: mon.np0005559462@-1(synchronizing).mds e17 print_map#012e17#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#01116#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-12-15T08:04:09.300216+0000#012modified#0112025-12-15T09:49:18.667189+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#01182#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=26777}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[6]#012metadata_pool#0117#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012qdb_cluster#011leader: 26777 members: 26777#012[mds.mds.np0005559463.rdpgze{0:26777} state up:active seq 13 addr [v2:172.18.0.107:6808/434194913,v1:172.18.0.107:6809/434194913] compat {c=[1],r=[1],i=[17ff]}]#012 #012 #012Standby daemons:#012 #012[mds.mds.np0005559462.mhigvc{-1:16959} state up:standby seq 1 addr [v2:172.18.0.106:6808/1713185344,v1:172.18.0.106:6809/1713185344] compat {c=[1],r=[1],i=[17ff]}]#012[mds.mds.np0005559464.piyuji{-1:26458} state up:standby seq 1 addr [v2:172.18.0.108:6808/2660138834,v1:172.18.0.108:6809/2660138834] compat {c=[1],r=[1],i=[17ff]}] Dec 15 04:50:31 localhost ceph-mon[293875]: mon.np0005559462@-1(synchronizing).osd e83 crush map has features 3314933000854323200, adjusting msgr requires Dec 15 04:50:31 localhost ceph-mon[293875]: mon.np0005559462@-1(synchronizing).osd e83 crush map has features 432629239337189376, adjusting msgr requires Dec 15 04:50:31 localhost ceph-mon[293875]: mon.np0005559462@-1(synchronizing).osd e83 crush map has features 432629239337189376, adjusting msgr requires Dec 15 04:50:31 localhost ceph-mon[293875]: mon.np0005559462@-1(synchronizing).osd e83 crush map has features 432629239337189376, adjusting msgr requires Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' cmd={"prefix": "auth rm", "entity": "mds.mds.np0005559461.pmmvjk"} : dispatch Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' cmd='[{"prefix": "auth rm", "entity": "mds.mds.np0005559461.pmmvjk"}]': finished Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:31 localhost ceph-mon[293875]: Removing key for mds.mds.np0005559461.pmmvjk Dec 15 04:50:31 localhost ceph-mon[293875]: Removing daemon mds.mds.np0005559460.lblagm from np0005559460.localdomain -- ports [] Dec 15 04:50:31 localhost ceph-mon[293875]: Removing key for mds.mds.np0005559460.lblagm Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' cmd={"prefix": "auth rm", "entity": "mds.mds.np0005559460.lblagm"} : dispatch Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' cmd='[{"prefix": "auth rm", "entity": "mds.mds.np0005559460.lblagm"}]': finished Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:31 localhost ceph-mon[293875]: Added label mgr to host np0005559462.localdomain Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Dec 15 04:50:31 localhost ceph-mon[293875]: Added label mgr to host np0005559463.localdomain Dec 15 04:50:31 localhost ceph-mon[293875]: Adjusting osd_memory_target on np0005559462.localdomain to 3396M Dec 15 04:50:31 localhost ceph-mon[293875]: Adjusting osd_memory_target on np0005559464.localdomain to 3396M Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:31 localhost ceph-mon[293875]: Adjusting osd_memory_target on np0005559463.localdomain to 3396M Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:31 localhost ceph-mon[293875]: Added label mgr to host np0005559464.localdomain Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:31 localhost ceph-mon[293875]: Saving service mgr spec with placement label:mgr Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005559462.fudvyx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.np0005559462.fudvyx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished Dec 15 04:50:31 localhost ceph-mon[293875]: Deploying daemon mgr.np0005559462.fudvyx on np0005559462.localdomain Dec 15 04:50:31 localhost ceph-mon[293875]: overall HEALTH_OK Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005559463.daptkf", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.np0005559463.daptkf", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished Dec 15 04:50:31 localhost ceph-mon[293875]: Deploying daemon mgr.np0005559463.daptkf on np0005559463.localdomain Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:31 localhost ceph-mon[293875]: Added label mon to host np0005559459.localdomain Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:31 localhost ceph-mon[293875]: Added label _admin to host np0005559459.localdomain Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005559464.aomnqe", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.np0005559464.aomnqe", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:31 localhost ceph-mon[293875]: Deploying daemon mgr.np0005559464.aomnqe on np0005559464.localdomain Dec 15 04:50:31 localhost ceph-mon[293875]: Added label mon to host np0005559460.localdomain Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:31 localhost ceph-mon[293875]: Added label _admin to host np0005559460.localdomain Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:31 localhost ceph-mon[293875]: Added label mon to host np0005559461.localdomain Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:31 localhost ceph-mon[293875]: Added label _admin to host np0005559461.localdomain Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:31 localhost ceph-mon[293875]: Added label mon to host np0005559462.localdomain Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Dec 15 04:50:31 localhost ceph-mon[293875]: Added label _admin to host np0005559462.localdomain Dec 15 04:50:31 localhost ceph-mon[293875]: Updating np0005559462.localdomain:/etc/ceph/ceph.conf Dec 15 04:50:31 localhost ceph-mon[293875]: Updating np0005559462.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:31 localhost ceph-mon[293875]: Added label mon to host np0005559463.localdomain Dec 15 04:50:31 localhost ceph-mon[293875]: Updating np0005559462.localdomain:/etc/ceph/ceph.client.admin.keyring Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:31 localhost ceph-mon[293875]: Updating np0005559462.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.client.admin.keyring Dec 15 04:50:31 localhost ceph-mon[293875]: Added label _admin to host np0005559463.localdomain Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Dec 15 04:50:31 localhost ceph-mon[293875]: Updating np0005559463.localdomain:/etc/ceph/ceph.conf Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:31 localhost ceph-mon[293875]: Added label mon to host np0005559464.localdomain Dec 15 04:50:31 localhost ceph-mon[293875]: Updating np0005559463.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:50:31 localhost ceph-mon[293875]: Updating np0005559463.localdomain:/etc/ceph/ceph.client.admin.keyring Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:31 localhost ceph-mon[293875]: Added label _admin to host np0005559464.localdomain Dec 15 04:50:31 localhost ceph-mon[293875]: Updating np0005559463.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.client.admin.keyring Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:31 localhost ceph-mon[293875]: Saving service mon spec with placement label:mon Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Dec 15 04:50:31 localhost ceph-mon[293875]: Updating np0005559464.localdomain:/etc/ceph/ceph.conf Dec 15 04:50:31 localhost ceph-mon[293875]: Updating np0005559464.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:50:31 localhost ceph-mon[293875]: Updating np0005559464.localdomain:/etc/ceph/ceph.client.admin.keyring Dec 15 04:50:31 localhost ceph-mon[293875]: Updating np0005559464.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.client.admin.keyring Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Dec 15 04:50:31 localhost ceph-mon[293875]: Deploying daemon mon.np0005559464 on np0005559464.localdomain Dec 15 04:50:31 localhost ceph-mon[293875]: Deploying daemon mon.np0005559463 on np0005559463.localdomain Dec 15 04:50:31 localhost ceph-mon[293875]: mon.np0005559459 calling monitor election Dec 15 04:50:31 localhost ceph-mon[293875]: mon.np0005559461 calling monitor election Dec 15 04:50:31 localhost ceph-mon[293875]: mon.np0005559460 calling monitor election Dec 15 04:50:31 localhost ceph-mon[293875]: mon.np0005559464 calling monitor election Dec 15 04:50:31 localhost ceph-mon[293875]: mon.np0005559459 is new leader, mons np0005559459,np0005559461,np0005559460,np0005559464 in quorum (ranks 0,1,2,3) Dec 15 04:50:31 localhost ceph-mon[293875]: overall HEALTH_OK Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:31 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Dec 15 04:50:31 localhost ceph-mon[293875]: Deploying daemon mon.np0005559462 on np0005559462.localdomain Dec 15 04:50:31 localhost ceph-mon[293875]: mon.np0005559462@-1(synchronizing).paxosservice(auth 1..34) refresh upgraded, format 0 -> 3 Dec 15 04:50:31 localhost podman[243449]: time="2025-12-15T09:50:31Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 15 04:50:31 localhost podman[243449]: @ - - [15/Dec/2025:09:50:31 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 154814 "" "Go-http-client/1.1" Dec 15 04:50:31 localhost podman[243449]: @ - - [15/Dec/2025:09:50:31 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18684 "" "Go-http-client/1.1" Dec 15 04:50:34 localhost ceph-mds[291134]: mds.beacon.mds.np0005559462.mhigvc missed beacon ack from the monitors Dec 15 04:50:34 localhost openstack_network_exporter[246484]: ERROR 09:50:34 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 15 04:50:34 localhost openstack_network_exporter[246484]: ERROR 09:50:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 04:50:34 localhost openstack_network_exporter[246484]: ERROR 09:50:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 04:50:34 localhost openstack_network_exporter[246484]: ERROR 09:50:34 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 15 04:50:34 localhost openstack_network_exporter[246484]: Dec 15 04:50:34 localhost nova_compute[286344]: 2025-12-15 09:50:34.853 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:50:34 localhost openstack_network_exporter[246484]: ERROR 09:50:34 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 15 04:50:34 localhost openstack_network_exporter[246484]: Dec 15 04:50:35 localhost nova_compute[286344]: 2025-12-15 09:50:35.067 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:50:35 localhost ceph-mgr[292421]: ms_deliver_dispatch: unhandled message 0x55975582b1e0 mon_map magic: 0 from mon.2 v2:172.18.0.104:3300/0 Dec 15 04:50:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e. Dec 15 04:50:36 localhost podman[293931]: 2025-12-15 09:50:36.231565211 +0000 UTC m=+0.076377720 container health_status a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Dec 15 04:50:36 localhost podman[293931]: 2025-12-15 09:50:36.24549941 +0000 UTC m=+0.090311909 container exec_died a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Dec 15 04:50:36 localhost systemd[1]: a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e.service: Deactivated successfully. Dec 15 04:50:37 localhost podman[294062]: 2025-12-15 09:50:37.271301709 +0000 UTC m=+0.073621734 container exec 8dcda56b365b42dc8758aab77a9ec80db304780e449052738f7e4e648ae1ecaf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-crash-np0005559462, architecture=x86_64, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, ceph=True, build-date=2025-11-26T19:44:28Z, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, io.openshift.expose-services=, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container) Dec 15 04:50:37 localhost podman[294062]: 2025-12-15 09:50:37.372062378 +0000 UTC m=+0.174382453 container exec_died 8dcda56b365b42dc8758aab77a9ec80db304780e449052738f7e4e648ae1ecaf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-crash-np0005559462, ceph=True, vcs-type=git, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, release=1763362218, com.redhat.component=rhceph-container, io.openshift.expose-services=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , version=7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, architecture=x86_64, GIT_BRANCH=main, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z) Dec 15 04:50:37 localhost ceph-mon[293875]: mon.np0005559462@-1(probing) e6 my rank is now 5 (was -1) Dec 15 04:50:37 localhost ceph-mon[293875]: log_channel(cluster) log [INF] : mon.np0005559462 calling monitor election Dec 15 04:50:37 localhost ceph-mon[293875]: paxos.5).electionLogic(0) init, first boot, initializing epoch at 1 Dec 15 04:50:37 localhost ceph-mon[293875]: mon.np0005559462@5(electing) e6 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Dec 15 04:50:39 localhost nova_compute[286344]: 2025-12-15 09:50:39.895 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:50:40 localhost nova_compute[286344]: 2025-12-15 09:50:40.069 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:50:42 localhost ceph-mon[293875]: mon.np0005559459 calling monitor election Dec 15 04:50:42 localhost ceph-mon[293875]: mon.np0005559461 calling monitor election Dec 15 04:50:42 localhost ceph-mon[293875]: mon.np0005559464 calling monitor election Dec 15 04:50:42 localhost ceph-mon[293875]: mon.np0005559460 calling monitor election Dec 15 04:50:42 localhost ceph-mon[293875]: mon.np0005559463 calling monitor election Dec 15 04:50:42 localhost ceph-mon[293875]: mon.np0005559459 is new leader, mons np0005559459,np0005559461,np0005559460,np0005559464,np0005559463 in quorum (ranks 0,1,2,3,4) Dec 15 04:50:42 localhost ceph-mon[293875]: overall HEALTH_OK Dec 15 04:50:42 localhost ceph-mon[293875]: mon.np0005559461 calling monitor election Dec 15 04:50:42 localhost ceph-mon[293875]: mon.np0005559464 calling monitor election Dec 15 04:50:42 localhost ceph-mon[293875]: mon.np0005559460 calling monitor election Dec 15 04:50:42 localhost ceph-mon[293875]: mon.np0005559463 calling monitor election Dec 15 04:50:42 localhost ceph-mon[293875]: mon.np0005559459 calling monitor election Dec 15 04:50:42 localhost ceph-mon[293875]: mon.np0005559459 is new leader, mons np0005559459,np0005559461,np0005559460,np0005559464,np0005559463 in quorum (ranks 0,1,2,3,4) Dec 15 04:50:42 localhost ceph-mon[293875]: Health check failed: 1/6 mons down, quorum np0005559459,np0005559461,np0005559460,np0005559464,np0005559463 (MON_DOWN) Dec 15 04:50:42 localhost ceph-mon[293875]: Health detail: HEALTH_WARN 1/6 mons down, quorum np0005559459,np0005559461,np0005559460,np0005559464,np0005559463 Dec 15 04:50:42 localhost ceph-mon[293875]: [WRN] MON_DOWN: 1/6 mons down, quorum np0005559459,np0005559461,np0005559460,np0005559464,np0005559463 Dec 15 04:50:42 localhost ceph-mon[293875]: mon.np0005559462 (rank 5) addr [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] is down (out of quorum) Dec 15 04:50:42 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:42 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:42 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:42 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:42 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:42 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:42 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:42 localhost ceph-mon[293875]: log_channel(cluster) log [INF] : mon.np0005559462 calling monitor election Dec 15 04:50:42 localhost ceph-mon[293875]: paxos.5).electionLogic(0) init, first boot, initializing epoch at 1 Dec 15 04:50:42 localhost ceph-mon[293875]: mon.np0005559462@5(electing) e6 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Dec 15 04:50:42 localhost ceph-mon[293875]: mon.np0005559462@5(electing) e6 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Dec 15 04:50:42 localhost ceph-mon[293875]: mon.np0005559462@5(electing) e6 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Dec 15 04:50:42 localhost ceph-mon[293875]: mon.np0005559462@5(peon) e6 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={4=support erasure code pools,5=new-style osdmap encoding,6=support isa/lrc erasure code,7=support shec erasure code} Dec 15 04:50:42 localhost ceph-mon[293875]: mon.np0005559462@5(peon) e6 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={8=support monmap features,9=luminous ondisk layout,10=mimic ondisk layout,11=nautilus ondisk layout,12=octopus ondisk layout,13=pacific ondisk layout,14=quincy ondisk layout,15=reef ondisk layout} Dec 15 04:50:42 localhost ceph-mon[293875]: mon.np0005559462@5(peon) e6 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Dec 15 04:50:42 localhost ceph-mon[293875]: mgrc update_daemon_metadata mon.np0005559462 metadata {addrs=[v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0],arch=x86_64,ceph_release=reef,ceph_version=ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable),ceph_version_short=18.2.1-361.el9cp,compression_algorithms=none, snappy, zlib, zstd, lz4,container_hostname=np0005559462.localdomain,container_image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest,cpu=AMD EPYC-Rome Processor,device_ids=,device_paths=vda=/dev/disk/by-path/pci-0000:00:04.0,devices=vda,distro=rhel,distro_description=Red Hat Enterprise Linux 9.7 (Plow),distro_version=9.7,hostname=np0005559462.localdomain,kernel_description=#1 SMP PREEMPT_DYNAMIC Wed Apr 12 10:45:03 EDT 2023,kernel_version=5.14.0-284.11.1.el9_2.x86_64,mem_swap_kb=1048572,mem_total_kb=16116604,os=Linux} Dec 15 04:50:43 localhost ceph-mon[293875]: mon.np0005559462 calling monitor election Dec 15 04:50:43 localhost ceph-mon[293875]: mon.np0005559461 calling monitor election Dec 15 04:50:43 localhost ceph-mon[293875]: mon.np0005559464 calling monitor election Dec 15 04:50:43 localhost ceph-mon[293875]: mon.np0005559459 calling monitor election Dec 15 04:50:43 localhost ceph-mon[293875]: mon.np0005559463 calling monitor election Dec 15 04:50:43 localhost ceph-mon[293875]: mon.np0005559460 calling monitor election Dec 15 04:50:43 localhost ceph-mon[293875]: mon.np0005559462 calling monitor election Dec 15 04:50:43 localhost ceph-mon[293875]: mon.np0005559459 is new leader, mons np0005559459,np0005559461,np0005559460,np0005559464,np0005559463,np0005559462 in quorum (ranks 0,1,2,3,4,5) Dec 15 04:50:43 localhost ceph-mon[293875]: Health check cleared: MON_DOWN (was: 1/6 mons down, quorum np0005559459,np0005559461,np0005559460,np0005559464,np0005559463) Dec 15 04:50:43 localhost ceph-mon[293875]: Cluster is now healthy Dec 15 04:50:43 localhost ceph-mon[293875]: overall HEALTH_OK Dec 15 04:50:44 localhost ceph-mon[293875]: Updating np0005559461.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:50:44 localhost ceph-mon[293875]: Updating np0005559459.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:50:44 localhost ceph-mon[293875]: Updating np0005559463.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:50:44 localhost ceph-mon[293875]: Updating np0005559460.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:50:44 localhost ceph-mon[293875]: Updating np0005559464.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:50:44 localhost ceph-mon[293875]: Updating np0005559462.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:50:44 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:44 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:44 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:44 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:44 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:44 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:44 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:44 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:44 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:44 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:44 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:44 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:44 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:44 localhost nova_compute[286344]: 2025-12-15 09:50:44.944 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:50:45 localhost nova_compute[286344]: 2025-12-15 09:50:45.071 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:50:45 localhost ceph-mon[293875]: Reconfiguring mon.np0005559459 (monmap changed)... Dec 15 04:50:45 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Dec 15 04:50:45 localhost ceph-mon[293875]: Reconfiguring daemon mon.np0005559459 on np0005559459.localdomain Dec 15 04:50:46 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:46 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:46 localhost ceph-mon[293875]: Reconfiguring mgr.np0005559459.hhnowu (monmap changed)... Dec 15 04:50:46 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005559459.hhnowu", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Dec 15 04:50:46 localhost ceph-mon[293875]: Reconfiguring daemon mgr.np0005559459.hhnowu on np0005559459.localdomain Dec 15 04:50:46 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:46 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:47 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:47 localhost ceph-mon[293875]: Reconfiguring crash.np0005559459 (monmap changed)... Dec 15 04:50:47 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005559459.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Dec 15 04:50:47 localhost ceph-mon[293875]: Reconfiguring daemon crash.np0005559459 on np0005559459.localdomain Dec 15 04:50:47 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:47 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:47 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005559460.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Dec 15 04:50:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0. Dec 15 04:50:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 04:50:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a. Dec 15 04:50:47 localhost systemd[1]: tmp-crun.U5nNNh.mount: Deactivated successfully. Dec 15 04:50:47 localhost podman[294589]: 2025-12-15 09:50:47.775138414 +0000 UTC m=+0.097857679 container health_status 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3) Dec 15 04:50:47 localhost systemd[1]: tmp-crun.OQwaj1.mount: Deactivated successfully. Dec 15 04:50:47 localhost podman[294588]: 2025-12-15 09:50:47.816489427 +0000 UTC m=+0.140067176 container health_status 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Dec 15 04:50:47 localhost podman[294588]: 2025-12-15 09:50:47.850969718 +0000 UTC m=+0.174547457 container exec_died 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Dec 15 04:50:47 localhost systemd[1]: 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0.service: Deactivated successfully. Dec 15 04:50:47 localhost podman[294590]: 2025-12-15 09:50:47.869921507 +0000 UTC m=+0.189295109 container health_status b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3) Dec 15 04:50:47 localhost podman[294589]: 2025-12-15 09:50:47.89336933 +0000 UTC m=+0.216088595 container exec_died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251202, config_id=multipathd, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 04:50:47 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.service: Deactivated successfully. Dec 15 04:50:47 localhost podman[294590]: 2025-12-15 09:50:47.909442108 +0000 UTC m=+0.228815790 container exec_died b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Dec 15 04:50:47 localhost systemd[1]: b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a.service: Deactivated successfully. Dec 15 04:50:48 localhost ceph-mon[293875]: Reconfiguring crash.np0005559460 (monmap changed)... Dec 15 04:50:48 localhost ceph-mon[293875]: Reconfiguring daemon crash.np0005559460 on np0005559460.localdomain Dec 15 04:50:48 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:48 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:48 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Dec 15 04:50:49 localhost ceph-mon[293875]: Reconfiguring mon.np0005559460 (monmap changed)... Dec 15 04:50:49 localhost ceph-mon[293875]: Reconfiguring daemon mon.np0005559460 on np0005559460.localdomain Dec 15 04:50:49 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:49 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' Dec 15 04:50:49 localhost ceph-mon[293875]: from='mgr.14120 172.18.0.103:0/3499672413' entity='mgr.np0005559459.hhnowu' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005559460.oexkup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Dec 15 04:50:49 localhost ceph-mon[293875]: mon.np0005559462@5(peon).osd e83 _set_cache_ratios kv ratio 0.25 inc ratio 0.375 full ratio 0.375 Dec 15 04:50:49 localhost ceph-mon[293875]: mon.np0005559462@5(peon).osd e83 register_cache_with_pcm pcm target: 2147483648 pcm max: 1020054732 pcm min: 134217728 inc_osd_cache size: 1 Dec 15 04:50:49 localhost ceph-mon[293875]: mon.np0005559462@5(peon).osd e84 e84: 6 total, 6 up, 6 in Dec 15 04:50:49 localhost systemd[1]: session-18.scope: Deactivated successfully. Dec 15 04:50:49 localhost systemd[1]: session-20.scope: Deactivated successfully. Dec 15 04:50:49 localhost systemd[1]: session-24.scope: Deactivated successfully. Dec 15 04:50:49 localhost systemd[1]: session-14.scope: Deactivated successfully. Dec 15 04:50:49 localhost systemd[1]: session-26.scope: Deactivated successfully. Dec 15 04:50:49 localhost systemd[1]: session-26.scope: Consumed 3min 33.781s CPU time. Dec 15 04:50:49 localhost systemd[1]: session-16.scope: Deactivated successfully. Dec 15 04:50:49 localhost systemd[1]: session-23.scope: Deactivated successfully. Dec 15 04:50:49 localhost systemd[1]: session-25.scope: Deactivated successfully. Dec 15 04:50:49 localhost systemd[1]: session-19.scope: Deactivated successfully. Dec 15 04:50:49 localhost systemd[1]: session-22.scope: Deactivated successfully. Dec 15 04:50:49 localhost systemd-logind[763]: Session 25 logged out. Waiting for processes to exit. Dec 15 04:50:49 localhost systemd-logind[763]: Session 23 logged out. Waiting for processes to exit. Dec 15 04:50:49 localhost systemd-logind[763]: Session 22 logged out. Waiting for processes to exit. Dec 15 04:50:49 localhost systemd-logind[763]: Session 20 logged out. Waiting for processes to exit. Dec 15 04:50:49 localhost systemd-logind[763]: Session 18 logged out. Waiting for processes to exit. Dec 15 04:50:49 localhost systemd-logind[763]: Session 24 logged out. Waiting for processes to exit. Dec 15 04:50:49 localhost systemd-logind[763]: Session 14 logged out. Waiting for processes to exit. Dec 15 04:50:49 localhost systemd-logind[763]: Session 26 logged out. Waiting for processes to exit. Dec 15 04:50:49 localhost systemd-logind[763]: Session 19 logged out. Waiting for processes to exit. Dec 15 04:50:49 localhost systemd-logind[763]: Session 16 logged out. Waiting for processes to exit. Dec 15 04:50:49 localhost systemd-logind[763]: Removed session 18. Dec 15 04:50:49 localhost systemd[1]: session-17.scope: Deactivated successfully. Dec 15 04:50:49 localhost systemd[1]: session-21.scope: Deactivated successfully. Dec 15 04:50:49 localhost systemd-logind[763]: Removed session 20. Dec 15 04:50:49 localhost systemd-logind[763]: Session 21 logged out. Waiting for processes to exit. Dec 15 04:50:49 localhost systemd-logind[763]: Session 17 logged out. Waiting for processes to exit. Dec 15 04:50:49 localhost systemd-logind[763]: Removed session 24. Dec 15 04:50:49 localhost systemd-logind[763]: Removed session 14. Dec 15 04:50:49 localhost systemd-logind[763]: Removed session 26. Dec 15 04:50:49 localhost systemd-logind[763]: Removed session 16. Dec 15 04:50:49 localhost systemd-logind[763]: Removed session 23. Dec 15 04:50:49 localhost systemd-logind[763]: Removed session 25. Dec 15 04:50:49 localhost systemd-logind[763]: Removed session 19. Dec 15 04:50:49 localhost systemd-logind[763]: Removed session 22. Dec 15 04:50:49 localhost systemd-logind[763]: Removed session 17. Dec 15 04:50:49 localhost systemd-logind[763]: Removed session 21. Dec 15 04:50:49 localhost nova_compute[286344]: 2025-12-15 09:50:49.986 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:50:50 localhost sshd[294650]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:50:50 localhost nova_compute[286344]: 2025-12-15 09:50:50.073 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:50:50 localhost systemd-logind[763]: New session 64 of user ceph-admin. Dec 15 04:50:50 localhost systemd[1]: Started Session 64 of User ceph-admin. Dec 15 04:50:50 localhost ceph-mon[293875]: Reconfiguring mgr.np0005559460.oexkup (monmap changed)... Dec 15 04:50:50 localhost ceph-mon[293875]: Reconfiguring daemon mgr.np0005559460.oexkup on np0005559460.localdomain Dec 15 04:50:50 localhost ceph-mon[293875]: from='client.? 172.18.0.103:0/2265348493' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Dec 15 04:50:50 localhost ceph-mon[293875]: Activating manager daemon np0005559461.egwgzn Dec 15 04:50:50 localhost ceph-mon[293875]: from='client.? 172.18.0.103:0/2265348493' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished Dec 15 04:50:50 localhost ceph-mon[293875]: Manager daemon np0005559461.egwgzn is now available Dec 15 04:50:50 localhost ceph-mon[293875]: from='mgr.24103 172.18.0.105:0/2912776587' entity='mgr.np0005559461.egwgzn' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005559461.egwgzn/mirror_snapshot_schedule"} : dispatch Dec 15 04:50:50 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005559461.egwgzn/mirror_snapshot_schedule"} : dispatch Dec 15 04:50:50 localhost ceph-mon[293875]: from='mgr.24103 172.18.0.105:0/2912776587' entity='mgr.np0005559461.egwgzn' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005559461.egwgzn/trash_purge_schedule"} : dispatch Dec 15 04:50:50 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005559461.egwgzn/trash_purge_schedule"} : dispatch Dec 15 04:50:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09. Dec 15 04:50:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 04:50:51 localhost podman[294732]: 2025-12-15 09:50:51.091216363 +0000 UTC m=+0.093805716 container health_status 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, config_id=openstack_network_exporter, architecture=x86_64, vcs-type=git, version=9.6, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, io.openshift.expose-services=, maintainer=Red Hat, Inc.) Dec 15 04:50:51 localhost podman[294732]: 2025-12-15 09:50:51.178806585 +0000 UTC m=+0.181395898 container exec_died 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, release=1755695350, vcs-type=git, vendor=Red Hat, Inc., name=ubi9-minimal, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, architecture=x86_64) Dec 15 04:50:51 localhost systemd[1]: 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09.service: Deactivated successfully. Dec 15 04:50:51 localhost podman[294733]: 2025-12-15 09:50:51.186297414 +0000 UTC m=+0.187129908 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Dec 15 04:50:51 localhost podman[294733]: 2025-12-15 09:50:51.265924594 +0000 UTC m=+0.266757068 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Dec 15 04:50:51 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 04:50:51 localhost systemd[1]: tmp-crun.Y0qaMd.mount: Deactivated successfully. Dec 15 04:50:51 localhost podman[294804]: 2025-12-15 09:50:51.322949503 +0000 UTC m=+0.090599346 container exec 8dcda56b365b42dc8758aab77a9ec80db304780e449052738f7e4e648ae1ecaf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-crash-np0005559462, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, version=7, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, maintainer=Guillaume Abrioux , build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, name=rhceph, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1763362218, description=Red Hat Ceph Storage 7) Dec 15 04:50:51 localhost podman[294804]: 2025-12-15 09:50:51.427455627 +0000 UTC m=+0.195105520 container exec_died 8dcda56b365b42dc8758aab77a9ec80db304780e449052738f7e4e648ae1ecaf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-crash-np0005559462, vcs-type=git, maintainer=Guillaume Abrioux , architecture=x86_64, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-11-26T19:44:28Z, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1763362218, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., ceph=True, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, GIT_CLEAN=True) Dec 15 04:50:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:50:51.467 160590 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:50:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:50:51.467 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:50:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:50:51.468 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:50:51 localhost ceph-mon[293875]: mon.np0005559462@5(peon).osd e84 _set_new_cache_sizes cache_size:1019627702 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:50:51 localhost ceph-mon[293875]: [15/Dec/2025:09:50:50] ENGINE Bus STARTING Dec 15 04:50:51 localhost ceph-mon[293875]: [15/Dec/2025:09:50:51] ENGINE Serving on https://172.18.0.105:7150 Dec 15 04:50:51 localhost ceph-mon[293875]: [15/Dec/2025:09:50:51] ENGINE Client ('172.18.0.105', 34708) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') Dec 15 04:50:51 localhost ceph-mon[293875]: [15/Dec/2025:09:50:51] ENGINE Serving on http://172.18.0.105:8765 Dec 15 04:50:51 localhost ceph-mon[293875]: [15/Dec/2025:09:50:51] ENGINE Bus STARTED Dec 15 04:50:52 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:50:52 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:50:52 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:50:52 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:50:52 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:50:52 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:50:52 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:50:52 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:50:52 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:50:52 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:50:52 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:50:52 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:50:54 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:50:54 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:50:54 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:50:54 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:50:54 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:50:54 localhost ceph-mon[293875]: from='mgr.24103 172.18.0.105:0/2912776587' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "config rm", "who": "osd/host:np0005559461", "name": "osd_memory_target"} : dispatch Dec 15 04:50:54 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:50:54 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "config rm", "who": "osd/host:np0005559461", "name": "osd_memory_target"} : dispatch Dec 15 04:50:54 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:50:54 localhost ceph-mon[293875]: from='mgr.24103 172.18.0.105:0/2912776587' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "config rm", "who": "osd/host:np0005559459", "name": "osd_memory_target"} : dispatch Dec 15 04:50:54 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:50:54 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "config rm", "who": "osd/host:np0005559459", "name": "osd_memory_target"} : dispatch Dec 15 04:50:54 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:50:54 localhost ceph-mon[293875]: from='mgr.24103 172.18.0.105:0/2912776587' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Dec 15 04:50:54 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:50:54 localhost ceph-mon[293875]: from='mgr.24103 172.18.0.105:0/2912776587' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Dec 15 04:50:54 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Dec 15 04:50:54 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Dec 15 04:50:54 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:50:54 localhost ceph-mon[293875]: from='mgr.24103 172.18.0.105:0/2912776587' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "config rm", "who": "osd/host:np0005559460", "name": "osd_memory_target"} : dispatch Dec 15 04:50:54 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:50:54 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "config rm", "who": "osd/host:np0005559460", "name": "osd_memory_target"} : dispatch Dec 15 04:50:54 localhost ceph-mon[293875]: from='mgr.24103 172.18.0.105:0/2912776587' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Dec 15 04:50:54 localhost ceph-mon[293875]: from='mgr.24103 172.18.0.105:0/2912776587' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Dec 15 04:50:54 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Dec 15 04:50:54 localhost ceph-mon[293875]: from='mgr.24103 172.18.0.105:0/2912776587' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Dec 15 04:50:54 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Dec 15 04:50:54 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Dec 15 04:50:54 localhost ceph-mon[293875]: Adjusting osd_memory_target on np0005559464.localdomain to 836.6M Dec 15 04:50:54 localhost ceph-mon[293875]: from='mgr.24103 172.18.0.105:0/2912776587' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Dec 15 04:50:54 localhost ceph-mon[293875]: Adjusting osd_memory_target on np0005559462.localdomain to 836.6M Dec 15 04:50:54 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Dec 15 04:50:54 localhost ceph-mon[293875]: Adjusting osd_memory_target on np0005559463.localdomain to 836.6M Dec 15 04:50:54 localhost ceph-mon[293875]: Unable to set osd_memory_target on np0005559464.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Dec 15 04:50:54 localhost ceph-mon[293875]: Unable to set osd_memory_target on np0005559462.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096 Dec 15 04:50:54 localhost ceph-mon[293875]: Unable to set osd_memory_target on np0005559463.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Dec 15 04:50:54 localhost ceph-mon[293875]: from='mgr.24103 172.18.0.105:0/2912776587' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Dec 15 04:50:54 localhost ceph-mon[293875]: Updating np0005559459.localdomain:/etc/ceph/ceph.conf Dec 15 04:50:54 localhost ceph-mon[293875]: Updating np0005559460.localdomain:/etc/ceph/ceph.conf Dec 15 04:50:54 localhost ceph-mon[293875]: Updating np0005559461.localdomain:/etc/ceph/ceph.conf Dec 15 04:50:54 localhost ceph-mon[293875]: Updating np0005559462.localdomain:/etc/ceph/ceph.conf Dec 15 04:50:54 localhost ceph-mon[293875]: Updating np0005559463.localdomain:/etc/ceph/ceph.conf Dec 15 04:50:54 localhost ceph-mon[293875]: Updating np0005559464.localdomain:/etc/ceph/ceph.conf Dec 15 04:50:54 localhost ceph-mon[293875]: Updating np0005559463.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:50:54 localhost ceph-mon[293875]: Updating np0005559460.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:50:54 localhost ceph-mon[293875]: Updating np0005559464.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:50:54 localhost ceph-mon[293875]: Updating np0005559461.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:50:54 localhost ceph-mon[293875]: Updating np0005559462.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:50:54 localhost ceph-mon[293875]: Updating np0005559459.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:50:55 localhost nova_compute[286344]: 2025-12-15 09:50:55.019 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:50:55 localhost nova_compute[286344]: 2025-12-15 09:50:55.074 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:50:55 localhost ceph-mon[293875]: Updating np0005559460.localdomain:/etc/ceph/ceph.client.admin.keyring Dec 15 04:50:55 localhost ceph-mon[293875]: Updating np0005559463.localdomain:/etc/ceph/ceph.client.admin.keyring Dec 15 04:50:55 localhost ceph-mon[293875]: Updating np0005559461.localdomain:/etc/ceph/ceph.client.admin.keyring Dec 15 04:50:55 localhost ceph-mon[293875]: Updating np0005559462.localdomain:/etc/ceph/ceph.client.admin.keyring Dec 15 04:50:55 localhost ceph-mon[293875]: Updating np0005559459.localdomain:/etc/ceph/ceph.client.admin.keyring Dec 15 04:50:55 localhost ceph-mon[293875]: Updating np0005559464.localdomain:/etc/ceph/ceph.client.admin.keyring Dec 15 04:50:56 localhost ceph-mon[293875]: mon.np0005559462@5(peon) e6 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Dec 15 04:50:56 localhost ceph-mon[293875]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2503775088' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Dec 15 04:50:56 localhost ceph-mon[293875]: mon.np0005559462@5(peon).osd e84 _set_new_cache_sizes cache_size:1020044617 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:50:56 localhost ceph-mon[293875]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #13. Immutable memtables: 0. Dec 15 04:50:56 localhost ceph-mon[293875]: rocksdb: (Original Log Time 2025/12/15-09:50:56.623526) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Dec 15 04:50:56 localhost ceph-mon[293875]: rocksdb: [db/flush_job.cc:856] [default] [JOB 3] Flushing memtable with next log file: 13 Dec 15 04:50:56 localhost ceph-mon[293875]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765792256623642, "job": 3, "event": "flush_started", "num_memtables": 1, "num_entries": 10771, "num_deletes": 530, "total_data_size": 15269408, "memory_usage": 15977584, "flush_reason": "Manual Compaction"} Dec 15 04:50:56 localhost ceph-mon[293875]: rocksdb: [db/flush_job.cc:885] [default] [JOB 3] Level-0 flush table #14: started Dec 15 04:50:56 localhost ceph-mon[293875]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765792256710212, "cf_name": "default", "job": 3, "event": "table_file_creation", "file_number": 14, "file_size": 11017538, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 6, "largest_seqno": 10776, "table_properties": {"data_size": 10965140, "index_size": 27140, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 24197, "raw_key_size": 254452, "raw_average_key_size": 26, "raw_value_size": 10802341, "raw_average_value_size": 1117, "num_data_blocks": 1020, "num_entries": 9670, "num_filter_entries": 9670, "num_deletions": 526, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765792231, "oldest_key_time": 1765792231, "file_creation_time": 1765792256, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8a79eac5-c138-4069-90b4-52dac737d5cc", "db_session_id": "227D9X1HL8Q2PJIK8L6U", "orig_file_number": 14, "seqno_to_time_mapping": "N/A"}} Dec 15 04:50:56 localhost ceph-mon[293875]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 3] Flush lasted 86763 microseconds, and 24940 cpu microseconds. Dec 15 04:50:56 localhost ceph-mon[293875]: rocksdb: (Original Log Time 2025/12/15-09:50:56.710290) [db/flush_job.cc:967] [default] [JOB 3] Level-0 flush table #14: 11017538 bytes OK Dec 15 04:50:56 localhost ceph-mon[293875]: rocksdb: (Original Log Time 2025/12/15-09:50:56.710317) [db/memtable_list.cc:519] [default] Level-0 commit table #14 started Dec 15 04:50:56 localhost ceph-mon[293875]: rocksdb: (Original Log Time 2025/12/15-09:50:56.712173) [db/memtable_list.cc:722] [default] Level-0 commit table #14: memtable #1 done Dec 15 04:50:56 localhost ceph-mon[293875]: rocksdb: (Original Log Time 2025/12/15-09:50:56.712195) EVENT_LOG_v1 {"time_micros": 1765792256712188, "job": 3, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [2, 0, 0, 0, 0, 0, 0], "immutable_memtables": 0} Dec 15 04:50:56 localhost ceph-mon[293875]: rocksdb: (Original Log Time 2025/12/15-09:50:56.712218) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: files[2 0 0 0 0 0 0] max score 0.50 Dec 15 04:50:56 localhost ceph-mon[293875]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 3] Try to delete WAL files size 15195833, prev total WAL file size 15197705, number of live WAL files 2. Dec 15 04:50:56 localhost ceph-mon[293875]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005559462/store.db/000009.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Dec 15 04:50:56 localhost ceph-mon[293875]: rocksdb: (Original Log Time 2025/12/15-09:50:56.714664) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003130303430' seq:72057594037927935, type:22 .. '7061786F73003130323932' seq:0, type:0; will stop at (end) Dec 15 04:50:56 localhost ceph-mon[293875]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 4] Compacting 2@0 files to L6, score -1.00 Dec 15 04:50:56 localhost ceph-mon[293875]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 3 Base level 0, inputs: [14(10MB) 8(2012B)] Dec 15 04:50:56 localhost ceph-mon[293875]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765792256714788, "job": 4, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [14, 8], "score": -1, "input_data_size": 11019550, "oldest_snapshot_seqno": -1} Dec 15 04:50:56 localhost ceph-mon[293875]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 4] Generated table #15: 9148 keys, 11010212 bytes, temperature: kUnknown Dec 15 04:50:56 localhost ceph-mon[293875]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765792256818573, "cf_name": "default", "job": 4, "event": "table_file_creation", "file_number": 15, "file_size": 11010212, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10959111, "index_size": 27128, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 22917, "raw_key_size": 245852, "raw_average_key_size": 26, "raw_value_size": 10802996, "raw_average_value_size": 1180, "num_data_blocks": 1018, "num_entries": 9148, "num_filter_entries": 9148, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765792231, "oldest_key_time": 0, "file_creation_time": 1765792256, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8a79eac5-c138-4069-90b4-52dac737d5cc", "db_session_id": "227D9X1HL8Q2PJIK8L6U", "orig_file_number": 15, "seqno_to_time_mapping": "N/A"}} Dec 15 04:50:56 localhost ceph-mon[293875]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Dec 15 04:50:56 localhost ceph-mon[293875]: rocksdb: (Original Log Time 2025/12/15-09:50:56.819797) [db/compaction/compaction_job.cc:1663] [default] [JOB 4] Compacted 2@0 files to L6 => 11010212 bytes Dec 15 04:50:56 localhost ceph-mon[293875]: rocksdb: (Original Log Time 2025/12/15-09:50:56.821825) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 105.9 rd, 105.8 wr, level 6, files in(2, 0) out(1 +0 blob) MB in(10.5, 0.0 +0.0 blob) out(10.5 +0.0 blob), read-write-amplify(2.0) write-amplify(1.0) OK, records in: 9675, records dropped: 527 output_compression: NoCompression Dec 15 04:50:56 localhost ceph-mon[293875]: rocksdb: (Original Log Time 2025/12/15-09:50:56.821858) EVENT_LOG_v1 {"time_micros": 1765792256821843, "job": 4, "event": "compaction_finished", "compaction_time_micros": 104034, "compaction_time_cpu_micros": 34229, "output_level": 6, "num_output_files": 1, "total_output_size": 11010212, "num_input_records": 9675, "num_output_records": 9148, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Dec 15 04:50:56 localhost ceph-mon[293875]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005559462/store.db/000014.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Dec 15 04:50:56 localhost ceph-mon[293875]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765792256824360, "job": 4, "event": "table_file_deletion", "file_number": 14} Dec 15 04:50:56 localhost ceph-mon[293875]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005559462/store.db/000008.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Dec 15 04:50:56 localhost ceph-mon[293875]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765792256824608, "job": 4, "event": "table_file_deletion", "file_number": 8} Dec 15 04:50:56 localhost ceph-mon[293875]: rocksdb: (Original Log Time 2025/12/15-09:50:56.714500) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 04:50:56 localhost ceph-mon[293875]: Updating np0005559460.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.client.admin.keyring Dec 15 04:50:56 localhost ceph-mon[293875]: Updating np0005559461.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.client.admin.keyring Dec 15 04:50:56 localhost ceph-mon[293875]: Updating np0005559462.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.client.admin.keyring Dec 15 04:50:56 localhost ceph-mon[293875]: Updating np0005559464.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.client.admin.keyring Dec 15 04:50:56 localhost ceph-mon[293875]: Updating np0005559463.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.client.admin.keyring Dec 15 04:50:56 localhost ceph-mon[293875]: Updating np0005559459.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.client.admin.keyring Dec 15 04:50:56 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:50:56 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:50:56 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:50:56 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:50:56 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:50:56 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:50:57 localhost ceph-mon[293875]: mon.np0005559462@5(peon) e6 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Dec 15 04:50:57 localhost ceph-mon[293875]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2404875704' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Dec 15 04:50:58 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:50:58 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:50:58 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:50:58 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:50:58 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:50:58 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:50:58 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:50:58 localhost ceph-mon[293875]: Reconfiguring mgr.np0005559460.oexkup (monmap changed)... Dec 15 04:50:58 localhost ceph-mon[293875]: from='mgr.24103 172.18.0.105:0/2912776587' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005559460.oexkup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Dec 15 04:50:58 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005559460.oexkup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Dec 15 04:50:58 localhost ceph-mon[293875]: Reconfiguring daemon mgr.np0005559460.oexkup on np0005559460.localdomain Dec 15 04:50:59 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:50:59 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:50:59 localhost ceph-mon[293875]: Reconfiguring mon.np0005559461 (monmap changed)... Dec 15 04:50:59 localhost ceph-mon[293875]: from='mgr.24103 172.18.0.105:0/2912776587' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Dec 15 04:50:59 localhost ceph-mon[293875]: Reconfiguring daemon mon.np0005559461 on np0005559461.localdomain Dec 15 04:50:59 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:50:59 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:50:59 localhost ceph-mon[293875]: from='mgr.24103 172.18.0.105:0/2912776587' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005559461.egwgzn", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Dec 15 04:50:59 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005559461.egwgzn", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Dec 15 04:50:59 localhost sshd[295710]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:50:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 04:50:59 localhost podman[295712]: 2025-12-15 09:50:59.831896494 +0000 UTC m=+0.086548604 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Dec 15 04:50:59 localhost podman[295712]: 2025-12-15 09:50:59.841342227 +0000 UTC m=+0.095994317 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202) Dec 15 04:50:59 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 04:51:00 localhost nova_compute[286344]: 2025-12-15 09:51:00.058 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:51:00 localhost nova_compute[286344]: 2025-12-15 09:51:00.076 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:51:00 localhost nova_compute[286344]: 2025-12-15 09:51:00.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:51:00 localhost nova_compute[286344]: 2025-12-15 09:51:00.338 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:51:00 localhost nova_compute[286344]: 2025-12-15 09:51:00.338 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:51:00 localhost nova_compute[286344]: 2025-12-15 09:51:00.338 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:51:00 localhost nova_compute[286344]: 2025-12-15 09:51:00.339 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Auditing locally available compute resources for np0005559462.localdomain (node: np0005559462.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 15 04:51:00 localhost nova_compute[286344]: 2025-12-15 09:51:00.339 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 04:51:00 localhost ceph-mon[293875]: Reconfiguring mgr.np0005559461.egwgzn (monmap changed)... Dec 15 04:51:00 localhost ceph-mon[293875]: Reconfiguring daemon mgr.np0005559461.egwgzn on np0005559461.localdomain Dec 15 04:51:00 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:00 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:00 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:00 localhost ceph-mon[293875]: from='mgr.24103 172.18.0.105:0/2912776587' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005559461.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Dec 15 04:51:00 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005559461.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Dec 15 04:51:00 localhost nova_compute[286344]: 2025-12-15 09:51:00.791 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 04:51:00 localhost nova_compute[286344]: 2025-12-15 09:51:00.893 286348 DEBUG nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 04:51:00 localhost nova_compute[286344]: 2025-12-15 09:51:00.893 286348 DEBUG nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 04:51:01 localhost nova_compute[286344]: 2025-12-15 09:51:01.113 286348 WARNING nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 15 04:51:01 localhost nova_compute[286344]: 2025-12-15 09:51:01.114 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Hypervisor/Node resource view: name=np0005559462.localdomain free_ram=11844MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 15 04:51:01 localhost nova_compute[286344]: 2025-12-15 09:51:01.115 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:51:01 localhost nova_compute[286344]: 2025-12-15 09:51:01.115 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:51:01 localhost nova_compute[286344]: 2025-12-15 09:51:01.173 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Instance 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 15 04:51:01 localhost nova_compute[286344]: 2025-12-15 09:51:01.173 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 15 04:51:01 localhost nova_compute[286344]: 2025-12-15 09:51:01.174 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Final resource view: name=np0005559462.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 15 04:51:01 localhost nova_compute[286344]: 2025-12-15 09:51:01.191 286348 DEBUG nova.scheduler.client.report [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Refreshing inventories for resource provider 26c8956b-6742-4951-b566-971b9bbe323b _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Dec 15 04:51:01 localhost nova_compute[286344]: 2025-12-15 09:51:01.253 286348 DEBUG nova.scheduler.client.report [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Updating ProviderTree inventory for provider 26c8956b-6742-4951-b566-971b9bbe323b from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Dec 15 04:51:01 localhost nova_compute[286344]: 2025-12-15 09:51:01.253 286348 DEBUG nova.compute.provider_tree [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Updating inventory in ProviderTree for provider 26c8956b-6742-4951-b566-971b9bbe323b with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Dec 15 04:51:01 localhost nova_compute[286344]: 2025-12-15 09:51:01.272 286348 DEBUG nova.scheduler.client.report [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Refreshing aggregate associations for resource provider 26c8956b-6742-4951-b566-971b9bbe323b, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Dec 15 04:51:01 localhost nova_compute[286344]: 2025-12-15 09:51:01.307 286348 DEBUG nova.scheduler.client.report [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Refreshing trait associations for resource provider 26c8956b-6742-4951-b566-971b9bbe323b, traits: COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_IDE,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NODE,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_CLMUL,HW_CPU_X86_ABM,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_BMI2,HW_CPU_X86_SSE42,HW_CPU_X86_FMA3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_BMI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE,HW_CPU_X86_AVX,HW_CPU_X86_SVM,COMPUTE_SECURITY_TPM_2_0,COMPUTE_TRUSTED_CERTS,COMPUTE_RESCUE_BFV,HW_CPU_X86_AVX2,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSSE3,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_AESNI,HW_CPU_X86_SSE4A,HW_CPU_X86_MMX,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_F16C,HW_CPU_X86_SHA,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_SATA _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Dec 15 04:51:01 localhost nova_compute[286344]: 2025-12-15 09:51:01.349 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 04:51:01 localhost ceph-mon[293875]: Reconfiguring crash.np0005559461 (monmap changed)... Dec 15 04:51:01 localhost ceph-mon[293875]: Reconfiguring daemon crash.np0005559461 on np0005559461.localdomain Dec 15 04:51:01 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:01 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:01 localhost ceph-mon[293875]: from='mgr.24103 172.18.0.105:0/2912776587' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005559462.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Dec 15 04:51:01 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005559462.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Dec 15 04:51:01 localhost ceph-mon[293875]: mon.np0005559462@5(peon).osd e84 _set_new_cache_sizes cache_size:1020054569 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:51:01 localhost sshd[295820]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:51:01 localhost ceph-mon[293875]: mon.np0005559462@5(peon) e6 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Dec 15 04:51:01 localhost ceph-mon[293875]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3255370743' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Dec 15 04:51:01 localhost podman[295826]: Dec 15 04:51:01 localhost nova_compute[286344]: 2025-12-15 09:51:01.807 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 04:51:01 localhost podman[295826]: 2025-12-15 09:51:01.809453976 +0000 UTC m=+0.076478822 container create f91f025db34b7a86b8107dd34bc320c12ea0f5bb6e20614b5839477958a22d4a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=admiring_diffie, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, CEPH_POINT_RELEASE=, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4, architecture=x86_64, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, GIT_CLEAN=True, GIT_BRANCH=main, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, version=7, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0) Dec 15 04:51:01 localhost nova_compute[286344]: 2025-12-15 09:51:01.813 286348 DEBUG nova.compute.provider_tree [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Inventory has not changed in ProviderTree for provider: 26c8956b-6742-4951-b566-971b9bbe323b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 15 04:51:01 localhost nova_compute[286344]: 2025-12-15 09:51:01.831 286348 DEBUG nova.scheduler.client.report [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Inventory has not changed for provider 26c8956b-6742-4951-b566-971b9bbe323b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 15 04:51:01 localhost nova_compute[286344]: 2025-12-15 09:51:01.833 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Compute_service record updated for np0005559462.localdomain:np0005559462.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 15 04:51:01 localhost nova_compute[286344]: 2025-12-15 09:51:01.833 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.718s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:51:01 localhost systemd[1]: Started libpod-conmon-f91f025db34b7a86b8107dd34bc320c12ea0f5bb6e20614b5839477958a22d4a.scope. Dec 15 04:51:01 localhost systemd[1]: Started libcrun container. Dec 15 04:51:01 localhost podman[243449]: time="2025-12-15T09:51:01Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 15 04:51:01 localhost podman[295826]: 2025-12-15 09:51:01.778833193 +0000 UTC m=+0.045858039 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 04:51:01 localhost podman[295826]: 2025-12-15 09:51:01.942956168 +0000 UTC m=+0.209981014 container init f91f025db34b7a86b8107dd34bc320c12ea0f5bb6e20614b5839477958a22d4a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=admiring_diffie, name=rhceph, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, RELEASE=main, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, architecture=x86_64, vcs-type=git, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, release=1763362218, vendor=Red Hat, Inc., GIT_BRANCH=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, version=7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0) Dec 15 04:51:01 localhost podman[295826]: 2025-12-15 09:51:01.953462081 +0000 UTC m=+0.220486927 container start f91f025db34b7a86b8107dd34bc320c12ea0f5bb6e20614b5839477958a22d4a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=admiring_diffie, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, RELEASE=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, name=rhceph, io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, maintainer=Guillaume Abrioux , vcs-type=git, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, build-date=2025-11-26T19:44:28Z, release=1763362218, version=7, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, GIT_BRANCH=main) Dec 15 04:51:01 localhost podman[295826]: 2025-12-15 09:51:01.953738348 +0000 UTC m=+0.220763205 container attach f91f025db34b7a86b8107dd34bc320c12ea0f5bb6e20614b5839477958a22d4a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=admiring_diffie, build-date=2025-11-26T19:44:28Z, name=rhceph, version=7, description=Red Hat Ceph Storage 7, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., GIT_CLEAN=True, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, RELEASE=main, maintainer=Guillaume Abrioux , ceph=True, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, release=1763362218, io.openshift.tags=rhceph ceph) Dec 15 04:51:01 localhost admiring_diffie[295844]: 167 167 Dec 15 04:51:01 localhost systemd[1]: libpod-f91f025db34b7a86b8107dd34bc320c12ea0f5bb6e20614b5839477958a22d4a.scope: Deactivated successfully. Dec 15 04:51:01 localhost podman[243449]: @ - - [15/Dec/2025:09:51:01 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156600 "" "Go-http-client/1.1" Dec 15 04:51:01 localhost podman[295826]: 2025-12-15 09:51:01.957197256 +0000 UTC m=+0.224222102 container died f91f025db34b7a86b8107dd34bc320c12ea0f5bb6e20614b5839477958a22d4a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=admiring_diffie, GIT_BRANCH=main, RELEASE=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , build-date=2025-11-26T19:44:28Z, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, release=1763362218, io.openshift.expose-services=, GIT_CLEAN=True, version=7, architecture=x86_64, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, ceph=True, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container) Dec 15 04:51:01 localhost podman[243449]: @ - - [15/Dec/2025:09:51:01 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18997 "" "Go-http-client/1.1" Dec 15 04:51:02 localhost podman[295849]: 2025-12-15 09:51:02.060862946 +0000 UTC m=+0.092161851 container remove f91f025db34b7a86b8107dd34bc320c12ea0f5bb6e20614b5839477958a22d4a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=admiring_diffie, build-date=2025-11-26T19:44:28Z, release=1763362218, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, com.redhat.component=rhceph-container, vcs-type=git, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, name=rhceph, distribution-scope=public, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, RELEASE=main) Dec 15 04:51:02 localhost systemd[1]: libpod-conmon-f91f025db34b7a86b8107dd34bc320c12ea0f5bb6e20614b5839477958a22d4a.scope: Deactivated successfully. Dec 15 04:51:02 localhost ceph-mon[293875]: Reconfiguring crash.np0005559462 (monmap changed)... Dec 15 04:51:02 localhost ceph-mon[293875]: Reconfiguring daemon crash.np0005559462 on np0005559462.localdomain Dec 15 04:51:02 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:02 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:02 localhost ceph-mon[293875]: from='mgr.24103 172.18.0.105:0/2912776587' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch Dec 15 04:51:02 localhost podman[295918]: Dec 15 04:51:02 localhost podman[295918]: 2025-12-15 09:51:02.814793024 +0000 UTC m=+0.075215718 container create fd71d81a0fa87943463c9546410108e6a21dd66c0ef5a340be66569bbc276e7a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=musing_mirzakhani, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, maintainer=Guillaume Abrioux , release=1763362218, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, distribution-scope=public, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, com.redhat.component=rhceph-container, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, RELEASE=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, GIT_BRANCH=main, CEPH_POINT_RELEASE=) Dec 15 04:51:02 localhost systemd[1]: var-lib-containers-storage-overlay-56e2106bbd64f44baab6cd5c5b45f67245c27d9f8d9db1daab1c68e3e5785d45-merged.mount: Deactivated successfully. Dec 15 04:51:02 localhost nova_compute[286344]: 2025-12-15 09:51:02.833 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:51:02 localhost nova_compute[286344]: 2025-12-15 09:51:02.834 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:51:02 localhost systemd[1]: Started libpod-conmon-fd71d81a0fa87943463c9546410108e6a21dd66c0ef5a340be66569bbc276e7a.scope. Dec 15 04:51:02 localhost systemd[1]: Started libcrun container. Dec 15 04:51:02 localhost podman[295918]: 2025-12-15 09:51:02.78379555 +0000 UTC m=+0.044218284 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 04:51:02 localhost podman[295918]: 2025-12-15 09:51:02.886412621 +0000 UTC m=+0.146835305 container init fd71d81a0fa87943463c9546410108e6a21dd66c0ef5a340be66569bbc276e7a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=musing_mirzakhani, release=1763362218, vendor=Red Hat, Inc., RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.buildah.version=1.41.4, io.openshift.expose-services=, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, distribution-scope=public, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z) Dec 15 04:51:02 localhost podman[295918]: 2025-12-15 09:51:02.896271826 +0000 UTC m=+0.156694520 container start fd71d81a0fa87943463c9546410108e6a21dd66c0ef5a340be66569bbc276e7a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=musing_mirzakhani, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, vendor=Red Hat, Inc., vcs-type=git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, io.openshift.expose-services=, build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, release=1763362218, GIT_CLEAN=True, io.buildah.version=1.41.4, version=7, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git) Dec 15 04:51:02 localhost podman[295918]: 2025-12-15 09:51:02.896526303 +0000 UTC m=+0.156949037 container attach fd71d81a0fa87943463c9546410108e6a21dd66c0ef5a340be66569bbc276e7a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=musing_mirzakhani, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, architecture=x86_64, CEPH_POINT_RELEASE=, GIT_CLEAN=True, RELEASE=main, version=7, maintainer=Guillaume Abrioux , build-date=2025-11-26T19:44:28Z, release=1763362218, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main) Dec 15 04:51:02 localhost musing_mirzakhani[295933]: 167 167 Dec 15 04:51:02 localhost systemd[1]: libpod-fd71d81a0fa87943463c9546410108e6a21dd66c0ef5a340be66569bbc276e7a.scope: Deactivated successfully. Dec 15 04:51:02 localhost podman[295918]: 2025-12-15 09:51:02.89966449 +0000 UTC m=+0.160087214 container died fd71d81a0fa87943463c9546410108e6a21dd66c0ef5a340be66569bbc276e7a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=musing_mirzakhani, maintainer=Guillaume Abrioux , io.openshift.expose-services=, vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, architecture=x86_64, name=rhceph, GIT_CLEAN=True, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, RELEASE=main, distribution-scope=public, GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, version=7) Dec 15 04:51:02 localhost nova_compute[286344]: 2025-12-15 09:51:02.978 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:51:02 localhost nova_compute[286344]: 2025-12-15 09:51:02.979 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:51:02 localhost nova_compute[286344]: 2025-12-15 09:51:02.980 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:51:03 localhost podman[295938]: 2025-12-15 09:51:03.009384399 +0000 UTC m=+0.095964306 container remove fd71d81a0fa87943463c9546410108e6a21dd66c0ef5a340be66569bbc276e7a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=musing_mirzakhani, vendor=Red Hat, Inc., release=1763362218, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, com.redhat.component=rhceph-container, GIT_BRANCH=main, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, distribution-scope=public, version=7, RELEASE=main, maintainer=Guillaume Abrioux , url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=) Dec 15 04:51:03 localhost systemd[1]: libpod-conmon-fd71d81a0fa87943463c9546410108e6a21dd66c0ef5a340be66569bbc276e7a.scope: Deactivated successfully. Dec 15 04:51:03 localhost nova_compute[286344]: 2025-12-15 09:51:03.269 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:51:03 localhost nova_compute[286344]: 2025-12-15 09:51:03.271 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 15 04:51:03 localhost nova_compute[286344]: 2025-12-15 09:51:03.271 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 15 04:51:03 localhost ceph-mon[293875]: Reconfiguring osd.0 (monmap changed)... Dec 15 04:51:03 localhost ceph-mon[293875]: Reconfiguring daemon osd.0 on np0005559462.localdomain Dec 15 04:51:03 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:03 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:03 localhost ceph-mon[293875]: from='mgr.24103 172.18.0.105:0/2912776587' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch Dec 15 04:51:03 localhost podman[296014]: Dec 15 04:51:03 localhost systemd[1]: var-lib-containers-storage-overlay-d8b3b546d956a0e26dbc5a4263fabd8ddb4d9a3ab2cf42bab3ab115a4691d3af-merged.mount: Deactivated successfully. Dec 15 04:51:03 localhost podman[296014]: 2025-12-15 09:51:03.820825202 +0000 UTC m=+0.077299207 container create 349ca66b1b561ad893fb37673397218e9f9c35cd0fdf8ec3160d7fdb9269d5e5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigilant_mirzakhani, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, RELEASE=main, GIT_BRANCH=main, ceph=True, distribution-scope=public, version=7, CEPH_POINT_RELEASE=, vcs-type=git, release=1763362218, maintainer=Guillaume Abrioux , name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, build-date=2025-11-26T19:44:28Z, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream) Dec 15 04:51:03 localhost systemd[1]: Started libpod-conmon-349ca66b1b561ad893fb37673397218e9f9c35cd0fdf8ec3160d7fdb9269d5e5.scope. Dec 15 04:51:03 localhost systemd[1]: Started libcrun container. Dec 15 04:51:03 localhost podman[296014]: 2025-12-15 09:51:03.789316643 +0000 UTC m=+0.045790718 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 04:51:03 localhost podman[296014]: 2025-12-15 09:51:03.894016672 +0000 UTC m=+0.150490727 container init 349ca66b1b561ad893fb37673397218e9f9c35cd0fdf8ec3160d7fdb9269d5e5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigilant_mirzakhani, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=7, release=1763362218, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, io.openshift.expose-services=, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, vcs-type=git, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , GIT_BRANCH=main, build-date=2025-11-26T19:44:28Z, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0) Dec 15 04:51:03 localhost podman[296014]: 2025-12-15 09:51:03.905184823 +0000 UTC m=+0.161658848 container start 349ca66b1b561ad893fb37673397218e9f9c35cd0fdf8ec3160d7fdb9269d5e5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigilant_mirzakhani, maintainer=Guillaume Abrioux , GIT_CLEAN=True, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, description=Red Hat Ceph Storage 7, vcs-type=git, vendor=Red Hat, Inc., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, RELEASE=main, GIT_BRANCH=main, com.redhat.component=rhceph-container, architecture=x86_64, io.openshift.expose-services=, release=1763362218) Dec 15 04:51:03 localhost podman[296014]: 2025-12-15 09:51:03.905524913 +0000 UTC m=+0.161998988 container attach 349ca66b1b561ad893fb37673397218e9f9c35cd0fdf8ec3160d7fdb9269d5e5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigilant_mirzakhani, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, release=1763362218, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux , GIT_BRANCH=main, distribution-scope=public, description=Red Hat Ceph Storage 7, RELEASE=main, CEPH_POINT_RELEASE=) Dec 15 04:51:03 localhost vigilant_mirzakhani[296030]: 167 167 Dec 15 04:51:03 localhost systemd[1]: libpod-349ca66b1b561ad893fb37673397218e9f9c35cd0fdf8ec3160d7fdb9269d5e5.scope: Deactivated successfully. Dec 15 04:51:03 localhost podman[296014]: 2025-12-15 09:51:03.908405783 +0000 UTC m=+0.164879838 container died 349ca66b1b561ad893fb37673397218e9f9c35cd0fdf8ec3160d7fdb9269d5e5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigilant_mirzakhani, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., name=rhceph, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, GIT_CLEAN=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, release=1763362218, io.openshift.tags=rhceph ceph, ceph=True, vcs-type=git, description=Red Hat Ceph Storage 7, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main) Dec 15 04:51:03 localhost nova_compute[286344]: 2025-12-15 09:51:03.910 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 15 04:51:03 localhost nova_compute[286344]: 2025-12-15 09:51:03.912 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquired lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 15 04:51:03 localhost nova_compute[286344]: 2025-12-15 09:51:03.912 286348 DEBUG nova.network.neutron [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 15 04:51:03 localhost nova_compute[286344]: 2025-12-15 09:51:03.913 286348 DEBUG nova.objects.instance [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 15 04:51:04 localhost podman[296035]: 2025-12-15 09:51:04.007030702 +0000 UTC m=+0.089297000 container remove 349ca66b1b561ad893fb37673397218e9f9c35cd0fdf8ec3160d7fdb9269d5e5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigilant_mirzakhani, release=1763362218, io.openshift.tags=rhceph ceph, ceph=True, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, RELEASE=main, architecture=x86_64, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Dec 15 04:51:04 localhost systemd[1]: libpod-conmon-349ca66b1b561ad893fb37673397218e9f9c35cd0fdf8ec3160d7fdb9269d5e5.scope: Deactivated successfully. Dec 15 04:51:04 localhost nova_compute[286344]: 2025-12-15 09:51:04.302 286348 DEBUG nova.network.neutron [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Updating instance_info_cache with network_info: [{"id": "03ef8889-3216-43fb-8a52-4be17a956ce1", "address": "fa:16:3e:74:df:7c", "network": {"id": "befb7a72-17a9-4bcb-b561-84b8f626685a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.201", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "c785bf23f53946bc99867d8832a50266", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03ef8889-32", "ovs_interfaceid": "03ef8889-3216-43fb-8a52-4be17a956ce1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 15 04:51:04 localhost nova_compute[286344]: 2025-12-15 09:51:04.321 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Releasing lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 15 04:51:04 localhost nova_compute[286344]: 2025-12-15 09:51:04.322 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 15 04:51:04 localhost nova_compute[286344]: 2025-12-15 09:51:04.323 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:51:04 localhost nova_compute[286344]: 2025-12-15 09:51:04.323 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:51:04 localhost nova_compute[286344]: 2025-12-15 09:51:04.324 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 15 04:51:04 localhost ceph-mon[293875]: Reconfiguring osd.3 (monmap changed)... Dec 15 04:51:04 localhost ceph-mon[293875]: Reconfiguring daemon osd.3 on np0005559462.localdomain Dec 15 04:51:04 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:04 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:04 localhost ceph-mon[293875]: from='mgr.24103 172.18.0.105:0/2912776587' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005559462.mhigvc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Dec 15 04:51:04 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005559462.mhigvc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Dec 15 04:51:04 localhost systemd[1]: var-lib-containers-storage-overlay-e9097f16148f03eb21369b9e9a3dc15158609c5d3747691d4a52bfc0c93e8a5d-merged.mount: Deactivated successfully. Dec 15 04:51:04 localhost openstack_network_exporter[246484]: ERROR 09:51:04 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 15 04:51:04 localhost openstack_network_exporter[246484]: ERROR 09:51:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 04:51:04 localhost openstack_network_exporter[246484]: ERROR 09:51:04 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 15 04:51:04 localhost openstack_network_exporter[246484]: Dec 15 04:51:04 localhost openstack_network_exporter[246484]: ERROR 09:51:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 04:51:04 localhost openstack_network_exporter[246484]: ERROR 09:51:04 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 15 04:51:04 localhost openstack_network_exporter[246484]: Dec 15 04:51:04 localhost podman[296111]: Dec 15 04:51:04 localhost podman[296111]: 2025-12-15 09:51:04.917377602 +0000 UTC m=+0.098881467 container create 8e6388df43dde04aa15f3c46a84b8566db1475272c171ca319730d4cad0bb9c6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_keldysh, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, io.openshift.expose-services=, version=7, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, build-date=2025-11-26T19:44:28Z, RELEASE=main, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, vcs-type=git, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, release=1763362218, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git) Dec 15 04:51:04 localhost systemd[1]: Started libpod-conmon-8e6388df43dde04aa15f3c46a84b8566db1475272c171ca319730d4cad0bb9c6.scope. Dec 15 04:51:04 localhost systemd[1]: Started libcrun container. Dec 15 04:51:04 localhost podman[296111]: 2025-12-15 09:51:04.987116596 +0000 UTC m=+0.168620471 container init 8e6388df43dde04aa15f3c46a84b8566db1475272c171ca319730d4cad0bb9c6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_keldysh, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, CEPH_POINT_RELEASE=, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, ceph=True, io.openshift.tags=rhceph ceph, release=1763362218, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., vcs-type=git, build-date=2025-11-26T19:44:28Z, GIT_BRANCH=main, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, architecture=x86_64, maintainer=Guillaume Abrioux ) Dec 15 04:51:04 localhost podman[296111]: 2025-12-15 09:51:04.889746912 +0000 UTC m=+0.071250807 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 04:51:04 localhost podman[296111]: 2025-12-15 09:51:04.997468305 +0000 UTC m=+0.178972180 container start 8e6388df43dde04aa15f3c46a84b8566db1475272c171ca319730d4cad0bb9c6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_keldysh, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, release=1763362218, name=rhceph, GIT_CLEAN=True, architecture=x86_64, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-11-26T19:44:28Z, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True) Dec 15 04:51:04 localhost podman[296111]: 2025-12-15 09:51:04.997735953 +0000 UTC m=+0.179239878 container attach 8e6388df43dde04aa15f3c46a84b8566db1475272c171ca319730d4cad0bb9c6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_keldysh, name=rhceph, RELEASE=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, ceph=True, architecture=x86_64, build-date=2025-11-26T19:44:28Z, maintainer=Guillaume Abrioux , io.buildah.version=1.41.4, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, release=1763362218) Dec 15 04:51:04 localhost optimistic_keldysh[296127]: 167 167 Dec 15 04:51:05 localhost systemd[1]: libpod-8e6388df43dde04aa15f3c46a84b8566db1475272c171ca319730d4cad0bb9c6.scope: Deactivated successfully. Dec 15 04:51:05 localhost podman[296111]: 2025-12-15 09:51:05.000465169 +0000 UTC m=+0.181969044 container died 8e6388df43dde04aa15f3c46a84b8566db1475272c171ca319730d4cad0bb9c6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_keldysh, io.openshift.expose-services=, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, name=rhceph, vcs-type=git, ceph=True, RELEASE=main, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., release=1763362218, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, build-date=2025-11-26T19:44:28Z, maintainer=Guillaume Abrioux ) Dec 15 04:51:05 localhost nova_compute[286344]: 2025-12-15 09:51:05.062 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:51:05 localhost nova_compute[286344]: 2025-12-15 09:51:05.078 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:51:05 localhost podman[296132]: 2025-12-15 09:51:05.091204718 +0000 UTC m=+0.081734839 container remove 8e6388df43dde04aa15f3c46a84b8566db1475272c171ca319730d4cad0bb9c6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_keldysh, distribution-scope=public, io.openshift.tags=rhceph ceph, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux , name=rhceph, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, ceph=True, release=1763362218, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, version=7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, build-date=2025-11-26T19:44:28Z, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git) Dec 15 04:51:05 localhost systemd[1]: libpod-conmon-8e6388df43dde04aa15f3c46a84b8566db1475272c171ca319730d4cad0bb9c6.scope: Deactivated successfully. Dec 15 04:51:05 localhost nova_compute[286344]: 2025-12-15 09:51:05.271 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:51:05 localhost ceph-mon[293875]: Reconfiguring mds.mds.np0005559462.mhigvc (monmap changed)... Dec 15 04:51:05 localhost ceph-mon[293875]: Reconfiguring daemon mds.mds.np0005559462.mhigvc on np0005559462.localdomain Dec 15 04:51:05 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:05 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:05 localhost ceph-mon[293875]: from='mgr.24103 172.18.0.105:0/2912776587' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005559462.fudvyx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Dec 15 04:51:05 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005559462.fudvyx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Dec 15 04:51:05 localhost ceph-mgr[292421]: ms_deliver_dispatch: unhandled message 0x55975582b1e0 mon_map magic: 0 from mon.2 v2:172.18.0.104:3300/0 Dec 15 04:51:05 localhost ceph-mgr[292421]: client.0 ms_handle_reset on v2:172.18.0.104:3300/0 Dec 15 04:51:05 localhost ceph-mgr[292421]: client.0 ms_handle_reset on v2:172.18.0.104:3300/0 Dec 15 04:51:05 localhost ceph-mon[293875]: mon.np0005559462@5(peon) e7 my rank is now 4 (was 5) Dec 15 04:51:05 localhost ceph-mgr[292421]: client.0 ms_handle_reset on v2:172.18.0.106:3300/0 Dec 15 04:51:05 localhost ceph-mgr[292421]: client.0 ms_handle_reset on v2:172.18.0.106:3300/0 Dec 15 04:51:05 localhost ceph-mgr[292421]: ms_deliver_dispatch: unhandled message 0x55975582b080 mon_map magic: 0 from mon.3 v2:172.18.0.107:3300/0 Dec 15 04:51:05 localhost ceph-mon[293875]: log_channel(cluster) log [INF] : mon.np0005559462 calling monitor election Dec 15 04:51:05 localhost ceph-mon[293875]: paxos.4).electionLogic(26) init, last seen epoch 26 Dec 15 04:51:05 localhost ceph-mon[293875]: mon.np0005559462@4(electing) e7 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Dec 15 04:51:05 localhost ceph-mon[293875]: mon.np0005559462@4(electing) e7 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Dec 15 04:51:05 localhost podman[296202]: Dec 15 04:51:05 localhost podman[296202]: 2025-12-15 09:51:05.79611201 +0000 UTC m=+0.046189039 container create b94810e10b3cc52e797fe2afeaad3ae4509a6edc304f29cfbf7a3addc7b10aed (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nervous_einstein, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, RELEASE=main, maintainer=Guillaume Abrioux , version=7, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, vendor=Red Hat, Inc., GIT_BRANCH=main, io.openshift.expose-services=, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=) Dec 15 04:51:05 localhost systemd[1]: var-lib-containers-storage-overlay-849d6887a9e0315dc99d1d8973b5a1dc8deeb8e17f05ab4e3e5bee0d45ffb4cf-merged.mount: Deactivated successfully. Dec 15 04:51:05 localhost systemd[1]: Started libpod-conmon-b94810e10b3cc52e797fe2afeaad3ae4509a6edc304f29cfbf7a3addc7b10aed.scope. Dec 15 04:51:05 localhost systemd[1]: Started libcrun container. Dec 15 04:51:05 localhost podman[296202]: 2025-12-15 09:51:05.863732256 +0000 UTC m=+0.113809305 container init b94810e10b3cc52e797fe2afeaad3ae4509a6edc304f29cfbf7a3addc7b10aed (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nervous_einstein, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, GIT_BRANCH=main, distribution-scope=public, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1763362218, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, name=rhceph, description=Red Hat Ceph Storage 7, architecture=x86_64, build-date=2025-11-26T19:44:28Z, version=7, ceph=True, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=) Dec 15 04:51:05 localhost podman[296202]: 2025-12-15 09:51:05.872441348 +0000 UTC m=+0.122518397 container start b94810e10b3cc52e797fe2afeaad3ae4509a6edc304f29cfbf7a3addc7b10aed (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nervous_einstein, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, io.buildah.version=1.41.4, vcs-type=git, RELEASE=main, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, GIT_CLEAN=True, build-date=2025-11-26T19:44:28Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, release=1763362218, ceph=True, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, distribution-scope=public) Dec 15 04:51:05 localhost podman[296202]: 2025-12-15 09:51:05.872671595 +0000 UTC m=+0.122748664 container attach b94810e10b3cc52e797fe2afeaad3ae4509a6edc304f29cfbf7a3addc7b10aed (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nervous_einstein, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, RELEASE=main, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, version=7, GIT_BRANCH=main, ceph=True, io.openshift.expose-services=, build-date=2025-11-26T19:44:28Z, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, CEPH_POINT_RELEASE=, release=1763362218, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.41.4) Dec 15 04:51:05 localhost podman[296202]: 2025-12-15 09:51:05.774892769 +0000 UTC m=+0.024969828 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 04:51:05 localhost nervous_einstein[296218]: 167 167 Dec 15 04:51:05 localhost systemd[1]: libpod-b94810e10b3cc52e797fe2afeaad3ae4509a6edc304f29cfbf7a3addc7b10aed.scope: Deactivated successfully. Dec 15 04:51:05 localhost podman[296202]: 2025-12-15 09:51:05.876959285 +0000 UTC m=+0.127036344 container died b94810e10b3cc52e797fe2afeaad3ae4509a6edc304f29cfbf7a3addc7b10aed (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nervous_einstein, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1763362218, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, version=7, ceph=True, build-date=2025-11-26T19:44:28Z, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., com.redhat.component=rhceph-container) Dec 15 04:51:05 localhost ceph-mon[293875]: mon.np0005559462@4(electing) e7 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Dec 15 04:51:05 localhost podman[296223]: 2025-12-15 09:51:05.980908523 +0000 UTC m=+0.089005673 container remove b94810e10b3cc52e797fe2afeaad3ae4509a6edc304f29cfbf7a3addc7b10aed (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nervous_einstein, architecture=x86_64, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, version=7, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-11-26T19:44:28Z, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, io.openshift.expose-services=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, release=1763362218, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, GIT_CLEAN=True, CEPH_POINT_RELEASE=) Dec 15 04:51:05 localhost systemd[1]: libpod-conmon-b94810e10b3cc52e797fe2afeaad3ae4509a6edc304f29cfbf7a3addc7b10aed.scope: Deactivated successfully. Dec 15 04:51:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e. Dec 15 04:51:06 localhost podman[296239]: 2025-12-15 09:51:06.76349478 +0000 UTC m=+0.090385231 container health_status a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Dec 15 04:51:06 localhost podman[296239]: 2025-12-15 09:51:06.771900794 +0000 UTC m=+0.098791235 container exec_died a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 15 04:51:06 localhost systemd[1]: a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e.service: Deactivated successfully. Dec 15 04:51:06 localhost systemd[1]: var-lib-containers-storage-overlay-d3da53344c895e7edef5349b370b4122f7abc46c4882be5ee11166f2f0b29b51-merged.mount: Deactivated successfully. Dec 15 04:51:10 localhost nova_compute[286344]: 2025-12-15 09:51:10.083 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:51:10 localhost nova_compute[286344]: 2025-12-15 09:51:10.085 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:51:10 localhost nova_compute[286344]: 2025-12-15 09:51:10.085 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5006 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 15 04:51:10 localhost nova_compute[286344]: 2025-12-15 09:51:10.085 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:51:10 localhost nova_compute[286344]: 2025-12-15 09:51:10.086 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:51:10 localhost nova_compute[286344]: 2025-12-15 09:51:10.089 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:51:10 localhost ceph-mds[291134]: mds.beacon.mds.np0005559462.mhigvc missed beacon ack from the monitors Dec 15 04:51:10 localhost ceph-mon[293875]: mon.np0005559462@4(electing) e7 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Dec 15 04:51:10 localhost ceph-mon[293875]: mon.np0005559462@4(electing) e7 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Dec 15 04:51:10 localhost ceph-mon[293875]: Reconfiguring daemon mgr.np0005559462.fudvyx on np0005559462.localdomain Dec 15 04:51:10 localhost ceph-mon[293875]: Remove daemons mon.np0005559459 Dec 15 04:51:10 localhost ceph-mon[293875]: Safe to remove mon.np0005559459: new quorum should be ['np0005559461', 'np0005559460', 'np0005559464', 'np0005559463', 'np0005559462'] (from ['np0005559461', 'np0005559460', 'np0005559464', 'np0005559463', 'np0005559462']) Dec 15 04:51:10 localhost ceph-mon[293875]: Removing monitor np0005559459 from monmap... Dec 15 04:51:10 localhost ceph-mon[293875]: from='mgr.24103 172.18.0.105:0/2912776587' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "mon rm", "name": "np0005559459"} : dispatch Dec 15 04:51:10 localhost ceph-mon[293875]: Removing daemon mon.np0005559459 from np0005559459.localdomain -- ports [] Dec 15 04:51:10 localhost ceph-mon[293875]: mon.np0005559460 calling monitor election Dec 15 04:51:10 localhost ceph-mon[293875]: mon.np0005559461 calling monitor election Dec 15 04:51:10 localhost ceph-mon[293875]: mon.np0005559463 calling monitor election Dec 15 04:51:10 localhost ceph-mon[293875]: mon.np0005559461 is new leader, mons np0005559461,np0005559460,np0005559463 in quorum (ranks 0,1,3) Dec 15 04:51:10 localhost ceph-mon[293875]: Health check failed: 2/5 mons down, quorum np0005559461,np0005559460,np0005559463 (MON_DOWN) Dec 15 04:51:10 localhost ceph-mon[293875]: Health detail: HEALTH_WARN 2/5 mons down, quorum np0005559461,np0005559460,np0005559463 Dec 15 04:51:10 localhost ceph-mon[293875]: [WRN] MON_DOWN: 2/5 mons down, quorum np0005559461,np0005559460,np0005559463 Dec 15 04:51:10 localhost ceph-mon[293875]: mon.np0005559464 (rank 2) addr [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] is down (out of quorum) Dec 15 04:51:10 localhost ceph-mon[293875]: mon.np0005559462 (rank 4) addr [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] is down (out of quorum) Dec 15 04:51:10 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:10 localhost ceph-mon[293875]: mon.np0005559462@4(peon) e7 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Dec 15 04:51:11 localhost podman[296315]: Dec 15 04:51:11 localhost podman[296315]: 2025-12-15 09:51:11.445612373 +0000 UTC m=+0.074445317 container create abdfc3533566a658557c97b9a0998528ca28c2f07b099255057ed2cd18cd1d80 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=beautiful_swartz, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, release=1763362218, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, io.openshift.expose-services=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, build-date=2025-11-26T19:44:28Z, name=rhceph, GIT_BRANCH=main, GIT_CLEAN=True, RELEASE=main, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., architecture=x86_64, ceph=True, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=) Dec 15 04:51:11 localhost systemd[1]: Started libpod-conmon-abdfc3533566a658557c97b9a0998528ca28c2f07b099255057ed2cd18cd1d80.scope. Dec 15 04:51:11 localhost systemd[1]: Started libcrun container. Dec 15 04:51:11 localhost podman[296315]: 2025-12-15 09:51:11.415879634 +0000 UTC m=+0.044712598 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 04:51:11 localhost podman[296315]: 2025-12-15 09:51:11.51832296 +0000 UTC m=+0.147155894 container init abdfc3533566a658557c97b9a0998528ca28c2f07b099255057ed2cd18cd1d80 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=beautiful_swartz, GIT_CLEAN=True, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., architecture=x86_64, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, CEPH_POINT_RELEASE=, release=1763362218, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, ceph=True, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, io.buildah.version=1.41.4) Dec 15 04:51:11 localhost systemd[1]: tmp-crun.KPiS9G.mount: Deactivated successfully. Dec 15 04:51:11 localhost podman[296315]: 2025-12-15 09:51:11.529944404 +0000 UTC m=+0.158777348 container start abdfc3533566a658557c97b9a0998528ca28c2f07b099255057ed2cd18cd1d80 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=beautiful_swartz, vendor=Red Hat, Inc., io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, build-date=2025-11-26T19:44:28Z, vcs-type=git, ceph=True, GIT_BRANCH=main, distribution-scope=public, CEPH_POINT_RELEASE=, version=7, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, release=1763362218, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux , RELEASE=main) Dec 15 04:51:11 localhost podman[296315]: 2025-12-15 09:51:11.531061865 +0000 UTC m=+0.159894869 container attach abdfc3533566a658557c97b9a0998528ca28c2f07b099255057ed2cd18cd1d80 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=beautiful_swartz, CEPH_POINT_RELEASE=, architecture=x86_64, vcs-type=git, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, maintainer=Guillaume Abrioux , url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, GIT_CLEAN=True, version=7, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public) Dec 15 04:51:11 localhost beautiful_swartz[296330]: 167 167 Dec 15 04:51:11 localhost systemd[1]: libpod-abdfc3533566a658557c97b9a0998528ca28c2f07b099255057ed2cd18cd1d80.scope: Deactivated successfully. Dec 15 04:51:11 localhost podman[296315]: 2025-12-15 09:51:11.535754606 +0000 UTC m=+0.164587550 container died abdfc3533566a658557c97b9a0998528ca28c2f07b099255057ed2cd18cd1d80 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=beautiful_swartz, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, RELEASE=main, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, io.openshift.expose-services=, GIT_BRANCH=main, build-date=2025-11-26T19:44:28Z, version=7, vcs-type=git, ceph=True, release=1763362218, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git) Dec 15 04:51:11 localhost ceph-mon[293875]: mon.np0005559462@4(peon).osd e84 _set_new_cache_sizes cache_size:1020054729 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:51:11 localhost podman[296335]: 2025-12-15 09:51:11.633734857 +0000 UTC m=+0.083282023 container remove abdfc3533566a658557c97b9a0998528ca28c2f07b099255057ed2cd18cd1d80 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=beautiful_swartz, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, vcs-type=git, build-date=2025-11-26T19:44:28Z, GIT_BRANCH=main, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, com.redhat.component=rhceph-container, RELEASE=main, maintainer=Guillaume Abrioux , release=1763362218, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, version=7, distribution-scope=public) Dec 15 04:51:11 localhost systemd[1]: libpod-conmon-abdfc3533566a658557c97b9a0998528ca28c2f07b099255057ed2cd18cd1d80.scope: Deactivated successfully. Dec 15 04:51:11 localhost ceph-mon[293875]: mon.np0005559462 calling monitor election Dec 15 04:51:11 localhost ceph-mon[293875]: mon.np0005559464 calling monitor election Dec 15 04:51:11 localhost ceph-mon[293875]: mon.np0005559461 calling monitor election Dec 15 04:51:11 localhost ceph-mon[293875]: mon.np0005559460 calling monitor election Dec 15 04:51:11 localhost ceph-mon[293875]: mon.np0005559461 is new leader, mons np0005559461,np0005559460,np0005559464,np0005559463,np0005559462 in quorum (ranks 0,1,2,3,4) Dec 15 04:51:11 localhost ceph-mon[293875]: Health check cleared: MON_DOWN (was: 2/5 mons down, quorum np0005559461,np0005559460,np0005559463) Dec 15 04:51:11 localhost ceph-mon[293875]: Cluster is now healthy Dec 15 04:51:11 localhost ceph-mon[293875]: overall HEALTH_OK Dec 15 04:51:11 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:11 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:11 localhost ceph-mon[293875]: from='mgr.24103 172.18.0.105:0/2912776587' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005559463.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Dec 15 04:51:11 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005559463.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Dec 15 04:51:12 localhost systemd[1]: var-lib-containers-storage-overlay-51af433f3881cf2f88565f903dee195142f5b1244a9319e05a993ae254fdb184-merged.mount: Deactivated successfully. Dec 15 04:51:13 localhost ceph-mon[293875]: Reconfiguring mon.np0005559462 (monmap changed)... Dec 15 04:51:13 localhost ceph-mon[293875]: Reconfiguring daemon mon.np0005559462 on np0005559462.localdomain Dec 15 04:51:13 localhost ceph-mon[293875]: Reconfiguring crash.np0005559463 (monmap changed)... Dec 15 04:51:13 localhost ceph-mon[293875]: Reconfiguring daemon crash.np0005559463 on np0005559463.localdomain Dec 15 04:51:13 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:13 localhost ceph-mon[293875]: Removed label mon from host np0005559459.localdomain Dec 15 04:51:13 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:13 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:13 localhost ceph-mon[293875]: from='mgr.24103 172.18.0.105:0/2912776587' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Dec 15 04:51:14 localhost ceph-mon[293875]: Reconfiguring osd.2 (monmap changed)... Dec 15 04:51:14 localhost ceph-mon[293875]: Reconfiguring daemon osd.2 on np0005559463.localdomain Dec 15 04:51:14 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:14 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:14 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:14 localhost ceph-mon[293875]: from='mgr.24103 172.18.0.105:0/2912776587' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Dec 15 04:51:15 localhost nova_compute[286344]: 2025-12-15 09:51:15.090 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:51:15 localhost ceph-mon[293875]: Removed label mgr from host np0005559459.localdomain Dec 15 04:51:15 localhost ceph-mon[293875]: Reconfiguring osd.5 (monmap changed)... Dec 15 04:51:15 localhost ceph-mon[293875]: Reconfiguring daemon osd.5 on np0005559463.localdomain Dec 15 04:51:15 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:15 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:15 localhost ceph-mon[293875]: from='mgr.24103 172.18.0.105:0/2912776587' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005559463.rdpgze", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Dec 15 04:51:15 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005559463.rdpgze", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Dec 15 04:51:15 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:16 localhost ceph-mon[293875]: Reconfiguring mds.mds.np0005559463.rdpgze (monmap changed)... Dec 15 04:51:16 localhost ceph-mon[293875]: Reconfiguring daemon mds.mds.np0005559463.rdpgze on np0005559463.localdomain Dec 15 04:51:16 localhost ceph-mon[293875]: Removed label _admin from host np0005559459.localdomain Dec 15 04:51:16 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:16 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:16 localhost ceph-mon[293875]: from='mgr.24103 172.18.0.105:0/2912776587' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005559463.daptkf", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Dec 15 04:51:16 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005559463.daptkf", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Dec 15 04:51:16 localhost ceph-mon[293875]: mon.np0005559462@4(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:51:17 localhost ceph-mon[293875]: Reconfiguring mgr.np0005559463.daptkf (monmap changed)... Dec 15 04:51:17 localhost ceph-mon[293875]: Reconfiguring daemon mgr.np0005559463.daptkf on np0005559463.localdomain Dec 15 04:51:17 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:17 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:17 localhost ceph-mon[293875]: from='mgr.24103 172.18.0.105:0/2912776587' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Dec 15 04:51:17 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:17 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:17 localhost ceph-mon[293875]: from='mgr.24103 172.18.0.105:0/2912776587' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005559464.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Dec 15 04:51:17 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005559464.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Dec 15 04:51:18 localhost ceph-mon[293875]: Reconfiguring mon.np0005559463 (monmap changed)... Dec 15 04:51:18 localhost ceph-mon[293875]: Reconfiguring daemon mon.np0005559463 on np0005559463.localdomain Dec 15 04:51:18 localhost ceph-mon[293875]: Reconfiguring crash.np0005559464 (monmap changed)... Dec 15 04:51:18 localhost ceph-mon[293875]: Reconfiguring daemon crash.np0005559464 on np0005559464.localdomain Dec 15 04:51:18 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:18 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:18 localhost ceph-mon[293875]: Reconfiguring osd.1 (monmap changed)... Dec 15 04:51:18 localhost ceph-mon[293875]: from='mgr.24103 172.18.0.105:0/2912776587' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Dec 15 04:51:18 localhost ceph-mon[293875]: Reconfiguring daemon osd.1 on np0005559464.localdomain Dec 15 04:51:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0. Dec 15 04:51:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 04:51:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a. Dec 15 04:51:18 localhost podman[296351]: 2025-12-15 09:51:18.749392043 +0000 UTC m=+0.079153878 container health_status 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Dec 15 04:51:18 localhost podman[296351]: 2025-12-15 09:51:18.764485253 +0000 UTC m=+0.094247088 container exec_died 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Dec 15 04:51:18 localhost systemd[1]: 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0.service: Deactivated successfully. Dec 15 04:51:18 localhost systemd[1]: tmp-crun.O9v2tl.mount: Deactivated successfully. Dec 15 04:51:18 localhost systemd[1]: tmp-crun.Hdl3OK.mount: Deactivated successfully. Dec 15 04:51:18 localhost podman[296352]: 2025-12-15 09:51:18.865637223 +0000 UTC m=+0.191520390 container health_status 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd) Dec 15 04:51:18 localhost podman[296353]: 2025-12-15 09:51:18.827210322 +0000 UTC m=+0.148553212 container health_status b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Dec 15 04:51:18 localhost podman[296352]: 2025-12-15 09:51:18.907235493 +0000 UTC m=+0.233118630 container exec_died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 04:51:18 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.service: Deactivated successfully. Dec 15 04:51:18 localhost podman[296353]: 2025-12-15 09:51:18.958287896 +0000 UTC m=+0.279630826 container exec_died b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0) Dec 15 04:51:18 localhost systemd[1]: b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a.service: Deactivated successfully. Dec 15 04:51:20 localhost nova_compute[286344]: 2025-12-15 09:51:20.095 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:51:20 localhost nova_compute[286344]: 2025-12-15 09:51:20.097 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:51:20 localhost nova_compute[286344]: 2025-12-15 09:51:20.098 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 15 04:51:20 localhost nova_compute[286344]: 2025-12-15 09:51:20.098 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:51:20 localhost nova_compute[286344]: 2025-12-15 09:51:20.115 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:51:20 localhost nova_compute[286344]: 2025-12-15 09:51:20.116 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:51:20 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:20 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:20 localhost ceph-mon[293875]: Reconfiguring osd.4 (monmap changed)... Dec 15 04:51:20 localhost ceph-mon[293875]: from='mgr.24103 172.18.0.105:0/2912776587' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Dec 15 04:51:20 localhost ceph-mon[293875]: Reconfiguring daemon osd.4 on np0005559464.localdomain Dec 15 04:51:21 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:21 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:21 localhost ceph-mon[293875]: Reconfiguring mds.mds.np0005559464.piyuji (monmap changed)... Dec 15 04:51:21 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005559464.piyuji", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Dec 15 04:51:21 localhost ceph-mon[293875]: from='mgr.24103 172.18.0.105:0/2912776587' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005559464.piyuji", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Dec 15 04:51:21 localhost ceph-mon[293875]: Reconfiguring daemon mds.mds.np0005559464.piyuji on np0005559464.localdomain Dec 15 04:51:21 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:21 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:21 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005559464.aomnqe", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Dec 15 04:51:21 localhost ceph-mon[293875]: from='mgr.24103 172.18.0.105:0/2912776587' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005559464.aomnqe", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Dec 15 04:51:21 localhost ceph-mon[293875]: mon.np0005559462@4(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:51:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09. Dec 15 04:51:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 04:51:21 localhost podman[296405]: 2025-12-15 09:51:21.737390324 +0000 UTC m=+0.068030347 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Dec 15 04:51:21 localhost podman[296404]: 2025-12-15 09:51:21.801603895 +0000 UTC m=+0.131092005 container health_status 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., version=9.6, io.buildah.version=1.33.7, container_name=openstack_network_exporter, release=1755695350, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible) Dec 15 04:51:21 localhost podman[296405]: 2025-12-15 09:51:21.822783216 +0000 UTC m=+0.153423239 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Dec 15 04:51:21 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 04:51:21 localhost podman[296404]: 2025-12-15 09:51:21.83942989 +0000 UTC m=+0.168918020 container exec_died 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, config_id=openstack_network_exporter, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, architecture=x86_64, io.buildah.version=1.33.7, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.openshift.expose-services=) Dec 15 04:51:21 localhost systemd[1]: 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09.service: Deactivated successfully. Dec 15 04:51:22 localhost ceph-mon[293875]: Reconfiguring mgr.np0005559464.aomnqe (monmap changed)... Dec 15 04:51:22 localhost ceph-mon[293875]: Reconfiguring daemon mgr.np0005559464.aomnqe on np0005559464.localdomain Dec 15 04:51:22 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:22 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:22 localhost ceph-mon[293875]: from='mgr.24103 172.18.0.105:0/2912776587' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Dec 15 04:51:23 localhost ceph-mon[293875]: Reconfiguring mon.np0005559464 (monmap changed)... Dec 15 04:51:23 localhost ceph-mon[293875]: Reconfiguring daemon mon.np0005559464 on np0005559464.localdomain Dec 15 04:51:23 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:23 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:25 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:25 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:25 localhost ceph-mon[293875]: from='mgr.24103 172.18.0.105:0/2912776587' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Dec 15 04:51:25 localhost nova_compute[286344]: 2025-12-15 09:51:25.116 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:51:25 localhost nova_compute[286344]: 2025-12-15 09:51:25.118 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:51:25 localhost nova_compute[286344]: 2025-12-15 09:51:25.118 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 15 04:51:25 localhost nova_compute[286344]: 2025-12-15 09:51:25.118 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:51:25 localhost nova_compute[286344]: 2025-12-15 09:51:25.119 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:51:25 localhost nova_compute[286344]: 2025-12-15 09:51:25.121 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:51:26 localhost ceph-mon[293875]: Removing np0005559459.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:51:26 localhost ceph-mon[293875]: Updating np0005559460.localdomain:/etc/ceph/ceph.conf Dec 15 04:51:26 localhost ceph-mon[293875]: Updating np0005559461.localdomain:/etc/ceph/ceph.conf Dec 15 04:51:26 localhost ceph-mon[293875]: Updating np0005559462.localdomain:/etc/ceph/ceph.conf Dec 15 04:51:26 localhost ceph-mon[293875]: Updating np0005559463.localdomain:/etc/ceph/ceph.conf Dec 15 04:51:26 localhost ceph-mon[293875]: Updating np0005559464.localdomain:/etc/ceph/ceph.conf Dec 15 04:51:26 localhost ceph-mon[293875]: Removing np0005559459.localdomain:/etc/ceph/ceph.client.admin.keyring Dec 15 04:51:26 localhost ceph-mon[293875]: Removing np0005559459.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.client.admin.keyring Dec 15 04:51:26 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:26 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:26 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:26 localhost ceph-mon[293875]: mon.np0005559462@4(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:51:27 localhost ceph-mon[293875]: Updating np0005559461.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:51:27 localhost ceph-mon[293875]: Updating np0005559462.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:51:27 localhost ceph-mon[293875]: Updating np0005559460.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:51:27 localhost ceph-mon[293875]: Updating np0005559464.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:51:27 localhost ceph-mon[293875]: Updating np0005559463.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:51:27 localhost ceph-mon[293875]: Added label _no_schedule to host np0005559459.localdomain Dec 15 04:51:27 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:27 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:27 localhost ceph-mon[293875]: Added label SpecialHostLabels.DRAIN_CONF_KEYRING to host np0005559459.localdomain Dec 15 04:51:27 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:27 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:27 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:27 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:27 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:27 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:27 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:27 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:27 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:27 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:28 localhost ceph-mon[293875]: Removing daemon crash.np0005559459 from np0005559459.localdomain -- ports [] Dec 15 04:51:29 localhost ceph-mon[293875]: from='mgr.24103 172.18.0.105:0/2912776587' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "auth rm", "entity": "client.crash.np0005559459.localdomain"} : dispatch Dec 15 04:51:29 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "auth rm", "entity": "client.crash.np0005559459.localdomain"} : dispatch Dec 15 04:51:29 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' cmd='[{"prefix": "auth rm", "entity": "client.crash.np0005559459.localdomain"}]': finished Dec 15 04:51:29 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:29 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:29 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:29 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005559459.localdomain"} : dispatch Dec 15 04:51:29 localhost ceph-mon[293875]: from='mgr.24103 172.18.0.105:0/2912776587' entity='mgr.np0005559461.egwgzn' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005559459.localdomain"} : dispatch Dec 15 04:51:29 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005559459.localdomain"}]': finished Dec 15 04:51:29 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "auth rm", "entity": "mgr.np0005559459.hhnowu"} : dispatch Dec 15 04:51:29 localhost ceph-mon[293875]: from='mgr.24103 172.18.0.105:0/2912776587' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "auth rm", "entity": "mgr.np0005559459.hhnowu"} : dispatch Dec 15 04:51:29 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' cmd='[{"prefix": "auth rm", "entity": "mgr.np0005559459.hhnowu"}]': finished Dec 15 04:51:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 04:51:30 localhost nova_compute[286344]: 2025-12-15 09:51:30.122 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:51:30 localhost nova_compute[286344]: 2025-12-15 09:51:30.124 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:51:30 localhost nova_compute[286344]: 2025-12-15 09:51:30.125 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 15 04:51:30 localhost nova_compute[286344]: 2025-12-15 09:51:30.125 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:51:30 localhost nova_compute[286344]: 2025-12-15 09:51:30.150 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:51:30 localhost nova_compute[286344]: 2025-12-15 09:51:30.151 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:51:30 localhost systemd[1]: tmp-crun.w2adLX.mount: Deactivated successfully. Dec 15 04:51:30 localhost podman[296805]: 2025-12-15 09:51:30.210179309 +0000 UTC m=+0.097467919 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Dec 15 04:51:30 localhost podman[296805]: 2025-12-15 09:51:30.221866654 +0000 UTC m=+0.109155244 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 04:51:30 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 04:51:30 localhost ceph-mon[293875]: Removing key for client.crash.np0005559459.localdomain Dec 15 04:51:30 localhost ceph-mon[293875]: Removing daemon mgr.np0005559459.hhnowu from np0005559459.localdomain -- ports [9283, 8765] Dec 15 04:51:30 localhost ceph-mon[293875]: Removed host np0005559459.localdomain Dec 15 04:51:30 localhost ceph-mon[293875]: Removing key for mgr.np0005559459.hhnowu Dec 15 04:51:30 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:30 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:30 localhost ceph-mon[293875]: from='mgr.24103 172.18.0.105:0/2912776587' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Dec 15 04:51:30 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:30 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:30 localhost ceph-mon[293875]: from='mgr.24103 172.18.0.105:0/2912776587' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005559460.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Dec 15 04:51:30 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005559460.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Dec 15 04:51:31 localhost ceph-mon[293875]: Reconfiguring crash.np0005559460 (monmap changed)... Dec 15 04:51:31 localhost ceph-mon[293875]: Reconfiguring daemon crash.np0005559460 on np0005559460.localdomain Dec 15 04:51:31 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:31 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:31 localhost ceph-mon[293875]: from='mgr.24103 172.18.0.105:0/2912776587' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Dec 15 04:51:31 localhost ceph-mon[293875]: mon.np0005559462@4(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:51:31 localhost podman[243449]: time="2025-12-15T09:51:31Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 15 04:51:31 localhost podman[243449]: @ - - [15/Dec/2025:09:51:31 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 154814 "" "Go-http-client/1.1" Dec 15 04:51:31 localhost podman[243449]: @ - - [15/Dec/2025:09:51:31 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18694 "" "Go-http-client/1.1" Dec 15 04:51:32 localhost sshd[296823]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:51:32 localhost systemd[1]: Created slice User Slice of UID 1003. Dec 15 04:51:32 localhost systemd[1]: Starting User Runtime Directory /run/user/1003... Dec 15 04:51:32 localhost systemd-logind[763]: New session 65 of user tripleo-admin. Dec 15 04:51:32 localhost ceph-mon[293875]: Reconfiguring mon.np0005559460 (monmap changed)... Dec 15 04:51:32 localhost ceph-mon[293875]: Reconfiguring daemon mon.np0005559460 on np0005559460.localdomain Dec 15 04:51:32 localhost systemd[1]: Finished User Runtime Directory /run/user/1003. Dec 15 04:51:32 localhost systemd[1]: Starting User Manager for UID 1003... Dec 15 04:51:32 localhost systemd[296827]: Queued start job for default target Main User Target. Dec 15 04:51:32 localhost systemd[296827]: Created slice User Application Slice. Dec 15 04:51:32 localhost systemd[296827]: Started Mark boot as successful after the user session has run 2 minutes. Dec 15 04:51:32 localhost systemd[296827]: Started Daily Cleanup of User's Temporary Directories. Dec 15 04:51:32 localhost systemd[296827]: Reached target Paths. Dec 15 04:51:32 localhost systemd[296827]: Reached target Timers. Dec 15 04:51:32 localhost systemd[296827]: Starting D-Bus User Message Bus Socket... Dec 15 04:51:32 localhost systemd[296827]: Starting Create User's Volatile Files and Directories... Dec 15 04:51:32 localhost systemd[296827]: Listening on D-Bus User Message Bus Socket. Dec 15 04:51:32 localhost systemd[296827]: Finished Create User's Volatile Files and Directories. Dec 15 04:51:32 localhost systemd[296827]: Reached target Sockets. Dec 15 04:51:32 localhost systemd[296827]: Reached target Basic System. Dec 15 04:51:32 localhost systemd[296827]: Reached target Main User Target. Dec 15 04:51:32 localhost systemd[296827]: Startup finished in 144ms. Dec 15 04:51:32 localhost systemd[1]: Started User Manager for UID 1003. Dec 15 04:51:32 localhost systemd[1]: Started Session 65 of User tripleo-admin. Dec 15 04:51:33 localhost python3[296969]: ansible-ansible.builtin.lineinfile Invoked with dest=/etc/os-net-config/tripleo_config.yaml insertafter=172.18.0 line= - ip_netmask: 172.18.0.103/24 backup=True path=/etc/os-net-config/tripleo_config.yaml state=present backrefs=False create=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Dec 15 04:51:33 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:33 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:33 localhost ceph-mon[293875]: Reconfiguring mgr.np0005559460.oexkup (monmap changed)... Dec 15 04:51:33 localhost ceph-mon[293875]: from='mgr.24103 172.18.0.105:0/2912776587' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005559460.oexkup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Dec 15 04:51:33 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005559460.oexkup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Dec 15 04:51:33 localhost ceph-mon[293875]: Reconfiguring daemon mgr.np0005559460.oexkup on np0005559460.localdomain Dec 15 04:51:33 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:33 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:33 localhost ceph-mon[293875]: from='mgr.24103 172.18.0.105:0/2912776587' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Dec 15 04:51:34 localhost python3[297115]: ansible-ansible.legacy.command Invoked with _raw_params=ip a add 172.18.0.103/24 dev vlan21 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:51:34 localhost openstack_network_exporter[246484]: ERROR 09:51:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 04:51:34 localhost openstack_network_exporter[246484]: ERROR 09:51:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 04:51:34 localhost openstack_network_exporter[246484]: ERROR 09:51:34 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 15 04:51:34 localhost openstack_network_exporter[246484]: ERROR 09:51:34 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 15 04:51:34 localhost openstack_network_exporter[246484]: Dec 15 04:51:34 localhost openstack_network_exporter[246484]: ERROR 09:51:34 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 15 04:51:34 localhost openstack_network_exporter[246484]: Dec 15 04:51:34 localhost python3[297260]: ansible-ansible.legacy.command Invoked with _raw_params=ping -W1 -c 3 172.18.0.103 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 04:51:35 localhost nova_compute[286344]: 2025-12-15 09:51:35.151 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:51:35 localhost nova_compute[286344]: 2025-12-15 09:51:35.156 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:51:35 localhost ceph-mon[293875]: Reconfiguring mon.np0005559461 (monmap changed)... Dec 15 04:51:35 localhost ceph-mon[293875]: Reconfiguring daemon mon.np0005559461 on np0005559461.localdomain Dec 15 04:51:35 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:35 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:35 localhost ceph-mon[293875]: Reconfiguring mgr.np0005559461.egwgzn (monmap changed)... Dec 15 04:51:35 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005559461.egwgzn", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Dec 15 04:51:35 localhost ceph-mon[293875]: from='mgr.24103 172.18.0.105:0/2912776587' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005559461.egwgzn", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Dec 15 04:51:35 localhost ceph-mon[293875]: Reconfiguring daemon mgr.np0005559461.egwgzn on np0005559461.localdomain Dec 15 04:51:35 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:35 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:35 localhost ceph-mon[293875]: from='mgr.24103 172.18.0.105:0/2912776587' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005559461.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Dec 15 04:51:35 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005559461.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Dec 15 04:51:35 localhost sshd[297262]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:51:36 localhost ceph-mon[293875]: Reconfiguring crash.np0005559461 (monmap changed)... Dec 15 04:51:36 localhost ceph-mon[293875]: Reconfiguring daemon crash.np0005559461 on np0005559461.localdomain Dec 15 04:51:36 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:36 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:36 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005559462.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Dec 15 04:51:36 localhost ceph-mon[293875]: from='mgr.24103 172.18.0.105:0/2912776587' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005559462.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Dec 15 04:51:36 localhost ceph-mon[293875]: mon.np0005559462@4(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:51:36 localhost podman[297316]: Dec 15 04:51:36 localhost podman[297316]: 2025-12-15 09:51:36.848633774 +0000 UTC m=+0.086243166 container create 513cb56f6d7a36e3a44807314319e1fea0d339978bd301ab63640d639ce8db63 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_chandrasekhar, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, version=7, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, vendor=Red Hat, Inc., release=1763362218, build-date=2025-11-26T19:44:28Z, RELEASE=main, name=rhceph, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, vcs-type=git, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers) Dec 15 04:51:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e. Dec 15 04:51:36 localhost systemd[1]: Started libpod-conmon-513cb56f6d7a36e3a44807314319e1fea0d339978bd301ab63640d639ce8db63.scope. Dec 15 04:51:36 localhost systemd[1]: Started libcrun container. Dec 15 04:51:36 localhost podman[297316]: 2025-12-15 09:51:36.815253403 +0000 UTC m=+0.052862835 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 04:51:36 localhost podman[297316]: 2025-12-15 09:51:36.933295054 +0000 UTC m=+0.170904446 container init 513cb56f6d7a36e3a44807314319e1fea0d339978bd301ab63640d639ce8db63 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_chandrasekhar, version=7, ceph=True, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, architecture=x86_64, distribution-scope=public, build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, vcs-type=git, release=1763362218) Dec 15 04:51:36 localhost podman[297316]: 2025-12-15 09:51:36.943442936 +0000 UTC m=+0.181052328 container start 513cb56f6d7a36e3a44807314319e1fea0d339978bd301ab63640d639ce8db63 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_chandrasekhar, com.redhat.component=rhceph-container, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, release=1763362218, GIT_CLEAN=True, version=7, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, RELEASE=main, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, io.openshift.tags=rhceph ceph, architecture=x86_64, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, build-date=2025-11-26T19:44:28Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Dec 15 04:51:36 localhost podman[297316]: 2025-12-15 09:51:36.9439126 +0000 UTC m=+0.181522042 container attach 513cb56f6d7a36e3a44807314319e1fea0d339978bd301ab63640d639ce8db63 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_chandrasekhar, CEPH_POINT_RELEASE=, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, vcs-type=git, distribution-scope=public, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, GIT_BRANCH=main, build-date=2025-11-26T19:44:28Z, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64) Dec 15 04:51:36 localhost reverent_chandrasekhar[297332]: 167 167 Dec 15 04:51:36 localhost systemd[1]: libpod-513cb56f6d7a36e3a44807314319e1fea0d339978bd301ab63640d639ce8db63.scope: Deactivated successfully. Dec 15 04:51:36 localhost podman[297316]: 2025-12-15 09:51:36.949400823 +0000 UTC m=+0.187010235 container died 513cb56f6d7a36e3a44807314319e1fea0d339978bd301ab63640d639ce8db63 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_chandrasekhar, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, GIT_BRANCH=main, version=7, ceph=True, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, vcs-type=git, release=1763362218, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , build-date=2025-11-26T19:44:28Z, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream) Dec 15 04:51:37 localhost podman[297331]: 2025-12-15 09:51:37.040400639 +0000 UTC m=+0.152337237 container health_status a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 15 04:51:37 localhost podman[297348]: 2025-12-15 09:51:37.082009409 +0000 UTC m=+0.121036345 container remove 513cb56f6d7a36e3a44807314319e1fea0d339978bd301ab63640d639ce8db63 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_chandrasekhar, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , distribution-scope=public, GIT_BRANCH=main, GIT_CLEAN=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, name=rhceph, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, build-date=2025-11-26T19:44:28Z, ceph=True, release=1763362218, version=7) Dec 15 04:51:37 localhost systemd[1]: libpod-conmon-513cb56f6d7a36e3a44807314319e1fea0d339978bd301ab63640d639ce8db63.scope: Deactivated successfully. Dec 15 04:51:37 localhost podman[297331]: 2025-12-15 09:51:37.125244024 +0000 UTC m=+0.237180672 container exec_died a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 15 04:51:37 localhost systemd[1]: a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e.service: Deactivated successfully. Dec 15 04:51:37 localhost ceph-mon[293875]: Reconfiguring crash.np0005559462 (monmap changed)... Dec 15 04:51:37 localhost ceph-mon[293875]: Reconfiguring daemon crash.np0005559462 on np0005559462.localdomain Dec 15 04:51:37 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:37 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:37 localhost ceph-mon[293875]: from='mgr.24103 172.18.0.105:0/2912776587' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch Dec 15 04:51:37 localhost sshd[297440]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:51:37 localhost podman[297446]: Dec 15 04:51:37 localhost podman[297446]: 2025-12-15 09:51:37.769346621 +0000 UTC m=+0.058876762 container create 25575d86cfc8f44b0fbe02d0cab7b6450d25758ddcf5f27b25f247ffe4e6d4cd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=keen_carson, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, RELEASE=main, GIT_BRANCH=main, vcs-type=git, com.redhat.component=rhceph-container, build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, architecture=x86_64, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, release=1763362218, maintainer=Guillaume Abrioux , url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, ceph=True) Dec 15 04:51:37 localhost systemd[1]: Started libpod-conmon-25575d86cfc8f44b0fbe02d0cab7b6450d25758ddcf5f27b25f247ffe4e6d4cd.scope. Dec 15 04:51:37 localhost systemd[1]: Started libcrun container. Dec 15 04:51:37 localhost podman[297446]: 2025-12-15 09:51:37.741336061 +0000 UTC m=+0.030866202 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 04:51:37 localhost podman[297446]: 2025-12-15 09:51:37.848290822 +0000 UTC m=+0.137820963 container init 25575d86cfc8f44b0fbe02d0cab7b6450d25758ddcf5f27b25f247ffe4e6d4cd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=keen_carson, CEPH_POINT_RELEASE=, vcs-type=git, io.openshift.expose-services=, maintainer=Guillaume Abrioux , url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-11-26T19:44:28Z, description=Red Hat Ceph Storage 7, ceph=True, com.redhat.component=rhceph-container, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, release=1763362218, architecture=x86_64, vendor=Red Hat, Inc., RELEASE=main) Dec 15 04:51:37 localhost systemd[1]: var-lib-containers-storage-overlay-d1cdc3edf6594936220d881002ebba335e6eded24ec6fabe6e7c0a7823be2c82-merged.mount: Deactivated successfully. Dec 15 04:51:37 localhost keen_carson[297463]: 167 167 Dec 15 04:51:37 localhost podman[297446]: 2025-12-15 09:51:37.862969062 +0000 UTC m=+0.152499203 container start 25575d86cfc8f44b0fbe02d0cab7b6450d25758ddcf5f27b25f247ffe4e6d4cd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=keen_carson, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , build-date=2025-11-26T19:44:28Z, description=Red Hat Ceph Storage 7, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., distribution-scope=public, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, architecture=x86_64) Dec 15 04:51:37 localhost systemd[1]: libpod-25575d86cfc8f44b0fbe02d0cab7b6450d25758ddcf5f27b25f247ffe4e6d4cd.scope: Deactivated successfully. Dec 15 04:51:37 localhost podman[297446]: 2025-12-15 09:51:37.863399794 +0000 UTC m=+0.152929995 container attach 25575d86cfc8f44b0fbe02d0cab7b6450d25758ddcf5f27b25f247ffe4e6d4cd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=keen_carson, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, RELEASE=main, distribution-scope=public, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1763362218, GIT_BRANCH=main, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc.) Dec 15 04:51:37 localhost podman[297446]: 2025-12-15 09:51:37.866313275 +0000 UTC m=+0.155843446 container died 25575d86cfc8f44b0fbe02d0cab7b6450d25758ddcf5f27b25f247ffe4e6d4cd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=keen_carson, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, GIT_BRANCH=main, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, GIT_CLEAN=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, description=Red Hat Ceph Storage 7, RELEASE=main, release=1763362218, version=7, ceph=True, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, architecture=x86_64, distribution-scope=public) Dec 15 04:51:37 localhost systemd[1]: var-lib-containers-storage-overlay-67fe1c2b1d942c07dddf7cf8985e45a5d6423a52a65bb622dd54514a15a267df-merged.mount: Deactivated successfully. Dec 15 04:51:37 localhost podman[297468]: 2025-12-15 09:51:37.967463915 +0000 UTC m=+0.093672152 container remove 25575d86cfc8f44b0fbe02d0cab7b6450d25758ddcf5f27b25f247ffe4e6d4cd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=keen_carson, architecture=x86_64, RELEASE=main, build-date=2025-11-26T19:44:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, maintainer=Guillaume Abrioux , ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, com.redhat.component=rhceph-container, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, GIT_CLEAN=True, name=rhceph, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, version=7, release=1763362218, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream) Dec 15 04:51:37 localhost systemd[1]: libpod-conmon-25575d86cfc8f44b0fbe02d0cab7b6450d25758ddcf5f27b25f247ffe4e6d4cd.scope: Deactivated successfully. Dec 15 04:51:38 localhost ceph-mon[293875]: Reconfiguring osd.0 (monmap changed)... Dec 15 04:51:38 localhost ceph-mon[293875]: Reconfiguring daemon osd.0 on np0005559462.localdomain Dec 15 04:51:38 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:38 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:38 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:38 localhost ceph-mon[293875]: from='mgr.24103 172.18.0.105:0/2912776587' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch Dec 15 04:51:38 localhost podman[297543]: Dec 15 04:51:38 localhost podman[297543]: 2025-12-15 09:51:38.795511151 +0000 UTC m=+0.078067388 container create 1de68abd1d758e06929556898dc3c85b0d78eb565c5d49f31346418e5a6242fc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=festive_shannon, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, architecture=x86_64, com.redhat.component=rhceph-container, distribution-scope=public, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1763362218, GIT_CLEAN=True, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, RELEASE=main, io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0) Dec 15 04:51:38 localhost systemd[1]: Started libpod-conmon-1de68abd1d758e06929556898dc3c85b0d78eb565c5d49f31346418e5a6242fc.scope. Dec 15 04:51:38 localhost systemd[1]: Started libcrun container. Dec 15 04:51:38 localhost podman[297543]: 2025-12-15 09:51:38.85792118 +0000 UTC m=+0.140477427 container init 1de68abd1d758e06929556898dc3c85b0d78eb565c5d49f31346418e5a6242fc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=festive_shannon, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, description=Red Hat Ceph Storage 7, ceph=True, RELEASE=main, GIT_CLEAN=True, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, version=7, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1763362218, architecture=x86_64, vcs-type=git, maintainer=Guillaume Abrioux , distribution-scope=public, GIT_BRANCH=main, build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers) Dec 15 04:51:38 localhost podman[297543]: 2025-12-15 09:51:38.762521011 +0000 UTC m=+0.045077248 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 04:51:38 localhost podman[297543]: 2025-12-15 09:51:38.867127577 +0000 UTC m=+0.149683804 container start 1de68abd1d758e06929556898dc3c85b0d78eb565c5d49f31346418e5a6242fc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=festive_shannon, GIT_CLEAN=True, vcs-type=git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, io.openshift.expose-services=, maintainer=Guillaume Abrioux , ceph=True, com.redhat.component=rhceph-container, architecture=x86_64, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., release=1763362218, io.openshift.tags=rhceph ceph, version=7, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4) Dec 15 04:51:38 localhost podman[297543]: 2025-12-15 09:51:38.867357363 +0000 UTC m=+0.149913610 container attach 1de68abd1d758e06929556898dc3c85b0d78eb565c5d49f31346418e5a6242fc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=festive_shannon, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux , RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, release=1763362218, com.redhat.component=rhceph-container, version=7, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, build-date=2025-11-26T19:44:28Z, name=rhceph, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, ceph=True, distribution-scope=public, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Dec 15 04:51:38 localhost festive_shannon[297558]: 167 167 Dec 15 04:51:38 localhost systemd[1]: libpod-1de68abd1d758e06929556898dc3c85b0d78eb565c5d49f31346418e5a6242fc.scope: Deactivated successfully. Dec 15 04:51:38 localhost podman[297543]: 2025-12-15 09:51:38.871086928 +0000 UTC m=+0.153643195 container died 1de68abd1d758e06929556898dc3c85b0d78eb565c5d49f31346418e5a6242fc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=festive_shannon, GIT_CLEAN=True, build-date=2025-11-26T19:44:28Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, release=1763362218, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, distribution-scope=public, version=7, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, io.openshift.expose-services=, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , architecture=x86_64) Dec 15 04:51:38 localhost systemd[1]: tmp-crun.Btlb2C.mount: Deactivated successfully. Dec 15 04:51:38 localhost podman[297563]: 2025-12-15 09:51:38.973822941 +0000 UTC m=+0.090102403 container remove 1de68abd1d758e06929556898dc3c85b0d78eb565c5d49f31346418e5a6242fc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=festive_shannon, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, build-date=2025-11-26T19:44:28Z, CEPH_POINT_RELEASE=, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, io.openshift.expose-services=, maintainer=Guillaume Abrioux , release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Dec 15 04:51:38 localhost systemd[1]: libpod-conmon-1de68abd1d758e06929556898dc3c85b0d78eb565c5d49f31346418e5a6242fc.scope: Deactivated successfully. Dec 15 04:51:39 localhost ceph-mon[293875]: Saving service mon spec with placement label:mon Dec 15 04:51:39 localhost ceph-mon[293875]: Reconfiguring osd.3 (monmap changed)... Dec 15 04:51:39 localhost ceph-mon[293875]: Reconfiguring daemon osd.3 on np0005559462.localdomain Dec 15 04:51:39 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:39 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:39 localhost ceph-mon[293875]: from='mgr.24103 172.18.0.105:0/2912776587' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005559462.mhigvc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Dec 15 04:51:39 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005559462.mhigvc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Dec 15 04:51:39 localhost podman[297639]: Dec 15 04:51:39 localhost podman[297639]: 2025-12-15 09:51:39.801348622 +0000 UTC m=+0.074778076 container create caf59bac52d39bc7b26fa948c1452a64ba8db34ecede72b3b13528e5ebb5e50a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=thirsty_allen, maintainer=Guillaume Abrioux , vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, version=7, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, ceph=True, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, vcs-type=git, RELEASE=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream) Dec 15 04:51:39 localhost systemd[1]: Started libpod-conmon-caf59bac52d39bc7b26fa948c1452a64ba8db34ecede72b3b13528e5ebb5e50a.scope. Dec 15 04:51:39 localhost systemd[1]: Started libcrun container. Dec 15 04:51:39 localhost systemd[1]: var-lib-containers-storage-overlay-55b16356f55f1e646e74569bbf223c28dfa4ceba4fccc0ecec0a716a03ea5ab2-merged.mount: Deactivated successfully. Dec 15 04:51:39 localhost podman[297639]: 2025-12-15 09:51:39.86583742 +0000 UTC m=+0.139266874 container init caf59bac52d39bc7b26fa948c1452a64ba8db34ecede72b3b13528e5ebb5e50a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=thirsty_allen, release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, RELEASE=main, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, build-date=2025-11-26T19:44:28Z, GIT_CLEAN=True, ceph=True, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, version=7, vendor=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, vcs-type=git, com.redhat.component=rhceph-container, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, maintainer=Guillaume Abrioux ) Dec 15 04:51:39 localhost podman[297639]: 2025-12-15 09:51:39.769965407 +0000 UTC m=+0.043394891 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 04:51:39 localhost thirsty_allen[297654]: 167 167 Dec 15 04:51:39 localhost podman[297639]: 2025-12-15 09:51:39.877512336 +0000 UTC m=+0.150941790 container start caf59bac52d39bc7b26fa948c1452a64ba8db34ecede72b3b13528e5ebb5e50a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=thirsty_allen, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, io.openshift.expose-services=, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, name=rhceph, version=7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, build-date=2025-11-26T19:44:28Z, architecture=x86_64, GIT_BRANCH=main, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, GIT_CLEAN=True) Dec 15 04:51:39 localhost podman[297639]: 2025-12-15 09:51:39.877827584 +0000 UTC m=+0.151257078 container attach caf59bac52d39bc7b26fa948c1452a64ba8db34ecede72b3b13528e5ebb5e50a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=thirsty_allen, vcs-type=git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, architecture=x86_64, GIT_BRANCH=main, vendor=Red Hat, Inc., release=1763362218, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, maintainer=Guillaume Abrioux , GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, RELEASE=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, version=7, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z) Dec 15 04:51:39 localhost systemd[1]: libpod-caf59bac52d39bc7b26fa948c1452a64ba8db34ecede72b3b13528e5ebb5e50a.scope: Deactivated successfully. Dec 15 04:51:39 localhost podman[297639]: 2025-12-15 09:51:39.880021796 +0000 UTC m=+0.153451280 container died caf59bac52d39bc7b26fa948c1452a64ba8db34ecede72b3b13528e5ebb5e50a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=thirsty_allen, GIT_CLEAN=True, release=1763362218, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, GIT_BRANCH=main, version=7, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., architecture=x86_64, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-26T19:44:28Z, distribution-scope=public, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, name=rhceph) Dec 15 04:51:39 localhost systemd[1]: tmp-crun.toPtpY.mount: Deactivated successfully. Dec 15 04:51:39 localhost podman[297659]: 2025-12-15 09:51:39.975641001 +0000 UTC m=+0.083512989 container remove caf59bac52d39bc7b26fa948c1452a64ba8db34ecede72b3b13528e5ebb5e50a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=thirsty_allen, io.buildah.version=1.41.4, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, build-date=2025-11-26T19:44:28Z, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , architecture=x86_64, version=7, RELEASE=main, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git) Dec 15 04:51:39 localhost systemd[1]: libpod-conmon-caf59bac52d39bc7b26fa948c1452a64ba8db34ecede72b3b13528e5ebb5e50a.scope: Deactivated successfully. Dec 15 04:51:40 localhost nova_compute[286344]: 2025-12-15 09:51:40.157 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:51:40 localhost nova_compute[286344]: 2025-12-15 09:51:40.161 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:51:40 localhost nova_compute[286344]: 2025-12-15 09:51:40.161 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 15 04:51:40 localhost nova_compute[286344]: 2025-12-15 09:51:40.162 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:51:40 localhost nova_compute[286344]: 2025-12-15 09:51:40.202 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:51:40 localhost nova_compute[286344]: 2025-12-15 09:51:40.203 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:51:40 localhost ceph-mon[293875]: Reconfiguring mds.mds.np0005559462.mhigvc (monmap changed)... Dec 15 04:51:40 localhost ceph-mon[293875]: Reconfiguring daemon mds.mds.np0005559462.mhigvc on np0005559462.localdomain Dec 15 04:51:40 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:40 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:51:40 localhost ceph-mon[293875]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005559462.fudvyx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Dec 15 04:51:40 localhost ceph-mon[293875]: from='mgr.24103 172.18.0.105:0/2912776587' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005559462.fudvyx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Dec 15 04:51:40 localhost podman[297729]: Dec 15 04:51:40 localhost podman[297729]: 2025-12-15 09:51:40.698958686 +0000 UTC m=+0.075914637 container create 8faeb1055e7a016e59e52a2e54cc3fb01c1a824922159950f5f6f657e8d494de (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=amazing_morse, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, distribution-scope=public, name=rhceph, GIT_CLEAN=True, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, release=1763362218, architecture=x86_64, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, maintainer=Guillaume Abrioux , vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, io.openshift.expose-services=, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0) Dec 15 04:51:40 localhost systemd[1]: Started libpod-conmon-8faeb1055e7a016e59e52a2e54cc3fb01c1a824922159950f5f6f657e8d494de.scope. Dec 15 04:51:40 localhost ceph-mgr[292421]: ms_deliver_dispatch: unhandled message 0x55975582b080 mon_map magic: 0 from mon.3 v2:172.18.0.107:3300/0 Dec 15 04:51:40 localhost ceph-mon[293875]: mon.np0005559462@4(peon) e8 removed from monmap, suicide. Dec 15 04:51:40 localhost systemd[1]: Started libcrun container. Dec 15 04:51:40 localhost podman[297729]: 2025-12-15 09:51:40.671389548 +0000 UTC m=+0.048345549 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 04:51:40 localhost podman[297729]: 2025-12-15 09:51:40.773976738 +0000 UTC m=+0.150932689 container init 8faeb1055e7a016e59e52a2e54cc3fb01c1a824922159950f5f6f657e8d494de (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=amazing_morse, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, com.redhat.component=rhceph-container, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, io.openshift.expose-services=, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, release=1763362218, name=rhceph, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, GIT_CLEAN=True, maintainer=Guillaume Abrioux ) Dec 15 04:51:40 localhost podman[297729]: 2025-12-15 09:51:40.784349287 +0000 UTC m=+0.161305238 container start 8faeb1055e7a016e59e52a2e54cc3fb01c1a824922159950f5f6f657e8d494de (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=amazing_morse, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, CEPH_POINT_RELEASE=, distribution-scope=public, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, version=7, vcs-type=git, release=1763362218, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, architecture=x86_64, io.openshift.tags=rhceph ceph, name=rhceph, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, GIT_BRANCH=main) Dec 15 04:51:40 localhost podman[297729]: 2025-12-15 09:51:40.784625055 +0000 UTC m=+0.161581036 container attach 8faeb1055e7a016e59e52a2e54cc3fb01c1a824922159950f5f6f657e8d494de (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=amazing_morse, version=7, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, maintainer=Guillaume Abrioux , url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, RELEASE=main, io.buildah.version=1.41.4, GIT_BRANCH=main, name=rhceph, io.openshift.expose-services=, GIT_CLEAN=True, ceph=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph) Dec 15 04:51:40 localhost amazing_morse[297748]: 167 167 Dec 15 04:51:40 localhost systemd[1]: libpod-8faeb1055e7a016e59e52a2e54cc3fb01c1a824922159950f5f6f657e8d494de.scope: Deactivated successfully. Dec 15 04:51:40 localhost podman[297729]: 2025-12-15 09:51:40.788574635 +0000 UTC m=+0.165530616 container died 8faeb1055e7a016e59e52a2e54cc3fb01c1a824922159950f5f6f657e8d494de (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=amazing_morse, architecture=x86_64, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, vcs-type=git, build-date=2025-11-26T19:44:28Z, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, name=rhceph, RELEASE=main, io.openshift.expose-services=, GIT_BRANCH=main, maintainer=Guillaume Abrioux , distribution-scope=public, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True) Dec 15 04:51:40 localhost podman[297762]: 2025-12-15 09:51:40.831706287 +0000 UTC m=+0.063474940 container died 69c9f90fbffad2f38f53a53e2be5c38fb80cf84a5c01e1ea6e8b126de4a3eecf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-mon-np0005559462, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, io.openshift.expose-services=, name=rhceph, release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, architecture=x86_64, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, version=7, maintainer=Guillaume Abrioux ) Dec 15 04:51:40 localhost systemd[1]: var-lib-containers-storage-overlay-5436ce0658c0c7a85047c2b3437e2c14f640eecbcf5b02933e36b7789dce8a41-merged.mount: Deactivated successfully. Dec 15 04:51:40 localhost systemd[1]: var-lib-containers-storage-overlay-78e8bfed3971c8ae628e85288100cabc11c29a2235c1f0c43453dda465da0d92-merged.mount: Deactivated successfully. Dec 15 04:51:40 localhost systemd[1]: var-lib-containers-storage-overlay-05611fbd93d7e8ef9543ca1a23958810fe87bdc36fd5a45e03753ac006e5ff96-merged.mount: Deactivated successfully. Dec 15 04:51:40 localhost podman[297775]: 2025-12-15 09:51:40.900036592 +0000 UTC m=+0.099664030 container remove 8faeb1055e7a016e59e52a2e54cc3fb01c1a824922159950f5f6f657e8d494de (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=amazing_morse, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, release=1763362218, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, ceph=True, distribution-scope=public, maintainer=Guillaume Abrioux , architecture=x86_64, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., RELEASE=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, build-date=2025-11-26T19:44:28Z, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7) Dec 15 04:51:40 localhost systemd[1]: libpod-conmon-8faeb1055e7a016e59e52a2e54cc3fb01c1a824922159950f5f6f657e8d494de.scope: Deactivated successfully. Dec 15 04:51:40 localhost podman[297762]: 2025-12-15 09:51:40.930096671 +0000 UTC m=+0.161865294 container remove 69c9f90fbffad2f38f53a53e2be5c38fb80cf84a5c01e1ea6e8b126de4a3eecf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-mon-np0005559462, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, release=1763362218, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, vendor=Red Hat, Inc., name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, build-date=2025-11-26T19:44:28Z, ceph=True, maintainer=Guillaume Abrioux , RELEASE=main, io.openshift.expose-services=, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4) Dec 15 04:51:41 localhost systemd[1]: ceph-bce17446-41b5-5408-a23e-0b011906b44a@mon.np0005559462.service: Deactivated successfully. Dec 15 04:51:41 localhost systemd[1]: Stopped Ceph mon.np0005559462 for bce17446-41b5-5408-a23e-0b011906b44a. Dec 15 04:51:41 localhost systemd[1]: ceph-bce17446-41b5-5408-a23e-0b011906b44a@mon.np0005559462.service: Consumed 3.834s CPU time. Dec 15 04:51:41 localhost systemd[1]: Reloading. Dec 15 04:51:42 localhost systemd-rc-local-generator[297935]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:51:42 localhost systemd-sysv-generator[297939]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:51:42 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:51:42 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 15 04:51:42 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:51:42 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:51:42 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:51:42 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 15 04:51:42 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:51:42 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:51:42 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:51:45 localhost nova_compute[286344]: 2025-12-15 09:51:45.203 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:51:45 localhost nova_compute[286344]: 2025-12-15 09:51:45.206 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:51:45 localhost nova_compute[286344]: 2025-12-15 09:51:45.207 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 15 04:51:45 localhost nova_compute[286344]: 2025-12-15 09:51:45.207 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:51:45 localhost nova_compute[286344]: 2025-12-15 09:51:45.242 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:51:45 localhost nova_compute[286344]: 2025-12-15 09:51:45.242 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.121 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'name': 'test', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005559462.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'c785bf23f53946bc99867d8832a50266', 'user_id': '1ba5fce347b64bfebf995f187193f205', 'hostId': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.122 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.122 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.122 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.135 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.136 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.137 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bd19b4f6-4874-4435-9280-c4ea490f5500', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:51:48.123094', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ac3d8fec-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11374.315784188, 'message_signature': '18a3e881582d3dce2bdd4a4a6da7a6f4287850bec67652d57d5d5386fd9e15d8'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:51:48.123094', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ac3da2c0-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11374.315784188, 'message_signature': '06d5e6ba9f092b537b0b92ed7ef9be1edd5e4673cf8fe3aa22ba87f39c52e4e4'}]}, 'timestamp': '2025-12-15 09:51:48.136620', '_unique_id': 'ec21adcc39b043abb6acc34effb224d7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.137 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.137 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.137 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.137 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.137 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.137 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.137 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.137 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.137 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.137 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.137 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.137 12 ERROR oslo_messaging.notify.messaging Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.137 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.137 12 ERROR oslo_messaging.notify.messaging Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.137 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.137 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.137 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.137 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.137 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.137 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.137 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.137 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.137 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.137 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.137 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.137 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.137 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.137 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.137 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.137 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.137 12 ERROR oslo_messaging.notify.messaging Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.139 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.163 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.163 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.164 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0e148347-284d-453d-821f-9c79949671ad', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:51:48.139303', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ac41bda6-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11374.3319679, 'message_signature': '50be33e147973b44a23e6aa3039110724071137034bb899936577d7460af4dd0'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:51:48.139303', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ac41c65c-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11374.3319679, 'message_signature': '74117ce3a20b1b48a6a79c96cd76fa43b00d4ff67f7d106e7a4b269003ade22c'}]}, 'timestamp': '2025-12-15 09:51:48.163643', '_unique_id': 'd6f723d2c7a44d41acfef93495806d95'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.164 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.164 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.164 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.164 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.164 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.164 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.164 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.164 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.164 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.164 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.164 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.164 12 ERROR oslo_messaging.notify.messaging Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.164 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.164 12 ERROR oslo_messaging.notify.messaging Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.164 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.164 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.164 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.164 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.164 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.164 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.164 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.164 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.164 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.164 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.164 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.164 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.164 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.164 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.164 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.164 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.164 12 ERROR oslo_messaging.notify.messaging Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.164 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.168 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.168 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3b9541f9-04de-4710-bd71-014eb68ba72b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:51:48.164944', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': 'ac42831c-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11374.357607975, 'message_signature': '843c4012beeee99960ed7e46fc3c80a6a45d10e7025d9440923ba497213945c1'}]}, 'timestamp': '2025-12-15 09:51:48.168485', '_unique_id': '3da02ad4597b4f808fce75db89a488fb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.168 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.168 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.168 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.168 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.168 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.168 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.168 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.168 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.168 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.168 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.168 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.168 12 ERROR oslo_messaging.notify.messaging Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.168 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.168 12 ERROR oslo_messaging.notify.messaging Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.168 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.168 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.168 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.168 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.168 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.168 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.168 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.168 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.168 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.168 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.168 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.168 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.168 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.168 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.168 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.168 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.168 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.168 12 ERROR oslo_messaging.notify.messaging Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.169 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.169 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.170 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '94ede338-2a3e-4ba0-8cc2-18daa151e296', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:51:48.169632', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': 'ac42b832-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11374.357607975, 'message_signature': 'ab12b7cc25bfa8bcd5d0dd199bbb25280a183f4cbcf2d548553aebef61cd3e19'}]}, 'timestamp': '2025-12-15 09:51:48.169839', '_unique_id': '1a8ef66f832d414e8d940c90042685b8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.170 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.170 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.170 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.170 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.170 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.170 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.170 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.170 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.170 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.170 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.170 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.170 12 ERROR oslo_messaging.notify.messaging Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.170 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.170 12 ERROR oslo_messaging.notify.messaging Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.170 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.170 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.170 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.170 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.170 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.170 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.170 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.170 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.170 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.170 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.170 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.170 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.170 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.170 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.170 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.170 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.170 12 ERROR oslo_messaging.notify.messaging Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.170 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.170 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.170 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.171 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1ed7623e-9571-498a-8399-b3550cdbca6d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:51:48.170838', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': 'ac42e730-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11374.357607975, 'message_signature': '47581f7eb5a0278d4f7853f56301e34dd90552d5b370e04a893cbc49b1c122b9'}]}, 'timestamp': '2025-12-15 09:51:48.171065', '_unique_id': 'f49f518ac00f44a6a93affaefd90e380'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.171 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.171 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.171 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.171 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.171 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.171 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.171 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.171 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.171 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.171 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.171 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.171 12 ERROR oslo_messaging.notify.messaging Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.171 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.171 12 ERROR oslo_messaging.notify.messaging Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.171 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.171 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.171 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.171 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.171 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.171 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.171 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.171 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.171 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.171 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.171 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.171 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.171 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.171 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.171 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.171 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.171 12 ERROR oslo_messaging.notify.messaging Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.172 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.183 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/memory.usage volume: 51.73828125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.184 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5a6fc36e-ab44-469b-a1f8-d44c9cc353f3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.73828125, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'timestamp': '2025-12-15T09:51:48.172092', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'ac44e3f0-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11374.376411349, 'message_signature': 'bd8b569e930f87e268eb24838ba6f29bb7c362a0ae616c5ec405f59d4ee96b1f'}]}, 'timestamp': '2025-12-15 09:51:48.184085', '_unique_id': '98bcd22c634a4b9c93ec57829cc90b61'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.184 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.184 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.184 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.184 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.184 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.184 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.184 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.184 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.184 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.184 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.184 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.184 12 ERROR oslo_messaging.notify.messaging Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.184 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.184 12 ERROR oslo_messaging.notify.messaging Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.184 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.184 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.184 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.184 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.184 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.184 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.184 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.184 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.184 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.184 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.184 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.184 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.184 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.184 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.184 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.184 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.184 12 ERROR oslo_messaging.notify.messaging Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.185 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.185 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.185 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.185 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cefce05b-189b-4011-bbdb-b09fa5952cd9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:51:48.185138', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': 'ac451618-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11374.357607975, 'message_signature': '96b89451d0c0908c1d7f452f82bd075f7922e407e86d97930fcdafd073f609ff'}]}, 'timestamp': '2025-12-15 09:51:48.185483', '_unique_id': '037fe98df9a84673b2a14434ebfad1db'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.185 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.185 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.185 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.185 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.185 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.185 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.185 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.185 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.185 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.185 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.185 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.185 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.185 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.185 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.185 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.185 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.185 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.185 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.185 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.185 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.185 12 ERROR oslo_messaging.notify.messaging Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.185 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.185 12 ERROR oslo_messaging.notify.messaging Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.185 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.185 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.185 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.185 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.185 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.185 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.185 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.185 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.185 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.185 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.185 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.185 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.185 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.185 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.185 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.185 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.185 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.185 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.185 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.185 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.185 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.185 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.185 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.185 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.185 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.185 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.185 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.185 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.185 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.185 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.185 12 ERROR oslo_messaging.notify.messaging Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.186 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.186 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.187 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '194b717f-abca-4e97-89cd-f252b6ce2361', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:51:48.186418', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': 'ac4547f0-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11374.357607975, 'message_signature': 'a83c8c67f235f69be3079ac4b18b710cf01e85673966b4a2ee9426af250f9666'}]}, 'timestamp': '2025-12-15 09:51:48.186625', '_unique_id': '9f6e070824a04750ba715e63eb1970c6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.187 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.187 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.187 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.187 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.187 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.187 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.187 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.187 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.187 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.187 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.187 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.187 12 ERROR oslo_messaging.notify.messaging Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.187 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.187 12 ERROR oslo_messaging.notify.messaging Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.187 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.187 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.187 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.187 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.187 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.187 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.187 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.187 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.187 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.187 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.187 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.187 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.187 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.187 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.187 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.187 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.187 12 ERROR oslo_messaging.notify.messaging Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.187 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.187 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/cpu volume: 11000000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.188 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ca33e646-2479-44e2-9966-43fd4e6bf7d3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11000000000, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'timestamp': '2025-12-15T09:51:48.187633', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'ac457770-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11374.376411349, 'message_signature': '5831198ddddb3e585c3be40cb304190205589f4b234460479d7cc77bbb6ddeb8'}]}, 'timestamp': '2025-12-15 09:51:48.187836', '_unique_id': '1efce63430ae41d78f3f088eb1885a16'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.188 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.188 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.188 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.188 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.188 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.188 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.188 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.188 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.188 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.188 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.188 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.188 12 ERROR oslo_messaging.notify.messaging Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.188 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.188 12 ERROR oslo_messaging.notify.messaging Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.188 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.188 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.188 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.188 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.188 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.188 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.188 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.188 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.188 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.188 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.188 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.188 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.188 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.188 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.188 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.188 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.188 12 ERROR oslo_messaging.notify.messaging Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.188 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.188 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.189 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bee9371b-a369-44be-bff0-129a26d6ace8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:51:48.188759', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': 'ac45a358-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11374.357607975, 'message_signature': '3cb3d5c604ffdb4666dcf97e8b001987b0ce699f08243ceafdc09281a30cb572'}]}, 'timestamp': '2025-12-15 09:51:48.188966', '_unique_id': 'f17087eeb83e46e58a137ee759b74adc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.189 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.189 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.189 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.189 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.189 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.189 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.189 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.189 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.189 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.189 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.189 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.189 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.189 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.189 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.189 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.189 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.189 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.189 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.189 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.189 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.189 12 ERROR oslo_messaging.notify.messaging Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.189 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.189 12 ERROR oslo_messaging.notify.messaging Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.189 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.189 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.189 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.189 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.189 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.189 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.189 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.189 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.189 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.189 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.189 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.189 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.189 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.189 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.189 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.189 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.189 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.189 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.189 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.189 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.189 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.189 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.189 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.189 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.189 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.189 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.189 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.189 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.189 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.189 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.189 12 ERROR oslo_messaging.notify.messaging Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.189 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.189 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.latency volume: 1243487016 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.190 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.latency volume: 24779175 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.190 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f12ce94e-5f26-4579-9b2c-44c55d39d1d9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1243487016, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:51:48.189908', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ac45d0bc-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11374.3319679, 'message_signature': 'b928c0addb1766b4c31304dae0e85e761840d5f0f2f5b81f0f6fc217cc4e7e76'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24779175, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:51:48.189908', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ac45d7ce-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11374.3319679, 'message_signature': '464fe7e5b30c0f86609bdf8c84b1ebb25c470b07f8a37d579165739d1f977a19'}]}, 'timestamp': '2025-12-15 09:51:48.190294', '_unique_id': '70b8b2f7b72447ce97c739fe8763366e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.190 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.190 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.190 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.190 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.190 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.190 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.190 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.190 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.190 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.190 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.190 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.190 12 ERROR oslo_messaging.notify.messaging Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.190 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.190 12 ERROR oslo_messaging.notify.messaging Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.190 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.190 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.190 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.190 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.190 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.190 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.190 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.190 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.190 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.190 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.190 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.190 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.190 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.190 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.190 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.190 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.190 12 ERROR oslo_messaging.notify.messaging Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.191 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.191 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.191 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'baca8454-dbeb-4744-984f-2ec3262067d2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:51:48.191249', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': 'ac4604ba-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11374.357607975, 'message_signature': '94ceeda07725ff5dde63ba0affd439a905a0c6d10ffa3d18323ea6cb83bdb3b1'}]}, 'timestamp': '2025-12-15 09:51:48.191458', '_unique_id': '7018b19426eb4a77ac6d8cbcf5175705'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.191 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.191 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.191 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.191 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.191 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.191 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.191 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.191 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.191 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.191 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.191 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.191 12 ERROR oslo_messaging.notify.messaging Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.191 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.191 12 ERROR oslo_messaging.notify.messaging Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.191 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.191 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.191 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.191 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.191 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.191 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.191 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.191 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.191 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.191 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.191 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.191 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.191 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.191 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.191 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.191 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.191 12 ERROR oslo_messaging.notify.messaging Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.192 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.192 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.192 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.193 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8f5d06c4-c765-49eb-8bec-b005aa4e4577', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:51:48.192369', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ac463048-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11374.315784188, 'message_signature': '5a0e6a621320d9c719f23fb76a7b4fc8e361dfc8db7a0d613ea21ecab2237b74'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:51:48.192369', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ac463728-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11374.315784188, 'message_signature': '3f74df483b05d5e9904de28c64ded72448beb7ddd323215ae9b91ce6dc8f40cc'}]}, 'timestamp': '2025-12-15 09:51:48.192734', '_unique_id': '7e9dc84cce8a4d95bde2e94500e300b6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.193 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.193 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.193 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.193 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.193 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.193 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.193 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.193 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.193 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.193 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.193 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.193 12 ERROR oslo_messaging.notify.messaging Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.193 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.193 12 ERROR oslo_messaging.notify.messaging Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.193 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.193 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.193 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.193 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.193 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.193 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.193 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.193 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.193 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.193 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.193 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.193 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.193 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.193 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.193 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.193 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.193 12 ERROR oslo_messaging.notify.messaging Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.193 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.193 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.latency volume: 1342134926 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.193 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.latency volume: 123356132 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.194 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '391ec5f0-866b-4b37-98f0-be2881cab4d4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1342134926, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:51:48.193658', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ac466338-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11374.3319679, 'message_signature': 'cb49dff024a8b50c7f5153ab703b77c21587ca19c7c8974cd4459489835771cd'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 123356132, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:51:48.193658', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ac466a68-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11374.3319679, 'message_signature': '1b60fc8ad20b962ccf0a86c72168d45a7410d315fe9b94e29f39f230ba897a5e'}]}, 'timestamp': '2025-12-15 09:51:48.194064', '_unique_id': '6e5bfa915e614d66afdc29396cbe7a96'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.194 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.194 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.194 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.194 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.194 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.194 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.194 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.194 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.194 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.194 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.194 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.194 12 ERROR oslo_messaging.notify.messaging Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.194 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.194 12 ERROR oslo_messaging.notify.messaging Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.194 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.194 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.194 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.194 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.194 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.194 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.194 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.194 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.194 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.194 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.194 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.194 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.194 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.194 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.194 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.194 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.194 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.194 12 ERROR oslo_messaging.notify.messaging Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.194 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.195 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.195 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.195 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a43d9b76-e162-42c0-889a-968eca46df40', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:51:48.194975', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ac469740-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11374.3319679, 'message_signature': '766bb7a53bc3139a1a46903d56284e638c10b2839561bc4847ecebe3bb5a1bed'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:51:48.194975', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ac469f60-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11374.3319679, 'message_signature': '15b6c22c4c5cebe6c4ce77e28ed2bab40357f6046f086789947c1930f36c8771'}]}, 'timestamp': '2025-12-15 09:51:48.195402', '_unique_id': '40ad7c2d88bb44eabe6b1693e9a0926a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.195 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.195 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.195 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.195 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.195 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.195 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.195 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.195 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.195 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.195 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.195 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.195 12 ERROR oslo_messaging.notify.messaging Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.195 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.195 12 ERROR oslo_messaging.notify.messaging Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.195 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.195 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.195 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.195 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.195 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.195 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.195 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.195 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.195 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.195 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.195 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.195 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.195 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.195 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.195 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.195 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.195 12 ERROR oslo_messaging.notify.messaging Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.196 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.196 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.196 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.197 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a64caeb4-747e-4967-b017-a630fbdfa606', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:51:48.196334', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ac46cb16-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11374.315784188, 'message_signature': '26cb11fc3e366479719c3e73d5859262fdc494d7cf9da39838597b0f892e595e'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:51:48.196334', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ac46d1d8-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11374.315784188, 'message_signature': '6946c4858e1535ca7ad2d65a95ebabfbe7402454f307464d49b8781f6da995e9'}]}, 'timestamp': '2025-12-15 09:51:48.196694', '_unique_id': '27a6cc6ec08c4a2883fc8fd5e2a6ee43'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.197 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.197 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.197 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.197 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.197 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.197 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.197 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.197 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.197 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.197 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.197 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.197 12 ERROR oslo_messaging.notify.messaging Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.197 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.197 12 ERROR oslo_messaging.notify.messaging Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.197 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.197 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.197 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.197 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.197 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.197 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.197 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.197 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.197 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.197 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.197 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.197 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.197 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.197 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.197 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.197 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.197 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.197 12 ERROR oslo_messaging.notify.messaging Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.197 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.197 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.198 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bf8a5fea-33a1-431d-8fab-c178a7d4db12', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:51:48.197831', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': 'ac4705cc-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11374.357607975, 'message_signature': 'b5793646b0f0b88218b3d308f8542d326811bdbc9457a955be2e414a37c16c52'}]}, 'timestamp': '2025-12-15 09:51:48.198056', '_unique_id': '5cf670cc2e534fec91dac2b178423516'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.198 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.198 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.198 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.198 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.198 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.198 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.198 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.198 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.198 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.198 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.198 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.198 12 ERROR oslo_messaging.notify.messaging Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.198 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.198 12 ERROR oslo_messaging.notify.messaging Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.198 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.198 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.198 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.198 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.198 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.198 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.198 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.198 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.198 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.198 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.198 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.198 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.198 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.198 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.198 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.198 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.198 12 ERROR oslo_messaging.notify.messaging Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.199 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.199 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.199 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.200 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1de8a9a4-e51e-4ab6-a518-c42a4de5eae5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:51:48.199199', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ac473cb8-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11374.3319679, 'message_signature': '54ca979d4c29394c18d6aa35f7ceecf905f0d1c2cee9ff447605be84638abc3d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:51:48.199199', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ac4746b8-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11374.3319679, 'message_signature': '7bbe5c5535b7a7bc9696cf89070d132767c37e35d874a7de9c2a5efadefb5a58'}]}, 'timestamp': '2025-12-15 09:51:48.199723', '_unique_id': '872f41569a3d419daac4ebc0ac8e4a62'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.200 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.200 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.200 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.200 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.200 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.200 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.200 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.200 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.200 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.200 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.200 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.200 12 ERROR oslo_messaging.notify.messaging Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.200 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.200 12 ERROR oslo_messaging.notify.messaging Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.200 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.200 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.200 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.200 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.200 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.200 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.200 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.200 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.200 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.200 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.200 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.200 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.200 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.200 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.200 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.200 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.200 12 ERROR oslo_messaging.notify.messaging Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.201 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.201 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.201 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '47b8cacd-0d32-4e68-a904-52c8fd0c9bd9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:51:48.201114', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': 'ac47883a-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11374.357607975, 'message_signature': '971f93f494607beef2cd5f450e718641f87dc0c430c3ab9efceb1add7c9984e9'}]}, 'timestamp': '2025-12-15 09:51:48.201427', '_unique_id': 'e144f6a45595460d93aa06af72dbbd1c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.201 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.201 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.201 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.201 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.201 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.201 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.201 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.201 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.201 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.201 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.201 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.201 12 ERROR oslo_messaging.notify.messaging Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.201 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.201 12 ERROR oslo_messaging.notify.messaging Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.201 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.201 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.201 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.201 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.201 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.201 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.201 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.201 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.201 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.201 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.201 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.201 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.201 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.201 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.201 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.201 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.201 12 ERROR oslo_messaging.notify.messaging Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.202 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.202 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.203 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9e8bb3c7-eed3-4777-9464-d58c35406c3d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:51:48.202709', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': 'ac47c692-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11374.357607975, 'message_signature': '44a73b9ac0430d38fa8220df55004533e3973a7863a0cf007e2991d93ef6d323'}]}, 'timestamp': '2025-12-15 09:51:48.203048', '_unique_id': 'fee8fc35365349149ec48dbedd5fba8f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.203 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.203 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.203 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.203 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.203 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.203 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.203 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.203 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.203 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.203 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.203 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.203 12 ERROR oslo_messaging.notify.messaging Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.203 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.203 12 ERROR oslo_messaging.notify.messaging Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.203 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.203 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.203 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.203 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.203 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.203 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.203 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.203 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.203 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.203 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.203 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.203 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.203 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.203 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.203 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.203 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.203 12 ERROR oslo_messaging.notify.messaging Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.204 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.204 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.204 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.205 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '29a153ee-0a09-4e1e-8129-491f19496995', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:51:48.204403', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ac480800-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11374.3319679, 'message_signature': '5d5d36b97062044f98b7cfe40ea54d91c58a20524c9f0eca832bd3f3dc533a61'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:51:48.204403', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ac4811f6-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11374.3319679, 'message_signature': 'b301ce75847caa788ab84241a047a3b962f91003f461c1351cd46d3e194a0ef7'}]}, 'timestamp': '2025-12-15 09:51:48.204929', '_unique_id': '2121e135a5d04af1bad7a68f6bbd52ed'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.205 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.205 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.205 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.205 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.205 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.205 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.205 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.205 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.205 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.205 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.205 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.205 12 ERROR oslo_messaging.notify.messaging Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.205 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.205 12 ERROR oslo_messaging.notify.messaging Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.205 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.205 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.205 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.205 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.205 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.205 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.205 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.205 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.205 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.205 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.205 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.205 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.205 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.205 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.205 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.205 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:51:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:51:48.205 12 ERROR oslo_messaging.notify.messaging Dec 15 04:51:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0. Dec 15 04:51:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 04:51:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a. Dec 15 04:51:49 localhost podman[297948]: 2025-12-15 09:51:49.754973581 +0000 UTC m=+0.083264013 container health_status 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Dec 15 04:51:49 localhost podman[297949]: 2025-12-15 09:51:49.808903894 +0000 UTC m=+0.130317904 container health_status 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Dec 15 04:51:49 localhost podman[297949]: 2025-12-15 09:51:49.822431841 +0000 UTC m=+0.143845891 container exec_died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd) Dec 15 04:51:49 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.service: Deactivated successfully. Dec 15 04:51:49 localhost podman[297950]: 2025-12-15 09:51:49.871716305 +0000 UTC m=+0.190464871 container health_status b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Dec 15 04:51:49 localhost podman[297950]: 2025-12-15 09:51:49.882033283 +0000 UTC m=+0.200781839 container exec_died b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible) Dec 15 04:51:49 localhost podman[297948]: 2025-12-15 09:51:49.89017908 +0000 UTC m=+0.218469502 container exec_died 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 15 04:51:49 localhost systemd[1]: b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a.service: Deactivated successfully. Dec 15 04:51:49 localhost systemd[1]: 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0.service: Deactivated successfully. Dec 15 04:51:50 localhost nova_compute[286344]: 2025-12-15 09:51:50.243 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:51:50 localhost nova_compute[286344]: 2025-12-15 09:51:50.245 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:51:50 localhost nova_compute[286344]: 2025-12-15 09:51:50.245 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 15 04:51:50 localhost nova_compute[286344]: 2025-12-15 09:51:50.245 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:51:50 localhost nova_compute[286344]: 2025-12-15 09:51:50.280 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:51:50 localhost nova_compute[286344]: 2025-12-15 09:51:50.281 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:51:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:51:51.468 160590 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:51:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:51:51.469 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:51:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:51:51.469 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:51:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09. Dec 15 04:51:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 04:51:52 localhost podman[298008]: 2025-12-15 09:51:52.754823634 +0000 UTC m=+0.080245169 container health_status 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, config_id=openstack_network_exporter, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, container_name=openstack_network_exporter, vcs-type=git, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public) Dec 15 04:51:52 localhost podman[298008]: 2025-12-15 09:51:52.772125646 +0000 UTC m=+0.097547201 container exec_died 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, version=9.6, build-date=2025-08-20T13:12:41, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, release=1755695350, io.buildah.version=1.33.7, managed_by=edpm_ansible, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter) Dec 15 04:51:52 localhost systemd[1]: 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09.service: Deactivated successfully. Dec 15 04:51:52 localhost podman[298009]: 2025-12-15 09:51:52.860431847 +0000 UTC m=+0.183236359 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Dec 15 04:51:52 localhost podman[298009]: 2025-12-15 09:51:52.922749035 +0000 UTC m=+0.245553567 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 04:51:52 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 04:51:55 localhost nova_compute[286344]: 2025-12-15 09:51:55.282 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:51:55 localhost nova_compute[286344]: 2025-12-15 09:51:55.284 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:51:55 localhost nova_compute[286344]: 2025-12-15 09:51:55.284 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 15 04:51:55 localhost nova_compute[286344]: 2025-12-15 09:51:55.285 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:51:55 localhost nova_compute[286344]: 2025-12-15 09:51:55.285 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:51:55 localhost nova_compute[286344]: 2025-12-15 09:51:55.288 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:51:56 localhost systemd[1]: tmp-crun.L9TEoS.mount: Deactivated successfully. Dec 15 04:51:56 localhost podman[298159]: 2025-12-15 09:51:56.792425948 +0000 UTC m=+0.098182668 container exec 8dcda56b365b42dc8758aab77a9ec80db304780e449052738f7e4e648ae1ecaf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-crash-np0005559462, version=7, build-date=2025-11-26T19:44:28Z, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, distribution-scope=public, ceph=True, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, release=1763362218, architecture=x86_64, maintainer=Guillaume Abrioux , url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, GIT_BRANCH=main, io.openshift.expose-services=) Dec 15 04:51:56 localhost podman[298159]: 2025-12-15 09:51:56.919107269 +0000 UTC m=+0.224863989 container exec_died 8dcda56b365b42dc8758aab77a9ec80db304780e449052738f7e4e648ae1ecaf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-crash-np0005559462, maintainer=Guillaume Abrioux , cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, release=1763362218, io.openshift.tags=rhceph ceph, ceph=True, build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vcs-type=git, RELEASE=main, io.openshift.expose-services=, GIT_CLEAN=True, distribution-scope=public, name=rhceph, vendor=Red Hat, Inc., version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0) Dec 15 04:51:58 localhost podman[298392]: Dec 15 04:51:58 localhost podman[298392]: 2025-12-15 09:51:58.059141932 +0000 UTC m=+0.069083647 container create d5a924eb7043dde20f79d3a5c20041e68a42601a1164adb93a6d8ec84666554a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_merkle, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, io.openshift.expose-services=, vcs-type=git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, ceph=True, io.buildah.version=1.41.4, distribution-scope=public, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux , GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, version=7, com.redhat.component=rhceph-container, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, GIT_BRANCH=main, RELEASE=main) Dec 15 04:51:58 localhost systemd[1]: Started libpod-conmon-d5a924eb7043dde20f79d3a5c20041e68a42601a1164adb93a6d8ec84666554a.scope. Dec 15 04:51:58 localhost systemd[1]: Started libcrun container. Dec 15 04:51:58 localhost podman[298392]: 2025-12-15 09:51:58.129261947 +0000 UTC m=+0.139203672 container init d5a924eb7043dde20f79d3a5c20041e68a42601a1164adb93a6d8ec84666554a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_merkle, io.openshift.expose-services=, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, ceph=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, name=rhceph, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, maintainer=Guillaume Abrioux , GIT_BRANCH=main) Dec 15 04:51:58 localhost podman[298392]: 2025-12-15 09:51:58.030517025 +0000 UTC m=+0.040458760 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 04:51:58 localhost podman[298392]: 2025-12-15 09:51:58.139461642 +0000 UTC m=+0.149403327 container start d5a924eb7043dde20f79d3a5c20041e68a42601a1164adb93a6d8ec84666554a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_merkle, ceph=True, RELEASE=main, version=7, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, architecture=x86_64, release=1763362218, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, io.openshift.expose-services=, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, maintainer=Guillaume Abrioux , GIT_CLEAN=True, distribution-scope=public) Dec 15 04:51:58 localhost podman[298392]: 2025-12-15 09:51:58.141239071 +0000 UTC m=+0.151180866 container attach d5a924eb7043dde20f79d3a5c20041e68a42601a1164adb93a6d8ec84666554a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_merkle, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, architecture=x86_64, io.buildah.version=1.41.4, distribution-scope=public, version=7, maintainer=Guillaume Abrioux , vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-26T19:44:28Z, description=Red Hat Ceph Storage 7, vcs-type=git, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, ceph=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers) Dec 15 04:51:58 localhost trusting_merkle[298411]: 167 167 Dec 15 04:51:58 localhost systemd[1]: libpod-d5a924eb7043dde20f79d3a5c20041e68a42601a1164adb93a6d8ec84666554a.scope: Deactivated successfully. Dec 15 04:51:58 localhost podman[298392]: 2025-12-15 09:51:58.143407162 +0000 UTC m=+0.153348907 container died d5a924eb7043dde20f79d3a5c20041e68a42601a1164adb93a6d8ec84666554a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_merkle, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, release=1763362218, io.openshift.tags=rhceph ceph, distribution-scope=public, architecture=x86_64, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, vcs-type=git, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, ceph=True, name=rhceph, description=Red Hat Ceph Storage 7, RELEASE=main, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container) Dec 15 04:51:58 localhost podman[298418]: 2025-12-15 09:51:58.244632333 +0000 UTC m=+0.088132437 container remove d5a924eb7043dde20f79d3a5c20041e68a42601a1164adb93a6d8ec84666554a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_merkle, com.redhat.component=rhceph-container, RELEASE=main, description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, io.buildah.version=1.41.4, GIT_BRANCH=main, GIT_CLEAN=True, io.openshift.expose-services=, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , name=rhceph) Dec 15 04:51:58 localhost systemd[1]: libpod-conmon-d5a924eb7043dde20f79d3a5c20041e68a42601a1164adb93a6d8ec84666554a.scope: Deactivated successfully. Dec 15 04:51:58 localhost podman[298447]: Dec 15 04:51:58 localhost podman[298447]: 2025-12-15 09:51:58.329908841 +0000 UTC m=+0.060478537 container create 4c461342c818cb2ad217ac2998a4db8be2814319e5957c9ccd213b29273f6b30 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wonderful_ishizaka, architecture=x86_64, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, name=rhceph, RELEASE=main, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218) Dec 15 04:51:58 localhost systemd[1]: Started libpod-conmon-4c461342c818cb2ad217ac2998a4db8be2814319e5957c9ccd213b29273f6b30.scope. Dec 15 04:51:58 localhost systemd[1]: Started libcrun container. Dec 15 04:51:58 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/08b561b96016d94b14ca5a67592cc74073be861e71555bb633672123caba2837/merged/tmp/keyring supports timestamps until 2038 (0x7fffffff) Dec 15 04:51:58 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/08b561b96016d94b14ca5a67592cc74073be861e71555bb633672123caba2837/merged/tmp/config supports timestamps until 2038 (0x7fffffff) Dec 15 04:51:58 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/08b561b96016d94b14ca5a67592cc74073be861e71555bb633672123caba2837/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Dec 15 04:51:58 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/08b561b96016d94b14ca5a67592cc74073be861e71555bb633672123caba2837/merged/var/lib/ceph/mon/ceph-np0005559462 supports timestamps until 2038 (0x7fffffff) Dec 15 04:51:58 localhost podman[298447]: 2025-12-15 09:51:58.390884881 +0000 UTC m=+0.121454577 container init 4c461342c818cb2ad217ac2998a4db8be2814319e5957c9ccd213b29273f6b30 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wonderful_ishizaka, io.openshift.expose-services=, release=1763362218, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-11-26T19:44:28Z, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, version=7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, name=rhceph) Dec 15 04:51:58 localhost podman[298447]: 2025-12-15 09:51:58.297295412 +0000 UTC m=+0.027865188 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 04:51:58 localhost podman[298447]: 2025-12-15 09:51:58.401184718 +0000 UTC m=+0.131754424 container start 4c461342c818cb2ad217ac2998a4db8be2814319e5957c9ccd213b29273f6b30 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wonderful_ishizaka, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, vcs-type=git, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, GIT_BRANCH=main) Dec 15 04:51:58 localhost podman[298447]: 2025-12-15 09:51:58.40301888 +0000 UTC m=+0.133588556 container attach 4c461342c818cb2ad217ac2998a4db8be2814319e5957c9ccd213b29273f6b30 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wonderful_ishizaka, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, io.buildah.version=1.41.4, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, maintainer=Guillaume Abrioux , cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, build-date=2025-11-26T19:44:28Z, release=1763362218) Dec 15 04:51:58 localhost systemd[1]: libpod-4c461342c818cb2ad217ac2998a4db8be2814319e5957c9ccd213b29273f6b30.scope: Deactivated successfully. Dec 15 04:51:58 localhost podman[298447]: 2025-12-15 09:51:58.496602388 +0000 UTC m=+0.227172104 container died 4c461342c818cb2ad217ac2998a4db8be2814319e5957c9ccd213b29273f6b30 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wonderful_ishizaka, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, name=rhceph, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, GIT_CLEAN=True, version=7, io.openshift.expose-services=, architecture=x86_64, release=1763362218) Dec 15 04:51:58 localhost podman[298540]: 2025-12-15 09:51:58.592641195 +0000 UTC m=+0.084936148 container remove 4c461342c818cb2ad217ac2998a4db8be2814319e5957c9ccd213b29273f6b30 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wonderful_ishizaka, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-11-26T19:44:28Z, GIT_CLEAN=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, vendor=Red Hat, Inc., vcs-type=git, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, architecture=x86_64, io.openshift.expose-services=, name=rhceph, distribution-scope=public, maintainer=Guillaume Abrioux , url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git) Dec 15 04:51:58 localhost systemd[1]: libpod-conmon-4c461342c818cb2ad217ac2998a4db8be2814319e5957c9ccd213b29273f6b30.scope: Deactivated successfully. Dec 15 04:51:58 localhost systemd[1]: Reloading. Dec 15 04:51:58 localhost systemd-rc-local-generator[298612]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:51:58 localhost systemd-sysv-generator[298619]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:51:58 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:51:58 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 15 04:51:58 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:51:58 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:51:58 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:51:58 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 15 04:51:58 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:51:58 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:51:58 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:51:58 localhost systemd[1]: var-lib-containers-storage-overlay-38d1358b040d09cfc71a7815a5a170ecab28319230833b09345aa0986f741eaa-merged.mount: Deactivated successfully. Dec 15 04:51:59 localhost systemd[1]: Reloading. Dec 15 04:51:59 localhost systemd-rc-local-generator[298679]: /etc/rc.d/rc.local is not marked executable, skipping. Dec 15 04:51:59 localhost systemd-sysv-generator[298683]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust. Dec 15 04:51:59 localhost systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:51:59 localhost systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload Dec 15 04:51:59 localhost systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:51:59 localhost systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:51:59 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Dec 15 04:51:59 localhost systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload Dec 15 04:51:59 localhost systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:51:59 localhost systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:51:59 localhost systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload Dec 15 04:51:59 localhost systemd[1]: Starting Ceph mon.np0005559462 for bce17446-41b5-5408-a23e-0b011906b44a... Dec 15 04:51:59 localhost podman[298860]: Dec 15 04:51:59 localhost podman[298860]: 2025-12-15 09:51:59.901208698 +0000 UTC m=+0.077031709 container create 98760c33c2641506a01dd5eb78a8bea3d6828496dba275bde8c5bc6b941094e3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-mon-np0005559462, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, architecture=x86_64, com.redhat.component=rhceph-container, build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, version=7, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, name=rhceph, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main) Dec 15 04:51:59 localhost podman[298860]: 2025-12-15 09:51:59.869511933 +0000 UTC m=+0.045334974 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 04:51:59 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/648d80df1de9be973f774bb97157b352cf608cc240cae4432210c611289b11cf/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Dec 15 04:51:59 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/648d80df1de9be973f774bb97157b352cf608cc240cae4432210c611289b11cf/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Dec 15 04:51:59 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/648d80df1de9be973f774bb97157b352cf608cc240cae4432210c611289b11cf/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Dec 15 04:51:59 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/648d80df1de9be973f774bb97157b352cf608cc240cae4432210c611289b11cf/merged/var/lib/ceph/mon/ceph-np0005559462 supports timestamps until 2038 (0x7fffffff) Dec 15 04:51:59 localhost podman[298860]: 2025-12-15 09:51:59.979702766 +0000 UTC m=+0.155525757 container init 98760c33c2641506a01dd5eb78a8bea3d6828496dba275bde8c5bc6b941094e3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-mon-np0005559462, version=7, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, CEPH_POINT_RELEASE=, ceph=True, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, architecture=x86_64, maintainer=Guillaume Abrioux , io.openshift.expose-services=, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0) Dec 15 04:51:59 localhost podman[298860]: 2025-12-15 09:51:59.988691757 +0000 UTC m=+0.164514748 container start 98760c33c2641506a01dd5eb78a8bea3d6828496dba275bde8c5bc6b941094e3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-mon-np0005559462, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, distribution-scope=public, RELEASE=main, version=7, name=rhceph, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, build-date=2025-11-26T19:44:28Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Dec 15 04:51:59 localhost bash[298860]: 98760c33c2641506a01dd5eb78a8bea3d6828496dba275bde8c5bc6b941094e3 Dec 15 04:51:59 localhost systemd[1]: Started Ceph mon.np0005559462 for bce17446-41b5-5408-a23e-0b011906b44a. Dec 15 04:52:00 localhost ceph-mon[298913]: set uid:gid to 167:167 (ceph:ceph) Dec 15 04:52:00 localhost ceph-mon[298913]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-mon, pid 2 Dec 15 04:52:00 localhost ceph-mon[298913]: pidfile_write: ignore empty --pid-file Dec 15 04:52:00 localhost ceph-mon[298913]: load: jerasure load: lrc Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: RocksDB version: 7.9.2 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Git sha 0 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Compile date 2025-09-23 00:00:00 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: DB SUMMARY Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: DB Session ID: 0OJRM9SCUA16EXV0VQZ2 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: CURRENT file: CURRENT Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: IDENTITY file: IDENTITY Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: MANIFEST file: MANIFEST-000005 size: 59 Bytes Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: SST files in /var/lib/ceph/mon/ceph-np0005559462/store.db dir, Total Num: 0, files: Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-np0005559462/store.db: 000004.log size: 886 ; Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.error_if_exists: 0 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.create_if_missing: 0 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.paranoid_checks: 1 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.flush_verify_memtable_count: 1 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.track_and_verify_wals_in_manifest: 0 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.verify_sst_unique_id_in_manifest: 1 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.env: 0x55e4c3e1d9e0 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.fs: PosixFileSystem Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.info_log: 0x55e4c4b00d20 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.max_file_opening_threads: 16 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.statistics: (nil) Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.use_fsync: 0 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.max_log_file_size: 0 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.max_manifest_file_size: 1073741824 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.log_file_time_to_roll: 0 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.keep_log_file_num: 1000 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.recycle_log_file_num: 0 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.allow_fallocate: 1 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.allow_mmap_reads: 0 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.allow_mmap_writes: 0 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.use_direct_reads: 0 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.use_direct_io_for_flush_and_compaction: 0 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.create_missing_column_families: 0 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.db_log_dir: Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.wal_dir: Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.table_cache_numshardbits: 6 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.WAL_ttl_seconds: 0 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.WAL_size_limit_MB: 0 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.max_write_batch_group_size_bytes: 1048576 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.manifest_preallocation_size: 4194304 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.is_fd_close_on_exec: 1 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.advise_random_on_open: 1 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.db_write_buffer_size: 0 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.write_buffer_manager: 0x55e4c4b11540 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.access_hint_on_compaction_start: 1 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.random_access_max_buffer_size: 1048576 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.use_adaptive_mutex: 0 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.rate_limiter: (nil) Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.sst_file_manager.rate_bytes_per_sec: 0 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.wal_recovery_mode: 2 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.enable_thread_tracking: 0 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.enable_pipelined_write: 0 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.unordered_write: 0 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.allow_concurrent_memtable_write: 1 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.enable_write_thread_adaptive_yield: 1 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.write_thread_max_yield_usec: 100 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.write_thread_slow_yield_usec: 3 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.row_cache: None Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.wal_filter: None Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.avoid_flush_during_recovery: 0 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.allow_ingest_behind: 0 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.two_write_queues: 0 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.manual_wal_flush: 0 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.wal_compression: 0 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.atomic_flush: 0 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.avoid_unnecessary_blocking_io: 0 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.persist_stats_to_disk: 0 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.write_dbid_to_manifest: 0 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.log_readahead_size: 0 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.file_checksum_gen_factory: Unknown Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.best_efforts_recovery: 0 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.max_bgerror_resume_count: 2147483647 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.bgerror_resume_retry_interval: 1000000 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.allow_data_in_errors: 0 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.db_host_id: __hostname__ Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.enforce_single_del_contracts: true Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.max_background_jobs: 2 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.max_background_compactions: -1 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.max_subcompactions: 1 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.avoid_flush_during_shutdown: 0 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.writable_file_max_buffer_size: 1048576 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.delayed_write_rate : 16777216 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.max_total_wal_size: 0 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.delete_obsolete_files_period_micros: 21600000000 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.stats_dump_period_sec: 600 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.stats_persist_period_sec: 600 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.stats_history_buffer_size: 1048576 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.max_open_files: -1 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.bytes_per_sync: 0 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.wal_bytes_per_sync: 0 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.strict_bytes_per_sync: 0 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.compaction_readahead_size: 0 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.max_background_flushes: -1 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Compression algorithms supported: Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: #011kZSTD supported: 0 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: #011kXpressCompression supported: 0 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: #011kBZip2Compression supported: 0 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: #011kZSTDNotFinalCompression supported: 0 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: #011kLZ4Compression supported: 1 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: #011kZlibCompression supported: 1 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: #011kLZ4HCCompression supported: 1 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: #011kSnappyCompression supported: 1 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Fast CRC32 supported: Supported on x86 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: DMutex implementation: pthread_mutex_t Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-np0005559462/store.db/MANIFEST-000005 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]: Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.comparator: leveldb.BytewiseComparator Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.merge_operator: Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.compaction_filter: None Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.compaction_filter_factory: None Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.sst_partitioner_factory: None Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.memtable_factory: SkipListFactory Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.table_factory: BlockBasedTable Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55e4c4b00980)#012 cache_index_and_filter_blocks: 1#012 cache_index_and_filter_blocks_with_high_priority: 0#012 pin_l0_filter_and_index_blocks_in_cache: 0#012 pin_top_level_index_and_filter: 1#012 index_type: 0#012 data_block_index_type: 0#012 index_shortening: 1#012 data_block_hash_table_util_ratio: 0.750000#012 checksum: 4#012 no_block_cache: 0#012 block_cache: 0x55e4c4afd350#012 block_cache_name: BinnedLRUCache#012 block_cache_options:#012 capacity : 536870912#012 num_shard_bits : 4#012 strict_capacity_limit : 0#012 high_pri_pool_ratio: 0.000#012 block_cache_compressed: (nil)#012 persistent_cache: (nil)#012 block_size: 4096#012 block_size_deviation: 10#012 block_restart_interval: 16#012 index_block_restart_interval: 1#012 metadata_block_size: 4096#012 partition_filters: 0#012 use_delta_encoding: 1#012 filter_policy: bloomfilter#012 whole_key_filtering: 1#012 verify_compression: 0#012 read_amp_bytes_per_bit: 0#012 format_version: 5#012 enable_index_compression: 1#012 block_align: 0#012 max_auto_readahead_size: 262144#012 prepopulate_block_cache: 0#012 initial_auto_readahead_size: 8192#012 num_file_reads_for_auto_readahead: 2 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.write_buffer_size: 33554432 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.max_write_buffer_number: 2 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.compression: NoCompression Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.bottommost_compression: Disabled Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.prefix_extractor: nullptr Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.num_levels: 7 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.min_write_buffer_number_to_merge: 1 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.bottommost_compression_opts.level: 32767 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.bottommost_compression_opts.strategy: 0 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.bottommost_compression_opts.enabled: false Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.compression_opts.window_bits: -14 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.compression_opts.level: 32767 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.compression_opts.strategy: 0 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.compression_opts.max_dict_bytes: 0 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.compression_opts.parallel_threads: 1 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.compression_opts.enabled: false Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.level0_file_num_compaction_trigger: 4 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.level0_slowdown_writes_trigger: 20 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.level0_stop_writes_trigger: 36 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.target_file_size_base: 67108864 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.target_file_size_multiplier: 1 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.max_bytes_for_level_base: 268435456 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.max_bytes_for_level_multiplier: 10.000000 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.max_sequential_skip_in_iterations: 8 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.max_compaction_bytes: 1677721600 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.arena_block_size: 1048576 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.disable_auto_compactions: 0 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.compaction_style: kCompactionStyleLevel Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.compaction_pri: kMinOverlappingRatio Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.compaction_options_universal.size_ratio: 1 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.table_properties_collectors: Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.inplace_update_support: 0 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.inplace_update_num_locks: 10000 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.memtable_whole_key_filtering: 0 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.memtable_huge_page_size: 0 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.bloom_locality: 0 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.max_successive_merges: 0 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.optimize_filters_for_hits: 0 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.paranoid_file_checks: 0 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.force_consistency_checks: 1 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.report_bg_io_stats: 0 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.ttl: 2592000 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.periodic_compaction_seconds: 0 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.preclude_last_level_data_seconds: 0 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.preserve_internal_time_seconds: 0 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.enable_blob_files: false Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.min_blob_size: 0 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.blob_file_size: 268435456 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.blob_compression_type: NoCompression Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.enable_blob_garbage_collection: false Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.blob_compaction_readahead_size: 0 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.blob_file_starting_level: 0 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-np0005559462/store.db/MANIFEST-000005 succeeded,manifest_file_number is 5, next_file_number is 7, last_sequence is 0, log_number is 0,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 0 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 0 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 603b24af-e2be-4214-bc56-9e652eb4af3d Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765792320037593, "job": 1, "event": "recovery_started", "wal_files": [4]} Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #4 mode 2 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765792320041819, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 8, "file_size": 2012, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 1, "largest_seqno": 5, "table_properties": {"data_size": 898, "index_size": 31, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 115, "raw_average_key_size": 23, "raw_value_size": 776, "raw_average_value_size": 155, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765792320, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "603b24af-e2be-4214-bc56-9e652eb4af3d", "db_session_id": "0OJRM9SCUA16EXV0VQZ2", "orig_file_number": 8, "seqno_to_time_mapping": "N/A"}} Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765792320042079, "job": 1, "event": "recovery_finished"} Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: [db/version_set.cc:5047] Creating manifest 10 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005559462/store.db/000004.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x55e4c4b24e00 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: DB pointer 0x55e4c4c1a000 Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Dec 15 04:52:00 localhost ceph-mon[298913]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 0.0 total, 0.0 interval#012Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s#012Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 1/0 1.96 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.5 0.00 0.00 1 0.004 0 0 0.0 0.0#012 Sum 1/0 1.96 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.5 0.00 0.00 1 0.004 0 0 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.5 0.00 0.00 1 0.004 0 0 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.5 0.00 0.00 1 0.004 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.0 total, 0.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.11 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.11 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55e4c4afd350#2 capacity: 512.00 MB usage: 0.22 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 1.5e-05 secs_since: 0#012Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.11 KB,2.08616e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] ** Dec 15 04:52:00 localhost ceph-mon[298913]: mon.np0005559462 does not exist in monmap, will attempt to join an existing cluster Dec 15 04:52:00 localhost ceph-mon[298913]: using public_addr v2:172.18.0.103:0/0 -> [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] Dec 15 04:52:00 localhost ceph-mon[298913]: starting mon.np0005559462 rank -1 at public addrs [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] at bind addrs [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon_data /var/lib/ceph/mon/ceph-np0005559462 fsid bce17446-41b5-5408-a23e-0b011906b44a Dec 15 04:52:00 localhost ceph-mon[298913]: mon.np0005559462@-1(???) e0 preinit fsid bce17446-41b5-5408-a23e-0b011906b44a Dec 15 04:52:00 localhost ceph-mon[298913]: mon.np0005559462@-1(synchronizing) e8 sync_obtain_latest_monmap Dec 15 04:52:00 localhost ceph-mon[298913]: mon.np0005559462@-1(synchronizing) e8 sync_obtain_latest_monmap obtained monmap e8 Dec 15 04:52:00 localhost nova_compute[286344]: 2025-12-15 09:52:00.289 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:52:00 localhost nova_compute[286344]: 2025-12-15 09:52:00.291 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:52:00 localhost nova_compute[286344]: 2025-12-15 09:52:00.291 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 15 04:52:00 localhost nova_compute[286344]: 2025-12-15 09:52:00.291 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:52:00 localhost nova_compute[286344]: 2025-12-15 09:52:00.319 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:52:00 localhost nova_compute[286344]: 2025-12-15 09:52:00.319 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:52:00 localhost ceph-mon[298913]: mon.np0005559462@-1(synchronizing).mds e17 new map Dec 15 04:52:00 localhost ceph-mon[298913]: mon.np0005559462@-1(synchronizing).mds e17 print_map#012e17#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,12=quiesce subvolumes}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#01116#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112025-12-15T08:04:09.300216+0000#012modified#0112025-12-15T09:49:18.667189+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#01182#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,12=quiesce subvolumes}#012max_mds#0111#012in#0110#012up#011{0=26777}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[6]#012metadata_pool#0117#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012qdb_cluster#011leader: 26777 members: 26777#012[mds.mds.np0005559463.rdpgze{0:26777} state up:active seq 13 addr [v2:172.18.0.107:6808/434194913,v1:172.18.0.107:6809/434194913] compat {c=[1],r=[1],i=[17ff]}]#012 #012 #012Standby daemons:#012 #012[mds.mds.np0005559462.mhigvc{-1:16959} state up:standby seq 1 addr [v2:172.18.0.106:6808/1713185344,v1:172.18.0.106:6809/1713185344] compat {c=[1],r=[1],i=[17ff]}]#012[mds.mds.np0005559464.piyuji{-1:26458} state up:standby seq 1 addr [v2:172.18.0.108:6808/2660138834,v1:172.18.0.108:6809/2660138834] compat {c=[1],r=[1],i=[17ff]}] Dec 15 04:52:00 localhost ceph-mon[298913]: mon.np0005559462@-1(synchronizing).osd e84 crush map has features 3314933000854323200, adjusting msgr requires Dec 15 04:52:00 localhost ceph-mon[298913]: mon.np0005559462@-1(synchronizing).osd e84 crush map has features 432629239337189376, adjusting msgr requires Dec 15 04:52:00 localhost ceph-mon[298913]: mon.np0005559462@-1(synchronizing).osd e84 crush map has features 432629239337189376, adjusting msgr requires Dec 15 04:52:00 localhost ceph-mon[298913]: mon.np0005559462@-1(synchronizing).osd e84 crush map has features 432629239337189376, adjusting msgr requires Dec 15 04:52:00 localhost ceph-mon[298913]: Reconfiguring mgr.np0005559463.daptkf (monmap changed)... Dec 15 04:52:00 localhost ceph-mon[298913]: Reconfiguring daemon mgr.np0005559463.daptkf on np0005559463.localdomain Dec 15 04:52:00 localhost ceph-mon[298913]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:52:00 localhost ceph-mon[298913]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:52:00 localhost ceph-mon[298913]: from='mgr.24103 172.18.0.105:0/2912776587' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005559464.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Dec 15 04:52:00 localhost ceph-mon[298913]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005559464.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Dec 15 04:52:00 localhost ceph-mon[298913]: Reconfiguring crash.np0005559464 (monmap changed)... Dec 15 04:52:00 localhost ceph-mon[298913]: Reconfiguring daemon crash.np0005559464 on np0005559464.localdomain Dec 15 04:52:00 localhost ceph-mon[298913]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:52:00 localhost ceph-mon[298913]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:52:00 localhost ceph-mon[298913]: from='mgr.24103 172.18.0.105:0/2912776587' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Dec 15 04:52:00 localhost ceph-mon[298913]: Reconfiguring osd.1 (monmap changed)... Dec 15 04:52:00 localhost ceph-mon[298913]: Reconfiguring daemon osd.1 on np0005559464.localdomain Dec 15 04:52:00 localhost ceph-mon[298913]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:52:00 localhost ceph-mon[298913]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:52:00 localhost ceph-mon[298913]: from='mgr.24103 172.18.0.105:0/2912776587' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Dec 15 04:52:00 localhost ceph-mon[298913]: Reconfiguring osd.4 (monmap changed)... Dec 15 04:52:00 localhost ceph-mon[298913]: Reconfiguring daemon osd.4 on np0005559464.localdomain Dec 15 04:52:00 localhost ceph-mon[298913]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:52:00 localhost ceph-mon[298913]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:52:00 localhost ceph-mon[298913]: from='mgr.24103 172.18.0.105:0/2912776587' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005559464.piyuji", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Dec 15 04:52:00 localhost ceph-mon[298913]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005559464.piyuji", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Dec 15 04:52:00 localhost ceph-mon[298913]: Reconfiguring mds.mds.np0005559464.piyuji (monmap changed)... Dec 15 04:52:00 localhost ceph-mon[298913]: Reconfiguring daemon mds.mds.np0005559464.piyuji on np0005559464.localdomain Dec 15 04:52:00 localhost ceph-mon[298913]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:52:00 localhost ceph-mon[298913]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:52:00 localhost ceph-mon[298913]: from='mgr.24103 172.18.0.105:0/2912776587' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005559464.aomnqe", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Dec 15 04:52:00 localhost ceph-mon[298913]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005559464.aomnqe", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Dec 15 04:52:00 localhost ceph-mon[298913]: Reconfiguring mgr.np0005559464.aomnqe (monmap changed)... Dec 15 04:52:00 localhost ceph-mon[298913]: Reconfiguring daemon mgr.np0005559464.aomnqe on np0005559464.localdomain Dec 15 04:52:00 localhost ceph-mon[298913]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:52:00 localhost ceph-mon[298913]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:52:00 localhost ceph-mon[298913]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:52:00 localhost ceph-mon[298913]: from='mgr.24103 172.18.0.105:0/2912776587' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Dec 15 04:52:00 localhost ceph-mon[298913]: Deploying daemon mon.np0005559462 on np0005559462.localdomain Dec 15 04:52:00 localhost ceph-mon[298913]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:52:00 localhost ceph-mon[298913]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:52:00 localhost ceph-mon[298913]: from='mgr.24103 172.18.0.105:0/2912776587' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Dec 15 04:52:00 localhost ceph-mon[298913]: Updating np0005559460.localdomain:/etc/ceph/ceph.conf Dec 15 04:52:00 localhost ceph-mon[298913]: Updating np0005559461.localdomain:/etc/ceph/ceph.conf Dec 15 04:52:00 localhost ceph-mon[298913]: Updating np0005559462.localdomain:/etc/ceph/ceph.conf Dec 15 04:52:00 localhost ceph-mon[298913]: Updating np0005559463.localdomain:/etc/ceph/ceph.conf Dec 15 04:52:00 localhost ceph-mon[298913]: Updating np0005559464.localdomain:/etc/ceph/ceph.conf Dec 15 04:52:00 localhost ceph-mon[298913]: Updating np0005559461.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:52:00 localhost ceph-mon[298913]: Updating np0005559463.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:52:00 localhost ceph-mon[298913]: Updating np0005559464.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:52:00 localhost ceph-mon[298913]: Updating np0005559460.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:52:00 localhost ceph-mon[298913]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:52:00 localhost ceph-mon[298913]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:52:00 localhost ceph-mon[298913]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:52:00 localhost ceph-mon[298913]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:52:00 localhost ceph-mon[298913]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:52:00 localhost ceph-mon[298913]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:52:00 localhost ceph-mon[298913]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:52:00 localhost ceph-mon[298913]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:52:00 localhost ceph-mon[298913]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:52:00 localhost ceph-mon[298913]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:52:00 localhost ceph-mon[298913]: mon.np0005559462@-1(synchronizing).paxosservice(auth 1..37) refresh upgraded, format 0 -> 3 Dec 15 04:52:00 localhost ceph-mgr[292421]: ms_deliver_dispatch: unhandled message 0x55975f446000 mon_map magic: 0 from mon.3 v2:172.18.0.107:3300/0 Dec 15 04:52:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 04:52:00 localhost podman[299006]: 2025-12-15 09:52:00.745264099 +0000 UTC m=+0.076851875 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3) Dec 15 04:52:00 localhost ceph-mon[298913]: mon.np0005559462@-1(probing) e8 handle_auth_request failed to assign global_id Dec 15 04:52:00 localhost podman[299006]: 2025-12-15 09:52:00.778459724 +0000 UTC m=+0.110047510 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true) Dec 15 04:52:00 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 04:52:00 localhost ceph-mon[298913]: mon.np0005559462@-1(probing) e8 handle_auth_request failed to assign global_id Dec 15 04:52:01 localhost nova_compute[286344]: 2025-12-15 09:52:01.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:52:01 localhost nova_compute[286344]: 2025-12-15 09:52:01.271 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:52:01 localhost nova_compute[286344]: 2025-12-15 09:52:01.299 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:52:01 localhost nova_compute[286344]: 2025-12-15 09:52:01.300 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:52:01 localhost nova_compute[286344]: 2025-12-15 09:52:01.300 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:52:01 localhost nova_compute[286344]: 2025-12-15 09:52:01.300 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Auditing locally available compute resources for np0005559462.localdomain (node: np0005559462.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 15 04:52:01 localhost nova_compute[286344]: 2025-12-15 09:52:01.301 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 04:52:01 localhost ceph-mon[298913]: mon.np0005559462@-1(probing) e8 handle_auth_request failed to assign global_id Dec 15 04:52:01 localhost ceph-mon[298913]: mon.np0005559462@-1(probing) e8 handle_auth_request failed to assign global_id Dec 15 04:52:01 localhost ceph-mon[298913]: mon.np0005559462@-1(probing) e8 handle_auth_request failed to assign global_id Dec 15 04:52:01 localhost podman[243449]: time="2025-12-15T09:52:01Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 15 04:52:01 localhost podman[243449]: @ - - [15/Dec/2025:09:52:01 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 154814 "" "Go-http-client/1.1" Dec 15 04:52:01 localhost podman[243449]: @ - - [15/Dec/2025:09:52:01 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18684 "" "Go-http-client/1.1" Dec 15 04:52:02 localhost ceph-mon[298913]: mon.np0005559462@-1(probing) e8 handle_auth_request failed to assign global_id Dec 15 04:52:02 localhost ceph-mon[298913]: mon.np0005559462@-1(probing) e8 handle_auth_request failed to assign global_id Dec 15 04:52:02 localhost ceph-mon[298913]: mon.np0005559462@-1(probing) e9 my rank is now 4 (was -1) Dec 15 04:52:02 localhost ceph-mon[298913]: log_channel(cluster) log [INF] : mon.np0005559462 calling monitor election Dec 15 04:52:02 localhost ceph-mon[298913]: paxos.4).electionLogic(0) init, first boot, initializing epoch at 1 Dec 15 04:52:02 localhost ceph-mon[298913]: mon.np0005559462@4(electing) e9 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Dec 15 04:52:02 localhost ceph-mon[298913]: mon.np0005559462@4(electing) e9 handle_auth_request failed to assign global_id Dec 15 04:52:03 localhost ceph-mon[298913]: mon.np0005559462@4(electing) e9 handle_auth_request failed to assign global_id Dec 15 04:52:03 localhost ceph-mon[298913]: mon.np0005559462@4(electing) e9 handle_auth_request failed to assign global_id Dec 15 04:52:04 localhost ceph-mon[298913]: mon.np0005559462@4(electing) e9 handle_auth_request failed to assign global_id Dec 15 04:52:04 localhost ceph-mon[298913]: mon.np0005559462@4(electing) e9 handle_auth_request failed to assign global_id Dec 15 04:52:04 localhost ceph-mon[298913]: mon.np0005559462@4(electing) e9 handle_auth_request failed to assign global_id Dec 15 04:52:04 localhost openstack_network_exporter[246484]: ERROR 09:52:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 04:52:04 localhost openstack_network_exporter[246484]: ERROR 09:52:04 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 15 04:52:04 localhost openstack_network_exporter[246484]: ERROR 09:52:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 04:52:04 localhost openstack_network_exporter[246484]: ERROR 09:52:04 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 15 04:52:04 localhost openstack_network_exporter[246484]: Dec 15 04:52:04 localhost openstack_network_exporter[246484]: ERROR 09:52:04 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 15 04:52:04 localhost openstack_network_exporter[246484]: Dec 15 04:52:05 localhost ceph-mon[298913]: mon.np0005559462@4(electing) e9 handle_auth_request failed to assign global_id Dec 15 04:52:05 localhost ceph-mon[298913]: mon.np0005559462@4(electing) e9 handle_auth_request failed to assign global_id Dec 15 04:52:05 localhost ceph-mon[298913]: mon.np0005559462@4(electing) e9 handle_auth_request failed to assign global_id Dec 15 04:52:05 localhost nova_compute[286344]: 2025-12-15 09:52:05.343 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:52:05 localhost nova_compute[286344]: 2025-12-15 09:52:05.346 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:52:05 localhost nova_compute[286344]: 2025-12-15 09:52:05.346 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5027 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 15 04:52:05 localhost nova_compute[286344]: 2025-12-15 09:52:05.347 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:52:05 localhost nova_compute[286344]: 2025-12-15 09:52:05.348 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:52:05 localhost nova_compute[286344]: 2025-12-15 09:52:05.352 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:52:05 localhost ceph-mon[298913]: mon.np0005559462@4(electing) e9 handle_auth_request failed to assign global_id Dec 15 04:52:05 localhost ceph-mon[298913]: mon.np0005559462@4(electing) e9 handle_auth_request failed to assign global_id Dec 15 04:52:05 localhost ceph-mon[298913]: mon.np0005559462@4(electing) e9 handle_auth_request failed to assign global_id Dec 15 04:52:05 localhost ceph-mon[298913]: mon.np0005559462@4(electing) e9 handle_auth_request failed to assign global_id Dec 15 04:52:06 localhost nova_compute[286344]: 2025-12-15 09:52:06.179 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 4.879s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 04:52:06 localhost nova_compute[286344]: 2025-12-15 09:52:06.251 286348 DEBUG nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 04:52:06 localhost nova_compute[286344]: 2025-12-15 09:52:06.251 286348 DEBUG nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 04:52:06 localhost nova_compute[286344]: 2025-12-15 09:52:06.456 286348 WARNING nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 15 04:52:06 localhost nova_compute[286344]: 2025-12-15 09:52:06.458 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Hypervisor/Node resource view: name=np0005559462.localdomain free_ram=11760MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 15 04:52:06 localhost nova_compute[286344]: 2025-12-15 09:52:06.459 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:52:06 localhost nova_compute[286344]: 2025-12-15 09:52:06.459 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:52:06 localhost nova_compute[286344]: 2025-12-15 09:52:06.544 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Instance 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 15 04:52:06 localhost nova_compute[286344]: 2025-12-15 09:52:06.544 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 15 04:52:06 localhost nova_compute[286344]: 2025-12-15 09:52:06.545 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Final resource view: name=np0005559462.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 15 04:52:06 localhost nova_compute[286344]: 2025-12-15 09:52:06.591 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 04:52:06 localhost ceph-mon[298913]: mon.np0005559462@4(electing) e9 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Dec 15 04:52:06 localhost ceph-mon[298913]: mon.np0005559462@4(electing) e9 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Dec 15 04:52:06 localhost ceph-mon[298913]: mon.np0005559462@4(electing) e9 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Dec 15 04:52:06 localhost ceph-mon[298913]: mon.np0005559462@4(electing) e9 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Dec 15 04:52:06 localhost ceph-mon[298913]: mon.np0005559462@4(peon) e9 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={4=support erasure code pools,5=new-style osdmap encoding,6=support isa/lrc erasure code,7=support shec erasure code} Dec 15 04:52:06 localhost ceph-mon[298913]: mon.np0005559462@4(peon) e9 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={8=support monmap features,9=luminous ondisk layout,10=mimic ondisk layout,11=nautilus ondisk layout,12=octopus ondisk layout,13=pacific ondisk layout,14=quincy ondisk layout,15=reef ondisk layout} Dec 15 04:52:06 localhost ceph-mon[298913]: mon.np0005559464 calling monitor election Dec 15 04:52:06 localhost ceph-mon[298913]: mon.np0005559461 calling monitor election Dec 15 04:52:06 localhost ceph-mon[298913]: mon.np0005559460 calling monitor election Dec 15 04:52:06 localhost ceph-mon[298913]: mon.np0005559461 is new leader, mons np0005559461,np0005559460,np0005559464 in quorum (ranks 0,1,2) Dec 15 04:52:06 localhost ceph-mon[298913]: Health check failed: 2/5 mons down, quorum np0005559461,np0005559460,np0005559464 (MON_DOWN) Dec 15 04:52:06 localhost ceph-mon[298913]: Health detail: HEALTH_WARN 2/5 mons down, quorum np0005559461,np0005559460,np0005559464 Dec 15 04:52:06 localhost ceph-mon[298913]: [WRN] MON_DOWN: 2/5 mons down, quorum np0005559461,np0005559460,np0005559464 Dec 15 04:52:06 localhost ceph-mon[298913]: mon.np0005559463 (rank 3) addr [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] is down (out of quorum) Dec 15 04:52:06 localhost ceph-mon[298913]: mon.np0005559462 (rank 4) addr [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] is down (out of quorum) Dec 15 04:52:06 localhost ceph-mon[298913]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:52:06 localhost ceph-mon[298913]: mon.np0005559462@4(peon) e9 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Dec 15 04:52:06 localhost ceph-mon[298913]: mgrc update_daemon_metadata mon.np0005559462 metadata {addrs=[v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0],arch=x86_64,ceph_release=reef,ceph_version=ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable),ceph_version_short=18.2.1-361.el9cp,compression_algorithms=none, snappy, zlib, zstd, lz4,container_hostname=np0005559462.localdomain,container_image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest,cpu=AMD EPYC-Rome Processor,device_ids=,device_paths=vda=/dev/disk/by-path/pci-0000:00:04.0,devices=vda,distro=rhel,distro_description=Red Hat Enterprise Linux 9.7 (Plow),distro_version=9.7,hostname=np0005559462.localdomain,kernel_description=#1 SMP PREEMPT_DYNAMIC Wed Apr 12 10:45:03 EDT 2023,kernel_version=5.14.0-284.11.1.el9_2.x86_64,mem_swap_kb=1048572,mem_total_kb=16116604,os=Linux} Dec 15 04:52:06 localhost ceph-mon[298913]: mon.np0005559462@4(peon) e9 handle_auth_request failed to assign global_id Dec 15 04:52:06 localhost ceph-mon[298913]: mon.np0005559462@4(peon) e9 handle_auth_request failed to assign global_id Dec 15 04:52:06 localhost ceph-mon[298913]: mon.np0005559462@4(peon) e9 handle_auth_request failed to assign global_id Dec 15 04:52:06 localhost ceph-mon[298913]: mon.np0005559463 calling monitor election Dec 15 04:52:06 localhost ceph-mon[298913]: mon.np0005559462 calling monitor election Dec 15 04:52:06 localhost ceph-mon[298913]: mon.np0005559464 calling monitor election Dec 15 04:52:06 localhost ceph-mon[298913]: mon.np0005559460 calling monitor election Dec 15 04:52:06 localhost ceph-mon[298913]: mon.np0005559461 calling monitor election Dec 15 04:52:06 localhost ceph-mon[298913]: mon.np0005559461 is new leader, mons np0005559461,np0005559460,np0005559464,np0005559463,np0005559462 in quorum (ranks 0,1,2,3,4) Dec 15 04:52:06 localhost ceph-mon[298913]: Health check cleared: MON_DOWN (was: 2/5 mons down, quorum np0005559461,np0005559460,np0005559464) Dec 15 04:52:06 localhost ceph-mon[298913]: Cluster is now healthy Dec 15 04:52:06 localhost ceph-mon[298913]: overall HEALTH_OK Dec 15 04:52:07 localhost nova_compute[286344]: 2025-12-15 09:52:07.041 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 04:52:07 localhost nova_compute[286344]: 2025-12-15 09:52:07.048 286348 DEBUG nova.compute.provider_tree [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Inventory has not changed in ProviderTree for provider: 26c8956b-6742-4951-b566-971b9bbe323b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 15 04:52:07 localhost nova_compute[286344]: 2025-12-15 09:52:07.066 286348 DEBUG nova.scheduler.client.report [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Inventory has not changed for provider 26c8956b-6742-4951-b566-971b9bbe323b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 15 04:52:07 localhost nova_compute[286344]: 2025-12-15 09:52:07.068 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Compute_service record updated for np0005559462.localdomain:np0005559462.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 15 04:52:07 localhost nova_compute[286344]: 2025-12-15 09:52:07.069 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.610s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:52:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e. Dec 15 04:52:07 localhost podman[299086]: 2025-12-15 09:52:07.750597271 +0000 UTC m=+0.080306830 container health_status a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Dec 15 04:52:07 localhost podman[299086]: 2025-12-15 09:52:07.764726335 +0000 UTC m=+0.094435894 container exec_died a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 15 04:52:07 localhost systemd[1]: a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e.service: Deactivated successfully. Dec 15 04:52:08 localhost ceph-mon[298913]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:52:08 localhost ceph-mon[298913]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:52:08 localhost ceph-mon[298913]: Reconfiguring mgr.np0005559460.oexkup (monmap changed)... Dec 15 04:52:08 localhost ceph-mon[298913]: from='mgr.24103 172.18.0.105:0/2912776587' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005559460.oexkup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Dec 15 04:52:08 localhost ceph-mon[298913]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005559460.oexkup", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Dec 15 04:52:08 localhost ceph-mon[298913]: Reconfiguring daemon mgr.np0005559460.oexkup on np0005559460.localdomain Dec 15 04:52:08 localhost ceph-mon[298913]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:52:08 localhost ceph-mon[298913]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:52:08 localhost ceph-mon[298913]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005559461.egwgzn", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Dec 15 04:52:08 localhost ceph-mon[298913]: from='mgr.24103 172.18.0.105:0/2912776587' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005559461.egwgzn", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Dec 15 04:52:08 localhost nova_compute[286344]: 2025-12-15 09:52:08.066 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:52:08 localhost nova_compute[286344]: 2025-12-15 09:52:08.067 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:52:08 localhost nova_compute[286344]: 2025-12-15 09:52:08.067 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 15 04:52:08 localhost nova_compute[286344]: 2025-12-15 09:52:08.067 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 15 04:52:08 localhost nova_compute[286344]: 2025-12-15 09:52:08.654 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 15 04:52:08 localhost nova_compute[286344]: 2025-12-15 09:52:08.654 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquired lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 15 04:52:08 localhost nova_compute[286344]: 2025-12-15 09:52:08.654 286348 DEBUG nova.network.neutron [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 15 04:52:08 localhost nova_compute[286344]: 2025-12-15 09:52:08.654 286348 DEBUG nova.objects.instance [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 15 04:52:09 localhost ceph-mon[298913]: Reconfiguring mgr.np0005559461.egwgzn (monmap changed)... Dec 15 04:52:09 localhost ceph-mon[298913]: Reconfiguring daemon mgr.np0005559461.egwgzn on np0005559461.localdomain Dec 15 04:52:09 localhost ceph-mon[298913]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:52:09 localhost ceph-mon[298913]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:52:09 localhost ceph-mon[298913]: from='mgr.24103 172.18.0.105:0/2912776587' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005559461.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Dec 15 04:52:09 localhost ceph-mon[298913]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005559461.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Dec 15 04:52:09 localhost nova_compute[286344]: 2025-12-15 09:52:09.044 286348 DEBUG nova.network.neutron [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Updating instance_info_cache with network_info: [{"id": "03ef8889-3216-43fb-8a52-4be17a956ce1", "address": "fa:16:3e:74:df:7c", "network": {"id": "befb7a72-17a9-4bcb-b561-84b8f626685a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.201", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "c785bf23f53946bc99867d8832a50266", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03ef8889-32", "ovs_interfaceid": "03ef8889-3216-43fb-8a52-4be17a956ce1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 15 04:52:09 localhost nova_compute[286344]: 2025-12-15 09:52:09.063 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Releasing lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 15 04:52:09 localhost nova_compute[286344]: 2025-12-15 09:52:09.063 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 15 04:52:09 localhost nova_compute[286344]: 2025-12-15 09:52:09.064 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:52:09 localhost nova_compute[286344]: 2025-12-15 09:52:09.065 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:52:09 localhost nova_compute[286344]: 2025-12-15 09:52:09.065 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:52:09 localhost nova_compute[286344]: 2025-12-15 09:52:09.065 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:52:09 localhost nova_compute[286344]: 2025-12-15 09:52:09.066 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:52:09 localhost nova_compute[286344]: 2025-12-15 09:52:09.066 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 15 04:52:09 localhost ceph-mon[298913]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #13. Immutable memtables: 0. Dec 15 04:52:09 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:52:09.706853) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Dec 15 04:52:09 localhost ceph-mon[298913]: rocksdb: [db/flush_job.cc:856] [default] [JOB 3] Flushing memtable with next log file: 13 Dec 15 04:52:09 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765792329706958, "job": 3, "event": "flush_started", "num_memtables": 1, "num_entries": 10642, "num_deletes": 254, "total_data_size": 14112158, "memory_usage": 14740112, "flush_reason": "Manual Compaction"} Dec 15 04:52:09 localhost ceph-mon[298913]: rocksdb: [db/flush_job.cc:885] [default] [JOB 3] Level-0 flush table #14: started Dec 15 04:52:09 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765792329786679, "cf_name": "default", "job": 3, "event": "table_file_creation", "file_number": 14, "file_size": 12756900, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 6, "largest_seqno": 10647, "table_properties": {"data_size": 12698425, "index_size": 31681, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 25733, "raw_key_size": 272124, "raw_average_key_size": 26, "raw_value_size": 12522816, "raw_average_value_size": 1219, "num_data_blocks": 1212, "num_entries": 10271, "num_filter_entries": 10271, "num_deletions": 254, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765792320, "oldest_key_time": 1765792320, "file_creation_time": 1765792329, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "603b24af-e2be-4214-bc56-9e652eb4af3d", "db_session_id": "0OJRM9SCUA16EXV0VQZ2", "orig_file_number": 14, "seqno_to_time_mapping": "N/A"}} Dec 15 04:52:09 localhost ceph-mon[298913]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 3] Flush lasted 79905 microseconds, and 28921 cpu microseconds. Dec 15 04:52:09 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:52:09.786760) [db/flush_job.cc:967] [default] [JOB 3] Level-0 flush table #14: 12756900 bytes OK Dec 15 04:52:09 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:52:09.786789) [db/memtable_list.cc:519] [default] Level-0 commit table #14 started Dec 15 04:52:09 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:52:09.788692) [db/memtable_list.cc:722] [default] Level-0 commit table #14: memtable #1 done Dec 15 04:52:09 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:52:09.788720) EVENT_LOG_v1 {"time_micros": 1765792329788711, "job": 3, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [2, 0, 0, 0, 0, 0, 0], "immutable_memtables": 0} Dec 15 04:52:09 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:52:09.788739) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: files[2 0 0 0 0 0 0] max score 0.50 Dec 15 04:52:09 localhost ceph-mon[298913]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 3] Try to delete WAL files size 14038106, prev total WAL file size 14068667, number of live WAL files 2. Dec 15 04:52:09 localhost ceph-mon[298913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005559462/store.db/000009.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Dec 15 04:52:09 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:52:09.791273) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003130323931' seq:72057594037927935, type:22 .. '7061786F73003130353433' seq:0, type:0; will stop at (end) Dec 15 04:52:09 localhost ceph-mon[298913]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 4] Compacting 2@0 files to L6, score -1.00 Dec 15 04:52:09 localhost ceph-mon[298913]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 3 Base level 0, inputs: [14(12MB) 8(2012B)] Dec 15 04:52:09 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765792329791393, "job": 4, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [14, 8], "score": -1, "input_data_size": 12758912, "oldest_snapshot_seqno": -1} Dec 15 04:52:09 localhost ceph-mon[298913]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 4] Generated table #15: 10021 keys, 12753679 bytes, temperature: kUnknown Dec 15 04:52:09 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765792329890888, "cf_name": "default", "job": 4, "event": "table_file_creation", "file_number": 15, "file_size": 12753679, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12695845, "index_size": 31685, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 25093, "raw_key_size": 267355, "raw_average_key_size": 26, "raw_value_size": 12523471, "raw_average_value_size": 1249, "num_data_blocks": 1211, "num_entries": 10021, "num_filter_entries": 10021, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765792320, "oldest_key_time": 0, "file_creation_time": 1765792329, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "603b24af-e2be-4214-bc56-9e652eb4af3d", "db_session_id": "0OJRM9SCUA16EXV0VQZ2", "orig_file_number": 15, "seqno_to_time_mapping": "N/A"}} Dec 15 04:52:09 localhost ceph-mon[298913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Dec 15 04:52:09 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:52:09.891540) [db/compaction/compaction_job.cc:1663] [default] [JOB 4] Compacted 2@0 files to L6 => 12753679 bytes Dec 15 04:52:09 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:52:09.893389) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 127.8 rd, 127.7 wr, level 6, files in(2, 0) out(1 +0 blob) MB in(12.2, 0.0 +0.0 blob) out(12.2 +0.0 blob), read-write-amplify(2.0) write-amplify(1.0) OK, records in: 10276, records dropped: 255 output_compression: NoCompression Dec 15 04:52:09 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:52:09.893419) EVENT_LOG_v1 {"time_micros": 1765792329893405, "job": 4, "event": "compaction_finished", "compaction_time_micros": 99867, "compaction_time_cpu_micros": 41918, "output_level": 6, "num_output_files": 1, "total_output_size": 12753679, "num_input_records": 10276, "num_output_records": 10021, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Dec 15 04:52:09 localhost ceph-mon[298913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005559462/store.db/000014.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Dec 15 04:52:09 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765792329895665, "job": 4, "event": "table_file_deletion", "file_number": 14} Dec 15 04:52:09 localhost ceph-mon[298913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005559462/store.db/000008.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Dec 15 04:52:09 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765792329895749, "job": 4, "event": "table_file_deletion", "file_number": 8} Dec 15 04:52:09 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:52:09.791176) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 04:52:10 localhost ceph-mon[298913]: Reconfiguring crash.np0005559461 (monmap changed)... Dec 15 04:52:10 localhost ceph-mon[298913]: Reconfiguring daemon crash.np0005559461 on np0005559461.localdomain Dec 15 04:52:10 localhost ceph-mon[298913]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:52:10 localhost ceph-mon[298913]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:52:10 localhost ceph-mon[298913]: from='mgr.24103 172.18.0.105:0/2912776587' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005559462.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Dec 15 04:52:10 localhost ceph-mon[298913]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005559462.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Dec 15 04:52:10 localhost podman[299160]: Dec 15 04:52:10 localhost podman[299160]: 2025-12-15 09:52:10.366684974 +0000 UTC m=+0.076583086 container create e14481db088b0cabfc56f6ca0bf22217e18364d417d3e664fc6e9d83cac1d7dc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=epic_edison, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, RELEASE=main, architecture=x86_64, release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux , GIT_CLEAN=True, io.openshift.tags=rhceph ceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, build-date=2025-11-26T19:44:28Z, version=7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, io.buildah.version=1.41.4, name=rhceph, vendor=Red Hat, Inc.) Dec 15 04:52:10 localhost nova_compute[286344]: 2025-12-15 09:52:10.384 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:52:10 localhost systemd[1]: Started libpod-conmon-e14481db088b0cabfc56f6ca0bf22217e18364d417d3e664fc6e9d83cac1d7dc.scope. Dec 15 04:52:10 localhost podman[299160]: 2025-12-15 09:52:10.334702253 +0000 UTC m=+0.044600405 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 04:52:10 localhost systemd[1]: Started libcrun container. Dec 15 04:52:10 localhost podman[299160]: 2025-12-15 09:52:10.454889884 +0000 UTC m=+0.164787946 container init e14481db088b0cabfc56f6ca0bf22217e18364d417d3e664fc6e9d83cac1d7dc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=epic_edison, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, name=rhceph, ceph=True, build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , io.openshift.expose-services=, distribution-scope=public, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, vendor=Red Hat, Inc., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, GIT_CLEAN=True, version=7, description=Red Hat Ceph Storage 7) Dec 15 04:52:10 localhost epic_edison[299173]: 167 167 Dec 15 04:52:10 localhost systemd[1]: libpod-e14481db088b0cabfc56f6ca0bf22217e18364d417d3e664fc6e9d83cac1d7dc.scope: Deactivated successfully. Dec 15 04:52:10 localhost podman[299160]: 2025-12-15 09:52:10.465451478 +0000 UTC m=+0.175349540 container start e14481db088b0cabfc56f6ca0bf22217e18364d417d3e664fc6e9d83cac1d7dc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=epic_edison, build-date=2025-11-26T19:44:28Z, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.openshift.expose-services=, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, GIT_BRANCH=main, ceph=True, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Dec 15 04:52:10 localhost podman[299160]: 2025-12-15 09:52:10.465623253 +0000 UTC m=+0.175521335 container attach e14481db088b0cabfc56f6ca0bf22217e18364d417d3e664fc6e9d83cac1d7dc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=epic_edison, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , io.openshift.expose-services=, architecture=x86_64, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, build-date=2025-11-26T19:44:28Z, name=rhceph, release=1763362218, io.openshift.tags=rhceph ceph) Dec 15 04:52:10 localhost podman[299160]: 2025-12-15 09:52:10.470436077 +0000 UTC m=+0.180334169 container died e14481db088b0cabfc56f6ca0bf22217e18364d417d3e664fc6e9d83cac1d7dc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=epic_edison, architecture=x86_64, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, RELEASE=main, name=rhceph, io.openshift.expose-services=, io.buildah.version=1.41.4, GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , release=1763362218, ceph=True, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, distribution-scope=public) Dec 15 04:52:10 localhost podman[299178]: 2025-12-15 09:52:10.566703701 +0000 UTC m=+0.090966997 container remove e14481db088b0cabfc56f6ca0bf22217e18364d417d3e664fc6e9d83cac1d7dc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=epic_edison, GIT_CLEAN=True, io.openshift.expose-services=, com.redhat.component=rhceph-container, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, RELEASE=main, build-date=2025-11-26T19:44:28Z, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, release=1763362218, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, version=7, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 15 04:52:10 localhost systemd[1]: libpod-conmon-e14481db088b0cabfc56f6ca0bf22217e18364d417d3e664fc6e9d83cac1d7dc.scope: Deactivated successfully. Dec 15 04:52:11 localhost ceph-mon[298913]: Reconfiguring crash.np0005559462 (monmap changed)... Dec 15 04:52:11 localhost ceph-mon[298913]: Reconfiguring daemon crash.np0005559462 on np0005559462.localdomain Dec 15 04:52:11 localhost ceph-mon[298913]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:52:11 localhost ceph-mon[298913]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:52:11 localhost ceph-mon[298913]: from='mgr.24103 172.18.0.105:0/2912776587' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch Dec 15 04:52:11 localhost podman[299247]: Dec 15 04:52:11 localhost podman[299247]: 2025-12-15 09:52:11.261164072 +0000 UTC m=+0.078297004 container create 8700ab85b273415466dc8df3e5e21f14262baf04fd5af2a3c48d3abca0a362df (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_chandrasekhar, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, ceph=True, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux , version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, com.redhat.component=rhceph-container, GIT_BRANCH=main, distribution-scope=public, release=1763362218, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers) Dec 15 04:52:11 localhost systemd[1]: Started libpod-conmon-8700ab85b273415466dc8df3e5e21f14262baf04fd5af2a3c48d3abca0a362df.scope. Dec 15 04:52:11 localhost systemd[1]: Started libcrun container. Dec 15 04:52:11 localhost podman[299247]: 2025-12-15 09:52:11.324408975 +0000 UTC m=+0.141541917 container init 8700ab85b273415466dc8df3e5e21f14262baf04fd5af2a3c48d3abca0a362df (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_chandrasekhar, ceph=True, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, architecture=x86_64, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, vcs-type=git) Dec 15 04:52:11 localhost podman[299247]: 2025-12-15 09:52:11.231223337 +0000 UTC m=+0.048356299 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 04:52:11 localhost podman[299247]: 2025-12-15 09:52:11.333452167 +0000 UTC m=+0.150585109 container start 8700ab85b273415466dc8df3e5e21f14262baf04fd5af2a3c48d3abca0a362df (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_chandrasekhar, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, build-date=2025-11-26T19:44:28Z, GIT_CLEAN=True, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, name=rhceph, vendor=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Dec 15 04:52:11 localhost podman[299247]: 2025-12-15 09:52:11.333700394 +0000 UTC m=+0.150833416 container attach 8700ab85b273415466dc8df3e5e21f14262baf04fd5af2a3c48d3abca0a362df (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_chandrasekhar, GIT_CLEAN=True, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1763362218, build-date=2025-11-26T19:44:28Z, GIT_BRANCH=main, RELEASE=main, architecture=x86_64, version=7, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., vcs-type=git, com.redhat.component=rhceph-container) Dec 15 04:52:11 localhost cool_chandrasekhar[299262]: 167 167 Dec 15 04:52:11 localhost systemd[1]: libpod-8700ab85b273415466dc8df3e5e21f14262baf04fd5af2a3c48d3abca0a362df.scope: Deactivated successfully. Dec 15 04:52:11 localhost podman[299247]: 2025-12-15 09:52:11.337226962 +0000 UTC m=+0.154359944 container died 8700ab85b273415466dc8df3e5e21f14262baf04fd5af2a3c48d3abca0a362df (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_chandrasekhar, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, RELEASE=main, GIT_BRANCH=main, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, io.openshift.expose-services=, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, release=1763362218, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, ceph=True, version=7, architecture=x86_64, maintainer=Guillaume Abrioux , build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git) Dec 15 04:52:11 localhost systemd[1]: var-lib-containers-storage-overlay-be094c71f48c88dacf58ef486941f3022bcac43e39ac8a3a542b15cba010b735-merged.mount: Deactivated successfully. Dec 15 04:52:11 localhost systemd[1]: tmp-crun.vjpoZP.mount: Deactivated successfully. Dec 15 04:52:11 localhost systemd[1]: var-lib-containers-storage-overlay-dcac27f8536e4093b6e5447f414fc999fa61ed549390a04d93ce0bcdc22aadbd-merged.mount: Deactivated successfully. Dec 15 04:52:11 localhost podman[299267]: 2025-12-15 09:52:11.442365943 +0000 UTC m=+0.092697005 container remove 8700ab85b273415466dc8df3e5e21f14262baf04fd5af2a3c48d3abca0a362df (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_chandrasekhar, io.k8s.description=Red Hat Ceph Storage 7, version=7, GIT_BRANCH=main, distribution-scope=public, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1763362218, vcs-type=git, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, com.redhat.component=rhceph-container, name=rhceph, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux , ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Dec 15 04:52:11 localhost systemd[1]: libpod-conmon-8700ab85b273415466dc8df3e5e21f14262baf04fd5af2a3c48d3abca0a362df.scope: Deactivated successfully. Dec 15 04:52:11 localhost ceph-mon[298913]: mon.np0005559462@4(peon) e9 handle_command mon_command({"prefix": "status", "format": "json"} v 0) Dec 15 04:52:11 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.200:0/2747521294' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch Dec 15 04:52:11 localhost sshd[299284]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:52:12 localhost ceph-mon[298913]: Reconfiguring osd.0 (monmap changed)... Dec 15 04:52:12 localhost ceph-mon[298913]: Reconfiguring daemon osd.0 on np0005559462.localdomain Dec 15 04:52:12 localhost ceph-mon[298913]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:52:12 localhost ceph-mon[298913]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:52:12 localhost ceph-mon[298913]: from='mgr.24103 172.18.0.105:0/2912776587' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch Dec 15 04:52:12 localhost podman[299343]: Dec 15 04:52:12 localhost podman[299343]: 2025-12-15 09:52:12.345220904 +0000 UTC m=+0.077547703 container create cb90b7aed22204b983b3ada7e73ad3c64b0b25d6c5f27d45993b9903a84f6330 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=beautiful_wiles, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-11-26T19:44:28Z, RELEASE=main, GIT_BRANCH=main, release=1763362218, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, ceph=True, version=7, architecture=x86_64, GIT_CLEAN=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, vcs-type=git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7) Dec 15 04:52:12 localhost systemd[1]: Started libpod-conmon-cb90b7aed22204b983b3ada7e73ad3c64b0b25d6c5f27d45993b9903a84f6330.scope. Dec 15 04:52:12 localhost systemd[1]: Started libcrun container. Dec 15 04:52:12 localhost podman[299343]: 2025-12-15 09:52:12.410883814 +0000 UTC m=+0.143210613 container init cb90b7aed22204b983b3ada7e73ad3c64b0b25d6c5f27d45993b9903a84f6330 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=beautiful_wiles, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, build-date=2025-11-26T19:44:28Z, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, vcs-type=git, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=) Dec 15 04:52:12 localhost podman[299343]: 2025-12-15 09:52:12.31456797 +0000 UTC m=+0.046894809 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 04:52:12 localhost podman[299343]: 2025-12-15 09:52:12.419846994 +0000 UTC m=+0.152173803 container start cb90b7aed22204b983b3ada7e73ad3c64b0b25d6c5f27d45993b9903a84f6330 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=beautiful_wiles, ceph=True, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, version=7, release=1763362218, io.buildah.version=1.41.4, vcs-type=git, build-date=2025-11-26T19:44:28Z, distribution-scope=public, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, com.redhat.component=rhceph-container, RELEASE=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream) Dec 15 04:52:12 localhost podman[299343]: 2025-12-15 09:52:12.420139392 +0000 UTC m=+0.152466211 container attach cb90b7aed22204b983b3ada7e73ad3c64b0b25d6c5f27d45993b9903a84f6330 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=beautiful_wiles, GIT_CLEAN=True, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, ceph=True, io.openshift.expose-services=, com.redhat.component=rhceph-container, release=1763362218, name=rhceph, build-date=2025-11-26T19:44:28Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers) Dec 15 04:52:12 localhost beautiful_wiles[299358]: 167 167 Dec 15 04:52:12 localhost systemd[1]: libpod-cb90b7aed22204b983b3ada7e73ad3c64b0b25d6c5f27d45993b9903a84f6330.scope: Deactivated successfully. Dec 15 04:52:12 localhost podman[299343]: 2025-12-15 09:52:12.42544874 +0000 UTC m=+0.157775569 container died cb90b7aed22204b983b3ada7e73ad3c64b0b25d6c5f27d45993b9903a84f6330 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=beautiful_wiles, maintainer=Guillaume Abrioux , release=1763362218, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, RELEASE=main, ceph=True, version=7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, io.openshift.expose-services=, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, build-date=2025-11-26T19:44:28Z, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, GIT_CLEAN=True) Dec 15 04:52:12 localhost podman[299363]: 2025-12-15 09:52:12.521240602 +0000 UTC m=+0.089707272 container remove cb90b7aed22204b983b3ada7e73ad3c64b0b25d6c5f27d45993b9903a84f6330 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=beautiful_wiles, GIT_BRANCH=main, com.redhat.component=rhceph-container, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, vcs-type=git, release=1763362218, build-date=2025-11-26T19:44:28Z, version=7, vendor=Red Hat, Inc., io.openshift.expose-services=, CEPH_POINT_RELEASE=, distribution-scope=public, io.buildah.version=1.41.4, architecture=x86_64, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7) Dec 15 04:52:12 localhost systemd[1]: libpod-conmon-cb90b7aed22204b983b3ada7e73ad3c64b0b25d6c5f27d45993b9903a84f6330.scope: Deactivated successfully. Dec 15 04:52:12 localhost sshd[299423]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:52:13 localhost ceph-mon[298913]: Reconfiguring osd.3 (monmap changed)... Dec 15 04:52:13 localhost ceph-mon[298913]: Reconfiguring daemon osd.3 on np0005559462.localdomain Dec 15 04:52:13 localhost ceph-mon[298913]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:52:13 localhost ceph-mon[298913]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:52:13 localhost ceph-mon[298913]: from='mgr.24103 172.18.0.105:0/2912776587' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005559462.mhigvc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Dec 15 04:52:13 localhost ceph-mon[298913]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005559462.mhigvc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Dec 15 04:52:13 localhost ceph-mon[298913]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:52:13 localhost ceph-mon[298913]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:52:13 localhost ceph-mon[298913]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:52:13 localhost ceph-mon[298913]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:52:13 localhost ceph-mon[298913]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:52:13 localhost ceph-mon[298913]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:52:13 localhost ceph-mon[298913]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:52:13 localhost ceph-mon[298913]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:52:13 localhost ceph-mon[298913]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:52:13 localhost ceph-mon[298913]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:52:13 localhost ceph-mon[298913]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:52:13 localhost ceph-mon[298913]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:52:13 localhost systemd[1]: tmp-crun.Yx4yt1.mount: Deactivated successfully. Dec 15 04:52:13 localhost systemd[1]: var-lib-containers-storage-overlay-deace98334ffb3d0b5e7fb8a0894c78fa18cb5ee26c7de20f2c64a774f9678af-merged.mount: Deactivated successfully. Dec 15 04:52:13 localhost podman[299442]: Dec 15 04:52:13 localhost podman[299442]: 2025-12-15 09:52:13.419316378 +0000 UTC m=+0.090305908 container create 69b85197c16c2a85d3b8e3235e292140c6b2207318335c787573a08a509543b2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_jennings, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, GIT_BRANCH=main, build-date=2025-11-26T19:44:28Z, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, distribution-scope=public, io.buildah.version=1.41.4, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, com.redhat.component=rhceph-container, release=1763362218, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Dec 15 04:52:13 localhost systemd[1]: Started libpod-conmon-69b85197c16c2a85d3b8e3235e292140c6b2207318335c787573a08a509543b2.scope. Dec 15 04:52:13 localhost systemd[1]: Started libcrun container. Dec 15 04:52:13 localhost podman[299442]: 2025-12-15 09:52:13.379738595 +0000 UTC m=+0.050728165 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 04:52:13 localhost podman[299442]: 2025-12-15 09:52:13.490666938 +0000 UTC m=+0.161656468 container init 69b85197c16c2a85d3b8e3235e292140c6b2207318335c787573a08a509543b2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_jennings, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., release=1763362218, vcs-type=git, GIT_CLEAN=True, RELEASE=main, build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, distribution-scope=public, CEPH_POINT_RELEASE=, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, ceph=True, architecture=x86_64) Dec 15 04:52:13 localhost podman[299442]: 2025-12-15 09:52:13.500136432 +0000 UTC m=+0.171125972 container start 69b85197c16c2a85d3b8e3235e292140c6b2207318335c787573a08a509543b2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_jennings, distribution-scope=public, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, version=7, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, GIT_BRANCH=main, ceph=True, GIT_CLEAN=True, architecture=x86_64) Dec 15 04:52:13 localhost podman[299442]: 2025-12-15 09:52:13.500501182 +0000 UTC m=+0.171490762 container attach 69b85197c16c2a85d3b8e3235e292140c6b2207318335c787573a08a509543b2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_jennings, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1763362218, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, io.openshift.expose-services=, RELEASE=main, name=rhceph, ceph=True, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, description=Red Hat Ceph Storage 7, vcs-type=git, GIT_BRANCH=main) Dec 15 04:52:13 localhost busy_jennings[299457]: 167 167 Dec 15 04:52:13 localhost systemd[1]: libpod-69b85197c16c2a85d3b8e3235e292140c6b2207318335c787573a08a509543b2.scope: Deactivated successfully. Dec 15 04:52:13 localhost podman[299442]: 2025-12-15 09:52:13.50367387 +0000 UTC m=+0.174663410 container died 69b85197c16c2a85d3b8e3235e292140c6b2207318335c787573a08a509543b2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_jennings, RELEASE=main, architecture=x86_64, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, name=rhceph, CEPH_POINT_RELEASE=, version=7) Dec 15 04:52:13 localhost podman[299462]: 2025-12-15 09:52:13.608225596 +0000 UTC m=+0.090335891 container remove 69b85197c16c2a85d3b8e3235e292140c6b2207318335c787573a08a509543b2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_jennings, maintainer=Guillaume Abrioux , cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, version=7, GIT_CLEAN=True, RELEASE=main, io.openshift.expose-services=, GIT_BRANCH=main, com.redhat.component=rhceph-container, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, vcs-type=git, build-date=2025-11-26T19:44:28Z, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., distribution-scope=public) Dec 15 04:52:13 localhost systemd[1]: libpod-conmon-69b85197c16c2a85d3b8e3235e292140c6b2207318335c787573a08a509543b2.scope: Deactivated successfully. Dec 15 04:52:14 localhost ceph-mon[298913]: Reconfiguring mds.mds.np0005559462.mhigvc (monmap changed)... Dec 15 04:52:14 localhost ceph-mon[298913]: Reconfiguring daemon mds.mds.np0005559462.mhigvc on np0005559462.localdomain Dec 15 04:52:14 localhost ceph-mon[298913]: Reconfig service osd.default_drive_group Dec 15 04:52:14 localhost ceph-mon[298913]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:52:14 localhost ceph-mon[298913]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:52:14 localhost ceph-mon[298913]: from='mgr.24103 172.18.0.105:0/2912776587' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005559462.fudvyx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Dec 15 04:52:14 localhost ceph-mon[298913]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005559462.fudvyx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Dec 15 04:52:14 localhost podman[299531]: Dec 15 04:52:14 localhost podman[299531]: 2025-12-15 09:52:14.296455512 +0000 UTC m=+0.066314849 container create 50cfb6143362a5649b6efba6ebb3a5fe58e36ed988b1959e14cd415bc7363ddf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elastic_darwin, CEPH_POINT_RELEASE=, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, distribution-scope=public, architecture=x86_64, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, version=7, com.redhat.component=rhceph-container, GIT_CLEAN=True, build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, maintainer=Guillaume Abrioux , vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1763362218, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git) Dec 15 04:52:14 localhost systemd[1]: Started libpod-conmon-50cfb6143362a5649b6efba6ebb3a5fe58e36ed988b1959e14cd415bc7363ddf.scope. Dec 15 04:52:14 localhost systemd[1]: Started libcrun container. Dec 15 04:52:14 localhost podman[299531]: 2025-12-15 09:52:14.26444143 +0000 UTC m=+0.034300807 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 04:52:14 localhost podman[299531]: 2025-12-15 09:52:14.36416686 +0000 UTC m=+0.134026207 container init 50cfb6143362a5649b6efba6ebb3a5fe58e36ed988b1959e14cd415bc7363ddf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elastic_darwin, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , version=7, io.openshift.expose-services=, vendor=Red Hat, Inc., distribution-scope=public, description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, vcs-type=git, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, ceph=True, CEPH_POINT_RELEASE=, name=rhceph) Dec 15 04:52:14 localhost elastic_darwin[299546]: 167 167 Dec 15 04:52:14 localhost systemd[1]: libpod-50cfb6143362a5649b6efba6ebb3a5fe58e36ed988b1959e14cd415bc7363ddf.scope: Deactivated successfully. Dec 15 04:52:14 localhost podman[299531]: 2025-12-15 09:52:14.373593223 +0000 UTC m=+0.143452540 container start 50cfb6143362a5649b6efba6ebb3a5fe58e36ed988b1959e14cd415bc7363ddf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elastic_darwin, architecture=x86_64, ceph=True, io.openshift.expose-services=, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, CEPH_POINT_RELEASE=, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, maintainer=Guillaume Abrioux , version=7, distribution-scope=public, name=rhceph, vendor=Red Hat, Inc., vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph) Dec 15 04:52:14 localhost podman[299531]: 2025-12-15 09:52:14.375448154 +0000 UTC m=+0.145307531 container attach 50cfb6143362a5649b6efba6ebb3a5fe58e36ed988b1959e14cd415bc7363ddf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elastic_darwin, maintainer=Guillaume Abrioux , version=7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., ceph=True, com.redhat.component=rhceph-container, RELEASE=main, name=rhceph, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_BRANCH=main, release=1763362218) Dec 15 04:52:14 localhost podman[299531]: 2025-12-15 09:52:14.378192821 +0000 UTC m=+0.148052168 container died 50cfb6143362a5649b6efba6ebb3a5fe58e36ed988b1959e14cd415bc7363ddf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elastic_darwin, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux , distribution-scope=public, com.redhat.component=rhceph-container, RELEASE=main, name=rhceph, io.buildah.version=1.41.4, vcs-type=git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, vendor=Red Hat, Inc., version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, release=1763362218, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7) Dec 15 04:52:14 localhost systemd[1]: var-lib-containers-storage-overlay-7a94e0161273f99b3178133d8e76b2b808fb2f4d611eeaae0306284670863d93-merged.mount: Deactivated successfully. Dec 15 04:52:14 localhost systemd[1]: var-lib-containers-storage-overlay-5f05c61c9dd597d171f99fe8933d85f353fe19e209b98c5d2f1a0b08af3ed268-merged.mount: Deactivated successfully. Dec 15 04:52:14 localhost podman[299551]: 2025-12-15 09:52:14.468699674 +0000 UTC m=+0.081363089 container remove 50cfb6143362a5649b6efba6ebb3a5fe58e36ed988b1959e14cd415bc7363ddf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elastic_darwin, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, maintainer=Guillaume Abrioux , name=rhceph, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.component=rhceph-container, io.openshift.expose-services=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git) Dec 15 04:52:14 localhost systemd[1]: libpod-conmon-50cfb6143362a5649b6efba6ebb3a5fe58e36ed988b1959e14cd415bc7363ddf.scope: Deactivated successfully. Dec 15 04:52:14 localhost ceph-mon[298913]: mon.np0005559462@4(peon).osd e84 _set_cache_ratios kv ratio 0.25 inc ratio 0.375 full ratio 0.375 Dec 15 04:52:14 localhost ceph-mon[298913]: mon.np0005559462@4(peon).osd e84 register_cache_with_pcm pcm target: 2147483648 pcm max: 1020054732 pcm min: 134217728 inc_osd_cache size: 1 Dec 15 04:52:14 localhost ceph-mon[298913]: mon.np0005559462@4(peon).osd e85 e85: 6 total, 6 up, 6 in Dec 15 04:52:14 localhost systemd-logind[763]: Session 64 logged out. Waiting for processes to exit. Dec 15 04:52:14 localhost systemd[1]: session-64.scope: Deactivated successfully. Dec 15 04:52:14 localhost systemd[1]: session-64.scope: Consumed 25.431s CPU time. Dec 15 04:52:14 localhost systemd-logind[763]: Removed session 64. Dec 15 04:52:15 localhost ceph-mon[298913]: mon.np0005559462@4(peon).osd e85 _set_new_cache_sizes cache_size:1019734272 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:52:15 localhost ceph-mon[298913]: Reconfiguring mgr.np0005559462.fudvyx (monmap changed)... Dec 15 04:52:15 localhost ceph-mon[298913]: Reconfiguring daemon mgr.np0005559462.fudvyx on np0005559462.localdomain Dec 15 04:52:15 localhost ceph-mon[298913]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:52:15 localhost ceph-mon[298913]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' Dec 15 04:52:15 localhost ceph-mon[298913]: from='mgr.24103 172.18.0.105:0/2912776587' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005559463.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Dec 15 04:52:15 localhost ceph-mon[298913]: from='mgr.24103 ' entity='mgr.np0005559461.egwgzn' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005559463.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Dec 15 04:52:15 localhost ceph-mon[298913]: from='client.? 172.18.0.200:0/1372257756' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Dec 15 04:52:15 localhost ceph-mon[298913]: from='client.? ' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Dec 15 04:52:15 localhost ceph-mon[298913]: Activating manager daemon np0005559459.hhnowu Dec 15 04:52:15 localhost ceph-mon[298913]: from='client.? ' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished Dec 15 04:52:15 localhost nova_compute[286344]: 2025-12-15 09:52:15.390 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:52:15 localhost nova_compute[286344]: 2025-12-15 09:52:15.394 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:52:15 localhost nova_compute[286344]: 2025-12-15 09:52:15.395 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5006 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 15 04:52:15 localhost nova_compute[286344]: 2025-12-15 09:52:15.395 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:52:15 localhost nova_compute[286344]: 2025-12-15 09:52:15.415 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:52:15 localhost nova_compute[286344]: 2025-12-15 09:52:15.416 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:52:20 localhost ceph-mon[298913]: mon.np0005559462@4(peon).osd e85 _set_new_cache_sizes cache_size:1020048650 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:52:20 localhost nova_compute[286344]: 2025-12-15 09:52:20.417 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:52:20 localhost nova_compute[286344]: 2025-12-15 09:52:20.419 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:52:20 localhost nova_compute[286344]: 2025-12-15 09:52:20.419 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 15 04:52:20 localhost nova_compute[286344]: 2025-12-15 09:52:20.420 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:52:20 localhost nova_compute[286344]: 2025-12-15 09:52:20.453 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:52:20 localhost nova_compute[286344]: 2025-12-15 09:52:20.454 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:52:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0. Dec 15 04:52:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 04:52:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a. Dec 15 04:52:20 localhost podman[299571]: 2025-12-15 09:52:20.759222667 +0000 UTC m=+0.079575120 container health_status b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible) Dec 15 04:52:20 localhost systemd[1]: tmp-crun.HCVWHC.mount: Deactivated successfully. Dec 15 04:52:20 localhost podman[299570]: 2025-12-15 09:52:20.813014786 +0000 UTC m=+0.134910392 container health_status 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_id=multipathd, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Dec 15 04:52:20 localhost podman[299570]: 2025-12-15 09:52:20.824328771 +0000 UTC m=+0.146224407 container exec_died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Dec 15 04:52:20 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.service: Deactivated successfully. Dec 15 04:52:20 localhost podman[299571]: 2025-12-15 09:52:20.876676801 +0000 UTC m=+0.197029294 container exec_died b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251202) Dec 15 04:52:20 localhost systemd[1]: b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a.service: Deactivated successfully. Dec 15 04:52:20 localhost podman[299569]: 2025-12-15 09:52:20.968560103 +0000 UTC m=+0.288956587 container health_status 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 15 04:52:20 localhost podman[299569]: 2025-12-15 09:52:20.981513964 +0000 UTC m=+0.301910428 container exec_died 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Dec 15 04:52:20 localhost systemd[1]: 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0.service: Deactivated successfully. Dec 15 04:52:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09. Dec 15 04:52:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 04:52:23 localhost systemd[1]: tmp-crun.4Dlrz5.mount: Deactivated successfully. Dec 15 04:52:23 localhost podman[299630]: 2025-12-15 09:52:23.418170405 +0000 UTC m=+0.097578450 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.build-date=20251202, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2) Dec 15 04:52:23 localhost systemd[1]: tmp-crun.6ZoskX.mount: Deactivated successfully. Dec 15 04:52:23 localhost podman[299629]: 2025-12-15 09:52:23.468437597 +0000 UTC m=+0.150293811 container health_status 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, name=ubi9-minimal, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, vendor=Red Hat, Inc., config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, architecture=x86_64) Dec 15 04:52:23 localhost podman[299630]: 2025-12-15 09:52:23.49543477 +0000 UTC m=+0.174842795 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202) Dec 15 04:52:23 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 04:52:23 localhost podman[299629]: 2025-12-15 09:52:23.511298062 +0000 UTC m=+0.193154226 container exec_died 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, version=9.6, build-date=2025-08-20T13:12:41, distribution-scope=public, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=) Dec 15 04:52:23 localhost systemd[1]: 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09.service: Deactivated successfully. Dec 15 04:52:24 localhost systemd[1]: Stopping User Manager for UID 1002... Dec 15 04:52:24 localhost systemd[25924]: Activating special unit Exit the Session... Dec 15 04:52:24 localhost systemd[25924]: Removed slice User Background Tasks Slice. Dec 15 04:52:24 localhost systemd[25924]: Stopped target Main User Target. Dec 15 04:52:24 localhost systemd[25924]: Stopped target Basic System. Dec 15 04:52:24 localhost systemd[25924]: Stopped target Paths. Dec 15 04:52:24 localhost systemd[25924]: Stopped target Sockets. Dec 15 04:52:24 localhost systemd[25924]: Stopped target Timers. Dec 15 04:52:24 localhost systemd[25924]: Stopped Mark boot as successful after the user session has run 2 minutes. Dec 15 04:52:24 localhost systemd[25924]: Stopped Daily Cleanup of User's Temporary Directories. Dec 15 04:52:24 localhost systemd[25924]: Closed D-Bus User Message Bus Socket. Dec 15 04:52:24 localhost systemd[25924]: Stopped Create User's Volatile Files and Directories. Dec 15 04:52:24 localhost systemd[25924]: Removed slice User Application Slice. Dec 15 04:52:24 localhost systemd[25924]: Reached target Shutdown. Dec 15 04:52:24 localhost systemd[25924]: Finished Exit the Session. Dec 15 04:52:24 localhost systemd[25924]: Reached target Exit the Session. Dec 15 04:52:24 localhost systemd[1]: user@1002.service: Deactivated successfully. Dec 15 04:52:24 localhost systemd[1]: Stopped User Manager for UID 1002. Dec 15 04:52:24 localhost systemd[1]: user@1002.service: Consumed 12.074s CPU time, read 0B from disk, written 7.0K to disk. Dec 15 04:52:24 localhost systemd[1]: Stopping User Runtime Directory /run/user/1002... Dec 15 04:52:24 localhost systemd[1]: run-user-1002.mount: Deactivated successfully. Dec 15 04:52:24 localhost systemd[1]: user-runtime-dir@1002.service: Deactivated successfully. Dec 15 04:52:24 localhost systemd[1]: Stopped User Runtime Directory /run/user/1002. Dec 15 04:52:24 localhost systemd[1]: Removed slice User Slice of UID 1002. Dec 15 04:52:24 localhost systemd[1]: user-1002.slice: Consumed 4min 14.921s CPU time. Dec 15 04:52:25 localhost ceph-mon[298913]: mon.np0005559462@4(peon).osd e85 _set_new_cache_sizes cache_size:1020054615 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:52:25 localhost nova_compute[286344]: 2025-12-15 09:52:25.455 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:52:25 localhost nova_compute[286344]: 2025-12-15 09:52:25.457 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:52:25 localhost nova_compute[286344]: 2025-12-15 09:52:25.457 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 15 04:52:25 localhost nova_compute[286344]: 2025-12-15 09:52:25.457 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:52:25 localhost nova_compute[286344]: 2025-12-15 09:52:25.491 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:52:25 localhost nova_compute[286344]: 2025-12-15 09:52:25.492 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:52:30 localhost ceph-mon[298913]: mon.np0005559462@4(peon).osd e85 _set_new_cache_sizes cache_size:1020054729 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:52:30 localhost nova_compute[286344]: 2025-12-15 09:52:30.493 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:52:30 localhost nova_compute[286344]: 2025-12-15 09:52:30.495 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:52:30 localhost nova_compute[286344]: 2025-12-15 09:52:30.495 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 15 04:52:30 localhost nova_compute[286344]: 2025-12-15 09:52:30.495 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:52:30 localhost nova_compute[286344]: 2025-12-15 09:52:30.532 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:52:30 localhost nova_compute[286344]: 2025-12-15 09:52:30.533 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:52:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 04:52:31 localhost systemd[1]: tmp-crun.mi3Ei3.mount: Deactivated successfully. Dec 15 04:52:31 localhost podman[299677]: 2025-12-15 09:52:31.776499427 +0000 UTC m=+0.100790521 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 04:52:31 localhost podman[299677]: 2025-12-15 09:52:31.784255564 +0000 UTC m=+0.108546668 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_metadata_agent) Dec 15 04:52:31 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 04:52:31 localhost podman[243449]: time="2025-12-15T09:52:31Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 15 04:52:31 localhost podman[243449]: @ - - [15/Dec/2025:09:52:31 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 154814 "" "Go-http-client/1.1" Dec 15 04:52:31 localhost podman[243449]: @ - - [15/Dec/2025:09:52:31 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18705 "" "Go-http-client/1.1" Dec 15 04:52:34 localhost openstack_network_exporter[246484]: ERROR 09:52:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 04:52:34 localhost openstack_network_exporter[246484]: ERROR 09:52:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 04:52:34 localhost openstack_network_exporter[246484]: ERROR 09:52:34 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 15 04:52:34 localhost openstack_network_exporter[246484]: ERROR 09:52:34 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 15 04:52:34 localhost openstack_network_exporter[246484]: Dec 15 04:52:34 localhost openstack_network_exporter[246484]: ERROR 09:52:34 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 15 04:52:34 localhost openstack_network_exporter[246484]: Dec 15 04:52:35 localhost ceph-mon[298913]: mon.np0005559462@4(peon).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:52:35 localhost nova_compute[286344]: 2025-12-15 09:52:35.534 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:52:35 localhost nova_compute[286344]: 2025-12-15 09:52:35.536 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:52:35 localhost nova_compute[286344]: 2025-12-15 09:52:35.536 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 15 04:52:35 localhost nova_compute[286344]: 2025-12-15 09:52:35.537 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:52:35 localhost nova_compute[286344]: 2025-12-15 09:52:35.564 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:52:35 localhost nova_compute[286344]: 2025-12-15 09:52:35.565 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:52:35 localhost nova_compute[286344]: 2025-12-15 09:52:35.568 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:52:36 localhost systemd[1]: session-65.scope: Deactivated successfully. Dec 15 04:52:36 localhost systemd[1]: session-65.scope: Consumed 1.715s CPU time. Dec 15 04:52:36 localhost systemd-logind[763]: Session 65 logged out. Waiting for processes to exit. Dec 15 04:52:36 localhost systemd-logind[763]: Removed session 65. Dec 15 04:52:37 localhost sshd[299695]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:52:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e. Dec 15 04:52:38 localhost podman[299697]: 2025-12-15 09:52:38.760428792 +0000 UTC m=+0.092786267 container health_status a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 15 04:52:38 localhost podman[299697]: 2025-12-15 09:52:38.79765389 +0000 UTC m=+0.130011335 container exec_died a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 15 04:52:38 localhost systemd[1]: a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e.service: Deactivated successfully. Dec 15 04:52:40 localhost ceph-mon[298913]: mon.np0005559462@4(peon).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:52:40 localhost nova_compute[286344]: 2025-12-15 09:52:40.570 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:52:40 localhost nova_compute[286344]: 2025-12-15 09:52:40.571 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:52:40 localhost nova_compute[286344]: 2025-12-15 09:52:40.572 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 15 04:52:40 localhost nova_compute[286344]: 2025-12-15 09:52:40.572 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:52:40 localhost nova_compute[286344]: 2025-12-15 09:52:40.600 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:52:40 localhost nova_compute[286344]: 2025-12-15 09:52:40.600 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:52:45 localhost ceph-mon[298913]: mon.np0005559462@4(peon).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:52:45 localhost nova_compute[286344]: 2025-12-15 09:52:45.601 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:52:45 localhost nova_compute[286344]: 2025-12-15 09:52:45.603 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:52:45 localhost nova_compute[286344]: 2025-12-15 09:52:45.603 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 15 04:52:45 localhost nova_compute[286344]: 2025-12-15 09:52:45.604 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:52:45 localhost nova_compute[286344]: 2025-12-15 09:52:45.637 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:52:45 localhost nova_compute[286344]: 2025-12-15 09:52:45.638 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:52:46 localhost systemd[1]: Stopping User Manager for UID 1003... Dec 15 04:52:46 localhost systemd[296827]: Activating special unit Exit the Session... Dec 15 04:52:46 localhost systemd[296827]: Stopped target Main User Target. Dec 15 04:52:46 localhost systemd[296827]: Stopped target Basic System. Dec 15 04:52:46 localhost systemd[296827]: Stopped target Paths. Dec 15 04:52:46 localhost systemd[296827]: Stopped target Sockets. Dec 15 04:52:46 localhost systemd[296827]: Stopped target Timers. Dec 15 04:52:46 localhost systemd[296827]: Stopped Mark boot as successful after the user session has run 2 minutes. Dec 15 04:52:46 localhost systemd[296827]: Stopped Daily Cleanup of User's Temporary Directories. Dec 15 04:52:46 localhost systemd[296827]: Closed D-Bus User Message Bus Socket. Dec 15 04:52:46 localhost systemd[296827]: Stopped Create User's Volatile Files and Directories. Dec 15 04:52:46 localhost systemd[296827]: Removed slice User Application Slice. Dec 15 04:52:46 localhost systemd[296827]: Reached target Shutdown. Dec 15 04:52:46 localhost systemd[296827]: Finished Exit the Session. Dec 15 04:52:46 localhost systemd[296827]: Reached target Exit the Session. Dec 15 04:52:46 localhost systemd[1]: user@1003.service: Deactivated successfully. Dec 15 04:52:46 localhost systemd[1]: Stopped User Manager for UID 1003. Dec 15 04:52:46 localhost systemd[1]: Stopping User Runtime Directory /run/user/1003... Dec 15 04:52:46 localhost systemd[1]: run-user-1003.mount: Deactivated successfully. Dec 15 04:52:46 localhost systemd[1]: user-runtime-dir@1003.service: Deactivated successfully. Dec 15 04:52:46 localhost systemd[1]: Stopped User Runtime Directory /run/user/1003. Dec 15 04:52:46 localhost systemd[1]: Removed slice User Slice of UID 1003. Dec 15 04:52:46 localhost systemd[1]: user-1003.slice: Consumed 2.259s CPU time. Dec 15 04:52:49 localhost ceph-mon[298913]: mon.np0005559462@4(peon).osd e86 e86: 6 total, 6 up, 6 in Dec 15 04:52:49 localhost ceph-mon[298913]: Activating manager daemon np0005559460.oexkup Dec 15 04:52:49 localhost ceph-mon[298913]: Manager daemon np0005559459.hhnowu is unresponsive, replacing it with standby daemon np0005559460.oexkup Dec 15 04:52:49 localhost sshd[299721]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:52:49 localhost systemd-logind[763]: New session 67 of user ceph-admin. Dec 15 04:52:49 localhost systemd[1]: Created slice User Slice of UID 1002. Dec 15 04:52:49 localhost systemd[1]: Starting User Runtime Directory /run/user/1002... Dec 15 04:52:49 localhost systemd[1]: Finished User Runtime Directory /run/user/1002. Dec 15 04:52:49 localhost systemd[1]: Starting User Manager for UID 1002... Dec 15 04:52:49 localhost systemd[299725]: Queued start job for default target Main User Target. Dec 15 04:52:49 localhost systemd[299725]: Created slice User Application Slice. Dec 15 04:52:49 localhost systemd[299725]: Started Mark boot as successful after the user session has run 2 minutes. Dec 15 04:52:49 localhost systemd[299725]: Started Daily Cleanup of User's Temporary Directories. Dec 15 04:52:49 localhost systemd[299725]: Reached target Paths. Dec 15 04:52:49 localhost systemd[299725]: Reached target Timers. Dec 15 04:52:49 localhost systemd[299725]: Starting D-Bus User Message Bus Socket... Dec 15 04:52:49 localhost systemd[299725]: Starting Create User's Volatile Files and Directories... Dec 15 04:52:50 localhost systemd[299725]: Listening on D-Bus User Message Bus Socket. Dec 15 04:52:50 localhost systemd[299725]: Reached target Sockets. Dec 15 04:52:50 localhost systemd[299725]: Finished Create User's Volatile Files and Directories. Dec 15 04:52:50 localhost systemd[299725]: Reached target Basic System. Dec 15 04:52:50 localhost systemd[299725]: Reached target Main User Target. Dec 15 04:52:50 localhost systemd[299725]: Startup finished in 164ms. Dec 15 04:52:50 localhost systemd[1]: Started User Manager for UID 1002. Dec 15 04:52:50 localhost systemd[1]: Started Session 67 of User ceph-admin. Dec 15 04:52:50 localhost ceph-mon[298913]: mon.np0005559462@4(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:52:50 localhost ceph-mon[298913]: Manager daemon np0005559460.oexkup is now available Dec 15 04:52:50 localhost ceph-mon[298913]: removing stray HostCache host record np0005559459.localdomain.devices.0 Dec 15 04:52:50 localhost ceph-mon[298913]: from='mgr.24104 172.18.0.104:0/2825492105' entity='mgr.np0005559460.oexkup' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005559459.localdomain.devices.0"} : dispatch Dec 15 04:52:50 localhost ceph-mon[298913]: from='mgr.24104 ' entity='mgr.np0005559460.oexkup' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005559459.localdomain.devices.0"} : dispatch Dec 15 04:52:50 localhost ceph-mon[298913]: from='mgr.24104 ' entity='mgr.np0005559460.oexkup' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005559459.localdomain.devices.0"}]': finished Dec 15 04:52:50 localhost ceph-mon[298913]: from='mgr.24104 ' entity='mgr.np0005559460.oexkup' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005559459.localdomain.devices.0"} : dispatch Dec 15 04:52:50 localhost ceph-mon[298913]: from='mgr.24104 172.18.0.104:0/2825492105' entity='mgr.np0005559460.oexkup' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005559459.localdomain.devices.0"} : dispatch Dec 15 04:52:50 localhost ceph-mon[298913]: from='mgr.24104 ' entity='mgr.np0005559460.oexkup' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005559459.localdomain.devices.0"}]': finished Dec 15 04:52:50 localhost ceph-mon[298913]: from='mgr.24104 ' entity='mgr.np0005559460.oexkup' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005559460.oexkup/mirror_snapshot_schedule"} : dispatch Dec 15 04:52:50 localhost ceph-mon[298913]: from='mgr.24104 172.18.0.104:0/2825492105' entity='mgr.np0005559460.oexkup' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005559460.oexkup/mirror_snapshot_schedule"} : dispatch Dec 15 04:52:50 localhost ceph-mon[298913]: from='mgr.24104 ' entity='mgr.np0005559460.oexkup' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005559460.oexkup/trash_purge_schedule"} : dispatch Dec 15 04:52:50 localhost ceph-mon[298913]: from='mgr.24104 172.18.0.104:0/2825492105' entity='mgr.np0005559460.oexkup' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005559460.oexkup/trash_purge_schedule"} : dispatch Dec 15 04:52:50 localhost nova_compute[286344]: 2025-12-15 09:52:50.639 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:52:50 localhost nova_compute[286344]: 2025-12-15 09:52:50.642 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:52:50 localhost nova_compute[286344]: 2025-12-15 09:52:50.643 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 15 04:52:50 localhost nova_compute[286344]: 2025-12-15 09:52:50.643 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:52:50 localhost nova_compute[286344]: 2025-12-15 09:52:50.673 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:52:50 localhost nova_compute[286344]: 2025-12-15 09:52:50.674 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:52:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 04:52:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a. Dec 15 04:52:51 localhost podman[299820]: 2025-12-15 09:52:51.061283946 +0000 UTC m=+0.146814154 container health_status b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Dec 15 04:52:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0. Dec 15 04:52:51 localhost podman[299820]: 2025-12-15 09:52:51.102508106 +0000 UTC m=+0.188038374 container exec_died b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3) Dec 15 04:52:51 localhost systemd[1]: b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a.service: Deactivated successfully. Dec 15 04:52:51 localhost podman[299819]: 2025-12-15 09:52:51.109458329 +0000 UTC m=+0.198178336 container health_status 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 04:52:51 localhost podman[299876]: 2025-12-15 09:52:51.166669424 +0000 UTC m=+0.089150306 container health_status 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Dec 15 04:52:51 localhost podman[299819]: 2025-12-15 09:52:51.193454781 +0000 UTC m=+0.282174818 container exec_died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true) Dec 15 04:52:51 localhost podman[299876]: 2025-12-15 09:52:51.205380793 +0000 UTC m=+0.127861685 container exec_died 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 15 04:52:51 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.service: Deactivated successfully. Dec 15 04:52:51 localhost systemd[1]: 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0.service: Deactivated successfully. Dec 15 04:52:51 localhost podman[299895]: 2025-12-15 09:52:51.318803805 +0000 UTC m=+0.163710005 container exec 8dcda56b365b42dc8758aab77a9ec80db304780e449052738f7e4e648ae1ecaf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-crash-np0005559462, GIT_CLEAN=True, vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, name=rhceph, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, ceph=True, io.openshift.expose-services=, vcs-type=git, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, distribution-scope=public, version=7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, com.redhat.component=rhceph-container, GIT_BRANCH=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 15 04:52:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:52:51.469 160590 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:52:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:52:51.470 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:52:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:52:51.472 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:52:51 localhost podman[299895]: 2025-12-15 09:52:51.498561027 +0000 UTC m=+0.343467177 container exec_died 8dcda56b365b42dc8758aab77a9ec80db304780e449052738f7e4e648ae1ecaf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-crash-np0005559462, GIT_CLEAN=True, io.openshift.expose-services=, RELEASE=main, release=1763362218, vcs-type=git, vendor=Red Hat, Inc., name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, distribution-scope=public, architecture=x86_64, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Dec 15 04:52:52 localhost ceph-mon[298913]: [15/Dec/2025:09:52:50] ENGINE Bus STARTING Dec 15 04:52:52 localhost ceph-mon[298913]: [15/Dec/2025:09:52:50] ENGINE Serving on https://172.18.0.104:7150 Dec 15 04:52:52 localhost ceph-mon[298913]: [15/Dec/2025:09:52:50] ENGINE Client ('172.18.0.104', 60646) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') Dec 15 04:52:52 localhost ceph-mon[298913]: [15/Dec/2025:09:52:50] ENGINE Serving on http://172.18.0.104:8765 Dec 15 04:52:52 localhost ceph-mon[298913]: [15/Dec/2025:09:52:50] ENGINE Bus STARTED Dec 15 04:52:52 localhost ceph-mon[298913]: from='mgr.24104 ' entity='mgr.np0005559460.oexkup' Dec 15 04:52:52 localhost ceph-mon[298913]: from='mgr.24104 ' entity='mgr.np0005559460.oexkup' Dec 15 04:52:52 localhost ceph-mon[298913]: from='mgr.24104 ' entity='mgr.np0005559460.oexkup' Dec 15 04:52:52 localhost ceph-mon[298913]: from='mgr.24104 ' entity='mgr.np0005559460.oexkup' Dec 15 04:52:52 localhost ceph-mon[298913]: from='mgr.24104 ' entity='mgr.np0005559460.oexkup' Dec 15 04:52:52 localhost ceph-mon[298913]: from='mgr.24104 ' entity='mgr.np0005559460.oexkup' Dec 15 04:52:52 localhost ceph-mon[298913]: from='mgr.24104 ' entity='mgr.np0005559460.oexkup' Dec 15 04:52:52 localhost ceph-mon[298913]: from='mgr.24104 ' entity='mgr.np0005559460.oexkup' Dec 15 04:52:52 localhost ceph-mon[298913]: from='mgr.24104 ' entity='mgr.np0005559460.oexkup' Dec 15 04:52:52 localhost ceph-mon[298913]: from='mgr.24104 ' entity='mgr.np0005559460.oexkup' Dec 15 04:52:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09. Dec 15 04:52:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 04:52:53 localhost podman[300146]: 2025-12-15 09:52:53.780273919 +0000 UTC m=+0.101677056 container health_status 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, release=1755695350, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, architecture=x86_64, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vendor=Red Hat, Inc., config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.) Dec 15 04:52:53 localhost podman[300146]: 2025-12-15 09:52:53.82158038 +0000 UTC m=+0.142983527 container exec_died 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, distribution-scope=public, managed_by=edpm_ansible, config_id=openstack_network_exporter, name=ubi9-minimal, vcs-type=git, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}) Dec 15 04:52:53 localhost systemd[1]: tmp-crun.k28X3f.mount: Deactivated successfully. Dec 15 04:52:53 localhost systemd[1]: 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09.service: Deactivated successfully. Dec 15 04:52:53 localhost podman[300148]: 2025-12-15 09:52:53.835615261 +0000 UTC m=+0.156896645 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 04:52:53 localhost podman[300148]: 2025-12-15 09:52:53.903642488 +0000 UTC m=+0.224923902 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, container_name=ovn_controller) Dec 15 04:52:53 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 04:52:54 localhost ceph-mon[298913]: from='mgr.24104 ' entity='mgr.np0005559460.oexkup' Dec 15 04:52:54 localhost ceph-mon[298913]: from='mgr.24104 ' entity='mgr.np0005559460.oexkup' Dec 15 04:52:54 localhost ceph-mon[298913]: from='mgr.24104 172.18.0.104:0/2825492105' entity='mgr.np0005559460.oexkup' cmd={"prefix": "config rm", "who": "osd/host:np0005559460", "name": "osd_memory_target"} : dispatch Dec 15 04:52:54 localhost ceph-mon[298913]: from='mgr.24104 ' entity='mgr.np0005559460.oexkup' cmd={"prefix": "config rm", "who": "osd/host:np0005559460", "name": "osd_memory_target"} : dispatch Dec 15 04:52:54 localhost ceph-mon[298913]: Saving service mon spec with placement label:mon Dec 15 04:52:54 localhost ceph-mon[298913]: from='mgr.24104 ' entity='mgr.np0005559460.oexkup' Dec 15 04:52:54 localhost ceph-mon[298913]: from='mgr.24104 ' entity='mgr.np0005559460.oexkup' Dec 15 04:52:54 localhost ceph-mon[298913]: from='mgr.24104 ' entity='mgr.np0005559460.oexkup' Dec 15 04:52:54 localhost ceph-mon[298913]: from='mgr.24104 ' entity='mgr.np0005559460.oexkup' cmd={"prefix": "config rm", "who": "osd/host:np0005559461", "name": "osd_memory_target"} : dispatch Dec 15 04:52:54 localhost ceph-mon[298913]: from='mgr.24104 172.18.0.104:0/2825492105' entity='mgr.np0005559460.oexkup' cmd={"prefix": "config rm", "who": "osd/host:np0005559461", "name": "osd_memory_target"} : dispatch Dec 15 04:52:54 localhost ceph-mon[298913]: from='mgr.24104 ' entity='mgr.np0005559460.oexkup' Dec 15 04:52:54 localhost ceph-mon[298913]: from='mgr.24104 ' entity='mgr.np0005559460.oexkup' Dec 15 04:52:54 localhost ceph-mon[298913]: from='mgr.24104 172.18.0.104:0/2825492105' entity='mgr.np0005559460.oexkup' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Dec 15 04:52:54 localhost ceph-mon[298913]: from='mgr.24104 ' entity='mgr.np0005559460.oexkup' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Dec 15 04:52:54 localhost ceph-mon[298913]: from='mgr.24104 ' entity='mgr.np0005559460.oexkup' Dec 15 04:52:54 localhost ceph-mon[298913]: from='mgr.24104 172.18.0.104:0/2825492105' entity='mgr.np0005559460.oexkup' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Dec 15 04:52:54 localhost ceph-mon[298913]: from='mgr.24104 ' entity='mgr.np0005559460.oexkup' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Dec 15 04:52:54 localhost ceph-mon[298913]: from='mgr.24104 ' entity='mgr.np0005559460.oexkup' Dec 15 04:52:54 localhost ceph-mon[298913]: from='mgr.24104 ' entity='mgr.np0005559460.oexkup' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Dec 15 04:52:54 localhost ceph-mon[298913]: from='mgr.24104 172.18.0.104:0/2825492105' entity='mgr.np0005559460.oexkup' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Dec 15 04:52:54 localhost ceph-mon[298913]: Adjusting osd_memory_target on np0005559463.localdomain to 836.6M Dec 15 04:52:54 localhost ceph-mon[298913]: from='mgr.24104 ' entity='mgr.np0005559460.oexkup' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Dec 15 04:52:54 localhost ceph-mon[298913]: from='mgr.24104 172.18.0.104:0/2825492105' entity='mgr.np0005559460.oexkup' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Dec 15 04:52:54 localhost ceph-mon[298913]: Unable to set osd_memory_target on np0005559463.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Dec 15 04:52:54 localhost ceph-mon[298913]: Adjusting osd_memory_target on np0005559464.localdomain to 836.6M Dec 15 04:52:54 localhost ceph-mon[298913]: Unable to set osd_memory_target on np0005559464.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Dec 15 04:52:54 localhost ceph-mon[298913]: from='mgr.24104 ' entity='mgr.np0005559460.oexkup' Dec 15 04:52:54 localhost ceph-mon[298913]: from='mgr.24104 ' entity='mgr.np0005559460.oexkup' Dec 15 04:52:54 localhost ceph-mon[298913]: from='mgr.24104 ' entity='mgr.np0005559460.oexkup' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Dec 15 04:52:54 localhost ceph-mon[298913]: from='mgr.24104 172.18.0.104:0/2825492105' entity='mgr.np0005559460.oexkup' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Dec 15 04:52:54 localhost ceph-mon[298913]: from='mgr.24104 ' entity='mgr.np0005559460.oexkup' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Dec 15 04:52:54 localhost ceph-mon[298913]: from='mgr.24104 172.18.0.104:0/2825492105' entity='mgr.np0005559460.oexkup' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Dec 15 04:52:54 localhost ceph-mon[298913]: from='mgr.24104 172.18.0.104:0/2825492105' entity='mgr.np0005559460.oexkup' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Dec 15 04:52:55 localhost ceph-mon[298913]: mon.np0005559462@4(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:52:55 localhost ceph-mon[298913]: Adjusting osd_memory_target on np0005559462.localdomain to 836.6M Dec 15 04:52:55 localhost ceph-mon[298913]: Unable to set osd_memory_target on np0005559462.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096 Dec 15 04:52:55 localhost ceph-mon[298913]: Updating np0005559460.localdomain:/etc/ceph/ceph.conf Dec 15 04:52:55 localhost ceph-mon[298913]: Updating np0005559461.localdomain:/etc/ceph/ceph.conf Dec 15 04:52:55 localhost ceph-mon[298913]: Updating np0005559462.localdomain:/etc/ceph/ceph.conf Dec 15 04:52:55 localhost ceph-mon[298913]: Updating np0005559463.localdomain:/etc/ceph/ceph.conf Dec 15 04:52:55 localhost ceph-mon[298913]: Updating np0005559464.localdomain:/etc/ceph/ceph.conf Dec 15 04:52:55 localhost ceph-mon[298913]: Updating np0005559464.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:52:55 localhost ceph-mon[298913]: Updating np0005559461.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:52:55 localhost ceph-mon[298913]: Updating np0005559460.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:52:55 localhost ceph-mon[298913]: Updating np0005559463.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:52:55 localhost ceph-mon[298913]: Updating np0005559462.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:52:55 localhost nova_compute[286344]: 2025-12-15 09:52:55.674 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:52:55 localhost nova_compute[286344]: 2025-12-15 09:52:55.676 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:52:55 localhost nova_compute[286344]: 2025-12-15 09:52:55.676 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 15 04:52:55 localhost nova_compute[286344]: 2025-12-15 09:52:55.677 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:52:55 localhost nova_compute[286344]: 2025-12-15 09:52:55.704 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:52:55 localhost nova_compute[286344]: 2025-12-15 09:52:55.705 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:52:56 localhost ceph-mon[298913]: Updating np0005559464.localdomain:/etc/ceph/ceph.client.admin.keyring Dec 15 04:52:56 localhost ceph-mon[298913]: Updating np0005559460.localdomain:/etc/ceph/ceph.client.admin.keyring Dec 15 04:52:56 localhost ceph-mon[298913]: Updating np0005559461.localdomain:/etc/ceph/ceph.client.admin.keyring Dec 15 04:52:56 localhost ceph-mon[298913]: Updating np0005559463.localdomain:/etc/ceph/ceph.client.admin.keyring Dec 15 04:52:56 localhost ceph-mon[298913]: Updating np0005559462.localdomain:/etc/ceph/ceph.client.admin.keyring Dec 15 04:52:57 localhost ceph-mon[298913]: Updating np0005559464.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.client.admin.keyring Dec 15 04:52:57 localhost ceph-mon[298913]: Updating np0005559460.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.client.admin.keyring Dec 15 04:52:57 localhost ceph-mon[298913]: Updating np0005559461.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.client.admin.keyring Dec 15 04:52:57 localhost ceph-mon[298913]: Updating np0005559463.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.client.admin.keyring Dec 15 04:52:57 localhost ceph-mon[298913]: Updating np0005559462.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.client.admin.keyring Dec 15 04:52:57 localhost ceph-mon[298913]: from='mgr.24104 ' entity='mgr.np0005559460.oexkup' Dec 15 04:52:57 localhost ceph-mon[298913]: from='mgr.24104 ' entity='mgr.np0005559460.oexkup' Dec 15 04:52:57 localhost ceph-mon[298913]: from='mgr.24104 ' entity='mgr.np0005559460.oexkup' Dec 15 04:52:57 localhost ceph-mon[298913]: from='mgr.24104 ' entity='mgr.np0005559460.oexkup' Dec 15 04:52:57 localhost ceph-mon[298913]: from='mgr.24104 ' entity='mgr.np0005559460.oexkup' Dec 15 04:52:57 localhost ceph-mon[298913]: from='mgr.24104 ' entity='mgr.np0005559460.oexkup' Dec 15 04:52:57 localhost ceph-mon[298913]: from='mgr.24104 ' entity='mgr.np0005559460.oexkup' Dec 15 04:52:57 localhost ceph-mon[298913]: from='mgr.24104 ' entity='mgr.np0005559460.oexkup' Dec 15 04:52:57 localhost ceph-mon[298913]: from='mgr.24104 ' entity='mgr.np0005559460.oexkup' Dec 15 04:52:57 localhost ceph-mon[298913]: from='mgr.24104 ' entity='mgr.np0005559460.oexkup' Dec 15 04:52:57 localhost ceph-mon[298913]: from='mgr.24104 ' entity='mgr.np0005559460.oexkup' Dec 15 04:52:57 localhost ceph-mon[298913]: from='mgr.24104 ' entity='mgr.np0005559460.oexkup' Dec 15 04:52:57 localhost ceph-mon[298913]: from='mgr.24104 172.18.0.104:0/2825492105' entity='mgr.np0005559460.oexkup' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Dec 15 04:52:58 localhost ceph-mon[298913]: Reconfiguring mon.np0005559460 (monmap changed)... Dec 15 04:52:58 localhost ceph-mon[298913]: Reconfiguring daemon mon.np0005559460 on np0005559460.localdomain Dec 15 04:52:58 localhost ceph-mon[298913]: from='mgr.24104 ' entity='mgr.np0005559460.oexkup' Dec 15 04:52:58 localhost ceph-mon[298913]: from='mgr.24104 ' entity='mgr.np0005559460.oexkup' Dec 15 04:52:58 localhost ceph-mon[298913]: from='mgr.24104 172.18.0.104:0/2825492105' entity='mgr.np0005559460.oexkup' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Dec 15 04:52:58 localhost ceph-mon[298913]: mon.np0005559462@4(peon) e9 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Dec 15 04:52:58 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/537385078' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Dec 15 04:52:59 localhost ceph-mon[298913]: Reconfiguring mon.np0005559461 (monmap changed)... Dec 15 04:52:59 localhost ceph-mon[298913]: Reconfiguring daemon mon.np0005559461 on np0005559461.localdomain Dec 15 04:52:59 localhost ceph-mon[298913]: from='mgr.24104 ' entity='mgr.np0005559460.oexkup' Dec 15 04:52:59 localhost ceph-mon[298913]: from='mgr.24104 ' entity='mgr.np0005559460.oexkup' Dec 15 04:52:59 localhost ceph-mon[298913]: from='mgr.24104 172.18.0.104:0/2825492105' entity='mgr.np0005559460.oexkup' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch Dec 15 04:52:59 localhost podman[300902]: Dec 15 04:52:59 localhost podman[300902]: 2025-12-15 09:52:59.440058397 +0000 UTC m=+0.046739134 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 04:52:59 localhost podman[300902]: 2025-12-15 09:52:59.673005381 +0000 UTC m=+0.279686048 container create 55801f788dcc3a1639c13718b23ddc9b99784624e211e503be07a9f6d1e6cce1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elegant_bell, GIT_BRANCH=main, name=rhceph, maintainer=Guillaume Abrioux , url=https://catalog.redhat.com/en/search?searchType=containers, version=7, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, vendor=Red Hat, Inc., distribution-scope=public, ceph=True, release=1763362218, vcs-type=git, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git) Dec 15 04:52:59 localhost systemd[1]: Started libpod-conmon-55801f788dcc3a1639c13718b23ddc9b99784624e211e503be07a9f6d1e6cce1.scope. Dec 15 04:52:59 localhost systemd[1]: Started libcrun container. Dec 15 04:52:59 localhost podman[300902]: 2025-12-15 09:52:59.74720322 +0000 UTC m=+0.353883927 container init 55801f788dcc3a1639c13718b23ddc9b99784624e211e503be07a9f6d1e6cce1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elegant_bell, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, architecture=x86_64, vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, RELEASE=main, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, release=1763362218, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux , url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, GIT_CLEAN=True, com.redhat.component=rhceph-container, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4) Dec 15 04:52:59 localhost podman[300902]: 2025-12-15 09:52:59.759837282 +0000 UTC m=+0.366517979 container start 55801f788dcc3a1639c13718b23ddc9b99784624e211e503be07a9f6d1e6cce1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elegant_bell, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, build-date=2025-11-26T19:44:28Z, CEPH_POINT_RELEASE=, GIT_CLEAN=True, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, com.redhat.component=rhceph-container, name=rhceph, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4) Dec 15 04:52:59 localhost podman[300902]: 2025-12-15 09:52:59.760198572 +0000 UTC m=+0.366879329 container attach 55801f788dcc3a1639c13718b23ddc9b99784624e211e503be07a9f6d1e6cce1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elegant_bell, GIT_BRANCH=main, name=rhceph, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, release=1763362218, maintainer=Guillaume Abrioux , url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-11-26T19:44:28Z, CEPH_POINT_RELEASE=, distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, RELEASE=main, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream) Dec 15 04:52:59 localhost elegant_bell[300916]: 167 167 Dec 15 04:52:59 localhost systemd[1]: libpod-55801f788dcc3a1639c13718b23ddc9b99784624e211e503be07a9f6d1e6cce1.scope: Deactivated successfully. Dec 15 04:52:59 localhost podman[300902]: 2025-12-15 09:52:59.765563631 +0000 UTC m=+0.372244338 container died 55801f788dcc3a1639c13718b23ddc9b99784624e211e503be07a9f6d1e6cce1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elegant_bell, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, release=1763362218, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., architecture=x86_64, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, version=7, maintainer=Guillaume Abrioux , name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, io.openshift.expose-services=, GIT_BRANCH=main, ceph=True, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, RELEASE=main, GIT_CLEAN=True) Dec 15 04:52:59 localhost podman[300921]: 2025-12-15 09:52:59.87815046 +0000 UTC m=+0.097616123 container remove 55801f788dcc3a1639c13718b23ddc9b99784624e211e503be07a9f6d1e6cce1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=elegant_bell, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, release=1763362218, io.openshift.expose-services=, GIT_CLEAN=True, com.redhat.component=rhceph-container, build-date=2025-11-26T19:44:28Z, maintainer=Guillaume Abrioux , vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, GIT_BRANCH=main, architecture=x86_64) Dec 15 04:52:59 localhost systemd[1]: libpod-conmon-55801f788dcc3a1639c13718b23ddc9b99784624e211e503be07a9f6d1e6cce1.scope: Deactivated successfully. Dec 15 04:53:00 localhost ceph-mon[298913]: mon.np0005559462@4(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:53:00 localhost ceph-mon[298913]: Reconfiguring daemon osd.0 on np0005559462.localdomain Dec 15 04:53:00 localhost ceph-mon[298913]: from='mgr.24104 ' entity='mgr.np0005559460.oexkup' Dec 15 04:53:00 localhost ceph-mon[298913]: from='mgr.24104 ' entity='mgr.np0005559460.oexkup' Dec 15 04:53:00 localhost ceph-mon[298913]: from='mgr.24104 ' entity='mgr.np0005559460.oexkup' Dec 15 04:53:00 localhost ceph-mon[298913]: from='mgr.24104 ' entity='mgr.np0005559460.oexkup' Dec 15 04:53:00 localhost ceph-mon[298913]: from='mgr.24104 ' entity='mgr.np0005559460.oexkup' Dec 15 04:53:00 localhost ceph-mon[298913]: from='mgr.24104 172.18.0.104:0/2825492105' entity='mgr.np0005559460.oexkup' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch Dec 15 04:53:00 localhost systemd[1]: tmp-crun.HfCaqz.mount: Deactivated successfully. Dec 15 04:53:00 localhost systemd[1]: var-lib-containers-storage-overlay-92836cb5e7587fca80867ba9cc9608bf1907dbdca629ef01fd16d2f7df4c7eec-merged.mount: Deactivated successfully. Dec 15 04:53:00 localhost nova_compute[286344]: 2025-12-15 09:53:00.705 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:53:00 localhost nova_compute[286344]: 2025-12-15 09:53:00.708 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:53:00 localhost nova_compute[286344]: 2025-12-15 09:53:00.708 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 15 04:53:00 localhost nova_compute[286344]: 2025-12-15 09:53:00.709 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:53:00 localhost nova_compute[286344]: 2025-12-15 09:53:00.742 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:53:00 localhost nova_compute[286344]: 2025-12-15 09:53:00.743 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:53:00 localhost podman[301000]: Dec 15 04:53:00 localhost podman[301000]: 2025-12-15 09:53:00.781417023 +0000 UTC m=+0.083236772 container create 6358f75c8365436482bd50d8a27bedecec72c2f9bb342ec7ade073f7387c154f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=serene_bohr, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, com.redhat.component=rhceph-container, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, RELEASE=main, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-11-26T19:44:28Z, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, vcs-type=git, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, description=Red Hat Ceph Storage 7) Dec 15 04:53:00 localhost systemd[1]: Started libpod-conmon-6358f75c8365436482bd50d8a27bedecec72c2f9bb342ec7ade073f7387c154f.scope. Dec 15 04:53:00 localhost systemd[1]: Started libcrun container. Dec 15 04:53:00 localhost podman[301000]: 2025-12-15 09:53:00.847496524 +0000 UTC m=+0.149316273 container init 6358f75c8365436482bd50d8a27bedecec72c2f9bb342ec7ade073f7387c154f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=serene_bohr, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, ceph=True, version=7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, maintainer=Guillaume Abrioux , distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, build-date=2025-11-26T19:44:28Z) Dec 15 04:53:00 localhost podman[301000]: 2025-12-15 09:53:00.748090804 +0000 UTC m=+0.049910613 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 04:53:00 localhost podman[301000]: 2025-12-15 09:53:00.85736508 +0000 UTC m=+0.159184799 container start 6358f75c8365436482bd50d8a27bedecec72c2f9bb342ec7ade073f7387c154f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=serene_bohr, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, io.openshift.expose-services=, com.redhat.component=rhceph-container, ceph=True, RELEASE=main, io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, release=1763362218, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-11-26T19:44:28Z, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., distribution-scope=public) Dec 15 04:53:00 localhost podman[301000]: 2025-12-15 09:53:00.857582916 +0000 UTC m=+0.159402715 container attach 6358f75c8365436482bd50d8a27bedecec72c2f9bb342ec7ade073f7387c154f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=serene_bohr, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, distribution-scope=public, architecture=x86_64, release=1763362218, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, io.openshift.expose-services=, io.buildah.version=1.41.4, GIT_BRANCH=main, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, version=7, name=rhceph, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-11-26T19:44:28Z, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git) Dec 15 04:53:00 localhost serene_bohr[301015]: 167 167 Dec 15 04:53:00 localhost systemd[1]: libpod-6358f75c8365436482bd50d8a27bedecec72c2f9bb342ec7ade073f7387c154f.scope: Deactivated successfully. Dec 15 04:53:00 localhost podman[301000]: 2025-12-15 09:53:00.860645381 +0000 UTC m=+0.162465170 container died 6358f75c8365436482bd50d8a27bedecec72c2f9bb342ec7ade073f7387c154f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=serene_bohr, ceph=True, architecture=x86_64, io.buildah.version=1.41.4, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-11-26T19:44:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, release=1763362218, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, distribution-scope=public, version=7, vcs-type=git, vendor=Red Hat, Inc., GIT_CLEAN=True, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0) Dec 15 04:53:00 localhost podman[301020]: 2025-12-15 09:53:00.939916371 +0000 UTC m=+0.073285834 container remove 6358f75c8365436482bd50d8a27bedecec72c2f9bb342ec7ade073f7387c154f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=serene_bohr, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, GIT_CLEAN=True, build-date=2025-11-26T19:44:28Z, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, RELEASE=main, name=rhceph, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, version=7, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, GIT_BRANCH=main, CEPH_POINT_RELEASE=, release=1763362218) Dec 15 04:53:00 localhost systemd[1]: libpod-conmon-6358f75c8365436482bd50d8a27bedecec72c2f9bb342ec7ade073f7387c154f.scope: Deactivated successfully. Dec 15 04:53:01 localhost nova_compute[286344]: 2025-12-15 09:53:01.271 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:53:01 localhost ceph-mon[298913]: Reconfiguring daemon osd.3 on np0005559462.localdomain Dec 15 04:53:01 localhost ceph-mon[298913]: from='mgr.24104 ' entity='mgr.np0005559460.oexkup' Dec 15 04:53:01 localhost ceph-mon[298913]: from='mgr.24104 ' entity='mgr.np0005559460.oexkup' Dec 15 04:53:01 localhost ceph-mon[298913]: from='mgr.24104 ' entity='mgr.np0005559460.oexkup' Dec 15 04:53:01 localhost ceph-mon[298913]: from='mgr.24104 ' entity='mgr.np0005559460.oexkup' Dec 15 04:53:01 localhost ceph-mon[298913]: from='mgr.24104 172.18.0.104:0/2825492105' entity='mgr.np0005559460.oexkup' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Dec 15 04:53:01 localhost systemd[1]: var-lib-containers-storage-overlay-4edb4099b9345ac1c92cf7e9f86718ddbfa62cc3e3cf114d65000a5fdffeca26-merged.mount: Deactivated successfully. Dec 15 04:53:01 localhost podman[301098]: Dec 15 04:53:01 localhost podman[301098]: 2025-12-15 09:53:01.782152092 +0000 UTC m=+0.076426411 container create ca008247fb997db68892c3b2043d9747a0f1840a952e8334a261bec880605a6d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=frosty_lalande, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, io.openshift.tags=rhceph ceph, name=rhceph, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, distribution-scope=public, vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, io.openshift.expose-services=, RELEASE=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , GIT_BRANCH=main, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream) Dec 15 04:53:01 localhost systemd[1]: Started libpod-conmon-ca008247fb997db68892c3b2043d9747a0f1840a952e8334a261bec880605a6d.scope. Dec 15 04:53:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 04:53:01 localhost systemd[1]: Started libcrun container. Dec 15 04:53:01 localhost podman[301098]: 2025-12-15 09:53:01.749734158 +0000 UTC m=+0.044008527 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 04:53:01 localhost podman[301098]: 2025-12-15 09:53:01.861686759 +0000 UTC m=+0.155961078 container init ca008247fb997db68892c3b2043d9747a0f1840a952e8334a261bec880605a6d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=frosty_lalande, distribution-scope=public, vendor=Red Hat, Inc., vcs-type=git, maintainer=Guillaume Abrioux , architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, RELEASE=main, build-date=2025-11-26T19:44:28Z, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Dec 15 04:53:01 localhost podman[301098]: 2025-12-15 09:53:01.873242952 +0000 UTC m=+0.167517241 container start ca008247fb997db68892c3b2043d9747a0f1840a952e8334a261bec880605a6d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=frosty_lalande, maintainer=Guillaume Abrioux , RELEASE=main, vendor=Red Hat, Inc., name=rhceph, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, build-date=2025-11-26T19:44:28Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, version=7, distribution-scope=public, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1763362218, architecture=x86_64, description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, io.buildah.version=1.41.4, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers) Dec 15 04:53:01 localhost podman[301098]: 2025-12-15 09:53:01.873473768 +0000 UTC m=+0.167748137 container attach ca008247fb997db68892c3b2043d9747a0f1840a952e8334a261bec880605a6d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=frosty_lalande, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, version=7, architecture=x86_64, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, GIT_BRANCH=main, build-date=2025-11-26T19:44:28Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1763362218, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, RELEASE=main, vendor=Red Hat, Inc.) Dec 15 04:53:01 localhost frosty_lalande[301113]: 167 167 Dec 15 04:53:01 localhost systemd[1]: libpod-ca008247fb997db68892c3b2043d9747a0f1840a952e8334a261bec880605a6d.scope: Deactivated successfully. Dec 15 04:53:01 localhost podman[301098]: 2025-12-15 09:53:01.877363246 +0000 UTC m=+0.171637575 container died ca008247fb997db68892c3b2043d9747a0f1840a952e8334a261bec880605a6d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=frosty_lalande, io.openshift.tags=rhceph ceph, name=rhceph, maintainer=Guillaume Abrioux , ceph=True, vendor=Red Hat, Inc., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, release=1763362218, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, build-date=2025-11-26T19:44:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, RELEASE=main, distribution-scope=public, io.openshift.expose-services=) Dec 15 04:53:01 localhost podman[243449]: time="2025-12-15T09:53:01Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 15 04:53:01 localhost podman[301114]: 2025-12-15 09:53:01.932053352 +0000 UTC m=+0.091668407 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 04:53:01 localhost podman[243449]: @ - - [15/Dec/2025:09:53:01 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156594 "" "Go-http-client/1.1" Dec 15 04:53:02 localhost podman[301114]: 2025-12-15 09:53:02.012817342 +0000 UTC m=+0.172432387 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Dec 15 04:53:02 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 04:53:02 localhost podman[301127]: 2025-12-15 09:53:02.080473268 +0000 UTC m=+0.193942327 container remove ca008247fb997db68892c3b2043d9747a0f1840a952e8334a261bec880605a6d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=frosty_lalande, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, GIT_CLEAN=True, io.buildah.version=1.41.4, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, name=rhceph, version=7, vcs-type=git, RELEASE=main, CEPH_POINT_RELEASE=, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers) Dec 15 04:53:02 localhost systemd[1]: libpod-conmon-ca008247fb997db68892c3b2043d9747a0f1840a952e8334a261bec880605a6d.scope: Deactivated successfully. Dec 15 04:53:02 localhost podman[243449]: @ - - [15/Dec/2025:09:53:01 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18707 "" "Go-http-client/1.1" Dec 15 04:53:02 localhost ceph-mon[298913]: mon.np0005559462@4(peon).osd e87 e87: 6 total, 6 up, 6 in Dec 15 04:53:02 localhost nova_compute[286344]: 2025-12-15 09:53:02.266 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:53:02 localhost nova_compute[286344]: 2025-12-15 09:53:02.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:53:02 localhost nova_compute[286344]: 2025-12-15 09:53:02.299 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:53:02 localhost nova_compute[286344]: 2025-12-15 09:53:02.300 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:53:02 localhost nova_compute[286344]: 2025-12-15 09:53:02.300 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:53:02 localhost nova_compute[286344]: 2025-12-15 09:53:02.300 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Auditing locally available compute resources for np0005559462.localdomain (node: np0005559462.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 15 04:53:02 localhost nova_compute[286344]: 2025-12-15 09:53:02.301 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 04:53:02 localhost systemd[1]: session-67.scope: Deactivated successfully. Dec 15 04:53:02 localhost systemd[1]: session-67.scope: Consumed 8.565s CPU time. Dec 15 04:53:02 localhost systemd-logind[763]: Session 67 logged out. Waiting for processes to exit. Dec 15 04:53:02 localhost systemd-logind[763]: Removed session 67. Dec 15 04:53:02 localhost ceph-mon[298913]: Reconfiguring mon.np0005559462 (monmap changed)... Dec 15 04:53:02 localhost ceph-mon[298913]: Reconfiguring daemon mon.np0005559462 on np0005559462.localdomain Dec 15 04:53:02 localhost ceph-mon[298913]: from='mgr.24104 ' entity='mgr.np0005559460.oexkup' Dec 15 04:53:02 localhost ceph-mon[298913]: from='mgr.24104 ' entity='mgr.np0005559460.oexkup' Dec 15 04:53:02 localhost ceph-mon[298913]: from='client.? ' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Dec 15 04:53:02 localhost ceph-mon[298913]: from='client.? 172.18.0.200:0/2648200823' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Dec 15 04:53:02 localhost ceph-mon[298913]: Activating manager daemon np0005559464.aomnqe Dec 15 04:53:02 localhost ceph-mon[298913]: from='mgr.24104 172.18.0.104:0/2825492105' entity='mgr.np0005559460.oexkup' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005559463.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Dec 15 04:53:02 localhost ceph-mon[298913]: from='mgr.24104 ' entity='mgr.np0005559460.oexkup' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005559463.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Dec 15 04:53:02 localhost ceph-mon[298913]: from='client.? ' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished Dec 15 04:53:02 localhost ceph-mon[298913]: Manager daemon np0005559464.aomnqe is now available Dec 15 04:53:02 localhost sshd[301170]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:53:02 localhost systemd[1]: var-lib-containers-storage-overlay-ff55957868cdd6c28d4556d573b14aa33ee71317402666c16572f850638b5e07-merged.mount: Deactivated successfully. Dec 15 04:53:02 localhost systemd-logind[763]: New session 69 of user ceph-admin. Dec 15 04:53:02 localhost systemd[1]: Started Session 69 of User ceph-admin. Dec 15 04:53:02 localhost nova_compute[286344]: 2025-12-15 09:53:02.743 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 04:53:02 localhost nova_compute[286344]: 2025-12-15 09:53:02.811 286348 DEBUG nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 04:53:02 localhost nova_compute[286344]: 2025-12-15 09:53:02.811 286348 DEBUG nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 04:53:03 localhost nova_compute[286344]: 2025-12-15 09:53:03.037 286348 WARNING nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 15 04:53:03 localhost nova_compute[286344]: 2025-12-15 09:53:03.038 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Hypervisor/Node resource view: name=np0005559462.localdomain free_ram=11780MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 15 04:53:03 localhost nova_compute[286344]: 2025-12-15 09:53:03.039 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:53:03 localhost nova_compute[286344]: 2025-12-15 09:53:03.039 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:53:03 localhost nova_compute[286344]: 2025-12-15 09:53:03.123 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Instance 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 15 04:53:03 localhost nova_compute[286344]: 2025-12-15 09:53:03.123 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 15 04:53:03 localhost nova_compute[286344]: 2025-12-15 09:53:03.124 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Final resource view: name=np0005559462.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 15 04:53:03 localhost nova_compute[286344]: 2025-12-15 09:53:03.163 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 04:53:03 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005559464.aomnqe/mirror_snapshot_schedule"} : dispatch Dec 15 04:53:03 localhost ceph-mon[298913]: from='mgr.26593 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005559464.aomnqe/mirror_snapshot_schedule"} : dispatch Dec 15 04:53:03 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005559464.aomnqe/trash_purge_schedule"} : dispatch Dec 15 04:53:03 localhost ceph-mon[298913]: from='mgr.26593 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005559464.aomnqe/trash_purge_schedule"} : dispatch Dec 15 04:53:03 localhost ceph-mon[298913]: mon.np0005559462@4(peon) e9 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Dec 15 04:53:03 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3645845023' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Dec 15 04:53:03 localhost nova_compute[286344]: 2025-12-15 09:53:03.643 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.480s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 04:53:03 localhost nova_compute[286344]: 2025-12-15 09:53:03.650 286348 DEBUG nova.compute.provider_tree [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Inventory has not changed in ProviderTree for provider: 26c8956b-6742-4951-b566-971b9bbe323b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 15 04:53:03 localhost nova_compute[286344]: 2025-12-15 09:53:03.664 286348 DEBUG nova.scheduler.client.report [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Inventory has not changed for provider 26c8956b-6742-4951-b566-971b9bbe323b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 15 04:53:03 localhost nova_compute[286344]: 2025-12-15 09:53:03.667 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Compute_service record updated for np0005559462.localdomain:np0005559462.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 15 04:53:03 localhost nova_compute[286344]: 2025-12-15 09:53:03.667 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.628s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:53:03 localhost podman[301306]: 2025-12-15 09:53:03.808252548 +0000 UTC m=+0.099559896 container exec 8dcda56b365b42dc8758aab77a9ec80db304780e449052738f7e4e648ae1ecaf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-crash-np0005559462, version=7, description=Red Hat Ceph Storage 7, vcs-type=git, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, name=rhceph, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, ceph=True, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1763362218, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, distribution-scope=public, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Dec 15 04:53:03 localhost podman[301306]: 2025-12-15 09:53:03.907766022 +0000 UTC m=+0.199073350 container exec_died 8dcda56b365b42dc8758aab77a9ec80db304780e449052738f7e4e648ae1ecaf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-crash-np0005559462, name=rhceph, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-11-26T19:44:28Z, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, release=1763362218, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, architecture=x86_64, ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, GIT_CLEAN=True, distribution-scope=public, io.openshift.expose-services=, GIT_BRANCH=main) Dec 15 04:53:04 localhost ceph-mon[298913]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #16. Immutable memtables: 0. Dec 15 04:53:04 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:53:04.173878) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Dec 15 04:53:04 localhost ceph-mon[298913]: rocksdb: [db/flush_job.cc:856] [default] [JOB 5] Flushing memtable with next log file: 16 Dec 15 04:53:04 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765792384173930, "job": 5, "event": "flush_started", "num_memtables": 1, "num_entries": 1577, "num_deletes": 264, "total_data_size": 8598776, "memory_usage": 9211984, "flush_reason": "Manual Compaction"} Dec 15 04:53:04 localhost ceph-mon[298913]: rocksdb: [db/flush_job.cc:885] [default] [JOB 5] Level-0 flush table #17: started Dec 15 04:53:04 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765792384210593, "cf_name": "default", "job": 5, "event": "table_file_creation", "file_number": 17, "file_size": 5372848, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 10652, "largest_seqno": 12224, "table_properties": {"data_size": 5365836, "index_size": 3837, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2117, "raw_key_size": 18041, "raw_average_key_size": 21, "raw_value_size": 5350568, "raw_average_value_size": 6501, "num_data_blocks": 161, "num_entries": 823, "num_filter_entries": 823, "num_deletions": 262, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765792329, "oldest_key_time": 1765792329, "file_creation_time": 1765792384, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "603b24af-e2be-4214-bc56-9e652eb4af3d", "db_session_id": "0OJRM9SCUA16EXV0VQZ2", "orig_file_number": 17, "seqno_to_time_mapping": "N/A"}} Dec 15 04:53:04 localhost ceph-mon[298913]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 5] Flush lasted 36809 microseconds, and 9979 cpu microseconds. Dec 15 04:53:04 localhost ceph-mon[298913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Dec 15 04:53:04 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:53:04.210675) [db/flush_job.cc:967] [default] [JOB 5] Level-0 flush table #17: 5372848 bytes OK Dec 15 04:53:04 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:53:04.210710) [db/memtable_list.cc:519] [default] Level-0 commit table #17 started Dec 15 04:53:04 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:53:04.213100) [db/memtable_list.cc:722] [default] Level-0 commit table #17: memtable #1 done Dec 15 04:53:04 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:53:04.213124) EVENT_LOG_v1 {"time_micros": 1765792384213118, "job": 5, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Dec 15 04:53:04 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:53:04.213147) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Dec 15 04:53:04 localhost ceph-mon[298913]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 5] Try to delete WAL files size 8590725, prev total WAL file size 8591474, number of live WAL files 2. Dec 15 04:53:04 localhost ceph-mon[298913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005559462/store.db/000013.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Dec 15 04:53:04 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:53:04.215063) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0033353134' seq:72057594037927935, type:22 .. '6C6F676D0033373639' seq:0, type:0; will stop at (end) Dec 15 04:53:04 localhost ceph-mon[298913]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 6] Compacting 1@0 + 1@6 files to L6, score -1.00 Dec 15 04:53:04 localhost ceph-mon[298913]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 5 Base level 0, inputs: [17(5246KB)], [15(12MB)] Dec 15 04:53:04 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765792384215138, "job": 6, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [17], "files_L6": [15], "score": -1, "input_data_size": 18126527, "oldest_snapshot_seqno": -1} Dec 15 04:53:04 localhost ceph-mon[298913]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 6] Generated table #18: 10281 keys, 17877183 bytes, temperature: kUnknown Dec 15 04:53:04 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765792384341288, "cf_name": "default", "job": 6, "event": "table_file_creation", "file_number": 18, "file_size": 17877183, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17815650, "index_size": 34744, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 25733, "raw_key_size": 274727, "raw_average_key_size": 26, "raw_value_size": 17636832, "raw_average_value_size": 1715, "num_data_blocks": 1341, "num_entries": 10281, "num_filter_entries": 10281, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765792320, "oldest_key_time": 0, "file_creation_time": 1765792384, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "603b24af-e2be-4214-bc56-9e652eb4af3d", "db_session_id": "0OJRM9SCUA16EXV0VQZ2", "orig_file_number": 18, "seqno_to_time_mapping": "N/A"}} Dec 15 04:53:04 localhost ceph-mon[298913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Dec 15 04:53:04 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:53:04.341916) [db/compaction/compaction_job.cc:1663] [default] [JOB 6] Compacted 1@0 + 1@6 files to L6 => 17877183 bytes Dec 15 04:53:04 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:53:04.343758) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 143.2 rd, 141.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(5.1, 12.2 +0.0 blob) out(17.0 +0.0 blob), read-write-amplify(6.7) write-amplify(3.3) OK, records in: 10844, records dropped: 563 output_compression: NoCompression Dec 15 04:53:04 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:53:04.343789) EVENT_LOG_v1 {"time_micros": 1765792384343776, "job": 6, "event": "compaction_finished", "compaction_time_micros": 126562, "compaction_time_cpu_micros": 45594, "output_level": 6, "num_output_files": 1, "total_output_size": 17877183, "num_input_records": 10844, "num_output_records": 10281, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Dec 15 04:53:04 localhost ceph-mon[298913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005559462/store.db/000017.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Dec 15 04:53:04 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765792384344916, "job": 6, "event": "table_file_deletion", "file_number": 17} Dec 15 04:53:04 localhost ceph-mon[298913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005559462/store.db/000015.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Dec 15 04:53:04 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765792384347107, "job": 6, "event": "table_file_deletion", "file_number": 15} Dec 15 04:53:04 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:53:04.214928) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 04:53:04 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:53:04.347331) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 04:53:04 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:53:04.347353) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 04:53:04 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:53:04.347356) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 04:53:04 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:53:04.347359) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 04:53:04 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:53:04.347362) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 04:53:04 localhost ceph-mon[298913]: [15/Dec/2025:09:53:03] ENGINE Bus STARTING Dec 15 04:53:04 localhost ceph-mon[298913]: [15/Dec/2025:09:53:03] ENGINE Serving on https://172.18.0.108:7150 Dec 15 04:53:04 localhost ceph-mon[298913]: [15/Dec/2025:09:53:03] ENGINE Client ('172.18.0.108', 56240) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') Dec 15 04:53:04 localhost ceph-mon[298913]: [15/Dec/2025:09:53:03] ENGINE Serving on http://172.18.0.108:8765 Dec 15 04:53:04 localhost ceph-mon[298913]: [15/Dec/2025:09:53:03] ENGINE Bus STARTED Dec 15 04:53:04 localhost ceph-mon[298913]: from='mgr.26593 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:04 localhost ceph-mon[298913]: from='mgr.26593 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:04 localhost ceph-mon[298913]: from='mgr.26593 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:04 localhost ceph-mon[298913]: from='mgr.26593 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:04 localhost nova_compute[286344]: 2025-12-15 09:53:04.668 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:53:04 localhost nova_compute[286344]: 2025-12-15 09:53:04.668 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 15 04:53:04 localhost nova_compute[286344]: 2025-12-15 09:53:04.668 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 15 04:53:04 localhost openstack_network_exporter[246484]: ERROR 09:53:04 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 15 04:53:04 localhost openstack_network_exporter[246484]: ERROR 09:53:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 04:53:04 localhost openstack_network_exporter[246484]: ERROR 09:53:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 04:53:04 localhost openstack_network_exporter[246484]: ERROR 09:53:04 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 15 04:53:04 localhost openstack_network_exporter[246484]: Dec 15 04:53:04 localhost openstack_network_exporter[246484]: ERROR 09:53:04 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 15 04:53:04 localhost openstack_network_exporter[246484]: Dec 15 04:53:05 localhost nova_compute[286344]: 2025-12-15 09:53:05.037 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 15 04:53:05 localhost nova_compute[286344]: 2025-12-15 09:53:05.037 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquired lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 15 04:53:05 localhost nova_compute[286344]: 2025-12-15 09:53:05.038 286348 DEBUG nova.network.neutron [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 15 04:53:05 localhost nova_compute[286344]: 2025-12-15 09:53:05.038 286348 DEBUG nova.objects.instance [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 15 04:53:05 localhost ceph-mon[298913]: mon.np0005559462@4(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:53:05 localhost ceph-mon[298913]: from='mgr.26593 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:05 localhost ceph-mon[298913]: from='mgr.26593 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:05 localhost ceph-mon[298913]: from='mgr.26593 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:05 localhost ceph-mon[298913]: from='mgr.26593 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:05 localhost ceph-mon[298913]: from='mgr.26593 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:05 localhost ceph-mon[298913]: from='mgr.26593 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:05 localhost nova_compute[286344]: 2025-12-15 09:53:05.488 286348 DEBUG nova.network.neutron [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Updating instance_info_cache with network_info: [{"id": "03ef8889-3216-43fb-8a52-4be17a956ce1", "address": "fa:16:3e:74:df:7c", "network": {"id": "befb7a72-17a9-4bcb-b561-84b8f626685a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.201", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "c785bf23f53946bc99867d8832a50266", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03ef8889-32", "ovs_interfaceid": "03ef8889-3216-43fb-8a52-4be17a956ce1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 15 04:53:05 localhost nova_compute[286344]: 2025-12-15 09:53:05.505 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Releasing lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 15 04:53:05 localhost nova_compute[286344]: 2025-12-15 09:53:05.506 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 15 04:53:05 localhost nova_compute[286344]: 2025-12-15 09:53:05.506 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:53:05 localhost nova_compute[286344]: 2025-12-15 09:53:05.507 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:53:05 localhost nova_compute[286344]: 2025-12-15 09:53:05.507 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:53:05 localhost nova_compute[286344]: 2025-12-15 09:53:05.742 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:53:05 localhost nova_compute[286344]: 2025-12-15 09:53:05.744 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:53:06 localhost nova_compute[286344]: 2025-12-15 09:53:06.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:53:06 localhost nova_compute[286344]: 2025-12-15 09:53:06.288 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:53:06 localhost nova_compute[286344]: 2025-12-15 09:53:06.289 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:53:06 localhost nova_compute[286344]: 2025-12-15 09:53:06.290 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 15 04:53:06 localhost ceph-mon[298913]: from='mgr.26593 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:06 localhost ceph-mon[298913]: from='mgr.26593 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:06 localhost ceph-mon[298913]: from='mgr.26593 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "config rm", "who": "osd/host:np0005559460", "name": "osd_memory_target"} : dispatch Dec 15 04:53:06 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "config rm", "who": "osd/host:np0005559460", "name": "osd_memory_target"} : dispatch Dec 15 04:53:06 localhost ceph-mon[298913]: from='mgr.26593 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:06 localhost ceph-mon[298913]: from='mgr.26593 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:06 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Dec 15 04:53:06 localhost ceph-mon[298913]: from='mgr.26593 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Dec 15 04:53:06 localhost ceph-mon[298913]: from='mgr.26593 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:06 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Dec 15 04:53:06 localhost ceph-mon[298913]: from='mgr.26593 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Dec 15 04:53:06 localhost ceph-mon[298913]: from='mgr.26593 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:06 localhost ceph-mon[298913]: Adjusting osd_memory_target on np0005559463.localdomain to 836.6M Dec 15 04:53:06 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "config rm", "who": "osd/host:np0005559461", "name": "osd_memory_target"} : dispatch Dec 15 04:53:06 localhost ceph-mon[298913]: from='mgr.26593 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "config rm", "who": "osd/host:np0005559461", "name": "osd_memory_target"} : dispatch Dec 15 04:53:06 localhost ceph-mon[298913]: Unable to set osd_memory_target on np0005559463.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Dec 15 04:53:06 localhost ceph-mon[298913]: from='mgr.26593 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:06 localhost ceph-mon[298913]: from='mgr.26593 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:06 localhost ceph-mon[298913]: from='mgr.26593 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Dec 15 04:53:06 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Dec 15 04:53:06 localhost ceph-mon[298913]: from='mgr.26593 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Dec 15 04:53:06 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Dec 15 04:53:06 localhost ceph-mon[298913]: Adjusting osd_memory_target on np0005559464.localdomain to 836.6M Dec 15 04:53:06 localhost ceph-mon[298913]: Unable to set osd_memory_target on np0005559464.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Dec 15 04:53:06 localhost ceph-mon[298913]: from='mgr.26593 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:06 localhost ceph-mon[298913]: from='mgr.26593 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:06 localhost ceph-mon[298913]: from='mgr.26593 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Dec 15 04:53:06 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Dec 15 04:53:06 localhost ceph-mon[298913]: from='mgr.26593 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Dec 15 04:53:06 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Dec 15 04:53:06 localhost ceph-mon[298913]: Adjusting osd_memory_target on np0005559462.localdomain to 836.6M Dec 15 04:53:06 localhost ceph-mon[298913]: Unable to set osd_memory_target on np0005559462.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096 Dec 15 04:53:06 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Dec 15 04:53:06 localhost ceph-mon[298913]: Updating np0005559460.localdomain:/etc/ceph/ceph.conf Dec 15 04:53:06 localhost ceph-mon[298913]: Updating np0005559461.localdomain:/etc/ceph/ceph.conf Dec 15 04:53:06 localhost ceph-mon[298913]: Updating np0005559462.localdomain:/etc/ceph/ceph.conf Dec 15 04:53:06 localhost ceph-mon[298913]: Updating np0005559463.localdomain:/etc/ceph/ceph.conf Dec 15 04:53:06 localhost ceph-mon[298913]: Updating np0005559464.localdomain:/etc/ceph/ceph.conf Dec 15 04:53:08 localhost ceph-mon[298913]: Updating np0005559460.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:53:08 localhost ceph-mon[298913]: Updating np0005559462.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:53:08 localhost ceph-mon[298913]: Updating np0005559461.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:53:08 localhost ceph-mon[298913]: Updating np0005559463.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:53:08 localhost ceph-mon[298913]: Updating np0005559464.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:53:08 localhost ceph-mon[298913]: Updating np0005559460.localdomain:/etc/ceph/ceph.client.admin.keyring Dec 15 04:53:08 localhost ceph-mon[298913]: Updating np0005559461.localdomain:/etc/ceph/ceph.client.admin.keyring Dec 15 04:53:08 localhost ceph-mon[298913]: Updating np0005559464.localdomain:/etc/ceph/ceph.client.admin.keyring Dec 15 04:53:08 localhost ceph-mon[298913]: Updating np0005559462.localdomain:/etc/ceph/ceph.client.admin.keyring Dec 15 04:53:08 localhost ceph-mon[298913]: Updating np0005559463.localdomain:/etc/ceph/ceph.client.admin.keyring Dec 15 04:53:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e. Dec 15 04:53:09 localhost ceph-mon[298913]: Updating np0005559460.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.client.admin.keyring Dec 15 04:53:09 localhost ceph-mon[298913]: Updating np0005559464.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.client.admin.keyring Dec 15 04:53:09 localhost ceph-mon[298913]: Updating np0005559461.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.client.admin.keyring Dec 15 04:53:09 localhost ceph-mon[298913]: Updating np0005559462.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.client.admin.keyring Dec 15 04:53:09 localhost ceph-mon[298913]: Updating np0005559463.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.client.admin.keyring Dec 15 04:53:09 localhost ceph-mon[298913]: from='mgr.26593 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:09 localhost ceph-mon[298913]: from='mgr.26593 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:09 localhost ceph-mon[298913]: from='mgr.26593 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:09 localhost ceph-mon[298913]: from='mgr.26593 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:09 localhost ceph-mon[298913]: from='mgr.26593 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:09 localhost ceph-mon[298913]: from='mgr.26593 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:09 localhost ceph-mon[298913]: from='mgr.26593 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:09 localhost ceph-mon[298913]: from='mgr.26593 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:09 localhost ceph-mon[298913]: from='mgr.26593 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:09 localhost ceph-mon[298913]: from='mgr.26593 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:09 localhost ceph-mon[298913]: from='mgr.26593 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:09 localhost ceph-mon[298913]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #19. Immutable memtables: 0. Dec 15 04:53:09 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:53:09.195101) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Dec 15 04:53:09 localhost ceph-mon[298913]: rocksdb: [db/flush_job.cc:856] [default] [JOB 7] Flushing memtable with next log file: 19 Dec 15 04:53:09 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765792389195166, "job": 7, "event": "flush_started", "num_memtables": 1, "num_entries": 702, "num_deletes": 257, "total_data_size": 2770614, "memory_usage": 2888176, "flush_reason": "Manual Compaction"} Dec 15 04:53:09 localhost ceph-mon[298913]: rocksdb: [db/flush_job.cc:885] [default] [JOB 7] Level-0 flush table #20: started Dec 15 04:53:09 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765792389215357, "cf_name": "default", "job": 7, "event": "table_file_creation", "file_number": 20, "file_size": 1764631, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 12229, "largest_seqno": 12926, "table_properties": {"data_size": 1760781, "index_size": 1577, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1221, "raw_key_size": 9092, "raw_average_key_size": 19, "raw_value_size": 1752705, "raw_average_value_size": 3785, "num_data_blocks": 63, "num_entries": 463, "num_filter_entries": 463, "num_deletions": 257, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765792384, "oldest_key_time": 1765792384, "file_creation_time": 1765792389, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "603b24af-e2be-4214-bc56-9e652eb4af3d", "db_session_id": "0OJRM9SCUA16EXV0VQZ2", "orig_file_number": 20, "seqno_to_time_mapping": "N/A"}} Dec 15 04:53:09 localhost ceph-mon[298913]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 7] Flush lasted 20351 microseconds, and 4873 cpu microseconds. Dec 15 04:53:09 localhost ceph-mon[298913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Dec 15 04:53:09 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:53:09.215449) [db/flush_job.cc:967] [default] [JOB 7] Level-0 flush table #20: 1764631 bytes OK Dec 15 04:53:09 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:53:09.215473) [db/memtable_list.cc:519] [default] Level-0 commit table #20 started Dec 15 04:53:09 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:53:09.220230) [db/memtable_list.cc:722] [default] Level-0 commit table #20: memtable #1 done Dec 15 04:53:09 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:53:09.220279) EVENT_LOG_v1 {"time_micros": 1765792389220272, "job": 7, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Dec 15 04:53:09 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:53:09.220302) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Dec 15 04:53:09 localhost ceph-mon[298913]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 7] Try to delete WAL files size 2766499, prev total WAL file size 2766499, number of live WAL files 2. Dec 15 04:53:09 localhost ceph-mon[298913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005559462/store.db/000016.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Dec 15 04:53:09 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:53:09.221112) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B760031303232' seq:72057594037927935, type:22 .. '6B760031323830' seq:0, type:0; will stop at (end) Dec 15 04:53:09 localhost ceph-mon[298913]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 8] Compacting 1@0 + 1@6 files to L6, score -1.00 Dec 15 04:53:09 localhost ceph-mon[298913]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 7 Base level 0, inputs: [20(1723KB)], [18(17MB)] Dec 15 04:53:09 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765792389221152, "job": 8, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [20], "files_L6": [18], "score": -1, "input_data_size": 19641814, "oldest_snapshot_seqno": -1} Dec 15 04:53:09 localhost systemd[1]: tmp-crun.qrXMDt.mount: Deactivated successfully. Dec 15 04:53:09 localhost podman[302204]: 2025-12-15 09:53:09.24076171 +0000 UTC m=+0.092833769 container health_status a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Dec 15 04:53:09 localhost podman[302204]: 2025-12-15 09:53:09.250367418 +0000 UTC m=+0.102439487 container exec_died a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Dec 15 04:53:09 localhost systemd[1]: a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e.service: Deactivated successfully. Dec 15 04:53:09 localhost ceph-mon[298913]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 8] Generated table #21: 10202 keys, 18642481 bytes, temperature: kUnknown Dec 15 04:53:09 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765792389364097, "cf_name": "default", "job": 8, "event": "table_file_creation", "file_number": 21, "file_size": 18642481, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 18582236, "index_size": 33648, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 25541, "raw_key_size": 274941, "raw_average_key_size": 26, "raw_value_size": 18405437, "raw_average_value_size": 1804, "num_data_blocks": 1278, "num_entries": 10202, "num_filter_entries": 10202, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765792320, "oldest_key_time": 0, "file_creation_time": 1765792389, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "603b24af-e2be-4214-bc56-9e652eb4af3d", "db_session_id": "0OJRM9SCUA16EXV0VQZ2", "orig_file_number": 21, "seqno_to_time_mapping": "N/A"}} Dec 15 04:53:09 localhost ceph-mon[298913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Dec 15 04:53:09 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:53:09.364840) [db/compaction/compaction_job.cc:1663] [default] [JOB 8] Compacted 1@0 + 1@6 files to L6 => 18642481 bytes Dec 15 04:53:09 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:53:09.368910) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 136.9 rd, 130.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.7, 17.0 +0.0 blob) out(17.8 +0.0 blob), read-write-amplify(21.7) write-amplify(10.6) OK, records in: 10744, records dropped: 542 output_compression: NoCompression Dec 15 04:53:09 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:53:09.368932) EVENT_LOG_v1 {"time_micros": 1765792389368922, "job": 8, "event": "compaction_finished", "compaction_time_micros": 143451, "compaction_time_cpu_micros": 48943, "output_level": 6, "num_output_files": 1, "total_output_size": 18642481, "num_input_records": 10744, "num_output_records": 10202, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Dec 15 04:53:09 localhost ceph-mon[298913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005559462/store.db/000020.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Dec 15 04:53:09 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765792389369327, "job": 8, "event": "table_file_deletion", "file_number": 20} Dec 15 04:53:09 localhost ceph-mon[298913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005559462/store.db/000018.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Dec 15 04:53:09 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765792389371089, "job": 8, "event": "table_file_deletion", "file_number": 18} Dec 15 04:53:09 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:53:09.221048) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 04:53:09 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:53:09.371206) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 04:53:09 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:53:09.371212) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 04:53:09 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:53:09.371215) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 04:53:09 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:53:09.371218) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 04:53:09 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:53:09.371231) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 04:53:10 localhost ceph-mon[298913]: mon.np0005559462@4(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:53:10 localhost ceph-mon[298913]: Reconfiguring crash.np0005559463 (monmap changed)... Dec 15 04:53:10 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005559463.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Dec 15 04:53:10 localhost ceph-mon[298913]: from='mgr.26593 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005559463.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Dec 15 04:53:10 localhost ceph-mon[298913]: Reconfiguring daemon crash.np0005559463 on np0005559463.localdomain Dec 15 04:53:10 localhost nova_compute[286344]: 2025-12-15 09:53:10.744 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:53:10 localhost nova_compute[286344]: 2025-12-15 09:53:10.747 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:53:11 localhost ceph-mon[298913]: from='mgr.26593 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:11 localhost ceph-mon[298913]: from='mgr.26593 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:11 localhost ceph-mon[298913]: Reconfiguring osd.2 (monmap changed)... Dec 15 04:53:11 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Dec 15 04:53:11 localhost ceph-mon[298913]: Reconfiguring daemon osd.2 on np0005559463.localdomain Dec 15 04:53:11 localhost ceph-mon[298913]: from='mgr.26593 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:12 localhost ceph-mon[298913]: from='mgr.26593 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:12 localhost ceph-mon[298913]: from='mgr.26593 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:12 localhost ceph-mon[298913]: from='mgr.26593 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:12 localhost ceph-mon[298913]: Reconfiguring osd.5 (monmap changed)... Dec 15 04:53:12 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Dec 15 04:53:12 localhost ceph-mon[298913]: Reconfiguring daemon osd.5 on np0005559463.localdomain Dec 15 04:53:13 localhost ceph-mon[298913]: from='mgr.26593 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:13 localhost ceph-mon[298913]: from='mgr.26593 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:13 localhost ceph-mon[298913]: from='mgr.26593 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:13 localhost ceph-mon[298913]: from='mgr.26593 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:13 localhost ceph-mon[298913]: Reconfiguring mds.mds.np0005559463.rdpgze (monmap changed)... Dec 15 04:53:13 localhost ceph-mon[298913]: from='mgr.26593 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005559463.rdpgze", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Dec 15 04:53:13 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005559463.rdpgze", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Dec 15 04:53:13 localhost ceph-mon[298913]: Reconfiguring daemon mds.mds.np0005559463.rdpgze on np0005559463.localdomain Dec 15 04:53:13 localhost ceph-mon[298913]: from='mgr.26593 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:13 localhost ceph-mon[298913]: from='mgr.26593 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:14 localhost ceph-mon[298913]: from='mgr.26593 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:14 localhost ceph-mon[298913]: Reconfiguring mgr.np0005559463.daptkf (monmap changed)... Dec 15 04:53:14 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005559463.daptkf", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Dec 15 04:53:14 localhost ceph-mon[298913]: from='mgr.26593 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005559463.daptkf", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Dec 15 04:53:14 localhost ceph-mon[298913]: Reconfiguring daemon mgr.np0005559463.daptkf on np0005559463.localdomain Dec 15 04:53:14 localhost ceph-mon[298913]: from='mgr.26593 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:14 localhost ceph-mon[298913]: from='mgr.26593 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:14 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Dec 15 04:53:15 localhost ceph-mon[298913]: mon.np0005559462@4(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:53:15 localhost ceph-mon[298913]: Reconfiguring mon.np0005559463 (monmap changed)... Dec 15 04:53:15 localhost ceph-mon[298913]: Reconfiguring daemon mon.np0005559463 on np0005559463.localdomain Dec 15 04:53:15 localhost ceph-mon[298913]: from='mgr.26593 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:15 localhost ceph-mon[298913]: from='mgr.26593 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:15 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005559464.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Dec 15 04:53:15 localhost ceph-mon[298913]: from='mgr.26593 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005559464.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Dec 15 04:53:15 localhost nova_compute[286344]: 2025-12-15 09:53:15.748 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:53:15 localhost nova_compute[286344]: 2025-12-15 09:53:15.750 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:53:15 localhost nova_compute[286344]: 2025-12-15 09:53:15.750 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 15 04:53:15 localhost nova_compute[286344]: 2025-12-15 09:53:15.750 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:53:15 localhost nova_compute[286344]: 2025-12-15 09:53:15.751 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:53:15 localhost nova_compute[286344]: 2025-12-15 09:53:15.754 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:53:16 localhost ceph-mon[298913]: Reconfiguring crash.np0005559464 (monmap changed)... Dec 15 04:53:16 localhost ceph-mon[298913]: Reconfiguring daemon crash.np0005559464 on np0005559464.localdomain Dec 15 04:53:16 localhost ceph-mon[298913]: from='mgr.26593 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:16 localhost ceph-mon[298913]: from='mgr.26593 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:16 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Dec 15 04:53:17 localhost ceph-mon[298913]: Reconfiguring osd.1 (monmap changed)... Dec 15 04:53:17 localhost ceph-mon[298913]: Reconfiguring daemon osd.1 on np0005559464.localdomain Dec 15 04:53:17 localhost ceph-mon[298913]: from='mgr.26593 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:17 localhost ceph-mon[298913]: from='mgr.26593 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:17 localhost ceph-mon[298913]: from='mgr.26593 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:17 localhost ceph-mon[298913]: from='mgr.26593 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:17 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Dec 15 04:53:18 localhost ceph-mgr[292421]: ms_deliver_dispatch: unhandled message 0x55975582b1e0 mon_map magic: 0 from mon.3 v2:172.18.0.107:3300/0 Dec 15 04:53:18 localhost ceph-mon[298913]: mon.np0005559462@4(peon) e10 my rank is now 3 (was 4) Dec 15 04:53:18 localhost ceph-mgr[292421]: client.0 ms_handle_reset on v2:172.18.0.107:3300/0 Dec 15 04:53:18 localhost ceph-mgr[292421]: client.0 ms_handle_reset on v2:172.18.0.107:3300/0 Dec 15 04:53:18 localhost ceph-mon[298913]: log_channel(cluster) log [INF] : mon.np0005559462 calling monitor election Dec 15 04:53:18 localhost ceph-mon[298913]: paxos.3).electionLogic(44) init, last seen epoch 44 Dec 15 04:53:18 localhost ceph-mgr[292421]: ms_deliver_dispatch: unhandled message 0x55975582b600 mon_map magic: 0 from mon.1 v2:172.18.0.108:3300/0 Dec 15 04:53:18 localhost ceph-mon[298913]: mon.np0005559462@3(electing) e10 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Dec 15 04:53:18 localhost ceph-mon[298913]: mon.np0005559462@3(electing) e10 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Dec 15 04:53:18 localhost ceph-mon[298913]: mon.np0005559462@3(electing) e10 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Dec 15 04:53:20 localhost ceph-mon[298913]: mon.np0005559462@3(electing) e10 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Dec 15 04:53:20 localhost ceph-mon[298913]: mon.np0005559462@3(peon) e10 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Dec 15 04:53:20 localhost ceph-mon[298913]: mon.np0005559462@3(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:53:20 localhost ceph-mon[298913]: Reconfiguring daemon osd.4 on np0005559464.localdomain Dec 15 04:53:20 localhost ceph-mon[298913]: Remove daemons mon.np0005559460 Dec 15 04:53:20 localhost ceph-mon[298913]: Safe to remove mon.np0005559460: new quorum should be ['np0005559461', 'np0005559464', 'np0005559463', 'np0005559462'] (from ['np0005559461', 'np0005559464', 'np0005559463', 'np0005559462']) Dec 15 04:53:20 localhost ceph-mon[298913]: Removing monitor np0005559460 from monmap... Dec 15 04:53:20 localhost ceph-mon[298913]: Removing daemon mon.np0005559460 from np0005559460.localdomain -- ports [] Dec 15 04:53:20 localhost ceph-mon[298913]: mon.np0005559463 calling monitor election Dec 15 04:53:20 localhost ceph-mon[298913]: mon.np0005559462 calling monitor election Dec 15 04:53:20 localhost ceph-mon[298913]: mon.np0005559464 calling monitor election Dec 15 04:53:20 localhost ceph-mon[298913]: mon.np0005559461 calling monitor election Dec 15 04:53:20 localhost ceph-mon[298913]: mon.np0005559461 is new leader, mons np0005559461,np0005559464,np0005559463,np0005559462 in quorum (ranks 0,1,2,3) Dec 15 04:53:20 localhost ceph-mon[298913]: overall HEALTH_OK Dec 15 04:53:20 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:20 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:20 localhost nova_compute[286344]: 2025-12-15 09:53:20.754 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:53:21 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:21 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:21 localhost ceph-mon[298913]: Reconfiguring mds.mds.np0005559464.piyuji (monmap changed)... Dec 15 04:53:21 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005559464.piyuji", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Dec 15 04:53:21 localhost ceph-mon[298913]: Reconfiguring daemon mds.mds.np0005559464.piyuji on np0005559464.localdomain Dec 15 04:53:21 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:21 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:21 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005559464.aomnqe", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Dec 15 04:53:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0. Dec 15 04:53:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 04:53:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a. Dec 15 04:53:21 localhost systemd[1]: tmp-crun.XCvuWB.mount: Deactivated successfully. Dec 15 04:53:21 localhost podman[302229]: 2025-12-15 09:53:21.776579486 +0000 UTC m=+0.099604768 container health_status 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Dec 15 04:53:21 localhost podman[302230]: 2025-12-15 09:53:21.817063155 +0000 UTC m=+0.140432896 container health_status 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, container_name=multipathd, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Dec 15 04:53:21 localhost podman[302229]: 2025-12-15 09:53:21.820469529 +0000 UTC m=+0.143494781 container exec_died 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 15 04:53:21 localhost podman[302230]: 2025-12-15 09:53:21.832448333 +0000 UTC m=+0.155818094 container exec_died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3) Dec 15 04:53:21 localhost systemd[1]: 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0.service: Deactivated successfully. Dec 15 04:53:21 localhost podman[302231]: 2025-12-15 09:53:21.874000112 +0000 UTC m=+0.195732158 container health_status b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Dec 15 04:53:21 localhost podman[302231]: 2025-12-15 09:53:21.884783353 +0000 UTC m=+0.206515379 container exec_died b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 04:53:21 localhost systemd[1]: b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a.service: Deactivated successfully. Dec 15 04:53:21 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.service: Deactivated successfully. Dec 15 04:53:22 localhost ceph-mon[298913]: Reconfiguring mgr.np0005559464.aomnqe (monmap changed)... Dec 15 04:53:22 localhost ceph-mon[298913]: Reconfiguring daemon mgr.np0005559464.aomnqe on np0005559464.localdomain Dec 15 04:53:22 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:22 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:22 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Dec 15 04:53:23 localhost ceph-mon[298913]: Reconfiguring mon.np0005559464 (monmap changed)... Dec 15 04:53:23 localhost ceph-mon[298913]: Reconfiguring daemon mon.np0005559464 on np0005559464.localdomain Dec 15 04:53:23 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:23 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:23 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:24 localhost ceph-mon[298913]: Removed label mon from host np0005559460.localdomain Dec 15 04:53:24 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09. Dec 15 04:53:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 04:53:24 localhost podman[302308]: 2025-12-15 09:53:24.527106597 +0000 UTC m=+0.066952697 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3) Dec 15 04:53:24 localhost systemd[1]: tmp-crun.gF69wC.mount: Deactivated successfully. Dec 15 04:53:24 localhost podman[302307]: 2025-12-15 09:53:24.630035217 +0000 UTC m=+0.170652799 container health_status 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, distribution-scope=public, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, config_id=openstack_network_exporter, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., architecture=x86_64, managed_by=edpm_ansible, name=ubi9-minimal, release=1755695350, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal) Dec 15 04:53:24 localhost podman[302308]: 2025-12-15 09:53:24.658587403 +0000 UTC m=+0.198433453 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Dec 15 04:53:24 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 04:53:24 localhost podman[302307]: 2025-12-15 09:53:24.667531182 +0000 UTC m=+0.208148764 container exec_died 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.buildah.version=1.33.7, architecture=x86_64, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., config_id=openstack_network_exporter, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vcs-type=git, maintainer=Red Hat, Inc.) Dec 15 04:53:24 localhost systemd[1]: 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09.service: Deactivated successfully. Dec 15 04:53:25 localhost ceph-mon[298913]: mon.np0005559462@3(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:53:25 localhost ceph-mon[298913]: Removed label mgr from host np0005559460.localdomain Dec 15 04:53:25 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:25 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:25 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Dec 15 04:53:25 localhost ceph-mon[298913]: Updating np0005559460.localdomain:/etc/ceph/ceph.conf Dec 15 04:53:25 localhost ceph-mon[298913]: Updating np0005559461.localdomain:/etc/ceph/ceph.conf Dec 15 04:53:25 localhost ceph-mon[298913]: Updating np0005559462.localdomain:/etc/ceph/ceph.conf Dec 15 04:53:25 localhost ceph-mon[298913]: Updating np0005559463.localdomain:/etc/ceph/ceph.conf Dec 15 04:53:25 localhost ceph-mon[298913]: Updating np0005559464.localdomain:/etc/ceph/ceph.conf Dec 15 04:53:25 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:25 localhost nova_compute[286344]: 2025-12-15 09:53:25.756 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:53:26 localhost sshd[302652]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:53:26 localhost ceph-mon[298913]: Updating np0005559462.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:53:26 localhost ceph-mon[298913]: Updating np0005559461.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:53:26 localhost ceph-mon[298913]: Updating np0005559460.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:53:26 localhost ceph-mon[298913]: Updating np0005559464.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:53:26 localhost ceph-mon[298913]: Updating np0005559463.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:53:26 localhost ceph-mon[298913]: Removed label _admin from host np0005559460.localdomain Dec 15 04:53:26 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:26 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:26 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:26 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:26 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:26 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:26 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:26 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:26 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:26 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:26 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:27 localhost ceph-mon[298913]: Removing daemon mgr.np0005559460.oexkup from np0005559460.localdomain -- ports [8765] Dec 15 04:53:28 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "mgr.np0005559460.oexkup"} : dispatch Dec 15 04:53:28 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth rm", "entity": "mgr.np0005559460.oexkup"}]': finished Dec 15 04:53:28 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:28 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:28 localhost sshd[302672]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:53:29 localhost ceph-mon[298913]: Removing key for mgr.np0005559460.oexkup Dec 15 04:53:29 localhost ceph-mon[298913]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #22. Immutable memtables: 0. Dec 15 04:53:29 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:53:29.699367) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Dec 15 04:53:29 localhost ceph-mon[298913]: rocksdb: [db/flush_job.cc:856] [default] [JOB 9] Flushing memtable with next log file: 22 Dec 15 04:53:29 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765792409699408, "job": 9, "event": "flush_started", "num_memtables": 1, "num_entries": 1106, "num_deletes": 252, "total_data_size": 1798526, "memory_usage": 1822176, "flush_reason": "Manual Compaction"} Dec 15 04:53:29 localhost ceph-mon[298913]: rocksdb: [db/flush_job.cc:885] [default] [JOB 9] Level-0 flush table #23: started Dec 15 04:53:29 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765792409709949, "cf_name": "default", "job": 9, "event": "table_file_creation", "file_number": 23, "file_size": 1036682, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 12931, "largest_seqno": 14032, "table_properties": {"data_size": 1031400, "index_size": 2625, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1605, "raw_key_size": 13898, "raw_average_key_size": 22, "raw_value_size": 1019965, "raw_average_value_size": 1631, "num_data_blocks": 110, "num_entries": 625, "num_filter_entries": 625, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765792390, "oldest_key_time": 1765792390, "file_creation_time": 1765792409, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "603b24af-e2be-4214-bc56-9e652eb4af3d", "db_session_id": "0OJRM9SCUA16EXV0VQZ2", "orig_file_number": 23, "seqno_to_time_mapping": "N/A"}} Dec 15 04:53:29 localhost ceph-mon[298913]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 9] Flush lasted 10672 microseconds, and 3955 cpu microseconds. Dec 15 04:53:29 localhost ceph-mon[298913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Dec 15 04:53:29 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:53:29.710037) [db/flush_job.cc:967] [default] [JOB 9] Level-0 flush table #23: 1036682 bytes OK Dec 15 04:53:29 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:53:29.710061) [db/memtable_list.cc:519] [default] Level-0 commit table #23 started Dec 15 04:53:29 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:53:29.712741) [db/memtable_list.cc:722] [default] Level-0 commit table #23: memtable #1 done Dec 15 04:53:29 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:53:29.712762) EVENT_LOG_v1 {"time_micros": 1765792409712756, "job": 9, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Dec 15 04:53:29 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:53:29.712786) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Dec 15 04:53:29 localhost ceph-mon[298913]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 9] Try to delete WAL files size 1792680, prev total WAL file size 1793097, number of live WAL files 2. Dec 15 04:53:29 localhost ceph-mon[298913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005559462/store.db/000019.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Dec 15 04:53:29 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:53:29.713513) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003130353432' seq:72057594037927935, type:22 .. '7061786F73003130373934' seq:0, type:0; will stop at (end) Dec 15 04:53:29 localhost ceph-mon[298913]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 10] Compacting 1@0 + 1@6 files to L6, score -1.00 Dec 15 04:53:29 localhost ceph-mon[298913]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 9 Base level 0, inputs: [23(1012KB)], [21(17MB)] Dec 15 04:53:29 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765792409713551, "job": 10, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [23], "files_L6": [21], "score": -1, "input_data_size": 19679163, "oldest_snapshot_seqno": -1} Dec 15 04:53:29 localhost ceph-mon[298913]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 10] Generated table #24: 10286 keys, 16101650 bytes, temperature: kUnknown Dec 15 04:53:29 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765792409786971, "cf_name": "default", "job": 10, "event": "table_file_creation", "file_number": 24, "file_size": 16101650, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 16042004, "index_size": 32857, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 25733, "raw_key_size": 277798, "raw_average_key_size": 27, "raw_value_size": 15864783, "raw_average_value_size": 1542, "num_data_blocks": 1243, "num_entries": 10286, "num_filter_entries": 10286, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765792320, "oldest_key_time": 0, "file_creation_time": 1765792409, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "603b24af-e2be-4214-bc56-9e652eb4af3d", "db_session_id": "0OJRM9SCUA16EXV0VQZ2", "orig_file_number": 24, "seqno_to_time_mapping": "N/A"}} Dec 15 04:53:29 localhost ceph-mon[298913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Dec 15 04:53:29 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:53:29.787768) [db/compaction/compaction_job.cc:1663] [default] [JOB 10] Compacted 1@0 + 1@6 files to L6 => 16101650 bytes Dec 15 04:53:29 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:53:29.789708) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 266.0 rd, 217.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.0, 17.8 +0.0 blob) out(15.4 +0.0 blob), read-write-amplify(34.5) write-amplify(15.5) OK, records in: 10827, records dropped: 541 output_compression: NoCompression Dec 15 04:53:29 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:53:29.789741) EVENT_LOG_v1 {"time_micros": 1765792409789727, "job": 10, "event": "compaction_finished", "compaction_time_micros": 73990, "compaction_time_cpu_micros": 32143, "output_level": 6, "num_output_files": 1, "total_output_size": 16101650, "num_input_records": 10827, "num_output_records": 10286, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Dec 15 04:53:29 localhost ceph-mon[298913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005559462/store.db/000023.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Dec 15 04:53:29 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765792409790306, "job": 10, "event": "table_file_deletion", "file_number": 23} Dec 15 04:53:29 localhost ceph-mon[298913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005559462/store.db/000021.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Dec 15 04:53:29 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765792409793143, "job": 10, "event": "table_file_deletion", "file_number": 21} Dec 15 04:53:29 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:53:29.713432) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 04:53:29 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:53:29.793228) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 04:53:29 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:53:29.793234) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 04:53:29 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:53:29.793237) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 04:53:29 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:53:29.793240) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 04:53:29 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:53:29.793243) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 04:53:30 localhost ceph-mon[298913]: mon.np0005559462@3(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:53:30 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:30 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:30 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Dec 15 04:53:30 localhost ceph-mon[298913]: Removing np0005559460.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:53:30 localhost ceph-mon[298913]: Removing np0005559460.localdomain:/etc/ceph/ceph.client.admin.keyring Dec 15 04:53:30 localhost ceph-mon[298913]: Removing np0005559460.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.client.admin.keyring Dec 15 04:53:30 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:30 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:30 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:30 localhost nova_compute[286344]: 2025-12-15 09:53:30.759 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:53:31 localhost ceph-mon[298913]: Reconfiguring crash.np0005559460 (monmap changed)... Dec 15 04:53:31 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005559460.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Dec 15 04:53:31 localhost ceph-mon[298913]: Reconfiguring daemon crash.np0005559460 on np0005559460.localdomain Dec 15 04:53:31 localhost podman[243449]: time="2025-12-15T09:53:31Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 15 04:53:31 localhost podman[243449]: @ - - [15/Dec/2025:09:53:31 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 154814 "" "Go-http-client/1.1" Dec 15 04:53:31 localhost podman[243449]: @ - - [15/Dec/2025:09:53:31 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18704 "" "Go-http-client/1.1" Dec 15 04:53:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 04:53:32 localhost podman[302692]: 2025-12-15 09:53:32.762717018 +0000 UTC m=+0.092773748 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible) Dec 15 04:53:32 localhost podman[302692]: 2025-12-15 09:53:32.767074419 +0000 UTC m=+0.097131139 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true) Dec 15 04:53:32 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 04:53:32 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:32 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:32 localhost ceph-mon[298913]: Reconfiguring mon.np0005559461 (monmap changed)... Dec 15 04:53:32 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Dec 15 04:53:32 localhost ceph-mon[298913]: Reconfiguring daemon mon.np0005559461 on np0005559461.localdomain Dec 15 04:53:32 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:32 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:32 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:32 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005559461.egwgzn", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Dec 15 04:53:34 localhost ceph-mon[298913]: Reconfiguring mgr.np0005559461.egwgzn (monmap changed)... Dec 15 04:53:34 localhost ceph-mon[298913]: Reconfiguring daemon mgr.np0005559461.egwgzn on np0005559461.localdomain Dec 15 04:53:34 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:34 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:34 localhost ceph-mon[298913]: Reconfiguring crash.np0005559461 (monmap changed)... Dec 15 04:53:34 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005559461.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Dec 15 04:53:34 localhost ceph-mon[298913]: Reconfiguring daemon crash.np0005559461 on np0005559461.localdomain Dec 15 04:53:34 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:34 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:34 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005559462.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Dec 15 04:53:34 localhost openstack_network_exporter[246484]: ERROR 09:53:34 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 15 04:53:34 localhost openstack_network_exporter[246484]: ERROR 09:53:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 04:53:34 localhost openstack_network_exporter[246484]: ERROR 09:53:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 04:53:34 localhost openstack_network_exporter[246484]: ERROR 09:53:34 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 15 04:53:34 localhost openstack_network_exporter[246484]: Dec 15 04:53:34 localhost openstack_network_exporter[246484]: ERROR 09:53:34 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 15 04:53:34 localhost openstack_network_exporter[246484]: Dec 15 04:53:35 localhost ceph-mon[298913]: mon.np0005559462@3(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:53:35 localhost podman[302765]: Dec 15 04:53:35 localhost podman[302765]: 2025-12-15 09:53:35.201787986 +0000 UTC m=+0.062405921 container create cb96d016944b4b67295b9b9a6026036edef00e91592038dd89a6469134b60d34 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_pasteur, GIT_BRANCH=main, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, vcs-type=git, maintainer=Guillaume Abrioux , GIT_CLEAN=True, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, ceph=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, release=1763362218, RELEASE=main, vendor=Red Hat, Inc., io.buildah.version=1.41.4) Dec 15 04:53:35 localhost systemd[1]: Started libpod-conmon-cb96d016944b4b67295b9b9a6026036edef00e91592038dd89a6469134b60d34.scope. Dec 15 04:53:35 localhost systemd[1]: Started libcrun container. Dec 15 04:53:35 localhost podman[302765]: 2025-12-15 09:53:35.175543504 +0000 UTC m=+0.036161449 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 04:53:35 localhost podman[302765]: 2025-12-15 09:53:35.280678966 +0000 UTC m=+0.141296941 container init cb96d016944b4b67295b9b9a6026036edef00e91592038dd89a6469134b60d34 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_pasteur, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, architecture=x86_64, vcs-type=git, RELEASE=main, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, ceph=True, com.redhat.component=rhceph-container, version=7, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, build-date=2025-11-26T19:44:28Z) Dec 15 04:53:35 localhost podman[302765]: 2025-12-15 09:53:35.293675427 +0000 UTC m=+0.154293392 container start cb96d016944b4b67295b9b9a6026036edef00e91592038dd89a6469134b60d34 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_pasteur, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, release=1763362218, version=7, distribution-scope=public, RELEASE=main, maintainer=Guillaume Abrioux , url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7) Dec 15 04:53:35 localhost podman[302765]: 2025-12-15 09:53:35.293995547 +0000 UTC m=+0.154613482 container attach cb96d016944b4b67295b9b9a6026036edef00e91592038dd89a6469134b60d34 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_pasteur, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, ceph=True, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , architecture=x86_64, build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, com.redhat.component=rhceph-container, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Dec 15 04:53:35 localhost vibrant_pasteur[302780]: 167 167 Dec 15 04:53:35 localhost systemd[1]: libpod-cb96d016944b4b67295b9b9a6026036edef00e91592038dd89a6469134b60d34.scope: Deactivated successfully. Dec 15 04:53:35 localhost podman[302765]: 2025-12-15 09:53:35.30306989 +0000 UTC m=+0.163687835 container died cb96d016944b4b67295b9b9a6026036edef00e91592038dd89a6469134b60d34 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_pasteur, distribution-scope=public, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, build-date=2025-11-26T19:44:28Z, release=1763362218, io.openshift.tags=rhceph ceph, vcs-type=git, io.buildah.version=1.41.4, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, name=rhceph, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, ceph=True) Dec 15 04:53:35 localhost podman[302785]: 2025-12-15 09:53:35.403833639 +0000 UTC m=+0.095599616 container remove cb96d016944b4b67295b9b9a6026036edef00e91592038dd89a6469134b60d34 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_pasteur, io.buildah.version=1.41.4, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1763362218, com.redhat.component=rhceph-container, vcs-type=git, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, name=rhceph, version=7, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Dec 15 04:53:35 localhost systemd[1]: libpod-conmon-cb96d016944b4b67295b9b9a6026036edef00e91592038dd89a6469134b60d34.scope: Deactivated successfully. Dec 15 04:53:35 localhost ceph-mon[298913]: Reconfiguring crash.np0005559462 (monmap changed)... Dec 15 04:53:35 localhost ceph-mon[298913]: Reconfiguring daemon crash.np0005559462 on np0005559462.localdomain Dec 15 04:53:35 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:35 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:35 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch Dec 15 04:53:35 localhost nova_compute[286344]: 2025-12-15 09:53:35.761 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:53:36 localhost podman[302855]: Dec 15 04:53:36 localhost podman[302855]: 2025-12-15 09:53:36.134468268 +0000 UTC m=+0.078418567 container create 5780553c5bc0fbe0953d3d753af6a48da45f032657f878196d6a6c98ba0c0ce9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_golick, vendor=Red Hat, Inc., architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, vcs-type=git, build-date=2025-11-26T19:44:28Z, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1763362218, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Dec 15 04:53:36 localhost systemd[1]: Started libpod-conmon-5780553c5bc0fbe0953d3d753af6a48da45f032657f878196d6a6c98ba0c0ce9.scope. Dec 15 04:53:36 localhost systemd[1]: Started libcrun container. Dec 15 04:53:36 localhost podman[302855]: 2025-12-15 09:53:36.199211133 +0000 UTC m=+0.143161432 container init 5780553c5bc0fbe0953d3d753af6a48da45f032657f878196d6a6c98ba0c0ce9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_golick, release=1763362218, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux , GIT_CLEAN=True, io.openshift.expose-services=, build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, vcs-type=git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, ceph=True, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream) Dec 15 04:53:36 localhost podman[302855]: 2025-12-15 09:53:36.100949814 +0000 UTC m=+0.044900133 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 04:53:36 localhost systemd[1]: var-lib-containers-storage-overlay-5ae3844efbc7d069e662816dda76c6e3f435b84b891ca9b68c01c4ab5c3f673a-merged.mount: Deactivated successfully. Dec 15 04:53:36 localhost podman[302855]: 2025-12-15 09:53:36.209399426 +0000 UTC m=+0.153349745 container start 5780553c5bc0fbe0953d3d753af6a48da45f032657f878196d6a6c98ba0c0ce9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_golick, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, io.openshift.tags=rhceph ceph, release=1763362218, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, vcs-type=git, GIT_CLEAN=True, ceph=True, architecture=x86_64, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, build-date=2025-11-26T19:44:28Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, RELEASE=main, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4) Dec 15 04:53:36 localhost podman[302855]: 2025-12-15 09:53:36.210013814 +0000 UTC m=+0.153964123 container attach 5780553c5bc0fbe0953d3d753af6a48da45f032657f878196d6a6c98ba0c0ce9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_golick, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, maintainer=Guillaume Abrioux , io.openshift.expose-services=, GIT_BRANCH=main, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, release=1763362218, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True) Dec 15 04:53:36 localhost reverent_golick[302870]: 167 167 Dec 15 04:53:36 localhost systemd[1]: libpod-5780553c5bc0fbe0953d3d753af6a48da45f032657f878196d6a6c98ba0c0ce9.scope: Deactivated successfully. Dec 15 04:53:36 localhost podman[302855]: 2025-12-15 09:53:36.212538564 +0000 UTC m=+0.156488923 container died 5780553c5bc0fbe0953d3d753af6a48da45f032657f878196d6a6c98ba0c0ce9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_golick, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, GIT_BRANCH=main, RELEASE=main, io.openshift.expose-services=, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1763362218, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, architecture=x86_64, ceph=True) Dec 15 04:53:36 localhost systemd[1]: var-lib-containers-storage-overlay-2d0c7dcdf8d57914bde082ec4f17ac750563faa068798c34aea457c07717d7cb-merged.mount: Deactivated successfully. Dec 15 04:53:36 localhost podman[302875]: 2025-12-15 09:53:36.311217625 +0000 UTC m=+0.085906816 container remove 5780553c5bc0fbe0953d3d753af6a48da45f032657f878196d6a6c98ba0c0ce9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_golick, version=7, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, ceph=True, io.openshift.expose-services=, build-date=2025-11-26T19:44:28Z, release=1763362218, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git) Dec 15 04:53:36 localhost systemd[1]: libpod-conmon-5780553c5bc0fbe0953d3d753af6a48da45f032657f878196d6a6c98ba0c0ce9.scope: Deactivated successfully. Dec 15 04:53:36 localhost ceph-mon[298913]: Reconfiguring osd.0 (monmap changed)... Dec 15 04:53:36 localhost ceph-mon[298913]: Reconfiguring daemon osd.0 on np0005559462.localdomain Dec 15 04:53:36 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:36 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:36 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch Dec 15 04:53:36 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:36 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:37 localhost podman[302950]: Dec 15 04:53:37 localhost podman[302950]: 2025-12-15 09:53:37.131120413 +0000 UTC m=+0.084053115 container create 45c639ff5b56a5f9ebc69cb2d494204259579aebf6dbb86074d1a73aec5713b8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=practical_lovelace, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, CEPH_POINT_RELEASE=, GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux , name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, version=7, ceph=True, io.buildah.version=1.41.4, io.openshift.expose-services=, distribution-scope=public, description=Red Hat Ceph Storage 7, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, release=1763362218, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers) Dec 15 04:53:37 localhost systemd[1]: Started libpod-conmon-45c639ff5b56a5f9ebc69cb2d494204259579aebf6dbb86074d1a73aec5713b8.scope. Dec 15 04:53:37 localhost systemd[1]: Started libcrun container. Dec 15 04:53:37 localhost podman[302950]: 2025-12-15 09:53:37.100437287 +0000 UTC m=+0.053369989 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 04:53:37 localhost podman[302950]: 2025-12-15 09:53:37.200352393 +0000 UTC m=+0.153285075 container init 45c639ff5b56a5f9ebc69cb2d494204259579aebf6dbb86074d1a73aec5713b8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=practical_lovelace, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, ceph=True, version=7, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux , build-date=2025-11-26T19:44:28Z, CEPH_POINT_RELEASE=, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, GIT_BRANCH=main, io.openshift.expose-services=, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, description=Red Hat Ceph Storage 7, release=1763362218) Dec 15 04:53:37 localhost podman[302950]: 2025-12-15 09:53:37.210076105 +0000 UTC m=+0.163008747 container start 45c639ff5b56a5f9ebc69cb2d494204259579aebf6dbb86074d1a73aec5713b8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=practical_lovelace, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, architecture=x86_64, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, vcs-type=git, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, io.openshift.expose-services=, ceph=True, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, name=rhceph, io.openshift.tags=rhceph ceph, release=1763362218, url=https://catalog.redhat.com/en/search?searchType=containers) Dec 15 04:53:37 localhost podman[302950]: 2025-12-15 09:53:37.21064177 +0000 UTC m=+0.163574402 container attach 45c639ff5b56a5f9ebc69cb2d494204259579aebf6dbb86074d1a73aec5713b8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=practical_lovelace, maintainer=Guillaume Abrioux , name=rhceph, io.openshift.expose-services=, RELEASE=main, release=1763362218, com.redhat.component=rhceph-container, architecture=x86_64, ceph=True, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, version=7, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, distribution-scope=public, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=) Dec 15 04:53:37 localhost practical_lovelace[302965]: 167 167 Dec 15 04:53:37 localhost systemd[1]: libpod-45c639ff5b56a5f9ebc69cb2d494204259579aebf6dbb86074d1a73aec5713b8.scope: Deactivated successfully. Dec 15 04:53:37 localhost podman[302950]: 2025-12-15 09:53:37.215366331 +0000 UTC m=+0.168299023 container died 45c639ff5b56a5f9ebc69cb2d494204259579aebf6dbb86074d1a73aec5713b8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=practical_lovelace, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, distribution-scope=public, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, name=rhceph, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1763362218, vcs-type=git, CEPH_POINT_RELEASE=, GIT_CLEAN=True, ceph=True, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, GIT_BRANCH=main, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph) Dec 15 04:53:37 localhost systemd[1]: var-lib-containers-storage-overlay-82a86ae61dbed095f1f87b7a43633686c2d2775e960e85d85553d1d7202ae5c3-merged.mount: Deactivated successfully. Dec 15 04:53:37 localhost podman[302971]: 2025-12-15 09:53:37.313782466 +0000 UTC m=+0.090665489 container remove 45c639ff5b56a5f9ebc69cb2d494204259579aebf6dbb86074d1a73aec5713b8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=practical_lovelace, io.openshift.expose-services=, name=rhceph, io.openshift.tags=rhceph ceph, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, vcs-type=git, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, version=7, architecture=x86_64, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Dec 15 04:53:37 localhost systemd[1]: libpod-conmon-45c639ff5b56a5f9ebc69cb2d494204259579aebf6dbb86074d1a73aec5713b8.scope: Deactivated successfully. Dec 15 04:53:37 localhost ceph-mon[298913]: Reconfiguring osd.3 (monmap changed)... Dec 15 04:53:37 localhost ceph-mon[298913]: Reconfiguring daemon osd.3 on np0005559462.localdomain Dec 15 04:53:37 localhost ceph-mon[298913]: Added label _no_schedule to host np0005559460.localdomain Dec 15 04:53:37 localhost ceph-mon[298913]: Added label SpecialHostLabels.DRAIN_CONF_KEYRING to host np0005559460.localdomain Dec 15 04:53:37 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:37 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:37 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005559462.mhigvc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Dec 15 04:53:38 localhost podman[303048]: Dec 15 04:53:38 localhost podman[303048]: 2025-12-15 09:53:38.088468333 +0000 UTC m=+0.074639203 container create d7a8671e8e2691830d71702b59e4d541cada8da3fa79b93dee23bd61e4d4d8c8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=angry_clarke, io.buildah.version=1.41.4, architecture=x86_64, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-11-26T19:44:28Z, ceph=True, vcs-type=git, io.openshift.expose-services=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, maintainer=Guillaume Abrioux , version=7, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, RELEASE=main) Dec 15 04:53:38 localhost systemd[1]: Started libpod-conmon-d7a8671e8e2691830d71702b59e4d541cada8da3fa79b93dee23bd61e4d4d8c8.scope. Dec 15 04:53:38 localhost systemd[1]: Started libcrun container. Dec 15 04:53:38 localhost podman[303048]: 2025-12-15 09:53:38.144035432 +0000 UTC m=+0.130206322 container init d7a8671e8e2691830d71702b59e4d541cada8da3fa79b93dee23bd61e4d4d8c8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=angry_clarke, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, vcs-type=git, build-date=2025-11-26T19:44:28Z, release=1763362218, RELEASE=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, name=rhceph, architecture=x86_64, distribution-scope=public, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, vendor=Red Hat, Inc.) Dec 15 04:53:38 localhost podman[303048]: 2025-12-15 09:53:38.152543519 +0000 UTC m=+0.138714369 container start d7a8671e8e2691830d71702b59e4d541cada8da3fa79b93dee23bd61e4d4d8c8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=angry_clarke, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, vcs-type=git, build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, RELEASE=main, architecture=x86_64, GIT_BRANCH=main, version=7, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers) Dec 15 04:53:38 localhost angry_clarke[303063]: 167 167 Dec 15 04:53:38 localhost systemd[1]: libpod-d7a8671e8e2691830d71702b59e4d541cada8da3fa79b93dee23bd61e4d4d8c8.scope: Deactivated successfully. Dec 15 04:53:38 localhost podman[303048]: 2025-12-15 09:53:38.153915147 +0000 UTC m=+0.140086027 container attach d7a8671e8e2691830d71702b59e4d541cada8da3fa79b93dee23bd61e4d4d8c8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=angry_clarke, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, ceph=True, maintainer=Guillaume Abrioux , version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, GIT_CLEAN=True, vendor=Red Hat, Inc., distribution-scope=public, RELEASE=main, CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, release=1763362218, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, io.openshift.expose-services=, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Dec 15 04:53:38 localhost podman[303048]: 2025-12-15 09:53:38.058162168 +0000 UTC m=+0.044333088 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 04:53:38 localhost podman[303048]: 2025-12-15 09:53:38.157291132 +0000 UTC m=+0.143461982 container died d7a8671e8e2691830d71702b59e4d541cada8da3fa79b93dee23bd61e4d4d8c8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=angry_clarke, distribution-scope=public, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, name=rhceph, vcs-type=git, io.buildah.version=1.41.4, io.openshift.expose-services=, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, release=1763362218, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main) Dec 15 04:53:38 localhost systemd[1]: var-lib-containers-storage-overlay-1b8fd0f1cd07900e3bcae293e9e44730d96b9d92ae946ddbcd2c7b63173a538b-merged.mount: Deactivated successfully. Dec 15 04:53:38 localhost podman[303068]: 2025-12-15 09:53:38.240196333 +0000 UTC m=+0.075242169 container remove d7a8671e8e2691830d71702b59e4d541cada8da3fa79b93dee23bd61e4d4d8c8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=angry_clarke, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, GIT_CLEAN=True, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-11-26T19:44:28Z, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, ceph=True, architecture=x86_64, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph) Dec 15 04:53:38 localhost systemd[1]: libpod-conmon-d7a8671e8e2691830d71702b59e4d541cada8da3fa79b93dee23bd61e4d4d8c8.scope: Deactivated successfully. Dec 15 04:53:38 localhost ceph-mon[298913]: Reconfiguring mds.mds.np0005559462.mhigvc (monmap changed)... Dec 15 04:53:38 localhost ceph-mon[298913]: Reconfiguring daemon mds.mds.np0005559462.mhigvc on np0005559462.localdomain Dec 15 04:53:38 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:38 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:38 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005559462.fudvyx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Dec 15 04:53:38 localhost podman[303139]: Dec 15 04:53:38 localhost podman[303139]: 2025-12-15 09:53:38.89830833 +0000 UTC m=+0.064866489 container create ea919ee717b4535929913548e8c0f00e31e08b15173b4e25fdf27f9252a9effc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=xenodochial_goldwasser, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=Guillaume Abrioux , GIT_BRANCH=main, description=Red Hat Ceph Storage 7, vcs-type=git, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1763362218, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, RELEASE=main, architecture=x86_64) Dec 15 04:53:38 localhost systemd[1]: Started libpod-conmon-ea919ee717b4535929913548e8c0f00e31e08b15173b4e25fdf27f9252a9effc.scope. Dec 15 04:53:38 localhost systemd[1]: Started libcrun container. Dec 15 04:53:38 localhost podman[303139]: 2025-12-15 09:53:38.954077755 +0000 UTC m=+0.120635884 container init ea919ee717b4535929913548e8c0f00e31e08b15173b4e25fdf27f9252a9effc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=xenodochial_goldwasser, release=1763362218, vendor=Red Hat, Inc., vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, build-date=2025-11-26T19:44:28Z, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , version=7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Dec 15 04:53:38 localhost podman[303139]: 2025-12-15 09:53:38.9628766 +0000 UTC m=+0.129434719 container start ea919ee717b4535929913548e8c0f00e31e08b15173b4e25fdf27f9252a9effc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=xenodochial_goldwasser, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, build-date=2025-11-26T19:44:28Z, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux , io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, version=7, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, release=1763362218, name=rhceph, distribution-scope=public, io.buildah.version=1.41.4, GIT_BRANCH=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0) Dec 15 04:53:38 localhost podman[303139]: 2025-12-15 09:53:38.963186759 +0000 UTC m=+0.129744888 container attach ea919ee717b4535929913548e8c0f00e31e08b15173b4e25fdf27f9252a9effc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=xenodochial_goldwasser, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, distribution-scope=public, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, release=1763362218, GIT_CLEAN=True, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=) Dec 15 04:53:38 localhost xenodochial_goldwasser[303154]: 167 167 Dec 15 04:53:38 localhost systemd[1]: libpod-ea919ee717b4535929913548e8c0f00e31e08b15173b4e25fdf27f9252a9effc.scope: Deactivated successfully. Dec 15 04:53:38 localhost podman[303139]: 2025-12-15 09:53:38.966594675 +0000 UTC m=+0.133152884 container died ea919ee717b4535929913548e8c0f00e31e08b15173b4e25fdf27f9252a9effc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=xenodochial_goldwasser, release=1763362218, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, RELEASE=main, ceph=True, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , io.openshift.expose-services=, GIT_CLEAN=True, io.buildah.version=1.41.4, version=7, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream) Dec 15 04:53:38 localhost podman[303139]: 2025-12-15 09:53:38.86851543 +0000 UTC m=+0.035073599 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 04:53:39 localhost podman[303159]: 2025-12-15 09:53:39.060648856 +0000 UTC m=+0.083226491 container remove ea919ee717b4535929913548e8c0f00e31e08b15173b4e25fdf27f9252a9effc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=xenodochial_goldwasser, maintainer=Guillaume Abrioux , vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, RELEASE=main, vendor=Red Hat, Inc., release=1763362218, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, distribution-scope=public, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, ceph=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0) Dec 15 04:53:39 localhost systemd[1]: libpod-conmon-ea919ee717b4535929913548e8c0f00e31e08b15173b4e25fdf27f9252a9effc.scope: Deactivated successfully. Dec 15 04:53:39 localhost systemd[1]: var-lib-containers-storage-overlay-6e010636bfb040fa6983a467e8d988c73f90127c9389d84359f7fbca0a967e7f-merged.mount: Deactivated successfully. Dec 15 04:53:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e. Dec 15 04:53:39 localhost podman[303211]: 2025-12-15 09:53:39.425461676 +0000 UTC m=+0.089120544 container health_status a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 15 04:53:39 localhost podman[303211]: 2025-12-15 09:53:39.44135515 +0000 UTC m=+0.105014018 container exec_died a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 15 04:53:39 localhost systemd[1]: a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e.service: Deactivated successfully. Dec 15 04:53:39 localhost podman[303251]: Dec 15 04:53:39 localhost podman[303251]: 2025-12-15 09:53:39.740033517 +0000 UTC m=+0.057345200 container create 29a691d3852e329405e7dc6d2fe3d4629f85d264ba9ed41f7319628bdbee2cbd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=frosty_lederberg, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-11-26T19:44:28Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, architecture=x86_64, com.redhat.component=rhceph-container, ceph=True, GIT_CLEAN=True, release=1763362218, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, distribution-scope=public, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0) Dec 15 04:53:39 localhost ceph-mon[298913]: Reconfiguring mgr.np0005559462.fudvyx (monmap changed)... Dec 15 04:53:39 localhost ceph-mon[298913]: Reconfiguring daemon mgr.np0005559462.fudvyx on np0005559462.localdomain Dec 15 04:53:39 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:39 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:39 localhost ceph-mon[298913]: Reconfiguring mon.np0005559462 (monmap changed)... Dec 15 04:53:39 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Dec 15 04:53:39 localhost ceph-mon[298913]: Reconfiguring daemon mon.np0005559462 on np0005559462.localdomain Dec 15 04:53:39 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:39 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005559460.localdomain"} : dispatch Dec 15 04:53:39 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005559460.localdomain"}]': finished Dec 15 04:53:39 localhost ceph-mon[298913]: Removed host np0005559460.localdomain Dec 15 04:53:39 localhost systemd[1]: Started libpod-conmon-29a691d3852e329405e7dc6d2fe3d4629f85d264ba9ed41f7319628bdbee2cbd.scope. Dec 15 04:53:39 localhost systemd[1]: Started libcrun container. Dec 15 04:53:39 localhost podman[303251]: 2025-12-15 09:53:39.809547374 +0000 UTC m=+0.126859107 container init 29a691d3852e329405e7dc6d2fe3d4629f85d264ba9ed41f7319628bdbee2cbd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=frosty_lederberg, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-11-26T19:44:28Z, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, name=rhceph, io.openshift.tags=rhceph ceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, version=7, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, release=1763362218, vcs-type=git, ceph=True, RELEASE=main, GIT_CLEAN=True) Dec 15 04:53:39 localhost podman[303251]: 2025-12-15 09:53:39.71504612 +0000 UTC m=+0.032357843 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 04:53:39 localhost podman[303251]: 2025-12-15 09:53:39.818845344 +0000 UTC m=+0.136157037 container start 29a691d3852e329405e7dc6d2fe3d4629f85d264ba9ed41f7319628bdbee2cbd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=frosty_lederberg, maintainer=Guillaume Abrioux , release=1763362218, description=Red Hat Ceph Storage 7, version=7, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, RELEASE=main, GIT_CLEAN=True, architecture=x86_64, vendor=Red Hat, Inc.) Dec 15 04:53:39 localhost podman[303251]: 2025-12-15 09:53:39.819337348 +0000 UTC m=+0.136649071 container attach 29a691d3852e329405e7dc6d2fe3d4629f85d264ba9ed41f7319628bdbee2cbd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=frosty_lederberg, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, build-date=2025-11-26T19:44:28Z, CEPH_POINT_RELEASE=, release=1763362218, name=rhceph, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, io.buildah.version=1.41.4, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 15 04:53:39 localhost frosty_lederberg[303267]: 167 167 Dec 15 04:53:39 localhost systemd[1]: libpod-29a691d3852e329405e7dc6d2fe3d4629f85d264ba9ed41f7319628bdbee2cbd.scope: Deactivated successfully. Dec 15 04:53:39 localhost podman[303251]: 2025-12-15 09:53:39.824385529 +0000 UTC m=+0.141697212 container died 29a691d3852e329405e7dc6d2fe3d4629f85d264ba9ed41f7319628bdbee2cbd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=frosty_lederberg, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, GIT_CLEAN=True, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, RELEASE=main, CEPH_POINT_RELEASE=, release=1763362218, vcs-type=git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, io.buildah.version=1.41.4, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7) Dec 15 04:53:39 localhost podman[303272]: 2025-12-15 09:53:39.921353111 +0000 UTC m=+0.088452936 container remove 29a691d3852e329405e7dc6d2fe3d4629f85d264ba9ed41f7319628bdbee2cbd (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=frosty_lederberg, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, distribution-scope=public, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, build-date=2025-11-26T19:44:28Z, RELEASE=main, maintainer=Guillaume Abrioux , url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, architecture=x86_64, vcs-type=git, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, ceph=True, io.openshift.expose-services=, release=1763362218, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0) Dec 15 04:53:39 localhost systemd[1]: libpod-conmon-29a691d3852e329405e7dc6d2fe3d4629f85d264ba9ed41f7319628bdbee2cbd.scope: Deactivated successfully. Dec 15 04:53:40 localhost ceph-mon[298913]: mon.np0005559462@3(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:53:40 localhost systemd[1]: var-lib-containers-storage-overlay-70a8a815d3d8594d32d7c76a59a445a0a7823beacccde28990913fe5697ee520-merged.mount: Deactivated successfully. Dec 15 04:53:40 localhost nova_compute[286344]: 2025-12-15 09:53:40.763 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:53:41 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:41 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:41 localhost ceph-mon[298913]: Reconfiguring crash.np0005559463 (monmap changed)... Dec 15 04:53:41 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005559463.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Dec 15 04:53:41 localhost ceph-mon[298913]: Reconfiguring daemon crash.np0005559463 on np0005559463.localdomain Dec 15 04:53:41 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:41 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:41 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Dec 15 04:53:42 localhost ceph-mon[298913]: Reconfiguring osd.2 (monmap changed)... Dec 15 04:53:42 localhost ceph-mon[298913]: Reconfiguring daemon osd.2 on np0005559463.localdomain Dec 15 04:53:42 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:42 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:42 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Dec 15 04:53:43 localhost ceph-mon[298913]: Reconfiguring osd.5 (monmap changed)... Dec 15 04:53:43 localhost ceph-mon[298913]: Reconfiguring daemon osd.5 on np0005559463.localdomain Dec 15 04:53:43 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:43 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:43 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005559463.rdpgze", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Dec 15 04:53:44 localhost ceph-mon[298913]: Reconfiguring mds.mds.np0005559463.rdpgze (monmap changed)... Dec 15 04:53:44 localhost ceph-mon[298913]: Reconfiguring daemon mds.mds.np0005559463.rdpgze on np0005559463.localdomain Dec 15 04:53:44 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:44 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:44 localhost ceph-mon[298913]: Reconfiguring mgr.np0005559463.daptkf (monmap changed)... Dec 15 04:53:44 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005559463.daptkf", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Dec 15 04:53:44 localhost ceph-mon[298913]: Reconfiguring daemon mgr.np0005559463.daptkf on np0005559463.localdomain Dec 15 04:53:44 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:44 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:44 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Dec 15 04:53:44 localhost sshd[303288]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:53:45 localhost ceph-mon[298913]: mon.np0005559462@3(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:53:45 localhost ceph-mon[298913]: Reconfiguring mon.np0005559463 (monmap changed)... Dec 15 04:53:45 localhost ceph-mon[298913]: Reconfiguring daemon mon.np0005559463 on np0005559463.localdomain Dec 15 04:53:45 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:45 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:45 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005559464.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Dec 15 04:53:45 localhost nova_compute[286344]: 2025-12-15 09:53:45.767 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:53:45 localhost nova_compute[286344]: 2025-12-15 09:53:45.768 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:53:45 localhost nova_compute[286344]: 2025-12-15 09:53:45.769 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 15 04:53:45 localhost nova_compute[286344]: 2025-12-15 09:53:45.769 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:53:45 localhost nova_compute[286344]: 2025-12-15 09:53:45.769 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:53:45 localhost nova_compute[286344]: 2025-12-15 09:53:45.772 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:53:46 localhost ceph-mon[298913]: Reconfiguring crash.np0005559464 (monmap changed)... Dec 15 04:53:46 localhost ceph-mon[298913]: Reconfiguring daemon crash.np0005559464 on np0005559464.localdomain Dec 15 04:53:46 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:46 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:46 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Dec 15 04:53:47 localhost ceph-mon[298913]: Reconfiguring osd.1 (monmap changed)... Dec 15 04:53:47 localhost ceph-mon[298913]: Reconfiguring daemon osd.1 on np0005559464.localdomain Dec 15 04:53:47 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:47 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:47 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:47 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.121 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'name': 'test', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005559462.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'c785bf23f53946bc99867d8832a50266', 'user_id': '1ba5fce347b64bfebf995f187193f205', 'hostId': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.122 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.127 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.129 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ef153117-98eb-4c44-b124-9663bdbc0548', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:53:48.122529', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': 'f3c2dca0-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11494.315194234, 'message_signature': 'b7f00b2edaa8969383a068674a2a7e868d914acd9e63fc77d7bb891f122ba99d'}]}, 'timestamp': '2025-12-15 09:53:48.128082', '_unique_id': '3fdfb6338f784365b3a194767bdc583e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.129 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.129 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.129 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.129 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.129 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.129 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.129 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.129 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.129 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.129 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.129 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.129 12 ERROR oslo_messaging.notify.messaging Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.129 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.129 12 ERROR oslo_messaging.notify.messaging Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.129 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.129 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.129 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.129 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.129 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.129 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.129 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.129 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.129 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.129 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.129 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.129 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.129 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.129 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.129 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.129 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.129 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.129 12 ERROR oslo_messaging.notify.messaging Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.130 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.131 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.132 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dc503f7f-6c48-480a-8967-02610c1060c6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:53:48.131151', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': 'f3c36a44-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11494.315194234, 'message_signature': 'a97899b098acb19c55fbdc072a47da4404bca5b81bed07922f5682e2afd349ae'}]}, 'timestamp': '2025-12-15 09:53:48.131642', '_unique_id': 'fd23a67b964245a2a1e3b95e56df3fc9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.132 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.132 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.132 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.132 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.132 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.132 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.132 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.132 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.132 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.132 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.132 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.132 12 ERROR oslo_messaging.notify.messaging Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.132 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.132 12 ERROR oslo_messaging.notify.messaging Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.132 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.132 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.132 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.132 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.132 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.132 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.132 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.132 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.132 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.132 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.132 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.132 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.132 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.132 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.132 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.132 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.132 12 ERROR oslo_messaging.notify.messaging Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.133 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.133 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.135 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cb7e96f0-23ea-4c77-8389-eabde36b52e7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:53:48.133891', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': 'f3c3d682-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11494.315194234, 'message_signature': 'a5abcfd49def5a6d4c572efaa9d06a1004034017e0d3f0c267857b491ef94281'}]}, 'timestamp': '2025-12-15 09:53:48.134395', '_unique_id': '09ce4eff9fb545aa839fbebd9de93220'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.135 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.135 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.135 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.135 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.135 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.135 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.135 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.135 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.135 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.135 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.135 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.135 12 ERROR oslo_messaging.notify.messaging Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.135 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.135 12 ERROR oslo_messaging.notify.messaging Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.135 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.135 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.135 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.135 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.135 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.135 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.135 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.135 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.135 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.135 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.135 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.135 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.135 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.135 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.135 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.135 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.135 12 ERROR oslo_messaging.notify.messaging Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.136 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.153 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/memory.usage volume: 51.73828125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.154 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '023c71bd-50e8-41c4-aace-b17698287678', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.73828125, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'timestamp': '2025-12-15T09:53:48.136871', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'f3c6c720-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11494.345737536, 'message_signature': 'e21055a5cad9509bd6e901eddda0068121badf721f799960511020d61a07f811'}]}, 'timestamp': '2025-12-15 09:53:48.153648', '_unique_id': '02d1dfb9b1894e0cb132703a9c476e6d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.154 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.154 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.154 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.154 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.154 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.154 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.154 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.154 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.154 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.154 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.154 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.154 12 ERROR oslo_messaging.notify.messaging Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.154 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.154 12 ERROR oslo_messaging.notify.messaging Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.154 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.154 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.154 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.154 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.154 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.154 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.154 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.154 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.154 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.154 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.154 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.154 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.154 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.154 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.154 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.154 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.154 12 ERROR oslo_messaging.notify.messaging Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.155 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.155 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.157 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ab957fa2-dc07-484c-8d63-c3e02fb0f5e9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:53:48.155838', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': 'f3c72f6c-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11494.315194234, 'message_signature': 'be4f35ffbeb7b3888014ce0c3bd959da265682f11df1e1363ad448cc85d9a169'}]}, 'timestamp': '2025-12-15 09:53:48.156348', '_unique_id': '9cff54570092481dbf7974366285b1f0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.157 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.157 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.157 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.157 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.157 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.157 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.157 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.157 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.157 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.157 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.157 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.157 12 ERROR oslo_messaging.notify.messaging Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.157 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.157 12 ERROR oslo_messaging.notify.messaging Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.157 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.157 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.157 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.157 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.157 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.157 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.157 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.157 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.157 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.157 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.157 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.157 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.157 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.157 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.157 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.157 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.157 12 ERROR oslo_messaging.notify.messaging Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.158 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.158 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.159 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '920816f6-e4a9-479b-a6df-da5a5b615b0b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:53:48.158492', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': 'f3c795d8-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11494.315194234, 'message_signature': 'c06ca4e9f10cd21d91161605666301d31ba35e5eff280217d185120924206cbe'}]}, 'timestamp': '2025-12-15 09:53:48.158949', '_unique_id': '8ec3f9bd3ddc4a6997941cc37004ce1d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.159 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.159 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.159 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.159 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.159 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.159 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.159 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.159 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.159 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.159 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.159 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.159 12 ERROR oslo_messaging.notify.messaging Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.159 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.159 12 ERROR oslo_messaging.notify.messaging Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.159 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.159 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.159 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.159 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.159 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.159 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.159 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.159 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.159 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.159 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.159 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.159 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.159 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.159 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.159 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.159 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.159 12 ERROR oslo_messaging.notify.messaging Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.160 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.161 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.162 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '660d35f7-cf27-44a4-abee-37935bdc0c38', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:53:48.161092', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': 'f3c7fb7c-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11494.315194234, 'message_signature': '0d0dd3df3daa5460c597b1668048365ee154790a1c9e750c107603025580e681'}]}, 'timestamp': '2025-12-15 09:53:48.161548', '_unique_id': '6782b04d48834136a3874cc3d556611f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.162 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.162 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.162 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.162 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.162 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.162 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.162 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.162 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.162 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.162 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.162 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.162 12 ERROR oslo_messaging.notify.messaging Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.162 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.162 12 ERROR oslo_messaging.notify.messaging Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.162 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.162 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.162 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.162 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.162 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.162 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.162 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.162 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.162 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.162 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.162 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.162 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.162 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.162 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.162 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.162 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.162 12 ERROR oslo_messaging.notify.messaging Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.163 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.190 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.190 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.192 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '57faa027-1f31-4caf-91e0-226d37161b8e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:53:48.163698', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f3cc763e-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11494.356365232, 'message_signature': 'c8df3ca6105603f0470f9aeec177736c5a1cf9119defe66fe587fb4d638a9383'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:53:48.163698', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f3cc88e0-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11494.356365232, 'message_signature': 'd546a0c8317ae00c0f4d9365d075877e659b45fd03439aa24b993eaef1b09804'}]}, 'timestamp': '2025-12-15 09:53:48.191357', '_unique_id': '5d75023dc58b4ecaad77c070f24cd2be'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.192 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.192 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.192 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.192 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.192 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.192 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.192 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.192 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.192 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.192 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.192 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.192 12 ERROR oslo_messaging.notify.messaging Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.192 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.192 12 ERROR oslo_messaging.notify.messaging Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.192 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.192 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.192 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.192 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.192 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.192 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.192 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.192 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.192 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.192 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.192 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.192 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.192 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.192 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.192 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.192 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.192 12 ERROR oslo_messaging.notify.messaging Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.193 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.193 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.194 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.195 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1be07df4-acef-4968-b87b-ea5e0494f4d1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:53:48.193545', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f3cceede-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11494.356365232, 'message_signature': 'd09880e7ce1da1d1dcd5ecb19133323a3811f8e0c0ab91bcdf8050984c5d1880'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:53:48.193545', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f3cd0144-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11494.356365232, 'message_signature': 'a8ee9d0cc7d9188127dbee1b87a53585437c4d3854412deb9d3156c97844ead0'}]}, 'timestamp': '2025-12-15 09:53:48.194437', '_unique_id': 'ef4b25778ed247a49a3552e110a571c7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.195 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.195 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.195 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.195 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.195 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.195 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.195 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.195 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.195 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.195 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.195 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.195 12 ERROR oslo_messaging.notify.messaging Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.195 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.195 12 ERROR oslo_messaging.notify.messaging Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.195 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.195 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.195 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.195 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.195 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.195 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.195 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.195 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.195 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.195 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.195 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.195 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.195 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.195 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.195 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.195 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.195 12 ERROR oslo_messaging.notify.messaging Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.196 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.196 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.196 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.198 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '90bd8d28-1a86-424d-af77-8c29097c222a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:53:48.196704', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': 'f3cd6a44-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11494.315194234, 'message_signature': 'c7c75e926a47335004185d450e183f460418290fe295c337c41dbb4b4c379e0e'}]}, 'timestamp': '2025-12-15 09:53:48.197185', '_unique_id': '7a9ea657d3414dfdb34108850715b071'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.198 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.198 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.198 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.198 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.198 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.198 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.198 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.198 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.198 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.198 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.198 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.198 12 ERROR oslo_messaging.notify.messaging Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.198 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.198 12 ERROR oslo_messaging.notify.messaging Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.198 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.198 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.198 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.198 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.198 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.198 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.198 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.198 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.198 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.198 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.198 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.198 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.198 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.198 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.198 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.198 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.198 12 ERROR oslo_messaging.notify.messaging Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.199 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.199 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/cpu volume: 11610000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.200 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2c2db8d7-c98a-4c8a-818d-503a45d0b9eb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11610000000, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'timestamp': '2025-12-15T09:53:48.199323', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'f3cdd074-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11494.345737536, 'message_signature': '27e1077e5e6ae22837fdb39c48591d11082cf43ad0dd4d89836a9d4719704783'}]}, 'timestamp': '2025-12-15 09:53:48.199751', '_unique_id': '64d5b854c05f403fbcdf86cde606f908'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.200 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.200 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.200 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.200 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.200 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.200 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.200 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.200 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.200 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.200 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.200 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.200 12 ERROR oslo_messaging.notify.messaging Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.200 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.200 12 ERROR oslo_messaging.notify.messaging Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.200 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.200 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.200 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.200 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.200 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.200 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.200 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.200 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.200 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.200 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.200 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.200 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.200 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.200 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.200 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.200 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.200 12 ERROR oslo_messaging.notify.messaging Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.201 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.201 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.latency volume: 1243487016 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.202 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.latency volume: 24779175 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.203 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '48091ed7-1339-4c4e-ae6f-940a3030fbed', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1243487016, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:53:48.201811', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f3ce349c-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11494.356365232, 'message_signature': '6dde8aa2aaea6d0c5ea71d497f6eb7101c4ebee2498b2f2a5b11ec43e44b1495'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24779175, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:53:48.201811', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f3ce446e-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11494.356365232, 'message_signature': 'c65f4ea42bfaef365fde18667ae1e77625a8ae02b5ef9bb6a4008f58d6c55d54'}]}, 'timestamp': '2025-12-15 09:53:48.202706', '_unique_id': '408206ec63d843828a993965165e8165'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.203 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.203 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.203 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.203 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.203 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.203 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.203 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.203 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.203 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.203 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.203 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.203 12 ERROR oslo_messaging.notify.messaging Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.203 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.203 12 ERROR oslo_messaging.notify.messaging Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.203 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.203 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.203 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.203 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.203 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.203 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.203 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.203 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.203 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.203 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.203 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.203 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.203 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.203 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.203 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.203 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.203 12 ERROR oslo_messaging.notify.messaging Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.204 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.204 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.206 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'af5d8e67-7595-4772-822f-a1d332bb573b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:53:48.204792', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': 'f3cea878-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11494.315194234, 'message_signature': 'ad7e07f95e45024b67fa3b4eb027da2c934f81efeb4ecd7f2ddd1e426f3b3aa3'}]}, 'timestamp': '2025-12-15 09:53:48.205307', '_unique_id': 'f9edc88a2e824079878d8bd357063e65'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.206 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.206 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.206 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.206 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.206 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.206 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.206 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.206 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.206 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.206 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.206 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.206 12 ERROR oslo_messaging.notify.messaging Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.206 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.206 12 ERROR oslo_messaging.notify.messaging Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.206 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.206 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.206 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.206 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.206 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.206 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.206 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.206 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.206 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.206 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.206 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.206 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.206 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.206 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.206 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.206 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.206 12 ERROR oslo_messaging.notify.messaging Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.207 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.207 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.207 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.209 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '318dfcaf-e1fd-446c-acf6-c8a69abe089b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:53:48.207496', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f3cf0f8e-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11494.356365232, 'message_signature': '9046682a38e0336acccd33e07785cf2bc9c5e0f8f3d3de3c4aa7443128139f6e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:53:48.207496', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f3cf20a0-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11494.356365232, 'message_signature': 'b117bb18ab98f55ad8231fcda91694fcecdb32724a702def70c647ea5974ba7a'}]}, 'timestamp': '2025-12-15 09:53:48.208351', '_unique_id': 'cbb873be7821446180ace540865ed2b5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.209 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.209 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.209 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.209 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.209 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.209 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.209 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.209 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.209 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.209 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.209 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.209 12 ERROR oslo_messaging.notify.messaging Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.209 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.209 12 ERROR oslo_messaging.notify.messaging Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.209 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.209 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.209 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.209 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.209 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.209 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.209 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.209 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.209 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.209 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.209 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.209 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.209 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.209 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.209 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.209 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.209 12 ERROR oslo_messaging.notify.messaging Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.210 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.210 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.210 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.211 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1bd3dd56-8c8c-47a2-8735-cd347ef95144', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:53:48.210418', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f3cf7ee2-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11494.356365232, 'message_signature': '5edf621521f54f2d25bb873ca6a5915ec7ba066c96d4043a930ff6db009a362b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:53:48.210418', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f3cf88a6-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11494.356365232, 'message_signature': 'b9f9d6010895d05631137c907bcea68830e313dd8138176b46cbbae187eeee50'}]}, 'timestamp': '2025-12-15 09:53:48.210935', '_unique_id': 'c0bd482ddee64c3a8a5df77d7fe8380d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.211 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.211 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.211 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.211 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.211 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.211 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.211 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.211 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.211 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.211 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.211 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.211 12 ERROR oslo_messaging.notify.messaging Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.211 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.211 12 ERROR oslo_messaging.notify.messaging Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.211 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.211 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.211 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.211 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.211 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.211 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.211 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.211 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.211 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.211 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.211 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.211 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.211 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.211 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.211 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.211 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.211 12 ERROR oslo_messaging.notify.messaging Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.212 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.220 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.220 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.221 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c0d75ef6-3635-45aa-8445-432af503e006', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:53:48.212275', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f3d10e24-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11494.404924926, 'message_signature': '402a3714df409f69d96e7e2fd29e7ee46e0f5e14ba20dd27d0e4f11b6a3e889e'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:53:48.212275', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f3d119a0-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11494.404924926, 'message_signature': '30df4390a2f624fdcd3c6ad2ecea5e55ee02fd011012d3a386ed76c12cdd0dfd'}]}, 'timestamp': '2025-12-15 09:53:48.221201', '_unique_id': '42a8f8d1dec0456bb8a194d0282e805f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.221 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.221 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.221 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.221 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.221 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.221 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.221 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.221 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.221 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.221 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.221 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.221 12 ERROR oslo_messaging.notify.messaging Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.221 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.221 12 ERROR oslo_messaging.notify.messaging Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.221 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.221 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.221 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.221 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.221 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.221 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.221 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.221 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.221 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.221 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.221 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.221 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.221 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.221 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.221 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.221 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.221 12 ERROR oslo_messaging.notify.messaging Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.222 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.222 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.222 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.222 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.223 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.223 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '972f0736-4a00-4d06-86d3-0ccbeb4e8aca', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:53:48.222746', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f3d16036-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11494.404924926, 'message_signature': '73a24d20e6edcbd11584adac8b7be18131042975e362c0251f6e4ca839144214'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:53:48.222746', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f3d16aae-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11494.404924926, 'message_signature': 'c5790ac953d225827bec8210b111bd8bcb73283f502550108795517a7a83d293'}]}, 'timestamp': '2025-12-15 09:53:48.223273', '_unique_id': 'b7694fdf3469447590bb928f569d81b5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.223 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.223 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.223 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.223 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.223 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.223 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.223 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.223 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.223 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.223 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.223 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.223 12 ERROR oslo_messaging.notify.messaging Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.223 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.223 12 ERROR oslo_messaging.notify.messaging Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.223 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.223 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.223 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.223 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.223 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.223 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.223 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.223 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.223 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.223 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.223 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.223 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.223 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.223 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.223 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.223 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.223 12 ERROR oslo_messaging.notify.messaging Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.224 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.224 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.225 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '29ddc1c3-7a69-4338-9926-d2d977c1f940', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:53:48.224595', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': 'f3d1a906-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11494.315194234, 'message_signature': '47ed55cbbd016f5bbd2eecb3377694bbe12b9093cdf9fd5b79eec49532cd5166'}]}, 'timestamp': '2025-12-15 09:53:48.224885', '_unique_id': 'f4b7d68ab0544213a48282502b101f83'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.225 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.225 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.225 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.225 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.225 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.225 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.225 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.225 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.225 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.225 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.225 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.225 12 ERROR oslo_messaging.notify.messaging Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.225 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.225 12 ERROR oslo_messaging.notify.messaging Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.225 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.225 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.225 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.225 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.225 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.225 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.225 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.225 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.225 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.225 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.225 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.225 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.225 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.225 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.225 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.225 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.225 12 ERROR oslo_messaging.notify.messaging Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.226 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.226 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.latency volume: 1342134926 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.226 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.latency volume: 123356132 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.227 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bb5be00f-01a8-49f9-a1f0-c52ae24bf876', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1342134926, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:53:48.226181', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f3d1e65a-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11494.356365232, 'message_signature': '3e89118549c2662e31c0686a6484a78c421d6d75cd13143ef660087108d77335'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 123356132, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:53:48.226181', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f3d1f0aa-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11494.356365232, 'message_signature': '796cc8674435e93aa12d0779d50d372b8d1bf9cbd4d085bf9906fbea3e680c74'}]}, 'timestamp': '2025-12-15 09:53:48.226702', '_unique_id': 'fe17290e86af4715bdc5a399bebc9700'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.227 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.227 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.227 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.227 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.227 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.227 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.227 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.227 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.227 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.227 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.227 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.227 12 ERROR oslo_messaging.notify.messaging Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.227 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.227 12 ERROR oslo_messaging.notify.messaging Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.227 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.227 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.227 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.227 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.227 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.227 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.227 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.227 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.227 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.227 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.227 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.227 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.227 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.227 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.227 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.227 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.227 12 ERROR oslo_messaging.notify.messaging Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.227 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.228 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.228 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f8470d9d-e642-41b1-bc9d-531a60f8551f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:53:48.228019', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': 'f3d22e8a-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11494.315194234, 'message_signature': '116dcdcbe815083123fe5985ce0b7399352a91704b73ba34683016100c2bfd56'}]}, 'timestamp': '2025-12-15 09:53:48.228302', '_unique_id': 'bc35e94abdc8449fbdafb2e29c597d7f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.228 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.228 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.228 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.228 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.228 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.228 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.228 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.228 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.228 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.228 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.228 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.228 12 ERROR oslo_messaging.notify.messaging Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.228 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.228 12 ERROR oslo_messaging.notify.messaging Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.228 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.228 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.228 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.228 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.228 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.228 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.228 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.228 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.228 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.228 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.228 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.228 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.228 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.228 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.228 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.228 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.228 12 ERROR oslo_messaging.notify.messaging Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.229 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.229 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.229 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.230 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3fd46b79-1568-4b91-a84e-bdaaf3cbd61e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:53:48.229602', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'f3d26c9c-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11494.404924926, 'message_signature': '98141d097e068af51e526e24de9725cf0cc50ca3cb9a12940ffacd368231e12c'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:53:48.229602', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'f3d27854-d99b-11f0-817e-fa163ebaca0f', 'monotonic_time': 11494.404924926, 'message_signature': '00d3e3d2bb6be23ccc9243dbdc5a9ec6124e692ff5a766672cb94c8446dc06b4'}]}, 'timestamp': '2025-12-15 09:53:48.230179', '_unique_id': '61fa77f0b910456185d2be567613410a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.230 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.230 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.230 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.230 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.230 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.230 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.230 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.230 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.230 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.230 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.230 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.230 12 ERROR oslo_messaging.notify.messaging Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.230 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.230 12 ERROR oslo_messaging.notify.messaging Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.230 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.230 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.230 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.230 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.230 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.230 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.230 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.230 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.230 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.230 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.230 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.230 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.230 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.230 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.230 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.230 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.230 12 ERROR oslo_messaging.notify.messaging Dec 15 04:53:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:53:48.231 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 15 04:53:48 localhost ceph-mon[298913]: Saving service mon spec with placement label:mon Dec 15 04:53:48 localhost ceph-mon[298913]: Reconfiguring osd.4 (monmap changed)... Dec 15 04:53:48 localhost ceph-mon[298913]: Reconfiguring daemon osd.4 on np0005559464.localdomain Dec 15 04:53:48 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:48 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:48 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Dec 15 04:53:48 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:49 localhost ceph-mgr[292421]: ms_deliver_dispatch: unhandled message 0x55975582af20 mon_map magic: 0 from mon.1 v2:172.18.0.108:3300/0 Dec 15 04:53:49 localhost ceph-mon[298913]: mon.np0005559462@3(peon) e11 my rank is now 2 (was 3) Dec 15 04:53:49 localhost ceph-mon[298913]: log_channel(cluster) log [INF] : mon.np0005559462 calling monitor election Dec 15 04:53:49 localhost ceph-mon[298913]: paxos.2).electionLogic(46) init, last seen epoch 46 Dec 15 04:53:49 localhost ceph-mon[298913]: mon.np0005559462@2(electing) e11 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Dec 15 04:53:50 localhost nova_compute[286344]: 2025-12-15 09:53:50.770 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:53:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:53:51.470 160590 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:53:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:53:51.470 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:53:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:53:51.472 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:53:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0. Dec 15 04:53:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 04:53:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a. Dec 15 04:53:52 localhost podman[303308]: 2025-12-15 09:53:52.75466098 +0000 UTC m=+0.081430671 container health_status 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Dec 15 04:53:52 localhost podman[303308]: 2025-12-15 09:53:52.762800607 +0000 UTC m=+0.089570298 container exec_died 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Dec 15 04:53:52 localhost systemd[1]: 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0.service: Deactivated successfully. Dec 15 04:53:52 localhost podman[303310]: 2025-12-15 09:53:52.820855065 +0000 UTC m=+0.142989487 container health_status b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0) Dec 15 04:53:52 localhost systemd[1]: tmp-crun.4mmooG.mount: Deactivated successfully. Dec 15 04:53:52 localhost podman[303309]: 2025-12-15 09:53:52.869417379 +0000 UTC m=+0.193207587 container health_status 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=multipathd, org.label-schema.build-date=20251202) Dec 15 04:53:52 localhost podman[303309]: 2025-12-15 09:53:52.879428549 +0000 UTC m=+0.203218727 container exec_died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team) Dec 15 04:53:52 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.service: Deactivated successfully. Dec 15 04:53:52 localhost podman[303310]: 2025-12-15 09:53:52.936255783 +0000 UTC m=+0.258390255 container exec_died b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute) Dec 15 04:53:52 localhost systemd[1]: b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a.service: Deactivated successfully. Dec 15 04:53:54 localhost ceph-mon[298913]: paxos.2).electionLogic(47) init, last seen epoch 47, mid-election, bumping Dec 15 04:53:54 localhost ceph-mon[298913]: mon.np0005559462@2(electing) e11 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Dec 15 04:53:54 localhost ceph-mon[298913]: mon.np0005559462@2(electing) e11 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Dec 15 04:53:54 localhost ceph-mon[298913]: mon.np0005559462@2(electing) e11 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Dec 15 04:53:54 localhost ceph-mon[298913]: mon.np0005559462@2(peon) e11 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Dec 15 04:53:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09. Dec 15 04:53:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 04:53:55 localhost systemd[1]: tmp-crun.1iSzWM.mount: Deactivated successfully. Dec 15 04:53:55 localhost podman[303388]: 2025-12-15 09:53:55.030775216 +0000 UTC m=+0.089214568 container health_status 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.buildah.version=1.33.7, vcs-type=git, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., release=1755695350, architecture=x86_64, distribution-scope=public, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter) Dec 15 04:53:55 localhost podman[303388]: 2025-12-15 09:53:55.070242177 +0000 UTC m=+0.128681529 container exec_died 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_id=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.expose-services=, release=1755695350, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b) Dec 15 04:53:55 localhost podman[303389]: 2025-12-15 09:53:55.083775154 +0000 UTC m=+0.140032485 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller) Dec 15 04:53:55 localhost ceph-mon[298913]: mon.np0005559462@2(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:53:55 localhost systemd[1]: 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09.service: Deactivated successfully. Dec 15 04:53:55 localhost podman[303389]: 2025-12-15 09:53:55.124279483 +0000 UTC m=+0.180536794 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2) Dec 15 04:53:55 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 04:53:55 localhost ceph-mon[298913]: mon.np0005559464 calling monitor election Dec 15 04:53:55 localhost ceph-mon[298913]: mon.np0005559462 calling monitor election Dec 15 04:53:55 localhost ceph-mon[298913]: mon.np0005559464 calling monitor election Dec 15 04:53:55 localhost ceph-mon[298913]: Health check failed: 1/3 mons down, quorum np0005559461,np0005559464 (MON_DOWN) Dec 15 04:53:55 localhost ceph-mon[298913]: overall HEALTH_OK Dec 15 04:53:55 localhost ceph-mon[298913]: mon.np0005559461 calling monitor election Dec 15 04:53:55 localhost ceph-mon[298913]: mon.np0005559461 is new leader, mons np0005559461,np0005559464,np0005559462 in quorum (ranks 0,1,2) Dec 15 04:53:55 localhost ceph-mon[298913]: Health check cleared: MON_DOWN (was: 1/3 mons down, quorum np0005559461,np0005559464) Dec 15 04:53:55 localhost ceph-mon[298913]: Cluster is now healthy Dec 15 04:53:55 localhost ceph-mon[298913]: overall HEALTH_OK Dec 15 04:53:55 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:55 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:55 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:55 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Dec 15 04:53:55 localhost ceph-mon[298913]: Updating np0005559461.localdomain:/etc/ceph/ceph.conf Dec 15 04:53:55 localhost ceph-mon[298913]: Updating np0005559462.localdomain:/etc/ceph/ceph.conf Dec 15 04:53:55 localhost ceph-mon[298913]: Updating np0005559463.localdomain:/etc/ceph/ceph.conf Dec 15 04:53:55 localhost ceph-mon[298913]: Updating np0005559464.localdomain:/etc/ceph/ceph.conf Dec 15 04:53:55 localhost ceph-mon[298913]: Updating np0005559461.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:53:55 localhost ceph-mon[298913]: Updating np0005559464.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:53:55 localhost ceph-mon[298913]: Updating np0005559463.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:53:55 localhost ceph-mon[298913]: Updating np0005559462.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:53:55 localhost nova_compute[286344]: 2025-12-15 09:53:55.772 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:53:56 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:56 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:56 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:56 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:56 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:56 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:56 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:56 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:56 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:56 localhost ceph-mon[298913]: Reconfiguring mgr.np0005559461.egwgzn (monmap changed)... Dec 15 04:53:56 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005559461.egwgzn", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Dec 15 04:53:56 localhost ceph-mon[298913]: Reconfiguring daemon mgr.np0005559461.egwgzn on np0005559461.localdomain Dec 15 04:53:58 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:58 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:58 localhost ceph-mon[298913]: Reconfiguring crash.np0005559461 (monmap changed)... Dec 15 04:53:58 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005559461.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Dec 15 04:53:58 localhost ceph-mon[298913]: Reconfiguring daemon crash.np0005559461 on np0005559461.localdomain Dec 15 04:53:58 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:58 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:58 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005559462.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Dec 15 04:53:58 localhost podman[303806]: Dec 15 04:53:58 localhost podman[303806]: 2025-12-15 09:53:58.887849448 +0000 UTC m=+0.075719502 container create 3d05eb89bd58cca3d50c354e9b2ceac5baf2ae00e083be2ea13285672e070e70 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=unruffled_davinci, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, io.openshift.expose-services=, RELEASE=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, vendor=Red Hat, Inc., GIT_BRANCH=main, build-date=2025-11-26T19:44:28Z, distribution-scope=public, ceph=True, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux ) Dec 15 04:53:58 localhost systemd[1]: Started libpod-conmon-3d05eb89bd58cca3d50c354e9b2ceac5baf2ae00e083be2ea13285672e070e70.scope. Dec 15 04:53:58 localhost systemd[1]: Started libcrun container. Dec 15 04:53:58 localhost podman[303806]: 2025-12-15 09:53:58.955084792 +0000 UTC m=+0.142954846 container init 3d05eb89bd58cca3d50c354e9b2ceac5baf2ae00e083be2ea13285672e070e70 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=unruffled_davinci, io.openshift.expose-services=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1763362218, build-date=2025-11-26T19:44:28Z, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, ceph=True, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, com.redhat.component=rhceph-container, GIT_BRANCH=main, maintainer=Guillaume Abrioux , GIT_CLEAN=True, io.buildah.version=1.41.4) Dec 15 04:53:58 localhost podman[303806]: 2025-12-15 09:53:58.856838053 +0000 UTC m=+0.044708117 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 04:53:58 localhost podman[303806]: 2025-12-15 09:53:58.971441938 +0000 UTC m=+0.159311992 container start 3d05eb89bd58cca3d50c354e9b2ceac5baf2ae00e083be2ea13285672e070e70 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=unruffled_davinci, architecture=x86_64, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vendor=Red Hat, Inc., release=1763362218, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, io.openshift.expose-services=, RELEASE=main, vcs-type=git, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Dec 15 04:53:58 localhost podman[303806]: 2025-12-15 09:53:58.973054203 +0000 UTC m=+0.160924307 container attach 3d05eb89bd58cca3d50c354e9b2ceac5baf2ae00e083be2ea13285672e070e70 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=unruffled_davinci, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, description=Red Hat Ceph Storage 7, name=rhceph, build-date=2025-11-26T19:44:28Z, GIT_CLEAN=True, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, com.redhat.component=rhceph-container, RELEASE=main, version=7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, vcs-type=git, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4) Dec 15 04:53:58 localhost systemd[1]: libpod-3d05eb89bd58cca3d50c354e9b2ceac5baf2ae00e083be2ea13285672e070e70.scope: Deactivated successfully. Dec 15 04:53:58 localhost unruffled_davinci[303822]: 167 167 Dec 15 04:53:58 localhost podman[303806]: 2025-12-15 09:53:58.975787429 +0000 UTC m=+0.163657513 container died 3d05eb89bd58cca3d50c354e9b2ceac5baf2ae00e083be2ea13285672e070e70 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=unruffled_davinci, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, release=1763362218, version=7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, distribution-scope=public, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, name=rhceph, ceph=True) Dec 15 04:53:59 localhost podman[303827]: 2025-12-15 09:53:59.071636211 +0000 UTC m=+0.086825292 container remove 3d05eb89bd58cca3d50c354e9b2ceac5baf2ae00e083be2ea13285672e070e70 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=unruffled_davinci, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, maintainer=Guillaume Abrioux , ceph=True, build-date=2025-11-26T19:44:28Z, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, RELEASE=main, description=Red Hat Ceph Storage 7, vcs-type=git, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, name=rhceph, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, release=1763362218, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4) Dec 15 04:53:59 localhost systemd[1]: libpod-conmon-3d05eb89bd58cca3d50c354e9b2ceac5baf2ae00e083be2ea13285672e070e70.scope: Deactivated successfully. Dec 15 04:53:59 localhost ceph-mon[298913]: Reconfiguring crash.np0005559462 (monmap changed)... Dec 15 04:53:59 localhost ceph-mon[298913]: Reconfiguring daemon crash.np0005559462 on np0005559462.localdomain Dec 15 04:53:59 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:59 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:53:59 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch Dec 15 04:53:59 localhost podman[303898]: Dec 15 04:53:59 localhost podman[303898]: 2025-12-15 09:53:59.712753525 +0000 UTC m=+0.042720552 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 04:53:59 localhost systemd[1]: tmp-crun.0zb1hv.mount: Deactivated successfully. Dec 15 04:53:59 localhost systemd[1]: var-lib-containers-storage-overlay-bf9afd3fd56ae09b0a3012237e82bb44e7d024322fd7ee276c66314012f8c2ac-merged.mount: Deactivated successfully. Dec 15 04:53:59 localhost podman[303898]: 2025-12-15 09:53:59.943474838 +0000 UTC m=+0.273441995 container create 12e98ba8cf53343a7ac22c9ab7fada5bd1496ef3bb3e21296865f8b84c7be95d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=compassionate_morse, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux , cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-11-26T19:44:28Z, distribution-scope=public, io.openshift.tags=rhceph ceph, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, release=1763362218, vendor=Red Hat, Inc., io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, GIT_CLEAN=True, RELEASE=main, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container) Dec 15 04:53:59 localhost systemd[1]: Started libpod-conmon-12e98ba8cf53343a7ac22c9ab7fada5bd1496ef3bb3e21296865f8b84c7be95d.scope. Dec 15 04:54:00 localhost systemd[1]: Started libcrun container. Dec 15 04:54:00 localhost podman[303898]: 2025-12-15 09:54:00.014002254 +0000 UTC m=+0.343969221 container init 12e98ba8cf53343a7ac22c9ab7fada5bd1496ef3bb3e21296865f8b84c7be95d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=compassionate_morse, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, vcs-type=git, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., GIT_BRANCH=main, io.buildah.version=1.41.4, build-date=2025-11-26T19:44:28Z, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, release=1763362218, GIT_CLEAN=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True) Dec 15 04:54:00 localhost podman[303898]: 2025-12-15 09:54:00.029037042 +0000 UTC m=+0.359004009 container start 12e98ba8cf53343a7ac22c9ab7fada5bd1496ef3bb3e21296865f8b84c7be95d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=compassionate_morse, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.41.4, GIT_CLEAN=True, release=1763362218, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, ceph=True, distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, version=7) Dec 15 04:54:00 localhost podman[303898]: 2025-12-15 09:54:00.029262529 +0000 UTC m=+0.359229506 container attach 12e98ba8cf53343a7ac22c9ab7fada5bd1496ef3bb3e21296865f8b84c7be95d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=compassionate_morse, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, name=rhceph, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, vcs-type=git, GIT_BRANCH=main, architecture=x86_64, io.buildah.version=1.41.4, build-date=2025-11-26T19:44:28Z, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, release=1763362218, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, version=7, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git) Dec 15 04:54:00 localhost compassionate_morse[303913]: 167 167 Dec 15 04:54:00 localhost systemd[1]: libpod-12e98ba8cf53343a7ac22c9ab7fada5bd1496ef3bb3e21296865f8b84c7be95d.scope: Deactivated successfully. Dec 15 04:54:00 localhost podman[303898]: 2025-12-15 09:54:00.032348515 +0000 UTC m=+0.362315502 container died 12e98ba8cf53343a7ac22c9ab7fada5bd1496ef3bb3e21296865f8b84c7be95d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=compassionate_morse, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, maintainer=Guillaume Abrioux , distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, architecture=x86_64, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, GIT_CLEAN=True, CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, version=7, RELEASE=main) Dec 15 04:54:00 localhost ceph-mon[298913]: mon.np0005559462@2(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:54:00 localhost podman[303918]: 2025-12-15 09:54:00.134197495 +0000 UTC m=+0.087716968 container remove 12e98ba8cf53343a7ac22c9ab7fada5bd1496ef3bb3e21296865f8b84c7be95d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=compassionate_morse, name=rhceph, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, build-date=2025-11-26T19:44:28Z, distribution-scope=public, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, CEPH_POINT_RELEASE=, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., RELEASE=main, version=7, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Dec 15 04:54:00 localhost systemd[1]: libpod-conmon-12e98ba8cf53343a7ac22c9ab7fada5bd1496ef3bb3e21296865f8b84c7be95d.scope: Deactivated successfully. Dec 15 04:54:00 localhost ceph-mon[298913]: Reconfiguring osd.0 (monmap changed)... Dec 15 04:54:00 localhost ceph-mon[298913]: Reconfiguring daemon osd.0 on np0005559462.localdomain Dec 15 04:54:00 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:54:00 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:54:00 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:54:00 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch Dec 15 04:54:00 localhost nova_compute[286344]: 2025-12-15 09:54:00.774 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:54:00 localhost podman[303994]: Dec 15 04:54:00 localhost podman[303994]: 2025-12-15 09:54:00.893431521 +0000 UTC m=+0.072215574 container create 7060a6e519fbd38f496a3d13414eab96f5d847d41ceeeb35ee6ddc370b665747 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_mirzakhani, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, vcs-type=git, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1763362218, name=rhceph, version=7, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, GIT_BRANCH=main, build-date=2025-11-26T19:44:28Z, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, distribution-scope=public) Dec 15 04:54:00 localhost systemd[1]: tmp-crun.Gi3jBs.mount: Deactivated successfully. Dec 15 04:54:00 localhost systemd[1]: var-lib-containers-storage-overlay-5df8a75e0ff98ca054c99208c8672b995beb9317ac4545b636e5c2a3b0ac412e-merged.mount: Deactivated successfully. Dec 15 04:54:00 localhost systemd[1]: Started libpod-conmon-7060a6e519fbd38f496a3d13414eab96f5d847d41ceeeb35ee6ddc370b665747.scope. Dec 15 04:54:00 localhost systemd[1]: Started libcrun container. Dec 15 04:54:00 localhost podman[303994]: 2025-12-15 09:54:00.947724525 +0000 UTC m=+0.126508568 container init 7060a6e519fbd38f496a3d13414eab96f5d847d41ceeeb35ee6ddc370b665747 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_mirzakhani, vendor=Red Hat, Inc., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, GIT_CLEAN=True, architecture=x86_64, com.redhat.component=rhceph-container, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, io.openshift.expose-services=, name=rhceph, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-11-26T19:44:28Z, release=1763362218, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4) Dec 15 04:54:00 localhost systemd[1]: tmp-crun.cmwBX2.mount: Deactivated successfully. Dec 15 04:54:00 localhost podman[303994]: 2025-12-15 09:54:00.957813486 +0000 UTC m=+0.136597559 container start 7060a6e519fbd38f496a3d13414eab96f5d847d41ceeeb35ee6ddc370b665747 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_mirzakhani, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, architecture=x86_64, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , distribution-scope=public, build-date=2025-11-26T19:44:28Z, release=1763362218, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, io.openshift.expose-services=, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7) Dec 15 04:54:00 localhost podman[303994]: 2025-12-15 09:54:00.958168756 +0000 UTC m=+0.136952819 container attach 7060a6e519fbd38f496a3d13414eab96f5d847d41ceeeb35ee6ddc370b665747 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_mirzakhani, build-date=2025-11-26T19:44:28Z, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_CLEAN=True, distribution-scope=public, version=7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, name=rhceph, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.expose-services=, release=1763362218, maintainer=Guillaume Abrioux , url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, vcs-type=git) Dec 15 04:54:00 localhost silly_mirzakhani[304009]: 167 167 Dec 15 04:54:00 localhost systemd[1]: libpod-7060a6e519fbd38f496a3d13414eab96f5d847d41ceeeb35ee6ddc370b665747.scope: Deactivated successfully. Dec 15 04:54:00 localhost podman[303994]: 2025-12-15 09:54:00.960648035 +0000 UTC m=+0.139432138 container died 7060a6e519fbd38f496a3d13414eab96f5d847d41ceeeb35ee6ddc370b665747 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_mirzakhani, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, RELEASE=main, release=1763362218, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.buildah.version=1.41.4, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , name=rhceph, distribution-scope=public, CEPH_POINT_RELEASE=, version=7, io.openshift.expose-services=, vcs-type=git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, build-date=2025-11-26T19:44:28Z, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph) Dec 15 04:54:00 localhost podman[303994]: 2025-12-15 09:54:00.864372681 +0000 UTC m=+0.043156794 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 04:54:01 localhost podman[304014]: 2025-12-15 09:54:01.029264038 +0000 UTC m=+0.059059398 container remove 7060a6e519fbd38f496a3d13414eab96f5d847d41ceeeb35ee6ddc370b665747 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_mirzakhani, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, ceph=True, name=rhceph, RELEASE=main, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1763362218, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, CEPH_POINT_RELEASE=) Dec 15 04:54:01 localhost systemd[1]: libpod-conmon-7060a6e519fbd38f496a3d13414eab96f5d847d41ceeeb35ee6ddc370b665747.scope: Deactivated successfully. Dec 15 04:54:01 localhost nova_compute[286344]: 2025-12-15 09:54:01.273 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:54:01 localhost ceph-mon[298913]: Reconfiguring osd.3 (monmap changed)... Dec 15 04:54:01 localhost ceph-mon[298913]: Reconfiguring daemon osd.3 on np0005559462.localdomain Dec 15 04:54:01 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:54:01 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:54:01 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005559462.mhigvc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Dec 15 04:54:01 localhost podman[304087]: Dec 15 04:54:01 localhost podman[304087]: 2025-12-15 09:54:01.811092584 +0000 UTC m=+0.073820268 container create 12f8053bd876f2c7e2d26bb5049730acad9f5400e32bffcb5e506a44a5f0c654 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=practical_swartz, io.buildah.version=1.41.4, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, release=1763362218, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, vcs-type=git, CEPH_POINT_RELEASE=, RELEASE=main, io.openshift.expose-services=, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-11-26T19:44:28Z, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , GIT_CLEAN=True, GIT_BRANCH=main) Dec 15 04:54:01 localhost systemd[1]: Started libpod-conmon-12f8053bd876f2c7e2d26bb5049730acad9f5400e32bffcb5e506a44a5f0c654.scope. Dec 15 04:54:01 localhost systemd[1]: Started libcrun container. Dec 15 04:54:01 localhost podman[304087]: 2025-12-15 09:54:01.780927784 +0000 UTC m=+0.043655508 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 04:54:01 localhost podman[243449]: time="2025-12-15T09:54:01Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 15 04:54:01 localhost podman[304087]: 2025-12-15 09:54:01.881853338 +0000 UTC m=+0.144581032 container init 12f8053bd876f2c7e2d26bb5049730acad9f5400e32bffcb5e506a44a5f0c654 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=practical_swartz, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, version=7, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, vendor=Red Hat, Inc., io.buildah.version=1.41.4, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, vcs-type=git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1763362218, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, RELEASE=main) Dec 15 04:54:01 localhost podman[304087]: 2025-12-15 09:54:01.890708284 +0000 UTC m=+0.153435968 container start 12f8053bd876f2c7e2d26bb5049730acad9f5400e32bffcb5e506a44a5f0c654 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=practical_swartz, build-date=2025-11-26T19:44:28Z, ceph=True, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, distribution-scope=public, description=Red Hat Ceph Storage 7, release=1763362218, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, CEPH_POINT_RELEASE=, RELEASE=main, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0) Dec 15 04:54:01 localhost podman[304087]: 2025-12-15 09:54:01.890974412 +0000 UTC m=+0.153702146 container attach 12f8053bd876f2c7e2d26bb5049730acad9f5400e32bffcb5e506a44a5f0c654 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=practical_swartz, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_BRANCH=main, ceph=True, GIT_CLEAN=True, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, io.buildah.version=1.41.4, build-date=2025-11-26T19:44:28Z, description=Red Hat Ceph Storage 7, release=1763362218, name=rhceph, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64) Dec 15 04:54:01 localhost practical_swartz[304102]: 167 167 Dec 15 04:54:01 localhost systemd[1]: var-lib-containers-storage-overlay-036a0f9adaea89454f4d8fe3813db3d8fda881f1a901262d835815203e449ebd-merged.mount: Deactivated successfully. Dec 15 04:54:01 localhost podman[304087]: 2025-12-15 09:54:01.899951592 +0000 UTC m=+0.162679326 container died 12f8053bd876f2c7e2d26bb5049730acad9f5400e32bffcb5e506a44a5f0c654 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=practical_swartz, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-11-26T19:44:28Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, name=rhceph, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, RELEASE=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, vcs-type=git, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=) Dec 15 04:54:01 localhost systemd[1]: libpod-12f8053bd876f2c7e2d26bb5049730acad9f5400e32bffcb5e506a44a5f0c654.scope: Deactivated successfully. Dec 15 04:54:01 localhost podman[243449]: @ - - [15/Dec/2025:09:54:01 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156595 "" "Go-http-client/1.1" Dec 15 04:54:01 localhost podman[243449]: @ - - [15/Dec/2025:09:54:01 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19032 "" "Go-http-client/1.1" Dec 15 04:54:02 localhost systemd[1]: var-lib-containers-storage-overlay-07d7ba6db239dcefa08a9c17a462efc66e2ede4c1ecad1ebd11bb7507b3646ed-merged.mount: Deactivated successfully. Dec 15 04:54:02 localhost podman[304107]: 2025-12-15 09:54:02.06488498 +0000 UTC m=+0.154927280 container remove 12f8053bd876f2c7e2d26bb5049730acad9f5400e32bffcb5e506a44a5f0c654 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=practical_swartz, vcs-type=git, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, GIT_BRANCH=main, io.buildah.version=1.41.4, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7) Dec 15 04:54:02 localhost systemd[1]: libpod-conmon-12f8053bd876f2c7e2d26bb5049730acad9f5400e32bffcb5e506a44a5f0c654.scope: Deactivated successfully. Dec 15 04:54:02 localhost nova_compute[286344]: 2025-12-15 09:54:02.266 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:54:02 localhost ceph-mon[298913]: Reconfiguring mds.mds.np0005559462.mhigvc (monmap changed)... Dec 15 04:54:02 localhost ceph-mon[298913]: Reconfiguring daemon mds.mds.np0005559462.mhigvc on np0005559462.localdomain Dec 15 04:54:02 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:54:02 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:54:02 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005559462.fudvyx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Dec 15 04:54:02 localhost podman[304175]: Dec 15 04:54:02 localhost podman[304175]: 2025-12-15 09:54:02.78335312 +0000 UTC m=+0.076453742 container create 0fa8a6517d1e7d78434bdb804f43c53c6e06c75f42386b45db92d5a2e11628ef (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gracious_rubin, com.redhat.component=rhceph-container, build-date=2025-11-26T19:44:28Z, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, name=rhceph, vendor=Red Hat, Inc., io.buildah.version=1.41.4, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux , io.openshift.expose-services=, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, RELEASE=main, version=7, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0) Dec 15 04:54:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 04:54:02 localhost systemd[1]: Started libpod-conmon-0fa8a6517d1e7d78434bdb804f43c53c6e06c75f42386b45db92d5a2e11628ef.scope. Dec 15 04:54:02 localhost systemd[1]: Started libcrun container. Dec 15 04:54:02 localhost podman[304175]: 2025-12-15 09:54:02.846877801 +0000 UTC m=+0.139978433 container init 0fa8a6517d1e7d78434bdb804f43c53c6e06c75f42386b45db92d5a2e11628ef (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gracious_rubin, ceph=True, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-11-26T19:44:28Z, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, vcs-type=git, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, release=1763362218, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, GIT_BRANCH=main, io.openshift.expose-services=, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Dec 15 04:54:02 localhost podman[304175]: 2025-12-15 09:54:02.751367649 +0000 UTC m=+0.044468291 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 04:54:02 localhost podman[304175]: 2025-12-15 09:54:02.856401186 +0000 UTC m=+0.149501808 container start 0fa8a6517d1e7d78434bdb804f43c53c6e06c75f42386b45db92d5a2e11628ef (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gracious_rubin, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, description=Red Hat Ceph Storage 7, release=1763362218, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , GIT_CLEAN=True, build-date=2025-11-26T19:44:28Z, version=7, com.redhat.component=rhceph-container, RELEASE=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64) Dec 15 04:54:02 localhost podman[304175]: 2025-12-15 09:54:02.856698826 +0000 UTC m=+0.149799488 container attach 0fa8a6517d1e7d78434bdb804f43c53c6e06c75f42386b45db92d5a2e11628ef (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gracious_rubin, GIT_BRANCH=main, vcs-type=git, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, architecture=x86_64, RELEASE=main, io.openshift.expose-services=, GIT_CLEAN=True, version=7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, description=Red Hat Ceph Storage 7, name=rhceph, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1763362218, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, build-date=2025-11-26T19:44:28Z, distribution-scope=public) Dec 15 04:54:02 localhost gracious_rubin[304190]: 167 167 Dec 15 04:54:02 localhost systemd[1]: libpod-0fa8a6517d1e7d78434bdb804f43c53c6e06c75f42386b45db92d5a2e11628ef.scope: Deactivated successfully. Dec 15 04:54:02 localhost podman[304175]: 2025-12-15 09:54:02.861327704 +0000 UTC m=+0.154428356 container died 0fa8a6517d1e7d78434bdb804f43c53c6e06c75f42386b45db92d5a2e11628ef (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gracious_rubin, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, io.openshift.tags=rhceph ceph, release=1763362218, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-type=git, build-date=2025-11-26T19:44:28Z, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main) Dec 15 04:54:02 localhost podman[304189]: 2025-12-15 09:54:02.907195123 +0000 UTC m=+0.088411426 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true) Dec 15 04:54:02 localhost podman[304205]: 2025-12-15 09:54:02.952051504 +0000 UTC m=+0.084174118 container remove 0fa8a6517d1e7d78434bdb804f43c53c6e06c75f42386b45db92d5a2e11628ef (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gracious_rubin, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1763362218, io.buildah.version=1.41.4, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Dec 15 04:54:02 localhost systemd[1]: libpod-conmon-0fa8a6517d1e7d78434bdb804f43c53c6e06c75f42386b45db92d5a2e11628ef.scope: Deactivated successfully. Dec 15 04:54:02 localhost podman[304189]: 2025-12-15 09:54:02.969034577 +0000 UTC m=+0.150250840 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_managed=true, container_name=ovn_metadata_agent) Dec 15 04:54:02 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 04:54:03 localhost sshd[304228]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:54:03 localhost nova_compute[286344]: 2025-12-15 09:54:03.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:54:03 localhost nova_compute[286344]: 2025-12-15 09:54:03.289 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:54:03 localhost nova_compute[286344]: 2025-12-15 09:54:03.290 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:54:03 localhost nova_compute[286344]: 2025-12-15 09:54:03.290 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:54:03 localhost nova_compute[286344]: 2025-12-15 09:54:03.290 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Auditing locally available compute resources for np0005559462.localdomain (node: np0005559462.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 15 04:54:03 localhost nova_compute[286344]: 2025-12-15 09:54:03.291 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 04:54:03 localhost ceph-mon[298913]: Reconfiguring mgr.np0005559462.fudvyx (monmap changed)... Dec 15 04:54:03 localhost ceph-mon[298913]: Reconfiguring daemon mgr.np0005559462.fudvyx on np0005559462.localdomain Dec 15 04:54:03 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:54:03 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Dec 15 04:54:03 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:54:03 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:54:03 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005559463.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Dec 15 04:54:03 localhost ceph-mon[298913]: mon.np0005559462@2(peon) e11 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Dec 15 04:54:03 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2043294572' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Dec 15 04:54:03 localhost nova_compute[286344]: 2025-12-15 09:54:03.677 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.386s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 04:54:03 localhost nova_compute[286344]: 2025-12-15 09:54:03.730 286348 DEBUG nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 04:54:03 localhost nova_compute[286344]: 2025-12-15 09:54:03.730 286348 DEBUG nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 04:54:03 localhost systemd[1]: var-lib-containers-storage-overlay-5f608ad3f95f3de716f52ec295530687c86348411526a3bdd53e7ce2689c967d-merged.mount: Deactivated successfully. Dec 15 04:54:03 localhost nova_compute[286344]: 2025-12-15 09:54:03.953 286348 WARNING nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 15 04:54:03 localhost nova_compute[286344]: 2025-12-15 09:54:03.955 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Hypervisor/Node resource view: name=np0005559462.localdomain free_ram=11783MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 15 04:54:03 localhost nova_compute[286344]: 2025-12-15 09:54:03.955 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:54:03 localhost nova_compute[286344]: 2025-12-15 09:54:03.956 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:54:04 localhost nova_compute[286344]: 2025-12-15 09:54:04.037 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Instance 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 15 04:54:04 localhost nova_compute[286344]: 2025-12-15 09:54:04.038 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 15 04:54:04 localhost nova_compute[286344]: 2025-12-15 09:54:04.038 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Final resource view: name=np0005559462.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 15 04:54:04 localhost nova_compute[286344]: 2025-12-15 09:54:04.093 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 04:54:04 localhost ceph-mon[298913]: Deploying daemon mon.np0005559463 on np0005559463.localdomain Dec 15 04:54:04 localhost ceph-mon[298913]: Reconfiguring crash.np0005559463 (monmap changed)... Dec 15 04:54:04 localhost ceph-mon[298913]: Reconfiguring daemon crash.np0005559463 on np0005559463.localdomain Dec 15 04:54:04 localhost ceph-mon[298913]: mon.np0005559462@2(peon) e11 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Dec 15 04:54:04 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1408708198' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Dec 15 04:54:04 localhost nova_compute[286344]: 2025-12-15 09:54:04.548 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 04:54:04 localhost nova_compute[286344]: 2025-12-15 09:54:04.554 286348 DEBUG nova.compute.provider_tree [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Inventory has not changed in ProviderTree for provider: 26c8956b-6742-4951-b566-971b9bbe323b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 15 04:54:04 localhost nova_compute[286344]: 2025-12-15 09:54:04.572 286348 DEBUG nova.scheduler.client.report [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Inventory has not changed for provider 26c8956b-6742-4951-b566-971b9bbe323b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 15 04:54:04 localhost nova_compute[286344]: 2025-12-15 09:54:04.575 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Compute_service record updated for np0005559462.localdomain:np0005559462.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 15 04:54:04 localhost nova_compute[286344]: 2025-12-15 09:54:04.575 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.619s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:54:04 localhost openstack_network_exporter[246484]: ERROR 09:54:04 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 15 04:54:04 localhost openstack_network_exporter[246484]: ERROR 09:54:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 04:54:04 localhost openstack_network_exporter[246484]: ERROR 09:54:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 04:54:04 localhost openstack_network_exporter[246484]: ERROR 09:54:04 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 15 04:54:04 localhost openstack_network_exporter[246484]: Dec 15 04:54:04 localhost openstack_network_exporter[246484]: ERROR 09:54:04 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 15 04:54:04 localhost openstack_network_exporter[246484]: Dec 15 04:54:05 localhost ceph-mon[298913]: mon.np0005559462@2(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:54:05 localhost ceph-mon[298913]: mon.np0005559462@2(peon) e11 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Dec 15 04:54:05 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3802037974' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Dec 15 04:54:05 localhost ceph-mon[298913]: mon.np0005559462@2(peon) e11 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Dec 15 04:54:05 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3802037974' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Dec 15 04:54:05 localhost ceph-mon[298913]: mon.np0005559462@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Dec 15 04:54:05 localhost ceph-mon[298913]: mon.np0005559462@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Dec 15 04:54:05 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:54:05 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:54:05 localhost nova_compute[286344]: 2025-12-15 09:54:05.576 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:54:05 localhost nova_compute[286344]: 2025-12-15 09:54:05.577 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 15 04:54:05 localhost nova_compute[286344]: 2025-12-15 09:54:05.577 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 15 04:54:05 localhost nova_compute[286344]: 2025-12-15 09:54:05.776 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:54:06 localhost nova_compute[286344]: 2025-12-15 09:54:06.004 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 15 04:54:06 localhost nova_compute[286344]: 2025-12-15 09:54:06.005 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquired lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 15 04:54:06 localhost nova_compute[286344]: 2025-12-15 09:54:06.005 286348 DEBUG nova.network.neutron [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 15 04:54:06 localhost nova_compute[286344]: 2025-12-15 09:54:06.006 286348 DEBUG nova.objects.instance [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 15 04:54:06 localhost ceph-mon[298913]: mon.np0005559462@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Dec 15 04:54:06 localhost nova_compute[286344]: 2025-12-15 09:54:06.377 286348 DEBUG nova.network.neutron [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Updating instance_info_cache with network_info: [{"id": "03ef8889-3216-43fb-8a52-4be17a956ce1", "address": "fa:16:3e:74:df:7c", "network": {"id": "befb7a72-17a9-4bcb-b561-84b8f626685a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.201", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "c785bf23f53946bc99867d8832a50266", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03ef8889-32", "ovs_interfaceid": "03ef8889-3216-43fb-8a52-4be17a956ce1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 15 04:54:06 localhost nova_compute[286344]: 2025-12-15 09:54:06.390 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Releasing lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 15 04:54:06 localhost nova_compute[286344]: 2025-12-15 09:54:06.391 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 15 04:54:06 localhost nova_compute[286344]: 2025-12-15 09:54:06.392 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:54:06 localhost nova_compute[286344]: 2025-12-15 09:54:06.393 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:54:06 localhost nova_compute[286344]: 2025-12-15 09:54:06.393 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:54:06 localhost nova_compute[286344]: 2025-12-15 09:54:06.394 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:54:06 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:54:06 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:54:06 localhost ceph-mon[298913]: Reconfiguring osd.2 (monmap changed)... Dec 15 04:54:06 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Dec 15 04:54:06 localhost ceph-mon[298913]: Reconfiguring daemon osd.2 on np0005559463.localdomain Dec 15 04:54:07 localhost nova_compute[286344]: 2025-12-15 09:54:07.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:54:07 localhost nova_compute[286344]: 2025-12-15 09:54:07.271 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 15 04:54:07 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:54:07 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:54:07 localhost ceph-mon[298913]: Reconfiguring osd.5 (monmap changed)... Dec 15 04:54:07 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Dec 15 04:54:07 localhost ceph-mon[298913]: Reconfiguring daemon osd.5 on np0005559463.localdomain Dec 15 04:54:08 localhost ceph-mon[298913]: mon.np0005559462@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Dec 15 04:54:08 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:54:08 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:54:08 localhost ceph-mon[298913]: Reconfiguring mds.mds.np0005559463.rdpgze (monmap changed)... Dec 15 04:54:08 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005559463.rdpgze", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Dec 15 04:54:08 localhost ceph-mon[298913]: Reconfiguring daemon mds.mds.np0005559463.rdpgze on np0005559463.localdomain Dec 15 04:54:08 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:54:08 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:54:08 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005559463.daptkf", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Dec 15 04:54:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e. Dec 15 04:54:09 localhost podman[304274]: 2025-12-15 09:54:09.763384856 +0000 UTC m=+0.091767569 container health_status a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Dec 15 04:54:09 localhost podman[304274]: 2025-12-15 09:54:09.774355002 +0000 UTC m=+0.102737665 container exec_died a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 15 04:54:09 localhost systemd[1]: a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e.service: Deactivated successfully. Dec 15 04:54:09 localhost ceph-mon[298913]: Reconfiguring mgr.np0005559463.daptkf (monmap changed)... Dec 15 04:54:09 localhost ceph-mon[298913]: Reconfiguring daemon mgr.np0005559463.daptkf on np0005559463.localdomain Dec 15 04:54:09 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:54:09 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:54:09 localhost ceph-mon[298913]: Reconfiguring crash.np0005559464 (monmap changed)... Dec 15 04:54:09 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005559464.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Dec 15 04:54:09 localhost ceph-mon[298913]: Reconfiguring daemon crash.np0005559464 on np0005559464.localdomain Dec 15 04:54:10 localhost ceph-mon[298913]: mon.np0005559462@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Dec 15 04:54:10 localhost ceph-mon[298913]: mon.np0005559462@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Dec 15 04:54:10 localhost ceph-mon[298913]: mon.np0005559462@2(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:54:10 localhost nova_compute[286344]: 2025-12-15 09:54:10.778 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:54:10 localhost nova_compute[286344]: 2025-12-15 09:54:10.779 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:54:10 localhost nova_compute[286344]: 2025-12-15 09:54:10.780 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 15 04:54:10 localhost nova_compute[286344]: 2025-12-15 09:54:10.780 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:54:10 localhost nova_compute[286344]: 2025-12-15 09:54:10.780 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:54:10 localhost nova_compute[286344]: 2025-12-15 09:54:10.783 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:54:10 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:54:10 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:54:10 localhost ceph-mon[298913]: Reconfiguring osd.1 (monmap changed)... Dec 15 04:54:10 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Dec 15 04:54:10 localhost ceph-mon[298913]: Reconfiguring daemon osd.1 on np0005559464.localdomain Dec 15 04:54:11 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:54:11 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:54:11 localhost ceph-mon[298913]: Reconfiguring osd.4 (monmap changed)... Dec 15 04:54:11 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Dec 15 04:54:11 localhost ceph-mon[298913]: Reconfiguring daemon osd.4 on np0005559464.localdomain Dec 15 04:54:12 localhost ceph-mon[298913]: mon.np0005559462@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Dec 15 04:54:13 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:54:13 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:54:13 localhost ceph-mon[298913]: Reconfiguring mds.mds.np0005559464.piyuji (monmap changed)... Dec 15 04:54:13 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005559464.piyuji", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Dec 15 04:54:13 localhost ceph-mon[298913]: Reconfiguring daemon mds.mds.np0005559464.piyuji on np0005559464.localdomain Dec 15 04:54:14 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:54:14 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:54:14 localhost ceph-mon[298913]: Reconfiguring mgr.np0005559464.aomnqe (monmap changed)... Dec 15 04:54:14 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005559464.aomnqe", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Dec 15 04:54:14 localhost ceph-mon[298913]: Reconfiguring daemon mgr.np0005559464.aomnqe on np0005559464.localdomain Dec 15 04:54:14 localhost ceph-mon[298913]: mon.np0005559462@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Dec 15 04:54:14 localhost ceph-mon[298913]: mon.np0005559462@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Dec 15 04:54:15 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:54:15 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:54:15 localhost ceph-mon[298913]: mon.np0005559462@2(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:54:15 localhost nova_compute[286344]: 2025-12-15 09:54:15.781 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:54:16 localhost ceph-mon[298913]: mon.np0005559462@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Dec 15 04:54:16 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:54:16 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:54:17 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Dec 15 04:54:17 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:54:18 localhost ceph-mon[298913]: mon.np0005559462@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Dec 15 04:54:18 localhost ceph-mon[298913]: mon.np0005559462@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Dec 15 04:54:19 localhost podman[304454]: Dec 15 04:54:19 localhost podman[304454]: 2025-12-15 09:54:19.281208793 +0000 UTC m=+0.083493359 container create f849a056b115e7bd6180f349d45d14ffbad572ec629483e234971c45f8b28eaf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=unruffled_chaplygin, release=1763362218, io.openshift.expose-services=, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux , RELEASE=main, io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z, com.redhat.component=rhceph-container, version=7, vcs-type=git, distribution-scope=public, GIT_BRANCH=main, ceph=True, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 15 04:54:19 localhost systemd[1]: Started libpod-conmon-f849a056b115e7bd6180f349d45d14ffbad572ec629483e234971c45f8b28eaf.scope. Dec 15 04:54:19 localhost systemd[1]: Started libcrun container. Dec 15 04:54:19 localhost podman[304454]: 2025-12-15 09:54:19.246849584 +0000 UTC m=+0.049134200 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 04:54:19 localhost podman[304454]: 2025-12-15 09:54:19.352263263 +0000 UTC m=+0.154547839 container init f849a056b115e7bd6180f349d45d14ffbad572ec629483e234971c45f8b28eaf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=unruffled_chaplygin, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, architecture=x86_64, com.redhat.component=rhceph-container, RELEASE=main, version=7, io.openshift.expose-services=, maintainer=Guillaume Abrioux , GIT_BRANCH=main, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, distribution-scope=public, CEPH_POINT_RELEASE=) Dec 15 04:54:19 localhost ceph-mon[298913]: Reconfig service osd.default_drive_group Dec 15 04:54:19 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:54:19 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:54:19 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Dec 15 04:54:19 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:54:19 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:54:19 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:54:19 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:54:19 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:54:19 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:54:19 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:54:19 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:54:19 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:54:19 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:54:19 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:54:19 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch Dec 15 04:54:19 localhost ceph-mon[298913]: Reconfiguring daemon osd.0 on np0005559462.localdomain Dec 15 04:54:19 localhost podman[304454]: 2025-12-15 09:54:19.362805167 +0000 UTC m=+0.165089733 container start f849a056b115e7bd6180f349d45d14ffbad572ec629483e234971c45f8b28eaf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=unruffled_chaplygin, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1763362218, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, vcs-type=git, io.openshift.expose-services=, GIT_CLEAN=True, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, name=rhceph, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, ceph=True, description=Red Hat Ceph Storage 7, architecture=x86_64, build-date=2025-11-26T19:44:28Z) Dec 15 04:54:19 localhost podman[304454]: 2025-12-15 09:54:19.363857536 +0000 UTC m=+0.166142102 container attach f849a056b115e7bd6180f349d45d14ffbad572ec629483e234971c45f8b28eaf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=unruffled_chaplygin, vcs-type=git, RELEASE=main, release=1763362218, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, GIT_CLEAN=True, build-date=2025-11-26T19:44:28Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , distribution-scope=public, CEPH_POINT_RELEASE=, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=) Dec 15 04:54:19 localhost unruffled_chaplygin[304469]: 167 167 Dec 15 04:54:19 localhost systemd[1]: libpod-f849a056b115e7bd6180f349d45d14ffbad572ec629483e234971c45f8b28eaf.scope: Deactivated successfully. Dec 15 04:54:19 localhost podman[304454]: 2025-12-15 09:54:19.367944 +0000 UTC m=+0.170228626 container died f849a056b115e7bd6180f349d45d14ffbad572ec629483e234971c45f8b28eaf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=unruffled_chaplygin, description=Red Hat Ceph Storage 7, release=1763362218, io.openshift.expose-services=, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, GIT_BRANCH=main, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main) Dec 15 04:54:19 localhost podman[304474]: 2025-12-15 09:54:19.473282917 +0000 UTC m=+0.090438812 container remove f849a056b115e7bd6180f349d45d14ffbad572ec629483e234971c45f8b28eaf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=unruffled_chaplygin, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux , version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, GIT_BRANCH=main, GIT_CLEAN=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, build-date=2025-11-26T19:44:28Z, RELEASE=main, io.openshift.expose-services=, ceph=True, release=1763362218, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git) Dec 15 04:54:19 localhost systemd[1]: libpod-conmon-f849a056b115e7bd6180f349d45d14ffbad572ec629483e234971c45f8b28eaf.scope: Deactivated successfully. Dec 15 04:54:20 localhost ceph-mon[298913]: mon.np0005559462@2(peon).osd e88 e88: 6 total, 6 up, 6 in Dec 15 04:54:20 localhost ceph-mon[298913]: mon.np0005559462@2(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:54:20 localhost ceph-mgr[292421]: mgr handle_mgr_map Activating! Dec 15 04:54:20 localhost ceph-mgr[292421]: mgr handle_mgr_map I am now activating Dec 15 04:54:20 localhost ceph-mon[298913]: mon.np0005559462@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Dec 15 04:54:20 localhost ceph-mgr[292421]: [balancer DEBUG root] setting log level based on debug_mgr: INFO (2/5) Dec 15 04:54:20 localhost ceph-mgr[292421]: mgr load Constructed class from module: balancer Dec 15 04:54:20 localhost ceph-mgr[292421]: [balancer INFO root] Starting Dec 15 04:54:20 localhost ceph-mgr[292421]: [cephadm DEBUG root] setting log level based on debug_mgr: INFO (2/5) Dec 15 04:54:20 localhost ceph-mgr[292421]: [balancer INFO root] Optimize plan auto_2025-12-15_09:54:20 Dec 15 04:54:20 localhost ceph-mgr[292421]: [balancer INFO root] Mode upmap, max misplaced 0.050000 Dec 15 04:54:20 localhost ceph-mgr[292421]: [balancer INFO root] Some PGs (1.000000) are unknown; try again later Dec 15 04:54:20 localhost ceph-mon[298913]: mon.np0005559462@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Dec 15 04:54:20 localhost systemd-logind[763]: Session 69 logged out. Waiting for processes to exit. Dec 15 04:54:20 localhost ceph-mgr[292421]: [cephadm WARNING root] removing stray HostCache host record np0005559460.localdomain.devices.0 Dec 15 04:54:20 localhost ceph-mgr[292421]: log_channel(cephadm) log [WRN] : removing stray HostCache host record np0005559460.localdomain.devices.0 Dec 15 04:54:20 localhost ceph-mgr[292421]: mgr load Constructed class from module: cephadm Dec 15 04:54:20 localhost ceph-mgr[292421]: [crash DEBUG root] setting log level based on debug_mgr: INFO (2/5) Dec 15 04:54:20 localhost ceph-mgr[292421]: mgr load Constructed class from module: crash Dec 15 04:54:20 localhost ceph-mgr[292421]: [devicehealth DEBUG root] setting log level based on debug_mgr: INFO (2/5) Dec 15 04:54:20 localhost ceph-mgr[292421]: mgr load Constructed class from module: devicehealth Dec 15 04:54:20 localhost ceph-mgr[292421]: [devicehealth INFO root] Starting Dec 15 04:54:20 localhost ceph-mgr[292421]: [iostat DEBUG root] setting log level based on debug_mgr: INFO (2/5) Dec 15 04:54:20 localhost ceph-mgr[292421]: mgr load Constructed class from module: iostat Dec 15 04:54:20 localhost ceph-mgr[292421]: [nfs DEBUG root] setting log level based on debug_mgr: INFO (2/5) Dec 15 04:54:20 localhost ceph-mgr[292421]: mgr load Constructed class from module: nfs Dec 15 04:54:20 localhost ceph-mgr[292421]: [orchestrator DEBUG root] setting log level based on debug_mgr: INFO (2/5) Dec 15 04:54:20 localhost ceph-mgr[292421]: mgr load Constructed class from module: orchestrator Dec 15 04:54:20 localhost ceph-mgr[292421]: [pg_autoscaler DEBUG root] setting log level based on debug_mgr: INFO (2/5) Dec 15 04:54:20 localhost ceph-mgr[292421]: mgr load Constructed class from module: pg_autoscaler Dec 15 04:54:20 localhost ceph-mgr[292421]: [pg_autoscaler INFO root] _maybe_adjust Dec 15 04:54:20 localhost ceph-mgr[292421]: [progress DEBUG root] setting log level based on debug_mgr: INFO (2/5) Dec 15 04:54:20 localhost ceph-mgr[292421]: mgr load Constructed class from module: progress Dec 15 04:54:20 localhost ceph-mgr[292421]: [progress INFO root] Loading... Dec 15 04:54:20 localhost ceph-mgr[292421]: [progress INFO root] Loaded [, , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ] historic events Dec 15 04:54:20 localhost ceph-mgr[292421]: [rbd_support DEBUG root] setting log level based on debug_mgr: INFO (2/5) Dec 15 04:54:20 localhost ceph-mgr[292421]: [progress INFO root] Loaded OSDMap, ready. Dec 15 04:54:20 localhost systemd[1]: var-lib-containers-storage-overlay-9ae4a4bd72c7e08f8a479e3134b185eb4415aab564c0e9ffca6877fcdba1f89a-merged.mount: Deactivated successfully. Dec 15 04:54:20 localhost ceph-mgr[292421]: [rbd_support INFO root] recovery thread starting Dec 15 04:54:20 localhost ceph-mgr[292421]: [rbd_support INFO root] starting setup Dec 15 04:54:20 localhost ceph-mgr[292421]: mgr load Constructed class from module: rbd_support Dec 15 04:54:20 localhost ceph-mgr[292421]: [restful DEBUG root] setting log level based on debug_mgr: INFO (2/5) Dec 15 04:54:20 localhost ceph-mgr[292421]: mgr load Constructed class from module: restful Dec 15 04:54:20 localhost ceph-mgr[292421]: [restful INFO root] server_addr: :: server_port: 8003 Dec 15 04:54:20 localhost ceph-mgr[292421]: [status DEBUG root] setting log level based on debug_mgr: INFO (2/5) Dec 15 04:54:20 localhost ceph-mgr[292421]: mgr load Constructed class from module: status Dec 15 04:54:20 localhost ceph-mgr[292421]: [telemetry DEBUG root] setting log level based on debug_mgr: INFO (2/5) Dec 15 04:54:20 localhost ceph-mgr[292421]: [restful WARNING root] server not running: no certificate configured Dec 15 04:54:20 localhost ceph-mgr[292421]: mgr load Constructed class from module: telemetry Dec 15 04:54:20 localhost ceph-mgr[292421]: [volumes DEBUG root] setting log level based on debug_mgr: INFO (2/5) Dec 15 04:54:20 localhost podman[304597]: Dec 15 04:54:20 localhost ceph-mgr[292421]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules Dec 15 04:54:20 localhost ceph-mgr[292421]: [rbd_support INFO root] load_schedules: vms, start_after= Dec 15 04:54:20 localhost ceph-mgr[292421]: [rbd_support INFO root] load_schedules: volumes, start_after= Dec 15 04:54:20 localhost podman[304597]: 2025-12-15 09:54:20.31954583 +0000 UTC m=+0.089164527 container create 913f24d573ddc097b05372ed070ad0f04c833ee535799b06bdcf97408c4bc5fb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_bartik, GIT_CLEAN=True, vcs-type=git, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, ceph=True, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, GIT_BRANCH=main, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, RELEASE=main, io.buildah.version=1.41.4, com.redhat.component=rhceph-container) Dec 15 04:54:20 localhost ceph-mgr[292421]: [rbd_support INFO root] load_schedules: images, start_after= Dec 15 04:54:20 localhost ceph-mgr[292421]: [rbd_support INFO root] load_schedules: backups, start_after= Dec 15 04:54:20 localhost ceph-mgr[292421]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: starting Dec 15 04:54:20 localhost ceph-mgr[292421]: [rbd_support INFO root] PerfHandler: starting Dec 15 04:54:20 localhost ceph-mgr[292421]: [rbd_support INFO root] load_task_task: vms, start_after= Dec 15 04:54:20 localhost ceph-mgr[292421]: [rbd_support INFO root] load_task_task: volumes, start_after= Dec 15 04:54:20 localhost ceph-mgr[292421]: [rbd_support INFO root] load_task_task: images, start_after= Dec 15 04:54:20 localhost ceph-mgr[292421]: [rbd_support INFO root] load_task_task: backups, start_after= Dec 15 04:54:20 localhost ceph-mgr[292421]: [rbd_support INFO root] TaskHandler: starting Dec 15 04:54:20 localhost ceph-mgr[292421]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules Dec 15 04:54:20 localhost ceph-mgr[292421]: [rbd_support INFO root] load_schedules: vms, start_after= Dec 15 04:54:20 localhost ceph-mgr[292421]: [rbd_support INFO root] load_schedules: volumes, start_after= Dec 15 04:54:20 localhost ceph-mgr[292421]: [rbd_support INFO root] load_schedules: images, start_after= Dec 15 04:54:20 localhost ceph-mgr[292421]: [rbd_support INFO root] load_schedules: backups, start_after= Dec 15 04:54:20 localhost ceph-mgr[292421]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs' Dec 15 04:54:20 localhost ceph-mgr[292421]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs' Dec 15 04:54:20 localhost ceph-mgr[292421]: mgr load Constructed class from module: volumes Dec 15 04:54:20 localhost ceph-mgr[292421]: [rbd_support INFO root] TrashPurgeScheduleHandler: starting Dec 15 04:54:20 localhost ceph-mgr[292421]: [rbd_support INFO root] setup complete Dec 15 04:54:20 localhost ceph-mgr[292421]: client.0 error registering admin socket command: (17) File exists Dec 15 04:54:20 localhost ceph-mgr[292421]: client.0 error registering admin socket command: (17) File exists Dec 15 04:54:20 localhost ceph-mgr[292421]: client.0 error registering admin socket command: (17) File exists Dec 15 04:54:20 localhost ceph-mgr[292421]: client.0 error registering admin socket command: (17) File exists Dec 15 04:54:20 localhost ceph-mgr[292421]: client.0 error registering admin socket command: (17) File exists Dec 15 04:54:20 localhost systemd[1]: Started libpod-conmon-913f24d573ddc097b05372ed070ad0f04c833ee535799b06bdcf97408c4bc5fb.scope. Dec 15 04:54:20 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-mgr-np0005559462-fudvyx[292417]: 2025-12-15T09:54:20.374+0000 7ff167cca640 -1 client.0 error registering admin socket command: (17) File exists Dec 15 04:54:20 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-mgr-np0005559462-fudvyx[292417]: 2025-12-15T09:54:20.374+0000 7ff167cca640 -1 client.0 error registering admin socket command: (17) File exists Dec 15 04:54:20 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-mgr-np0005559462-fudvyx[292417]: 2025-12-15T09:54:20.374+0000 7ff167cca640 -1 client.0 error registering admin socket command: (17) File exists Dec 15 04:54:20 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-mgr-np0005559462-fudvyx[292417]: 2025-12-15T09:54:20.374+0000 7ff167cca640 -1 client.0 error registering admin socket command: (17) File exists Dec 15 04:54:20 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-mgr-np0005559462-fudvyx[292417]: 2025-12-15T09:54:20.374+0000 7ff167cca640 -1 client.0 error registering admin socket command: (17) File exists Dec 15 04:54:20 localhost podman[304597]: 2025-12-15 09:54:20.278867726 +0000 UTC m=+0.048486463 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 04:54:20 localhost ceph-mgr[292421]: client.0 error registering admin socket command: (17) File exists Dec 15 04:54:20 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-mgr-np0005559462-fudvyx[292417]: 2025-12-15T09:54:20.380+0000 7ff1644c3640 -1 client.0 error registering admin socket command: (17) File exists Dec 15 04:54:20 localhost ceph-mgr[292421]: client.0 error registering admin socket command: (17) File exists Dec 15 04:54:20 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-mgr-np0005559462-fudvyx[292417]: 2025-12-15T09:54:20.380+0000 7ff1644c3640 -1 client.0 error registering admin socket command: (17) File exists Dec 15 04:54:20 localhost ceph-mgr[292421]: client.0 error registering admin socket command: (17) File exists Dec 15 04:54:20 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-mgr-np0005559462-fudvyx[292417]: 2025-12-15T09:54:20.380+0000 7ff1644c3640 -1 client.0 error registering admin socket command: (17) File exists Dec 15 04:54:20 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-mgr-np0005559462-fudvyx[292417]: 2025-12-15T09:54:20.380+0000 7ff1644c3640 -1 client.0 error registering admin socket command: (17) File exists Dec 15 04:54:20 localhost ceph-mgr[292421]: client.0 error registering admin socket command: (17) File exists Dec 15 04:54:20 localhost ceph-mgr[292421]: client.0 error registering admin socket command: (17) File exists Dec 15 04:54:20 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-mgr-np0005559462-fudvyx[292417]: 2025-12-15T09:54:20.380+0000 7ff1644c3640 -1 client.0 error registering admin socket command: (17) File exists Dec 15 04:54:20 localhost systemd[1]: Started libcrun container. Dec 15 04:54:20 localhost podman[304597]: 2025-12-15 09:54:20.409339623 +0000 UTC m=+0.178958300 container init 913f24d573ddc097b05372ed070ad0f04c833ee535799b06bdcf97408c4bc5fb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_bartik, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, version=7, build-date=2025-11-26T19:44:28Z, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, io.buildah.version=1.41.4, name=rhceph, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux , distribution-scope=public, io.openshift.expose-services=, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, vendor=Red Hat, Inc., vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git) Dec 15 04:54:20 localhost podman[304597]: 2025-12-15 09:54:20.42178125 +0000 UTC m=+0.191399947 container start 913f24d573ddc097b05372ed070ad0f04c833ee535799b06bdcf97408c4bc5fb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_bartik, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, release=1763362218, io.openshift.expose-services=, vendor=Red Hat, Inc., GIT_BRANCH=main, maintainer=Guillaume Abrioux , build-date=2025-11-26T19:44:28Z, CEPH_POINT_RELEASE=, RELEASE=main, vcs-type=git, com.redhat.component=rhceph-container, architecture=x86_64, version=7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Dec 15 04:54:20 localhost podman[304597]: 2025-12-15 09:54:20.422070098 +0000 UTC m=+0.191688765 container attach 913f24d573ddc097b05372ed070ad0f04c833ee535799b06bdcf97408c4bc5fb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_bartik, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, io.openshift.tags=rhceph ceph, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, name=rhceph, maintainer=Guillaume Abrioux , GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, version=7, ceph=True) Dec 15 04:54:20 localhost busy_bartik[304700]: 167 167 Dec 15 04:54:20 localhost systemd[1]: libpod-913f24d573ddc097b05372ed070ad0f04c833ee535799b06bdcf97408c4bc5fb.scope: Deactivated successfully. Dec 15 04:54:20 localhost podman[304597]: 2025-12-15 09:54:20.426405639 +0000 UTC m=+0.196024306 container died 913f24d573ddc097b05372ed070ad0f04c833ee535799b06bdcf97408c4bc5fb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_bartik, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1763362218, vcs-type=git, ceph=True, version=7, build-date=2025-11-26T19:44:28Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, architecture=x86_64, vendor=Red Hat, Inc., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, RELEASE=main, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4) Dec 15 04:54:20 localhost sshd[304716]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:54:20 localhost podman[304710]: 2025-12-15 09:54:20.502043047 +0000 UTC m=+0.069816297 container remove 913f24d573ddc097b05372ed070ad0f04c833ee535799b06bdcf97408c4bc5fb (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_bartik, version=7, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, vcs-type=git, io.openshift.expose-services=, com.redhat.component=rhceph-container, architecture=x86_64, name=rhceph, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1763362218, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-11-26T19:44:28Z, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public) Dec 15 04:54:20 localhost systemd[1]: libpod-conmon-913f24d573ddc097b05372ed070ad0f04c833ee535799b06bdcf97408c4bc5fb.scope: Deactivated successfully. Dec 15 04:54:20 localhost systemd-logind[763]: New session 70 of user ceph-admin. Dec 15 04:54:20 localhost systemd[1]: Started Session 70 of User ceph-admin. Dec 15 04:54:20 localhost systemd[1]: session-69.scope: Deactivated successfully. Dec 15 04:54:20 localhost systemd[1]: session-69.scope: Consumed 19.592s CPU time. Dec 15 04:54:20 localhost systemd-logind[763]: Removed session 69. Dec 15 04:54:20 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:54:20 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:54:20 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:54:20 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:54:20 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch Dec 15 04:54:20 localhost ceph-mon[298913]: from='mgr.26593 172.18.0.108:0/1692918800' entity='mgr.np0005559464.aomnqe' Dec 15 04:54:20 localhost ceph-mon[298913]: from='client.? 172.18.0.200:0/2577178447' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Dec 15 04:54:20 localhost ceph-mon[298913]: from='client.? ' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Dec 15 04:54:20 localhost ceph-mon[298913]: Activating manager daemon np0005559462.fudvyx Dec 15 04:54:20 localhost ceph-mon[298913]: from='client.? ' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished Dec 15 04:54:20 localhost ceph-mon[298913]: Manager daemon np0005559462.fudvyx is now available Dec 15 04:54:20 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005559460.localdomain.devices.0"} : dispatch Dec 15 04:54:20 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005559460.localdomain.devices.0"}]': finished Dec 15 04:54:20 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005559460.localdomain.devices.0"} : dispatch Dec 15 04:54:20 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005559460.localdomain.devices.0"}]': finished Dec 15 04:54:20 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005559462.fudvyx/mirror_snapshot_schedule"} : dispatch Dec 15 04:54:20 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005559462.fudvyx/trash_purge_schedule"} : dispatch Dec 15 04:54:20 localhost nova_compute[286344]: 2025-12-15 09:54:20.783 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:54:20 localhost nova_compute[286344]: 2025-12-15 09:54:20.786 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:54:21 localhost ceph-mgr[292421]: log_channel(cluster) log [DBG] : pgmap v3: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Dec 15 04:54:21 localhost systemd[1]: tmp-crun.SpH5lZ.mount: Deactivated successfully. Dec 15 04:54:21 localhost systemd[1]: var-lib-containers-storage-overlay-80ee44dec4a55ee90cb283a4ac97be4fef9a4ae65763f5431624065cc548f1bc-merged.mount: Deactivated successfully. Dec 15 04:54:21 localhost podman[304845]: 2025-12-15 09:54:21.56313892 +0000 UTC m=+0.092279974 container exec 8dcda56b365b42dc8758aab77a9ec80db304780e449052738f7e4e648ae1ecaf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-crash-np0005559462, GIT_CLEAN=True, RELEASE=main, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , name=rhceph, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, distribution-scope=public, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7) Dec 15 04:54:21 localhost podman[304845]: 2025-12-15 09:54:21.691837208 +0000 UTC m=+0.220978302 container exec_died 8dcda56b365b42dc8758aab77a9ec80db304780e449052738f7e4e648ae1ecaf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-crash-np0005559462, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, io.openshift.expose-services=, architecture=x86_64, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, RELEASE=main, CEPH_POINT_RELEASE=, vcs-type=git, com.redhat.component=rhceph-container, release=1763362218, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, version=7) Dec 15 04:54:21 localhost ceph-mgr[292421]: [cephadm INFO cherrypy.error] [15/Dec/2025:09:54:21] ENGINE Bus STARTING Dec 15 04:54:21 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : [15/Dec/2025:09:54:21] ENGINE Bus STARTING Dec 15 04:54:21 localhost ceph-mgr[292421]: [cephadm INFO cherrypy.error] [15/Dec/2025:09:54:21] ENGINE Serving on https://172.18.0.106:7150 Dec 15 04:54:21 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : [15/Dec/2025:09:54:21] ENGINE Serving on https://172.18.0.106:7150 Dec 15 04:54:21 localhost ceph-mgr[292421]: [cephadm INFO cherrypy.error] [15/Dec/2025:09:54:21] ENGINE Client ('172.18.0.106', 56084) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') Dec 15 04:54:21 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : [15/Dec/2025:09:54:21] ENGINE Client ('172.18.0.106', 56084) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') Dec 15 04:54:21 localhost ceph-mgr[292421]: [cephadm INFO cherrypy.error] [15/Dec/2025:09:54:21] ENGINE Serving on http://172.18.0.106:8765 Dec 15 04:54:21 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : [15/Dec/2025:09:54:21] ENGINE Serving on http://172.18.0.106:8765 Dec 15 04:54:21 localhost ceph-mgr[292421]: [cephadm INFO cherrypy.error] [15/Dec/2025:09:54:21] ENGINE Bus STARTED Dec 15 04:54:21 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : [15/Dec/2025:09:54:21] ENGINE Bus STARTED Dec 15 04:54:22 localhost ceph-mgr[292421]: log_channel(cluster) log [DBG] : pgmap v4: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Dec 15 04:54:22 localhost ceph-mon[298913]: mon.np0005559462@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Dec 15 04:54:22 localhost ceph-mon[298913]: removing stray HostCache host record np0005559460.localdomain.devices.0 Dec 15 04:54:22 localhost ceph-mgr[292421]: [devicehealth INFO root] Check health Dec 15 04:54:23 localhost ceph-mon[298913]: [15/Dec/2025:09:54:21] ENGINE Bus STARTING Dec 15 04:54:23 localhost ceph-mon[298913]: [15/Dec/2025:09:54:21] ENGINE Serving on https://172.18.0.106:7150 Dec 15 04:54:23 localhost ceph-mon[298913]: [15/Dec/2025:09:54:21] ENGINE Client ('172.18.0.106', 56084) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') Dec 15 04:54:23 localhost ceph-mon[298913]: [15/Dec/2025:09:54:21] ENGINE Serving on http://172.18.0.106:8765 Dec 15 04:54:23 localhost ceph-mon[298913]: [15/Dec/2025:09:54:21] ENGINE Bus STARTED Dec 15 04:54:23 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:54:23 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:54:23 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:54:23 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:54:23 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:54:23 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:54:23 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:54:23 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:54:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0. Dec 15 04:54:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 04:54:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a. Dec 15 04:54:23 localhost podman[305086]: 2025-12-15 09:54:23.44983227 +0000 UTC m=+0.081278837 container health_status b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute) Dec 15 04:54:23 localhost podman[305086]: 2025-12-15 09:54:23.461579887 +0000 UTC m=+0.093026434 container exec_died b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3) Dec 15 04:54:23 localhost systemd[1]: b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a.service: Deactivated successfully. Dec 15 04:54:23 localhost podman[305078]: 2025-12-15 09:54:23.465520836 +0000 UTC m=+0.088270951 container health_status 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Dec 15 04:54:23 localhost podman[305078]: 2025-12-15 09:54:23.557189292 +0000 UTC m=+0.179939457 container exec_died 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Dec 15 04:54:23 localhost systemd[1]: 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0.service: Deactivated successfully. Dec 15 04:54:23 localhost systemd[1]: tmp-crun.H9xyAH.mount: Deactivated successfully. Dec 15 04:54:23 localhost podman[305080]: 2025-12-15 09:54:23.620673422 +0000 UTC m=+0.248887190 container health_status 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Dec 15 04:54:23 localhost podman[305080]: 2025-12-15 09:54:23.634362143 +0000 UTC m=+0.262575921 container exec_died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true) Dec 15 04:54:23 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.service: Deactivated successfully. Dec 15 04:54:23 localhost ceph-mgr[292421]: [cephadm INFO root] Adjusting osd_memory_target on np0005559463.localdomain to 836.6M Dec 15 04:54:23 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Adjusting osd_memory_target on np0005559463.localdomain to 836.6M Dec 15 04:54:23 localhost ceph-mgr[292421]: [cephadm INFO root] Adjusting osd_memory_target on np0005559464.localdomain to 836.6M Dec 15 04:54:23 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Adjusting osd_memory_target on np0005559464.localdomain to 836.6M Dec 15 04:54:23 localhost ceph-mgr[292421]: [cephadm WARNING cephadm.serve] Unable to set osd_memory_target on np0005559463.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Dec 15 04:54:23 localhost ceph-mgr[292421]: log_channel(cephadm) log [WRN] : Unable to set osd_memory_target on np0005559463.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Dec 15 04:54:23 localhost ceph-mgr[292421]: [cephadm INFO root] Adjusting osd_memory_target on np0005559462.localdomain to 836.6M Dec 15 04:54:23 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Adjusting osd_memory_target on np0005559462.localdomain to 836.6M Dec 15 04:54:23 localhost ceph-mgr[292421]: [cephadm WARNING cephadm.serve] Unable to set osd_memory_target on np0005559464.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Dec 15 04:54:23 localhost ceph-mgr[292421]: log_channel(cephadm) log [WRN] : Unable to set osd_memory_target on np0005559464.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Dec 15 04:54:23 localhost ceph-mgr[292421]: [cephadm WARNING cephadm.serve] Unable to set osd_memory_target on np0005559462.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096 Dec 15 04:54:23 localhost ceph-mgr[292421]: log_channel(cephadm) log [WRN] : Unable to set osd_memory_target on np0005559462.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096 Dec 15 04:54:23 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Updating np0005559461.localdomain:/etc/ceph/ceph.conf Dec 15 04:54:23 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Updating np0005559461.localdomain:/etc/ceph/ceph.conf Dec 15 04:54:23 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Updating np0005559462.localdomain:/etc/ceph/ceph.conf Dec 15 04:54:23 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Updating np0005559463.localdomain:/etc/ceph/ceph.conf Dec 15 04:54:23 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Updating np0005559462.localdomain:/etc/ceph/ceph.conf Dec 15 04:54:23 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Updating np0005559464.localdomain:/etc/ceph/ceph.conf Dec 15 04:54:23 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Updating np0005559463.localdomain:/etc/ceph/ceph.conf Dec 15 04:54:23 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Updating np0005559464.localdomain:/etc/ceph/ceph.conf Dec 15 04:54:24 localhost ceph-mgr[292421]: log_channel(cluster) log [DBG] : pgmap v5: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Dec 15 04:54:24 localhost ceph-mon[298913]: mon.np0005559462@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Dec 15 04:54:24 localhost ceph-mon[298913]: mon.np0005559462@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Dec 15 04:54:24 localhost ceph-mgr[292421]: mgr.server handle_open ignoring open from mon.np0005559463 172.18.0.107:0/2300540652; not ready for session (expect reconnect) Dec 15 04:54:24 localhost ceph-mgr[292421]: mgr finish mon failed to return metadata for mon.np0005559463: (2) No such file or directory Dec 15 04:54:24 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:54:24 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:54:24 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:54:24 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:54:24 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Dec 15 04:54:24 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:54:24 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "config rm", "who": "osd/host:np0005559461", "name": "osd_memory_target"} : dispatch Dec 15 04:54:24 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:54:24 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:54:24 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Dec 15 04:54:24 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Dec 15 04:54:24 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:54:24 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Dec 15 04:54:24 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Dec 15 04:54:24 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Dec 15 04:54:24 localhost ceph-mon[298913]: Adjusting osd_memory_target on np0005559463.localdomain to 836.6M Dec 15 04:54:24 localhost ceph-mon[298913]: Adjusting osd_memory_target on np0005559464.localdomain to 836.6M Dec 15 04:54:24 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Dec 15 04:54:24 localhost ceph-mon[298913]: Unable to set osd_memory_target on np0005559463.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Dec 15 04:54:24 localhost ceph-mon[298913]: Adjusting osd_memory_target on np0005559462.localdomain to 836.6M Dec 15 04:54:24 localhost ceph-mon[298913]: Unable to set osd_memory_target on np0005559464.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Dec 15 04:54:24 localhost ceph-mon[298913]: Unable to set osd_memory_target on np0005559462.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096 Dec 15 04:54:24 localhost ceph-mon[298913]: Updating np0005559461.localdomain:/etc/ceph/ceph.conf Dec 15 04:54:24 localhost ceph-mon[298913]: Updating np0005559462.localdomain:/etc/ceph/ceph.conf Dec 15 04:54:24 localhost ceph-mon[298913]: Updating np0005559463.localdomain:/etc/ceph/ceph.conf Dec 15 04:54:24 localhost ceph-mon[298913]: Updating np0005559464.localdomain:/etc/ceph/ceph.conf Dec 15 04:54:24 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Updating np0005559461.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:54:24 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Updating np0005559461.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:54:24 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Updating np0005559464.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:54:24 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Updating np0005559464.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:54:24 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Updating np0005559463.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:54:24 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Updating np0005559463.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:54:24 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Updating np0005559462.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:54:24 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Updating np0005559462.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:54:25 localhost ceph-mon[298913]: mon.np0005559462@2(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:54:25 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Updating np0005559461.localdomain:/etc/ceph/ceph.client.admin.keyring Dec 15 04:54:25 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Updating np0005559461.localdomain:/etc/ceph/ceph.client.admin.keyring Dec 15 04:54:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09. Dec 15 04:54:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 04:54:25 localhost ceph-mgr[292421]: mgr.server handle_open ignoring open from mgr.np0005559464.aomnqe 172.18.0.108:0/929118679; not ready for session (expect reconnect) Dec 15 04:54:25 localhost ceph-mgr[292421]: mgr.server handle_open ignoring open from mon.np0005559463 172.18.0.107:0/2300540652; not ready for session (expect reconnect) Dec 15 04:54:25 localhost ceph-mgr[292421]: mgr finish mon failed to return metadata for mon.np0005559463: (2) No such file or directory Dec 15 04:54:25 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Updating np0005559463.localdomain:/etc/ceph/ceph.client.admin.keyring Dec 15 04:54:25 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Updating np0005559463.localdomain:/etc/ceph/ceph.client.admin.keyring Dec 15 04:54:25 localhost podman[305468]: 2025-12-15 09:54:25.288325865 +0000 UTC m=+0.086061931 container health_status 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vcs-type=git, container_name=openstack_network_exporter, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, config_id=openstack_network_exporter, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, architecture=x86_64, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, distribution-scope=public, release=1755695350, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible) Dec 15 04:54:25 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Updating np0005559464.localdomain:/etc/ceph/ceph.client.admin.keyring Dec 15 04:54:25 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Updating np0005559464.localdomain:/etc/ceph/ceph.client.admin.keyring Dec 15 04:54:25 localhost podman[305468]: 2025-12-15 09:54:25.330284534 +0000 UTC m=+0.128020540 container exec_died 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, vcs-type=git, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, container_name=openstack_network_exporter, config_id=openstack_network_exporter, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, maintainer=Red Hat, Inc.) Dec 15 04:54:25 localhost systemd[1]: 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09.service: Deactivated successfully. Dec 15 04:54:25 localhost podman[305469]: 2025-12-15 09:54:25.35417347 +0000 UTC m=+0.149848868 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Dec 15 04:54:25 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Updating np0005559462.localdomain:/etc/ceph/ceph.client.admin.keyring Dec 15 04:54:25 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Updating np0005559462.localdomain:/etc/ceph/ceph.client.admin.keyring Dec 15 04:54:25 localhost podman[305469]: 2025-12-15 09:54:25.398393213 +0000 UTC m=+0.194068611 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251202) Dec 15 04:54:25 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 04:54:25 localhost nova_compute[286344]: 2025-12-15 09:54:25.787 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:54:25 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Updating np0005559461.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.client.admin.keyring Dec 15 04:54:25 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Updating np0005559461.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.client.admin.keyring Dec 15 04:54:25 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Updating np0005559463.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.client.admin.keyring Dec 15 04:54:25 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Updating np0005559463.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.client.admin.keyring Dec 15 04:54:25 localhost ceph-mon[298913]: Updating np0005559461.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:54:25 localhost ceph-mon[298913]: Updating np0005559464.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:54:25 localhost ceph-mon[298913]: Updating np0005559463.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:54:25 localhost ceph-mon[298913]: Updating np0005559462.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:54:25 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Updating np0005559464.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.client.admin.keyring Dec 15 04:54:25 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Updating np0005559464.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.client.admin.keyring Dec 15 04:54:26 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Updating np0005559462.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.client.admin.keyring Dec 15 04:54:26 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Updating np0005559462.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.client.admin.keyring Dec 15 04:54:26 localhost ceph-mgr[292421]: log_channel(cluster) log [DBG] : pgmap v6: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Dec 15 04:54:26 localhost ceph-mon[298913]: mon.np0005559462@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Dec 15 04:54:26 localhost ceph-mgr[292421]: mgr.server handle_open ignoring open from mon.np0005559463 172.18.0.107:0/2300540652; not ready for session (expect reconnect) Dec 15 04:54:26 localhost ceph-mgr[292421]: mgr finish mon failed to return metadata for mon.np0005559463: (2) No such file or directory Dec 15 04:54:26 localhost ceph-mgr[292421]: log_channel(cluster) log [DBG] : pgmap v7: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail; 37 KiB/s rd, 0 B/s wr, 20 op/s Dec 15 04:54:26 localhost ceph-mgr[292421]: [progress INFO root] update: starting ev 1e7599fd-5702-4117-be1c-d62112b4ceb6 (Updating node-proxy deployment (+4 -> 4)) Dec 15 04:54:26 localhost ceph-mgr[292421]: [progress INFO root] complete: finished ev 1e7599fd-5702-4117-be1c-d62112b4ceb6 (Updating node-proxy deployment (+4 -> 4)) Dec 15 04:54:26 localhost ceph-mgr[292421]: [progress INFO root] Completed event 1e7599fd-5702-4117-be1c-d62112b4ceb6 (Updating node-proxy deployment (+4 -> 4)) in 0 seconds Dec 15 04:54:26 localhost ceph-mon[298913]: Updating np0005559461.localdomain:/etc/ceph/ceph.client.admin.keyring Dec 15 04:54:26 localhost ceph-mon[298913]: Updating np0005559463.localdomain:/etc/ceph/ceph.client.admin.keyring Dec 15 04:54:26 localhost ceph-mon[298913]: Updating np0005559464.localdomain:/etc/ceph/ceph.client.admin.keyring Dec 15 04:54:26 localhost ceph-mon[298913]: Updating np0005559462.localdomain:/etc/ceph/ceph.client.admin.keyring Dec 15 04:54:26 localhost ceph-mon[298913]: Updating np0005559461.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.client.admin.keyring Dec 15 04:54:26 localhost ceph-mon[298913]: Updating np0005559463.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.client.admin.keyring Dec 15 04:54:27 localhost ceph-mon[298913]: Updating np0005559464.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.client.admin.keyring Dec 15 04:54:27 localhost ceph-mon[298913]: Updating np0005559462.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.client.admin.keyring Dec 15 04:54:27 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:54:27 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:54:27 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:54:27 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:54:27 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:54:27 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:54:27 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:54:27 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:54:27 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:54:27 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.3 on np0005559462.localdomain Dec 15 04:54:27 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.3 on np0005559462.localdomain Dec 15 04:54:27 localhost ceph-mgr[292421]: mgr.server handle_open ignoring open from mon.np0005559463 172.18.0.107:0/2300540652; not ready for session (expect reconnect) Dec 15 04:54:27 localhost ceph-mgr[292421]: mgr finish mon failed to return metadata for mon.np0005559463: (2) No such file or directory Dec 15 04:54:27 localhost podman[305940]: Dec 15 04:54:27 localhost podman[305940]: 2025-12-15 09:54:27.74011453 +0000 UTC m=+0.081942885 container create b2e021159c915f26b0ffc8175f3bc9e3e3d756ca88045e36e944f417151d3353 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=inspiring_sanderson, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, GIT_CLEAN=True, build-date=2025-11-26T19:44:28Z, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, release=1763362218, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, RELEASE=main, com.redhat.component=rhceph-container, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph) Dec 15 04:54:27 localhost systemd[1]: Started libpod-conmon-b2e021159c915f26b0ffc8175f3bc9e3e3d756ca88045e36e944f417151d3353.scope. Dec 15 04:54:27 localhost systemd[1]: Started libcrun container. Dec 15 04:54:27 localhost podman[305940]: 2025-12-15 09:54:27.708426991 +0000 UTC m=+0.050255356 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 04:54:27 localhost podman[305940]: 2025-12-15 09:54:27.809805725 +0000 UTC m=+0.151634090 container init b2e021159c915f26b0ffc8175f3bc9e3e3d756ca88045e36e944f417151d3353 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=inspiring_sanderson, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, name=rhceph, architecture=x86_64, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-26T19:44:28Z, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, ceph=True, CEPH_POINT_RELEASE=, GIT_BRANCH=main) Dec 15 04:54:27 localhost podman[305940]: 2025-12-15 09:54:27.819977707 +0000 UTC m=+0.161806062 container start b2e021159c915f26b0ffc8175f3bc9e3e3d756ca88045e36e944f417151d3353 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=inspiring_sanderson, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, architecture=x86_64, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, name=rhceph, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1763362218, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, build-date=2025-11-26T19:44:28Z, ceph=True, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers) Dec 15 04:54:27 localhost podman[305940]: 2025-12-15 09:54:27.820326807 +0000 UTC m=+0.162155202 container attach b2e021159c915f26b0ffc8175f3bc9e3e3d756ca88045e36e944f417151d3353 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=inspiring_sanderson, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, release=1763362218, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., version=7, build-date=2025-11-26T19:44:28Z, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, distribution-scope=public, architecture=x86_64, io.buildah.version=1.41.4, ceph=True, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, GIT_BRANCH=main) Dec 15 04:54:27 localhost inspiring_sanderson[305955]: 167 167 Dec 15 04:54:27 localhost systemd[1]: libpod-b2e021159c915f26b0ffc8175f3bc9e3e3d756ca88045e36e944f417151d3353.scope: Deactivated successfully. Dec 15 04:54:27 localhost podman[305940]: 2025-12-15 09:54:27.824678437 +0000 UTC m=+0.166506802 container died b2e021159c915f26b0ffc8175f3bc9e3e3d756ca88045e36e944f417151d3353 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=inspiring_sanderson, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, version=7, GIT_CLEAN=True, io.openshift.expose-services=, ceph=True, maintainer=Guillaume Abrioux , cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-11-26T19:44:28Z, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=rhceph ceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, name=rhceph, architecture=x86_64) Dec 15 04:54:27 localhost podman[305960]: 2025-12-15 09:54:27.930114334 +0000 UTC m=+0.093620980 container remove b2e021159c915f26b0ffc8175f3bc9e3e3d756ca88045e36e944f417151d3353 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=inspiring_sanderson, vendor=Red Hat, Inc., RELEASE=main, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux , build-date=2025-11-26T19:44:28Z, com.redhat.component=rhceph-container, ceph=True, architecture=x86_64, description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, version=7, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1763362218) Dec 15 04:54:27 localhost systemd[1]: libpod-conmon-b2e021159c915f26b0ffc8175f3bc9e3e3d756ca88045e36e944f417151d3353.scope: Deactivated successfully. Dec 15 04:54:28 localhost ceph-mon[298913]: Health check failed: 1 stray daemon(s) not managed by cephadm (CEPHADM_STRAY_DAEMON) Dec 15 04:54:28 localhost ceph-mon[298913]: Health check failed: 1 stray host(s) with 1 daemon(s) not managed by cephadm (CEPHADM_STRAY_HOST) Dec 15 04:54:28 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch Dec 15 04:54:28 localhost ceph-mon[298913]: mon.np0005559462@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Dec 15 04:54:28 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.2 on np0005559463.localdomain Dec 15 04:54:28 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.2 on np0005559463.localdomain Dec 15 04:54:28 localhost ceph-mgr[292421]: mgr.server handle_open ignoring open from mon.np0005559463 172.18.0.107:0/2300540652; not ready for session (expect reconnect) Dec 15 04:54:28 localhost ceph-mgr[292421]: mgr finish mon failed to return metadata for mon.np0005559463: (2) No such file or directory Dec 15 04:54:28 localhost ceph-mon[298913]: mon.np0005559462@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Dec 15 04:54:28 localhost systemd[1]: var-lib-containers-storage-overlay-46950ef1e031afe689afb31e778a122b92fd82b991468fab3b5cee384bae6261-merged.mount: Deactivated successfully. Dec 15 04:54:28 localhost ceph-mgr[292421]: log_channel(cluster) log [DBG] : pgmap v8: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail; 27 KiB/s rd, 0 B/s wr, 15 op/s Dec 15 04:54:29 localhost ceph-mon[298913]: Reconfiguring daemon osd.3 on np0005559462.localdomain Dec 15 04:54:29 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:54:29 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:54:29 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:54:29 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:54:29 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Dec 15 04:54:29 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.5 on np0005559463.localdomain Dec 15 04:54:29 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.5 on np0005559463.localdomain Dec 15 04:54:29 localhost ceph-mgr[292421]: mgr.server handle_open ignoring open from mon.np0005559463 172.18.0.107:0/2300540652; not ready for session (expect reconnect) Dec 15 04:54:29 localhost ceph-mgr[292421]: mgr finish mon failed to return metadata for mon.np0005559463: (2) No such file or directory Dec 15 04:54:30 localhost ceph-mon[298913]: Reconfiguring daemon osd.2 on np0005559463.localdomain Dec 15 04:54:30 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:54:30 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:54:30 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:54:30 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:54:30 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Dec 15 04:54:30 localhost ceph-mon[298913]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #25. Immutable memtables: 0. Dec 15 04:54:30 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:54:30.077683) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Dec 15 04:54:30 localhost ceph-mon[298913]: rocksdb: [db/flush_job.cc:856] [default] [JOB 11] Flushing memtable with next log file: 25 Dec 15 04:54:30 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765792470077785, "job": 11, "event": "flush_started", "num_memtables": 1, "num_entries": 2820, "num_deletes": 256, "total_data_size": 9240566, "memory_usage": 9490248, "flush_reason": "Manual Compaction"} Dec 15 04:54:30 localhost ceph-mon[298913]: rocksdb: [db/flush_job.cc:885] [default] [JOB 11] Level-0 flush table #26: started Dec 15 04:54:30 localhost ceph-mon[298913]: mon.np0005559462@2(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:54:30 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765792470111463, "cf_name": "default", "job": 11, "event": "table_file_creation", "file_number": 26, "file_size": 5551166, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14038, "largest_seqno": 16852, "table_properties": {"data_size": 5539396, "index_size": 7379, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3333, "raw_key_size": 29929, "raw_average_key_size": 22, "raw_value_size": 5513833, "raw_average_value_size": 4167, "num_data_blocks": 321, "num_entries": 1323, "num_filter_entries": 1323, "num_deletions": 254, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765792409, "oldest_key_time": 1765792409, "file_creation_time": 1765792470, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "603b24af-e2be-4214-bc56-9e652eb4af3d", "db_session_id": "0OJRM9SCUA16EXV0VQZ2", "orig_file_number": 26, "seqno_to_time_mapping": "N/A"}} Dec 15 04:54:30 localhost ceph-mon[298913]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 11] Flush lasted 35139 microseconds, and 10219 cpu microseconds. Dec 15 04:54:30 localhost ceph-mon[298913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Dec 15 04:54:30 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:54:30.111543) [db/flush_job.cc:967] [default] [JOB 11] Level-0 flush table #26: 5551166 bytes OK Dec 15 04:54:30 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:54:30.112856) [db/memtable_list.cc:519] [default] Level-0 commit table #26 started Dec 15 04:54:30 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:54:30.114794) [db/memtable_list.cc:722] [default] Level-0 commit table #26: memtable #1 done Dec 15 04:54:30 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:54:30.114818) EVENT_LOG_v1 {"time_micros": 1765792470114811, "job": 11, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Dec 15 04:54:30 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:54:30.114842) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Dec 15 04:54:30 localhost ceph-mon[298913]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 11] Try to delete WAL files size 9226846, prev total WAL file size 9228718, number of live WAL files 2. Dec 15 04:54:30 localhost ceph-mon[298913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005559462/store.db/000022.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Dec 15 04:54:30 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:54:30.116721) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003130373933' seq:72057594037927935, type:22 .. '7061786F73003131303435' seq:0, type:0; will stop at (end) Dec 15 04:54:30 localhost ceph-mon[298913]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 12] Compacting 1@0 + 1@6 files to L6, score -1.00 Dec 15 04:54:30 localhost ceph-mon[298913]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 11 Base level 0, inputs: [26(5421KB)], [24(15MB)] Dec 15 04:54:30 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765792470116790, "job": 12, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [26], "files_L6": [24], "score": -1, "input_data_size": 21652816, "oldest_snapshot_seqno": -1} Dec 15 04:54:30 localhost ceph-mon[298913]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 12] Generated table #27: 11060 keys, 19731922 bytes, temperature: kUnknown Dec 15 04:54:30 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765792470202929, "cf_name": "default", "job": 12, "event": "table_file_creation", "file_number": 27, "file_size": 19731922, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 19666660, "index_size": 36489, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 27717, "raw_key_size": 296330, "raw_average_key_size": 26, "raw_value_size": 19475672, "raw_average_value_size": 1760, "num_data_blocks": 1402, "num_entries": 11060, "num_filter_entries": 11060, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765792320, "oldest_key_time": 0, "file_creation_time": 1765792470, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "603b24af-e2be-4214-bc56-9e652eb4af3d", "db_session_id": "0OJRM9SCUA16EXV0VQZ2", "orig_file_number": 27, "seqno_to_time_mapping": "N/A"}} Dec 15 04:54:30 localhost ceph-mon[298913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Dec 15 04:54:30 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:54:30.203215) [db/compaction/compaction_job.cc:1663] [default] [JOB 12] Compacted 1@0 + 1@6 files to L6 => 19731922 bytes Dec 15 04:54:30 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:54:30.204730) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 251.1 rd, 228.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(5.3, 15.4 +0.0 blob) out(18.8 +0.0 blob), read-write-amplify(7.5) write-amplify(3.6) OK, records in: 11609, records dropped: 549 output_compression: NoCompression Dec 15 04:54:30 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:54:30.204750) EVENT_LOG_v1 {"time_micros": 1765792470204741, "job": 12, "event": "compaction_finished", "compaction_time_micros": 86249, "compaction_time_cpu_micros": 27934, "output_level": 6, "num_output_files": 1, "total_output_size": 19731922, "num_input_records": 11609, "num_output_records": 11060, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Dec 15 04:54:30 localhost ceph-mon[298913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005559462/store.db/000026.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Dec 15 04:54:30 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765792470205331, "job": 12, "event": "table_file_deletion", "file_number": 26} Dec 15 04:54:30 localhost ceph-mon[298913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005559462/store.db/000024.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Dec 15 04:54:30 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765792470206989, "job": 12, "event": "table_file_deletion", "file_number": 24} Dec 15 04:54:30 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:54:30.116617) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 04:54:30 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:54:30.207015) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 04:54:30 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:54:30.207020) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 04:54:30 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:54:30.207022) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 04:54:30 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:54:30.207023) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 04:54:30 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:54:30.207025) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 04:54:30 localhost ceph-mgr[292421]: mgr.server handle_open ignoring open from mon.np0005559463 172.18.0.107:0/2300540652; not ready for session (expect reconnect) Dec 15 04:54:30 localhost ceph-mgr[292421]: mgr finish mon failed to return metadata for mon.np0005559463: (2) No such file or directory Dec 15 04:54:30 localhost ceph-mon[298913]: mon.np0005559462@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Dec 15 04:54:30 localhost ceph-mgr[292421]: [progress INFO root] Writing back 50 completed events Dec 15 04:54:30 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.1 on np0005559464.localdomain Dec 15 04:54:30 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.1 on np0005559464.localdomain Dec 15 04:54:30 localhost ceph-mon[298913]: mon.np0005559462@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Dec 15 04:54:30 localhost ceph-mgr[292421]: log_channel(cluster) log [DBG] : pgmap v9: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail; 21 KiB/s rd, 0 B/s wr, 12 op/s Dec 15 04:54:30 localhost nova_compute[286344]: 2025-12-15 09:54:30.788 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:54:30 localhost nova_compute[286344]: 2025-12-15 09:54:30.789 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:54:30 localhost nova_compute[286344]: 2025-12-15 09:54:30.790 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 15 04:54:30 localhost nova_compute[286344]: 2025-12-15 09:54:30.790 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:54:30 localhost nova_compute[286344]: 2025-12-15 09:54:30.791 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:54:30 localhost nova_compute[286344]: 2025-12-15 09:54:30.793 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:54:31 localhost ceph-mon[298913]: Reconfiguring daemon osd.5 on np0005559463.localdomain Dec 15 04:54:31 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:54:31 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:54:31 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:54:31 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:54:31 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Dec 15 04:54:31 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:54:31 localhost ceph-mgr[292421]: mgr.server handle_open ignoring open from mon.np0005559463 172.18.0.107:0/2300540652; not ready for session (expect reconnect) Dec 15 04:54:31 localhost ceph-mgr[292421]: mgr finish mon failed to return metadata for mon.np0005559463: (2) No such file or directory Dec 15 04:54:31 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.4 on np0005559464.localdomain Dec 15 04:54:31 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.4 on np0005559464.localdomain Dec 15 04:54:31 localhost ceph-mgr[292421]: log_channel(audit) log [DBG] : from='client.34438 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch Dec 15 04:54:31 localhost podman[243449]: time="2025-12-15T09:54:31Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 15 04:54:31 localhost podman[243449]: @ - - [15/Dec/2025:09:54:31 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 154814 "" "Go-http-client/1.1" Dec 15 04:54:31 localhost podman[243449]: @ - - [15/Dec/2025:09:54:31 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18709 "" "Go-http-client/1.1" Dec 15 04:54:32 localhost ceph-mon[298913]: Reconfiguring daemon osd.1 on np0005559464.localdomain Dec 15 04:54:32 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:54:32 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:54:32 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:54:32 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:54:32 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Dec 15 04:54:32 localhost ceph-mon[298913]: Reconfiguring daemon osd.4 on np0005559464.localdomain Dec 15 04:54:32 localhost ceph-mgr[292421]: mgr.server handle_open ignoring open from mon.np0005559463 172.18.0.107:0/2300540652; not ready for session (expect reconnect) Dec 15 04:54:32 localhost ceph-mgr[292421]: mgr finish mon failed to return metadata for mon.np0005559463: (2) No such file or directory Dec 15 04:54:32 localhost ceph-mon[298913]: mon.np0005559462@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Dec 15 04:54:32 localhost ceph-mon[298913]: mon.np0005559462@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Dec 15 04:54:32 localhost ceph-mgr[292421]: [progress INFO root] update: starting ev d0e0e465-89ac-44ed-9c4c-66d4002c1a9c (Updating node-proxy deployment (+4 -> 4)) Dec 15 04:54:32 localhost ceph-mgr[292421]: [progress INFO root] complete: finished ev d0e0e465-89ac-44ed-9c4c-66d4002c1a9c (Updating node-proxy deployment (+4 -> 4)) Dec 15 04:54:32 localhost ceph-mgr[292421]: [progress INFO root] Completed event d0e0e465-89ac-44ed-9c4c-66d4002c1a9c (Updating node-proxy deployment (+4 -> 4)) in 0 seconds Dec 15 04:54:32 localhost ceph-mgr[292421]: log_channel(cluster) log [DBG] : pgmap v10: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s Dec 15 04:54:33 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:54:33 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:54:33 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:54:33 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:54:33 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Dec 15 04:54:33 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:54:33 localhost ceph-mgr[292421]: mgr.server handle_open ignoring open from mon.np0005559463 172.18.0.107:0/2300540652; not ready for session (expect reconnect) Dec 15 04:54:33 localhost ceph-mgr[292421]: mgr finish mon failed to return metadata for mon.np0005559463: (2) No such file or directory Dec 15 04:54:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 04:54:33 localhost podman[306001]: 2025-12-15 09:54:33.753348299 +0000 UTC m=+0.078773518 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_managed=true) Dec 15 04:54:33 localhost podman[306001]: 2025-12-15 09:54:33.787490117 +0000 UTC m=+0.112915316 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent) Dec 15 04:54:33 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 04:54:34 localhost ceph-mgr[292421]: log_channel(audit) log [DBG] : from='client.27192 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch Dec 15 04:54:34 localhost ceph-mgr[292421]: [cephadm INFO root] Saving service mon spec with placement label:mon Dec 15 04:54:34 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Saving service mon spec with placement label:mon Dec 15 04:54:34 localhost ceph-mgr[292421]: mgr.server handle_open ignoring open from mon.np0005559463 172.18.0.107:0/2300540652; not ready for session (expect reconnect) Dec 15 04:54:34 localhost ceph-mgr[292421]: mgr finish mon failed to return metadata for mon.np0005559463: (2) No such file or directory Dec 15 04:54:34 localhost ceph-mgr[292421]: [progress INFO root] update: starting ev 35a49559-2999-485d-84c7-455983d22f38 (Updating node-proxy deployment (+4 -> 4)) Dec 15 04:54:34 localhost ceph-mgr[292421]: [progress INFO root] complete: finished ev 35a49559-2999-485d-84c7-455983d22f38 (Updating node-proxy deployment (+4 -> 4)) Dec 15 04:54:34 localhost ceph-mgr[292421]: [progress INFO root] Completed event 35a49559-2999-485d-84c7-455983d22f38 (Updating node-proxy deployment (+4 -> 4)) in 0 seconds Dec 15 04:54:34 localhost ceph-mon[298913]: mon.np0005559462@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Dec 15 04:54:34 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Reconfiguring mon.np0005559461 (monmap changed)... Dec 15 04:54:34 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Reconfiguring mon.np0005559461 (monmap changed)... Dec 15 04:54:34 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Reconfiguring daemon mon.np0005559461 on np0005559461.localdomain Dec 15 04:54:34 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Reconfiguring daemon mon.np0005559461 on np0005559461.localdomain Dec 15 04:54:34 localhost ceph-mgr[292421]: log_channel(cluster) log [DBG] : pgmap v11: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s Dec 15 04:54:34 localhost openstack_network_exporter[246484]: ERROR 09:54:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 04:54:34 localhost openstack_network_exporter[246484]: ERROR 09:54:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 04:54:34 localhost openstack_network_exporter[246484]: ERROR 09:54:34 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 15 04:54:34 localhost openstack_network_exporter[246484]: ERROR 09:54:34 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 15 04:54:34 localhost openstack_network_exporter[246484]: Dec 15 04:54:34 localhost openstack_network_exporter[246484]: ERROR 09:54:34 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 15 04:54:34 localhost openstack_network_exporter[246484]: Dec 15 04:54:35 localhost ceph-mon[298913]: mon.np0005559462@2(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:54:35 localhost ceph-mgr[292421]: mgr.server handle_open ignoring open from mon.np0005559463 172.18.0.107:0/2300540652; not ready for session (expect reconnect) Dec 15 04:54:35 localhost ceph-mon[298913]: Saving service mon spec with placement label:mon Dec 15 04:54:35 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:54:35 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Dec 15 04:54:35 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:54:35 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:54:35 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Dec 15 04:54:35 localhost ceph-mon[298913]: Reconfiguring mon.np0005559461 (monmap changed)... Dec 15 04:54:35 localhost ceph-mon[298913]: Reconfiguring daemon mon.np0005559461 on np0005559461.localdomain Dec 15 04:54:35 localhost ceph-mgr[292421]: mgr finish mon failed to return metadata for mon.np0005559463: (2) No such file or directory Dec 15 04:54:35 localhost ceph-mgr[292421]: [progress INFO root] Writing back 50 completed events Dec 15 04:54:35 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Reconfiguring mon.np0005559462 (monmap changed)... Dec 15 04:54:35 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Reconfiguring mon.np0005559462 (monmap changed)... Dec 15 04:54:35 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Reconfiguring daemon mon.np0005559462 on np0005559462.localdomain Dec 15 04:54:35 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Reconfiguring daemon mon.np0005559462 on np0005559462.localdomain Dec 15 04:54:35 localhost ceph-mgr[292421]: log_channel(audit) log [DBG] : from='client.27198 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005559463", "target": ["mon-mgr", ""], "format": "json"}]: dispatch Dec 15 04:54:35 localhost nova_compute[286344]: 2025-12-15 09:54:35.795 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:54:36 localhost podman[306090]: Dec 15 04:54:36 localhost podman[306090]: 2025-12-15 09:54:36.054592864 +0000 UTC m=+0.081476233 container create 3ce5d225d53f1056291055f54a4c1a4f19a54c07742224897b874db4d9c8b922 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_jones, release=1763362218, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, name=rhceph, distribution-scope=public, vendor=Red Hat, Inc., ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, GIT_CLEAN=True, io.openshift.expose-services=, version=7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, com.redhat.component=rhceph-container, architecture=x86_64, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , vcs-type=git, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream) Dec 15 04:54:36 localhost systemd[1]: Started libpod-conmon-3ce5d225d53f1056291055f54a4c1a4f19a54c07742224897b874db4d9c8b922.scope. Dec 15 04:54:36 localhost systemd[1]: Started libcrun container. Dec 15 04:54:36 localhost podman[306090]: 2025-12-15 09:54:36.021841894 +0000 UTC m=+0.048725283 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 04:54:36 localhost podman[306090]: 2025-12-15 09:54:36.131174409 +0000 UTC m=+0.158057778 container init 3ce5d225d53f1056291055f54a4c1a4f19a54c07742224897b874db4d9c8b922 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_jones, name=rhceph, GIT_CLEAN=True, io.openshift.expose-services=, vcs-type=git, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, io.openshift.tags=rhceph ceph, distribution-scope=public, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, architecture=x86_64, version=7, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-11-26T19:44:28Z, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, maintainer=Guillaume Abrioux , url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container) Dec 15 04:54:36 localhost podman[306090]: 2025-12-15 09:54:36.14343342 +0000 UTC m=+0.170316789 container start 3ce5d225d53f1056291055f54a4c1a4f19a54c07742224897b874db4d9c8b922 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_jones, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z, vendor=Red Hat, Inc., GIT_CLEAN=True, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, io.openshift.expose-services=, name=rhceph, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux , cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, version=7, release=1763362218, vcs-type=git, ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git) Dec 15 04:54:36 localhost competent_jones[306105]: 167 167 Dec 15 04:54:36 localhost systemd[1]: libpod-3ce5d225d53f1056291055f54a4c1a4f19a54c07742224897b874db4d9c8b922.scope: Deactivated successfully. Dec 15 04:54:36 localhost podman[306090]: 2025-12-15 09:54:36.143740848 +0000 UTC m=+0.170624217 container attach 3ce5d225d53f1056291055f54a4c1a4f19a54c07742224897b874db4d9c8b922 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_jones, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, release=1763362218, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, name=rhceph, maintainer=Guillaume Abrioux , architecture=x86_64, vcs-type=git, ceph=True, GIT_BRANCH=main, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z, GIT_CLEAN=True, CEPH_POINT_RELEASE=, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7) Dec 15 04:54:36 localhost podman[306090]: 2025-12-15 09:54:36.151398981 +0000 UTC m=+0.178282370 container died 3ce5d225d53f1056291055f54a4c1a4f19a54c07742224897b874db4d9c8b922 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_jones, maintainer=Guillaume Abrioux , ceph=True, GIT_CLEAN=True, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, name=rhceph, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, com.redhat.component=rhceph-container, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, release=1763362218, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-11-26T19:44:28Z, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0) Dec 15 04:54:36 localhost ceph-mgr[292421]: mgr.server handle_open ignoring open from mon.np0005559463 172.18.0.107:0/2300540652; not ready for session (expect reconnect) Dec 15 04:54:36 localhost ceph-mgr[292421]: mgr finish mon failed to return metadata for mon.np0005559463: (2) No such file or directory Dec 15 04:54:36 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:54:36 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:54:36 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:54:36 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Dec 15 04:54:36 localhost ceph-mon[298913]: Reconfiguring mon.np0005559462 (monmap changed)... Dec 15 04:54:36 localhost ceph-mon[298913]: Reconfiguring daemon mon.np0005559462 on np0005559462.localdomain Dec 15 04:54:36 localhost podman[306110]: 2025-12-15 09:54:36.262634508 +0000 UTC m=+0.098560897 container remove 3ce5d225d53f1056291055f54a4c1a4f19a54c07742224897b874db4d9c8b922 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_jones, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, name=rhceph, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, ceph=True, version=7, distribution-scope=public, build-date=2025-11-26T19:44:28Z) Dec 15 04:54:36 localhost systemd[1]: libpod-conmon-3ce5d225d53f1056291055f54a4c1a4f19a54c07742224897b874db4d9c8b922.scope: Deactivated successfully. Dec 15 04:54:36 localhost ceph-mon[298913]: mon.np0005559462@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Dec 15 04:54:36 localhost ceph-mon[298913]: mon.np0005559462@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Dec 15 04:54:36 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Reconfiguring mon.np0005559464 (monmap changed)... Dec 15 04:54:36 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Reconfiguring mon.np0005559464 (monmap changed)... Dec 15 04:54:36 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Reconfiguring daemon mon.np0005559464 on np0005559464.localdomain Dec 15 04:54:36 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Reconfiguring daemon mon.np0005559464 on np0005559464.localdomain Dec 15 04:54:36 localhost ceph-mgr[292421]: log_channel(cluster) log [DBG] : pgmap v12: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s Dec 15 04:54:37 localhost systemd[1]: var-lib-containers-storage-overlay-136f783894189efb2cea5ffa1cac4488ce2013e784b8c6b73de4f5ba49f79cf5-merged.mount: Deactivated successfully. Dec 15 04:54:37 localhost ceph-mgr[292421]: mgr.server handle_open ignoring open from mon.np0005559463 172.18.0.107:0/2300540652; not ready for session (expect reconnect) Dec 15 04:54:37 localhost ceph-mgr[292421]: mgr finish mon failed to return metadata for mon.np0005559463: (2) No such file or directory Dec 15 04:54:37 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:54:37 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:54:37 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Dec 15 04:54:37 localhost ceph-mon[298913]: Reconfiguring mon.np0005559464 (monmap changed)... Dec 15 04:54:37 localhost ceph-mon[298913]: Reconfiguring daemon mon.np0005559464 on np0005559464.localdomain Dec 15 04:54:38 localhost ceph-mgr[292421]: mgr.server handle_open ignoring open from mon.np0005559463 172.18.0.107:0/2300540652; not ready for session (expect reconnect) Dec 15 04:54:38 localhost ceph-mgr[292421]: mgr finish mon failed to return metadata for mon.np0005559463: (2) No such file or directory Dec 15 04:54:38 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:54:38 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:54:38 localhost ceph-mon[298913]: mon.np0005559462@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Dec 15 04:54:38 localhost ceph-mgr[292421]: log_channel(cluster) log [DBG] : pgmap v13: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Dec 15 04:54:39 localhost ceph-mgr[292421]: mgr.server handle_open ignoring open from mon.np0005559463 172.18.0.107:0/2300540652; not ready for session (expect reconnect) Dec 15 04:54:39 localhost ceph-mgr[292421]: mgr finish mon failed to return metadata for mon.np0005559463: (2) No such file or directory Dec 15 04:54:40 localhost ceph-mon[298913]: mon.np0005559462@2(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:54:40 localhost ceph-mgr[292421]: mgr.server handle_open ignoring open from mon.np0005559463 172.18.0.107:0/2300540652; not ready for session (expect reconnect) Dec 15 04:54:40 localhost ceph-mgr[292421]: mgr finish mon failed to return metadata for mon.np0005559463: (2) No such file or directory Dec 15 04:54:40 localhost ceph-mon[298913]: mon.np0005559462@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Dec 15 04:54:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e. Dec 15 04:54:40 localhost podman[306126]: 2025-12-15 09:54:40.751886235 +0000 UTC m=+0.080075134 container health_status a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Dec 15 04:54:40 localhost podman[306126]: 2025-12-15 09:54:40.762331215 +0000 UTC m=+0.090520094 container exec_died a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Dec 15 04:54:40 localhost systemd[1]: a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e.service: Deactivated successfully. Dec 15 04:54:40 localhost ceph-mgr[292421]: log_channel(cluster) log [DBG] : pgmap v14: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Dec 15 04:54:40 localhost nova_compute[286344]: 2025-12-15 09:54:40.797 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:54:40 localhost nova_compute[286344]: 2025-12-15 09:54:40.798 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:54:40 localhost nova_compute[286344]: 2025-12-15 09:54:40.799 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 15 04:54:40 localhost nova_compute[286344]: 2025-12-15 09:54:40.799 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:54:40 localhost nova_compute[286344]: 2025-12-15 09:54:40.800 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:54:41 localhost ceph-mgr[292421]: mgr.server handle_open ignoring open from mon.np0005559463 172.18.0.107:0/2300540652; not ready for session (expect reconnect) Dec 15 04:54:41 localhost ceph-mgr[292421]: mgr finish mon failed to return metadata for mon.np0005559463: (2) No such file or directory Dec 15 04:54:42 localhost ceph-mgr[292421]: mgr.server handle_open ignoring open from mon.np0005559463 172.18.0.107:0/2300540652; not ready for session (expect reconnect) Dec 15 04:54:42 localhost ceph-mgr[292421]: mgr finish mon failed to return metadata for mon.np0005559463: (2) No such file or directory Dec 15 04:54:42 localhost ceph-mon[298913]: mon.np0005559462@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Dec 15 04:54:42 localhost ceph-mon[298913]: mon.np0005559462@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Dec 15 04:54:42 localhost ceph-mgr[292421]: log_channel(cluster) log [DBG] : pgmap v15: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Dec 15 04:54:43 localhost ceph-mgr[292421]: mgr.server handle_open ignoring open from mon.np0005559463 172.18.0.107:0/2300540652; not ready for session (expect reconnect) Dec 15 04:54:43 localhost ceph-mgr[292421]: mgr finish mon failed to return metadata for mon.np0005559463: (2) No such file or directory Dec 15 04:54:43 localhost ceph-mgr[292421]: log_channel(audit) log [DBG] : from='client.27213 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005559461", "target": ["mon-mgr", ""], "format": "json"}]: dispatch Dec 15 04:54:44 localhost ceph-mgr[292421]: mgr.server handle_open ignoring open from mon.np0005559463 172.18.0.107:0/2300540652; not ready for session (expect reconnect) Dec 15 04:54:44 localhost ceph-mgr[292421]: mgr finish mon failed to return metadata for mon.np0005559463: (2) No such file or directory Dec 15 04:54:44 localhost ceph-mon[298913]: mon.np0005559462@2(peon) e11 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Dec 15 04:54:44 localhost ceph-mgr[292421]: log_channel(cluster) log [DBG] : pgmap v16: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Dec 15 04:54:44 localhost ceph-mgr[292421]: log_channel(audit) log [DBG] : from='client.27219 -' entity='client.admin' cmd=[{"prefix": "orch daemon rm", "names": ["mon.np0005559461"], "force": true, "target": ["mon-mgr", ""]}]: dispatch Dec 15 04:54:44 localhost ceph-mgr[292421]: [cephadm INFO root] Remove daemons mon.np0005559461 Dec 15 04:54:44 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Remove daemons mon.np0005559461 Dec 15 04:54:44 localhost ceph-mgr[292421]: [cephadm INFO cephadm.services.cephadmservice] Safe to remove mon.np0005559461: new quorum should be ['np0005559464', 'np0005559462'] (from ['np0005559464', 'np0005559462']) Dec 15 04:54:44 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Safe to remove mon.np0005559461: new quorum should be ['np0005559464', 'np0005559462'] (from ['np0005559464', 'np0005559462']) Dec 15 04:54:44 localhost ceph-mgr[292421]: [cephadm INFO cephadm.services.cephadmservice] Removing monitor np0005559461 from monmap... Dec 15 04:54:44 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Removing monitor np0005559461 from monmap... Dec 15 04:54:44 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Removing daemon mon.np0005559461 from np0005559461.localdomain -- ports [] Dec 15 04:54:44 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Removing daemon mon.np0005559461 from np0005559461.localdomain -- ports [] Dec 15 04:54:44 localhost ceph-mon[298913]: mon.np0005559462@2(peon) e12 my rank is now 1 (was 2) Dec 15 04:54:44 localhost ceph-mgr[292421]: client.0 ms_handle_reset on v2:172.18.0.103:3300/0 Dec 15 04:54:44 localhost ceph-mgr[292421]: client.0 ms_handle_reset on v2:172.18.0.103:3300/0 Dec 15 04:54:44 localhost ceph-mgr[292421]: client.27183 ms_handle_reset on v2:172.18.0.103:3300/0 Dec 15 04:54:44 localhost ceph-mgr[292421]: client.44351 ms_handle_reset on v2:172.18.0.103:3300/0 Dec 15 04:54:44 localhost ceph-mon[298913]: mon.np0005559462@1(probing) e12 handle_command mon_command({"prefix": "mon metadata", "id": "np0005559462"} v 0) Dec 15 04:54:44 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "mon metadata", "id": "np0005559462"} : dispatch Dec 15 04:54:44 localhost ceph-mon[298913]: mon.np0005559462@1(probing) e12 handle_command mon_command({"prefix": "mon metadata", "id": "np0005559464"} v 0) Dec 15 04:54:44 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "mon metadata", "id": "np0005559464"} : dispatch Dec 15 04:54:44 localhost ceph-mon[298913]: mon.np0005559462@1(probing) e12 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Dec 15 04:54:44 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "config generate-minimal-conf"} : dispatch Dec 15 04:54:45 localhost ceph-mon[298913]: log_channel(cluster) log [INF] : mon.np0005559462 calling monitor election Dec 15 04:54:45 localhost ceph-mon[298913]: paxos.1).electionLogic(52) init, last seen epoch 52 Dec 15 04:54:45 localhost ceph-mon[298913]: mon.np0005559462@1(electing) e12 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Dec 15 04:54:45 localhost ceph-mon[298913]: mon.np0005559462@1(electing) e12 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Dec 15 04:54:45 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e12 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Dec 15 04:54:45 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e12 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) Dec 15 04:54:45 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Dec 15 04:54:45 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Updating np0005559461.localdomain:/etc/ceph/ceph.conf Dec 15 04:54:45 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Updating np0005559461.localdomain:/etc/ceph/ceph.conf Dec 15 04:54:45 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Updating np0005559462.localdomain:/etc/ceph/ceph.conf Dec 15 04:54:45 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Updating np0005559462.localdomain:/etc/ceph/ceph.conf Dec 15 04:54:45 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Updating np0005559463.localdomain:/etc/ceph/ceph.conf Dec 15 04:54:45 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Updating np0005559463.localdomain:/etc/ceph/ceph.conf Dec 15 04:54:45 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Updating np0005559464.localdomain:/etc/ceph/ceph.conf Dec 15 04:54:45 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Updating np0005559464.localdomain:/etc/ceph/ceph.conf Dec 15 04:54:45 localhost ceph-mon[298913]: Remove daemons mon.np0005559461 Dec 15 04:54:45 localhost ceph-mon[298913]: Safe to remove mon.np0005559461: new quorum should be ['np0005559464', 'np0005559462'] (from ['np0005559464', 'np0005559462']) Dec 15 04:54:45 localhost ceph-mon[298913]: Removing monitor np0005559461 from monmap... Dec 15 04:54:45 localhost ceph-mon[298913]: Removing daemon mon.np0005559461 from np0005559461.localdomain -- ports [] Dec 15 04:54:45 localhost ceph-mon[298913]: mon.np0005559464 calling monitor election Dec 15 04:54:45 localhost ceph-mon[298913]: mon.np0005559462 calling monitor election Dec 15 04:54:45 localhost ceph-mon[298913]: mon.np0005559464 is new leader, mons np0005559464,np0005559462 in quorum (ranks 0,1) Dec 15 04:54:45 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Dec 15 04:54:45 localhost ceph-mon[298913]: Updating np0005559461.localdomain:/etc/ceph/ceph.conf Dec 15 04:54:45 localhost ceph-mon[298913]: Updating np0005559462.localdomain:/etc/ceph/ceph.conf Dec 15 04:54:45 localhost ceph-mon[298913]: Updating np0005559463.localdomain:/etc/ceph/ceph.conf Dec 15 04:54:45 localhost ceph-mon[298913]: Updating np0005559464.localdomain:/etc/ceph/ceph.conf Dec 15 04:54:45 localhost ceph-mon[298913]: Health detail: HEALTH_WARN 1 stray daemon(s) not managed by cephadm; 1 stray host(s) with 1 daemon(s) not managed by cephadm Dec 15 04:54:45 localhost ceph-mon[298913]: [WRN] CEPHADM_STRAY_DAEMON: 1 stray daemon(s) not managed by cephadm Dec 15 04:54:45 localhost ceph-mon[298913]: stray daemon mgr.np0005559460.oexkup on host np0005559460.localdomain not managed by cephadm Dec 15 04:54:45 localhost ceph-mon[298913]: [WRN] CEPHADM_STRAY_HOST: 1 stray host(s) with 1 daemon(s) not managed by cephadm Dec 15 04:54:45 localhost ceph-mon[298913]: stray host np0005559460.localdomain has 1 stray daemons: ['mgr.np0005559460.oexkup'] Dec 15 04:54:45 localhost ceph-mgr[292421]: mgr.server handle_open ignoring open from mon.np0005559463 172.18.0.107:0/2300540652; not ready for session (expect reconnect) Dec 15 04:54:45 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e12 handle_command mon_command({"prefix": "mon metadata", "id": "np0005559463"} v 0) Dec 15 04:54:45 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "mon metadata", "id": "np0005559463"} : dispatch Dec 15 04:54:45 localhost ceph-mgr[292421]: mgr finish mon failed to return metadata for mon.np0005559463: (2) No such file or directory Dec 15 04:54:45 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Updating np0005559461.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:54:45 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Updating np0005559461.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:54:45 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Updating np0005559463.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:54:45 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Updating np0005559463.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:54:45 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Updating np0005559462.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:54:45 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Updating np0005559462.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:54:45 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Updating np0005559464.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:54:45 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Updating np0005559464.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:54:45 localhost nova_compute[286344]: 2025-12-15 09:54:45.802 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:54:46 localhost ceph-mon[298913]: Updating np0005559461.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:54:46 localhost ceph-mon[298913]: Updating np0005559463.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:54:46 localhost ceph-mon[298913]: Updating np0005559462.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:54:46 localhost ceph-mon[298913]: Updating np0005559464.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:54:46 localhost ceph-mgr[292421]: mgr.server handle_open ignoring open from mon.np0005559463 172.18.0.107:0/2300540652; not ready for session (expect reconnect) Dec 15 04:54:46 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e12 handle_command mon_command({"prefix": "mon metadata", "id": "np0005559463"} v 0) Dec 15 04:54:46 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "mon metadata", "id": "np0005559463"} : dispatch Dec 15 04:54:46 localhost ceph-mgr[292421]: mgr finish mon failed to return metadata for mon.np0005559463: (2) No such file or directory Dec 15 04:54:46 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e12 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Dec 15 04:54:46 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e12 adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints Dec 15 04:54:46 localhost ceph-mon[298913]: mon.np0005559462@1(probing) e13 handle_command mon_command({"prefix": "mon metadata", "id": "np0005559462"} v 0) Dec 15 04:54:46 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "mon metadata", "id": "np0005559462"} : dispatch Dec 15 04:54:46 localhost ceph-mon[298913]: mon.np0005559462@1(probing) e13 handle_command mon_command({"prefix": "mon metadata", "id": "np0005559463"} v 0) Dec 15 04:54:46 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "mon metadata", "id": "np0005559463"} : dispatch Dec 15 04:54:46 localhost ceph-mon[298913]: log_channel(cluster) log [INF] : mon.np0005559462 calling monitor election Dec 15 04:54:46 localhost ceph-mon[298913]: paxos.1).electionLogic(54) init, last seen epoch 54 Dec 15 04:54:46 localhost ceph-mgr[292421]: mgr finish mon failed to return metadata for mon.np0005559463: (22) Invalid argument Dec 15 04:54:46 localhost ceph-mon[298913]: mon.np0005559462@1(electing) e13 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Dec 15 04:54:46 localhost ceph-mon[298913]: mon.np0005559462@1(electing) e13 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Dec 15 04:54:46 localhost ceph-mon[298913]: mon.np0005559462@1(electing) e13 handle_command mon_command({"prefix": "mon metadata", "id": "np0005559464"} v 0) Dec 15 04:54:46 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "mon metadata", "id": "np0005559464"} : dispatch Dec 15 04:54:46 localhost ceph-mon[298913]: mon.np0005559462@1(electing) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559463.localdomain.devices.0}] v 0) Dec 15 04:54:46 localhost ceph-mon[298913]: mon.np0005559462@1(electing) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559462.localdomain.devices.0}] v 0) Dec 15 04:54:46 localhost ceph-mon[298913]: mon.np0005559462@1(electing) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559464.localdomain.devices.0}] v 0) Dec 15 04:54:46 localhost ceph-mon[298913]: mon.np0005559462@1(electing) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559461.localdomain.devices.0}] v 0) Dec 15 04:54:46 localhost ceph-mgr[292421]: log_channel(cluster) log [DBG] : pgmap v17: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Dec 15 04:54:47 localhost ceph-mgr[292421]: mgr.server handle_open ignoring open from mon.np0005559463 172.18.0.107:0/2300540652; not ready for session (expect reconnect) Dec 15 04:54:47 localhost ceph-mon[298913]: mon.np0005559462@1(electing) e13 handle_command mon_command({"prefix": "mon metadata", "id": "np0005559463"} v 0) Dec 15 04:54:47 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "mon metadata", "id": "np0005559463"} : dispatch Dec 15 04:54:47 localhost ceph-mgr[292421]: mgr finish mon failed to return metadata for mon.np0005559463: (22) Invalid argument Dec 15 04:54:48 localhost ceph-mgr[292421]: mgr.server handle_open ignoring open from mon.np0005559463 172.18.0.107:0/2300540652; not ready for session (expect reconnect) Dec 15 04:54:48 localhost ceph-mon[298913]: mon.np0005559462@1(electing) e13 handle_command mon_command({"prefix": "mon metadata", "id": "np0005559463"} v 0) Dec 15 04:54:48 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "mon metadata", "id": "np0005559463"} : dispatch Dec 15 04:54:48 localhost ceph-mgr[292421]: mgr finish mon failed to return metadata for mon.np0005559463: (22) Invalid argument Dec 15 04:54:48 localhost ceph-mgr[292421]: log_channel(cluster) log [DBG] : pgmap v18: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Dec 15 04:54:49 localhost ceph-mgr[292421]: mgr.server handle_open ignoring open from mon.np0005559463 172.18.0.107:0/2300540652; not ready for session (expect reconnect) Dec 15 04:54:49 localhost ceph-mon[298913]: mon.np0005559462@1(electing) e13 handle_command mon_command({"prefix": "mon metadata", "id": "np0005559463"} v 0) Dec 15 04:54:49 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "mon metadata", "id": "np0005559463"} : dispatch Dec 15 04:54:49 localhost ceph-mgr[292421]: mgr finish mon failed to return metadata for mon.np0005559463: (22) Invalid argument Dec 15 04:54:50 localhost ceph-mgr[292421]: mgr.server handle_open ignoring open from mon.np0005559463 172.18.0.107:0/2300540652; not ready for session (expect reconnect) Dec 15 04:54:50 localhost ceph-mon[298913]: mon.np0005559462@1(electing) e13 handle_command mon_command({"prefix": "mon metadata", "id": "np0005559463"} v 0) Dec 15 04:54:50 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "mon metadata", "id": "np0005559463"} : dispatch Dec 15 04:54:50 localhost ceph-mgr[292421]: mgr finish mon failed to return metadata for mon.np0005559463: (22) Invalid argument Dec 15 04:54:50 localhost ceph-mgr[292421]: [volumes INFO mgr_util] scanning for idle connections.. Dec 15 04:54:50 localhost ceph-mgr[292421]: [volumes INFO mgr_util] cleaning up connections: [] Dec 15 04:54:50 localhost ceph-mgr[292421]: [volumes INFO mgr_util] scanning for idle connections.. Dec 15 04:54:50 localhost ceph-mgr[292421]: [volumes INFO mgr_util] cleaning up connections: [] Dec 15 04:54:50 localhost ceph-mgr[292421]: [volumes INFO mgr_util] scanning for idle connections.. Dec 15 04:54:50 localhost ceph-mgr[292421]: [volumes INFO mgr_util] cleaning up connections: [] Dec 15 04:54:50 localhost ceph-mds[291134]: mds.beacon.mds.np0005559462.mhigvc missed beacon ack from the monitors Dec 15 04:54:50 localhost ceph-mgr[292421]: log_channel(cluster) log [DBG] : pgmap v19: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Dec 15 04:54:50 localhost nova_compute[286344]: 2025-12-15 09:54:50.804 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:54:51 localhost ceph-mgr[292421]: mgr.server handle_open ignoring open from mon.np0005559463 172.18.0.107:0/2300540652; not ready for session (expect reconnect) Dec 15 04:54:51 localhost ceph-mon[298913]: mon.np0005559462@1(electing) e13 handle_command mon_command({"prefix": "mon metadata", "id": "np0005559463"} v 0) Dec 15 04:54:51 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "mon metadata", "id": "np0005559463"} : dispatch Dec 15 04:54:51 localhost ceph-mgr[292421]: mgr finish mon failed to return metadata for mon.np0005559463: (22) Invalid argument Dec 15 04:54:51 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e13 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Dec 15 04:54:51 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559463.localdomain}] v 0) Dec 15 04:54:51 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559462.localdomain}] v 0) Dec 15 04:54:51 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559464.localdomain}] v 0) Dec 15 04:54:51 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559461.localdomain}] v 0) Dec 15 04:54:51 localhost ceph-mon[298913]: mon.np0005559464 calling monitor election Dec 15 04:54:51 localhost ceph-mon[298913]: mon.np0005559462 calling monitor election Dec 15 04:54:51 localhost ceph-mon[298913]: mon.np0005559464 is new leader, mons np0005559464,np0005559462 in quorum (ranks 0,1) Dec 15 04:54:51 localhost ceph-mon[298913]: Health check failed: 1/3 mons down, quorum np0005559464,np0005559462 (MON_DOWN) Dec 15 04:54:51 localhost ceph-mon[298913]: Health detail: HEALTH_WARN 1 stray daemon(s) not managed by cephadm; 1 stray host(s) with 1 daemon(s) not managed by cephadm; 1/3 mons down, quorum np0005559464,np0005559462 Dec 15 04:54:51 localhost ceph-mon[298913]: [WRN] CEPHADM_STRAY_DAEMON: 1 stray daemon(s) not managed by cephadm Dec 15 04:54:51 localhost ceph-mon[298913]: stray daemon mgr.np0005559460.oexkup on host np0005559460.localdomain not managed by cephadm Dec 15 04:54:51 localhost ceph-mon[298913]: [WRN] CEPHADM_STRAY_HOST: 1 stray host(s) with 1 daemon(s) not managed by cephadm Dec 15 04:54:51 localhost ceph-mon[298913]: stray host np0005559460.localdomain has 1 stray daemons: ['mgr.np0005559460.oexkup'] Dec 15 04:54:51 localhost ceph-mon[298913]: [WRN] MON_DOWN: 1/3 mons down, quorum np0005559464,np0005559462 Dec 15 04:54:51 localhost ceph-mon[298913]: mon.np0005559463 (rank 2) addr [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] is down (out of quorum) Dec 15 04:54:51 localhost ceph-mon[298913]: from='mgr.26879 ' entity='mgr.np0005559462.fudvyx' Dec 15 04:54:51 localhost ceph-mon[298913]: from='mgr.26879 ' entity='mgr.np0005559462.fudvyx' Dec 15 04:54:51 localhost ceph-mon[298913]: from='mgr.26879 ' entity='mgr.np0005559462.fudvyx' Dec 15 04:54:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:54:51.471 160590 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:54:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:54:51.472 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:54:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:54:51.472 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:54:51 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Dec 15 04:54:51 localhost ceph-mgr[292421]: [progress INFO root] update: starting ev bfd1151a-c95d-4fdc-a68c-b492e57c8a6d (Updating mon deployment (+1 -> 4)) Dec 15 04:54:51 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e13 handle_command mon_command({"prefix": "auth get", "entity": "mon."} v 0) Dec 15 04:54:51 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Dec 15 04:54:51 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e13 handle_command mon_command({"prefix": "config get", "who": "mon", "key": "public_network"} v 0) Dec 15 04:54:51 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch Dec 15 04:54:51 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e13 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Dec 15 04:54:51 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "config generate-minimal-conf"} : dispatch Dec 15 04:54:51 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Deploying daemon mon.np0005559461 on np0005559461.localdomain Dec 15 04:54:51 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Deploying daemon mon.np0005559461 on np0005559461.localdomain Dec 15 04:54:52 localhost ceph-mgr[292421]: mgr.server handle_open ignoring open from mon.np0005559463 172.18.0.107:0/2300540652; not ready for session (expect reconnect) Dec 15 04:54:52 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e13 handle_command mon_command({"prefix": "mon metadata", "id": "np0005559463"} v 0) Dec 15 04:54:52 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "mon metadata", "id": "np0005559463"} : dispatch Dec 15 04:54:52 localhost ceph-mgr[292421]: mgr finish mon failed to return metadata for mon.np0005559463: (22) Invalid argument Dec 15 04:54:52 localhost ceph-mgr[292421]: log_channel(audit) log [DBG] : from='client.34464 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005559461.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch Dec 15 04:54:52 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/inventory}] v 0) Dec 15 04:54:52 localhost ceph-mgr[292421]: [cephadm INFO root] Removed label mon from host np0005559461.localdomain Dec 15 04:54:52 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Removed label mon from host np0005559461.localdomain Dec 15 04:54:52 localhost ceph-mon[298913]: from='mgr.26879 ' entity='mgr.np0005559462.fudvyx' Dec 15 04:54:52 localhost ceph-mon[298913]: from='mgr.26879 ' entity='mgr.np0005559462.fudvyx' Dec 15 04:54:52 localhost ceph-mon[298913]: from='mgr.26879 ' entity='mgr.np0005559462.fudvyx' Dec 15 04:54:52 localhost ceph-mon[298913]: from='mgr.26879 ' entity='mgr.np0005559462.fudvyx' Dec 15 04:54:52 localhost ceph-mon[298913]: from='mgr.26879 ' entity='mgr.np0005559462.fudvyx' Dec 15 04:54:52 localhost ceph-mon[298913]: from='mgr.26879 ' entity='mgr.np0005559462.fudvyx' Dec 15 04:54:52 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Dec 15 04:54:52 localhost ceph-mon[298913]: Deploying daemon mon.np0005559461 on np0005559461.localdomain Dec 15 04:54:52 localhost ceph-mon[298913]: from='mgr.26879 ' entity='mgr.np0005559462.fudvyx' Dec 15 04:54:52 localhost ceph-mgr[292421]: log_channel(cluster) log [DBG] : pgmap v20: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Dec 15 04:54:53 localhost ceph-mgr[292421]: mgr.server handle_open ignoring open from mon.np0005559463 172.18.0.107:0/2300540652; not ready for session (expect reconnect) Dec 15 04:54:53 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e13 handle_command mon_command({"prefix": "mon metadata", "id": "np0005559463"} v 0) Dec 15 04:54:53 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "mon metadata", "id": "np0005559463"} : dispatch Dec 15 04:54:53 localhost ceph-mgr[292421]: mgr finish mon failed to return metadata for mon.np0005559463: (22) Invalid argument Dec 15 04:54:53 localhost ceph-mon[298913]: log_channel(cluster) log [INF] : mon.np0005559462 calling monitor election Dec 15 04:54:53 localhost ceph-mon[298913]: paxos.1).electionLogic(56) init, last seen epoch 56 Dec 15 04:54:53 localhost ceph-mon[298913]: mon.np0005559462@1(electing) e13 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Dec 15 04:54:53 localhost ceph-mon[298913]: mon.np0005559462@1(electing) e13 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Dec 15 04:54:53 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e13 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Dec 15 04:54:53 localhost ceph-mon[298913]: mon.np0005559463 calling monitor election Dec 15 04:54:53 localhost ceph-mon[298913]: mon.np0005559463 calling monitor election Dec 15 04:54:53 localhost ceph-mon[298913]: mon.np0005559464 calling monitor election Dec 15 04:54:53 localhost ceph-mon[298913]: mon.np0005559462 calling monitor election Dec 15 04:54:53 localhost ceph-mon[298913]: mon.np0005559464 is new leader, mons np0005559464,np0005559462,np0005559463 in quorum (ranks 0,1,2) Dec 15 04:54:53 localhost ceph-mon[298913]: Health check cleared: MON_DOWN (was: 1/3 mons down, quorum np0005559464,np0005559462) Dec 15 04:54:53 localhost ceph-mon[298913]: Health detail: HEALTH_WARN 1 stray daemon(s) not managed by cephadm; 1 stray host(s) with 1 daemon(s) not managed by cephadm Dec 15 04:54:53 localhost ceph-mon[298913]: [WRN] CEPHADM_STRAY_DAEMON: 1 stray daemon(s) not managed by cephadm Dec 15 04:54:53 localhost ceph-mon[298913]: stray daemon mgr.np0005559460.oexkup on host np0005559460.localdomain not managed by cephadm Dec 15 04:54:53 localhost ceph-mon[298913]: [WRN] CEPHADM_STRAY_HOST: 1 stray host(s) with 1 daemon(s) not managed by cephadm Dec 15 04:54:53 localhost ceph-mon[298913]: stray host np0005559460.localdomain has 1 stray daemons: ['mgr.np0005559460.oexkup'] Dec 15 04:54:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0. Dec 15 04:54:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a. Dec 15 04:54:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 04:54:53 localhost systemd[299725]: Starting Mark boot as successful... Dec 15 04:54:53 localhost podman[306469]: 2025-12-15 09:54:53.750326358 +0000 UTC m=+0.077090561 container health_status 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 15 04:54:53 localhost systemd[299725]: Finished Mark boot as successful. Dec 15 04:54:53 localhost podman[306470]: 2025-12-15 09:54:53.80444645 +0000 UTC m=+0.131493441 container health_status b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS) Dec 15 04:54:53 localhost podman[306469]: 2025-12-15 09:54:53.835522182 +0000 UTC m=+0.162286395 container exec_died 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 15 04:54:53 localhost systemd[1]: 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0.service: Deactivated successfully. Dec 15 04:54:53 localhost podman[306471]: 2025-12-15 09:54:53.911430919 +0000 UTC m=+0.231687251 container health_status 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, org.label-schema.schema-version=1.0) Dec 15 04:54:53 localhost podman[306470]: 2025-12-15 09:54:53.941497744 +0000 UTC m=+0.268544695 container exec_died b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, tcib_managed=true, managed_by=edpm_ansible, io.buildah.version=1.41.3, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute) Dec 15 04:54:53 localhost systemd[1]: b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a.service: Deactivated successfully. Dec 15 04:54:54 localhost podman[306471]: 2025-12-15 09:54:53.999758141 +0000 UTC m=+0.320014473 container exec_died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 04:54:54 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.service: Deactivated successfully. Dec 15 04:54:54 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559461.localdomain.devices.0}] v 0) Dec 15 04:54:54 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559461.localdomain}] v 0) Dec 15 04:54:54 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e13 adding peer [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] to list of hints Dec 15 04:54:54 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e13 adding peer [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] to list of hints Dec 15 04:54:54 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mon}] v 0) Dec 15 04:54:54 localhost ceph-mgr[292421]: [progress INFO root] complete: finished ev bfd1151a-c95d-4fdc-a68c-b492e57c8a6d (Updating mon deployment (+1 -> 4)) Dec 15 04:54:54 localhost ceph-mgr[292421]: [progress INFO root] Completed event bfd1151a-c95d-4fdc-a68c-b492e57c8a6d (Updating mon deployment (+1 -> 4)) in 3 seconds Dec 15 04:54:54 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mon}] v 0) Dec 15 04:54:54 localhost ceph-mgr[292421]: [progress INFO root] update: starting ev 5d13f653-d2db-4b2a-b76d-1e9abe2a018a (Updating node-proxy deployment (+4 -> 4)) Dec 15 04:54:54 localhost ceph-mgr[292421]: [progress INFO root] complete: finished ev 5d13f653-d2db-4b2a-b76d-1e9abe2a018a (Updating node-proxy deployment (+4 -> 4)) Dec 15 04:54:54 localhost ceph-mgr[292421]: [progress INFO root] Completed event 5d13f653-d2db-4b2a-b76d-1e9abe2a018a (Updating node-proxy deployment (+4 -> 4)) in 0 seconds Dec 15 04:54:54 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e13 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) Dec 15 04:54:54 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch Dec 15 04:54:54 localhost ceph-mgr[292421]: mgr.server handle_open ignoring open from mon.np0005559463 172.18.0.107:0/2300540652; not ready for session (expect reconnect) Dec 15 04:54:54 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e13 handle_command mon_command({"prefix": "mon metadata", "id": "np0005559463"} v 0) Dec 15 04:54:54 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "mon metadata", "id": "np0005559463"} : dispatch Dec 15 04:54:54 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e13 adding peer [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] to list of hints Dec 15 04:54:54 localhost ceph-mgr[292421]: mgr.server handle_open ignoring open from mon.np0005559461 172.18.0.105:0/1113426387; not ready for session (expect reconnect) Dec 15 04:54:54 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e13 handle_command mon_command({"prefix": "mon metadata", "id": "np0005559461"} v 0) Dec 15 04:54:54 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "mon metadata", "id": "np0005559461"} : dispatch Dec 15 04:54:54 localhost ceph-mgr[292421]: mgr finish mon failed to return metadata for mon.np0005559461: (2) No such file or directory Dec 15 04:54:54 localhost ceph-mon[298913]: mon.np0005559462@1(probing) e14 handle_command mon_command({"prefix": "mon metadata", "id": "np0005559461"} v 0) Dec 15 04:54:54 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "mon metadata", "id": "np0005559461"} : dispatch Dec 15 04:54:54 localhost ceph-mon[298913]: mon.np0005559462@1(probing) e14 handle_command mon_command({"prefix": "mon metadata", "id": "np0005559462"} v 0) Dec 15 04:54:54 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "mon metadata", "id": "np0005559462"} : dispatch Dec 15 04:54:54 localhost ceph-mgr[292421]: mgr finish mon failed to return metadata for mon.np0005559461: (22) Invalid argument Dec 15 04:54:54 localhost ceph-mon[298913]: mon.np0005559462@1(probing) e14 handle_command mon_command({"prefix": "mon metadata", "id": "np0005559463"} v 0) Dec 15 04:54:54 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "mon metadata", "id": "np0005559463"} : dispatch Dec 15 04:54:54 localhost ceph-mon[298913]: log_channel(cluster) log [INF] : mon.np0005559462 calling monitor election Dec 15 04:54:54 localhost ceph-mon[298913]: paxos.1).electionLogic(58) init, last seen epoch 58 Dec 15 04:54:54 localhost ceph-mon[298913]: mon.np0005559462@1(electing) e14 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Dec 15 04:54:54 localhost ceph-mon[298913]: mon.np0005559462@1(electing) e14 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Dec 15 04:54:54 localhost ceph-mon[298913]: mon.np0005559462@1(electing) e14 handle_command mon_command({"prefix": "mon metadata", "id": "np0005559464"} v 0) Dec 15 04:54:54 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "mon metadata", "id": "np0005559464"} : dispatch Dec 15 04:54:54 localhost ceph-mgr[292421]: log_channel(cluster) log [DBG] : pgmap v21: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Dec 15 04:54:55 localhost ceph-mgr[292421]: mgr.server handle_report got status from non-daemon mon.np0005559463 Dec 15 04:54:55 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-mgr-np0005559462-fudvyx[292417]: 2025-12-15T09:54:55.213+0000 7ff1946a3640 -1 mgr.server handle_report got status from non-daemon mon.np0005559463 Dec 15 04:54:55 localhost ceph-mgr[292421]: [progress INFO root] Writing back 50 completed events Dec 15 04:54:55 localhost ceph-mon[298913]: mon.np0005559462@1(electing) e14 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Dec 15 04:54:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09. Dec 15 04:54:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 04:54:55 localhost ceph-mgr[292421]: mgr.server handle_open ignoring open from mon.np0005559461 172.18.0.105:0/1113426387; not ready for session (expect reconnect) Dec 15 04:54:55 localhost ceph-mon[298913]: mon.np0005559462@1(electing) e14 handle_command mon_command({"prefix": "mon metadata", "id": "np0005559461"} v 0) Dec 15 04:54:55 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "mon metadata", "id": "np0005559461"} : dispatch Dec 15 04:54:55 localhost ceph-mgr[292421]: mgr finish mon failed to return metadata for mon.np0005559461: (22) Invalid argument Dec 15 04:54:55 localhost podman[306549]: 2025-12-15 09:54:55.747850332 +0000 UTC m=+0.076400991 container health_status 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.component=ubi9-minimal-container, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, architecture=x86_64, vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, container_name=openstack_network_exporter, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Dec 15 04:54:55 localhost podman[306549]: 2025-12-15 09:54:55.763469725 +0000 UTC m=+0.092020454 container exec_died 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, version=9.6, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, release=1755695350, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, name=ubi9-minimal, architecture=x86_64, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, io.openshift.expose-services=, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.) Dec 15 04:54:55 localhost systemd[1]: 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09.service: Deactivated successfully. Dec 15 04:54:55 localhost nova_compute[286344]: 2025-12-15 09:54:55.807 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:54:55 localhost systemd[1]: tmp-crun.HJ8N6R.mount: Deactivated successfully. Dec 15 04:54:55 localhost podman[306550]: 2025-12-15 09:54:55.826834934 +0000 UTC m=+0.152510634 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team) Dec 15 04:54:55 localhost podman[306550]: 2025-12-15 09:54:55.857258269 +0000 UTC m=+0.182933969 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3) Dec 15 04:54:55 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 04:54:55 localhost ceph-mon[298913]: mon.np0005559462@1(electing) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559461.localdomain.devices.0}] v 0) Dec 15 04:54:56 localhost ceph-mgr[292421]: mgr.server handle_open ignoring open from mon.np0005559461 172.18.0.105:0/1113426387; not ready for session (expect reconnect) Dec 15 04:54:56 localhost ceph-mon[298913]: mon.np0005559462@1(electing) e14 handle_command mon_command({"prefix": "mon metadata", "id": "np0005559461"} v 0) Dec 15 04:54:56 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "mon metadata", "id": "np0005559461"} : dispatch Dec 15 04:54:56 localhost ceph-mgr[292421]: mgr finish mon failed to return metadata for mon.np0005559461: (22) Invalid argument Dec 15 04:54:56 localhost ceph-mgr[292421]: log_channel(cluster) log [DBG] : pgmap v22: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Dec 15 04:54:57 localhost ceph-mgr[292421]: mgr.server handle_open ignoring open from mon.np0005559461 172.18.0.105:0/1113426387; not ready for session (expect reconnect) Dec 15 04:54:57 localhost ceph-mon[298913]: mon.np0005559462@1(electing) e14 handle_command mon_command({"prefix": "mon metadata", "id": "np0005559461"} v 0) Dec 15 04:54:57 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "mon metadata", "id": "np0005559461"} : dispatch Dec 15 04:54:57 localhost ceph-mgr[292421]: mgr finish mon failed to return metadata for mon.np0005559461: (22) Invalid argument Dec 15 04:54:58 localhost ceph-mgr[292421]: mgr.server handle_open ignoring open from mon.np0005559461 172.18.0.105:0/1113426387; not ready for session (expect reconnect) Dec 15 04:54:58 localhost ceph-mon[298913]: mon.np0005559462@1(electing) e14 handle_command mon_command({"prefix": "mon metadata", "id": "np0005559461"} v 0) Dec 15 04:54:58 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "mon metadata", "id": "np0005559461"} : dispatch Dec 15 04:54:58 localhost ceph-mgr[292421]: mgr finish mon failed to return metadata for mon.np0005559461: (22) Invalid argument Dec 15 04:54:58 localhost ceph-mgr[292421]: log_channel(cluster) log [DBG] : pgmap v23: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Dec 15 04:54:59 localhost nova_compute[286344]: 2025-12-15 09:54:59.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:54:59 localhost nova_compute[286344]: 2025-12-15 09:54:59.271 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Cleaning up deleted instances with incomplete migration _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m Dec 15 04:54:59 localhost ceph-mon[298913]: mon.np0005559462@1(electing) e14 handle_auth_request failed to assign global_id Dec 15 04:54:59 localhost ceph-mon[298913]: mon.np0005559462@1(electing) e14 handle_auth_request failed to assign global_id Dec 15 04:54:59 localhost ceph-mgr[292421]: mgr.server handle_open ignoring open from mon.np0005559461 172.18.0.105:0/1113426387; not ready for session (expect reconnect) Dec 15 04:54:59 localhost ceph-mon[298913]: mon.np0005559462@1(electing) e14 handle_command mon_command({"prefix": "mon metadata", "id": "np0005559461"} v 0) Dec 15 04:54:59 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "mon metadata", "id": "np0005559461"} : dispatch Dec 15 04:54:59 localhost ceph-mgr[292421]: mgr finish mon failed to return metadata for mon.np0005559461: (22) Invalid argument Dec 15 04:54:59 localhost ceph-mon[298913]: mon.np0005559462@1(electing) e14 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Dec 15 04:54:59 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e14 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Dec 15 04:54:59 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559461.localdomain}] v 0) Dec 15 04:54:59 localhost ceph-mon[298913]: mon.np0005559464 calling monitor election Dec 15 04:54:59 localhost ceph-mon[298913]: mon.np0005559463 calling monitor election Dec 15 04:54:59 localhost ceph-mon[298913]: mon.np0005559462 calling monitor election Dec 15 04:54:59 localhost ceph-mon[298913]: mon.np0005559461 calling monitor election Dec 15 04:54:59 localhost ceph-mon[298913]: mon.np0005559464 is new leader, mons np0005559464,np0005559462,np0005559463,np0005559461 in quorum (ranks 0,1,2,3) Dec 15 04:54:59 localhost ceph-mon[298913]: Health detail: HEALTH_WARN 1 stray daemon(s) not managed by cephadm; 1 stray host(s) with 1 daemon(s) not managed by cephadm Dec 15 04:54:59 localhost ceph-mon[298913]: [WRN] CEPHADM_STRAY_DAEMON: 1 stray daemon(s) not managed by cephadm Dec 15 04:54:59 localhost ceph-mon[298913]: stray daemon mgr.np0005559460.oexkup on host np0005559460.localdomain not managed by cephadm Dec 15 04:54:59 localhost ceph-mon[298913]: [WRN] CEPHADM_STRAY_HOST: 1 stray host(s) with 1 daemon(s) not managed by cephadm Dec 15 04:54:59 localhost ceph-mon[298913]: stray host np0005559460.localdomain has 1 stray daemons: ['mgr.np0005559460.oexkup'] Dec 15 04:54:59 localhost ceph-mon[298913]: from='mgr.26879 ' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:00 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e14 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Dec 15 04:55:00 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "config generate-minimal-conf"} : dispatch Dec 15 04:55:00 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e14 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) Dec 15 04:55:00 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Dec 15 04:55:00 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Updating np0005559461.localdomain:/etc/ceph/ceph.conf Dec 15 04:55:00 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Updating np0005559461.localdomain:/etc/ceph/ceph.conf Dec 15 04:55:00 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Updating np0005559462.localdomain:/etc/ceph/ceph.conf Dec 15 04:55:00 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Updating np0005559462.localdomain:/etc/ceph/ceph.conf Dec 15 04:55:00 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Updating np0005559463.localdomain:/etc/ceph/ceph.conf Dec 15 04:55:00 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Updating np0005559463.localdomain:/etc/ceph/ceph.conf Dec 15 04:55:00 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Updating np0005559464.localdomain:/etc/ceph/ceph.conf Dec 15 04:55:00 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Updating np0005559464.localdomain:/etc/ceph/ceph.conf Dec 15 04:55:00 localhost ceph-mon[298913]: mon.np0005559462@1(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:55:00 localhost ceph-mgr[292421]: log_channel(audit) log [DBG] : from='client.54107 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005559461.localdomain", "label": "mgr", "target": ["mon-mgr", ""]}]: dispatch Dec 15 04:55:00 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/inventory}] v 0) Dec 15 04:55:00 localhost ceph-mgr[292421]: [cephadm INFO root] Removed label mgr from host np0005559461.localdomain Dec 15 04:55:00 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Removed label mgr from host np0005559461.localdomain Dec 15 04:55:00 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Updating np0005559461.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:55:00 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Updating np0005559461.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:55:00 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Updating np0005559462.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:55:00 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Updating np0005559462.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:55:00 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Updating np0005559463.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:55:00 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Updating np0005559463.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:55:00 localhost ceph-mgr[292421]: mgr.server handle_open ignoring open from mon.np0005559461 172.18.0.105:0/1113426387; not ready for session (expect reconnect) Dec 15 04:55:00 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e14 handle_command mon_command({"prefix": "mon metadata", "id": "np0005559461"} v 0) Dec 15 04:55:00 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "mon metadata", "id": "np0005559461"} : dispatch Dec 15 04:55:00 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Updating np0005559464.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:55:00 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Updating np0005559464.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:55:00 localhost ceph-mgr[292421]: log_channel(cluster) log [DBG] : pgmap v24: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Dec 15 04:55:00 localhost nova_compute[286344]: 2025-12-15 09:55:00.810 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:55:00 localhost nova_compute[286344]: 2025-12-15 09:55:00.815 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:55:00 localhost ceph-mon[298913]: from='mgr.26879 ' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:00 localhost ceph-mon[298913]: from='mgr.26879 ' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:00 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Dec 15 04:55:00 localhost ceph-mon[298913]: Updating np0005559461.localdomain:/etc/ceph/ceph.conf Dec 15 04:55:00 localhost ceph-mon[298913]: Updating np0005559462.localdomain:/etc/ceph/ceph.conf Dec 15 04:55:00 localhost ceph-mon[298913]: Updating np0005559463.localdomain:/etc/ceph/ceph.conf Dec 15 04:55:00 localhost ceph-mon[298913]: Updating np0005559464.localdomain:/etc/ceph/ceph.conf Dec 15 04:55:00 localhost ceph-mon[298913]: from='mgr.26879 ' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:01 localhost nova_compute[286344]: 2025-12-15 09:55:01.293 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:55:01 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559464.localdomain.devices.0}] v 0) Dec 15 04:55:01 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559464.localdomain}] v 0) Dec 15 04:55:01 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559462.localdomain.devices.0}] v 0) Dec 15 04:55:01 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559461.localdomain.devices.0}] v 0) Dec 15 04:55:01 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559462.localdomain}] v 0) Dec 15 04:55:01 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559463.localdomain.devices.0}] v 0) Dec 15 04:55:01 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559461.localdomain}] v 0) Dec 15 04:55:01 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559463.localdomain}] v 0) Dec 15 04:55:01 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Dec 15 04:55:01 localhost ceph-mgr[292421]: [progress INFO root] update: starting ev 2a372bca-86a5-468e-8b8f-8a69595a47ea (Updating mgr deployment (-1 -> 3)) Dec 15 04:55:01 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Removing daemon mgr.np0005559461.egwgzn from np0005559461.localdomain -- ports [8765] Dec 15 04:55:01 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Removing daemon mgr.np0005559461.egwgzn from np0005559461.localdomain -- ports [8765] Dec 15 04:55:01 localhost ceph-mgr[292421]: log_channel(audit) log [DBG] : from='client.54115 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005559461.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch Dec 15 04:55:01 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/inventory}] v 0) Dec 15 04:55:01 localhost ceph-mgr[292421]: [cephadm INFO root] Removed label _admin from host np0005559461.localdomain Dec 15 04:55:01 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Removed label _admin from host np0005559461.localdomain Dec 15 04:55:01 localhost ceph-mgr[292421]: mgr.server handle_report got status from non-daemon mon.np0005559461 Dec 15 04:55:01 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-mgr-np0005559462-fudvyx[292417]: 2025-12-15T09:55:01.740+0000 7ff1946a3640 -1 mgr.server handle_report got status from non-daemon mon.np0005559461 Dec 15 04:55:01 localhost podman[243449]: time="2025-12-15T09:55:01Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 15 04:55:01 localhost podman[243449]: @ - - [15/Dec/2025:09:55:01 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 154814 "" "Go-http-client/1.1" Dec 15 04:55:01 localhost podman[243449]: @ - - [15/Dec/2025:09:55:01 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18724 "" "Go-http-client/1.1" Dec 15 04:55:02 localhost ceph-mon[298913]: Removed label mgr from host np0005559461.localdomain Dec 15 04:55:02 localhost ceph-mon[298913]: Updating np0005559461.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:55:02 localhost ceph-mon[298913]: Updating np0005559462.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:55:02 localhost ceph-mon[298913]: Updating np0005559463.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:55:02 localhost ceph-mon[298913]: Updating np0005559464.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:55:02 localhost ceph-mon[298913]: from='mgr.26879 ' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:02 localhost ceph-mon[298913]: from='mgr.26879 ' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:02 localhost ceph-mon[298913]: from='mgr.26879 ' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:02 localhost ceph-mon[298913]: from='mgr.26879 ' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:02 localhost ceph-mon[298913]: from='mgr.26879 ' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:02 localhost ceph-mon[298913]: from='mgr.26879 ' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:02 localhost ceph-mon[298913]: from='mgr.26879 ' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:02 localhost ceph-mon[298913]: from='mgr.26879 ' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:02 localhost ceph-mon[298913]: from='mgr.26879 ' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:02 localhost ceph-mon[298913]: from='mgr.26879 ' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:02 localhost nova_compute[286344]: 2025-12-15 09:55:02.266 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:55:02 localhost ceph-mgr[292421]: log_channel(cluster) log [DBG] : pgmap v25: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Dec 15 04:55:03 localhost ceph-mon[298913]: Removing daemon mgr.np0005559461.egwgzn from np0005559461.localdomain -- ports [8765] Dec 15 04:55:03 localhost ceph-mon[298913]: Removed label _admin from host np0005559461.localdomain Dec 15 04:55:03 localhost ceph-mon[298913]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #28. Immutable memtables: 0. Dec 15 04:55:03 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:55:03.526526) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Dec 15 04:55:03 localhost ceph-mon[298913]: rocksdb: [db/flush_job.cc:856] [default] [JOB 13] Flushing memtable with next log file: 28 Dec 15 04:55:03 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765792503526606, "job": 13, "event": "flush_started", "num_memtables": 1, "num_entries": 1331, "num_deletes": 259, "total_data_size": 1915934, "memory_usage": 1943168, "flush_reason": "Manual Compaction"} Dec 15 04:55:03 localhost ceph-mon[298913]: rocksdb: [db/flush_job.cc:885] [default] [JOB 13] Level-0 flush table #29: started Dec 15 04:55:03 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765792503535651, "cf_name": "default", "job": 13, "event": "table_file_creation", "file_number": 29, "file_size": 1104905, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16857, "largest_seqno": 18183, "table_properties": {"data_size": 1099275, "index_size": 2717, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1861, "raw_key_size": 15750, "raw_average_key_size": 21, "raw_value_size": 1086265, "raw_average_value_size": 1469, "num_data_blocks": 114, "num_entries": 739, "num_filter_entries": 739, "num_deletions": 259, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765792470, "oldest_key_time": 1765792470, "file_creation_time": 1765792503, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "603b24af-e2be-4214-bc56-9e652eb4af3d", "db_session_id": "0OJRM9SCUA16EXV0VQZ2", "orig_file_number": 29, "seqno_to_time_mapping": "N/A"}} Dec 15 04:55:03 localhost ceph-mon[298913]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 13] Flush lasted 9173 microseconds, and 4174 cpu microseconds. Dec 15 04:55:03 localhost ceph-mon[298913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Dec 15 04:55:03 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:55:03.535705) [db/flush_job.cc:967] [default] [JOB 13] Level-0 flush table #29: 1104905 bytes OK Dec 15 04:55:03 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:55:03.535728) [db/memtable_list.cc:519] [default] Level-0 commit table #29 started Dec 15 04:55:03 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:55:03.537299) [db/memtable_list.cc:722] [default] Level-0 commit table #29: memtable #1 done Dec 15 04:55:03 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:55:03.537324) EVENT_LOG_v1 {"time_micros": 1765792503537317, "job": 13, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Dec 15 04:55:03 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:55:03.537349) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Dec 15 04:55:03 localhost ceph-mon[298913]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 13] Try to delete WAL files size 1908890, prev total WAL file size 1908890, number of live WAL files 2. Dec 15 04:55:03 localhost ceph-mon[298913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005559462/store.db/000025.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Dec 15 04:55:03 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:55:03.538133) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B760031323739' seq:72057594037927935, type:22 .. '6B760031353339' seq:0, type:0; will stop at (end) Dec 15 04:55:03 localhost ceph-mon[298913]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 14] Compacting 1@0 + 1@6 files to L6, score -1.00 Dec 15 04:55:03 localhost ceph-mon[298913]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 13 Base level 0, inputs: [29(1079KB)], [27(18MB)] Dec 15 04:55:03 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765792503538178, "job": 14, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [29], "files_L6": [27], "score": -1, "input_data_size": 20836827, "oldest_snapshot_seqno": -1} Dec 15 04:55:03 localhost ceph-mon[298913]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 14] Generated table #30: 11248 keys, 19768388 bytes, temperature: kUnknown Dec 15 04:55:03 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765792503678932, "cf_name": "default", "job": 14, "event": "table_file_creation", "file_number": 30, "file_size": 19768388, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 19702524, "index_size": 36643, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 28165, "raw_key_size": 303025, "raw_average_key_size": 26, "raw_value_size": 19508556, "raw_average_value_size": 1734, "num_data_blocks": 1389, "num_entries": 11248, "num_filter_entries": 11248, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765792320, "oldest_key_time": 0, "file_creation_time": 1765792503, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "603b24af-e2be-4214-bc56-9e652eb4af3d", "db_session_id": "0OJRM9SCUA16EXV0VQZ2", "orig_file_number": 30, "seqno_to_time_mapping": "N/A"}} Dec 15 04:55:03 localhost ceph-mon[298913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Dec 15 04:55:03 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:55:03.679535) [db/compaction/compaction_job.cc:1663] [default] [JOB 14] Compacted 1@0 + 1@6 files to L6 => 19768388 bytes Dec 15 04:55:03 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:55:03.681535) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 147.7 rd, 140.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.1, 18.8 +0.0 blob) out(18.9 +0.0 blob), read-write-amplify(36.7) write-amplify(17.9) OK, records in: 11799, records dropped: 551 output_compression: NoCompression Dec 15 04:55:03 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:55:03.681567) EVENT_LOG_v1 {"time_micros": 1765792503681554, "job": 14, "event": "compaction_finished", "compaction_time_micros": 141105, "compaction_time_cpu_micros": 36393, "output_level": 6, "num_output_files": 1, "total_output_size": 19768388, "num_input_records": 11799, "num_output_records": 11248, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Dec 15 04:55:03 localhost ceph-mon[298913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005559462/store.db/000029.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Dec 15 04:55:03 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765792503682358, "job": 14, "event": "table_file_deletion", "file_number": 29} Dec 15 04:55:03 localhost ceph-mon[298913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005559462/store.db/000027.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Dec 15 04:55:03 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765792503685615, "job": 14, "event": "table_file_deletion", "file_number": 27} Dec 15 04:55:03 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:55:03.538074) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 04:55:03 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:55:03.685834) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 04:55:03 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:55:03.685841) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 04:55:03 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:55:03.685844) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 04:55:03 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:55:03.685847) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 04:55:03 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:55:03.685850) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 04:55:03 localhost ceph-mgr[292421]: [cephadm INFO cephadm.services.cephadmservice] Removing key for mgr.np0005559461.egwgzn Dec 15 04:55:03 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Removing key for mgr.np0005559461.egwgzn Dec 15 04:55:03 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e14 handle_command mon_command({"prefix": "auth rm", "entity": "mgr.np0005559461.egwgzn"} v 0) Dec 15 04:55:03 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth rm", "entity": "mgr.np0005559461.egwgzn"} : dispatch Dec 15 04:55:03 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mgr}] v 0) Dec 15 04:55:03 localhost ceph-mgr[292421]: [progress INFO root] complete: finished ev 2a372bca-86a5-468e-8b8f-8a69595a47ea (Updating mgr deployment (-1 -> 3)) Dec 15 04:55:03 localhost ceph-mgr[292421]: [progress INFO root] Completed event 2a372bca-86a5-468e-8b8f-8a69595a47ea (Updating mgr deployment (-1 -> 3)) in 2 seconds Dec 15 04:55:03 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mgr}] v 0) Dec 15 04:55:03 localhost ceph-mgr[292421]: [progress INFO root] update: starting ev ad2a5e7d-c014-4d79-8aff-9590225e0392 (Updating mon deployment (-1 -> 3)) Dec 15 04:55:03 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e14 handle_command mon_command({"prefix": "mon ok-to-stop", "ids": ["np0005559461"]} v 0) Dec 15 04:55:03 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "mon ok-to-stop", "ids": ["np0005559461"]} : dispatch Dec 15 04:55:03 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e14 handle_command mon_command({"prefix": "quorum_status"} v 0) Dec 15 04:55:03 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "quorum_status"} : dispatch Dec 15 04:55:03 localhost ceph-mgr[292421]: [cephadm INFO cephadm.services.cephadmservice] Safe to remove mon.np0005559461: new quorum should be ['np0005559464', 'np0005559462', 'np0005559463'] (from ['np0005559464', 'np0005559462', 'np0005559463']) Dec 15 04:55:03 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Safe to remove mon.np0005559461: new quorum should be ['np0005559464', 'np0005559462', 'np0005559463'] (from ['np0005559464', 'np0005559462', 'np0005559463']) Dec 15 04:55:03 localhost ceph-mgr[292421]: [cephadm INFO cephadm.services.cephadmservice] Removing monitor np0005559461 from monmap... Dec 15 04:55:03 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Removing monitor np0005559461 from monmap... Dec 15 04:55:03 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e14 handle_command mon_command({"prefix": "mon rm", "name": "np0005559461"} v 0) Dec 15 04:55:03 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "mon rm", "name": "np0005559461"} : dispatch Dec 15 04:55:03 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Removing daemon mon.np0005559461 from np0005559461.localdomain -- ports [] Dec 15 04:55:03 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Removing daemon mon.np0005559461 from np0005559461.localdomain -- ports [] Dec 15 04:55:03 localhost ceph-mon[298913]: mon.np0005559462@1(probing) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005559462"} v 0) Dec 15 04:55:03 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "mon metadata", "id": "np0005559462"} : dispatch Dec 15 04:55:03 localhost ceph-mon[298913]: mon.np0005559462@1(probing) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005559463"} v 0) Dec 15 04:55:03 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "mon metadata", "id": "np0005559463"} : dispatch Dec 15 04:55:03 localhost ceph-mon[298913]: mon.np0005559462@1(probing) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005559464"} v 0) Dec 15 04:55:03 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "mon metadata", "id": "np0005559464"} : dispatch Dec 15 04:55:03 localhost ceph-mon[298913]: log_channel(cluster) log [INF] : mon.np0005559462 calling monitor election Dec 15 04:55:03 localhost ceph-mon[298913]: paxos.1).electionLogic(62) init, last seen epoch 62 Dec 15 04:55:03 localhost ceph-mon[298913]: mon.np0005559462@1(electing) e15 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Dec 15 04:55:04 localhost nova_compute[286344]: 2025-12-15 09:55:04.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:55:04 localhost nova_compute[286344]: 2025-12-15 09:55:04.272 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:55:04 localhost nova_compute[286344]: 2025-12-15 09:55:04.294 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:55:04 localhost nova_compute[286344]: 2025-12-15 09:55:04.294 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:55:04 localhost nova_compute[286344]: 2025-12-15 09:55:04.295 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:55:04 localhost nova_compute[286344]: 2025-12-15 09:55:04.296 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Auditing locally available compute resources for np0005559462.localdomain (node: np0005559462.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 15 04:55:04 localhost nova_compute[286344]: 2025-12-15 09:55:04.296 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 04:55:04 localhost ceph-mon[298913]: mon.np0005559462@1(electing) e15 handle_auth_request failed to assign global_id Dec 15 04:55:04 localhost ceph-mon[298913]: mon.np0005559462@1(electing) e15 handle_auth_request failed to assign global_id Dec 15 04:55:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 04:55:04 localhost podman[306924]: 2025-12-15 09:55:04.733189154 +0000 UTC m=+0.068725648 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0) Dec 15 04:55:04 localhost podman[306924]: 2025-12-15 09:55:04.766444717 +0000 UTC m=+0.101981171 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202) Dec 15 04:55:04 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 04:55:04 localhost ceph-mgr[292421]: log_channel(cluster) log [DBG] : pgmap v26: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Dec 15 04:55:04 localhost openstack_network_exporter[246484]: ERROR 09:55:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 04:55:04 localhost openstack_network_exporter[246484]: ERROR 09:55:04 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 15 04:55:04 localhost openstack_network_exporter[246484]: ERROR 09:55:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 04:55:04 localhost openstack_network_exporter[246484]: ERROR 09:55:04 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 15 04:55:04 localhost openstack_network_exporter[246484]: Dec 15 04:55:04 localhost openstack_network_exporter[246484]: ERROR 09:55:04 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 15 04:55:04 localhost openstack_network_exporter[246484]: Dec 15 04:55:04 localhost ceph-mgr[292421]: [progress INFO root] Writing back 50 completed events Dec 15 04:55:04 localhost ceph-mon[298913]: mon.np0005559462@1(electing) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Dec 15 04:55:05 localhost ceph-mon[298913]: mon.np0005559462@1(electing) e15 handle_auth_request failed to assign global_id Dec 15 04:55:05 localhost ceph-mon[298913]: mon.np0005559462@1(electing) e15 handle_auth_request failed to assign global_id Dec 15 04:55:05 localhost ceph-mon[298913]: mon.np0005559462@1(electing) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mon}] v 0) Dec 15 04:55:05 localhost ceph-mon[298913]: mon.np0005559462@1(electing) e15 handle_auth_request failed to assign global_id Dec 15 04:55:05 localhost nova_compute[286344]: 2025-12-15 09:55:05.812 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:55:05 localhost ceph-mon[298913]: mon.np0005559462@1(electing) e15 handle_auth_request failed to assign global_id Dec 15 04:55:05 localhost ceph-osd[31375]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Dec 15 04:55:05 localhost ceph-osd[31375]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 7800.1 total, 600.0 interval#012Cumulative writes: 5049 writes, 22K keys, 5049 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 5049 writes, 730 syncs, 6.92 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 234 writes, 629 keys, 234 commit groups, 1.0 writes per commit group, ingest: 0.79 MB, 0.00 MB/s#012Interval WAL: 234 writes, 102 syncs, 2.29 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Dec 15 04:55:05 localhost ceph-mon[298913]: mon.np0005559462@1(electing) e15 handle_auth_request failed to assign global_id Dec 15 04:55:06 localhost ceph-mon[298913]: mon.np0005559462@1(electing) e15 handle_auth_request failed to assign global_id Dec 15 04:55:06 localhost ceph-mgr[292421]: log_channel(cluster) log [DBG] : pgmap v27: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Dec 15 04:55:07 localhost ceph-mon[298913]: mon.np0005559462@1(electing) e15 handle_auth_request failed to assign global_id Dec 15 04:55:07 localhost ceph-mon[298913]: mon.np0005559462@1(electing) e15 handle_auth_request failed to assign global_id Dec 15 04:55:08 localhost ceph-mon[298913]: mon.np0005559462@1(electing) e15 handle_auth_request failed to assign global_id Dec 15 04:55:08 localhost ceph-mon[298913]: mon.np0005559462@1(electing) e15 handle_auth_request failed to assign global_id Dec 15 04:55:08 localhost ceph-mon[298913]: mon.np0005559462@1(electing) e15 handle_auth_request failed to assign global_id Dec 15 04:55:08 localhost ceph-mgr[292421]: log_channel(cluster) log [DBG] : pgmap v28: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Dec 15 04:55:08 localhost ceph-mon[298913]: mon.np0005559462@1(electing) e15 handle_auth_request failed to assign global_id Dec 15 04:55:08 localhost ceph-mon[298913]: paxos.1).electionLogic(63) init, last seen epoch 63, mid-election, bumping Dec 15 04:55:08 localhost ceph-mon[298913]: mon.np0005559462@1(electing) e15 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Dec 15 04:55:08 localhost ceph-mon[298913]: mon.np0005559462@1(electing) e15 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Dec 15 04:55:08 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Dec 15 04:55:08 localhost ceph-mgr[292421]: [progress INFO root] complete: finished ev ad2a5e7d-c014-4d79-8aff-9590225e0392 (Updating mon deployment (-1 -> 3)) Dec 15 04:55:08 localhost ceph-mgr[292421]: [progress INFO root] Completed event ad2a5e7d-c014-4d79-8aff-9590225e0392 (Updating mon deployment (-1 -> 3)) in 5 seconds Dec 15 04:55:08 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mon}] v 0) Dec 15 04:55:08 localhost ceph-mgr[292421]: [progress INFO root] update: starting ev 7d1064b2-ab60-4a1f-ab55-0fa7c6e3bacc (Updating node-proxy deployment (+4 -> 4)) Dec 15 04:55:08 localhost ceph-mgr[292421]: [progress INFO root] complete: finished ev 7d1064b2-ab60-4a1f-ab55-0fa7c6e3bacc (Updating node-proxy deployment (+4 -> 4)) Dec 15 04:55:08 localhost ceph-mgr[292421]: [progress INFO root] Completed event 7d1064b2-ab60-4a1f-ab55-0fa7c6e3bacc (Updating node-proxy deployment (+4 -> 4)) in 0 seconds Dec 15 04:55:08 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) Dec 15 04:55:08 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch Dec 15 04:55:08 localhost ceph-mon[298913]: Removing key for mgr.np0005559461.egwgzn Dec 15 04:55:08 localhost ceph-mon[298913]: Safe to remove mon.np0005559461: new quorum should be ['np0005559464', 'np0005559462', 'np0005559463'] (from ['np0005559464', 'np0005559462', 'np0005559463']) Dec 15 04:55:08 localhost ceph-mon[298913]: Removing monitor np0005559461 from monmap... Dec 15 04:55:08 localhost ceph-mon[298913]: Removing daemon mon.np0005559461 from np0005559461.localdomain -- ports [] Dec 15 04:55:08 localhost ceph-mon[298913]: mon.np0005559464 calling monitor election Dec 15 04:55:08 localhost ceph-mon[298913]: mon.np0005559463 calling monitor election Dec 15 04:55:08 localhost ceph-mon[298913]: mon.np0005559462 calling monitor election Dec 15 04:55:08 localhost ceph-mon[298913]: mon.np0005559464 is new leader, mons np0005559464,np0005559463 in quorum (ranks 0,2) Dec 15 04:55:08 localhost ceph-mon[298913]: Health detail: HEALTH_WARN 1 stray daemon(s) not managed by cephadm; 1 stray host(s) with 1 daemon(s) not managed by cephadm Dec 15 04:55:08 localhost ceph-mon[298913]: [WRN] CEPHADM_STRAY_DAEMON: 1 stray daemon(s) not managed by cephadm Dec 15 04:55:08 localhost ceph-mon[298913]: stray daemon mgr.np0005559460.oexkup on host np0005559460.localdomain not managed by cephadm Dec 15 04:55:08 localhost ceph-mon[298913]: [WRN] CEPHADM_STRAY_HOST: 1 stray host(s) with 1 daemon(s) not managed by cephadm Dec 15 04:55:08 localhost ceph-mon[298913]: stray host np0005559460.localdomain has 1 stray daemons: ['mgr.np0005559460.oexkup'] Dec 15 04:55:08 localhost ceph-mon[298913]: mon.np0005559464 calling monitor election Dec 15 04:55:08 localhost ceph-mon[298913]: mon.np0005559464 is new leader, mons np0005559464,np0005559462,np0005559463 in quorum (ranks 0,1,2) Dec 15 04:55:08 localhost ceph-mon[298913]: Health detail: HEALTH_WARN 1 stray daemon(s) not managed by cephadm; 1 stray host(s) with 1 daemon(s) not managed by cephadm Dec 15 04:55:08 localhost ceph-mon[298913]: [WRN] CEPHADM_STRAY_DAEMON: 1 stray daemon(s) not managed by cephadm Dec 15 04:55:08 localhost ceph-mon[298913]: stray daemon mgr.np0005559460.oexkup on host np0005559460.localdomain not managed by cephadm Dec 15 04:55:08 localhost ceph-mon[298913]: [WRN] CEPHADM_STRAY_HOST: 1 stray host(s) with 1 daemon(s) not managed by cephadm Dec 15 04:55:08 localhost ceph-mon[298913]: stray host np0005559460.localdomain has 1 stray daemons: ['mgr.np0005559460.oexkup'] Dec 15 04:55:08 localhost ceph-mon[298913]: from='mgr.26879 ' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:08 localhost ceph-mon[298913]: from='mgr.26879 ' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:08 localhost ceph-mon[298913]: from='mgr.26879 ' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:10 localhost ceph-mon[298913]: mon.np0005559462@1(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:55:10 localhost ceph-osd[32311]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Dec 15 04:55:10 localhost ceph-osd[32311]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 7800.2 total, 600.0 interval#012Cumulative writes: 5830 writes, 25K keys, 5830 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s#012Cumulative WAL: 5830 writes, 805 syncs, 7.24 writes per sync, written: 0.02 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 85 writes, 258 keys, 85 commit groups, 1.0 writes per commit group, ingest: 0.32 MB, 0.00 MB/s#012Interval WAL: 85 writes, 42 syncs, 2.02 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Dec 15 04:55:10 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559461.localdomain.devices.0}] v 0) Dec 15 04:55:10 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559461.localdomain}] v 0) Dec 15 04:55:10 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Dec 15 04:55:10 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "config generate-minimal-conf"} : dispatch Dec 15 04:55:10 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) Dec 15 04:55:10 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Dec 15 04:55:10 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Removing np0005559461.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:55:10 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Removing np0005559461.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:55:10 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Updating np0005559462.localdomain:/etc/ceph/ceph.conf Dec 15 04:55:10 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Updating np0005559463.localdomain:/etc/ceph/ceph.conf Dec 15 04:55:10 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Updating np0005559462.localdomain:/etc/ceph/ceph.conf Dec 15 04:55:10 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Updating np0005559464.localdomain:/etc/ceph/ceph.conf Dec 15 04:55:10 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Updating np0005559463.localdomain:/etc/ceph/ceph.conf Dec 15 04:55:10 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Updating np0005559464.localdomain:/etc/ceph/ceph.conf Dec 15 04:55:10 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Dec 15 04:55:10 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/4039382621' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Dec 15 04:55:10 localhost nova_compute[286344]: 2025-12-15 09:55:10.771 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 6.474s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 04:55:10 localhost ceph-mgr[292421]: log_channel(cluster) log [DBG] : pgmap v29: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Dec 15 04:55:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e. Dec 15 04:55:10 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Removing np0005559461.localdomain:/etc/ceph/ceph.client.admin.keyring Dec 15 04:55:10 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Removing np0005559461.localdomain:/etc/ceph/ceph.client.admin.keyring Dec 15 04:55:10 localhost nova_compute[286344]: 2025-12-15 09:55:10.818 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:55:10 localhost nova_compute[286344]: 2025-12-15 09:55:10.836 286348 DEBUG nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 04:55:10 localhost nova_compute[286344]: 2025-12-15 09:55:10.836 286348 DEBUG nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 04:55:10 localhost podman[306989]: 2025-12-15 09:55:10.885236824 +0000 UTC m=+0.078164180 container health_status a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Dec 15 04:55:10 localhost podman[306989]: 2025-12-15 09:55:10.898280257 +0000 UTC m=+0.091207563 container exec_died a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Dec 15 04:55:10 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Removing np0005559461.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.client.admin.keyring Dec 15 04:55:10 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Removing np0005559461.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.client.admin.keyring Dec 15 04:55:10 localhost systemd[1]: a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e.service: Deactivated successfully. Dec 15 04:55:10 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559461.localdomain.devices.0}] v 0) Dec 15 04:55:11 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559461.localdomain}] v 0) Dec 15 04:55:11 localhost nova_compute[286344]: 2025-12-15 09:55:11.042 286348 WARNING nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 15 04:55:11 localhost nova_compute[286344]: 2025-12-15 09:55:11.043 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Hypervisor/Node resource view: name=np0005559462.localdomain free_ram=11721MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 15 04:55:11 localhost nova_compute[286344]: 2025-12-15 09:55:11.043 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:55:11 localhost nova_compute[286344]: 2025-12-15 09:55:11.043 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:55:11 localhost nova_compute[286344]: 2025-12-15 09:55:11.152 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Instance 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 15 04:55:11 localhost nova_compute[286344]: 2025-12-15 09:55:11.153 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 15 04:55:11 localhost nova_compute[286344]: 2025-12-15 09:55:11.153 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Final resource view: name=np0005559462.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 15 04:55:11 localhost nova_compute[286344]: 2025-12-15 09:55:11.268 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 04:55:11 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Updating np0005559464.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:55:11 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Updating np0005559464.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:55:11 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Updating np0005559463.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:55:11 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Updating np0005559463.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:55:11 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Updating np0005559462.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:55:11 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Updating np0005559462.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:55:11 localhost ceph-mon[298913]: from='mgr.26879 ' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:11 localhost ceph-mon[298913]: from='mgr.26879 ' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:11 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Dec 15 04:55:11 localhost ceph-mon[298913]: Removing np0005559461.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:55:11 localhost ceph-mon[298913]: Updating np0005559462.localdomain:/etc/ceph/ceph.conf Dec 15 04:55:11 localhost ceph-mon[298913]: Updating np0005559463.localdomain:/etc/ceph/ceph.conf Dec 15 04:55:11 localhost ceph-mon[298913]: Updating np0005559464.localdomain:/etc/ceph/ceph.conf Dec 15 04:55:11 localhost ceph-mon[298913]: Removing np0005559461.localdomain:/etc/ceph/ceph.client.admin.keyring Dec 15 04:55:11 localhost ceph-mon[298913]: Removing np0005559461.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.client.admin.keyring Dec 15 04:55:11 localhost ceph-mon[298913]: from='mgr.26879 ' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:11 localhost ceph-mon[298913]: from='mgr.26879 ' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:11 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Dec 15 04:55:11 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3443481974' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Dec 15 04:55:11 localhost nova_compute[286344]: 2025-12-15 09:55:11.730 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.462s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 04:55:11 localhost nova_compute[286344]: 2025-12-15 09:55:11.737 286348 DEBUG nova.compute.provider_tree [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Inventory has not changed in ProviderTree for provider: 26c8956b-6742-4951-b566-971b9bbe323b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 15 04:55:11 localhost nova_compute[286344]: 2025-12-15 09:55:11.759 286348 DEBUG nova.scheduler.client.report [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Inventory has not changed for provider 26c8956b-6742-4951-b566-971b9bbe323b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 15 04:55:11 localhost nova_compute[286344]: 2025-12-15 09:55:11.762 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Compute_service record updated for np0005559462.localdomain:np0005559462.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 15 04:55:11 localhost nova_compute[286344]: 2025-12-15 09:55:11.762 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.719s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:55:11 localhost nova_compute[286344]: 2025-12-15 09:55:11.763 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:55:11 localhost nova_compute[286344]: 2025-12-15 09:55:11.763 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m Dec 15 04:55:11 localhost nova_compute[286344]: 2025-12-15 09:55:11.775 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m Dec 15 04:55:11 localhost nova_compute[286344]: 2025-12-15 09:55:11.775 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:55:11 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559464.localdomain.devices.0}] v 0) Dec 15 04:55:11 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559464.localdomain}] v 0) Dec 15 04:55:12 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559463.localdomain.devices.0}] v 0) Dec 15 04:55:12 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559462.localdomain.devices.0}] v 0) Dec 15 04:55:12 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559463.localdomain}] v 0) Dec 15 04:55:12 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559462.localdomain}] v 0) Dec 15 04:55:12 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Dec 15 04:55:12 localhost ceph-mgr[292421]: [progress INFO root] update: starting ev 3cd1cdbb-2be7-4353-824a-3e5d56222569 (Updating node-proxy deployment (+4 -> 4)) Dec 15 04:55:12 localhost ceph-mgr[292421]: [progress INFO root] complete: finished ev 3cd1cdbb-2be7-4353-824a-3e5d56222569 (Updating node-proxy deployment (+4 -> 4)) Dec 15 04:55:12 localhost ceph-mgr[292421]: [progress INFO root] Completed event 3cd1cdbb-2be7-4353-824a-3e5d56222569 (Updating node-proxy deployment (+4 -> 4)) in 0 seconds Dec 15 04:55:12 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) Dec 15 04:55:12 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch Dec 15 04:55:12 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005559461 (monmap changed)... Dec 15 04:55:12 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005559461 (monmap changed)... Dec 15 04:55:12 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005559461.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0) Dec 15 04:55:12 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005559461.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Dec 15 04:55:12 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Dec 15 04:55:12 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "config generate-minimal-conf"} : dispatch Dec 15 04:55:12 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005559461 on np0005559461.localdomain Dec 15 04:55:12 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005559461 on np0005559461.localdomain Dec 15 04:55:12 localhost ceph-mon[298913]: Updating np0005559464.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:55:12 localhost ceph-mon[298913]: Updating np0005559463.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:55:12 localhost ceph-mon[298913]: Updating np0005559462.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:55:12 localhost ceph-mon[298913]: from='mgr.26879 ' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:12 localhost ceph-mon[298913]: from='mgr.26879 ' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:12 localhost ceph-mon[298913]: from='mgr.26879 ' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:12 localhost ceph-mon[298913]: from='mgr.26879 ' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:12 localhost ceph-mon[298913]: from='mgr.26879 ' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:12 localhost ceph-mon[298913]: from='mgr.26879 ' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:12 localhost ceph-mon[298913]: from='mgr.26879 ' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:12 localhost ceph-mon[298913]: from='mgr.26879 ' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005559461.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Dec 15 04:55:12 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005559461.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Dec 15 04:55:12 localhost ceph-mgr[292421]: log_channel(cluster) log [DBG] : pgmap v30: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Dec 15 04:55:13 localhost ceph-mgr[292421]: log_channel(audit) log [DBG] : from='client.44440 -' entity='client.admin' cmd=[{"prefix": "orch host drain", "hostname": "np0005559461.localdomain", "target": ["mon-mgr", ""]}]: dispatch Dec 15 04:55:13 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/inventory}] v 0) Dec 15 04:55:13 localhost ceph-mgr[292421]: [cephadm INFO root] Added label _no_schedule to host np0005559461.localdomain Dec 15 04:55:13 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Added label _no_schedule to host np0005559461.localdomain Dec 15 04:55:13 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/inventory}] v 0) Dec 15 04:55:13 localhost ceph-mgr[292421]: [cephadm INFO root] Added label SpecialHostLabels.DRAIN_CONF_KEYRING to host np0005559461.localdomain Dec 15 04:55:13 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Added label SpecialHostLabels.DRAIN_CONF_KEYRING to host np0005559461.localdomain Dec 15 04:55:13 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559461.localdomain.devices.0}] v 0) Dec 15 04:55:13 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559461.localdomain}] v 0) Dec 15 04:55:13 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005559462 (monmap changed)... Dec 15 04:55:13 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005559462 (monmap changed)... Dec 15 04:55:13 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005559462.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0) Dec 15 04:55:13 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005559462.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Dec 15 04:55:13 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Dec 15 04:55:13 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "config generate-minimal-conf"} : dispatch Dec 15 04:55:13 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005559462 on np0005559462.localdomain Dec 15 04:55:13 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005559462 on np0005559462.localdomain Dec 15 04:55:13 localhost nova_compute[286344]: 2025-12-15 09:55:13.786 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:55:13 localhost nova_compute[286344]: 2025-12-15 09:55:13.809 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:55:13 localhost nova_compute[286344]: 2025-12-15 09:55:13.809 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 15 04:55:13 localhost nova_compute[286344]: 2025-12-15 09:55:13.809 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 15 04:55:13 localhost ceph-mgr[292421]: [progress INFO root] Writing back 50 completed events Dec 15 04:55:13 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Dec 15 04:55:14 localhost nova_compute[286344]: 2025-12-15 09:55:14.027 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 15 04:55:14 localhost nova_compute[286344]: 2025-12-15 09:55:14.027 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquired lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 15 04:55:14 localhost nova_compute[286344]: 2025-12-15 09:55:14.028 286348 DEBUG nova.network.neutron [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 15 04:55:14 localhost nova_compute[286344]: 2025-12-15 09:55:14.028 286348 DEBUG nova.objects.instance [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 15 04:55:14 localhost ceph-mon[298913]: Reconfiguring crash.np0005559461 (monmap changed)... Dec 15 04:55:14 localhost ceph-mon[298913]: Reconfiguring daemon crash.np0005559461 on np0005559461.localdomain Dec 15 04:55:14 localhost ceph-mon[298913]: from='mgr.26879 ' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:14 localhost ceph-mon[298913]: Added label _no_schedule to host np0005559461.localdomain Dec 15 04:55:14 localhost ceph-mon[298913]: from='mgr.26879 ' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:14 localhost ceph-mon[298913]: Added label SpecialHostLabels.DRAIN_CONF_KEYRING to host np0005559461.localdomain Dec 15 04:55:14 localhost ceph-mon[298913]: from='mgr.26879 ' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:14 localhost ceph-mon[298913]: from='mgr.26879 ' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:14 localhost ceph-mon[298913]: from='mgr.26879 ' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005559462.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Dec 15 04:55:14 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005559462.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Dec 15 04:55:14 localhost ceph-mon[298913]: from='mgr.26879 ' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:14 localhost podman[307407]: Dec 15 04:55:14 localhost podman[307407]: 2025-12-15 09:55:14.126208033 +0000 UTC m=+0.075531057 container create c53a6ed00adde343482f627dd0f0b6325a55cd92343794286489dab08b5d2b5b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=intelligent_spence, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, release=1763362218, ceph=True, architecture=x86_64, com.redhat.component=rhceph-container, build-date=2025-11-26T19:44:28Z, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, CEPH_POINT_RELEASE=, distribution-scope=public, vcs-type=git, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux , GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Dec 15 04:55:14 localhost systemd[1]: Started libpod-conmon-c53a6ed00adde343482f627dd0f0b6325a55cd92343794286489dab08b5d2b5b.scope. Dec 15 04:55:14 localhost systemd[1]: Started libcrun container. Dec 15 04:55:14 localhost podman[307407]: 2025-12-15 09:55:14.097684312 +0000 UTC m=+0.047007386 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 04:55:14 localhost podman[307407]: 2025-12-15 09:55:14.203675344 +0000 UTC m=+0.152998368 container init c53a6ed00adde343482f627dd0f0b6325a55cd92343794286489dab08b5d2b5b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=intelligent_spence, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, name=rhceph, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, version=7, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., ceph=True, build-date=2025-11-26T19:44:28Z, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, distribution-scope=public, vcs-type=git, release=1763362218, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, com.redhat.component=rhceph-container) Dec 15 04:55:14 localhost systemd[1]: tmp-crun.jSrNoi.mount: Deactivated successfully. Dec 15 04:55:14 localhost podman[307407]: 2025-12-15 09:55:14.217162648 +0000 UTC m=+0.166485702 container start c53a6ed00adde343482f627dd0f0b6325a55cd92343794286489dab08b5d2b5b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=intelligent_spence, ceph=True, GIT_BRANCH=main, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, build-date=2025-11-26T19:44:28Z, distribution-scope=public, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, RELEASE=main, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., vcs-type=git, com.redhat.component=rhceph-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, release=1763362218, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux ) Dec 15 04:55:14 localhost intelligent_spence[307422]: 167 167 Dec 15 04:55:14 localhost systemd[1]: libpod-c53a6ed00adde343482f627dd0f0b6325a55cd92343794286489dab08b5d2b5b.scope: Deactivated successfully. Dec 15 04:55:14 localhost podman[307407]: 2025-12-15 09:55:14.217838707 +0000 UTC m=+0.167161751 container attach c53a6ed00adde343482f627dd0f0b6325a55cd92343794286489dab08b5d2b5b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=intelligent_spence, description=Red Hat Ceph Storage 7, release=1763362218, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux , GIT_BRANCH=main, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, com.redhat.component=rhceph-container, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, CEPH_POINT_RELEASE=, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph) Dec 15 04:55:14 localhost podman[307407]: 2025-12-15 09:55:14.22225083 +0000 UTC m=+0.171573874 container died c53a6ed00adde343482f627dd0f0b6325a55cd92343794286489dab08b5d2b5b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=intelligent_spence, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, GIT_BRANCH=main, name=rhceph, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, ceph=True, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, architecture=x86_64, version=7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1763362218, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 15 04:55:14 localhost podman[307427]: 2025-12-15 09:55:14.296462919 +0000 UTC m=+0.068286496 container remove c53a6ed00adde343482f627dd0f0b6325a55cd92343794286489dab08b5d2b5b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=intelligent_spence, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, RELEASE=main, vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, com.redhat.component=rhceph-container, architecture=x86_64, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, io.openshift.tags=rhceph ceph, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git) Dec 15 04:55:14 localhost systemd[1]: libpod-conmon-c53a6ed00adde343482f627dd0f0b6325a55cd92343794286489dab08b5d2b5b.scope: Deactivated successfully. Dec 15 04:55:14 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559462.localdomain.devices.0}] v 0) Dec 15 04:55:14 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559462.localdomain}] v 0) Dec 15 04:55:14 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Reconfiguring osd.0 (monmap changed)... Dec 15 04:55:14 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Reconfiguring osd.0 (monmap changed)... Dec 15 04:55:14 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "osd.0"} v 0) Dec 15 04:55:14 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch Dec 15 04:55:14 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Dec 15 04:55:14 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "config generate-minimal-conf"} : dispatch Dec 15 04:55:14 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.0 on np0005559462.localdomain Dec 15 04:55:14 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.0 on np0005559462.localdomain Dec 15 04:55:14 localhost nova_compute[286344]: 2025-12-15 09:55:14.423 286348 DEBUG nova.network.neutron [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Updating instance_info_cache with network_info: [{"id": "03ef8889-3216-43fb-8a52-4be17a956ce1", "address": "fa:16:3e:74:df:7c", "network": {"id": "befb7a72-17a9-4bcb-b561-84b8f626685a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.201", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "c785bf23f53946bc99867d8832a50266", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03ef8889-32", "ovs_interfaceid": "03ef8889-3216-43fb-8a52-4be17a956ce1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 15 04:55:14 localhost nova_compute[286344]: 2025-12-15 09:55:14.439 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Releasing lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 15 04:55:14 localhost nova_compute[286344]: 2025-12-15 09:55:14.439 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 15 04:55:14 localhost nova_compute[286344]: 2025-12-15 09:55:14.440 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:55:14 localhost nova_compute[286344]: 2025-12-15 09:55:14.440 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:55:14 localhost nova_compute[286344]: 2025-12-15 09:55:14.441 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:55:14 localhost nova_compute[286344]: 2025-12-15 09:55:14.441 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:55:14 localhost nova_compute[286344]: 2025-12-15 09:55:14.441 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 15 04:55:14 localhost ceph-mgr[292421]: log_channel(audit) log [DBG] : from='client.44446 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "host_pattern": "np0005559461.localdomain", "target": ["mon-mgr", ""], "format": "json"}]: dispatch Dec 15 04:55:14 localhost ceph-mgr[292421]: log_channel(cluster) log [DBG] : pgmap v31: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Dec 15 04:55:14 localhost podman[307495]: Dec 15 04:55:14 localhost podman[307495]: 2025-12-15 09:55:14.981046131 +0000 UTC m=+0.073355057 container create 4fcdde651885710eb8944adee25061d3992fc14df0de0def1b3d997db46d199e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sad_bardeen, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, maintainer=Guillaume Abrioux , cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, architecture=x86_64, io.openshift.expose-services=, distribution-scope=public, RELEASE=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, GIT_CLEAN=True) Dec 15 04:55:15 localhost systemd[1]: Started libpod-conmon-4fcdde651885710eb8944adee25061d3992fc14df0de0def1b3d997db46d199e.scope. Dec 15 04:55:15 localhost systemd[1]: Started libcrun container. Dec 15 04:55:15 localhost podman[307495]: 2025-12-15 09:55:15.046614201 +0000 UTC m=+0.138923137 container init 4fcdde651885710eb8944adee25061d3992fc14df0de0def1b3d997db46d199e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sad_bardeen, CEPH_POINT_RELEASE=, distribution-scope=public, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, RELEASE=main, architecture=x86_64, build-date=2025-11-26T19:44:28Z, release=1763362218, GIT_CLEAN=True, version=7, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., ceph=True, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main) Dec 15 04:55:15 localhost podman[307495]: 2025-12-15 09:55:14.95183515 +0000 UTC m=+0.044144076 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 04:55:15 localhost podman[307495]: 2025-12-15 09:55:15.05811445 +0000 UTC m=+0.150423376 container start 4fcdde651885710eb8944adee25061d3992fc14df0de0def1b3d997db46d199e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sad_bardeen, CEPH_POINT_RELEASE=, build-date=2025-11-26T19:44:28Z, com.redhat.component=rhceph-container, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-type=git, distribution-scope=public, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, release=1763362218, maintainer=Guillaume Abrioux ) Dec 15 04:55:15 localhost podman[307495]: 2025-12-15 09:55:15.058559712 +0000 UTC m=+0.150868668 container attach 4fcdde651885710eb8944adee25061d3992fc14df0de0def1b3d997db46d199e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sad_bardeen, io.openshift.expose-services=, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , GIT_CLEAN=True, description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, name=rhceph, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git) Dec 15 04:55:15 localhost sad_bardeen[307510]: 167 167 Dec 15 04:55:15 localhost systemd[1]: libpod-4fcdde651885710eb8944adee25061d3992fc14df0de0def1b3d997db46d199e.scope: Deactivated successfully. Dec 15 04:55:15 localhost podman[307495]: 2025-12-15 09:55:15.062294906 +0000 UTC m=+0.154603862 container died 4fcdde651885710eb8944adee25061d3992fc14df0de0def1b3d997db46d199e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sad_bardeen, maintainer=Guillaume Abrioux , version=7, io.buildah.version=1.41.4, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.openshift.expose-services=, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, distribution-scope=public, GIT_BRANCH=main, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, vcs-type=git, com.redhat.component=rhceph-container, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, RELEASE=main, ceph=True, release=1763362218) Dec 15 04:55:15 localhost ceph-mon[298913]: Reconfiguring crash.np0005559462 (monmap changed)... Dec 15 04:55:15 localhost ceph-mon[298913]: Reconfiguring daemon crash.np0005559462 on np0005559462.localdomain Dec 15 04:55:15 localhost ceph-mon[298913]: from='mgr.26879 ' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:15 localhost ceph-mon[298913]: from='mgr.26879 ' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:15 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch Dec 15 04:55:15 localhost ceph-mon[298913]: mon.np0005559462@1(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:55:15 localhost systemd[1]: var-lib-containers-storage-overlay-fff4e4f8420462270f2e4ee8f8e92b957295e026511b8ba0674fcab20adda2a6-merged.mount: Deactivated successfully. Dec 15 04:55:15 localhost systemd[1]: var-lib-containers-storage-overlay-ef0abbf5e30e17923d508d731aa352ee4ee2739e3686c9dcc9507e236b579bea-merged.mount: Deactivated successfully. Dec 15 04:55:15 localhost podman[307517]: 2025-12-15 09:55:15.146318138 +0000 UTC m=+0.077966904 container remove 4fcdde651885710eb8944adee25061d3992fc14df0de0def1b3d997db46d199e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sad_bardeen, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, io.buildah.version=1.41.4, vendor=Red Hat, Inc., vcs-type=git, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , release=1763362218, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, GIT_CLEAN=True, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, version=7, GIT_BRANCH=main) Dec 15 04:55:15 localhost systemd[1]: libpod-conmon-4fcdde651885710eb8944adee25061d3992fc14df0de0def1b3d997db46d199e.scope: Deactivated successfully. Dec 15 04:55:15 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559462.localdomain.devices.0}] v 0) Dec 15 04:55:15 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559462.localdomain}] v 0) Dec 15 04:55:15 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Reconfiguring osd.3 (monmap changed)... Dec 15 04:55:15 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Reconfiguring osd.3 (monmap changed)... Dec 15 04:55:15 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "osd.3"} v 0) Dec 15 04:55:15 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch Dec 15 04:55:15 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Dec 15 04:55:15 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "config generate-minimal-conf"} : dispatch Dec 15 04:55:15 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.3 on np0005559462.localdomain Dec 15 04:55:15 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.3 on np0005559462.localdomain Dec 15 04:55:15 localhost ceph-mgr[292421]: log_channel(audit) log [DBG] : from='client.34494 -' entity='client.admin' cmd=[{"prefix": "orch host rm", "hostname": "np0005559461.localdomain", "force": true, "target": ["mon-mgr", ""]}]: dispatch Dec 15 04:55:15 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/inventory}] v 0) Dec 15 04:55:15 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command({"prefix":"config-key del","key":"mgr/cephadm/host.np0005559461.localdomain"} v 0) Dec 15 04:55:15 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005559461.localdomain"} : dispatch Dec 15 04:55:15 localhost ceph-mgr[292421]: [cephadm INFO root] Removed host np0005559461.localdomain Dec 15 04:55:15 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Removed host np0005559461.localdomain Dec 15 04:55:15 localhost nova_compute[286344]: 2025-12-15 09:55:15.822 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:55:16 localhost podman[307591]: Dec 15 04:55:16 localhost podman[307591]: 2025-12-15 09:55:16.023134966 +0000 UTC m=+0.080397612 container create f1c7df1d1ccefb2f795662d3b98529ab33a89e3cf7a9627c6743753f89234eb1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_blackburn, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, GIT_BRANCH=main, GIT_CLEAN=True, io.buildah.version=1.41.4, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True) Dec 15 04:55:16 localhost systemd[1]: Started libpod-conmon-f1c7df1d1ccefb2f795662d3b98529ab33a89e3cf7a9627c6743753f89234eb1.scope. Dec 15 04:55:16 localhost systemd[1]: Started libcrun container. Dec 15 04:55:16 localhost podman[307591]: 2025-12-15 09:55:16.08811873 +0000 UTC m=+0.145381386 container init f1c7df1d1ccefb2f795662d3b98529ab33a89e3cf7a9627c6743753f89234eb1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_blackburn, vcs-type=git, description=Red Hat Ceph Storage 7, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, CEPH_POINT_RELEASE=, distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, release=1763362218, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Dec 15 04:55:16 localhost podman[307591]: 2025-12-15 09:55:15.989806191 +0000 UTC m=+0.047068857 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 04:55:16 localhost podman[307591]: 2025-12-15 09:55:16.098410515 +0000 UTC m=+0.155673161 container start f1c7df1d1ccefb2f795662d3b98529ab33a89e3cf7a9627c6743753f89234eb1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_blackburn, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, GIT_CLEAN=True, distribution-scope=public, ceph=True, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, version=7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, release=1763362218, build-date=2025-11-26T19:44:28Z) Dec 15 04:55:16 localhost podman[307591]: 2025-12-15 09:55:16.098681953 +0000 UTC m=+0.155944699 container attach f1c7df1d1ccefb2f795662d3b98529ab33a89e3cf7a9627c6743753f89234eb1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_blackburn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, RELEASE=main, vcs-type=git, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-11-26T19:44:28Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, com.redhat.component=rhceph-container, GIT_BRANCH=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, distribution-scope=public) Dec 15 04:55:16 localhost silly_blackburn[307606]: 167 167 Dec 15 04:55:16 localhost systemd[1]: libpod-f1c7df1d1ccefb2f795662d3b98529ab33a89e3cf7a9627c6743753f89234eb1.scope: Deactivated successfully. Dec 15 04:55:16 localhost podman[307591]: 2025-12-15 09:55:16.100928265 +0000 UTC m=+0.158190981 container died f1c7df1d1ccefb2f795662d3b98529ab33a89e3cf7a9627c6743753f89234eb1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_blackburn, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, ceph=True, io.openshift.expose-services=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, CEPH_POINT_RELEASE=, GIT_CLEAN=True, distribution-scope=public) Dec 15 04:55:16 localhost ceph-mon[298913]: Reconfiguring osd.0 (monmap changed)... Dec 15 04:55:16 localhost ceph-mon[298913]: Reconfiguring daemon osd.0 on np0005559462.localdomain Dec 15 04:55:16 localhost ceph-mon[298913]: from='mgr.26879 ' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:16 localhost ceph-mon[298913]: from='mgr.26879 ' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:16 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch Dec 15 04:55:16 localhost ceph-mon[298913]: from='mgr.26879 ' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:16 localhost ceph-mon[298913]: from='mgr.26879 ' entity='mgr.np0005559462.fudvyx' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005559461.localdomain"} : dispatch Dec 15 04:55:16 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005559461.localdomain"} : dispatch Dec 15 04:55:16 localhost ceph-mon[298913]: from='mgr.26879 ' entity='mgr.np0005559462.fudvyx' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005559461.localdomain"}]': finished Dec 15 04:55:16 localhost systemd[1]: var-lib-containers-storage-overlay-75460e41116650a915249d60956bf29aa49486c02f92e0f9a37ebfa7aeafdf7a-merged.mount: Deactivated successfully. Dec 15 04:55:16 localhost podman[307611]: 2025-12-15 09:55:16.195509421 +0000 UTC m=+0.084395594 container remove f1c7df1d1ccefb2f795662d3b98529ab33a89e3cf7a9627c6743753f89234eb1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=silly_blackburn, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, RELEASE=main, architecture=x86_64, description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, build-date=2025-11-26T19:44:28Z, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., ceph=True, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, CEPH_POINT_RELEASE=, GIT_CLEAN=True, io.buildah.version=1.41.4) Dec 15 04:55:16 localhost systemd[1]: libpod-conmon-f1c7df1d1ccefb2f795662d3b98529ab33a89e3cf7a9627c6743753f89234eb1.scope: Deactivated successfully. Dec 15 04:55:16 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559462.localdomain.devices.0}] v 0) Dec 15 04:55:16 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559462.localdomain}] v 0) Dec 15 04:55:16 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005559462.mhigvc (monmap changed)... Dec 15 04:55:16 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005559462.mhigvc (monmap changed)... Dec 15 04:55:16 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005559462.mhigvc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0) Dec 15 04:55:16 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005559462.mhigvc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Dec 15 04:55:16 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Dec 15 04:55:16 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "config generate-minimal-conf"} : dispatch Dec 15 04:55:16 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005559462.mhigvc on np0005559462.localdomain Dec 15 04:55:16 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005559462.mhigvc on np0005559462.localdomain Dec 15 04:55:16 localhost ceph-mgr[292421]: log_channel(cluster) log [DBG] : pgmap v32: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Dec 15 04:55:16 localhost podman[307688]: Dec 15 04:55:16 localhost podman[307688]: 2025-12-15 09:55:16.998834409 +0000 UTC m=+0.069549982 container create 118feddb604d525ae51de76f1e7b1dc98bb51d1021273a40b2d7a771453d5add (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_nash, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , GIT_BRANCH=main, build-date=2025-11-26T19:44:28Z, name=rhceph, io.buildah.version=1.41.4, architecture=x86_64, release=1763362218, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, description=Red Hat Ceph Storage 7, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, version=7, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc.) Dec 15 04:55:17 localhost systemd[1]: Started libpod-conmon-118feddb604d525ae51de76f1e7b1dc98bb51d1021273a40b2d7a771453d5add.scope. Dec 15 04:55:17 localhost systemd[1]: Started libcrun container. Dec 15 04:55:17 localhost podman[307688]: 2025-12-15 09:55:17.058253258 +0000 UTC m=+0.128968841 container init 118feddb604d525ae51de76f1e7b1dc98bb51d1021273a40b2d7a771453d5add (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_nash, name=rhceph, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, vcs-type=git, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, architecture=x86_64, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-11-26T19:44:28Z, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vendor=Red Hat, Inc., RELEASE=main, ceph=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0) Dec 15 04:55:17 localhost podman[307688]: 2025-12-15 09:55:17.06737027 +0000 UTC m=+0.138085854 container start 118feddb604d525ae51de76f1e7b1dc98bb51d1021273a40b2d7a771453d5add (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_nash, io.buildah.version=1.41.4, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , release=1763362218, CEPH_POINT_RELEASE=, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, ceph=True, architecture=x86_64, build-date=2025-11-26T19:44:28Z, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 15 04:55:17 localhost podman[307688]: 2025-12-15 09:55:17.067669289 +0000 UTC m=+0.138384902 container attach 118feddb604d525ae51de76f1e7b1dc98bb51d1021273a40b2d7a771453d5add (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_nash, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, ceph=True, vendor=Red Hat, Inc., release=1763362218, CEPH_POINT_RELEASE=, io.openshift.expose-services=, name=rhceph, maintainer=Guillaume Abrioux , GIT_BRANCH=main, RELEASE=main, distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, vcs-type=git, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers) Dec 15 04:55:17 localhost blissful_nash[307703]: 167 167 Dec 15 04:55:17 localhost systemd[1]: libpod-118feddb604d525ae51de76f1e7b1dc98bb51d1021273a40b2d7a771453d5add.scope: Deactivated successfully. Dec 15 04:55:17 localhost podman[307688]: 2025-12-15 09:55:17.072647697 +0000 UTC m=+0.143363290 container died 118feddb604d525ae51de76f1e7b1dc98bb51d1021273a40b2d7a771453d5add (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_nash, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, vcs-type=git, GIT_CLEAN=True, version=7, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, maintainer=Guillaume Abrioux , url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, RELEASE=main) Dec 15 04:55:17 localhost podman[307688]: 2025-12-15 09:55:16.976273902 +0000 UTC m=+0.046989455 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 04:55:17 localhost ceph-mon[298913]: Reconfiguring osd.3 (monmap changed)... Dec 15 04:55:17 localhost ceph-mon[298913]: Reconfiguring daemon osd.3 on np0005559462.localdomain Dec 15 04:55:17 localhost ceph-mon[298913]: Removed host np0005559461.localdomain Dec 15 04:55:17 localhost ceph-mon[298913]: from='mgr.26879 ' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:17 localhost ceph-mon[298913]: from='mgr.26879 ' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:17 localhost ceph-mon[298913]: from='mgr.26879 ' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005559462.mhigvc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Dec 15 04:55:17 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005559462.mhigvc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Dec 15 04:55:17 localhost systemd[1]: tmp-crun.FY7ef7.mount: Deactivated successfully. Dec 15 04:55:17 localhost systemd[1]: var-lib-containers-storage-overlay-94c32204b06cc14224b551d9bab6a2ecc8bdda9b460e98f93482893ee7370dfb-merged.mount: Deactivated successfully. Dec 15 04:55:17 localhost podman[307708]: 2025-12-15 09:55:17.186676652 +0000 UTC m=+0.100897151 container remove 118feddb604d525ae51de76f1e7b1dc98bb51d1021273a40b2d7a771453d5add (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_nash, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, maintainer=Guillaume Abrioux , name=rhceph, io.buildah.version=1.41.4, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, io.openshift.expose-services=, architecture=x86_64, release=1763362218, com.redhat.component=rhceph-container, RELEASE=main, GIT_CLEAN=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, version=7) Dec 15 04:55:17 localhost systemd[1]: libpod-conmon-118feddb604d525ae51de76f1e7b1dc98bb51d1021273a40b2d7a771453d5add.scope: Deactivated successfully. Dec 15 04:55:17 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559462.localdomain.devices.0}] v 0) Dec 15 04:55:17 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559462.localdomain}] v 0) Dec 15 04:55:17 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005559462.fudvyx (monmap changed)... Dec 15 04:55:17 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005559462.fudvyx (monmap changed)... Dec 15 04:55:17 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005559462.fudvyx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0) Dec 15 04:55:17 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005559462.fudvyx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Dec 15 04:55:17 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command({"prefix": "mgr services"} v 0) Dec 15 04:55:17 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "mgr services"} : dispatch Dec 15 04:55:17 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Dec 15 04:55:17 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "config generate-minimal-conf"} : dispatch Dec 15 04:55:17 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005559462.fudvyx on np0005559462.localdomain Dec 15 04:55:17 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005559462.fudvyx on np0005559462.localdomain Dec 15 04:55:17 localhost podman[307778]: Dec 15 04:55:17 localhost podman[307778]: 2025-12-15 09:55:17.875525953 +0000 UTC m=+0.073681317 container create 13620529f6c1be4550f43e87db810f7cdf2e7b2e4510fb1dcf83269eb00b8d85 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=musing_franklin, release=1763362218, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, distribution-scope=public, vcs-type=git, description=Red Hat Ceph Storage 7, version=7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, GIT_BRANCH=main, com.redhat.component=rhceph-container, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, maintainer=Guillaume Abrioux , vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7) Dec 15 04:55:17 localhost systemd[1]: Started libpod-conmon-13620529f6c1be4550f43e87db810f7cdf2e7b2e4510fb1dcf83269eb00b8d85.scope. Dec 15 04:55:17 localhost systemd[1]: Started libcrun container. Dec 15 04:55:17 localhost podman[307778]: 2025-12-15 09:55:17.932127094 +0000 UTC m=+0.130282448 container init 13620529f6c1be4550f43e87db810f7cdf2e7b2e4510fb1dcf83269eb00b8d85 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=musing_franklin, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, build-date=2025-11-26T19:44:28Z, vcs-type=git, io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , name=rhceph, GIT_BRANCH=main, RELEASE=main, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, io.openshift.expose-services=) Dec 15 04:55:17 localhost podman[307778]: 2025-12-15 09:55:17.939779446 +0000 UTC m=+0.137934800 container start 13620529f6c1be4550f43e87db810f7cdf2e7b2e4510fb1dcf83269eb00b8d85 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=musing_franklin, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, RELEASE=main, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, release=1763362218, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, GIT_BRANCH=main, version=7, architecture=x86_64, ceph=True, maintainer=Guillaume Abrioux ) Dec 15 04:55:17 localhost musing_franklin[307794]: 167 167 Dec 15 04:55:17 localhost systemd[1]: libpod-13620529f6c1be4550f43e87db810f7cdf2e7b2e4510fb1dcf83269eb00b8d85.scope: Deactivated successfully. Dec 15 04:55:17 localhost podman[307778]: 2025-12-15 09:55:17.940173777 +0000 UTC m=+0.138329171 container attach 13620529f6c1be4550f43e87db810f7cdf2e7b2e4510fb1dcf83269eb00b8d85 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=musing_franklin, build-date=2025-11-26T19:44:28Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, version=7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, name=rhceph, CEPH_POINT_RELEASE=, io.openshift.expose-services=, release=1763362218, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., vcs-type=git, distribution-scope=public, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers) Dec 15 04:55:17 localhost podman[307778]: 2025-12-15 09:55:17.944537538 +0000 UTC m=+0.142692892 container died 13620529f6c1be4550f43e87db810f7cdf2e7b2e4510fb1dcf83269eb00b8d85 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=musing_franklin, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, distribution-scope=public, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, GIT_CLEAN=True, build-date=2025-11-26T19:44:28Z, io.openshift.tags=rhceph ceph, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.openshift.expose-services=, io.buildah.version=1.41.4) Dec 15 04:55:17 localhost podman[307778]: 2025-12-15 09:55:17.845439097 +0000 UTC m=+0.043594481 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 04:55:18 localhost podman[307799]: 2025-12-15 09:55:18.037442277 +0000 UTC m=+0.083730355 container remove 13620529f6c1be4550f43e87db810f7cdf2e7b2e4510fb1dcf83269eb00b8d85 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=musing_franklin, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, build-date=2025-11-26T19:44:28Z, io.openshift.tags=rhceph ceph, distribution-scope=public, version=7, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.openshift.expose-services=, RELEASE=main, release=1763362218, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True) Dec 15 04:55:18 localhost systemd[1]: libpod-conmon-13620529f6c1be4550f43e87db810f7cdf2e7b2e4510fb1dcf83269eb00b8d85.scope: Deactivated successfully. Dec 15 04:55:18 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559462.localdomain.devices.0}] v 0) Dec 15 04:55:18 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559462.localdomain}] v 0) Dec 15 04:55:18 localhost ceph-mon[298913]: Reconfiguring mds.mds.np0005559462.mhigvc (monmap changed)... Dec 15 04:55:18 localhost ceph-mon[298913]: Reconfiguring daemon mds.mds.np0005559462.mhigvc on np0005559462.localdomain Dec 15 04:55:18 localhost ceph-mon[298913]: from='mgr.26879 ' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:18 localhost ceph-mon[298913]: from='mgr.26879 ' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:18 localhost ceph-mon[298913]: from='mgr.26879 ' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005559462.fudvyx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Dec 15 04:55:18 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005559462.fudvyx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Dec 15 04:55:18 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Reconfiguring mon.np0005559462 (monmap changed)... Dec 15 04:55:18 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Reconfiguring mon.np0005559462 (monmap changed)... Dec 15 04:55:18 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "mon."} v 0) Dec 15 04:55:18 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Dec 15 04:55:18 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command({"prefix": "config get", "who": "mon", "key": "public_network"} v 0) Dec 15 04:55:18 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch Dec 15 04:55:18 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Dec 15 04:55:18 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "config generate-minimal-conf"} : dispatch Dec 15 04:55:18 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Reconfiguring daemon mon.np0005559462 on np0005559462.localdomain Dec 15 04:55:18 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Reconfiguring daemon mon.np0005559462 on np0005559462.localdomain Dec 15 04:55:18 localhost podman[307870]: Dec 15 04:55:18 localhost podman[307870]: 2025-12-15 09:55:18.754021256 +0000 UTC m=+0.075884487 container create 262c558de32167fe926ee7f03723596423bc004cbdf8e1587b78c082e8c872e8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_elion, vcs-type=git, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vendor=Red Hat, Inc., io.buildah.version=1.41.4, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, version=7, com.redhat.component=rhceph-container, release=1763362218, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, build-date=2025-11-26T19:44:28Z) Dec 15 04:55:18 localhost systemd[1]: Started libpod-conmon-262c558de32167fe926ee7f03723596423bc004cbdf8e1587b78c082e8c872e8.scope. Dec 15 04:55:18 localhost ceph-mgr[292421]: log_channel(cluster) log [DBG] : pgmap v33: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Dec 15 04:55:18 localhost systemd[1]: Started libcrun container. Dec 15 04:55:18 localhost podman[307870]: 2025-12-15 09:55:18.811971395 +0000 UTC m=+0.133834626 container init 262c558de32167fe926ee7f03723596423bc004cbdf8e1587b78c082e8c872e8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_elion, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, build-date=2025-11-26T19:44:28Z, CEPH_POINT_RELEASE=, name=rhceph, RELEASE=main, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, release=1763362218, version=7) Dec 15 04:55:18 localhost podman[307870]: 2025-12-15 09:55:18.72278858 +0000 UTC m=+0.044651831 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 04:55:18 localhost podman[307870]: 2025-12-15 09:55:18.824520823 +0000 UTC m=+0.146384064 container start 262c558de32167fe926ee7f03723596423bc004cbdf8e1587b78c082e8c872e8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_elion, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., GIT_BRANCH=main, architecture=x86_64, vcs-type=git, release=1763362218, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, ceph=True, distribution-scope=public) Dec 15 04:55:18 localhost podman[307870]: 2025-12-15 09:55:18.824752921 +0000 UTC m=+0.146616202 container attach 262c558de32167fe926ee7f03723596423bc004cbdf8e1587b78c082e8c872e8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_elion, io.k8s.description=Red Hat Ceph Storage 7, release=1763362218, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_CLEAN=True, GIT_BRANCH=main, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, ceph=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, build-date=2025-11-26T19:44:28Z, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, version=7, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, architecture=x86_64) Dec 15 04:55:18 localhost sweet_elion[307885]: 167 167 Dec 15 04:55:18 localhost systemd[1]: libpod-262c558de32167fe926ee7f03723596423bc004cbdf8e1587b78c082e8c872e8.scope: Deactivated successfully. Dec 15 04:55:18 localhost podman[307870]: 2025-12-15 09:55:18.827495857 +0000 UTC m=+0.149359088 container died 262c558de32167fe926ee7f03723596423bc004cbdf8e1587b78c082e8c872e8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_elion, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., io.buildah.version=1.41.4, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, name=rhceph, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main) Dec 15 04:55:18 localhost podman[307890]: 2025-12-15 09:55:18.920248841 +0000 UTC m=+0.081888125 container remove 262c558de32167fe926ee7f03723596423bc004cbdf8e1587b78c082e8c872e8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sweet_elion, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1763362218, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, GIT_CLEAN=True, RELEASE=main, vcs-type=git, distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, name=rhceph) Dec 15 04:55:18 localhost systemd[1]: libpod-conmon-262c558de32167fe926ee7f03723596423bc004cbdf8e1587b78c082e8c872e8.scope: Deactivated successfully. Dec 15 04:55:18 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559462.localdomain.devices.0}] v 0) Dec 15 04:55:19 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559462.localdomain}] v 0) Dec 15 04:55:19 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005559463 (monmap changed)... Dec 15 04:55:19 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005559463 (monmap changed)... Dec 15 04:55:19 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005559463.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0) Dec 15 04:55:19 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005559463.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Dec 15 04:55:19 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Dec 15 04:55:19 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "config generate-minimal-conf"} : dispatch Dec 15 04:55:19 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005559463 on np0005559463.localdomain Dec 15 04:55:19 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005559463 on np0005559463.localdomain Dec 15 04:55:19 localhost systemd[1]: tmp-crun.KshAlG.mount: Deactivated successfully. Dec 15 04:55:19 localhost systemd[1]: var-lib-containers-storage-overlay-ff84ebe222181b1f28125a8d2c2ed492b7dc5445d8e71c828b7770603b7e27b0-merged.mount: Deactivated successfully. Dec 15 04:55:19 localhost ceph-mon[298913]: Reconfiguring mgr.np0005559462.fudvyx (monmap changed)... Dec 15 04:55:19 localhost ceph-mon[298913]: Reconfiguring daemon mgr.np0005559462.fudvyx on np0005559462.localdomain Dec 15 04:55:19 localhost ceph-mon[298913]: from='mgr.26879 ' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:19 localhost ceph-mon[298913]: from='mgr.26879 ' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:19 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Dec 15 04:55:19 localhost ceph-mon[298913]: from='mgr.26879 ' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:19 localhost ceph-mon[298913]: from='mgr.26879 ' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:19 localhost ceph-mon[298913]: from='mgr.26879 ' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005559463.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Dec 15 04:55:19 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005559463.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Dec 15 04:55:19 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559463.localdomain.devices.0}] v 0) Dec 15 04:55:19 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559463.localdomain}] v 0) Dec 15 04:55:19 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Reconfiguring osd.2 (monmap changed)... Dec 15 04:55:19 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Reconfiguring osd.2 (monmap changed)... Dec 15 04:55:19 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "osd.2"} v 0) Dec 15 04:55:19 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Dec 15 04:55:19 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Dec 15 04:55:19 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "config generate-minimal-conf"} : dispatch Dec 15 04:55:19 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.2 on np0005559463.localdomain Dec 15 04:55:19 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.2 on np0005559463.localdomain Dec 15 04:55:20 localhost ceph-mon[298913]: mon.np0005559462@1(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:55:20 localhost ceph-mon[298913]: Reconfiguring mon.np0005559462 (monmap changed)... Dec 15 04:55:20 localhost ceph-mon[298913]: Reconfiguring daemon mon.np0005559462 on np0005559462.localdomain Dec 15 04:55:20 localhost ceph-mon[298913]: Reconfiguring crash.np0005559463 (monmap changed)... Dec 15 04:55:20 localhost ceph-mon[298913]: Reconfiguring daemon crash.np0005559463 on np0005559463.localdomain Dec 15 04:55:20 localhost ceph-mgr[292421]: [balancer INFO root] Optimize plan auto_2025-12-15_09:55:20 Dec 15 04:55:20 localhost ceph-mon[298913]: from='mgr.26879 ' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:20 localhost ceph-mon[298913]: from='mgr.26879 ' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:20 localhost ceph-mgr[292421]: [balancer INFO root] Mode upmap, max misplaced 0.050000 Dec 15 04:55:20 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Dec 15 04:55:20 localhost ceph-mgr[292421]: [balancer INFO root] do_upmap Dec 15 04:55:20 localhost ceph-mgr[292421]: [balancer INFO root] pools ['manila_data', 'volumes', 'manila_metadata', 'images', '.mgr', 'vms', 'backups'] Dec 15 04:55:20 localhost ceph-mgr[292421]: [balancer INFO root] prepared 0/10 changes Dec 15 04:55:20 localhost ceph-mgr[292421]: [pg_autoscaler INFO root] _maybe_adjust Dec 15 04:55:20 localhost ceph-mgr[292421]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784 Dec 15 04:55:20 localhost ceph-mgr[292421]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 3.080724804578448e-05 of space, bias 1.0, pg target 0.006161449609156895 quantized to 1 (current 1) Dec 15 04:55:20 localhost ceph-mgr[292421]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784 Dec 15 04:55:20 localhost ceph-mgr[292421]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.003325274375348967 of space, bias 1.0, pg target 0.6650548750697934 quantized to 32 (current 32) Dec 15 04:55:20 localhost ceph-mgr[292421]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784 Dec 15 04:55:20 localhost ceph-mgr[292421]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32) Dec 15 04:55:20 localhost ceph-mgr[292421]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784 Dec 15 04:55:20 localhost ceph-mgr[292421]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0014449417225013959 of space, bias 1.0, pg target 0.2885066972594454 quantized to 32 (current 32) Dec 15 04:55:20 localhost ceph-mgr[292421]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784 Dec 15 04:55:20 localhost ceph-mgr[292421]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32) Dec 15 04:55:20 localhost ceph-mgr[292421]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784 Dec 15 04:55:20 localhost ceph-mgr[292421]: [pg_autoscaler INFO root] Pool 'manila_data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32) Dec 15 04:55:20 localhost ceph-mgr[292421]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784 Dec 15 04:55:20 localhost ceph-mgr[292421]: [pg_autoscaler INFO root] Pool 'manila_metadata' root_id -1 using 2.453674623115578e-06 of space, bias 4.0, pg target 0.0019596681323283084 quantized to 16 (current 16) Dec 15 04:55:20 localhost ceph-mgr[292421]: [volumes INFO mgr_util] scanning for idle connections.. Dec 15 04:55:20 localhost ceph-mgr[292421]: [volumes INFO mgr_util] cleaning up connections: [] Dec 15 04:55:20 localhost ceph-mgr[292421]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules Dec 15 04:55:20 localhost ceph-mgr[292421]: [rbd_support INFO root] load_schedules: vms, start_after= Dec 15 04:55:20 localhost ceph-mgr[292421]: [rbd_support INFO root] load_schedules: volumes, start_after= Dec 15 04:55:20 localhost ceph-mgr[292421]: [rbd_support INFO root] load_schedules: images, start_after= Dec 15 04:55:20 localhost ceph-mgr[292421]: [rbd_support INFO root] load_schedules: backups, start_after= Dec 15 04:55:20 localhost ceph-mgr[292421]: [volumes INFO mgr_util] scanning for idle connections.. Dec 15 04:55:20 localhost ceph-mgr[292421]: [volumes INFO mgr_util] cleaning up connections: [] Dec 15 04:55:20 localhost ceph-mgr[292421]: [volumes INFO mgr_util] scanning for idle connections.. Dec 15 04:55:20 localhost ceph-mgr[292421]: [volumes INFO mgr_util] cleaning up connections: [] Dec 15 04:55:20 localhost ceph-mgr[292421]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules Dec 15 04:55:20 localhost ceph-mgr[292421]: [rbd_support INFO root] load_schedules: vms, start_after= Dec 15 04:55:20 localhost ceph-mgr[292421]: [rbd_support INFO root] load_schedules: volumes, start_after= Dec 15 04:55:20 localhost ceph-mgr[292421]: [rbd_support INFO root] load_schedules: images, start_after= Dec 15 04:55:20 localhost ceph-mgr[292421]: [rbd_support INFO root] load_schedules: backups, start_after= Dec 15 04:55:20 localhost ceph-mgr[292421]: log_channel(cluster) log [DBG] : pgmap v34: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Dec 15 04:55:20 localhost nova_compute[286344]: 2025-12-15 09:55:20.824 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:55:20 localhost nova_compute[286344]: 2025-12-15 09:55:20.834 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:55:20 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559463.localdomain.devices.0}] v 0) Dec 15 04:55:20 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559463.localdomain}] v 0) Dec 15 04:55:20 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Reconfiguring osd.5 (monmap changed)... Dec 15 04:55:20 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Reconfiguring osd.5 (monmap changed)... Dec 15 04:55:20 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "osd.5"} v 0) Dec 15 04:55:20 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Dec 15 04:55:20 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Dec 15 04:55:20 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "config generate-minimal-conf"} : dispatch Dec 15 04:55:20 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.5 on np0005559463.localdomain Dec 15 04:55:20 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.5 on np0005559463.localdomain Dec 15 04:55:21 localhost ceph-mon[298913]: Reconfiguring osd.2 (monmap changed)... Dec 15 04:55:21 localhost ceph-mon[298913]: Reconfiguring daemon osd.2 on np0005559463.localdomain Dec 15 04:55:21 localhost ceph-mon[298913]: from='mgr.26879 ' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:21 localhost ceph-mon[298913]: from='mgr.26879 ' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:21 localhost ceph-mon[298913]: Reconfiguring osd.5 (monmap changed)... Dec 15 04:55:21 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Dec 15 04:55:21 localhost ceph-mon[298913]: Reconfiguring daemon osd.5 on np0005559463.localdomain Dec 15 04:55:21 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559463.localdomain.devices.0}] v 0) Dec 15 04:55:21 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559463.localdomain}] v 0) Dec 15 04:55:21 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005559463.rdpgze (monmap changed)... Dec 15 04:55:21 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005559463.rdpgze (monmap changed)... Dec 15 04:55:21 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005559463.rdpgze", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0) Dec 15 04:55:21 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005559463.rdpgze", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Dec 15 04:55:21 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Dec 15 04:55:21 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "config generate-minimal-conf"} : dispatch Dec 15 04:55:21 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005559463.rdpgze on np0005559463.localdomain Dec 15 04:55:21 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005559463.rdpgze on np0005559463.localdomain Dec 15 04:55:22 localhost ceph-mgr[292421]: log_channel(cluster) log [DBG] : pgmap v35: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Dec 15 04:55:22 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559463.localdomain.devices.0}] v 0) Dec 15 04:55:22 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559463.localdomain}] v 0) Dec 15 04:55:22 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005559463.daptkf (monmap changed)... Dec 15 04:55:22 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005559463.daptkf (monmap changed)... Dec 15 04:55:22 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005559463.daptkf", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0) Dec 15 04:55:22 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005559463.daptkf", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Dec 15 04:55:22 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command({"prefix": "mgr services"} v 0) Dec 15 04:55:22 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "mgr services"} : dispatch Dec 15 04:55:22 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Dec 15 04:55:22 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "config generate-minimal-conf"} : dispatch Dec 15 04:55:22 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005559463.daptkf on np0005559463.localdomain Dec 15 04:55:22 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005559463.daptkf on np0005559463.localdomain Dec 15 04:55:22 localhost ceph-mon[298913]: from='mgr.26879 ' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:22 localhost ceph-mon[298913]: from='mgr.26879 ' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:22 localhost ceph-mon[298913]: from='mgr.26879 ' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005559463.rdpgze", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Dec 15 04:55:22 localhost ceph-mon[298913]: Reconfiguring mds.mds.np0005559463.rdpgze (monmap changed)... Dec 15 04:55:22 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005559463.rdpgze", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Dec 15 04:55:22 localhost ceph-mon[298913]: Reconfiguring daemon mds.mds.np0005559463.rdpgze on np0005559463.localdomain Dec 15 04:55:22 localhost ceph-mon[298913]: from='mgr.26879 ' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:22 localhost ceph-mon[298913]: from='mgr.26879 ' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:22 localhost ceph-mon[298913]: from='mgr.26879 ' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005559463.daptkf", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Dec 15 04:55:22 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005559463.daptkf", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Dec 15 04:55:23 localhost ceph-mgr[292421]: log_channel(audit) log [DBG] : from='client.54146 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch Dec 15 04:55:23 localhost ceph-mgr[292421]: [cephadm INFO root] Saving service mon spec with placement label:mon Dec 15 04:55:23 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Saving service mon spec with placement label:mon Dec 15 04:55:23 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mon}] v 0) Dec 15 04:55:23 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559463.localdomain.devices.0}] v 0) Dec 15 04:55:23 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559463.localdomain}] v 0) Dec 15 04:55:23 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005559464 (monmap changed)... Dec 15 04:55:23 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005559464 (monmap changed)... Dec 15 04:55:23 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005559464.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0) Dec 15 04:55:23 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005559464.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Dec 15 04:55:23 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Dec 15 04:55:23 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "config generate-minimal-conf"} : dispatch Dec 15 04:55:23 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005559464 on np0005559464.localdomain Dec 15 04:55:23 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005559464 on np0005559464.localdomain Dec 15 04:55:23 localhost ceph-mon[298913]: Reconfiguring mgr.np0005559463.daptkf (monmap changed)... Dec 15 04:55:23 localhost ceph-mon[298913]: Reconfiguring daemon mgr.np0005559463.daptkf on np0005559463.localdomain Dec 15 04:55:23 localhost ceph-mon[298913]: from='mgr.26879 ' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:23 localhost ceph-mon[298913]: from='mgr.26879 ' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:23 localhost ceph-mon[298913]: from='mgr.26879 ' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:23 localhost ceph-mon[298913]: from='mgr.26879 ' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005559464.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Dec 15 04:55:23 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005559464.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Dec 15 04:55:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0. Dec 15 04:55:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 04:55:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a. Dec 15 04:55:24 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559464.localdomain.devices.0}] v 0) Dec 15 04:55:24 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559464.localdomain}] v 0) Dec 15 04:55:24 localhost podman[307906]: 2025-12-15 09:55:24.772812169 +0000 UTC m=+0.098069373 container health_status 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Dec 15 04:55:24 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Reconfiguring osd.1 (monmap changed)... Dec 15 04:55:24 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Reconfiguring osd.1 (monmap changed)... Dec 15 04:55:24 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "osd.1"} v 0) Dec 15 04:55:24 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Dec 15 04:55:24 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Dec 15 04:55:24 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "config generate-minimal-conf"} : dispatch Dec 15 04:55:24 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.1 on np0005559464.localdomain Dec 15 04:55:24 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.1 on np0005559464.localdomain Dec 15 04:55:24 localhost ceph-mgr[292421]: log_channel(cluster) log [DBG] : pgmap v36: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Dec 15 04:55:24 localhost podman[307907]: 2025-12-15 09:55:24.816719777 +0000 UTC m=+0.141384375 container health_status 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 04:55:24 localhost podman[307907]: 2025-12-15 09:55:24.832268259 +0000 UTC m=+0.156932877 container exec_died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible) Dec 15 04:55:24 localhost ceph-mgr[292421]: log_channel(audit) log [DBG] : from='client.44461 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005559464", "target": ["mon-mgr", ""], "format": "json"}]: dispatch Dec 15 04:55:24 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.service: Deactivated successfully. Dec 15 04:55:24 localhost podman[307906]: 2025-12-15 09:55:24.887019189 +0000 UTC m=+0.212276373 container exec_died 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 15 04:55:24 localhost systemd[1]: 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0.service: Deactivated successfully. Dec 15 04:55:24 localhost podman[307908]: 2025-12-15 09:55:24.962563406 +0000 UTC m=+0.282650226 container health_status b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible) Dec 15 04:55:24 localhost podman[307908]: 2025-12-15 09:55:24.999270195 +0000 UTC m=+0.319357015 container exec_died b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 04:55:25 localhost ceph-mon[298913]: Saving service mon spec with placement label:mon Dec 15 04:55:25 localhost ceph-mon[298913]: Reconfiguring crash.np0005559464 (monmap changed)... Dec 15 04:55:25 localhost ceph-mon[298913]: Reconfiguring daemon crash.np0005559464 on np0005559464.localdomain Dec 15 04:55:25 localhost ceph-mon[298913]: from='mgr.26879 ' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:25 localhost ceph-mon[298913]: from='mgr.26879 ' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:25 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Dec 15 04:55:25 localhost systemd[1]: b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a.service: Deactivated successfully. Dec 15 04:55:25 localhost ceph-mon[298913]: mon.np0005559462@1(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:55:25 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559464.localdomain.devices.0}] v 0) Dec 15 04:55:25 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559464.localdomain}] v 0) Dec 15 04:55:25 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Reconfiguring osd.4 (monmap changed)... Dec 15 04:55:25 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Reconfiguring osd.4 (monmap changed)... Dec 15 04:55:25 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "osd.4"} v 0) Dec 15 04:55:25 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Dec 15 04:55:25 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Dec 15 04:55:25 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "config generate-minimal-conf"} : dispatch Dec 15 04:55:25 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.4 on np0005559464.localdomain Dec 15 04:55:25 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.4 on np0005559464.localdomain Dec 15 04:55:25 localhost nova_compute[286344]: 2025-12-15 09:55:25.828 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:55:25 localhost nova_compute[286344]: 2025-12-15 09:55:25.835 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:55:26 localhost ceph-mon[298913]: Reconfiguring osd.1 (monmap changed)... Dec 15 04:55:26 localhost ceph-mon[298913]: Reconfiguring daemon osd.1 on np0005559464.localdomain Dec 15 04:55:26 localhost ceph-mon[298913]: from='mgr.26879 ' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:26 localhost ceph-mon[298913]: from='mgr.26879 ' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:26 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Dec 15 04:55:26 localhost ceph-mgr[292421]: log_channel(audit) log [DBG] : from='client.44470 -' entity='client.admin' cmd=[{"prefix": "orch daemon rm", "names": ["mon.np0005559464"], "force": true, "target": ["mon-mgr", ""]}]: dispatch Dec 15 04:55:26 localhost ceph-mgr[292421]: [cephadm INFO root] Remove daemons mon.np0005559464 Dec 15 04:55:26 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Remove daemons mon.np0005559464 Dec 15 04:55:26 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command({"prefix": "quorum_status"} v 0) Dec 15 04:55:26 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "quorum_status"} : dispatch Dec 15 04:55:26 localhost ceph-mgr[292421]: [cephadm INFO cephadm.services.cephadmservice] Safe to remove mon.np0005559464: new quorum should be ['np0005559462', 'np0005559463'] (from ['np0005559462', 'np0005559463']) Dec 15 04:55:26 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Safe to remove mon.np0005559464: new quorum should be ['np0005559462', 'np0005559463'] (from ['np0005559462', 'np0005559463']) Dec 15 04:55:26 localhost ceph-mgr[292421]: [cephadm INFO cephadm.services.cephadmservice] Removing monitor np0005559464 from monmap... Dec 15 04:55:26 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Removing monitor np0005559464 from monmap... Dec 15 04:55:26 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e15 handle_command mon_command({"prefix": "mon rm", "name": "np0005559464"} v 0) Dec 15 04:55:26 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "mon rm", "name": "np0005559464"} : dispatch Dec 15 04:55:26 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Removing daemon mon.np0005559464 from np0005559464.localdomain -- ports [] Dec 15 04:55:26 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Removing daemon mon.np0005559464 from np0005559464.localdomain -- ports [] Dec 15 04:55:26 localhost ceph-mon[298913]: mon.np0005559462@1(peon) e16 my rank is now 0 (was 1) Dec 15 04:55:26 localhost ceph-mgr[292421]: client.0 ms_handle_reset on v2:172.18.0.103:3300/0 Dec 15 04:55:26 localhost ceph-mgr[292421]: client.0 ms_handle_reset on v2:172.18.0.103:3300/0 Dec 15 04:55:26 localhost ceph-mgr[292421]: client.27183 ms_handle_reset on v2:172.18.0.103:3300/0 Dec 15 04:55:26 localhost ceph-mgr[292421]: client.44351 ms_handle_reset on v2:172.18.0.103:3300/0 Dec 15 04:55:26 localhost ceph-mgr[292421]: client.44351 ms_handle_reset on v2:172.18.0.104:3300/0 Dec 15 04:55:26 localhost ceph-mon[298913]: mon.np0005559462@0(probing) e16 handle_command mon_command({"prefix": "mon metadata", "id": "np0005559462"} v 0) Dec 15 04:55:26 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "mon metadata", "id": "np0005559462"} : dispatch Dec 15 04:55:26 localhost ceph-mon[298913]: mon.np0005559462@0(probing) e16 handle_command mon_command({"prefix": "mon metadata", "id": "np0005559463"} v 0) Dec 15 04:55:26 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "mon metadata", "id": "np0005559463"} : dispatch Dec 15 04:55:26 localhost ceph-mgr[292421]: --2- 172.18.0.106:0/3470317100 >> [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] conn(0x55975f52a800 0x55975eebb080 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2 Dec 15 04:55:26 localhost ceph-mgr[292421]: --2- 172.18.0.106:0/2986203701 >> [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] conn(0x55975f6ff000 0x55975f903b80 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2 Dec 15 04:55:26 localhost ceph-mgr[292421]: client.27183 ms_handle_reset on v2:172.18.0.104:3300/0 Dec 15 04:55:26 localhost nova_compute[286344]: 2025-12-15 09:55:26.650 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:55:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09. Dec 15 04:55:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 04:55:26 localhost nova_compute[286344]: 2025-12-15 09:55:26.668 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Triggering sync for uuid 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m Dec 15 04:55:26 localhost nova_compute[286344]: 2025-12-15 09:55:26.669 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:55:26 localhost nova_compute[286344]: 2025-12-15 09:55:26.670 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:55:26 localhost nova_compute[286344]: 2025-12-15 09:55:26.746 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.076s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:55:26 localhost podman[307965]: 2025-12-15 09:55:26.765273443 +0000 UTC m=+0.093472575 container health_status 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vendor=Red Hat, Inc., release=1755695350, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, managed_by=edpm_ansible, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, version=9.6, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=ubi9-minimal, architecture=x86_64, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.) Dec 15 04:55:26 localhost ceph-mon[298913]: mon.np0005559462@0(probing) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559464.localdomain.devices.0}] v 0) Dec 15 04:55:26 localhost podman[307965]: 2025-12-15 09:55:26.778288625 +0000 UTC m=+0.106487767 container exec_died 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, architecture=x86_64, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, name=ubi9-minimal, config_id=openstack_network_exporter, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Dec 15 04:55:26 localhost systemd[1]: 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09.service: Deactivated successfully. Dec 15 04:55:26 localhost ceph-mgr[292421]: log_channel(cluster) log [DBG] : pgmap v37: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Dec 15 04:55:26 localhost systemd[1]: tmp-crun.R5yHNU.mount: Deactivated successfully. Dec 15 04:55:26 localhost podman[307966]: 2025-12-15 09:55:26.872780617 +0000 UTC m=+0.197637487 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Dec 15 04:55:26 localhost podman[307966]: 2025-12-15 09:55:26.987342537 +0000 UTC m=+0.312199367 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) Dec 15 04:55:26 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 04:55:28 localhost ceph-mon[298913]: log_channel(cluster) log [INF] : mon.np0005559462 calling monitor election Dec 15 04:55:28 localhost ceph-mon[298913]: paxos.0).electionLogic(68) init, last seen epoch 68 Dec 15 04:55:28 localhost ceph-mon[298913]: mon.np0005559462@0(electing) e16 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Dec 15 04:55:28 localhost ceph-mon[298913]: log_channel(cluster) log [INF] : mon.np0005559462 is new leader, mons np0005559462,np0005559463 in quorum (ranks 0,1) Dec 15 04:55:28 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : monmap epoch 16 Dec 15 04:55:28 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : fsid bce17446-41b5-5408-a23e-0b011906b44a Dec 15 04:55:28 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : last_changed 2025-12-15T09:55:26.160367+0000 Dec 15 04:55:28 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : created 2025-12-15T07:42:11.600132+0000 Dec 15 04:55:28 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : min_mon_release 18 (reef) Dec 15 04:55:28 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : election_strategy: 1 Dec 15 04:55:28 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : 0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005559462 Dec 15 04:55:28 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005559463 Dec 15 04:55:28 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Dec 15 04:55:28 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : fsmap cephfs:1 {0=mds.np0005559463.rdpgze=up:active} 2 up:standby Dec 15 04:55:28 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e88: 6 total, 6 up, 6 in Dec 15 04:55:28 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : mgrmap e35: np0005559462.fudvyx(active, since 68s), standbys: np0005559463.daptkf, np0005559461.egwgzn, np0005559464.aomnqe Dec 15 04:55:28 localhost ceph-mon[298913]: log_channel(cluster) log [WRN] : Health detail: HEALTH_WARN 1 stray daemon(s) not managed by cephadm; 1 stray host(s) with 1 daemon(s) not managed by cephadm Dec 15 04:55:28 localhost ceph-mon[298913]: log_channel(cluster) log [WRN] : [WRN] CEPHADM_STRAY_DAEMON: 1 stray daemon(s) not managed by cephadm Dec 15 04:55:28 localhost ceph-mon[298913]: log_channel(cluster) log [WRN] : stray daemon mgr.np0005559460.oexkup on host np0005559460.localdomain not managed by cephadm Dec 15 04:55:28 localhost ceph-mon[298913]: log_channel(cluster) log [WRN] : [WRN] CEPHADM_STRAY_HOST: 1 stray host(s) with 1 daemon(s) not managed by cephadm Dec 15 04:55:28 localhost ceph-mon[298913]: log_channel(cluster) log [WRN] : stray host np0005559460.localdomain has 1 stray daemons: ['mgr.np0005559460.oexkup'] Dec 15 04:55:28 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:28 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559464.localdomain}] v 0) Dec 15 04:55:28 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:28 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005559464.piyuji (monmap changed)... Dec 15 04:55:28 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005559464.piyuji (monmap changed)... Dec 15 04:55:28 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005559464.piyuji", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0) Dec 15 04:55:28 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005559464.piyuji", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Dec 15 04:55:28 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Dec 15 04:55:28 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "config generate-minimal-conf"} : dispatch Dec 15 04:55:28 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005559464.piyuji on np0005559464.localdomain Dec 15 04:55:28 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005559464.piyuji on np0005559464.localdomain Dec 15 04:55:28 localhost ceph-mon[298913]: Reconfiguring daemon osd.4 on np0005559464.localdomain Dec 15 04:55:28 localhost ceph-mon[298913]: Remove daemons mon.np0005559464 Dec 15 04:55:28 localhost ceph-mon[298913]: Safe to remove mon.np0005559464: new quorum should be ['np0005559462', 'np0005559463'] (from ['np0005559462', 'np0005559463']) Dec 15 04:55:28 localhost ceph-mon[298913]: Removing monitor np0005559464 from monmap... Dec 15 04:55:28 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "mon rm", "name": "np0005559464"} : dispatch Dec 15 04:55:28 localhost ceph-mon[298913]: Removing daemon mon.np0005559464 from np0005559464.localdomain -- ports [] Dec 15 04:55:28 localhost ceph-mon[298913]: mon.np0005559463 calling monitor election Dec 15 04:55:28 localhost ceph-mon[298913]: mon.np0005559462 calling monitor election Dec 15 04:55:28 localhost ceph-mon[298913]: mon.np0005559462 is new leader, mons np0005559462,np0005559463 in quorum (ranks 0,1) Dec 15 04:55:28 localhost ceph-mon[298913]: Health detail: HEALTH_WARN 1 stray daemon(s) not managed by cephadm; 1 stray host(s) with 1 daemon(s) not managed by cephadm Dec 15 04:55:28 localhost ceph-mon[298913]: [WRN] CEPHADM_STRAY_DAEMON: 1 stray daemon(s) not managed by cephadm Dec 15 04:55:28 localhost ceph-mon[298913]: stray daemon mgr.np0005559460.oexkup on host np0005559460.localdomain not managed by cephadm Dec 15 04:55:28 localhost ceph-mon[298913]: [WRN] CEPHADM_STRAY_HOST: 1 stray host(s) with 1 daemon(s) not managed by cephadm Dec 15 04:55:28 localhost ceph-mon[298913]: stray host np0005559460.localdomain has 1 stray daemons: ['mgr.np0005559460.oexkup'] Dec 15 04:55:28 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:28 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:28 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005559464.piyuji", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Dec 15 04:55:28 localhost ceph-mgr[292421]: log_channel(cluster) log [DBG] : pgmap v38: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Dec 15 04:55:29 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559464.localdomain.devices.0}] v 0) Dec 15 04:55:29 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:29 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559464.localdomain}] v 0) Dec 15 04:55:29 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:29 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005559464.aomnqe (monmap changed)... Dec 15 04:55:29 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005559464.aomnqe (monmap changed)... Dec 15 04:55:29 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005559464.aomnqe", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0) Dec 15 04:55:29 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005559464.aomnqe", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Dec 15 04:55:29 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command({"prefix": "mgr services"} v 0) Dec 15 04:55:29 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "mgr services"} : dispatch Dec 15 04:55:29 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Dec 15 04:55:29 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "config generate-minimal-conf"} : dispatch Dec 15 04:55:29 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005559464.aomnqe on np0005559464.localdomain Dec 15 04:55:29 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005559464.aomnqe on np0005559464.localdomain Dec 15 04:55:29 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559464.localdomain.devices.0}] v 0) Dec 15 04:55:29 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:29 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559464.localdomain}] v 0) Dec 15 04:55:29 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:30 localhost ceph-mon[298913]: Reconfiguring mds.mds.np0005559464.piyuji (monmap changed)... Dec 15 04:55:30 localhost ceph-mon[298913]: Reconfiguring daemon mds.mds.np0005559464.piyuji on np0005559464.localdomain Dec 15 04:55:30 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:30 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:30 localhost ceph-mon[298913]: Reconfiguring mgr.np0005559464.aomnqe (monmap changed)... Dec 15 04:55:30 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005559464.aomnqe", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Dec 15 04:55:30 localhost ceph-mon[298913]: Reconfiguring daemon mgr.np0005559464.aomnqe on np0005559464.localdomain Dec 15 04:55:30 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:30 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:30 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:55:30 localhost ceph-mgr[292421]: log_channel(cluster) log [DBG] : pgmap v39: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Dec 15 04:55:30 localhost nova_compute[286344]: 2025-12-15 09:55:30.830 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:55:30 localhost nova_compute[286344]: 2025-12-15 09:55:30.836 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:55:31 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559464.localdomain.devices.0}] v 0) Dec 15 04:55:31 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:31 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559464.localdomain}] v 0) Dec 15 04:55:31 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:31 localhost podman[243449]: time="2025-12-15T09:55:31Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 15 04:55:31 localhost podman[243449]: @ - - [15/Dec/2025:09:55:31 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 154814 "" "Go-http-client/1.1" Dec 15 04:55:31 localhost podman[243449]: @ - - [15/Dec/2025:09:55:31 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18731 "" "Go-http-client/1.1" Dec 15 04:55:32 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:32 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:32 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Dec 15 04:55:32 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "config generate-minimal-conf"} : dispatch Dec 15 04:55:32 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) Dec 15 04:55:32 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Dec 15 04:55:32 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Updating np0005559462.localdomain:/etc/ceph/ceph.conf Dec 15 04:55:32 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Updating np0005559462.localdomain:/etc/ceph/ceph.conf Dec 15 04:55:32 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Updating np0005559463.localdomain:/etc/ceph/ceph.conf Dec 15 04:55:32 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Updating np0005559464.localdomain:/etc/ceph/ceph.conf Dec 15 04:55:32 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Updating np0005559463.localdomain:/etc/ceph/ceph.conf Dec 15 04:55:32 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Updating np0005559464.localdomain:/etc/ceph/ceph.conf Dec 15 04:55:32 localhost ceph-mgr[292421]: log_channel(cluster) log [DBG] : pgmap v40: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Dec 15 04:55:32 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Updating np0005559463.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:55:32 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Updating np0005559463.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:55:33 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Updating np0005559464.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:55:33 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Updating np0005559464.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:55:33 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Updating np0005559462.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:55:33 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Updating np0005559462.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:55:33 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Dec 15 04:55:33 localhost ceph-mon[298913]: Updating np0005559462.localdomain:/etc/ceph/ceph.conf Dec 15 04:55:33 localhost ceph-mon[298913]: Updating np0005559463.localdomain:/etc/ceph/ceph.conf Dec 15 04:55:33 localhost ceph-mon[298913]: Updating np0005559464.localdomain:/etc/ceph/ceph.conf Dec 15 04:55:33 localhost ceph-mon[298913]: Updating np0005559463.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:55:33 localhost ceph-mon[298913]: Updating np0005559464.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:55:33 localhost ceph-mon[298913]: Updating np0005559462.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:55:33 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559463.localdomain.devices.0}] v 0) Dec 15 04:55:33 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559464.localdomain.devices.0}] v 0) Dec 15 04:55:33 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:33 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559463.localdomain}] v 0) Dec 15 04:55:33 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:33 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559464.localdomain}] v 0) Dec 15 04:55:33 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:33 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:33 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559462.localdomain.devices.0}] v 0) Dec 15 04:55:33 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:33 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559462.localdomain}] v 0) Dec 15 04:55:33 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:33 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Dec 15 04:55:33 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:33 localhost ceph-mgr[292421]: [progress INFO root] update: starting ev 4fba00e8-3113-41bf-a6d6-d35f706fde26 (Updating node-proxy deployment (+3 -> 3)) Dec 15 04:55:33 localhost ceph-mgr[292421]: [progress INFO root] complete: finished ev 4fba00e8-3113-41bf-a6d6-d35f706fde26 (Updating node-proxy deployment (+3 -> 3)) Dec 15 04:55:33 localhost ceph-mgr[292421]: [progress INFO root] Completed event 4fba00e8-3113-41bf-a6d6-d35f706fde26 (Updating node-proxy deployment (+3 -> 3)) in 0 seconds Dec 15 04:55:33 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) Dec 15 04:55:33 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch Dec 15 04:55:33 localhost ceph-mgr[292421]: [progress INFO root] Writing back 50 completed events Dec 15 04:55:33 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Dec 15 04:55:33 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:33 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005559462 (monmap changed)... Dec 15 04:55:33 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005559462 (monmap changed)... Dec 15 04:55:33 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005559462.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0) Dec 15 04:55:33 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005559462.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Dec 15 04:55:33 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Dec 15 04:55:33 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "config generate-minimal-conf"} : dispatch Dec 15 04:55:33 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005559462 on np0005559462.localdomain Dec 15 04:55:33 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005559462 on np0005559462.localdomain Dec 15 04:55:34 localhost podman[308471]: Dec 15 04:55:34 localhost podman[308471]: 2025-12-15 09:55:34.542115504 +0000 UTC m=+0.076456623 container create ee1b17d7d63e9278b413be2abe95159c0253389110c7014d69dfdccf8d4af774 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=determined_torvalds, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, distribution-scope=public, vcs-type=git, ceph=True, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, release=1763362218, build-date=2025-11-26T19:44:28Z, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, version=7, architecture=x86_64, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux ) Dec 15 04:55:34 localhost systemd[1]: Started libpod-conmon-ee1b17d7d63e9278b413be2abe95159c0253389110c7014d69dfdccf8d4af774.scope. Dec 15 04:55:34 localhost systemd[1]: Started libcrun container. Dec 15 04:55:34 localhost podman[308471]: 2025-12-15 09:55:34.508612294 +0000 UTC m=+0.042953423 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 04:55:34 localhost podman[308471]: 2025-12-15 09:55:34.623476942 +0000 UTC m=+0.157818071 container init ee1b17d7d63e9278b413be2abe95159c0253389110c7014d69dfdccf8d4af774 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=determined_torvalds, io.openshift.tags=rhceph ceph, ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, name=rhceph, version=7, distribution-scope=public, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, io.openshift.expose-services=, vcs-type=git, build-date=2025-11-26T19:44:28Z, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux ) Dec 15 04:55:34 localhost podman[308471]: 2025-12-15 09:55:34.634105847 +0000 UTC m=+0.168446966 container start ee1b17d7d63e9278b413be2abe95159c0253389110c7014d69dfdccf8d4af774 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=determined_torvalds, io.openshift.expose-services=, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, GIT_CLEAN=True, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, RELEASE=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, release=1763362218, io.openshift.tags=rhceph ceph, name=rhceph, build-date=2025-11-26T19:44:28Z, vcs-type=git) Dec 15 04:55:34 localhost podman[308471]: 2025-12-15 09:55:34.634361434 +0000 UTC m=+0.168702613 container attach ee1b17d7d63e9278b413be2abe95159c0253389110c7014d69dfdccf8d4af774 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=determined_torvalds, vcs-type=git, io.openshift.expose-services=, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, name=rhceph, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, distribution-scope=public, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, ceph=True, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git) Dec 15 04:55:34 localhost determined_torvalds[308486]: 167 167 Dec 15 04:55:34 localhost systemd[1]: libpod-ee1b17d7d63e9278b413be2abe95159c0253389110c7014d69dfdccf8d4af774.scope: Deactivated successfully. Dec 15 04:55:34 localhost podman[308471]: 2025-12-15 09:55:34.637552713 +0000 UTC m=+0.171893842 container died ee1b17d7d63e9278b413be2abe95159c0253389110c7014d69dfdccf8d4af774 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=determined_torvalds, CEPH_POINT_RELEASE=, version=7, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, architecture=x86_64, GIT_BRANCH=main, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, build-date=2025-11-26T19:44:28Z, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4) Dec 15 04:55:34 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:34 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:34 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:34 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:34 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:34 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:34 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:34 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:34 localhost ceph-mon[298913]: Reconfiguring crash.np0005559462 (monmap changed)... Dec 15 04:55:34 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005559462.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Dec 15 04:55:34 localhost ceph-mon[298913]: Reconfiguring daemon crash.np0005559462 on np0005559462.localdomain Dec 15 04:55:34 localhost podman[308491]: 2025-12-15 09:55:34.741234011 +0000 UTC m=+0.090459682 container remove ee1b17d7d63e9278b413be2abe95159c0253389110c7014d69dfdccf8d4af774 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=determined_torvalds, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, build-date=2025-11-26T19:44:28Z, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , GIT_BRANCH=main, ceph=True, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main) Dec 15 04:55:34 localhost systemd[1]: libpod-conmon-ee1b17d7d63e9278b413be2abe95159c0253389110c7014d69dfdccf8d4af774.scope: Deactivated successfully. Dec 15 04:55:34 localhost ceph-mgr[292421]: log_channel(cluster) log [DBG] : pgmap v41: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Dec 15 04:55:34 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559462.localdomain.devices.0}] v 0) Dec 15 04:55:34 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:34 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559462.localdomain}] v 0) Dec 15 04:55:34 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:34 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Reconfiguring osd.0 (monmap changed)... Dec 15 04:55:34 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Reconfiguring osd.0 (monmap changed)... Dec 15 04:55:34 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command({"prefix": "auth get", "entity": "osd.0"} v 0) Dec 15 04:55:34 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch Dec 15 04:55:34 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Dec 15 04:55:34 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "config generate-minimal-conf"} : dispatch Dec 15 04:55:34 localhost openstack_network_exporter[246484]: ERROR 09:55:34 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 15 04:55:34 localhost openstack_network_exporter[246484]: ERROR 09:55:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 04:55:34 localhost openstack_network_exporter[246484]: ERROR 09:55:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 04:55:34 localhost openstack_network_exporter[246484]: ERROR 09:55:34 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 15 04:55:34 localhost openstack_network_exporter[246484]: Dec 15 04:55:34 localhost openstack_network_exporter[246484]: ERROR 09:55:34 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 15 04:55:34 localhost openstack_network_exporter[246484]: Dec 15 04:55:34 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.0 on np0005559462.localdomain Dec 15 04:55:34 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.0 on np0005559462.localdomain Dec 15 04:55:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 04:55:35 localhost podman[308524]: 2025-12-15 09:55:35.072556747 +0000 UTC m=+0.089175316 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Dec 15 04:55:35 localhost podman[308524]: 2025-12-15 09:55:35.105176123 +0000 UTC m=+0.121794742 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 04:55:35 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:55:35 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 04:55:35 localhost podman[308577]: Dec 15 04:55:35 localhost podman[308577]: 2025-12-15 09:55:35.470023719 +0000 UTC m=+0.079652691 container create 057898e6efae9d51f950d70b7e1d7db1dd087993257777f85550429e510ea8c1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=youthful_wing, RELEASE=main, CEPH_POINT_RELEASE=, distribution-scope=public, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, build-date=2025-11-26T19:44:28Z, name=rhceph, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, ceph=True, release=1763362218, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers) Dec 15 04:55:35 localhost systemd[1]: Started libpod-conmon-057898e6efae9d51f950d70b7e1d7db1dd087993257777f85550429e510ea8c1.scope. Dec 15 04:55:35 localhost systemd[1]: Started libcrun container. Dec 15 04:55:35 localhost podman[308577]: 2025-12-15 09:55:35.534316044 +0000 UTC m=+0.143945016 container init 057898e6efae9d51f950d70b7e1d7db1dd087993257777f85550429e510ea8c1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=youthful_wing, io.buildah.version=1.41.4, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, com.redhat.component=rhceph-container, version=7, maintainer=Guillaume Abrioux , cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, architecture=x86_64, RELEASE=main, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, GIT_CLEAN=True, distribution-scope=public, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Dec 15 04:55:35 localhost podman[308577]: 2025-12-15 09:55:35.440803089 +0000 UTC m=+0.050432081 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 04:55:35 localhost podman[308577]: 2025-12-15 09:55:35.543617483 +0000 UTC m=+0.153246455 container start 057898e6efae9d51f950d70b7e1d7db1dd087993257777f85550429e510ea8c1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=youthful_wing, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-26T19:44:28Z, com.redhat.component=rhceph-container, name=rhceph, io.buildah.version=1.41.4, version=7, maintainer=Guillaume Abrioux , architecture=x86_64, RELEASE=main, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main) Dec 15 04:55:35 localhost podman[308577]: 2025-12-15 09:55:35.54387062 +0000 UTC m=+0.153499582 container attach 057898e6efae9d51f950d70b7e1d7db1dd087993257777f85550429e510ea8c1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=youthful_wing, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, version=7, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, ceph=True, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, GIT_CLEAN=True, CEPH_POINT_RELEASE=, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 15 04:55:35 localhost youthful_wing[308592]: 167 167 Dec 15 04:55:35 localhost systemd[1]: var-lib-containers-storage-overlay-0ee9524d906f9dcc5236fa4b079b33857ea7d1cf427c53ca4d45ce8f9f57c3db-merged.mount: Deactivated successfully. Dec 15 04:55:35 localhost systemd[1]: libpod-057898e6efae9d51f950d70b7e1d7db1dd087993257777f85550429e510ea8c1.scope: Deactivated successfully. Dec 15 04:55:35 localhost podman[308577]: 2025-12-15 09:55:35.553144377 +0000 UTC m=+0.162773369 container died 057898e6efae9d51f950d70b7e1d7db1dd087993257777f85550429e510ea8c1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=youthful_wing, RELEASE=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, version=7, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1763362218, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, com.redhat.component=rhceph-container, distribution-scope=public, CEPH_POINT_RELEASE=, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., GIT_CLEAN=True, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, name=rhceph) Dec 15 04:55:35 localhost systemd[1]: var-lib-containers-storage-overlay-2a01b0f125ab6af05348de83029b1725e9de66219a4ffa0264cb902f8af89873-merged.mount: Deactivated successfully. Dec 15 04:55:35 localhost podman[308598]: 2025-12-15 09:55:35.65160713 +0000 UTC m=+0.090273077 container remove 057898e6efae9d51f950d70b7e1d7db1dd087993257777f85550429e510ea8c1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=youthful_wing, maintainer=Guillaume Abrioux , release=1763362218, build-date=2025-11-26T19:44:28Z, com.redhat.component=rhceph-container, architecture=x86_64, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., name=rhceph, ceph=True, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0) Dec 15 04:55:35 localhost systemd[1]: libpod-conmon-057898e6efae9d51f950d70b7e1d7db1dd087993257777f85550429e510ea8c1.scope: Deactivated successfully. Dec 15 04:55:35 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:35 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:35 localhost nova_compute[286344]: 2025-12-15 09:55:35.835 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:55:35 localhost ceph-mon[298913]: Reconfiguring osd.0 (monmap changed)... Dec 15 04:55:35 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch Dec 15 04:55:35 localhost ceph-mon[298913]: Reconfiguring daemon osd.0 on np0005559462.localdomain Dec 15 04:55:35 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559462.localdomain.devices.0}] v 0) Dec 15 04:55:35 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:35 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559462.localdomain}] v 0) Dec 15 04:55:35 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:35 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Reconfiguring osd.3 (monmap changed)... Dec 15 04:55:35 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Reconfiguring osd.3 (monmap changed)... Dec 15 04:55:35 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command({"prefix": "auth get", "entity": "osd.3"} v 0) Dec 15 04:55:35 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch Dec 15 04:55:35 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Dec 15 04:55:35 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "config generate-minimal-conf"} : dispatch Dec 15 04:55:35 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.3 on np0005559462.localdomain Dec 15 04:55:35 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.3 on np0005559462.localdomain Dec 15 04:55:36 localhost podman[308675]: Dec 15 04:55:36 localhost podman[308675]: 2025-12-15 09:55:36.578290762 +0000 UTC m=+0.080442454 container create 11b8949d9a582ac555927d4c58a9dceda15d91520e32cc6c397f3b2d87844511 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=awesome_wu, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, release=1763362218, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, version=7, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, io.openshift.expose-services=, build-date=2025-11-26T19:44:28Z, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, vcs-type=git) Dec 15 04:55:36 localhost systemd[1]: Started libpod-conmon-11b8949d9a582ac555927d4c58a9dceda15d91520e32cc6c397f3b2d87844511.scope. Dec 15 04:55:36 localhost systemd[1]: Started libcrun container. Dec 15 04:55:36 localhost podman[308675]: 2025-12-15 09:55:36.638298897 +0000 UTC m=+0.140450589 container init 11b8949d9a582ac555927d4c58a9dceda15d91520e32cc6c397f3b2d87844511 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=awesome_wu, ceph=True, RELEASE=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, io.buildah.version=1.41.4, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, version=7, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public) Dec 15 04:55:36 localhost podman[308675]: 2025-12-15 09:55:36.54510296 +0000 UTC m=+0.047254722 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 04:55:36 localhost podman[308675]: 2025-12-15 09:55:36.648746087 +0000 UTC m=+0.150897779 container start 11b8949d9a582ac555927d4c58a9dceda15d91520e32cc6c397f3b2d87844511 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=awesome_wu, maintainer=Guillaume Abrioux , GIT_BRANCH=main, RELEASE=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, release=1763362218, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, io.buildah.version=1.41.4, ceph=True, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0) Dec 15 04:55:36 localhost podman[308675]: 2025-12-15 09:55:36.649056916 +0000 UTC m=+0.151208608 container attach 11b8949d9a582ac555927d4c58a9dceda15d91520e32cc6c397f3b2d87844511 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=awesome_wu, distribution-scope=public, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, release=1763362218, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.buildah.version=1.41.4, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, version=7, build-date=2025-11-26T19:44:28Z, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph) Dec 15 04:55:36 localhost awesome_wu[308690]: 167 167 Dec 15 04:55:36 localhost systemd[1]: libpod-11b8949d9a582ac555927d4c58a9dceda15d91520e32cc6c397f3b2d87844511.scope: Deactivated successfully. Dec 15 04:55:36 localhost podman[308675]: 2025-12-15 09:55:36.652590284 +0000 UTC m=+0.154742046 container died 11b8949d9a582ac555927d4c58a9dceda15d91520e32cc6c397f3b2d87844511 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=awesome_wu, maintainer=Guillaume Abrioux , url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, release=1763362218, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, version=7, RELEASE=main, ceph=True, io.openshift.tags=rhceph ceph, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, GIT_CLEAN=True, distribution-scope=public, build-date=2025-11-26T19:44:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Dec 15 04:55:36 localhost podman[308695]: 2025-12-15 09:55:36.744125585 +0000 UTC m=+0.081646037 container remove 11b8949d9a582ac555927d4c58a9dceda15d91520e32cc6c397f3b2d87844511 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=awesome_wu, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, io.openshift.expose-services=, GIT_BRANCH=main, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, build-date=2025-11-26T19:44:28Z, vendor=Red Hat, Inc., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, release=1763362218, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, distribution-scope=public, io.openshift.tags=rhceph ceph, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, architecture=x86_64, name=rhceph) Dec 15 04:55:36 localhost systemd[1]: libpod-conmon-11b8949d9a582ac555927d4c58a9dceda15d91520e32cc6c397f3b2d87844511.scope: Deactivated successfully. Dec 15 04:55:36 localhost ceph-mgr[292421]: log_channel(cluster) log [DBG] : pgmap v42: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Dec 15 04:55:36 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:36 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:36 localhost ceph-mon[298913]: Reconfiguring osd.3 (monmap changed)... Dec 15 04:55:36 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch Dec 15 04:55:36 localhost ceph-mon[298913]: Reconfiguring daemon osd.3 on np0005559462.localdomain Dec 15 04:55:36 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559462.localdomain.devices.0}] v 0) Dec 15 04:55:36 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:36 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559462.localdomain}] v 0) Dec 15 04:55:36 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:36 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005559462.mhigvc (monmap changed)... Dec 15 04:55:36 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005559462.mhigvc (monmap changed)... Dec 15 04:55:36 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005559462.mhigvc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0) Dec 15 04:55:36 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005559462.mhigvc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Dec 15 04:55:36 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Dec 15 04:55:36 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "config generate-minimal-conf"} : dispatch Dec 15 04:55:36 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005559462.mhigvc on np0005559462.localdomain Dec 15 04:55:36 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005559462.mhigvc on np0005559462.localdomain Dec 15 04:55:37 localhost systemd[1]: var-lib-containers-storage-overlay-11a1c0d5665546851d71e533d414f6d3b757d30872ecb24d529da64289901d14-merged.mount: Deactivated successfully. Dec 15 04:55:37 localhost podman[308773]: Dec 15 04:55:37 localhost podman[308773]: 2025-12-15 09:55:37.649005651 +0000 UTC m=+0.074554200 container create 547814fc4a6469bcebccdf4bd85ebe60426986bb78cf25a26a2b5dd0dfd82bec (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_euclid, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., RELEASE=main, GIT_BRANCH=main, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-26T19:44:28Z, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, io.openshift.expose-services=, GIT_CLEAN=True, vcs-type=git, CEPH_POINT_RELEASE=, release=1763362218, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, version=7, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True) Dec 15 04:55:37 localhost systemd[1]: Started libpod-conmon-547814fc4a6469bcebccdf4bd85ebe60426986bb78cf25a26a2b5dd0dfd82bec.scope. Dec 15 04:55:37 localhost systemd[1]: Started libcrun container. Dec 15 04:55:37 localhost podman[308773]: 2025-12-15 09:55:37.709296975 +0000 UTC m=+0.134845514 container init 547814fc4a6469bcebccdf4bd85ebe60426986bb78cf25a26a2b5dd0dfd82bec (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_euclid, ceph=True, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, description=Red Hat Ceph Storage 7, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, name=rhceph, RELEASE=main, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, architecture=x86_64) Dec 15 04:55:37 localhost podman[308773]: 2025-12-15 09:55:37.619207515 +0000 UTC m=+0.044756124 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 04:55:37 localhost podman[308773]: 2025-12-15 09:55:37.71884372 +0000 UTC m=+0.144392259 container start 547814fc4a6469bcebccdf4bd85ebe60426986bb78cf25a26a2b5dd0dfd82bec (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_euclid, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, io.openshift.expose-services=, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., release=1763362218, distribution-scope=public, vcs-type=git, version=7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True) Dec 15 04:55:37 localhost podman[308773]: 2025-12-15 09:55:37.71920077 +0000 UTC m=+0.144749359 container attach 547814fc4a6469bcebccdf4bd85ebe60426986bb78cf25a26a2b5dd0dfd82bec (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_euclid, vcs-type=git, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , GIT_BRANCH=main, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, distribution-scope=public, io.openshift.expose-services=, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, release=1763362218, description=Red Hat Ceph Storage 7, version=7) Dec 15 04:55:37 localhost recursing_euclid[308788]: 167 167 Dec 15 04:55:37 localhost systemd[1]: libpod-547814fc4a6469bcebccdf4bd85ebe60426986bb78cf25a26a2b5dd0dfd82bec.scope: Deactivated successfully. Dec 15 04:55:37 localhost podman[308773]: 2025-12-15 09:55:37.722278115 +0000 UTC m=+0.147826664 container died 547814fc4a6469bcebccdf4bd85ebe60426986bb78cf25a26a2b5dd0dfd82bec (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_euclid, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, build-date=2025-11-26T19:44:28Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, GIT_BRANCH=main, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, GIT_CLEAN=True, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , ceph=True, CEPH_POINT_RELEASE=, release=1763362218, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main) Dec 15 04:55:37 localhost podman[308793]: 2025-12-15 09:55:37.841890535 +0000 UTC m=+0.104859551 container remove 547814fc4a6469bcebccdf4bd85ebe60426986bb78cf25a26a2b5dd0dfd82bec (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_euclid, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, distribution-scope=public, io.openshift.expose-services=, build-date=2025-11-26T19:44:28Z, RELEASE=main, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , release=1763362218, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64) Dec 15 04:55:37 localhost systemd[1]: libpod-conmon-547814fc4a6469bcebccdf4bd85ebe60426986bb78cf25a26a2b5dd0dfd82bec.scope: Deactivated successfully. Dec 15 04:55:37 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559462.localdomain.devices.0}] v 0) Dec 15 04:55:37 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:37 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559462.localdomain}] v 0) Dec 15 04:55:37 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:37 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005559462.fudvyx (monmap changed)... Dec 15 04:55:37 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005559462.fudvyx (monmap changed)... Dec 15 04:55:37 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005559462.fudvyx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0) Dec 15 04:55:37 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005559462.fudvyx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Dec 15 04:55:37 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command({"prefix": "mgr services"} v 0) Dec 15 04:55:37 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "mgr services"} : dispatch Dec 15 04:55:37 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Dec 15 04:55:37 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "config generate-minimal-conf"} : dispatch Dec 15 04:55:37 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005559462.fudvyx on np0005559462.localdomain Dec 15 04:55:37 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005559462.fudvyx on np0005559462.localdomain Dec 15 04:55:37 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:37 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:37 localhost ceph-mon[298913]: Reconfiguring mds.mds.np0005559462.mhigvc (monmap changed)... Dec 15 04:55:37 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005559462.mhigvc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Dec 15 04:55:37 localhost ceph-mon[298913]: Reconfiguring daemon mds.mds.np0005559462.mhigvc on np0005559462.localdomain Dec 15 04:55:37 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:37 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:38 localhost podman[308863]: Dec 15 04:55:38 localhost podman[308863]: 2025-12-15 09:55:38.538254824 +0000 UTC m=+0.082177942 container create bab075d8106c6dfa5e05a862a9b1bfe9dfeee3b00848989c47206ee9f75a5cd2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nostalgic_keldysh, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , build-date=2025-11-26T19:44:28Z, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, io.buildah.version=1.41.4, distribution-scope=public, release=1763362218, name=rhceph, description=Red Hat Ceph Storage 7, version=7, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, vcs-type=git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, io.k8s.description=Red Hat Ceph Storage 7) Dec 15 04:55:38 localhost systemd[1]: Started libpod-conmon-bab075d8106c6dfa5e05a862a9b1bfe9dfeee3b00848989c47206ee9f75a5cd2.scope. Dec 15 04:55:38 localhost systemd[1]: var-lib-containers-storage-overlay-7029b9301f10c09893d8c2024b60b7cbf9ce2b98ee39ac323382596f3bde3be6-merged.mount: Deactivated successfully. Dec 15 04:55:38 localhost systemd[1]: Started libcrun container. Dec 15 04:55:38 localhost podman[308863]: 2025-12-15 09:55:38.603183157 +0000 UTC m=+0.147106275 container init bab075d8106c6dfa5e05a862a9b1bfe9dfeee3b00848989c47206ee9f75a5cd2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nostalgic_keldysh, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux , distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, release=1763362218, description=Red Hat Ceph Storage 7, vcs-type=git, architecture=x86_64, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., GIT_CLEAN=True, GIT_BRANCH=main, com.redhat.component=rhceph-container, ceph=True, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-26T19:44:28Z, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Dec 15 04:55:38 localhost podman[308863]: 2025-12-15 09:55:38.507273255 +0000 UTC m=+0.051196403 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 04:55:38 localhost nostalgic_keldysh[308879]: 167 167 Dec 15 04:55:38 localhost systemd[1]: libpod-bab075d8106c6dfa5e05a862a9b1bfe9dfeee3b00848989c47206ee9f75a5cd2.scope: Deactivated successfully. Dec 15 04:55:38 localhost podman[308863]: 2025-12-15 09:55:38.613827872 +0000 UTC m=+0.157750990 container start bab075d8106c6dfa5e05a862a9b1bfe9dfeee3b00848989c47206ee9f75a5cd2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nostalgic_keldysh, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-11-26T19:44:28Z, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, name=rhceph, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, version=7, vcs-type=git, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1763362218, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=) Dec 15 04:55:38 localhost podman[308863]: 2025-12-15 09:55:38.614134501 +0000 UTC m=+0.158057629 container attach bab075d8106c6dfa5e05a862a9b1bfe9dfeee3b00848989c47206ee9f75a5cd2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nostalgic_keldysh, build-date=2025-11-26T19:44:28Z, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vendor=Red Hat, Inc., vcs-type=git, architecture=x86_64, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, release=1763362218, GIT_CLEAN=True, RELEASE=main, maintainer=Guillaume Abrioux , GIT_BRANCH=main, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0) Dec 15 04:55:38 localhost podman[308863]: 2025-12-15 09:55:38.616929978 +0000 UTC m=+0.160853106 container died bab075d8106c6dfa5e05a862a9b1bfe9dfeee3b00848989c47206ee9f75a5cd2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nostalgic_keldysh, architecture=x86_64, maintainer=Guillaume Abrioux , RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, ceph=True, description=Red Hat Ceph Storage 7, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., name=rhceph, io.buildah.version=1.41.4, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1763362218, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, vcs-type=git, GIT_BRANCH=main) Dec 15 04:55:38 localhost podman[308884]: 2025-12-15 09:55:38.709420945 +0000 UTC m=+0.082793679 container remove bab075d8106c6dfa5e05a862a9b1bfe9dfeee3b00848989c47206ee9f75a5cd2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nostalgic_keldysh, GIT_BRANCH=main, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, RELEASE=main, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-11-26T19:44:28Z, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, architecture=x86_64, name=rhceph, io.buildah.version=1.41.4, version=7, maintainer=Guillaume Abrioux , vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1763362218, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0) Dec 15 04:55:38 localhost systemd[1]: libpod-conmon-bab075d8106c6dfa5e05a862a9b1bfe9dfeee3b00848989c47206ee9f75a5cd2.scope: Deactivated successfully. Dec 15 04:55:38 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559462.localdomain.devices.0}] v 0) Dec 15 04:55:38 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:38 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559462.localdomain}] v 0) Dec 15 04:55:38 localhost ceph-mgr[292421]: log_channel(cluster) log [DBG] : pgmap v43: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Dec 15 04:55:38 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:38 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005559463 (monmap changed)... Dec 15 04:55:38 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005559463 (monmap changed)... Dec 15 04:55:38 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005559463.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0) Dec 15 04:55:38 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005559463.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Dec 15 04:55:38 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Dec 15 04:55:38 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "config generate-minimal-conf"} : dispatch Dec 15 04:55:38 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005559463 on np0005559463.localdomain Dec 15 04:55:38 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005559463 on np0005559463.localdomain Dec 15 04:55:38 localhost ceph-mon[298913]: Reconfiguring mgr.np0005559462.fudvyx (monmap changed)... Dec 15 04:55:38 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005559462.fudvyx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Dec 15 04:55:38 localhost ceph-mon[298913]: Reconfiguring daemon mgr.np0005559462.fudvyx on np0005559462.localdomain Dec 15 04:55:38 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:38 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:38 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005559463.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Dec 15 04:55:39 localhost systemd[1]: var-lib-containers-storage-overlay-96ab797859387190f21ab1c3fb6594e9db3f4232083f8b9c1509645aa196e336-merged.mount: Deactivated successfully. Dec 15 04:55:39 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559463.localdomain.devices.0}] v 0) Dec 15 04:55:39 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:39 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559463.localdomain}] v 0) Dec 15 04:55:39 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:39 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Reconfiguring osd.2 (monmap changed)... Dec 15 04:55:39 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Reconfiguring osd.2 (monmap changed)... Dec 15 04:55:39 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command({"prefix": "auth get", "entity": "osd.2"} v 0) Dec 15 04:55:39 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Dec 15 04:55:39 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Dec 15 04:55:39 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "config generate-minimal-conf"} : dispatch Dec 15 04:55:39 localhost ceph-mgr[292421]: log_channel(audit) log [DBG] : from='client.54157 -' entity='client.admin' cmd=[{"prefix": "orch daemon add", "daemon_type": "mon", "placement": "np0005559464.localdomain:172.18.0.105", "target": ["mon-mgr", ""]}]: dispatch Dec 15 04:55:39 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mon}] v 0) Dec 15 04:55:39 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.2 on np0005559463.localdomain Dec 15 04:55:39 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.2 on np0005559463.localdomain Dec 15 04:55:39 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:39 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command({"prefix": "auth get", "entity": "mon."} v 0) Dec 15 04:55:39 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Dec 15 04:55:39 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Dec 15 04:55:39 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "config generate-minimal-conf"} : dispatch Dec 15 04:55:39 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Deploying daemon mon.np0005559464 on np0005559464.localdomain Dec 15 04:55:39 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Deploying daemon mon.np0005559464 on np0005559464.localdomain Dec 15 04:55:39 localhost ceph-mon[298913]: Reconfiguring crash.np0005559463 (monmap changed)... Dec 15 04:55:39 localhost ceph-mon[298913]: Reconfiguring daemon crash.np0005559463 on np0005559463.localdomain Dec 15 04:55:39 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:39 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:39 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Dec 15 04:55:39 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:39 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Dec 15 04:55:40 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:55:40 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559463.localdomain.devices.0}] v 0) Dec 15 04:55:40 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:40 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559463.localdomain}] v 0) Dec 15 04:55:40 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:40 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Reconfiguring osd.5 (monmap changed)... Dec 15 04:55:40 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Reconfiguring osd.5 (monmap changed)... Dec 15 04:55:40 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command({"prefix": "auth get", "entity": "osd.5"} v 0) Dec 15 04:55:40 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Dec 15 04:55:40 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Dec 15 04:55:40 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "config generate-minimal-conf"} : dispatch Dec 15 04:55:40 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.5 on np0005559463.localdomain Dec 15 04:55:40 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.5 on np0005559463.localdomain Dec 15 04:55:40 localhost ceph-mgr[292421]: log_channel(cluster) log [DBG] : pgmap v44: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Dec 15 04:55:40 localhost nova_compute[286344]: 2025-12-15 09:55:40.836 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:55:40 localhost nova_compute[286344]: 2025-12-15 09:55:40.840 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:55:41 localhost ceph-mon[298913]: Reconfiguring osd.2 (monmap changed)... Dec 15 04:55:41 localhost ceph-mon[298913]: Reconfiguring daemon osd.2 on np0005559463.localdomain Dec 15 04:55:41 localhost ceph-mon[298913]: Deploying daemon mon.np0005559464 on np0005559464.localdomain Dec 15 04:55:41 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:41 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:41 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Dec 15 04:55:41 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559463.localdomain.devices.0}] v 0) Dec 15 04:55:41 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:41 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559463.localdomain}] v 0) Dec 15 04:55:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e. Dec 15 04:55:41 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:41 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005559463.rdpgze (monmap changed)... Dec 15 04:55:41 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005559463.rdpgze (monmap changed)... Dec 15 04:55:41 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005559463.rdpgze", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0) Dec 15 04:55:41 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005559463.rdpgze", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Dec 15 04:55:41 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Dec 15 04:55:41 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "config generate-minimal-conf"} : dispatch Dec 15 04:55:41 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005559463.rdpgze on np0005559463.localdomain Dec 15 04:55:41 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005559463.rdpgze on np0005559463.localdomain Dec 15 04:55:41 localhost podman[308900]: 2025-12-15 09:55:41.767752564 +0000 UTC m=+0.093568508 container health_status a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Dec 15 04:55:41 localhost podman[308900]: 2025-12-15 09:55:41.78271769 +0000 UTC m=+0.108533634 container exec_died a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Dec 15 04:55:41 localhost systemd[1]: a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e.service: Deactivated successfully. Dec 15 04:55:42 localhost ceph-mon[298913]: Reconfiguring osd.5 (monmap changed)... Dec 15 04:55:42 localhost ceph-mon[298913]: Reconfiguring daemon osd.5 on np0005559463.localdomain Dec 15 04:55:42 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:42 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:42 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005559463.rdpgze", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Dec 15 04:55:42 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559464.localdomain.devices.0}] v 0) Dec 15 04:55:42 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:42 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559464.localdomain}] v 0) Dec 15 04:55:42 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:42 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559463.localdomain.devices.0}] v 0) Dec 15 04:55:42 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:42 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559463.localdomain}] v 0) Dec 15 04:55:42 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:42 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005559463.daptkf (monmap changed)... Dec 15 04:55:42 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005559463.daptkf (monmap changed)... Dec 15 04:55:42 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005559463.daptkf", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0) Dec 15 04:55:42 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005559463.daptkf", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Dec 15 04:55:42 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command({"prefix": "mgr services"} v 0) Dec 15 04:55:42 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "mgr services"} : dispatch Dec 15 04:55:42 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Dec 15 04:55:42 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "config generate-minimal-conf"} : dispatch Dec 15 04:55:42 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005559463.daptkf on np0005559463.localdomain Dec 15 04:55:42 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005559463.daptkf on np0005559463.localdomain Dec 15 04:55:42 localhost ceph-mgr[292421]: log_channel(cluster) log [DBG] : pgmap v45: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Dec 15 04:55:43 localhost ceph-mon[298913]: Reconfiguring mds.mds.np0005559463.rdpgze (monmap changed)... Dec 15 04:55:43 localhost ceph-mon[298913]: Reconfiguring daemon mds.mds.np0005559463.rdpgze on np0005559463.localdomain Dec 15 04:55:43 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:43 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:43 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:43 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:43 localhost ceph-mon[298913]: Reconfiguring mgr.np0005559463.daptkf (monmap changed)... Dec 15 04:55:43 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005559463.daptkf", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Dec 15 04:55:43 localhost ceph-mon[298913]: Reconfiguring daemon mgr.np0005559463.daptkf on np0005559463.localdomain Dec 15 04:55:43 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559463.localdomain.devices.0}] v 0) Dec 15 04:55:43 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:43 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559463.localdomain}] v 0) Dec 15 04:55:43 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:43 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005559464 (monmap changed)... Dec 15 04:55:43 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005559464 (monmap changed)... Dec 15 04:55:43 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005559464.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0) Dec 15 04:55:43 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005559464.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Dec 15 04:55:43 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Dec 15 04:55:43 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "config generate-minimal-conf"} : dispatch Dec 15 04:55:43 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005559464 on np0005559464.localdomain Dec 15 04:55:43 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005559464 on np0005559464.localdomain Dec 15 04:55:44 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559464.localdomain.devices.0}] v 0) Dec 15 04:55:44 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:44 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559464.localdomain}] v 0) Dec 15 04:55:44 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:44 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Reconfiguring osd.1 (monmap changed)... Dec 15 04:55:44 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Reconfiguring osd.1 (monmap changed)... Dec 15 04:55:44 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command({"prefix": "auth get", "entity": "osd.1"} v 0) Dec 15 04:55:44 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Dec 15 04:55:44 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Dec 15 04:55:44 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "config generate-minimal-conf"} : dispatch Dec 15 04:55:44 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.1 on np0005559464.localdomain Dec 15 04:55:44 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.1 on np0005559464.localdomain Dec 15 04:55:44 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:44 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:44 localhost ceph-mon[298913]: Reconfiguring crash.np0005559464 (monmap changed)... Dec 15 04:55:44 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005559464.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Dec 15 04:55:44 localhost ceph-mon[298913]: Reconfiguring daemon crash.np0005559464 on np0005559464.localdomain Dec 15 04:55:44 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:44 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:44 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Dec 15 04:55:44 localhost ceph-mgr[292421]: log_channel(cluster) log [DBG] : pgmap v46: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Dec 15 04:55:45 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:55:45 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559464.localdomain.devices.0}] v 0) Dec 15 04:55:45 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:45 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559464.localdomain}] v 0) Dec 15 04:55:45 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:45 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Reconfiguring osd.4 (monmap changed)... Dec 15 04:55:45 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Reconfiguring osd.4 (monmap changed)... Dec 15 04:55:45 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command({"prefix": "auth get", "entity": "osd.4"} v 0) Dec 15 04:55:45 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Dec 15 04:55:45 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Dec 15 04:55:45 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "config generate-minimal-conf"} : dispatch Dec 15 04:55:45 localhost ceph-mon[298913]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #31. Immutable memtables: 0. Dec 15 04:55:45 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:55:45.259273) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Dec 15 04:55:45 localhost ceph-mon[298913]: rocksdb: [db/flush_job.cc:856] [default] [JOB 15] Flushing memtable with next log file: 31 Dec 15 04:55:45 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765792545259323, "job": 15, "event": "flush_started", "num_memtables": 1, "num_entries": 1767, "num_deletes": 252, "total_data_size": 2762319, "memory_usage": 2816856, "flush_reason": "Manual Compaction"} Dec 15 04:55:45 localhost ceph-mon[298913]: rocksdb: [db/flush_job.cc:885] [default] [JOB 15] Level-0 flush table #32: started Dec 15 04:55:45 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.4 on np0005559464.localdomain Dec 15 04:55:45 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.4 on np0005559464.localdomain Dec 15 04:55:45 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765792545273735, "cf_name": "default", "job": 15, "event": "table_file_creation", "file_number": 32, "file_size": 1846149, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 18189, "largest_seqno": 19950, "table_properties": {"data_size": 1838267, "index_size": 4455, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2373, "raw_key_size": 20973, "raw_average_key_size": 22, "raw_value_size": 1820889, "raw_average_value_size": 1957, "num_data_blocks": 196, "num_entries": 930, "num_filter_entries": 930, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765792503, "oldest_key_time": 1765792503, "file_creation_time": 1765792545, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "603b24af-e2be-4214-bc56-9e652eb4af3d", "db_session_id": "0OJRM9SCUA16EXV0VQZ2", "orig_file_number": 32, "seqno_to_time_mapping": "N/A"}} Dec 15 04:55:45 localhost ceph-mon[298913]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 15] Flush lasted 14512 microseconds, and 5151 cpu microseconds. Dec 15 04:55:45 localhost ceph-mon[298913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Dec 15 04:55:45 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:55:45.273783) [db/flush_job.cc:967] [default] [JOB 15] Level-0 flush table #32: 1846149 bytes OK Dec 15 04:55:45 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:55:45.273806) [db/memtable_list.cc:519] [default] Level-0 commit table #32 started Dec 15 04:55:45 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:55:45.275571) [db/memtable_list.cc:722] [default] Level-0 commit table #32: memtable #1 done Dec 15 04:55:45 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:55:45.275591) EVENT_LOG_v1 {"time_micros": 1765792545275585, "job": 15, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Dec 15 04:55:45 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:55:45.275612) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Dec 15 04:55:45 localhost ceph-mon[298913]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 15] Try to delete WAL files size 2753574, prev total WAL file size 2753574, number of live WAL files 2. Dec 15 04:55:45 localhost ceph-mon[298913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005559462/store.db/000028.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Dec 15 04:55:45 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:55:45.276431) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003131303434' seq:72057594037927935, type:22 .. '7061786F73003131323936' seq:0, type:0; will stop at (end) Dec 15 04:55:45 localhost ceph-mon[298913]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 16] Compacting 1@0 + 1@6 files to L6, score -1.00 Dec 15 04:55:45 localhost ceph-mon[298913]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 15 Base level 0, inputs: [32(1802KB)], [30(18MB)] Dec 15 04:55:45 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765792545276491, "job": 16, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [32], "files_L6": [30], "score": -1, "input_data_size": 21614537, "oldest_snapshot_seqno": -1} Dec 15 04:55:45 localhost ceph-mon[298913]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 16] Generated table #33: 11637 keys, 17167944 bytes, temperature: kUnknown Dec 15 04:55:45 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765792545373874, "cf_name": "default", "job": 16, "event": "table_file_creation", "file_number": 33, "file_size": 17167944, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17100132, "index_size": 37631, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 29125, "raw_key_size": 312981, "raw_average_key_size": 26, "raw_value_size": 16899953, "raw_average_value_size": 1452, "num_data_blocks": 1432, "num_entries": 11637, "num_filter_entries": 11637, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765792320, "oldest_key_time": 0, "file_creation_time": 1765792545, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "603b24af-e2be-4214-bc56-9e652eb4af3d", "db_session_id": "0OJRM9SCUA16EXV0VQZ2", "orig_file_number": 33, "seqno_to_time_mapping": "N/A"}} Dec 15 04:55:45 localhost ceph-mon[298913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Dec 15 04:55:45 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:55:45.374308) [db/compaction/compaction_job.cc:1663] [default] [JOB 16] Compacted 1@0 + 1@6 files to L6 => 17167944 bytes Dec 15 04:55:45 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:55:45.375924) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 221.5 rd, 176.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.8, 18.9 +0.0 blob) out(16.4 +0.0 blob), read-write-amplify(21.0) write-amplify(9.3) OK, records in: 12178, records dropped: 541 output_compression: NoCompression Dec 15 04:55:45 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:55:45.376060) EVENT_LOG_v1 {"time_micros": 1765792545376043, "job": 16, "event": "compaction_finished", "compaction_time_micros": 97562, "compaction_time_cpu_micros": 44760, "output_level": 6, "num_output_files": 1, "total_output_size": 17167944, "num_input_records": 12178, "num_output_records": 11637, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Dec 15 04:55:45 localhost ceph-mon[298913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005559462/store.db/000032.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Dec 15 04:55:45 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765792545376484, "job": 16, "event": "table_file_deletion", "file_number": 32} Dec 15 04:55:45 localhost ceph-mon[298913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005559462/store.db/000030.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Dec 15 04:55:45 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765792545379590, "job": 16, "event": "table_file_deletion", "file_number": 30} Dec 15 04:55:45 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:55:45.276341) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 04:55:45 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:55:45.379643) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 04:55:45 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:55:45.379648) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 04:55:45 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:55:45.379650) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 04:55:45 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:55:45.379652) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 04:55:45 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:55:45.379654) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 04:55:45 localhost nova_compute[286344]: 2025-12-15 09:55:45.840 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:55:46 localhost ceph-mon[298913]: Reconfiguring osd.1 (monmap changed)... Dec 15 04:55:46 localhost ceph-mon[298913]: Reconfiguring daemon osd.1 on np0005559464.localdomain Dec 15 04:55:46 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:46 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:46 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Dec 15 04:55:46 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559464.localdomain.devices.0}] v 0) Dec 15 04:55:46 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:46 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559464.localdomain}] v 0) Dec 15 04:55:46 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:46 localhost ceph-mgr[292421]: log_channel(cluster) log [DBG] : pgmap v47: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Dec 15 04:55:47 localhost ceph-mon[298913]: Reconfiguring osd.4 (monmap changed)... Dec 15 04:55:47 localhost ceph-mon[298913]: Reconfiguring daemon osd.4 on np0005559464.localdomain Dec 15 04:55:47 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:47 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:47 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559464.localdomain.devices.0}] v 0) Dec 15 04:55:47 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:47 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559464.localdomain}] v 0) Dec 15 04:55:47 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:47 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Dec 15 04:55:47 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "config generate-minimal-conf"} : dispatch Dec 15 04:55:47 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) Dec 15 04:55:47 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Dec 15 04:55:47 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Dec 15 04:55:47 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:47 localhost ceph-mgr[292421]: [progress INFO root] update: starting ev 19d1e734-af70-48a6-abfd-b53e43c9f17b (Updating node-proxy deployment (+3 -> 3)) Dec 15 04:55:47 localhost ceph-mgr[292421]: [progress INFO root] complete: finished ev 19d1e734-af70-48a6-abfd-b53e43c9f17b (Updating node-proxy deployment (+3 -> 3)) Dec 15 04:55:47 localhost ceph-mgr[292421]: [progress INFO root] Completed event 19d1e734-af70-48a6-abfd-b53e43c9f17b (Updating node-proxy deployment (+3 -> 3)) in 0 seconds Dec 15 04:55:47 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) Dec 15 04:55:47 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.121 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'name': 'test', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005559462.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'c785bf23f53946bc99867d8832a50266', 'user_id': '1ba5fce347b64bfebf995f187193f205', 'hostId': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.122 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.134 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.135 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.137 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7b4503da-891c-4e9c-9bde-46293c12780b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:55:48.122268', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '3b4a9068-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11614.314935328, 'message_signature': '40706fc42fdd235f056d3ca67a5192b570c19a0ba9618dd1747cf3a3d602f581'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:55:48.122268', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '3b4aa3f0-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11614.314935328, 'message_signature': 'f7166d49cf9b63c7d32129e60d0c9a149c3aec4fdfc61f850af1e83421b68fe8'}]}, 'timestamp': '2025-12-15 09:55:48.136091', '_unique_id': '91cac42171904f02bbd5ef4919f075fb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.137 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.137 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.137 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.137 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.137 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.137 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.137 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.137 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.137 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.137 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.137 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.137 12 ERROR oslo_messaging.notify.messaging Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.137 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.137 12 ERROR oslo_messaging.notify.messaging Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.137 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.137 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.137 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.137 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.137 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.137 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.137 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.137 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.137 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.137 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.137 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.137 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.137 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.137 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.137 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.137 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.137 12 ERROR oslo_messaging.notify.messaging Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.138 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.142 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.143 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f0b578ad-92e2-42e1-9694-489dd95df173', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:55:48.138976', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '3b4ba570-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11614.331678533, 'message_signature': 'a5bacbe980a69034923a60db46a46b165ee6d2e2039345d16d6dc88056979602'}]}, 'timestamp': '2025-12-15 09:55:48.142672', '_unique_id': 'd8f2cdd8184a485191835cb27559f28a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.143 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.143 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.143 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.143 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.143 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.143 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.143 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.143 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.143 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.143 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.143 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.143 12 ERROR oslo_messaging.notify.messaging Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.143 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.143 12 ERROR oslo_messaging.notify.messaging Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.143 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.143 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.143 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.143 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.143 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.143 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.143 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.143 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.143 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.143 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.143 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.143 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.143 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.143 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.143 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.143 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.143 12 ERROR oslo_messaging.notify.messaging Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.144 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.144 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.173 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.173 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.175 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6d1b8563-71d2-4b0c-8f56-2f067317bf3e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:55:48.145106', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '3b5067a4-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11614.337776042, 'message_signature': '1f9503210f14be6ff66a6463f719643723af394d07b3dba798715368a9544249'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:55:48.145106', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '3b507a96-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11614.337776042, 'message_signature': '5bbd3c700833698f977719da34b8ccb604cefc6a127c4d214326c50ba11b548a'}]}, 'timestamp': '2025-12-15 09:55:48.174315', '_unique_id': '677369da56d74d99a0037c729252b7df'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.175 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.175 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.175 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.175 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.175 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.175 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.175 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.175 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.175 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.175 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.175 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.175 12 ERROR oslo_messaging.notify.messaging Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.175 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.175 12 ERROR oslo_messaging.notify.messaging Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.175 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.175 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.175 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.175 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.175 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.175 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.175 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.175 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.175 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.175 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.175 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.175 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.175 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.175 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.175 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.175 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.175 12 ERROR oslo_messaging.notify.messaging Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.176 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.176 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.177 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.178 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e86a5d49-befb-4745-b65f-0f437e13981f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:55:48.177071', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '3b50f85e-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11614.331678533, 'message_signature': '3a290730283ab718f76b60cd0ee22979e418a615ed72ee5fad9d903adf3a9753'}]}, 'timestamp': '2025-12-15 09:55:48.177553', '_unique_id': '356bb7bc1d184789bbb4f0bfda87a9d4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.178 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.178 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.178 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.178 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.178 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.178 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.178 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.178 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.178 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.178 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.178 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.178 12 ERROR oslo_messaging.notify.messaging Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.178 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.178 12 ERROR oslo_messaging.notify.messaging Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.178 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.178 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.178 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.178 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.178 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.178 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.178 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.178 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.178 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.178 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.178 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.178 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.178 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.178 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.178 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.178 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.178 12 ERROR oslo_messaging.notify.messaging Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.179 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.179 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.181 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c4236123-b548-445d-a0cf-a306cabb7f60', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:55:48.179705', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '3b515e8e-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11614.331678533, 'message_signature': '3c82f1cfacc728afeb456b964934c7dbea58b0a8570b60d58b2815973dd87e43'}]}, 'timestamp': '2025-12-15 09:55:48.180198', '_unique_id': '1bf0568d688d4fb68671eac52481577e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.181 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.181 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.181 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.181 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.181 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.181 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.181 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.181 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.181 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.181 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.181 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.181 12 ERROR oslo_messaging.notify.messaging Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.181 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.181 12 ERROR oslo_messaging.notify.messaging Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.181 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.181 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.181 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.181 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.181 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.181 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.181 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.181 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.181 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.181 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.181 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.181 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.181 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.181 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.181 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.181 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.181 12 ERROR oslo_messaging.notify.messaging Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.182 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.198 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/cpu volume: 12200000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.199 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5b5ac23f-77f0-4666-946e-e50889a53837', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12200000000, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'timestamp': '2025-12-15T09:55:48.182287', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '3b543992-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11614.390924867, 'message_signature': 'f1ec7851f48509a1c8329ecc4bd4fc51e3f8eed325fcdad47886e08c4ceb9ceb'}]}, 'timestamp': '2025-12-15 09:55:48.198890', '_unique_id': '57808a00489f4af3bdecde72cb170ef0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.199 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.199 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.199 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.199 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.199 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.199 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.199 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.199 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.199 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.199 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.199 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.199 12 ERROR oslo_messaging.notify.messaging Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.199 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.199 12 ERROR oslo_messaging.notify.messaging Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.199 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.199 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.199 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.199 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.199 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.199 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.199 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.199 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.199 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.199 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.199 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.199 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.199 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.199 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.199 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.199 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.199 12 ERROR oslo_messaging.notify.messaging Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.201 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.201 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.201 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.203 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '59d770d1-3b3b-4257-a9e6-c55407f60431', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:55:48.201226', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '3b54a706-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11614.314935328, 'message_signature': '82b4ff8ca108e68ae31dd23511520550380a5c90c3d2eaf3a8ccf61b9cd7a8db'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:55:48.201226', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '3b54b6ec-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11614.314935328, 'message_signature': 'a6f096ba3f2df89a5f9246f22b6bfdbca72f7c9e1bd7061fb88a3617e78053ca'}]}, 'timestamp': '2025-12-15 09:55:48.202085', '_unique_id': 'bc473c66ca90435ea870575761546cde'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.203 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.203 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.203 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.203 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.203 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.203 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.203 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.203 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.203 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.203 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.203 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.203 12 ERROR oslo_messaging.notify.messaging Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.203 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.203 12 ERROR oslo_messaging.notify.messaging Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.203 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.203 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.203 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.203 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.203 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.203 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.203 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.203 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.203 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.203 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.203 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.203 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.203 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.203 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.203 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.203 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.203 12 ERROR oslo_messaging.notify.messaging Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.204 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.204 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.204 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.205 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd50347b3-e4a2-41b9-8472-079473e5957d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:55:48.204224', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '3b551c0e-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11614.337776042, 'message_signature': '1277cc8b6cdf4cc74de681b7bef56a88d2c2c4a2f4272c43b5e9715893bcc5d1'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:55:48.204224', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '3b552bc2-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11614.337776042, 'message_signature': '290f29f9747a0cd96c7444eb9c146f59af3a0446c9fc1a3521c29d0b5420d2f2'}]}, 'timestamp': '2025-12-15 09:55:48.205079', '_unique_id': '37b370dd9d3d49069e63fcfa3c1cf5e1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.205 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.205 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.205 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.205 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.205 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.205 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.205 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.205 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.205 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.205 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.205 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.205 12 ERROR oslo_messaging.notify.messaging Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.205 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.205 12 ERROR oslo_messaging.notify.messaging Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.205 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.205 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.205 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.205 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.205 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.205 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.205 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.205 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.205 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.205 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.205 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.205 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.205 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.205 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.205 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.205 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.205 12 ERROR oslo_messaging.notify.messaging Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.207 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.207 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.207 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.207 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.209 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7a208257-e29f-4904-b117-06b708bc3d38', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:55:48.207424', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '3b559940-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11614.337776042, 'message_signature': '20b42ab088db4a0b0e6def6d8d30ed90be9918acfb483986f40eac80d88dc7e1'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:55:48.207424', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '3b55aa34-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11614.337776042, 'message_signature': '4432c818e5626925ebf1088a4aa92ae25e0bcf2187fd23a4d162afd05302fa05'}]}, 'timestamp': '2025-12-15 09:55:48.208281', '_unique_id': 'a20722def55f4b43918c3ca9f5a480ba'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.209 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.209 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.209 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.209 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.209 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.209 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.209 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.209 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.209 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.209 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.209 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.209 12 ERROR oslo_messaging.notify.messaging Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.209 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.209 12 ERROR oslo_messaging.notify.messaging Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.209 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.209 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.209 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.209 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.209 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.209 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.209 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.209 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.209 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.209 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.209 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.209 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.209 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.209 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.209 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.209 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.209 12 ERROR oslo_messaging.notify.messaging Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.210 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.210 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.211 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '07b6efc9-8394-4d39-ae0c-4af7e387d076', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:55:48.210390', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '3b560d1c-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11614.331678533, 'message_signature': '1b5a2e5ec86ffd0e325d857cbb7e7dea699711dc2f7e44454ddd520064219694'}]}, 'timestamp': '2025-12-15 09:55:48.210845', '_unique_id': 'f9562e99a0754c3a95a3c125fefc2fc8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.211 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.211 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.211 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.211 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.211 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.211 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.211 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.211 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.211 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.211 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.211 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.211 12 ERROR oslo_messaging.notify.messaging Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.211 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.211 12 ERROR oslo_messaging.notify.messaging Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.211 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.211 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.211 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.211 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.211 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.211 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.211 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.211 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.211 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.211 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.211 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.211 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.211 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.211 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.211 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.211 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.211 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.211 12 ERROR oslo_messaging.notify.messaging Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.212 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.212 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/memory.usage volume: 51.73828125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.214 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0a7269fd-fc12-4b06-88e6-828d9a3efab1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.73828125, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'timestamp': '2025-12-15T09:55:48.212919', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '3b5670e0-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11614.390924867, 'message_signature': 'a0b9477093d2d2902f8dd420e4d602c9fc494e78500b1240b97589bf48c3a11d'}]}, 'timestamp': '2025-12-15 09:55:48.213381', '_unique_id': '2c394a5467e846e69f3ccadb905c5508'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.214 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.214 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.214 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.214 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.214 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.214 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.214 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.214 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.214 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.214 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.214 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.214 12 ERROR oslo_messaging.notify.messaging Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.214 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.214 12 ERROR oslo_messaging.notify.messaging Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.214 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.214 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.214 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.214 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.214 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.214 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.214 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.214 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.214 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.214 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.214 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.214 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.214 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.214 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.214 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.214 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.214 12 ERROR oslo_messaging.notify.messaging Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.215 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.215 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.216 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6c211288-7938-4739-893a-c409d7394df0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:55:48.215452', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '3b56d29c-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11614.331678533, 'message_signature': '2c4073519bd4e1e4531d658f0340685d24aea83fbe062674c3bbe8330a23f055'}]}, 'timestamp': '2025-12-15 09:55:48.215883', '_unique_id': '403d7e9ac37c4a64962abdcb34862166'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.216 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.216 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.216 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.216 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.216 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.216 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.216 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.216 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.216 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.216 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.216 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.216 12 ERROR oslo_messaging.notify.messaging Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.216 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.216 12 ERROR oslo_messaging.notify.messaging Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.216 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.216 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.216 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.216 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.216 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.216 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.216 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.216 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.216 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.216 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.216 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.216 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.216 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.216 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.216 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.216 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.216 12 ERROR oslo_messaging.notify.messaging Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.217 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.217 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.217 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.218 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5fea3fb7-b42a-4b3b-8ee5-1db616366c97', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:55:48.217181', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '3b5712de-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11614.314935328, 'message_signature': '6c0b637df7ac23a392bf8208679bb267ff220f5a5718ddd1dcc7a9d4449ebd06'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:55:48.217181', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '3b571ce8-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11614.314935328, 'message_signature': 'dd43472ca38ff6be1b20790467f84abbc63b171e2bb85bd341beca558e920966'}]}, 'timestamp': '2025-12-15 09:55:48.217697', '_unique_id': '934007a88cc54ae0883a1fc207107c63'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.218 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.218 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.218 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.218 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.218 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.218 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.218 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.218 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.218 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.218 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.218 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.218 12 ERROR oslo_messaging.notify.messaging Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.218 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.218 12 ERROR oslo_messaging.notify.messaging Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.218 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.218 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.218 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.218 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.218 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.218 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.218 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.218 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.218 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.218 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.218 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.218 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.218 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.218 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.218 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.218 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.218 12 ERROR oslo_messaging.notify.messaging Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.218 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.219 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.219 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.latency volume: 1243487016 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.219 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.latency volume: 24779175 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.220 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e4f9eaad-3c2a-47ae-b65b-499231e2a98e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1243487016, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:55:48.219104', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '3b575de8-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11614.337776042, 'message_signature': 'd319008399e1e96f5c58a0b007d27c0e2c81c9c32c54da52cb2c3b848760567c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24779175, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:55:48.219104', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '3b57686a-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11614.337776042, 'message_signature': '673be38a244b2baf48880421ac937972764ce8c3c805c348fb9a27a87ff0825b'}]}, 'timestamp': '2025-12-15 09:55:48.219632', '_unique_id': 'c55608d630a24f278de895659de5778f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.220 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.220 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.220 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.220 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.220 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.220 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.220 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.220 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.220 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.220 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.220 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.220 12 ERROR oslo_messaging.notify.messaging Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.220 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.220 12 ERROR oslo_messaging.notify.messaging Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.220 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.220 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.220 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.220 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.220 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.220 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.220 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.220 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.220 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.220 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.220 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.220 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.220 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.220 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.220 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.220 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.220 12 ERROR oslo_messaging.notify.messaging Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.220 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.220 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.221 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b4cb59f7-bbd6-4ee1-9fc8-13374c1f9d52', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:55:48.220914', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '3b57a5b4-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11614.331678533, 'message_signature': 'f811891c31a7fb6f2d76c5b9037060adcc0b56bc1bcff907b1ee71f32a71eb78'}]}, 'timestamp': '2025-12-15 09:55:48.221217', '_unique_id': 'b275a5b8c4eb4b5e835f573587b0e843'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.221 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.221 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.221 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.221 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.221 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.221 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.221 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.221 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.221 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.221 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.221 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.221 12 ERROR oslo_messaging.notify.messaging Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.221 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.221 12 ERROR oslo_messaging.notify.messaging Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.221 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.221 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.221 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.221 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.221 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.221 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.221 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.221 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.221 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.221 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.221 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.221 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.221 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.221 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.221 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.221 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.221 12 ERROR oslo_messaging.notify.messaging Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.222 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.222 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.223 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ccfd0168-f1fe-4483-9ea5-91ca9416243b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:55:48.222460', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '3b57e132-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11614.331678533, 'message_signature': 'edc64b15429ab76c620b7f2bb2dcc750877752bc8cdbceb0b0082d3ab895fee3'}]}, 'timestamp': '2025-12-15 09:55:48.222738', '_unique_id': 'b42b951291754a27a00f1c832e5cfda2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.223 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.223 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.223 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.223 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.223 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.223 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.223 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.223 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.223 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.223 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.223 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.223 12 ERROR oslo_messaging.notify.messaging Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.223 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.223 12 ERROR oslo_messaging.notify.messaging Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.223 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.223 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.223 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.223 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.223 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.223 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.223 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.223 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.223 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.223 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.223 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.223 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.223 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.223 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.223 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.223 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.223 12 ERROR oslo_messaging.notify.messaging Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.223 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.224 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.latency volume: 1342134926 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.224 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.latency volume: 123356132 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.225 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '22b85756-9e4d-4a05-8b7c-3d6e0143f8d3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1342134926, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:55:48.224032', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '3b581e9a-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11614.337776042, 'message_signature': '27e783064cd093b870d84a020c02b866915cdcd5e426a9b88469c45637a53c2e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 123356132, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:55:48.224032', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '3b5828a4-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11614.337776042, 'message_signature': '0f73035ecd8278a5661d2f636e6348b52e10c0bfff954c7557ef3f023ac0e637'}]}, 'timestamp': '2025-12-15 09:55:48.224553', '_unique_id': '8920347a65094e87a9585c6fd85e640c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.225 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.225 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.225 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.225 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.225 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.225 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.225 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.225 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.225 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.225 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.225 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.225 12 ERROR oslo_messaging.notify.messaging Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.225 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.225 12 ERROR oslo_messaging.notify.messaging Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.225 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.225 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.225 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.225 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.225 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.225 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.225 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.225 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.225 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.225 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.225 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.225 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.225 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.225 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.225 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.225 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.225 12 ERROR oslo_messaging.notify.messaging Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.225 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.225 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.226 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd2013212-4d17-4dd0-91a5-9f84a3314120', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:55:48.225833', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '3b586512-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11614.331678533, 'message_signature': '07c11c278598e9fb3ecf9995d2d87986d9bdb69e4c0c8e2809c15f3ae6fd1d3f'}]}, 'timestamp': '2025-12-15 09:55:48.226138', '_unique_id': '2836c3e499e04c1094b5227da2944d70'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.226 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.226 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.226 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.226 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.226 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.226 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.226 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.226 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.226 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.226 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.226 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.226 12 ERROR oslo_messaging.notify.messaging Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.226 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.226 12 ERROR oslo_messaging.notify.messaging Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.226 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.226 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.226 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.226 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.226 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.226 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.226 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.226 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.226 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.226 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.226 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.226 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.226 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.226 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.226 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.226 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.226 12 ERROR oslo_messaging.notify.messaging Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.227 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.227 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.228 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '92d3b62e-2857-4e49-a68f-738d2e30929b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:55:48.227418', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '3b58a2e8-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11614.331678533, 'message_signature': '0d3fb7aca005d9fc032f8cefdfd69e39853be34db748cb30120dc72743c48ed8'}]}, 'timestamp': '2025-12-15 09:55:48.227700', '_unique_id': 'ade904908b6048eb8f6fe62df8251a01'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.228 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.228 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.228 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.228 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.228 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.228 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.228 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.228 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.228 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.228 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.228 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.228 12 ERROR oslo_messaging.notify.messaging Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.228 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.228 12 ERROR oslo_messaging.notify.messaging Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.228 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.228 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.228 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.228 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.228 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.228 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.228 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.228 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.228 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.228 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.228 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.228 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.228 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.228 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.228 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.228 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.228 12 ERROR oslo_messaging.notify.messaging Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.228 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.229 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.229 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.230 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '80981ef3-66a9-40aa-bba0-ff7fa340aef4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:55:48.228956', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '3b58dfba-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11614.337776042, 'message_signature': 'a13dcb860f338897ec67ae8db0437c298183cdc2cef5645d479a6ab88748e4f4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:55:48.228956', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '3b58e992-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11614.337776042, 'message_signature': '4908374a8f3b03fc62e114d6e49848f20ea037f633785c649e444ffce2318d7c'}]}, 'timestamp': '2025-12-15 09:55:48.229488', '_unique_id': '1ad6d003bea045129f20eca119028668'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.230 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.230 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.230 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.230 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.230 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.230 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.230 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.230 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.230 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.230 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.230 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.230 12 ERROR oslo_messaging.notify.messaging Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.230 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.230 12 ERROR oslo_messaging.notify.messaging Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.230 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.230 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.230 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.230 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.230 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.230 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.230 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.230 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.230 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.230 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.230 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.230 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.230 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.230 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.230 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.230 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.230 12 ERROR oslo_messaging.notify.messaging Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.230 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.230 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.231 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b56a26b0-2d29-4798-b607-14ebd821f3fb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:55:48.230759', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '3b59256a-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11614.331678533, 'message_signature': '403103fc5b2c98ea81bcbc35dc65dcf918409cd12c1e5d905093c0fde6fed69f'}]}, 'timestamp': '2025-12-15 09:55:48.231062', '_unique_id': '901e1533c8c54849ba07c67489c5642f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.231 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.231 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.231 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.231 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.231 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.231 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.231 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.231 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.231 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.231 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.231 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.231 12 ERROR oslo_messaging.notify.messaging Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.231 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.231 12 ERROR oslo_messaging.notify.messaging Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.231 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.231 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.231 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.231 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.231 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.231 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.231 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.231 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.231 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.231 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.231 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.231 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.231 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.231 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.231 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.231 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:55:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:55:48.231 12 ERROR oslo_messaging.notify.messaging Dec 15 04:55:48 localhost ceph-mgr[292421]: log_channel(cluster) log [DBG] : pgmap v48: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Dec 15 04:55:48 localhost ceph-mgr[292421]: [progress INFO root] Writing back 50 completed events Dec 15 04:55:48 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Dec 15 04:55:48 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:48 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:48 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Dec 15 04:55:48 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:49 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:50 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:55:50 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:55:50 localhost ceph-mgr[292421]: [volumes INFO mgr_util] scanning for idle connections.. Dec 15 04:55:50 localhost ceph-mgr[292421]: [volumes INFO mgr_util] cleaning up connections: [] Dec 15 04:55:50 localhost ceph-mgr[292421]: [volumes INFO mgr_util] scanning for idle connections.. Dec 15 04:55:50 localhost ceph-mgr[292421]: [volumes INFO mgr_util] cleaning up connections: [('cephfs', )] Dec 15 04:55:50 localhost ceph-mgr[292421]: [volumes INFO mgr_util] disconnecting from cephfs 'cephfs' Dec 15 04:55:50 localhost ceph-mgr[292421]: [volumes INFO mgr_util] scanning for idle connections.. Dec 15 04:55:50 localhost ceph-mgr[292421]: [volumes INFO mgr_util] cleaning up connections: [('cephfs', )] Dec 15 04:55:50 localhost ceph-mgr[292421]: [volumes INFO mgr_util] disconnecting from cephfs 'cephfs' Dec 15 04:55:50 localhost ceph-mgr[292421]: log_channel(cluster) log [DBG] : pgmap v49: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Dec 15 04:55:50 localhost nova_compute[286344]: 2025-12-15 09:55:50.843 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:55:51 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : mgrmap e36: np0005559462.fudvyx(active, since 91s), standbys: np0005559463.daptkf, np0005559461.egwgzn, np0005559464.aomnqe Dec 15 04:55:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:55:51.473 160590 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:55:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:55:51.473 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:55:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:55:51.474 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:55:52 localhost ceph-mgr[292421]: log_channel(cluster) log [DBG] : pgmap v50: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail; 255 B/s wr, 0 op/s Dec 15 04:55:54 localhost ceph-mgr[292421]: mgr.server handle_open ignoring open from mon.np0005559464 172.18.0.108:0/769215439; not ready for session (expect reconnect) Dec 15 04:55:54 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command({"prefix": "mon metadata", "id": "np0005559464"} v 0) Dec 15 04:55:54 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "mon metadata", "id": "np0005559464"} : dispatch Dec 15 04:55:54 localhost ceph-mgr[292421]: mgr finish mon failed to return metadata for mon.np0005559464: (2) No such file or directory Dec 15 04:55:54 localhost ceph-mgr[292421]: log_channel(cluster) log [DBG] : pgmap v51: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail; 255 B/s wr, 0 op/s Dec 15 04:55:55 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:55:55 localhost ceph-mgr[292421]: mgr.server handle_open ignoring open from mon.np0005559464 172.18.0.108:0/769215439; not ready for session (expect reconnect) Dec 15 04:55:55 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command({"prefix": "mon metadata", "id": "np0005559464"} v 0) Dec 15 04:55:55 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "mon metadata", "id": "np0005559464"} : dispatch Dec 15 04:55:55 localhost ceph-mgr[292421]: mgr finish mon failed to return metadata for mon.np0005559464: (2) No such file or directory Dec 15 04:55:55 localhost sshd[308940]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:55:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0. Dec 15 04:55:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 04:55:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a. Dec 15 04:55:55 localhost podman[308941]: 2025-12-15 09:55:55.76385294 +0000 UTC m=+0.089375151 container health_status 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Dec 15 04:55:55 localhost systemd[1]: tmp-crun.pOQsVk.mount: Deactivated successfully. Dec 15 04:55:55 localhost podman[308943]: 2025-12-15 09:55:55.827543948 +0000 UTC m=+0.147643319 container health_status b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute) Dec 15 04:55:55 localhost nova_compute[286344]: 2025-12-15 09:55:55.846 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:55:55 localhost podman[308942]: 2025-12-15 09:55:55.873066822 +0000 UTC m=+0.195984661 container health_status 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Dec 15 04:55:55 localhost podman[308941]: 2025-12-15 09:55:55.888336425 +0000 UTC m=+0.213858616 container exec_died 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 15 04:55:55 localhost systemd[1]: 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0.service: Deactivated successfully. Dec 15 04:55:55 localhost podman[308942]: 2025-12-15 09:55:55.91337682 +0000 UTC m=+0.236294649 container exec_died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, config_id=multipathd, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Dec 15 04:55:55 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.service: Deactivated successfully. Dec 15 04:55:55 localhost podman[308943]: 2025-12-15 09:55:55.94145258 +0000 UTC m=+0.261551991 container exec_died b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute) Dec 15 04:55:55 localhost systemd[1]: b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a.service: Deactivated successfully. Dec 15 04:55:56 localhost ceph-mgr[292421]: mgr.server handle_open ignoring open from mon.np0005559464 172.18.0.108:0/769215439; not ready for session (expect reconnect) Dec 15 04:55:56 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command({"prefix": "mon metadata", "id": "np0005559464"} v 0) Dec 15 04:55:56 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "mon metadata", "id": "np0005559464"} : dispatch Dec 15 04:55:56 localhost ceph-mgr[292421]: mgr finish mon failed to return metadata for mon.np0005559464: (2) No such file or directory Dec 15 04:55:56 localhost ceph-mgr[292421]: log_channel(cluster) log [DBG] : pgmap v52: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail; 255 B/s wr, 0 op/s Dec 15 04:55:57 localhost ceph-mgr[292421]: mgr.server handle_open ignoring open from mon.np0005559464 172.18.0.108:0/769215439; not ready for session (expect reconnect) Dec 15 04:55:57 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command({"prefix": "mon metadata", "id": "np0005559464"} v 0) Dec 15 04:55:57 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "mon metadata", "id": "np0005559464"} : dispatch Dec 15 04:55:57 localhost ceph-mgr[292421]: mgr finish mon failed to return metadata for mon.np0005559464: (2) No such file or directory Dec 15 04:55:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09. Dec 15 04:55:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 04:55:57 localhost systemd[1]: tmp-crun.FFWa12.mount: Deactivated successfully. Dec 15 04:55:57 localhost podman[308998]: 2025-12-15 09:55:57.767392842 +0000 UTC m=+0.096992563 container health_status 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.buildah.version=1.33.7, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vcs-type=git) Dec 15 04:55:57 localhost systemd[1]: tmp-crun.UIJogX.mount: Deactivated successfully. Dec 15 04:55:57 localhost podman[308999]: 2025-12-15 09:55:57.814210731 +0000 UTC m=+0.140547082 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Dec 15 04:55:57 localhost podman[308998]: 2025-12-15 09:55:57.836081418 +0000 UTC m=+0.165681169 container exec_died 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, architecture=x86_64, build-date=2025-08-20T13:12:41, release=1755695350, config_id=openstack_network_exporter, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9) Dec 15 04:55:57 localhost systemd[1]: 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09.service: Deactivated successfully. Dec 15 04:55:57 localhost podman[308999]: 2025-12-15 09:55:57.879578666 +0000 UTC m=+0.205915097 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Dec 15 04:55:57 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 04:55:58 localhost ceph-mgr[292421]: mgr.server handle_open ignoring open from mon.np0005559464 172.18.0.108:0/769215439; not ready for session (expect reconnect) Dec 15 04:55:58 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command({"prefix": "mon metadata", "id": "np0005559464"} v 0) Dec 15 04:55:58 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "mon metadata", "id": "np0005559464"} : dispatch Dec 15 04:55:58 localhost ceph-mgr[292421]: mgr finish mon failed to return metadata for mon.np0005559464: (2) No such file or directory Dec 15 04:55:58 localhost ceph-mgr[292421]: log_channel(cluster) log [DBG] : pgmap v53: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail; 255 B/s wr, 0 op/s Dec 15 04:55:59 localhost ceph-mgr[292421]: mgr.server handle_open ignoring open from mon.np0005559464 172.18.0.108:0/769215439; not ready for session (expect reconnect) Dec 15 04:55:59 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command({"prefix": "mon metadata", "id": "np0005559464"} v 0) Dec 15 04:55:59 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "mon metadata", "id": "np0005559464"} : dispatch Dec 15 04:55:59 localhost ceph-mgr[292421]: mgr finish mon failed to return metadata for mon.np0005559464: (2) No such file or directory Dec 15 04:56:00 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:56:00 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : mgrmap e37: np0005559462.fudvyx(active, since 100s), standbys: np0005559463.daptkf, np0005559464.aomnqe Dec 15 04:56:00 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Dec 15 04:56:00 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/461008315' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Dec 15 04:56:00 localhost ceph-mgr[292421]: mgr.server handle_open ignoring open from mon.np0005559464 172.18.0.108:0/769215439; not ready for session (expect reconnect) Dec 15 04:56:00 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command({"prefix": "mon metadata", "id": "np0005559464"} v 0) Dec 15 04:56:00 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "mon metadata", "id": "np0005559464"} : dispatch Dec 15 04:56:00 localhost ceph-mgr[292421]: mgr finish mon failed to return metadata for mon.np0005559464: (2) No such file or directory Dec 15 04:56:00 localhost ceph-mgr[292421]: log_channel(cluster) log [DBG] : pgmap v54: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail; 255 B/s wr, 0 op/s Dec 15 04:56:00 localhost nova_compute[286344]: 2025-12-15 09:56:00.851 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:56:01 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Dec 15 04:56:01 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3302889907' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Dec 15 04:56:01 localhost nova_compute[286344]: 2025-12-15 09:56:01.290 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:56:01 localhost ceph-mgr[292421]: mgr.server handle_open ignoring open from mon.np0005559464 172.18.0.108:0/769215439; not ready for session (expect reconnect) Dec 15 04:56:01 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command({"prefix": "mon metadata", "id": "np0005559464"} v 0) Dec 15 04:56:01 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "mon metadata", "id": "np0005559464"} : dispatch Dec 15 04:56:01 localhost ceph-mgr[292421]: mgr finish mon failed to return metadata for mon.np0005559464: (2) No such file or directory Dec 15 04:56:01 localhost podman[243449]: time="2025-12-15T09:56:01Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 15 04:56:01 localhost podman[243449]: @ - - [15/Dec/2025:09:56:01 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 154814 "" "Go-http-client/1.1" Dec 15 04:56:01 localhost podman[243449]: @ - - [15/Dec/2025:09:56:01 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18722 "" "Go-http-client/1.1" Dec 15 04:56:02 localhost nova_compute[286344]: 2025-12-15 09:56:02.266 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:56:02 localhost ceph-mgr[292421]: mgr.server handle_open ignoring open from mon.np0005559464 172.18.0.108:0/769215439; not ready for session (expect reconnect) Dec 15 04:56:02 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command({"prefix": "mon metadata", "id": "np0005559464"} v 0) Dec 15 04:56:02 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "mon metadata", "id": "np0005559464"} : dispatch Dec 15 04:56:02 localhost ceph-mgr[292421]: mgr finish mon failed to return metadata for mon.np0005559464: (2) No such file or directory Dec 15 04:56:02 localhost ceph-mgr[292421]: log_channel(cluster) log [DBG] : pgmap v55: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail; 255 B/s wr, 0 op/s Dec 15 04:56:03 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command({"prefix": "status", "format": "json"} v 0) Dec 15 04:56:03 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.200:0/1418799730' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch Dec 15 04:56:03 localhost ceph-mgr[292421]: mgr.server handle_open ignoring open from mon.np0005559464 172.18.0.108:0/769215439; not ready for session (expect reconnect) Dec 15 04:56:03 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command({"prefix": "mon metadata", "id": "np0005559464"} v 0) Dec 15 04:56:03 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "mon metadata", "id": "np0005559464"} : dispatch Dec 15 04:56:03 localhost ceph-mgr[292421]: mgr finish mon failed to return metadata for mon.np0005559464: (2) No such file or directory Dec 15 04:56:04 localhost ceph-mgr[292421]: mgr.server handle_open ignoring open from mon.np0005559464 172.18.0.108:0/769215439; not ready for session (expect reconnect) Dec 15 04:56:04 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 handle_command mon_command({"prefix": "mon metadata", "id": "np0005559464"} v 0) Dec 15 04:56:04 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "mon metadata", "id": "np0005559464"} : dispatch Dec 15 04:56:04 localhost ceph-mgr[292421]: mgr finish mon failed to return metadata for mon.np0005559464: (2) No such file or directory Dec 15 04:56:04 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 adding peer [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] to list of hints Dec 15 04:56:04 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 adding peer [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] to list of hints Dec 15 04:56:04 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 adding peer [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] to list of hints Dec 15 04:56:04 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 adding peer [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] to list of hints Dec 15 04:56:04 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 adding peer [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] to list of hints Dec 15 04:56:04 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 adding peer [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] to list of hints Dec 15 04:56:04 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 adding peer [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] to list of hints Dec 15 04:56:04 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 adding peer [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] to list of hints Dec 15 04:56:04 localhost ceph-mon[298913]: mon.np0005559462@0(leader).monmap v16 adding/updating np0005559464 at [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] to monitor cluster Dec 15 04:56:04 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 adding peer [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] to list of hints Dec 15 04:56:04 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 adding peer [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] to list of hints Dec 15 04:56:04 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 adding peer [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] to list of hints Dec 15 04:56:04 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 adding peer [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] to list of hints Dec 15 04:56:04 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 adding peer [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] to list of hints Dec 15 04:56:04 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e16 adding peer [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] to list of hints Dec 15 04:56:04 localhost ceph-mon[298913]: mon.np0005559462@0(probing) e17 handle_command mon_command({"prefix": "mon metadata", "id": "np0005559462"} v 0) Dec 15 04:56:04 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "mon metadata", "id": "np0005559462"} : dispatch Dec 15 04:56:04 localhost ceph-mon[298913]: mon.np0005559462@0(probing) e17 handle_command mon_command({"prefix": "mon metadata", "id": "np0005559463"} v 0) Dec 15 04:56:04 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "mon metadata", "id": "np0005559463"} : dispatch Dec 15 04:56:04 localhost ceph-mon[298913]: mon.np0005559462@0(probing) e17 handle_command mon_command({"prefix": "mon metadata", "id": "np0005559464"} v 0) Dec 15 04:56:04 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "mon metadata", "id": "np0005559464"} : dispatch Dec 15 04:56:04 localhost ceph-mon[298913]: log_channel(cluster) log [INF] : mon.np0005559462 calling monitor election Dec 15 04:56:04 localhost ceph-mon[298913]: paxos.0).electionLogic(70) init, last seen epoch 70 Dec 15 04:56:04 localhost ceph-mgr[292421]: mgr finish mon failed to return metadata for mon.np0005559464: (22) Invalid argument Dec 15 04:56:04 localhost ceph-mon[298913]: mon.np0005559462@0(electing) e17 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Dec 15 04:56:04 localhost ceph-mgr[292421]: log_channel(cluster) log [DBG] : pgmap v56: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Dec 15 04:56:04 localhost ceph-mon[298913]: mon.np0005559462@0(electing) e17 handle_auth_request failed to assign global_id Dec 15 04:56:04 localhost openstack_network_exporter[246484]: ERROR 09:56:04 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 15 04:56:04 localhost openstack_network_exporter[246484]: ERROR 09:56:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 04:56:04 localhost openstack_network_exporter[246484]: ERROR 09:56:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 04:56:04 localhost openstack_network_exporter[246484]: ERROR 09:56:04 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 15 04:56:04 localhost openstack_network_exporter[246484]: Dec 15 04:56:04 localhost openstack_network_exporter[246484]: ERROR 09:56:04 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 15 04:56:04 localhost openstack_network_exporter[246484]: Dec 15 04:56:05 localhost ceph-mon[298913]: mon.np0005559462@0(electing) e17 handle_auth_request failed to assign global_id Dec 15 04:56:05 localhost nova_compute[286344]: 2025-12-15 09:56:05.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:56:05 localhost nova_compute[286344]: 2025-12-15 09:56:05.271 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:56:05 localhost ceph-mon[298913]: mon.np0005559462@0(electing) e17 handle_auth_request failed to assign global_id Dec 15 04:56:05 localhost ceph-mgr[292421]: mgr.server handle_open ignoring open from mon.np0005559464 172.18.0.108:0/769215439; not ready for session (expect reconnect) Dec 15 04:56:05 localhost ceph-mon[298913]: mon.np0005559462@0(electing) e17 handle_command mon_command({"prefix": "mon metadata", "id": "np0005559464"} v 0) Dec 15 04:56:05 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "mon metadata", "id": "np0005559464"} : dispatch Dec 15 04:56:05 localhost ceph-mgr[292421]: mgr finish mon failed to return metadata for mon.np0005559464: (22) Invalid argument Dec 15 04:56:05 localhost ceph-mon[298913]: mon.np0005559462@0(electing) e17 handle_auth_request failed to assign global_id Dec 15 04:56:05 localhost ceph-mon[298913]: mon.np0005559462@0(electing) e17 handle_auth_request failed to assign global_id Dec 15 04:56:05 localhost nova_compute[286344]: 2025-12-15 09:56:05.520 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:56:05 localhost nova_compute[286344]: 2025-12-15 09:56:05.521 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:56:05 localhost nova_compute[286344]: 2025-12-15 09:56:05.521 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:56:05 localhost nova_compute[286344]: 2025-12-15 09:56:05.522 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Auditing locally available compute resources for np0005559462.localdomain (node: np0005559462.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 15 04:56:05 localhost nova_compute[286344]: 2025-12-15 09:56:05.522 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 04:56:05 localhost ceph-mon[298913]: mon.np0005559462@0(electing) e17 handle_auth_request failed to assign global_id Dec 15 04:56:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 04:56:05 localhost podman[309052]: 2025-12-15 09:56:05.746212678 +0000 UTC m=+0.081804871 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, managed_by=edpm_ansible) Dec 15 04:56:05 localhost podman[309052]: 2025-12-15 09:56:05.780389447 +0000 UTC m=+0.115981620 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 04:56:05 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 04:56:05 localhost ceph-mon[298913]: mon.np0005559462@0(electing) e17 handle_auth_request failed to assign global_id Dec 15 04:56:05 localhost nova_compute[286344]: 2025-12-15 09:56:05.855 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:56:05 localhost ceph-mon[298913]: mon.np0005559462@0(electing) e17 handle_auth_request failed to assign global_id Dec 15 04:56:06 localhost ceph-mon[298913]: mon.np0005559462@0(electing) e17 handle_auth_request failed to assign global_id Dec 15 04:56:06 localhost ceph-mon[298913]: mon.np0005559462@0(electing) e17 handle_auth_request failed to assign global_id Dec 15 04:56:06 localhost ceph-mgr[292421]: mgr.server handle_open ignoring open from mon.np0005559464 172.18.0.108:0/769215439; not ready for session (expect reconnect) Dec 15 04:56:06 localhost ceph-mon[298913]: mon.np0005559462@0(electing) e17 handle_command mon_command({"prefix": "mon metadata", "id": "np0005559464"} v 0) Dec 15 04:56:06 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "mon metadata", "id": "np0005559464"} : dispatch Dec 15 04:56:06 localhost ceph-mgr[292421]: mgr finish mon failed to return metadata for mon.np0005559464: (22) Invalid argument Dec 15 04:56:06 localhost ceph-mon[298913]: mon.np0005559462@0(electing) e17 handle_auth_request failed to assign global_id Dec 15 04:56:06 localhost ceph-mgr[292421]: log_channel(cluster) log [DBG] : pgmap v57: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Dec 15 04:56:07 localhost ceph-mon[298913]: mon.np0005559462@0(electing) e17 handle_auth_request failed to assign global_id Dec 15 04:56:07 localhost ceph-mgr[292421]: mgr.server handle_open ignoring open from mon.np0005559464 172.18.0.108:0/769215439; not ready for session (expect reconnect) Dec 15 04:56:07 localhost ceph-mon[298913]: mon.np0005559462@0(electing) e17 handle_command mon_command({"prefix": "mon metadata", "id": "np0005559464"} v 0) Dec 15 04:56:07 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "mon metadata", "id": "np0005559464"} : dispatch Dec 15 04:56:07 localhost ceph-mgr[292421]: mgr finish mon failed to return metadata for mon.np0005559464: (22) Invalid argument Dec 15 04:56:07 localhost ceph-mon[298913]: mon.np0005559462@0(electing) e17 handle_auth_request failed to assign global_id Dec 15 04:56:08 localhost ceph-mon[298913]: mon.np0005559462@0(electing) e17 handle_auth_request failed to assign global_id Dec 15 04:56:08 localhost ceph-mon[298913]: mon.np0005559462@0(electing) e17 handle_auth_request failed to assign global_id Dec 15 04:56:08 localhost ceph-mgr[292421]: mgr.server handle_open ignoring open from mon.np0005559464 172.18.0.108:0/769215439; not ready for session (expect reconnect) Dec 15 04:56:08 localhost ceph-mon[298913]: mon.np0005559462@0(electing) e17 handle_command mon_command({"prefix": "mon metadata", "id": "np0005559464"} v 0) Dec 15 04:56:08 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "mon metadata", "id": "np0005559464"} : dispatch Dec 15 04:56:08 localhost ceph-mgr[292421]: mgr finish mon failed to return metadata for mon.np0005559464: (22) Invalid argument Dec 15 04:56:08 localhost ceph-mon[298913]: mon.np0005559462@0(electing) e17 handle_auth_request failed to assign global_id Dec 15 04:56:08 localhost ceph-mon[298913]: mon.np0005559462@0(electing) e17 handle_auth_request failed to assign global_id Dec 15 04:56:08 localhost ceph-mon[298913]: mon.np0005559462@0(electing) e17 handle_auth_request failed to assign global_id Dec 15 04:56:08 localhost ceph-mgr[292421]: log_channel(cluster) log [DBG] : pgmap v58: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Dec 15 04:56:08 localhost ceph-mon[298913]: mon.np0005559462@0(electing) e17 handle_auth_request failed to assign global_id Dec 15 04:56:08 localhost ceph-mon[298913]: mon.np0005559462@0(electing) e17 handle_auth_request failed to assign global_id Dec 15 04:56:09 localhost ceph-mon[298913]: mon.np0005559462@0(electing) e17 handle_auth_request failed to assign global_id Dec 15 04:56:09 localhost ceph-mon[298913]: mon.np0005559462@0(electing) e17 handle_auth_request failed to assign global_id Dec 15 04:56:09 localhost ceph-mgr[292421]: mgr.server handle_open ignoring open from mon.np0005559464 172.18.0.108:0/769215439; not ready for session (expect reconnect) Dec 15 04:56:09 localhost ceph-mon[298913]: mon.np0005559462@0(electing) e17 handle_command mon_command({"prefix": "mon metadata", "id": "np0005559464"} v 0) Dec 15 04:56:09 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "mon metadata", "id": "np0005559464"} : dispatch Dec 15 04:56:09 localhost ceph-mgr[292421]: mgr finish mon failed to return metadata for mon.np0005559464: (22) Invalid argument Dec 15 04:56:09 localhost ceph-mon[298913]: mon.np0005559462@0(electing) e17 handle_auth_request failed to assign global_id Dec 15 04:56:09 localhost ceph-mon[298913]: log_channel(cluster) log [INF] : mon.np0005559462 is new leader, mons np0005559462,np0005559463 in quorum (ranks 0,1) Dec 15 04:56:09 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : monmap epoch 17 Dec 15 04:56:09 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : fsid bce17446-41b5-5408-a23e-0b011906b44a Dec 15 04:56:09 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : last_changed 2025-12-15T09:56:04.705631+0000 Dec 15 04:56:09 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : created 2025-12-15T07:42:11.600132+0000 Dec 15 04:56:09 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : min_mon_release 18 (reef) Dec 15 04:56:09 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : election_strategy: 1 Dec 15 04:56:09 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : 0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005559462 Dec 15 04:56:09 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005559463 Dec 15 04:56:09 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : 2: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005559464 Dec 15 04:56:09 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Dec 15 04:56:09 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : fsmap cephfs:1 {0=mds.np0005559463.rdpgze=up:active} 2 up:standby Dec 15 04:56:09 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e88: 6 total, 6 up, 6 in Dec 15 04:56:09 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : mgrmap e37: np0005559462.fudvyx(active, since 109s), standbys: np0005559463.daptkf, np0005559464.aomnqe Dec 15 04:56:09 localhost ceph-mon[298913]: log_channel(cluster) log [WRN] : Health check failed: 1/3 mons down, quorum np0005559462,np0005559463 (MON_DOWN) Dec 15 04:56:09 localhost ceph-mon[298913]: log_channel(cluster) log [WRN] : Health detail: HEALTH_WARN 1 stray daemon(s) not managed by cephadm; 1 stray host(s) with 1 daemon(s) not managed by cephadm; 1/3 mons down, quorum np0005559462,np0005559463 Dec 15 04:56:09 localhost ceph-mon[298913]: log_channel(cluster) log [WRN] : [WRN] CEPHADM_STRAY_DAEMON: 1 stray daemon(s) not managed by cephadm Dec 15 04:56:09 localhost ceph-mon[298913]: log_channel(cluster) log [WRN] : stray daemon mgr.np0005559460.oexkup on host np0005559460.localdomain not managed by cephadm Dec 15 04:56:09 localhost ceph-mon[298913]: log_channel(cluster) log [WRN] : [WRN] CEPHADM_STRAY_HOST: 1 stray host(s) with 1 daemon(s) not managed by cephadm Dec 15 04:56:09 localhost ceph-mon[298913]: log_channel(cluster) log [WRN] : stray host np0005559460.localdomain has 1 stray daemons: ['mgr.np0005559460.oexkup'] Dec 15 04:56:09 localhost ceph-mon[298913]: log_channel(cluster) log [WRN] : [WRN] MON_DOWN: 1/3 mons down, quorum np0005559462,np0005559463 Dec 15 04:56:09 localhost ceph-mon[298913]: log_channel(cluster) log [WRN] : mon.np0005559464 (rank 2) addr [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] is down (out of quorum) Dec 15 04:56:09 localhost ceph-mon[298913]: mon.np0005559463 calling monitor election Dec 15 04:56:09 localhost ceph-mon[298913]: mon.np0005559462 calling monitor election Dec 15 04:56:09 localhost ceph-mon[298913]: mon.np0005559462 is new leader, mons np0005559462,np0005559463 in quorum (ranks 0,1) Dec 15 04:56:09 localhost ceph-mon[298913]: Health check failed: 1/3 mons down, quorum np0005559462,np0005559463 (MON_DOWN) Dec 15 04:56:09 localhost ceph-mon[298913]: Health detail: HEALTH_WARN 1 stray daemon(s) not managed by cephadm; 1 stray host(s) with 1 daemon(s) not managed by cephadm; 1/3 mons down, quorum np0005559462,np0005559463 Dec 15 04:56:09 localhost ceph-mon[298913]: [WRN] CEPHADM_STRAY_DAEMON: 1 stray daemon(s) not managed by cephadm Dec 15 04:56:09 localhost ceph-mon[298913]: stray daemon mgr.np0005559460.oexkup on host np0005559460.localdomain not managed by cephadm Dec 15 04:56:09 localhost ceph-mon[298913]: [WRN] CEPHADM_STRAY_HOST: 1 stray host(s) with 1 daemon(s) not managed by cephadm Dec 15 04:56:09 localhost ceph-mon[298913]: stray host np0005559460.localdomain has 1 stray daemons: ['mgr.np0005559460.oexkup'] Dec 15 04:56:09 localhost ceph-mon[298913]: [WRN] MON_DOWN: 1/3 mons down, quorum np0005559462,np0005559463 Dec 15 04:56:09 localhost ceph-mon[298913]: mon.np0005559464 (rank 2) addr [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] is down (out of quorum) Dec 15 04:56:10 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:56:10 localhost ceph-mgr[292421]: mgr.server handle_open ignoring open from mon.np0005559464 172.18.0.108:0/769215439; not ready for session (expect reconnect) Dec 15 04:56:10 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon metadata", "id": "np0005559464"} v 0) Dec 15 04:56:10 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "mon metadata", "id": "np0005559464"} : dispatch Dec 15 04:56:10 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Dec 15 04:56:10 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/4166638744' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Dec 15 04:56:10 localhost ceph-mgr[292421]: mgr finish mon failed to return metadata for mon.np0005559464: (22) Invalid argument Dec 15 04:56:10 localhost nova_compute[286344]: 2025-12-15 09:56:10.402 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 4.880s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 04:56:10 localhost ceph-mgr[292421]: log_channel(cluster) log [DBG] : pgmap v59: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Dec 15 04:56:10 localhost nova_compute[286344]: 2025-12-15 09:56:10.857 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:56:11 localhost nova_compute[286344]: 2025-12-15 09:56:11.191 286348 DEBUG nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 04:56:11 localhost nova_compute[286344]: 2025-12-15 09:56:11.191 286348 DEBUG nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 04:56:11 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Dec 15 04:56:11 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2070990210' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Dec 15 04:56:11 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Dec 15 04:56:11 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2070990210' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Dec 15 04:56:11 localhost ceph-mgr[292421]: mgr.server handle_open ignoring open from mon.np0005559464 172.18.0.108:0/769215439; not ready for session (expect reconnect) Dec 15 04:56:11 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon metadata", "id": "np0005559464"} v 0) Dec 15 04:56:11 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "mon metadata", "id": "np0005559464"} : dispatch Dec 15 04:56:11 localhost ceph-mgr[292421]: mgr finish mon failed to return metadata for mon.np0005559464: (22) Invalid argument Dec 15 04:56:11 localhost nova_compute[286344]: 2025-12-15 09:56:11.399 286348 WARNING nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 15 04:56:11 localhost nova_compute[286344]: 2025-12-15 09:56:11.400 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Hypervisor/Node resource view: name=np0005559462.localdomain free_ram=11710MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 15 04:56:11 localhost nova_compute[286344]: 2025-12-15 09:56:11.401 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:56:11 localhost nova_compute[286344]: 2025-12-15 09:56:11.401 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:56:11 localhost ceph-mon[298913]: log_channel(cluster) log [INF] : mon.np0005559462 calling monitor election Dec 15 04:56:11 localhost ceph-mon[298913]: paxos.0).electionLogic(72) init, last seen epoch 72 Dec 15 04:56:11 localhost ceph-mon[298913]: mon.np0005559462@0(electing) e17 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Dec 15 04:56:11 localhost ceph-mon[298913]: log_channel(cluster) log [INF] : mon.np0005559462 is new leader, mons np0005559462,np0005559463,np0005559464 in quorum (ranks 0,1,2) Dec 15 04:56:11 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : monmap epoch 17 Dec 15 04:56:11 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : fsid bce17446-41b5-5408-a23e-0b011906b44a Dec 15 04:56:11 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : last_changed 2025-12-15T09:56:04.705631+0000 Dec 15 04:56:11 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : created 2025-12-15T07:42:11.600132+0000 Dec 15 04:56:11 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : min_mon_release 18 (reef) Dec 15 04:56:11 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : election_strategy: 1 Dec 15 04:56:11 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : 0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005559462 Dec 15 04:56:11 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005559463 Dec 15 04:56:11 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : 2: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005559464 Dec 15 04:56:11 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 collect_metadata vda: no unique device id for vda: fallback method has no model nor serial Dec 15 04:56:11 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : fsmap cephfs:1 {0=mds.np0005559463.rdpgze=up:active} 2 up:standby Dec 15 04:56:11 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e88: 6 total, 6 up, 6 in Dec 15 04:56:11 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : mgrmap e37: np0005559462.fudvyx(active, since 111s), standbys: np0005559463.daptkf, np0005559464.aomnqe Dec 15 04:56:11 localhost ceph-mon[298913]: log_channel(cluster) log [INF] : Health check cleared: MON_DOWN (was: 1/3 mons down, quorum np0005559462,np0005559463) Dec 15 04:56:11 localhost ceph-mon[298913]: log_channel(cluster) log [WRN] : Health detail: HEALTH_WARN 1 stray daemon(s) not managed by cephadm; 1 stray host(s) with 1 daemon(s) not managed by cephadm Dec 15 04:56:11 localhost ceph-mon[298913]: log_channel(cluster) log [WRN] : [WRN] CEPHADM_STRAY_DAEMON: 1 stray daemon(s) not managed by cephadm Dec 15 04:56:11 localhost ceph-mon[298913]: log_channel(cluster) log [WRN] : stray daemon mgr.np0005559460.oexkup on host np0005559460.localdomain not managed by cephadm Dec 15 04:56:11 localhost ceph-mon[298913]: log_channel(cluster) log [WRN] : [WRN] CEPHADM_STRAY_HOST: 1 stray host(s) with 1 daemon(s) not managed by cephadm Dec 15 04:56:11 localhost ceph-mon[298913]: log_channel(cluster) log [WRN] : stray host np0005559460.localdomain has 1 stray daemons: ['mgr.np0005559460.oexkup'] Dec 15 04:56:11 localhost nova_compute[286344]: 2025-12-15 09:56:11.921 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Instance 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 15 04:56:11 localhost nova_compute[286344]: 2025-12-15 09:56:11.922 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 15 04:56:11 localhost nova_compute[286344]: 2025-12-15 09:56:11.922 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Final resource view: name=np0005559462.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 15 04:56:11 localhost nova_compute[286344]: 2025-12-15 09:56:11.947 286348 DEBUG nova.scheduler.client.report [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Refreshing inventories for resource provider 26c8956b-6742-4951-b566-971b9bbe323b _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Dec 15 04:56:11 localhost nova_compute[286344]: 2025-12-15 09:56:11.966 286348 DEBUG nova.scheduler.client.report [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Updating ProviderTree inventory for provider 26c8956b-6742-4951-b566-971b9bbe323b from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Dec 15 04:56:11 localhost nova_compute[286344]: 2025-12-15 09:56:11.967 286348 DEBUG nova.compute.provider_tree [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Updating inventory in ProviderTree for provider 26c8956b-6742-4951-b566-971b9bbe323b with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Dec 15 04:56:11 localhost nova_compute[286344]: 2025-12-15 09:56:11.986 286348 DEBUG nova.scheduler.client.report [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Refreshing aggregate associations for resource provider 26c8956b-6742-4951-b566-971b9bbe323b, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Dec 15 04:56:12 localhost nova_compute[286344]: 2025-12-15 09:56:12.009 286348 DEBUG nova.scheduler.client.report [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Refreshing trait associations for resource provider 26c8956b-6742-4951-b566-971b9bbe323b, traits: COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_IDE,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NODE,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_CLMUL,HW_CPU_X86_ABM,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_BMI2,HW_CPU_X86_SSE42,HW_CPU_X86_FMA3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_BMI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE,HW_CPU_X86_AVX,HW_CPU_X86_SVM,COMPUTE_SECURITY_TPM_2_0,COMPUTE_TRUSTED_CERTS,COMPUTE_RESCUE_BFV,HW_CPU_X86_AVX2,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSSE3,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_AESNI,HW_CPU_X86_SSE4A,HW_CPU_X86_MMX,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_F16C,HW_CPU_X86_SHA,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_SATA _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Dec 15 04:56:12 localhost nova_compute[286344]: 2025-12-15 09:56:12.039 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 04:56:12 localhost ceph-mon[298913]: mon.np0005559464 calling monitor election Dec 15 04:56:12 localhost ceph-mon[298913]: mon.np0005559464 calling monitor election Dec 15 04:56:12 localhost ceph-mon[298913]: mon.np0005559463 calling monitor election Dec 15 04:56:12 localhost ceph-mon[298913]: mon.np0005559462 calling monitor election Dec 15 04:56:12 localhost ceph-mon[298913]: mon.np0005559462 is new leader, mons np0005559462,np0005559463,np0005559464 in quorum (ranks 0,1,2) Dec 15 04:56:12 localhost ceph-mon[298913]: Health check cleared: MON_DOWN (was: 1/3 mons down, quorum np0005559462,np0005559463) Dec 15 04:56:12 localhost ceph-mon[298913]: Health detail: HEALTH_WARN 1 stray daemon(s) not managed by cephadm; 1 stray host(s) with 1 daemon(s) not managed by cephadm Dec 15 04:56:12 localhost ceph-mon[298913]: [WRN] CEPHADM_STRAY_DAEMON: 1 stray daemon(s) not managed by cephadm Dec 15 04:56:12 localhost ceph-mon[298913]: stray daemon mgr.np0005559460.oexkup on host np0005559460.localdomain not managed by cephadm Dec 15 04:56:12 localhost ceph-mon[298913]: [WRN] CEPHADM_STRAY_HOST: 1 stray host(s) with 1 daemon(s) not managed by cephadm Dec 15 04:56:12 localhost ceph-mon[298913]: stray host np0005559460.localdomain has 1 stray daemons: ['mgr.np0005559460.oexkup'] Dec 15 04:56:12 localhost ceph-mgr[292421]: mgr.server handle_open ignoring open from mon.np0005559464 172.18.0.108:0/769215439; not ready for session (expect reconnect) Dec 15 04:56:12 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon metadata", "id": "np0005559464"} v 0) Dec 15 04:56:12 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "mon metadata", "id": "np0005559464"} : dispatch Dec 15 04:56:12 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Dec 15 04:56:12 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/451065634' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Dec 15 04:56:12 localhost nova_compute[286344]: 2025-12-15 09:56:12.487 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 04:56:12 localhost nova_compute[286344]: 2025-12-15 09:56:12.494 286348 DEBUG nova.compute.provider_tree [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Inventory has not changed in ProviderTree for provider: 26c8956b-6742-4951-b566-971b9bbe323b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 15 04:56:12 localhost nova_compute[286344]: 2025-12-15 09:56:12.556 286348 DEBUG nova.scheduler.client.report [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Inventory has not changed for provider 26c8956b-6742-4951-b566-971b9bbe323b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 15 04:56:12 localhost nova_compute[286344]: 2025-12-15 09:56:12.559 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Compute_service record updated for np0005559462.localdomain:np0005559462.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 15 04:56:12 localhost nova_compute[286344]: 2025-12-15 09:56:12.559 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.158s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:56:12 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "status", "format": "json"} v 0) Dec 15 04:56:12 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.200:0/2648535273' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch Dec 15 04:56:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e. Dec 15 04:56:12 localhost podman[309103]: 2025-12-15 09:56:12.760357317 +0000 UTC m=+0.086716918 container health_status a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Dec 15 04:56:12 localhost podman[309103]: 2025-12-15 09:56:12.773395748 +0000 UTC m=+0.099755399 container exec_died a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Dec 15 04:56:12 localhost systemd[1]: a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e.service: Deactivated successfully. Dec 15 04:56:12 localhost ceph-mgr[292421]: log_channel(cluster) log [DBG] : pgmap v60: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Dec 15 04:56:13 localhost ceph-mgr[292421]: log_channel(audit) log [DBG] : from='client.54199 -' entity='client.admin' cmd=[{"prefix": "orch", "action": "reconfig", "service_name": "osd.default_drive_group", "target": ["mon-mgr", ""]}]: dispatch Dec 15 04:56:13 localhost ceph-mgr[292421]: [cephadm INFO root] Reconfig service osd.default_drive_group Dec 15 04:56:13 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Reconfig service osd.default_drive_group Dec 15 04:56:13 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559462.localdomain.devices.0}] v 0) Dec 15 04:56:13 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:56:13 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559462.localdomain}] v 0) Dec 15 04:56:13 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:56:13 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559462.localdomain.devices.0}] v 0) Dec 15 04:56:13 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) Dec 15 04:56:13 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "config generate-minimal-conf"} : dispatch Dec 15 04:56:13 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0) Dec 15 04:56:13 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Dec 15 04:56:13 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Updating np0005559462.localdomain:/etc/ceph/ceph.conf Dec 15 04:56:13 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Updating np0005559462.localdomain:/etc/ceph/ceph.conf Dec 15 04:56:13 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Updating np0005559463.localdomain:/etc/ceph/ceph.conf Dec 15 04:56:13 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Updating np0005559464.localdomain:/etc/ceph/ceph.conf Dec 15 04:56:13 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Updating np0005559463.localdomain:/etc/ceph/ceph.conf Dec 15 04:56:13 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Updating np0005559464.localdomain:/etc/ceph/ceph.conf Dec 15 04:56:13 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:56:13 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559462.localdomain}] v 0) Dec 15 04:56:13 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:56:13 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559463.localdomain.devices.0}] v 0) Dec 15 04:56:13 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:56:13 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559463.localdomain}] v 0) Dec 15 04:56:13 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:56:13 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559463.localdomain.devices.0}] v 0) Dec 15 04:56:13 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:56:13 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559463.localdomain}] v 0) Dec 15 04:56:14 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:56:14 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559464.localdomain.devices.0}] v 0) Dec 15 04:56:14 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:56:14 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559464.localdomain}] v 0) Dec 15 04:56:14 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:56:14 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559464.localdomain.devices.0}] v 0) Dec 15 04:56:14 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:56:14 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559464.localdomain}] v 0) Dec 15 04:56:14 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:56:14 localhost nova_compute[286344]: 2025-12-15 09:56:14.559 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:56:14 localhost nova_compute[286344]: 2025-12-15 09:56:14.560 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 15 04:56:14 localhost nova_compute[286344]: 2025-12-15 09:56:14.560 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 15 04:56:14 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Updating np0005559463.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:56:14 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Updating np0005559463.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:56:14 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Updating np0005559462.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:56:14 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Updating np0005559462.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:56:14 localhost ceph-mgr[292421]: [cephadm INFO cephadm.serve] Updating np0005559464.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:56:14 localhost ceph-mgr[292421]: log_channel(cephadm) log [INF] : Updating np0005559464.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:56:14 localhost ceph-mgr[292421]: log_channel(cluster) log [DBG] : pgmap v61: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail Dec 15 04:56:14 localhost ceph-mon[298913]: Reconfig service osd.default_drive_group Dec 15 04:56:14 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:56:14 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:56:14 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Dec 15 04:56:14 localhost ceph-mon[298913]: Updating np0005559462.localdomain:/etc/ceph/ceph.conf Dec 15 04:56:14 localhost ceph-mon[298913]: Updating np0005559463.localdomain:/etc/ceph/ceph.conf Dec 15 04:56:14 localhost ceph-mon[298913]: Updating np0005559464.localdomain:/etc/ceph/ceph.conf Dec 15 04:56:14 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:56:14 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:56:14 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:56:14 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:56:14 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:56:14 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:56:14 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:56:14 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:56:14 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:56:14 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:56:15 localhost nova_compute[286344]: 2025-12-15 09:56:15.047 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 15 04:56:15 localhost nova_compute[286344]: 2025-12-15 09:56:15.048 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquired lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 15 04:56:15 localhost nova_compute[286344]: 2025-12-15 09:56:15.048 286348 DEBUG nova.network.neutron [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 15 04:56:15 localhost nova_compute[286344]: 2025-12-15 09:56:15.048 286348 DEBUG nova.objects.instance [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 15 04:56:15 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:56:15 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559463.localdomain.devices.0}] v 0) Dec 15 04:56:15 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:56:15 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559463.localdomain}] v 0) Dec 15 04:56:15 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:56:15 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559462.localdomain.devices.0}] v 0) Dec 15 04:56:15 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559464.localdomain.devices.0}] v 0) Dec 15 04:56:15 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:56:15 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559462.localdomain}] v 0) Dec 15 04:56:15 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:56:15 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559464.localdomain}] v 0) Dec 15 04:56:15 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:56:15 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:56:15 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Dec 15 04:56:15 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:56:15 localhost ceph-mgr[292421]: [progress INFO root] update: starting ev 568a3b2e-7d7d-40dc-9f04-6cb5c24ecba4 (Updating node-proxy deployment (+3 -> 3)) Dec 15 04:56:15 localhost ceph-mgr[292421]: [progress INFO root] complete: finished ev 568a3b2e-7d7d-40dc-9f04-6cb5c24ecba4 (Updating node-proxy deployment (+3 -> 3)) Dec 15 04:56:15 localhost ceph-mgr[292421]: [progress INFO root] Completed event 568a3b2e-7d7d-40dc-9f04-6cb5c24ecba4 (Updating node-proxy deployment (+3 -> 3)) in 0 seconds Dec 15 04:56:15 localhost nova_compute[286344]: 2025-12-15 09:56:15.419 286348 DEBUG nova.network.neutron [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Updating instance_info_cache with network_info: [{"id": "03ef8889-3216-43fb-8a52-4be17a956ce1", "address": "fa:16:3e:74:df:7c", "network": {"id": "befb7a72-17a9-4bcb-b561-84b8f626685a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.201", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "c785bf23f53946bc99867d8832a50266", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03ef8889-32", "ovs_interfaceid": "03ef8889-3216-43fb-8a52-4be17a956ce1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 15 04:56:15 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0) Dec 15 04:56:15 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch Dec 15 04:56:15 localhost nova_compute[286344]: 2025-12-15 09:56:15.438 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Releasing lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 15 04:56:15 localhost nova_compute[286344]: 2025-12-15 09:56:15.438 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 15 04:56:15 localhost nova_compute[286344]: 2025-12-15 09:56:15.439 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:56:15 localhost nova_compute[286344]: 2025-12-15 09:56:15.439 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:56:15 localhost nova_compute[286344]: 2025-12-15 09:56:15.440 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:56:15 localhost nova_compute[286344]: 2025-12-15 09:56:15.440 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:56:15 localhost nova_compute[286344]: 2025-12-15 09:56:15.440 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 15 04:56:15 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mgr fail"} v 0) Dec 15 04:56:15 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='client.? ' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Dec 15 04:56:15 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e88 do_prune osdmap full prune enabled Dec 15 04:56:15 localhost ceph-mon[298913]: log_channel(cluster) log [INF] : Activating manager daemon np0005559463.daptkf Dec 15 04:56:15 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e89 e89: 6 total, 6 up, 6 in Dec 15 04:56:15 localhost ceph-mgr[292421]: mgr handle_mgr_map I was active but no longer am Dec 15 04:56:15 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-mgr-np0005559462-fudvyx[292417]: 2025-12-15T09:56:15.542+0000 7ff1f0bf5640 -1 mgr handle_mgr_map I was active but no longer am Dec 15 04:56:15 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e89: 6 total, 6 up, 6 in Dec 15 04:56:15 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='client.? ' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished Dec 15 04:56:15 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : mgrmap e38: np0005559463.daptkf(active, starting, since 0.0348207s), standbys: np0005559464.aomnqe Dec 15 04:56:15 localhost systemd[1]: session-70.scope: Deactivated successfully. Dec 15 04:56:15 localhost systemd[1]: session-70.scope: Consumed 23.951s CPU time. Dec 15 04:56:15 localhost ceph-mon[298913]: log_channel(cluster) log [INF] : Manager daemon np0005559463.daptkf is now available Dec 15 04:56:15 localhost systemd-logind[763]: Session 70 logged out. Waiting for processes to exit. Dec 15 04:56:15 localhost systemd-logind[763]: Removed session 70. Dec 15 04:56:15 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix":"config-key del","key":"mgr/cephadm/host.np0005559461.localdomain.devices.0"} v 0) Dec 15 04:56:15 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005559461.localdomain.devices.0"} : dispatch Dec 15 04:56:15 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-mgr-np0005559462-fudvyx[292417]: ignoring --setuser ceph since I am not root Dec 15 04:56:15 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-mgr-np0005559462-fudvyx[292417]: ignoring --setgroup ceph since I am not root Dec 15 04:56:15 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005559461.localdomain.devices.0"}]': finished Dec 15 04:56:15 localhost ceph-mgr[292421]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-mgr, pid 2 Dec 15 04:56:15 localhost ceph-mgr[292421]: pidfile_write: ignore empty --pid-file Dec 15 04:56:15 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix":"config-key del","key":"mgr/cephadm/host.np0005559461.localdomain.devices.0"} v 0) Dec 15 04:56:15 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005559461.localdomain.devices.0"} : dispatch Dec 15 04:56:15 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005559461.localdomain.devices.0"}]': finished Dec 15 04:56:15 localhost ceph-mgr[292421]: mgr[py] Loading python module 'alerts' Dec 15 04:56:15 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005559463.daptkf/mirror_snapshot_schedule"} v 0) Dec 15 04:56:15 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005559463.daptkf/mirror_snapshot_schedule"} : dispatch Dec 15 04:56:15 localhost ceph-mgr[292421]: mgr[py] Module alerts has missing NOTIFY_TYPES member Dec 15 04:56:15 localhost ceph-mgr[292421]: mgr[py] Loading python module 'balancer' Dec 15 04:56:15 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-mgr-np0005559462-fudvyx[292417]: 2025-12-15T09:56:15.753+0000 7f34c6cac140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member Dec 15 04:56:15 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005559463.daptkf/trash_purge_schedule"} v 0) Dec 15 04:56:15 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005559463.daptkf/trash_purge_schedule"} : dispatch Dec 15 04:56:15 localhost ceph-mgr[292421]: mgr[py] Module balancer has missing NOTIFY_TYPES member Dec 15 04:56:15 localhost ceph-mgr[292421]: mgr[py] Loading python module 'cephadm' Dec 15 04:56:15 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-mgr-np0005559462-fudvyx[292417]: 2025-12-15T09:56:15.824+0000 7f34c6cac140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member Dec 15 04:56:15 localhost nova_compute[286344]: 2025-12-15 09:56:15.860 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:56:15 localhost sshd[309488]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:56:15 localhost systemd-logind[763]: New session 71 of user ceph-admin. Dec 15 04:56:16 localhost systemd[1]: Started Session 71 of User ceph-admin. Dec 15 04:56:16 localhost ceph-mon[298913]: Updating np0005559463.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:56:16 localhost ceph-mon[298913]: Updating np0005559462.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:56:16 localhost ceph-mon[298913]: Updating np0005559464.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:56:16 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:56:16 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:56:16 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:56:16 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:56:16 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:56:16 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:56:16 localhost ceph-mon[298913]: from='mgr.26879 172.18.0.106:0/2534740212' entity='mgr.np0005559462.fudvyx' Dec 15 04:56:16 localhost ceph-mon[298913]: from='client.? 172.18.0.200:0/255843959' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Dec 15 04:56:16 localhost ceph-mon[298913]: from='client.? ' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Dec 15 04:56:16 localhost ceph-mon[298913]: Activating manager daemon np0005559463.daptkf Dec 15 04:56:16 localhost ceph-mon[298913]: from='client.? ' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished Dec 15 04:56:16 localhost ceph-mon[298913]: Manager daemon np0005559463.daptkf is now available Dec 15 04:56:16 localhost ceph-mon[298913]: from='mgr.26885 172.18.0.107:0/2441414812' entity='mgr.np0005559463.daptkf' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005559461.localdomain.devices.0"} : dispatch Dec 15 04:56:16 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005559461.localdomain.devices.0"} : dispatch Dec 15 04:56:16 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005559461.localdomain.devices.0"}]': finished Dec 15 04:56:16 localhost ceph-mon[298913]: from='mgr.26885 172.18.0.107:0/2441414812' entity='mgr.np0005559463.daptkf' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005559461.localdomain.devices.0"} : dispatch Dec 15 04:56:16 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005559461.localdomain.devices.0"} : dispatch Dec 15 04:56:16 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005559461.localdomain.devices.0"}]': finished Dec 15 04:56:16 localhost ceph-mon[298913]: from='mgr.26885 172.18.0.107:0/2441414812' entity='mgr.np0005559463.daptkf' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005559463.daptkf/mirror_snapshot_schedule"} : dispatch Dec 15 04:56:16 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005559463.daptkf/mirror_snapshot_schedule"} : dispatch Dec 15 04:56:16 localhost ceph-mon[298913]: from='mgr.26885 172.18.0.107:0/2441414812' entity='mgr.np0005559463.daptkf' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005559463.daptkf/trash_purge_schedule"} : dispatch Dec 15 04:56:16 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005559463.daptkf/trash_purge_schedule"} : dispatch Dec 15 04:56:16 localhost ceph-mgr[292421]: mgr[py] Loading python module 'crash' Dec 15 04:56:16 localhost ceph-mgr[292421]: mgr[py] Module crash has missing NOTIFY_TYPES member Dec 15 04:56:16 localhost ceph-mgr[292421]: mgr[py] Loading python module 'dashboard' Dec 15 04:56:16 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-mgr-np0005559462-fudvyx[292417]: 2025-12-15T09:56:16.464+0000 7f34c6cac140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member Dec 15 04:56:16 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : mgrmap e39: np0005559463.daptkf(active, since 1.04757s), standbys: np0005559464.aomnqe Dec 15 04:56:16 localhost ceph-mgr[292421]: mgr[py] Loading python module 'devicehealth' Dec 15 04:56:17 localhost ceph-mgr[292421]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member Dec 15 04:56:17 localhost ceph-mgr[292421]: mgr[py] Loading python module 'diskprediction_local' Dec 15 04:56:17 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-mgr-np0005559462-fudvyx[292417]: 2025-12-15T09:56:17.018+0000 7f34c6cac140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member Dec 15 04:56:17 localhost systemd[1]: tmp-crun.xdM1G3.mount: Deactivated successfully. Dec 15 04:56:17 localhost podman[309607]: 2025-12-15 09:56:17.057334118 +0000 UTC m=+0.105021017 container exec 8dcda56b365b42dc8758aab77a9ec80db304780e449052738f7e4e648ae1ecaf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-crash-np0005559462, vcs-type=git, com.redhat.component=rhceph-container, io.openshift.expose-services=, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, release=1763362218, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-11-26T19:44:28Z, RELEASE=main, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., distribution-scope=public, architecture=x86_64, name=rhceph, ceph=True, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git) Dec 15 04:56:17 localhost ceph-mon[298913]: removing stray HostCache host record np0005559461.localdomain.devices.0 Dec 15 04:56:17 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-mgr-np0005559462-fudvyx[292417]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode. Dec 15 04:56:17 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-mgr-np0005559462-fudvyx[292417]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve. Dec 15 04:56:17 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-mgr-np0005559462-fudvyx[292417]: from numpy import show_config as show_numpy_config Dec 15 04:56:17 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-mgr-np0005559462-fudvyx[292417]: 2025-12-15T09:56:17.152+0000 7f34c6cac140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member Dec 15 04:56:17 localhost ceph-mgr[292421]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member Dec 15 04:56:17 localhost ceph-mgr[292421]: mgr[py] Loading python module 'influx' Dec 15 04:56:17 localhost podman[309607]: 2025-12-15 09:56:17.191562624 +0000 UTC m=+0.239249553 container exec_died 8dcda56b365b42dc8758aab77a9ec80db304780e449052738f7e4e648ae1ecaf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-crash-np0005559462, description=Red Hat Ceph Storage 7, architecture=x86_64, GIT_CLEAN=True, build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., release=1763362218, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, name=rhceph, maintainer=Guillaume Abrioux , com.redhat.component=rhceph-container) Dec 15 04:56:17 localhost ceph-mgr[292421]: mgr[py] Module influx has missing NOTIFY_TYPES member Dec 15 04:56:17 localhost ceph-mgr[292421]: mgr[py] Loading python module 'insights' Dec 15 04:56:17 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-mgr-np0005559462-fudvyx[292417]: 2025-12-15T09:56:17.211+0000 7f34c6cac140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member Dec 15 04:56:17 localhost ceph-mgr[292421]: mgr[py] Loading python module 'iostat' Dec 15 04:56:17 localhost ceph-mgr[292421]: mgr[py] Module iostat has missing NOTIFY_TYPES member Dec 15 04:56:17 localhost ceph-mgr[292421]: mgr[py] Loading python module 'k8sevents' Dec 15 04:56:17 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-mgr-np0005559462-fudvyx[292417]: 2025-12-15T09:56:17.329+0000 7f34c6cac140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member Dec 15 04:56:17 localhost ceph-mon[298913]: log_channel(cluster) log [INF] : Health check cleared: CEPHADM_STRAY_DAEMON (was: 1 stray daemon(s) not managed by cephadm) Dec 15 04:56:17 localhost ceph-mon[298913]: log_channel(cluster) log [INF] : Health check cleared: CEPHADM_STRAY_HOST (was: 1 stray host(s) with 1 daemon(s) not managed by cephadm) Dec 15 04:56:17 localhost ceph-mon[298913]: log_channel(cluster) log [INF] : Cluster is now healthy Dec 15 04:56:17 localhost ceph-mgr[292421]: mgr[py] Loading python module 'localpool' Dec 15 04:56:17 localhost ceph-mgr[292421]: mgr[py] Loading python module 'mds_autoscaler' Dec 15 04:56:17 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559463.localdomain.devices.0}] v 0) Dec 15 04:56:17 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:17 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559463.localdomain}] v 0) Dec 15 04:56:17 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:17 localhost ceph-mgr[292421]: mgr[py] Loading python module 'mirroring' Dec 15 04:56:17 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559462.localdomain.devices.0}] v 0) Dec 15 04:56:17 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:17 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559462.localdomain}] v 0) Dec 15 04:56:17 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:17 localhost ceph-mgr[292421]: mgr[py] Loading python module 'nfs' Dec 15 04:56:18 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559464.localdomain.devices.0}] v 0) Dec 15 04:56:18 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:18 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559464.localdomain}] v 0) Dec 15 04:56:18 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:18 localhost ceph-mgr[292421]: mgr[py] Module nfs has missing NOTIFY_TYPES member Dec 15 04:56:18 localhost ceph-mgr[292421]: mgr[py] Loading python module 'orchestrator' Dec 15 04:56:18 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-mgr-np0005559462-fudvyx[292417]: 2025-12-15T09:56:18.065+0000 7f34c6cac140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member Dec 15 04:56:18 localhost ceph-mon[298913]: Health check cleared: CEPHADM_STRAY_DAEMON (was: 1 stray daemon(s) not managed by cephadm) Dec 15 04:56:18 localhost ceph-mon[298913]: Health check cleared: CEPHADM_STRAY_HOST (was: 1 stray host(s) with 1 daemon(s) not managed by cephadm) Dec 15 04:56:18 localhost ceph-mon[298913]: Cluster is now healthy Dec 15 04:56:18 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:18 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:18 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:18 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:18 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:18 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:18 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : mgrmap e40: np0005559463.daptkf(active, since 2s), standbys: np0005559464.aomnqe Dec 15 04:56:18 localhost ceph-mgr[292421]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member Dec 15 04:56:18 localhost ceph-mgr[292421]: mgr[py] Loading python module 'osd_perf_query' Dec 15 04:56:18 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-mgr-np0005559462-fudvyx[292417]: 2025-12-15T09:56:18.209+0000 7f34c6cac140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member Dec 15 04:56:18 localhost ceph-mgr[292421]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member Dec 15 04:56:18 localhost ceph-mgr[292421]: mgr[py] Loading python module 'osd_support' Dec 15 04:56:18 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-mgr-np0005559462-fudvyx[292417]: 2025-12-15T09:56:18.271+0000 7f34c6cac140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member Dec 15 04:56:18 localhost ceph-mgr[292421]: mgr[py] Module osd_support has missing NOTIFY_TYPES member Dec 15 04:56:18 localhost ceph-mgr[292421]: mgr[py] Loading python module 'pg_autoscaler' Dec 15 04:56:18 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-mgr-np0005559462-fudvyx[292417]: 2025-12-15T09:56:18.324+0000 7f34c6cac140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member Dec 15 04:56:18 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-mgr-np0005559462-fudvyx[292417]: 2025-12-15T09:56:18.388+0000 7f34c6cac140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member Dec 15 04:56:18 localhost ceph-mgr[292421]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member Dec 15 04:56:18 localhost ceph-mgr[292421]: mgr[py] Loading python module 'progress' Dec 15 04:56:18 localhost ceph-mgr[292421]: mgr[py] Module progress has missing NOTIFY_TYPES member Dec 15 04:56:18 localhost ceph-mgr[292421]: mgr[py] Loading python module 'prometheus' Dec 15 04:56:18 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-mgr-np0005559462-fudvyx[292417]: 2025-12-15T09:56:18.445+0000 7f34c6cac140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member Dec 15 04:56:18 localhost ceph-mgr[292421]: mgr[py] Module prometheus has missing NOTIFY_TYPES member Dec 15 04:56:18 localhost ceph-mgr[292421]: mgr[py] Loading python module 'rbd_support' Dec 15 04:56:18 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-mgr-np0005559462-fudvyx[292417]: 2025-12-15T09:56:18.734+0000 7f34c6cac140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member Dec 15 04:56:18 localhost ceph-mgr[292421]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member Dec 15 04:56:18 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-mgr-np0005559462-fudvyx[292417]: 2025-12-15T09:56:18.813+0000 7f34c6cac140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member Dec 15 04:56:18 localhost ceph-mgr[292421]: mgr[py] Loading python module 'restful' Dec 15 04:56:18 localhost ceph-mgr[292421]: mgr[py] Loading python module 'rgw' Dec 15 04:56:19 localhost ceph-mgr[292421]: mgr[py] Module rgw has missing NOTIFY_TYPES member Dec 15 04:56:19 localhost ceph-mgr[292421]: mgr[py] Loading python module 'rook' Dec 15 04:56:19 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-mgr-np0005559462-fudvyx[292417]: 2025-12-15T09:56:19.127+0000 7f34c6cac140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member Dec 15 04:56:19 localhost ceph-mon[298913]: [15/Dec/2025:09:56:17] ENGINE Bus STARTING Dec 15 04:56:19 localhost ceph-mon[298913]: [15/Dec/2025:09:56:17] ENGINE Serving on http://172.18.0.107:8765 Dec 15 04:56:19 localhost ceph-mon[298913]: [15/Dec/2025:09:56:17] ENGINE Serving on https://172.18.0.107:7150 Dec 15 04:56:19 localhost ceph-mon[298913]: [15/Dec/2025:09:56:17] ENGINE Bus STARTED Dec 15 04:56:19 localhost ceph-mon[298913]: [15/Dec/2025:09:56:17] ENGINE Client ('172.18.0.107', 39834) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') Dec 15 04:56:19 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559463.localdomain.devices.0}] v 0) Dec 15 04:56:19 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559462.localdomain.devices.0}] v 0) Dec 15 04:56:19 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:19 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:19 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559463.localdomain}] v 0) Dec 15 04:56:19 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559462.localdomain}] v 0) Dec 15 04:56:19 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:19 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559464.localdomain.devices.0}] v 0) Dec 15 04:56:19 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:19 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} v 0) Dec 15 04:56:19 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Dec 15 04:56:19 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:19 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} v 0) Dec 15 04:56:19 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Dec 15 04:56:19 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559464.localdomain}] v 0) Dec 15 04:56:19 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} v 0) Dec 15 04:56:19 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Dec 15 04:56:19 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} v 0) Dec 15 04:56:19 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Dec 15 04:56:19 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:19 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} v 0) Dec 15 04:56:19 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Dec 15 04:56:19 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0) Dec 15 04:56:19 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0) Dec 15 04:56:19 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} v 0) Dec 15 04:56:19 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Dec 15 04:56:19 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0) Dec 15 04:56:19 localhost ceph-mgr[292421]: mgr[py] Module rook has missing NOTIFY_TYPES member Dec 15 04:56:19 localhost ceph-mgr[292421]: mgr[py] Loading python module 'selftest' Dec 15 04:56:19 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-mgr-np0005559462-fudvyx[292417]: 2025-12-15T09:56:19.534+0000 7f34c6cac140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member Dec 15 04:56:19 localhost ceph-mgr[292421]: mgr[py] Module selftest has missing NOTIFY_TYPES member Dec 15 04:56:19 localhost ceph-mgr[292421]: mgr[py] Loading python module 'snap_schedule' Dec 15 04:56:19 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-mgr-np0005559462-fudvyx[292417]: 2025-12-15T09:56:19.593+0000 7f34c6cac140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member Dec 15 04:56:19 localhost ceph-mgr[292421]: mgr[py] Loading python module 'stats' Dec 15 04:56:19 localhost ceph-mgr[292421]: mgr[py] Loading python module 'status' Dec 15 04:56:19 localhost ceph-mgr[292421]: mgr[py] Module status has missing NOTIFY_TYPES member Dec 15 04:56:19 localhost ceph-mgr[292421]: mgr[py] Loading python module 'telegraf' Dec 15 04:56:19 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-mgr-np0005559462-fudvyx[292417]: 2025-12-15T09:56:19.776+0000 7f34c6cac140 -1 mgr[py] Module status has missing NOTIFY_TYPES member Dec 15 04:56:19 localhost ceph-mgr[292421]: mgr[py] Module telegraf has missing NOTIFY_TYPES member Dec 15 04:56:19 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-mgr-np0005559462-fudvyx[292417]: 2025-12-15T09:56:19.832+0000 7f34c6cac140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member Dec 15 04:56:19 localhost ceph-mgr[292421]: mgr[py] Loading python module 'telemetry' Dec 15 04:56:19 localhost ceph-mgr[292421]: mgr[py] Module telemetry has missing NOTIFY_TYPES member Dec 15 04:56:19 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-mgr-np0005559462-fudvyx[292417]: 2025-12-15T09:56:19.957+0000 7f34c6cac140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member Dec 15 04:56:19 localhost ceph-mgr[292421]: mgr[py] Loading python module 'test_orchestrator' Dec 15 04:56:20 localhost ceph-mgr[292421]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member Dec 15 04:56:20 localhost ceph-mgr[292421]: mgr[py] Loading python module 'volumes' Dec 15 04:56:20 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-mgr-np0005559462-fudvyx[292417]: 2025-12-15T09:56:20.099+0000 7f34c6cac140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member Dec 15 04:56:20 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : mgrmap e41: np0005559463.daptkf(active, since 4s), standbys: np0005559464.aomnqe Dec 15 04:56:20 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:56:20 localhost ceph-mgr[292421]: mgr[py] Module volumes has missing NOTIFY_TYPES member Dec 15 04:56:20 localhost ceph-mgr[292421]: mgr[py] Loading python module 'zabbix' Dec 15 04:56:20 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-mgr-np0005559462-fudvyx[292417]: 2025-12-15T09:56:20.284+0000 7f34c6cac140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member Dec 15 04:56:20 localhost ceph-mgr[292421]: mgr[py] Module zabbix has missing NOTIFY_TYPES member Dec 15 04:56:20 localhost ceph-bce17446-41b5-5408-a23e-0b011906b44a-mgr-np0005559462-fudvyx[292417]: 2025-12-15T09:56:20.340+0000 7f34c6cac140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member Dec 15 04:56:20 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : Standby manager daemon np0005559462.fudvyx started Dec 15 04:56:20 localhost ceph-mgr[292421]: ms_deliver_dispatch: unhandled message 0x55cce75251e0 mon_map magic: 0 from mon.0 v2:172.18.0.103:3300/0 Dec 15 04:56:20 localhost ceph-mgr[292421]: client.0 ms_handle_reset on v2:172.18.0.107:6810/1143112299 Dec 15 04:56:20 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:20 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:20 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:20 localhost ceph-mon[298913]: from='mgr.26885 172.18.0.107:0/2441414812' entity='mgr.np0005559463.daptkf' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Dec 15 04:56:20 localhost ceph-mon[298913]: from='mgr.26885 172.18.0.107:0/2441414812' entity='mgr.np0005559463.daptkf' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Dec 15 04:56:20 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:20 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Dec 15 04:56:20 localhost ceph-mon[298913]: from='mgr.26885 172.18.0.107:0/2441414812' entity='mgr.np0005559463.daptkf' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Dec 15 04:56:20 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:20 localhost ceph-mon[298913]: from='mgr.26885 172.18.0.107:0/2441414812' entity='mgr.np0005559463.daptkf' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Dec 15 04:56:20 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Dec 15 04:56:20 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Dec 15 04:56:20 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Dec 15 04:56:20 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:20 localhost ceph-mon[298913]: Adjusting osd_memory_target on np0005559463.localdomain to 836.6M Dec 15 04:56:20 localhost ceph-mon[298913]: Adjusting osd_memory_target on np0005559462.localdomain to 836.6M Dec 15 04:56:20 localhost ceph-mon[298913]: from='mgr.26885 172.18.0.107:0/2441414812' entity='mgr.np0005559463.daptkf' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Dec 15 04:56:20 localhost ceph-mon[298913]: Unable to set osd_memory_target on np0005559463.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Dec 15 04:56:20 localhost ceph-mon[298913]: Unable to set osd_memory_target on np0005559462.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096 Dec 15 04:56:20 localhost ceph-mon[298913]: from='mgr.26885 172.18.0.107:0/2441414812' entity='mgr.np0005559463.daptkf' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Dec 15 04:56:20 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Dec 15 04:56:20 localhost ceph-mon[298913]: Adjusting osd_memory_target on np0005559464.localdomain to 836.6M Dec 15 04:56:20 localhost ceph-mon[298913]: Unable to set osd_memory_target on np0005559464.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Dec 15 04:56:20 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Dec 15 04:56:20 localhost ceph-mon[298913]: from='mgr.26885 172.18.0.107:0/2441414812' entity='mgr.np0005559463.daptkf' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Dec 15 04:56:20 localhost ceph-mon[298913]: Updating np0005559462.localdomain:/etc/ceph/ceph.conf Dec 15 04:56:20 localhost ceph-mon[298913]: Updating np0005559463.localdomain:/etc/ceph/ceph.conf Dec 15 04:56:20 localhost ceph-mon[298913]: Updating np0005559464.localdomain:/etc/ceph/ceph.conf Dec 15 04:56:20 localhost ceph-mon[298913]: Updating np0005559464.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:56:20 localhost ceph-mon[298913]: Updating np0005559462.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:56:20 localhost ceph-mon[298913]: Updating np0005559463.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:56:20 localhost nova_compute[286344]: 2025-12-15 09:56:20.865 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:56:20 localhost nova_compute[286344]: 2025-12-15 09:56:20.867 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:56:20 localhost nova_compute[286344]: 2025-12-15 09:56:20.867 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 15 04:56:20 localhost nova_compute[286344]: 2025-12-15 09:56:20.867 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:56:20 localhost nova_compute[286344]: 2025-12-15 09:56:20.868 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:56:20 localhost nova_compute[286344]: 2025-12-15 09:56:20.871 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:56:21 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : mgrmap e42: np0005559463.daptkf(active, since 5s), standbys: np0005559464.aomnqe, np0005559462.fudvyx Dec 15 04:56:22 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559462.localdomain.devices.0}] v 0) Dec 15 04:56:22 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:22 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559463.localdomain.devices.0}] v 0) Dec 15 04:56:22 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559462.localdomain}] v 0) Dec 15 04:56:22 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:22 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559463.localdomain}] v 0) Dec 15 04:56:22 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:22 localhost ceph-mon[298913]: Updating np0005559464.localdomain:/etc/ceph/ceph.client.admin.keyring Dec 15 04:56:22 localhost ceph-mon[298913]: Updating np0005559462.localdomain:/etc/ceph/ceph.client.admin.keyring Dec 15 04:56:22 localhost ceph-mon[298913]: Updating np0005559463.localdomain:/etc/ceph/ceph.client.admin.keyring Dec 15 04:56:22 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:22 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:22 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559464.localdomain.devices.0}] v 0) Dec 15 04:56:22 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:22 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559464.localdomain}] v 0) Dec 15 04:56:22 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:22 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Dec 15 04:56:22 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:22 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005559462.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0) Dec 15 04:56:22 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005559462.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Dec 15 04:56:23 localhost podman[310559]: Dec 15 04:56:23 localhost podman[310559]: 2025-12-15 09:56:23.094908701 +0000 UTC m=+0.075425914 container create 9df5c7492637a03879dacdb71b81d593ede093907ea1700e51e9315b3541c48b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_goodall, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, release=1763362218, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, io.openshift.expose-services=, version=7, build-date=2025-11-26T19:44:28Z, RELEASE=main, GIT_CLEAN=True, GIT_BRANCH=main, architecture=x86_64, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Dec 15 04:56:23 localhost systemd[1]: Started libpod-conmon-9df5c7492637a03879dacdb71b81d593ede093907ea1700e51e9315b3541c48b.scope. Dec 15 04:56:23 localhost systemd[1]: Started libcrun container. Dec 15 04:56:23 localhost podman[310559]: 2025-12-15 09:56:23.155955956 +0000 UTC m=+0.136473159 container init 9df5c7492637a03879dacdb71b81d593ede093907ea1700e51e9315b3541c48b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_goodall, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux , name=rhceph, build-date=2025-11-26T19:44:28Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, vcs-type=git, com.redhat.component=rhceph-container, ceph=True, GIT_CLEAN=True, architecture=x86_64, release=1763362218, GIT_BRANCH=main) Dec 15 04:56:23 localhost podman[310559]: 2025-12-15 09:56:23.065664949 +0000 UTC m=+0.046182182 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 04:56:23 localhost podman[310559]: 2025-12-15 09:56:23.16726282 +0000 UTC m=+0.147779993 container start 9df5c7492637a03879dacdb71b81d593ede093907ea1700e51e9315b3541c48b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_goodall, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, version=7, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, release=1763362218, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, ceph=True, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, io.openshift.tags=rhceph ceph, vcs-type=git, CEPH_POINT_RELEASE=, architecture=x86_64) Dec 15 04:56:23 localhost podman[310559]: 2025-12-15 09:56:23.167451345 +0000 UTC m=+0.147968608 container attach 9df5c7492637a03879dacdb71b81d593ede093907ea1700e51e9315b3541c48b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_goodall, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux , distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, release=1763362218, vendor=Red Hat, Inc., architecture=x86_64, CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, version=7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vcs-type=git, name=rhceph) Dec 15 04:56:23 localhost crazy_goodall[310573]: 167 167 Dec 15 04:56:23 localhost systemd[1]: libpod-9df5c7492637a03879dacdb71b81d593ede093907ea1700e51e9315b3541c48b.scope: Deactivated successfully. Dec 15 04:56:23 localhost podman[310559]: 2025-12-15 09:56:23.170836249 +0000 UTC m=+0.151353482 container died 9df5c7492637a03879dacdb71b81d593ede093907ea1700e51e9315b3541c48b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_goodall, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, release=1763362218, io.buildah.version=1.41.4, GIT_CLEAN=True, GIT_BRANCH=main, name=rhceph, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux , distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, RELEASE=main, description=Red Hat Ceph Storage 7, vcs-type=git, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers) Dec 15 04:56:23 localhost ceph-mon[298913]: Updating np0005559462.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.client.admin.keyring Dec 15 04:56:23 localhost ceph-mon[298913]: Updating np0005559463.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.client.admin.keyring Dec 15 04:56:23 localhost ceph-mon[298913]: Updating np0005559464.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.client.admin.keyring Dec 15 04:56:23 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:23 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:23 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:23 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:23 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:23 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:23 localhost ceph-mon[298913]: from='mgr.26885 172.18.0.107:0/2441414812' entity='mgr.np0005559463.daptkf' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005559462.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Dec 15 04:56:23 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005559462.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Dec 15 04:56:23 localhost podman[310578]: 2025-12-15 09:56:23.252397812 +0000 UTC m=+0.072886164 container remove 9df5c7492637a03879dacdb71b81d593ede093907ea1700e51e9315b3541c48b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_goodall, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_BRANCH=main, version=7, build-date=2025-11-26T19:44:28Z, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., release=1763362218, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux ) Dec 15 04:56:23 localhost systemd[1]: libpod-conmon-9df5c7492637a03879dacdb71b81d593ede093907ea1700e51e9315b3541c48b.scope: Deactivated successfully. Dec 15 04:56:23 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559462.localdomain.devices.0}] v 0) Dec 15 04:56:23 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:23 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559462.localdomain}] v 0) Dec 15 04:56:23 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:23 localhost podman[310649]: Dec 15 04:56:23 localhost podman[310649]: 2025-12-15 09:56:23.971545254 +0000 UTC m=+0.079667513 container create 9a1e76b85147c74bc338007acba7de5842ac920cd35651e1507c5a2d6cd5ceab (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_lederberg, com.redhat.component=rhceph-container, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, RELEASE=main, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, io.openshift.expose-services=, io.buildah.version=1.41.4, vcs-type=git, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., distribution-scope=public, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, build-date=2025-11-26T19:44:28Z, architecture=x86_64, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0) Dec 15 04:56:24 localhost systemd[1]: Started libpod-conmon-9a1e76b85147c74bc338007acba7de5842ac920cd35651e1507c5a2d6cd5ceab.scope. Dec 15 04:56:24 localhost systemd[1]: Started libcrun container. Dec 15 04:56:24 localhost podman[310649]: 2025-12-15 09:56:24.032732343 +0000 UTC m=+0.140854602 container init 9a1e76b85147c74bc338007acba7de5842ac920cd35651e1507c5a2d6cd5ceab (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_lederberg, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-11-26T19:44:28Z, architecture=x86_64, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=Guillaume Abrioux , version=7, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, RELEASE=main, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1763362218, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container) Dec 15 04:56:24 localhost podman[310649]: 2025-12-15 09:56:23.936957933 +0000 UTC m=+0.045080222 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 04:56:24 localhost podman[310649]: 2025-12-15 09:56:24.039451099 +0000 UTC m=+0.147573368 container start 9a1e76b85147c74bc338007acba7de5842ac920cd35651e1507c5a2d6cd5ceab (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_lederberg, maintainer=Guillaume Abrioux , GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, ceph=True, build-date=2025-11-26T19:44:28Z, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, release=1763362218, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, architecture=x86_64, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, vendor=Red Hat, Inc., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, version=7, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Dec 15 04:56:24 localhost podman[310649]: 2025-12-15 09:56:24.039664554 +0000 UTC m=+0.147786823 container attach 9a1e76b85147c74bc338007acba7de5842ac920cd35651e1507c5a2d6cd5ceab (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_lederberg, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1763362218, ceph=True, io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, RELEASE=main, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, distribution-scope=public, com.redhat.component=rhceph-container, vcs-type=git, version=7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, build-date=2025-11-26T19:44:28Z, maintainer=Guillaume Abrioux , io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc.) Dec 15 04:56:24 localhost cool_lederberg[310664]: 167 167 Dec 15 04:56:24 localhost systemd[1]: libpod-9a1e76b85147c74bc338007acba7de5842ac920cd35651e1507c5a2d6cd5ceab.scope: Deactivated successfully. Dec 15 04:56:24 localhost podman[310649]: 2025-12-15 09:56:24.041047203 +0000 UTC m=+0.149169442 container died 9a1e76b85147c74bc338007acba7de5842ac920cd35651e1507c5a2d6cd5ceab (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_lederberg, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, name=rhceph, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-11-26T19:44:28Z, com.redhat.component=rhceph-container, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, release=1763362218, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, GIT_BRANCH=main, maintainer=Guillaume Abrioux , io.openshift.tags=rhceph ceph, version=7, ceph=True, RELEASE=main) Dec 15 04:56:24 localhost systemd[1]: var-lib-containers-storage-overlay-7849207df4fee9940b4dd7e5401f2ba49c682ceb9c08c2f5742d110ba711ee28-merged.mount: Deactivated successfully. Dec 15 04:56:24 localhost systemd[1]: var-lib-containers-storage-overlay-383800c2d84d20df9dd61d5524af729a2e0314366ccbed4a576280d30779535e-merged.mount: Deactivated successfully. Dec 15 04:56:24 localhost podman[310670]: 2025-12-15 09:56:24.130532816 +0000 UTC m=+0.077136391 container remove 9a1e76b85147c74bc338007acba7de5842ac920cd35651e1507c5a2d6cd5ceab (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=cool_lederberg, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, distribution-scope=public, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, architecture=x86_64, RELEASE=main, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, build-date=2025-11-26T19:44:28Z, version=7, io.openshift.expose-services=, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc.) Dec 15 04:56:24 localhost systemd[1]: libpod-conmon-9a1e76b85147c74bc338007acba7de5842ac920cd35651e1507c5a2d6cd5ceab.scope: Deactivated successfully. Dec 15 04:56:24 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559462.localdomain.devices.0}] v 0) Dec 15 04:56:24 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:24 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559462.localdomain}] v 0) Dec 15 04:56:24 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:24 localhost ceph-mon[298913]: Reconfiguring crash.np0005559462 (monmap changed)... Dec 15 04:56:24 localhost ceph-mon[298913]: Reconfiguring daemon crash.np0005559462 on np0005559462.localdomain Dec 15 04:56:24 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:24 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:24 localhost ceph-mon[298913]: Reconfiguring osd.0 (monmap changed)... Dec 15 04:56:24 localhost ceph-mon[298913]: from='mgr.26885 172.18.0.107:0/2441414812' entity='mgr.np0005559463.daptkf' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch Dec 15 04:56:24 localhost ceph-mon[298913]: Reconfiguring daemon osd.0 on np0005559462.localdomain Dec 15 04:56:24 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:24 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559462.localdomain.devices.0}] v 0) Dec 15 04:56:24 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:24 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559462.localdomain}] v 0) Dec 15 04:56:24 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:24 localhost podman[310744]: Dec 15 04:56:24 localhost podman[310744]: 2025-12-15 09:56:24.972109946 +0000 UTC m=+0.073006377 container create 121f62eaff0625dcdd0b64e62aa3a8231b9243beef774687423db61f118d7d31 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=romantic_goldwasser, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, io.openshift.expose-services=, name=rhceph, maintainer=Guillaume Abrioux , release=1763362218, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, architecture=x86_64, distribution-scope=public, build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7) Dec 15 04:56:25 localhost systemd[1]: Started libpod-conmon-121f62eaff0625dcdd0b64e62aa3a8231b9243beef774687423db61f118d7d31.scope. Dec 15 04:56:25 localhost systemd[1]: Started libcrun container. Dec 15 04:56:25 localhost podman[310744]: 2025-12-15 09:56:24.943561004 +0000 UTC m=+0.044457465 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 04:56:25 localhost podman[310744]: 2025-12-15 09:56:25.045338399 +0000 UTC m=+0.146234840 container init 121f62eaff0625dcdd0b64e62aa3a8231b9243beef774687423db61f118d7d31 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=romantic_goldwasser, RELEASE=main, GIT_BRANCH=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, release=1763362218, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, CEPH_POINT_RELEASE=, ceph=True, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9) Dec 15 04:56:25 localhost podman[310744]: 2025-12-15 09:56:25.05332252 +0000 UTC m=+0.154218971 container start 121f62eaff0625dcdd0b64e62aa3a8231b9243beef774687423db61f118d7d31 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=romantic_goldwasser, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., distribution-scope=public, GIT_BRANCH=main, version=7, build-date=2025-11-26T19:44:28Z, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, release=1763362218, io.buildah.version=1.41.4) Dec 15 04:56:25 localhost podman[310744]: 2025-12-15 09:56:25.053565577 +0000 UTC m=+0.154462018 container attach 121f62eaff0625dcdd0b64e62aa3a8231b9243beef774687423db61f118d7d31 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=romantic_goldwasser, GIT_CLEAN=True, com.redhat.component=rhceph-container, vcs-type=git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, RELEASE=main, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, maintainer=Guillaume Abrioux , release=1763362218, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, GIT_BRANCH=main, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-11-26T19:44:28Z, distribution-scope=public, io.openshift.tags=rhceph ceph) Dec 15 04:56:25 localhost romantic_goldwasser[310759]: 167 167 Dec 15 04:56:25 localhost systemd[1]: libpod-121f62eaff0625dcdd0b64e62aa3a8231b9243beef774687423db61f118d7d31.scope: Deactivated successfully. Dec 15 04:56:25 localhost podman[310744]: 2025-12-15 09:56:25.057325221 +0000 UTC m=+0.158221732 container died 121f62eaff0625dcdd0b64e62aa3a8231b9243beef774687423db61f118d7d31 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=romantic_goldwasser, version=7, GIT_BRANCH=main, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , GIT_CLEAN=True, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, vcs-type=git, build-date=2025-11-26T19:44:28Z, release=1763362218, ceph=True, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, architecture=x86_64, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, io.buildah.version=1.41.4) Dec 15 04:56:25 localhost systemd[1]: tmp-crun.p1Bj6Q.mount: Deactivated successfully. Dec 15 04:56:25 localhost systemd[1]: var-lib-containers-storage-overlay-5a6eeac167ffbee33e16a08d99bd0c35cb558330b6c13626bbe6155b9d414b12-merged.mount: Deactivated successfully. Dec 15 04:56:25 localhost podman[310764]: 2025-12-15 09:56:25.153339707 +0000 UTC m=+0.085753922 container remove 121f62eaff0625dcdd0b64e62aa3a8231b9243beef774687423db61f118d7d31 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=romantic_goldwasser, vcs-type=git, build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, distribution-scope=public, GIT_CLEAN=True, version=7, architecture=x86_64, maintainer=Guillaume Abrioux , ceph=True, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, io.openshift.tags=rhceph ceph, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Dec 15 04:56:25 localhost systemd[1]: libpod-conmon-121f62eaff0625dcdd0b64e62aa3a8231b9243beef774687423db61f118d7d31.scope: Deactivated successfully. Dec 15 04:56:25 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:56:25 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559462.localdomain.devices.0}] v 0) Dec 15 04:56:25 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:25 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559462.localdomain}] v 0) Dec 15 04:56:25 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:25 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:25 localhost ceph-mon[298913]: from='mgr.26885 172.18.0.107:0/2441414812' entity='mgr.np0005559463.daptkf' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch Dec 15 04:56:25 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:25 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:25 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:25 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559462.localdomain.devices.0}] v 0) Dec 15 04:56:25 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:25 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559462.localdomain}] v 0) Dec 15 04:56:25 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:25 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005559462.mhigvc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0) Dec 15 04:56:25 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005559462.mhigvc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Dec 15 04:56:25 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Dec 15 04:56:25 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:25 localhost nova_compute[286344]: 2025-12-15 09:56:25.871 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:56:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0. Dec 15 04:56:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 04:56:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a. Dec 15 04:56:26 localhost podman[310838]: 2025-12-15 09:56:26.073192609 +0000 UTC m=+0.107888866 container health_status 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0) Dec 15 04:56:26 localhost podman[310838]: 2025-12-15 09:56:26.084606595 +0000 UTC m=+0.119302882 container exec_died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, container_name=multipathd, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202) Dec 15 04:56:26 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.service: Deactivated successfully. Dec 15 04:56:26 localhost podman[310837]: 2025-12-15 09:56:26.168077453 +0000 UTC m=+0.202889364 container health_status 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Dec 15 04:56:26 localhost podman[310837]: 2025-12-15 09:56:26.183385077 +0000 UTC m=+0.218196988 container exec_died 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 15 04:56:26 localhost systemd[1]: 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0.service: Deactivated successfully. Dec 15 04:56:26 localhost podman[310840]: 2025-12-15 09:56:26.282527789 +0000 UTC m=+0.308991488 container health_status b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 04:56:26 localhost podman[310860]: Dec 15 04:56:26 localhost podman[310840]: 2025-12-15 09:56:26.321422929 +0000 UTC m=+0.347886668 container exec_died b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute) Dec 15 04:56:26 localhost systemd[1]: b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a.service: Deactivated successfully. Dec 15 04:56:26 localhost podman[310860]: 2025-12-15 09:56:26.244116013 +0000 UTC m=+0.248904880 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 04:56:26 localhost podman[310860]: 2025-12-15 09:56:26.376039995 +0000 UTC m=+0.380828872 container create 7098fdbfc43b398fde3f20721c615fa99b2d3732638dcde7b2cfee7986fe17ae (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=kind_mahavira, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-11-26T19:44:28Z, name=rhceph, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, vcs-type=git, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, description=Red Hat Ceph Storage 7, release=1763362218, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, ceph=True) Dec 15 04:56:26 localhost systemd[1]: Started libpod-conmon-7098fdbfc43b398fde3f20721c615fa99b2d3732638dcde7b2cfee7986fe17ae.scope. Dec 15 04:56:26 localhost systemd[1]: Started libcrun container. Dec 15 04:56:26 localhost podman[310860]: 2025-12-15 09:56:26.455244063 +0000 UTC m=+0.460032900 container init 7098fdbfc43b398fde3f20721c615fa99b2d3732638dcde7b2cfee7986fe17ae (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=kind_mahavira, maintainer=Guillaume Abrioux , ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, version=7, distribution-scope=public, CEPH_POINT_RELEASE=, build-date=2025-11-26T19:44:28Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, vendor=Red Hat, Inc., RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git) Dec 15 04:56:26 localhost podman[310860]: 2025-12-15 09:56:26.464236813 +0000 UTC m=+0.469025630 container start 7098fdbfc43b398fde3f20721c615fa99b2d3732638dcde7b2cfee7986fe17ae (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=kind_mahavira, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, architecture=x86_64, ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-11-26T19:44:28Z, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , name=rhceph, GIT_BRANCH=main, io.buildah.version=1.41.4, RELEASE=main, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, com.redhat.component=rhceph-container) Dec 15 04:56:26 localhost podman[310860]: 2025-12-15 09:56:26.46449753 +0000 UTC m=+0.469286387 container attach 7098fdbfc43b398fde3f20721c615fa99b2d3732638dcde7b2cfee7986fe17ae (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=kind_mahavira, ceph=True, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., version=7, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-11-26T19:44:28Z, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, io.openshift.expose-services=, CEPH_POINT_RELEASE=, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, maintainer=Guillaume Abrioux , release=1763362218, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4) Dec 15 04:56:26 localhost kind_mahavira[310918]: 167 167 Dec 15 04:56:26 localhost systemd[1]: libpod-7098fdbfc43b398fde3f20721c615fa99b2d3732638dcde7b2cfee7986fe17ae.scope: Deactivated successfully. Dec 15 04:56:26 localhost podman[310860]: 2025-12-15 09:56:26.469659133 +0000 UTC m=+0.474447940 container died 7098fdbfc43b398fde3f20721c615fa99b2d3732638dcde7b2cfee7986fe17ae (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=kind_mahavira, maintainer=Guillaume Abrioux , cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1763362218, com.redhat.component=rhceph-container, io.openshift.expose-services=, GIT_BRANCH=main, build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, GIT_CLEAN=True, vcs-type=git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, io.openshift.tags=rhceph ceph, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., io.buildah.version=1.41.4, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True) Dec 15 04:56:26 localhost ceph-mon[298913]: Reconfiguring osd.3 (monmap changed)... Dec 15 04:56:26 localhost ceph-mon[298913]: Reconfiguring daemon osd.3 on np0005559462.localdomain Dec 15 04:56:26 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:26 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:26 localhost ceph-mon[298913]: from='mgr.26885 172.18.0.107:0/2441414812' entity='mgr.np0005559463.daptkf' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005559462.mhigvc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Dec 15 04:56:26 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:26 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005559462.mhigvc", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Dec 15 04:56:26 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:26 localhost podman[310923]: 2025-12-15 09:56:26.567156849 +0000 UTC m=+0.084793584 container remove 7098fdbfc43b398fde3f20721c615fa99b2d3732638dcde7b2cfee7986fe17ae (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=kind_mahavira, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, build-date=2025-11-26T19:44:28Z, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, com.redhat.component=rhceph-container, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, CEPH_POINT_RELEASE=, vcs-type=git, distribution-scope=public, GIT_CLEAN=True, name=rhceph, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, architecture=x86_64, GIT_BRANCH=main, release=1763362218, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0) Dec 15 04:56:26 localhost systemd[1]: libpod-conmon-7098fdbfc43b398fde3f20721c615fa99b2d3732638dcde7b2cfee7986fe17ae.scope: Deactivated successfully. Dec 15 04:56:26 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559462.localdomain.devices.0}] v 0) Dec 15 04:56:26 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:26 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559462.localdomain}] v 0) Dec 15 04:56:26 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:26 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005559462.fudvyx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0) Dec 15 04:56:26 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005559462.fudvyx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Dec 15 04:56:27 localhost systemd[1]: var-lib-containers-storage-overlay-631be0af81a5543011fe9a5403e7db67f41b58749b2db3f80b140fb96d71b58b-merged.mount: Deactivated successfully. Dec 15 04:56:27 localhost podman[310991]: Dec 15 04:56:27 localhost podman[310991]: 2025-12-15 09:56:27.280843409 +0000 UTC m=+0.077719159 container create 79801f698536056390e8ec20ef0cd6e6adcb3b9f909b49f561d0058ddd2111b0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=infallible_driscoll, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, release=1763362218, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, name=rhceph, description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, GIT_BRANCH=main) Dec 15 04:56:27 localhost systemd[1]: Started libpod-conmon-79801f698536056390e8ec20ef0cd6e6adcb3b9f909b49f561d0058ddd2111b0.scope. Dec 15 04:56:27 localhost systemd[1]: Started libcrun container. Dec 15 04:56:27 localhost podman[310991]: 2025-12-15 09:56:27.344881337 +0000 UTC m=+0.141757097 container init 79801f698536056390e8ec20ef0cd6e6adcb3b9f909b49f561d0058ddd2111b0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=infallible_driscoll, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., RELEASE=main, io.buildah.version=1.41.4, vcs-type=git, io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z, version=7, release=1763362218, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, com.redhat.component=rhceph-container, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux , io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0) Dec 15 04:56:27 localhost podman[310991]: 2025-12-15 09:56:27.24847072 +0000 UTC m=+0.045346510 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 04:56:27 localhost podman[310991]: 2025-12-15 09:56:27.354288567 +0000 UTC m=+0.151164327 container start 79801f698536056390e8ec20ef0cd6e6adcb3b9f909b49f561d0058ddd2111b0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=infallible_driscoll, io.openshift.expose-services=, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, CEPH_POINT_RELEASE=, build-date=2025-11-26T19:44:28Z, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, version=7, io.buildah.version=1.41.4, distribution-scope=public, GIT_BRANCH=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., release=1763362218, GIT_CLEAN=True, vcs-type=git, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7) Dec 15 04:56:27 localhost podman[310991]: 2025-12-15 09:56:27.354548675 +0000 UTC m=+0.151424465 container attach 79801f698536056390e8ec20ef0cd6e6adcb3b9f909b49f561d0058ddd2111b0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=infallible_driscoll, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-11-26T19:44:28Z, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-type=git, ceph=True, architecture=x86_64, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, release=1763362218, maintainer=Guillaume Abrioux , description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.buildah.version=1.41.4, com.redhat.component=rhceph-container) Dec 15 04:56:27 localhost infallible_driscoll[311006]: 167 167 Dec 15 04:56:27 localhost systemd[1]: libpod-79801f698536056390e8ec20ef0cd6e6adcb3b9f909b49f561d0058ddd2111b0.scope: Deactivated successfully. Dec 15 04:56:27 localhost podman[310991]: 2025-12-15 09:56:27.36085092 +0000 UTC m=+0.157726680 container died 79801f698536056390e8ec20ef0cd6e6adcb3b9f909b49f561d0058ddd2111b0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=infallible_driscoll, description=Red Hat Ceph Storage 7, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, ceph=True, release=1763362218, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., name=rhceph, GIT_CLEAN=True, build-date=2025-11-26T19:44:28Z, maintainer=Guillaume Abrioux , vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7) Dec 15 04:56:27 localhost podman[311011]: 2025-12-15 09:56:27.45742189 +0000 UTC m=+0.084698892 container remove 79801f698536056390e8ec20ef0cd6e6adcb3b9f909b49f561d0058ddd2111b0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=infallible_driscoll, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, ceph=True, version=7, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, io.openshift.expose-services=, io.buildah.version=1.41.4, distribution-scope=public, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_BRANCH=main, CEPH_POINT_RELEASE=, RELEASE=main, name=rhceph, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, maintainer=Guillaume Abrioux , GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream) Dec 15 04:56:27 localhost systemd[1]: libpod-conmon-79801f698536056390e8ec20ef0cd6e6adcb3b9f909b49f561d0058ddd2111b0.scope: Deactivated successfully. Dec 15 04:56:27 localhost ceph-mon[298913]: Reconfiguring mds.mds.np0005559462.mhigvc (monmap changed)... Dec 15 04:56:27 localhost ceph-mon[298913]: Reconfiguring daemon mds.mds.np0005559462.mhigvc on np0005559462.localdomain Dec 15 04:56:27 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:27 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:27 localhost ceph-mon[298913]: Reconfiguring mgr.np0005559462.fudvyx (monmap changed)... Dec 15 04:56:27 localhost ceph-mon[298913]: from='mgr.26885 172.18.0.107:0/2441414812' entity='mgr.np0005559463.daptkf' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005559462.fudvyx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Dec 15 04:56:27 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005559462.fudvyx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Dec 15 04:56:27 localhost ceph-mon[298913]: Reconfiguring daemon mgr.np0005559462.fudvyx on np0005559462.localdomain Dec 15 04:56:27 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559462.localdomain.devices.0}] v 0) Dec 15 04:56:27 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:27 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559462.localdomain}] v 0) Dec 15 04:56:27 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:27 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005559463.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0) Dec 15 04:56:27 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005559463.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Dec 15 04:56:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09. Dec 15 04:56:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 04:56:28 localhost podman[311028]: 2025-12-15 09:56:28.018208166 +0000 UTC m=+0.098126585 container health_status 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., vcs-type=git, container_name=openstack_network_exporter, io.openshift.expose-services=, version=9.6, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., name=ubi9-minimal, managed_by=edpm_ansible, release=1755695350, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public) Dec 15 04:56:28 localhost podman[311029]: 2025-12-15 09:56:28.064733488 +0000 UTC m=+0.144604505 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 04:56:28 localhost podman[311028]: 2025-12-15 09:56:28.08355115 +0000 UTC m=+0.163469559 container exec_died 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., release=1755695350, architecture=x86_64, build-date=2025-08-20T13:12:41, version=9.6, vcs-type=git, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.openshift.expose-services=) Dec 15 04:56:28 localhost systemd[1]: 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09.service: Deactivated successfully. Dec 15 04:56:28 localhost podman[311029]: 2025-12-15 09:56:28.099581974 +0000 UTC m=+0.179452981 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, managed_by=edpm_ansible) Dec 15 04:56:28 localhost systemd[1]: var-lib-containers-storage-overlay-20352b815ece5273de4cbd192dc480275f580c0c9bc7688ffd50b60e30c8c329-merged.mount: Deactivated successfully. Dec 15 04:56:28 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 04:56:28 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559463.localdomain.devices.0}] v 0) Dec 15 04:56:28 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:28 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559463.localdomain}] v 0) Dec 15 04:56:28 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:28 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:28 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:28 localhost ceph-mon[298913]: Reconfiguring crash.np0005559463 (monmap changed)... Dec 15 04:56:28 localhost ceph-mon[298913]: from='mgr.26885 172.18.0.107:0/2441414812' entity='mgr.np0005559463.daptkf' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005559463.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Dec 15 04:56:28 localhost ceph-mon[298913]: Reconfiguring daemon crash.np0005559463 on np0005559463.localdomain Dec 15 04:56:28 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005559463.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Dec 15 04:56:28 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:28 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:28 localhost ceph-mon[298913]: from='mgr.26885 172.18.0.107:0/2441414812' entity='mgr.np0005559463.daptkf' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch Dec 15 04:56:29 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559463.localdomain.devices.0}] v 0) Dec 15 04:56:29 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:29 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559463.localdomain}] v 0) Dec 15 04:56:29 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:29 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559463.localdomain.devices.0}] v 0) Dec 15 04:56:29 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:29 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559463.localdomain}] v 0) Dec 15 04:56:29 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:29 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mon}] v 0) Dec 15 04:56:29 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:30 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:56:30 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559463.localdomain.devices.0}] v 0) Dec 15 04:56:30 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:30 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559463.localdomain}] v 0) Dec 15 04:56:30 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:30 localhost ceph-mon[298913]: Reconfiguring osd.2 (monmap changed)... Dec 15 04:56:30 localhost ceph-mon[298913]: Reconfiguring daemon osd.2 on np0005559463.localdomain Dec 15 04:56:30 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:30 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:30 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:30 localhost ceph-mon[298913]: from='mgr.26885 172.18.0.107:0/2441414812' entity='mgr.np0005559463.daptkf' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch Dec 15 04:56:30 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:30 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:30 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:30 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559463.localdomain.devices.0}] v 0) Dec 15 04:56:30 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:30 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559463.localdomain}] v 0) Dec 15 04:56:30 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:30 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005559463.rdpgze", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0) Dec 15 04:56:30 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005559463.rdpgze", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Dec 15 04:56:30 localhost nova_compute[286344]: 2025-12-15 09:56:30.873 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:56:30 localhost nova_compute[286344]: 2025-12-15 09:56:30.875 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:56:30 localhost nova_compute[286344]: 2025-12-15 09:56:30.876 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 15 04:56:30 localhost nova_compute[286344]: 2025-12-15 09:56:30.876 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:56:30 localhost nova_compute[286344]: 2025-12-15 09:56:30.877 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:56:30 localhost nova_compute[286344]: 2025-12-15 09:56:30.881 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:56:31 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559463.localdomain.devices.0}] v 0) Dec 15 04:56:31 localhost ceph-mon[298913]: Reconfiguring osd.5 (monmap changed)... Dec 15 04:56:31 localhost ceph-mon[298913]: Reconfiguring daemon osd.5 on np0005559463.localdomain Dec 15 04:56:31 localhost ceph-mon[298913]: Saving service mon spec with placement label:mon Dec 15 04:56:31 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:31 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:31 localhost ceph-mon[298913]: from='mgr.26885 172.18.0.107:0/2441414812' entity='mgr.np0005559463.daptkf' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005559463.rdpgze", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Dec 15 04:56:31 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:31 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005559463.rdpgze", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Dec 15 04:56:31 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:31 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559463.localdomain}] v 0) Dec 15 04:56:31 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:31 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005559463.daptkf", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0) Dec 15 04:56:31 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005559463.daptkf", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Dec 15 04:56:31 localhost podman[243449]: time="2025-12-15T09:56:31Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 15 04:56:31 localhost podman[243449]: @ - - [15/Dec/2025:09:56:31 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 154814 "" "Go-http-client/1.1" Dec 15 04:56:31 localhost podman[243449]: @ - - [15/Dec/2025:09:56:31 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18724 "" "Go-http-client/1.1" Dec 15 04:56:32 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559463.localdomain.devices.0}] v 0) Dec 15 04:56:32 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:32 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559463.localdomain}] v 0) Dec 15 04:56:32 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:32 localhost ceph-mon[298913]: Reconfiguring mds.mds.np0005559463.rdpgze (monmap changed)... Dec 15 04:56:32 localhost ceph-mon[298913]: Reconfiguring daemon mds.mds.np0005559463.rdpgze on np0005559463.localdomain Dec 15 04:56:32 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:32 localhost ceph-mon[298913]: from='mgr.26885 172.18.0.107:0/2441414812' entity='mgr.np0005559463.daptkf' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005559463.daptkf", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Dec 15 04:56:32 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:32 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005559463.daptkf", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Dec 15 04:56:32 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:32 localhost ceph-mon[298913]: from='mgr.26885 172.18.0.107:0/2441414812' entity='mgr.np0005559463.daptkf' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Dec 15 04:56:32 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:33 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559463.localdomain.devices.0}] v 0) Dec 15 04:56:33 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:33 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559463.localdomain}] v 0) Dec 15 04:56:33 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:33 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005559464.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0) Dec 15 04:56:33 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005559464.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Dec 15 04:56:33 localhost ceph-mon[298913]: Reconfiguring mgr.np0005559463.daptkf (monmap changed)... Dec 15 04:56:33 localhost ceph-mon[298913]: Reconfiguring daemon mgr.np0005559463.daptkf on np0005559463.localdomain Dec 15 04:56:33 localhost ceph-mon[298913]: Reconfiguring mon.np0005559463 (monmap changed)... Dec 15 04:56:33 localhost ceph-mon[298913]: Reconfiguring daemon mon.np0005559463 on np0005559463.localdomain Dec 15 04:56:33 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:33 localhost ceph-mon[298913]: Reconfiguring crash.np0005559464 (monmap changed)... Dec 15 04:56:33 localhost ceph-mon[298913]: from='mgr.26885 172.18.0.107:0/2441414812' entity='mgr.np0005559463.daptkf' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005559464.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Dec 15 04:56:33 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:33 localhost ceph-mon[298913]: Reconfiguring daemon crash.np0005559464 on np0005559464.localdomain Dec 15 04:56:33 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005559464.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch Dec 15 04:56:34 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559464.localdomain.devices.0}] v 0) Dec 15 04:56:34 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:34 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559464.localdomain}] v 0) Dec 15 04:56:34 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:34 localhost openstack_network_exporter[246484]: ERROR 09:56:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 04:56:34 localhost openstack_network_exporter[246484]: ERROR 09:56:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 04:56:34 localhost openstack_network_exporter[246484]: ERROR 09:56:34 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 15 04:56:34 localhost openstack_network_exporter[246484]: ERROR 09:56:34 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 15 04:56:34 localhost openstack_network_exporter[246484]: Dec 15 04:56:34 localhost openstack_network_exporter[246484]: ERROR 09:56:34 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 15 04:56:34 localhost openstack_network_exporter[246484]: Dec 15 04:56:35 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:35 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:35 localhost ceph-mon[298913]: Reconfiguring osd.1 (monmap changed)... Dec 15 04:56:35 localhost ceph-mon[298913]: from='mgr.26885 172.18.0.107:0/2441414812' entity='mgr.np0005559463.daptkf' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch Dec 15 04:56:35 localhost ceph-mon[298913]: Reconfiguring daemon osd.1 on np0005559464.localdomain Dec 15 04:56:35 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559464.localdomain.devices.0}] v 0) Dec 15 04:56:35 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:35 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559464.localdomain}] v 0) Dec 15 04:56:35 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:35 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559464.localdomain.devices.0}] v 0) Dec 15 04:56:35 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:35 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559464.localdomain}] v 0) Dec 15 04:56:35 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:35 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:56:35 localhost nova_compute[286344]: 2025-12-15 09:56:35.879 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:56:36 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:36 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:36 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:36 localhost ceph-mon[298913]: Reconfiguring osd.4 (monmap changed)... Dec 15 04:56:36 localhost ceph-mon[298913]: from='mgr.26885 172.18.0.107:0/2441414812' entity='mgr.np0005559463.daptkf' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch Dec 15 04:56:36 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:36 localhost ceph-mon[298913]: Reconfiguring daemon osd.4 on np0005559464.localdomain Dec 15 04:56:36 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559464.localdomain.devices.0}] v 0) Dec 15 04:56:36 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:36 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559464.localdomain}] v 0) Dec 15 04:56:36 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:36 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559464.localdomain.devices.0}] v 0) Dec 15 04:56:36 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:36 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559464.localdomain}] v 0) Dec 15 04:56:36 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:36 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005559464.piyuji", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0) Dec 15 04:56:36 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005559464.piyuji", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Dec 15 04:56:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 04:56:36 localhost systemd[1]: tmp-crun.nDKK7n.mount: Deactivated successfully. Dec 15 04:56:36 localhost podman[311073]: 2025-12-15 09:56:36.765234455 +0000 UTC m=+0.096522331 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202) Dec 15 04:56:36 localhost podman[311073]: 2025-12-15 09:56:36.769586235 +0000 UTC m=+0.100874111 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3) Dec 15 04:56:36 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 04:56:37 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559464.localdomain.devices.0}] v 0) Dec 15 04:56:37 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:37 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559464.localdomain}] v 0) Dec 15 04:56:37 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:37 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:37 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:37 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:37 localhost ceph-mon[298913]: Reconfiguring mds.mds.np0005559464.piyuji (monmap changed)... Dec 15 04:56:37 localhost ceph-mon[298913]: from='mgr.26885 172.18.0.107:0/2441414812' entity='mgr.np0005559463.daptkf' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005559464.piyuji", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Dec 15 04:56:37 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:37 localhost ceph-mon[298913]: Reconfiguring daemon mds.mds.np0005559464.piyuji on np0005559464.localdomain Dec 15 04:56:37 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005559464.piyuji", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch Dec 15 04:56:37 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:37 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005559464.aomnqe", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0) Dec 15 04:56:37 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005559464.aomnqe", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Dec 15 04:56:38 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559464.localdomain.devices.0}] v 0) Dec 15 04:56:38 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:38 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559464.localdomain}] v 0) Dec 15 04:56:38 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:38 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:38 localhost ceph-mon[298913]: Reconfiguring mgr.np0005559464.aomnqe (monmap changed)... Dec 15 04:56:38 localhost ceph-mon[298913]: from='mgr.26885 172.18.0.107:0/2441414812' entity='mgr.np0005559463.daptkf' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005559464.aomnqe", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Dec 15 04:56:38 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005559464.aomnqe", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch Dec 15 04:56:38 localhost ceph-mon[298913]: Reconfiguring daemon mgr.np0005559464.aomnqe on np0005559464.localdomain Dec 15 04:56:38 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:38 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:38 localhost ceph-mon[298913]: from='mgr.26885 172.18.0.107:0/2441414812' entity='mgr.np0005559463.daptkf' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Dec 15 04:56:39 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559464.localdomain.devices.0}] v 0) Dec 15 04:56:39 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:39 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559464.localdomain}] v 0) Dec 15 04:56:39 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:39 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Dec 15 04:56:39 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:39 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mon}] v 0) Dec 15 04:56:39 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:39 localhost ceph-mon[298913]: Reconfiguring mon.np0005559464 (monmap changed)... Dec 15 04:56:39 localhost ceph-mon[298913]: Reconfiguring daemon mon.np0005559464 on np0005559464.localdomain Dec 15 04:56:39 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:39 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:39 localhost ceph-mon[298913]: from='mgr.26885 172.18.0.107:0/2441414812' entity='mgr.np0005559463.daptkf' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Dec 15 04:56:39 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:39 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:39 localhost podman[311161]: Dec 15 04:56:39 localhost podman[311161]: 2025-12-15 09:56:39.979273146 +0000 UTC m=+0.072567416 container create 5092b8eea7611e3be4cc0f56df19a15f4da675340c454068d315026b7f2ae59d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_herschel, ceph=True, distribution-scope=public, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, release=1763362218, CEPH_POINT_RELEASE=, name=rhceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, com.redhat.component=rhceph-container, architecture=x86_64, build-date=2025-11-26T19:44:28Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, version=7, maintainer=Guillaume Abrioux , RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, io.openshift.tags=rhceph ceph, io.openshift.expose-services=) Dec 15 04:56:40 localhost systemd[1]: Started libpod-conmon-5092b8eea7611e3be4cc0f56df19a15f4da675340c454068d315026b7f2ae59d.scope. Dec 15 04:56:40 localhost systemd[1]: Started libcrun container. Dec 15 04:56:40 localhost podman[311161]: 2025-12-15 09:56:39.953407768 +0000 UTC m=+0.046702018 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 04:56:40 localhost podman[311161]: 2025-12-15 09:56:40.053126525 +0000 UTC m=+0.146420795 container init 5092b8eea7611e3be4cc0f56df19a15f4da675340c454068d315026b7f2ae59d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_herschel, name=rhceph, CEPH_POINT_RELEASE=, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, vendor=Red Hat, Inc., release=1763362218, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, io.openshift.tags=rhceph ceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux , com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, version=7, vcs-type=git, distribution-scope=public, architecture=x86_64) Dec 15 04:56:40 localhost podman[311161]: 2025-12-15 09:56:40.064507621 +0000 UTC m=+0.157801901 container start 5092b8eea7611e3be4cc0f56df19a15f4da675340c454068d315026b7f2ae59d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_herschel, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, name=rhceph, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, ceph=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., release=1763362218, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0) Dec 15 04:56:40 localhost podman[311161]: 2025-12-15 09:56:40.065023386 +0000 UTC m=+0.158317666 container attach 5092b8eea7611e3be4cc0f56df19a15f4da675340c454068d315026b7f2ae59d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_herschel, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, distribution-scope=public, maintainer=Guillaume Abrioux , ceph=True, io.openshift.expose-services=, name=rhceph, vendor=Red Hat, Inc., io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, architecture=x86_64, build-date=2025-11-26T19:44:28Z, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, com.redhat.component=rhceph-container, GIT_BRANCH=main, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, release=1763362218, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Dec 15 04:56:40 localhost eager_herschel[311176]: 167 167 Dec 15 04:56:40 localhost systemd[1]: libpod-5092b8eea7611e3be4cc0f56df19a15f4da675340c454068d315026b7f2ae59d.scope: Deactivated successfully. Dec 15 04:56:40 localhost podman[311161]: 2025-12-15 09:56:40.067548975 +0000 UTC m=+0.160843276 container died 5092b8eea7611e3be4cc0f56df19a15f4da675340c454068d315026b7f2ae59d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_herschel, build-date=2025-11-26T19:44:28Z, CEPH_POINT_RELEASE=, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, distribution-scope=public, maintainer=Guillaume Abrioux , url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, version=7, description=Red Hat Ceph Storage 7, name=rhceph, vendor=Red Hat, Inc., release=1763362218, io.openshift.expose-services=, RELEASE=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0) Dec 15 04:56:40 localhost podman[311181]: 2025-12-15 09:56:40.15956014 +0000 UTC m=+0.081806522 container remove 5092b8eea7611e3be4cc0f56df19a15f4da675340c454068d315026b7f2ae59d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_herschel, vcs-type=git, vendor=Red Hat, Inc., ceph=True, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, name=rhceph, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, build-date=2025-11-26T19:44:28Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, release=1763362218, version=7, distribution-scope=public, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream) Dec 15 04:56:40 localhost systemd[1]: libpod-conmon-5092b8eea7611e3be4cc0f56df19a15f4da675340c454068d315026b7f2ae59d.scope: Deactivated successfully. Dec 15 04:56:40 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:56:40 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559462.localdomain.devices.0}] v 0) Dec 15 04:56:40 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:40 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559462.localdomain}] v 0) Dec 15 04:56:40 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:40 localhost ceph-mon[298913]: from='mgr.26885 172.18.0.107:0/2441414812' entity='mgr.np0005559463.daptkf' cmd={"prefix": "auth get", "entity": "mon."} : dispatch Dec 15 04:56:40 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:40 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:40 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Dec 15 04:56:40 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:40 localhost nova_compute[286344]: 2025-12-15 09:56:40.882 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:56:40 localhost systemd[1]: var-lib-containers-storage-overlay-97de0a3dcdab459c7d5d73e27b4447d4797745be9de832eac882f52706256da3-merged.mount: Deactivated successfully. Dec 15 04:56:41 localhost ceph-mon[298913]: Reconfiguring mon.np0005559462 (monmap changed)... Dec 15 04:56:41 localhost ceph-mon[298913]: Reconfiguring daemon mon.np0005559462 on np0005559462.localdomain Dec 15 04:56:41 localhost ceph-mon[298913]: from='mgr.26885 ' entity='mgr.np0005559463.daptkf' Dec 15 04:56:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e. Dec 15 04:56:43 localhost podman[311197]: 2025-12-15 09:56:43.76044874 +0000 UTC m=+0.090734420 container health_status a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Dec 15 04:56:43 localhost podman[311197]: 2025-12-15 09:56:43.770007005 +0000 UTC m=+0.100292685 container exec_died a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Dec 15 04:56:43 localhost systemd[1]: a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e.service: Deactivated successfully. Dec 15 04:56:45 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:56:45 localhost nova_compute[286344]: 2025-12-15 09:56:45.886 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:56:46 localhost sshd[311221]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:56:50 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:56:50 localhost nova_compute[286344]: 2025-12-15 09:56:50.890 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:56:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:56:51.474 160590 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:56:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:56:51.474 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:56:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:56:51.475 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:56:55 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:56:55 localhost nova_compute[286344]: 2025-12-15 09:56:55.893 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:56:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0. Dec 15 04:56:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 04:56:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a. Dec 15 04:56:56 localhost podman[311223]: 2025-12-15 09:56:56.756211829 +0000 UTC m=+0.081503283 container health_status 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 15 04:56:56 localhost podman[311223]: 2025-12-15 09:56:56.765133776 +0000 UTC m=+0.090425220 container exec_died 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Dec 15 04:56:56 localhost systemd[1]: 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0.service: Deactivated successfully. Dec 15 04:56:56 localhost systemd[1]: tmp-crun.u1n1ZT.mount: Deactivated successfully. Dec 15 04:56:56 localhost podman[311225]: 2025-12-15 09:56:56.815828683 +0000 UTC m=+0.134604617 container health_status b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Dec 15 04:56:56 localhost systemd[1]: tmp-crun.wjOYac.mount: Deactivated successfully. Dec 15 04:56:56 localhost podman[311224]: 2025-12-15 09:56:56.860138433 +0000 UTC m=+0.181608392 container health_status 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Dec 15 04:56:56 localhost podman[311224]: 2025-12-15 09:56:56.869671578 +0000 UTC m=+0.191141547 container exec_died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true) Dec 15 04:56:56 localhost podman[311225]: 2025-12-15 09:56:56.879252784 +0000 UTC m=+0.198028748 container exec_died b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2) Dec 15 04:56:56 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.service: Deactivated successfully. Dec 15 04:56:56 localhost systemd[1]: b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a.service: Deactivated successfully. Dec 15 04:56:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09. Dec 15 04:56:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 04:56:58 localhost podman[311282]: 2025-12-15 09:56:58.74604572 +0000 UTC m=+0.076269507 container health_status 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., architecture=x86_64, distribution-scope=public, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, com.redhat.component=ubi9-minimal-container, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, container_name=openstack_network_exporter) Dec 15 04:56:58 localhost podman[311282]: 2025-12-15 09:56:58.835518634 +0000 UTC m=+0.165742371 container exec_died 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, version=9.6, distribution-scope=public, maintainer=Red Hat, Inc., name=ubi9-minimal, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Dec 15 04:56:58 localhost systemd[1]: 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09.service: Deactivated successfully. Dec 15 04:56:58 localhost podman[311283]: 2025-12-15 09:56:58.841678815 +0000 UTC m=+0.166793940 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, org.label-schema.build-date=20251202) Dec 15 04:56:58 localhost podman[311283]: 2025-12-15 09:56:58.921612914 +0000 UTC m=+0.246728059 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2) Dec 15 04:56:58 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 04:57:00 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:57:00 localhost nova_compute[286344]: 2025-12-15 09:57:00.895 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:57:01 localhost podman[243449]: time="2025-12-15T09:57:01Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 15 04:57:01 localhost podman[243449]: @ - - [15/Dec/2025:09:57:01 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 154814 "" "Go-http-client/1.1" Dec 15 04:57:01 localhost podman[243449]: @ - - [15/Dec/2025:09:57:01 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18726 "" "Go-http-client/1.1" Dec 15 04:57:03 localhost nova_compute[286344]: 2025-12-15 09:57:03.271 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:57:04 localhost nova_compute[286344]: 2025-12-15 09:57:04.266 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:57:04 localhost openstack_network_exporter[246484]: ERROR 09:57:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 04:57:04 localhost openstack_network_exporter[246484]: ERROR 09:57:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 04:57:04 localhost openstack_network_exporter[246484]: ERROR 09:57:04 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 15 04:57:04 localhost openstack_network_exporter[246484]: ERROR 09:57:04 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 15 04:57:04 localhost openstack_network_exporter[246484]: Dec 15 04:57:04 localhost openstack_network_exporter[246484]: ERROR 09:57:04 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 15 04:57:04 localhost openstack_network_exporter[246484]: Dec 15 04:57:05 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:57:05 localhost nova_compute[286344]: 2025-12-15 09:57:05.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:57:05 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Dec 15 04:57:05 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/584733257' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Dec 15 04:57:05 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Dec 15 04:57:05 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/584733257' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Dec 15 04:57:05 localhost nova_compute[286344]: 2025-12-15 09:57:05.897 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:57:07 localhost nova_compute[286344]: 2025-12-15 09:57:07.266 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:57:07 localhost nova_compute[286344]: 2025-12-15 09:57:07.383 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:57:07 localhost nova_compute[286344]: 2025-12-15 09:57:07.384 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 15 04:57:07 localhost nova_compute[286344]: 2025-12-15 09:57:07.384 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 15 04:57:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 04:57:07 localhost systemd[1]: tmp-crun.DJGrbC.mount: Deactivated successfully. Dec 15 04:57:07 localhost podman[311325]: 2025-12-15 09:57:07.765593915 +0000 UTC m=+0.094206926 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 04:57:07 localhost podman[311325]: 2025-12-15 09:57:07.771658574 +0000 UTC m=+0.100271535 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Dec 15 04:57:07 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 04:57:08 localhost nova_compute[286344]: 2025-12-15 09:57:08.132 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 15 04:57:08 localhost nova_compute[286344]: 2025-12-15 09:57:08.132 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquired lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 15 04:57:08 localhost nova_compute[286344]: 2025-12-15 09:57:08.133 286348 DEBUG nova.network.neutron [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 15 04:57:08 localhost nova_compute[286344]: 2025-12-15 09:57:08.134 286348 DEBUG nova.objects.instance [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 15 04:57:08 localhost nova_compute[286344]: 2025-12-15 09:57:08.626 286348 DEBUG nova.network.neutron [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Updating instance_info_cache with network_info: [{"id": "03ef8889-3216-43fb-8a52-4be17a956ce1", "address": "fa:16:3e:74:df:7c", "network": {"id": "befb7a72-17a9-4bcb-b561-84b8f626685a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.201", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "c785bf23f53946bc99867d8832a50266", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03ef8889-32", "ovs_interfaceid": "03ef8889-3216-43fb-8a52-4be17a956ce1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 15 04:57:08 localhost nova_compute[286344]: 2025-12-15 09:57:08.728 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Releasing lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 15 04:57:08 localhost nova_compute[286344]: 2025-12-15 09:57:08.728 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 15 04:57:08 localhost nova_compute[286344]: 2025-12-15 09:57:08.729 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:57:08 localhost nova_compute[286344]: 2025-12-15 09:57:08.730 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:57:08 localhost nova_compute[286344]: 2025-12-15 09:57:08.730 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:57:08 localhost nova_compute[286344]: 2025-12-15 09:57:08.730 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 15 04:57:08 localhost nova_compute[286344]: 2025-12-15 09:57:08.731 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:57:09 localhost nova_compute[286344]: 2025-12-15 09:57:09.133 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:57:09 localhost nova_compute[286344]: 2025-12-15 09:57:09.134 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:57:09 localhost nova_compute[286344]: 2025-12-15 09:57:09.134 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:57:09 localhost nova_compute[286344]: 2025-12-15 09:57:09.134 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Auditing locally available compute resources for np0005559462.localdomain (node: np0005559462.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 15 04:57:09 localhost nova_compute[286344]: 2025-12-15 09:57:09.135 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 04:57:09 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Dec 15 04:57:09 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1861283195' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Dec 15 04:57:09 localhost nova_compute[286344]: 2025-12-15 09:57:09.595 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 04:57:09 localhost nova_compute[286344]: 2025-12-15 09:57:09.844 286348 DEBUG nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 04:57:09 localhost nova_compute[286344]: 2025-12-15 09:57:09.845 286348 DEBUG nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 04:57:10 localhost nova_compute[286344]: 2025-12-15 09:57:10.080 286348 WARNING nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 15 04:57:10 localhost nova_compute[286344]: 2025-12-15 09:57:10.082 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Hypervisor/Node resource view: name=np0005559462.localdomain free_ram=11750MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 15 04:57:10 localhost nova_compute[286344]: 2025-12-15 09:57:10.082 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:57:10 localhost nova_compute[286344]: 2025-12-15 09:57:10.083 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:57:10 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:57:10 localhost nova_compute[286344]: 2025-12-15 09:57:10.351 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Instance 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 15 04:57:10 localhost nova_compute[286344]: 2025-12-15 09:57:10.352 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 15 04:57:10 localhost nova_compute[286344]: 2025-12-15 09:57:10.353 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Final resource view: name=np0005559462.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 15 04:57:10 localhost nova_compute[286344]: 2025-12-15 09:57:10.405 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 04:57:10 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Dec 15 04:57:10 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3735652595' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Dec 15 04:57:10 localhost nova_compute[286344]: 2025-12-15 09:57:10.862 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 04:57:10 localhost nova_compute[286344]: 2025-12-15 09:57:10.870 286348 DEBUG nova.compute.provider_tree [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Inventory has not changed in ProviderTree for provider: 26c8956b-6742-4951-b566-971b9bbe323b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 15 04:57:10 localhost nova_compute[286344]: 2025-12-15 09:57:10.892 286348 DEBUG nova.scheduler.client.report [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Inventory has not changed for provider 26c8956b-6742-4951-b566-971b9bbe323b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 15 04:57:10 localhost nova_compute[286344]: 2025-12-15 09:57:10.894 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Compute_service record updated for np0005559462.localdomain:np0005559462.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 15 04:57:10 localhost nova_compute[286344]: 2025-12-15 09:57:10.895 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.812s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:57:10 localhost nova_compute[286344]: 2025-12-15 09:57:10.900 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:57:12 localhost nova_compute[286344]: 2025-12-15 09:57:12.435 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:57:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e. Dec 15 04:57:14 localhost systemd[1]: tmp-crun.FkCKsD.mount: Deactivated successfully. Dec 15 04:57:14 localhost podman[311387]: 2025-12-15 09:57:14.759472193 +0000 UTC m=+0.084216078 container health_status a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Dec 15 04:57:14 localhost podman[311387]: 2025-12-15 09:57:14.772291169 +0000 UTC m=+0.097035054 container exec_died a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 15 04:57:14 localhost systemd[1]: a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e.service: Deactivated successfully. Dec 15 04:57:15 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:57:15 localhost nova_compute[286344]: 2025-12-15 09:57:15.904 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:57:20 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:57:20 localhost nova_compute[286344]: 2025-12-15 09:57:20.906 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:57:20 localhost nova_compute[286344]: 2025-12-15 09:57:20.909 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:57:25 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:57:25 localhost nova_compute[286344]: 2025-12-15 09:57:25.911 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:57:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0. Dec 15 04:57:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 04:57:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a. Dec 15 04:57:27 localhost podman[311411]: 2025-12-15 09:57:27.752032421 +0000 UTC m=+0.074811728 container health_status 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=multipathd, tcib_managed=true, org.label-schema.vendor=CentOS) Dec 15 04:57:27 localhost podman[311411]: 2025-12-15 09:57:27.791418795 +0000 UTC m=+0.114198111 container exec_died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Dec 15 04:57:27 localhost podman[311410]: 2025-12-15 09:57:27.808282073 +0000 UTC m=+0.133057644 container health_status 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 15 04:57:27 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.service: Deactivated successfully. Dec 15 04:57:27 localhost podman[311410]: 2025-12-15 09:57:27.817376036 +0000 UTC m=+0.142151607 container exec_died 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 15 04:57:27 localhost systemd[1]: 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0.service: Deactivated successfully. Dec 15 04:57:27 localhost podman[311412]: 2025-12-15 09:57:27.871768715 +0000 UTC m=+0.189871701 container health_status b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Dec 15 04:57:27 localhost podman[311412]: 2025-12-15 09:57:27.909497402 +0000 UTC m=+0.227600418 container exec_died b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 04:57:27 localhost systemd[1]: b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a.service: Deactivated successfully. Dec 15 04:57:29 localhost ceph-mon[298913]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #34. Immutable memtables: 0. Dec 15 04:57:29 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:57:29.387431) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Dec 15 04:57:29 localhost ceph-mon[298913]: rocksdb: [db/flush_job.cc:856] [default] [JOB 17] Flushing memtable with next log file: 34 Dec 15 04:57:29 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765792649387493, "job": 17, "event": "flush_started", "num_memtables": 1, "num_entries": 2469, "num_deletes": 255, "total_data_size": 5018476, "memory_usage": 5213952, "flush_reason": "Manual Compaction"} Dec 15 04:57:29 localhost ceph-mon[298913]: rocksdb: [db/flush_job.cc:885] [default] [JOB 17] Level-0 flush table #35: started Dec 15 04:57:29 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765792649412642, "cf_name": "default", "job": 17, "event": "table_file_creation", "file_number": 35, "file_size": 4582896, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 19951, "largest_seqno": 22419, "table_properties": {"data_size": 4572134, "index_size": 6627, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3077, "raw_key_size": 27121, "raw_average_key_size": 22, "raw_value_size": 4548676, "raw_average_value_size": 3740, "num_data_blocks": 288, "num_entries": 1216, "num_filter_entries": 1216, "num_deletions": 253, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765792545, "oldest_key_time": 1765792545, "file_creation_time": 1765792649, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "603b24af-e2be-4214-bc56-9e652eb4af3d", "db_session_id": "0OJRM9SCUA16EXV0VQZ2", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}} Dec 15 04:57:29 localhost ceph-mon[298913]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 17] Flush lasted 25264 microseconds, and 9585 cpu microseconds. Dec 15 04:57:29 localhost ceph-mon[298913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Dec 15 04:57:29 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:57:29.412698) [db/flush_job.cc:967] [default] [JOB 17] Level-0 flush table #35: 4582896 bytes OK Dec 15 04:57:29 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:57:29.412723) [db/memtable_list.cc:519] [default] Level-0 commit table #35 started Dec 15 04:57:29 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:57:29.414769) [db/memtable_list.cc:722] [default] Level-0 commit table #35: memtable #1 done Dec 15 04:57:29 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:57:29.414792) EVENT_LOG_v1 {"time_micros": 1765792649414786, "job": 17, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Dec 15 04:57:29 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:57:29.414815) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Dec 15 04:57:29 localhost ceph-mon[298913]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 17] Try to delete WAL files size 5007250, prev total WAL file size 5007250, number of live WAL files 2. Dec 15 04:57:29 localhost ceph-mon[298913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005559462/store.db/000031.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Dec 15 04:57:29 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:57:29.416108) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003131323935' seq:72057594037927935, type:22 .. '7061786F73003131353437' seq:0, type:0; will stop at (end) Dec 15 04:57:29 localhost ceph-mon[298913]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 18] Compacting 1@0 + 1@6 files to L6, score -1.00 Dec 15 04:57:29 localhost ceph-mon[298913]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 17 Base level 0, inputs: [35(4475KB)], [33(16MB)] Dec 15 04:57:29 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765792649416176, "job": 18, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [35], "files_L6": [33], "score": -1, "input_data_size": 21750840, "oldest_snapshot_seqno": -1} Dec 15 04:57:29 localhost ceph-mon[298913]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 18] Generated table #36: 12306 keys, 18517214 bytes, temperature: kUnknown Dec 15 04:57:29 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765792649529949, "cf_name": "default", "job": 18, "event": "table_file_creation", "file_number": 36, "file_size": 18517214, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 18444577, "index_size": 40792, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 30789, "raw_key_size": 328946, "raw_average_key_size": 26, "raw_value_size": 18232369, "raw_average_value_size": 1481, "num_data_blocks": 1566, "num_entries": 12306, "num_filter_entries": 12306, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765792320, "oldest_key_time": 0, "file_creation_time": 1765792649, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "603b24af-e2be-4214-bc56-9e652eb4af3d", "db_session_id": "0OJRM9SCUA16EXV0VQZ2", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}} Dec 15 04:57:29 localhost ceph-mon[298913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Dec 15 04:57:29 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:57:29.530401) [db/compaction/compaction_job.cc:1663] [default] [JOB 18] Compacted 1@0 + 1@6 files to L6 => 18517214 bytes Dec 15 04:57:29 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:57:29.532290) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 190.9 rd, 162.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(4.4, 16.4 +0.0 blob) out(17.7 +0.0 blob), read-write-amplify(8.8) write-amplify(4.0) OK, records in: 12853, records dropped: 547 output_compression: NoCompression Dec 15 04:57:29 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:57:29.532338) EVENT_LOG_v1 {"time_micros": 1765792649532308, "job": 18, "event": "compaction_finished", "compaction_time_micros": 113951, "compaction_time_cpu_micros": 50847, "output_level": 6, "num_output_files": 1, "total_output_size": 18517214, "num_input_records": 12853, "num_output_records": 12306, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Dec 15 04:57:29 localhost ceph-mon[298913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005559462/store.db/000035.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Dec 15 04:57:29 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765792649533279, "job": 18, "event": "table_file_deletion", "file_number": 35} Dec 15 04:57:29 localhost ceph-mon[298913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005559462/store.db/000033.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Dec 15 04:57:29 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765792649535961, "job": 18, "event": "table_file_deletion", "file_number": 33} Dec 15 04:57:29 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:57:29.415940) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 04:57:29 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:57:29.536043) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 04:57:29 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:57:29.536051) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 04:57:29 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:57:29.536055) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 04:57:29 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:57:29.536059) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 04:57:29 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:57:29.536063) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 04:57:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09. Dec 15 04:57:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 04:57:29 localhost podman[311468]: 2025-12-15 09:57:29.751831419 +0000 UTC m=+0.083507809 container health_status 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, managed_by=edpm_ansible, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, distribution-scope=public, io.openshift.expose-services=, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc.) Dec 15 04:57:29 localhost podman[311468]: 2025-12-15 09:57:29.793437614 +0000 UTC m=+0.125114024 container exec_died 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., release=1755695350, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.buildah.version=1.33.7, name=ubi9-minimal, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, vcs-type=git, architecture=x86_64, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.) Dec 15 04:57:29 localhost podman[311469]: 2025-12-15 09:57:29.80301552 +0000 UTC m=+0.129824014 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true) Dec 15 04:57:29 localhost systemd[1]: 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09.service: Deactivated successfully. Dec 15 04:57:29 localhost podman[311469]: 2025-12-15 09:57:29.864572909 +0000 UTC m=+0.191381343 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2) Dec 15 04:57:29 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 04:57:30 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:57:30 localhost nova_compute[286344]: 2025-12-15 09:57:30.914 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:57:31 localhost podman[243449]: time="2025-12-15T09:57:31Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 15 04:57:31 localhost podman[243449]: @ - - [15/Dec/2025:09:57:31 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 154814 "" "Go-http-client/1.1" Dec 15 04:57:31 localhost podman[243449]: @ - - [15/Dec/2025:09:57:31 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18733 "" "Go-http-client/1.1" Dec 15 04:57:34 localhost openstack_network_exporter[246484]: ERROR 09:57:34 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 15 04:57:34 localhost openstack_network_exporter[246484]: Dec 15 04:57:34 localhost openstack_network_exporter[246484]: ERROR 09:57:34 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 15 04:57:34 localhost openstack_network_exporter[246484]: ERROR 09:57:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 04:57:34 localhost openstack_network_exporter[246484]: ERROR 09:57:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 04:57:34 localhost openstack_network_exporter[246484]: ERROR 09:57:34 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 15 04:57:34 localhost openstack_network_exporter[246484]: Dec 15 04:57:35 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:57:35 localhost nova_compute[286344]: 2025-12-15 09:57:35.917 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:57:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 04:57:38 localhost podman[311512]: 2025-12-15 09:57:38.748735705 +0000 UTC m=+0.083254272 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 04:57:38 localhost podman[311512]: 2025-12-15 09:57:38.753585679 +0000 UTC m=+0.088104206 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 04:57:38 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 04:57:39 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mgr fail"} v 0) Dec 15 04:57:39 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='client.? ' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Dec 15 04:57:39 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e89 do_prune osdmap full prune enabled Dec 15 04:57:39 localhost ceph-mon[298913]: log_channel(cluster) log [INF] : Activating manager daemon np0005559464.aomnqe Dec 15 04:57:39 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e90 e90: 6 total, 6 up, 6 in Dec 15 04:57:39 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e90: 6 total, 6 up, 6 in Dec 15 04:57:39 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='client.? ' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished Dec 15 04:57:39 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : mgrmap e43: np0005559464.aomnqe(active, starting, since 0.0385837s), standbys: np0005559462.fudvyx Dec 15 04:57:39 localhost ceph-mon[298913]: log_channel(cluster) log [INF] : Manager daemon np0005559464.aomnqe is now available Dec 15 04:57:39 localhost systemd[1]: session-71.scope: Deactivated successfully. Dec 15 04:57:39 localhost systemd[1]: session-71.scope: Consumed 10.634s CPU time. Dec 15 04:57:39 localhost systemd-logind[763]: Session 71 logged out. Waiting for processes to exit. Dec 15 04:57:39 localhost systemd-logind[763]: Removed session 71. Dec 15 04:57:39 localhost ceph-mon[298913]: from='client.? 172.18.0.200:0/329918378' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Dec 15 04:57:39 localhost ceph-mon[298913]: from='client.? ' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch Dec 15 04:57:39 localhost ceph-mon[298913]: Activating manager daemon np0005559464.aomnqe Dec 15 04:57:39 localhost ceph-mon[298913]: from='client.? ' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished Dec 15 04:57:39 localhost ceph-mon[298913]: Manager daemon np0005559464.aomnqe is now available Dec 15 04:57:39 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005559464.aomnqe/mirror_snapshot_schedule"} v 0) Dec 15 04:57:39 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005559464.aomnqe/mirror_snapshot_schedule"} : dispatch Dec 15 04:57:39 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005559464.aomnqe/trash_purge_schedule"} v 0) Dec 15 04:57:39 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005559464.aomnqe/trash_purge_schedule"} : dispatch Dec 15 04:57:39 localhost sshd[311530]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:57:39 localhost systemd-logind[763]: New session 72 of user ceph-admin. Dec 15 04:57:39 localhost systemd[1]: Started Session 72 of User ceph-admin. Dec 15 04:57:40 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : mgrmap e44: np0005559464.aomnqe(active, since 1.05866s), standbys: np0005559462.fudvyx Dec 15 04:57:40 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:57:40 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005559464.aomnqe/mirror_snapshot_schedule"} : dispatch Dec 15 04:57:40 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005559464.aomnqe/mirror_snapshot_schedule"} : dispatch Dec 15 04:57:40 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005559464.aomnqe/trash_purge_schedule"} : dispatch Dec 15 04:57:40 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005559464.aomnqe/trash_purge_schedule"} : dispatch Dec 15 04:57:40 localhost podman[311643]: 2025-12-15 09:57:40.531053966 +0000 UTC m=+0.080495586 container exec 8dcda56b365b42dc8758aab77a9ec80db304780e449052738f7e4e648ae1ecaf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-crash-np0005559462, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux , GIT_BRANCH=main, release=1763362218, ceph=True, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z, GIT_CLEAN=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, architecture=x86_64, vendor=Red Hat, Inc., name=rhceph, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main) Dec 15 04:57:40 localhost podman[311643]: 2025-12-15 09:57:40.629526949 +0000 UTC m=+0.178968539 container exec_died 8dcda56b365b42dc8758aab77a9ec80db304780e449052738f7e4e648ae1ecaf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-crash-np0005559462, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, distribution-scope=public, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux , GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, architecture=x86_64, com.redhat.component=rhceph-container, build-date=2025-11-26T19:44:28Z, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=) Dec 15 04:57:40 localhost nova_compute[286344]: 2025-12-15 09:57:40.919 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:57:40 localhost nova_compute[286344]: 2025-12-15 09:57:40.922 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:57:40 localhost nova_compute[286344]: 2025-12-15 09:57:40.922 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 15 04:57:40 localhost nova_compute[286344]: 2025-12-15 09:57:40.922 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:57:40 localhost nova_compute[286344]: 2025-12-15 09:57:40.923 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:57:40 localhost nova_compute[286344]: 2025-12-15 09:57:40.926 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:57:41 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559462.localdomain.devices.0}] v 0) Dec 15 04:57:41 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:57:41 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559462.localdomain}] v 0) Dec 15 04:57:41 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:57:41 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559464.localdomain.devices.0}] v 0) Dec 15 04:57:41 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:57:41 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559464.localdomain}] v 0) Dec 15 04:57:41 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:57:41 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559463.localdomain.devices.0}] v 0) Dec 15 04:57:41 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:57:41 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559463.localdomain}] v 0) Dec 15 04:57:41 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:57:42 localhost ceph-mon[298913]: [15/Dec/2025:09:57:40] ENGINE Bus STARTING Dec 15 04:57:42 localhost ceph-mon[298913]: [15/Dec/2025:09:57:40] ENGINE Serving on http://172.18.0.108:8765 Dec 15 04:57:42 localhost ceph-mon[298913]: [15/Dec/2025:09:57:40] ENGINE Serving on https://172.18.0.108:7150 Dec 15 04:57:42 localhost ceph-mon[298913]: [15/Dec/2025:09:57:40] ENGINE Bus STARTED Dec 15 04:57:42 localhost ceph-mon[298913]: [15/Dec/2025:09:57:40] ENGINE Client ('172.18.0.108', 58624) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') Dec 15 04:57:42 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:57:42 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:57:42 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:57:42 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:57:42 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:57:42 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:57:42 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : mgrmap e45: np0005559464.aomnqe(active, since 3s), standbys: np0005559462.fudvyx Dec 15 04:57:42 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559462.localdomain.devices.0}] v 0) Dec 15 04:57:42 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:57:42 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559462.localdomain}] v 0) Dec 15 04:57:42 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:57:42 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} v 0) Dec 15 04:57:42 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Dec 15 04:57:42 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} v 0) Dec 15 04:57:42 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Dec 15 04:57:42 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0) Dec 15 04:57:42 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559463.localdomain.devices.0}] v 0) Dec 15 04:57:42 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:57:42 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559463.localdomain}] v 0) Dec 15 04:57:42 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559464.localdomain.devices.0}] v 0) Dec 15 04:57:42 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:57:42 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:57:42 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} v 0) Dec 15 04:57:42 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Dec 15 04:57:42 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559464.localdomain}] v 0) Dec 15 04:57:42 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} v 0) Dec 15 04:57:42 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Dec 15 04:57:42 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:57:42 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} v 0) Dec 15 04:57:42 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Dec 15 04:57:42 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0) Dec 15 04:57:42 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} v 0) Dec 15 04:57:42 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Dec 15 04:57:42 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0) Dec 15 04:57:43 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:57:43 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Dec 15 04:57:43 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:57:43 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Dec 15 04:57:43 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Dec 15 04:57:43 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Dec 15 04:57:43 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:57:43 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:57:43 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Dec 15 04:57:43 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Dec 15 04:57:43 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:57:43 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Dec 15 04:57:43 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Dec 15 04:57:43 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Dec 15 04:57:43 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:57:43 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Dec 15 04:57:43 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Dec 15 04:57:43 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Dec 15 04:57:43 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Dec 15 04:57:44 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : Standby manager daemon np0005559463.daptkf started Dec 15 04:57:44 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : mgrmap e46: np0005559464.aomnqe(active, since 5s), standbys: np0005559462.fudvyx, np0005559463.daptkf Dec 15 04:57:44 localhost ceph-mon[298913]: Adjusting osd_memory_target on np0005559462.localdomain to 836.6M Dec 15 04:57:44 localhost ceph-mon[298913]: Unable to set osd_memory_target on np0005559462.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096 Dec 15 04:57:44 localhost ceph-mon[298913]: Adjusting osd_memory_target on np0005559463.localdomain to 836.6M Dec 15 04:57:44 localhost ceph-mon[298913]: Unable to set osd_memory_target on np0005559463.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Dec 15 04:57:44 localhost ceph-mon[298913]: Adjusting osd_memory_target on np0005559464.localdomain to 836.6M Dec 15 04:57:44 localhost ceph-mon[298913]: Unable to set osd_memory_target on np0005559464.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Dec 15 04:57:44 localhost ceph-mon[298913]: Updating np0005559462.localdomain:/etc/ceph/ceph.conf Dec 15 04:57:44 localhost ceph-mon[298913]: Updating np0005559463.localdomain:/etc/ceph/ceph.conf Dec 15 04:57:44 localhost ceph-mon[298913]: Updating np0005559464.localdomain:/etc/ceph/ceph.conf Dec 15 04:57:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e. Dec 15 04:57:44 localhost podman[312349]: 2025-12-15 09:57:44.925584264 +0000 UTC m=+0.083457738 container health_status a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 15 04:57:44 localhost podman[312349]: 2025-12-15 09:57:44.9373481 +0000 UTC m=+0.095221584 container exec_died a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Dec 15 04:57:44 localhost systemd[1]: a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e.service: Deactivated successfully. Dec 15 04:57:45 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:57:45 localhost ceph-mon[298913]: Updating np0005559463.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:57:45 localhost ceph-mon[298913]: Updating np0005559462.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:57:45 localhost ceph-mon[298913]: Updating np0005559464.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.conf Dec 15 04:57:45 localhost ceph-mon[298913]: Updating np0005559463.localdomain:/etc/ceph/ceph.client.admin.keyring Dec 15 04:57:45 localhost ceph-mon[298913]: Updating np0005559464.localdomain:/etc/ceph/ceph.client.admin.keyring Dec 15 04:57:45 localhost ceph-mon[298913]: Updating np0005559462.localdomain:/etc/ceph/ceph.client.admin.keyring Dec 15 04:57:45 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559464.localdomain.devices.0}] v 0) Dec 15 04:57:45 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:57:45 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559464.localdomain}] v 0) Dec 15 04:57:45 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:57:45 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559463.localdomain.devices.0}] v 0) Dec 15 04:57:45 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:57:45 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559462.localdomain.devices.0}] v 0) Dec 15 04:57:45 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559463.localdomain}] v 0) Dec 15 04:57:45 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:57:45 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:57:45 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559462.localdomain}] v 0) Dec 15 04:57:45 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:57:45 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Dec 15 04:57:45 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:57:45 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Dec 15 04:57:45 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:57:45 localhost nova_compute[286344]: 2025-12-15 09:57:45.922 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:57:45 localhost nova_compute[286344]: 2025-12-15 09:57:45.926 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:57:46 localhost ceph-mon[298913]: Updating np0005559464.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.client.admin.keyring Dec 15 04:57:46 localhost ceph-mon[298913]: Updating np0005559463.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.client.admin.keyring Dec 15 04:57:46 localhost ceph-mon[298913]: Updating np0005559462.localdomain:/var/lib/ceph/bce17446-41b5-5408-a23e-0b011906b44a/config/ceph.client.admin.keyring Dec 15 04:57:46 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:57:46 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:57:46 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:57:46 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:57:46 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:57:46 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:57:46 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:57:46 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Dec 15 04:57:46 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.122 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'name': 'test', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005559462.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'c785bf23f53946bc99867d8832a50266', 'user_id': '1ba5fce347b64bfebf995f187193f205', 'hostId': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.124 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.124 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.154 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.latency volume: 1243487016 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.155 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.latency volume: 24779175 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.157 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c01ec855-e59d-4890-8dad-d01e9f8a592c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1243487016, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:57:48.124524', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '82d4278c-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11734.3171933, 'message_signature': '5b054ba815b160fff08647546764f4df8420d32195ea6bc255398b397d3b64f0'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24779175, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:57:48.124524', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '82d43a74-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11734.3171933, 'message_signature': '8d384b385910218d61e64c8c258531f17dee315ec48246d7fdaa425e7038458d'}]}, 'timestamp': '2025-12-15 09:57:48.156031', '_unique_id': '51120c2885de4987be585df407d4214d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.157 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.157 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.157 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.157 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.157 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.157 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.157 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.157 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.157 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.157 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.157 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.157 12 ERROR oslo_messaging.notify.messaging Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.157 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.157 12 ERROR oslo_messaging.notify.messaging Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.157 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.157 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.157 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.157 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.157 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.157 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.157 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.157 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.157 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.157 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.157 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.157 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.157 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.157 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.157 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.157 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.157 12 ERROR oslo_messaging.notify.messaging Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.158 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.162 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.164 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ca3a5a82-f83f-4866-99f4-50cce63967d6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:57:48.159128', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '82d5602a-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11734.35181457, 'message_signature': '78d318007e2be79c303741c230427579c1c694b599b0f3fa47f2f44467cca14e'}]}, 'timestamp': '2025-12-15 09:57:48.163531', '_unique_id': '0aae3f9ebe3a43edbe7156315fea78f2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.164 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.164 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.164 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.164 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.164 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.164 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.164 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.164 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.164 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.164 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.164 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.164 12 ERROR oslo_messaging.notify.messaging Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.164 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.164 12 ERROR oslo_messaging.notify.messaging Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.164 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.164 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.164 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.164 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.164 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.164 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.164 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.164 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.164 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.164 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.164 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.164 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.164 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.164 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.164 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.164 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.164 12 ERROR oslo_messaging.notify.messaging Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.165 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.165 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.166 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.167 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e2f1a922-ffb4-4c30-8af7-a9cf76413890', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:57:48.165768', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '82d5cb3c-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11734.3171933, 'message_signature': '6d02b64f97be7b7b8633ff503f7425ca58a44574306b53d3685fe45cfa463b75'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:57:48.165768', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '82d5dbf4-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11734.3171933, 'message_signature': '0a200353118174fd4cfad88a704dc453ea05e3c357ea4c6adb5b4ccb4961c5d7'}]}, 'timestamp': '2025-12-15 09:57:48.166679', '_unique_id': 'd36ca8f2b7af45a5beab9d99fb54574f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.167 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.167 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.167 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.167 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.167 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.167 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.167 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.167 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.167 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.167 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.167 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.167 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.167 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.167 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.167 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.167 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.167 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.167 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.167 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.167 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.167 12 ERROR oslo_messaging.notify.messaging Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.167 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.167 12 ERROR oslo_messaging.notify.messaging Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.167 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.167 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.167 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.167 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.167 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.167 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.167 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.167 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.167 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.167 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.167 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.167 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.167 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.167 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.167 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.167 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.167 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.167 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.167 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.167 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.167 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.167 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.167 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.167 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.167 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.167 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.167 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.167 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.167 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.167 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.167 12 ERROR oslo_messaging.notify.messaging Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.169 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.169 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.170 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f72b9fd6-c3aa-4731-8837-3e7d9126de20', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:57:48.169207', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '82d650ac-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11734.35181457, 'message_signature': 'ad3bc3f7ed2219d7120e066612683e0e8a36f8a644e4058f1d225551343e83fc'}]}, 'timestamp': '2025-12-15 09:57:48.169679', '_unique_id': 'ab406315288e4c39b6b562274602f85a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.170 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.170 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.170 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.170 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.170 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.170 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.170 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.170 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.170 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.170 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.170 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.170 12 ERROR oslo_messaging.notify.messaging Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.170 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.170 12 ERROR oslo_messaging.notify.messaging Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.170 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.170 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.170 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.170 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.170 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.170 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.170 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.170 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.170 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.170 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.170 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.170 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.170 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.170 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.170 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.170 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.170 12 ERROR oslo_messaging.notify.messaging Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.172 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.172 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.latency volume: 1342134926 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.173 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.latency volume: 123356132 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.174 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9face88f-4a6c-4016-93f5-2abae3e7fc4f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1342134926, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:57:48.172446', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '82d6d0b8-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11734.3171933, 'message_signature': 'f1125c788c0748a00c94f0cd6c45615154063cb8e099497b107cb29fc32b9d1d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 123356132, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:57:48.172446', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '82d6e710-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11734.3171933, 'message_signature': '594aa2e1f52ea76f57fcd1846b011587bfaa83badc1823817802573af8a7bde5'}]}, 'timestamp': '2025-12-15 09:57:48.173650', '_unique_id': 'f9514f7477c84992bd934047454a4164'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.174 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.174 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.174 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.174 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.174 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.174 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.174 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.174 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.174 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.174 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.174 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.174 12 ERROR oslo_messaging.notify.messaging Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.174 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.174 12 ERROR oslo_messaging.notify.messaging Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.174 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.174 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.174 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.174 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.174 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.174 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.174 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.174 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.174 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.174 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.174 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.174 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.174 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.174 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.174 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.174 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.174 12 ERROR oslo_messaging.notify.messaging Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.176 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.176 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.176 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.178 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ca72cc7d-2dbc-4849-8225-19f01b05f1d1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:57:48.176339', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '82d76884-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11734.3171933, 'message_signature': 'c9fe9b37ab4bcbc14d182f485a7d761a2d9593c7124c8527b01d718201b65d75'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:57:48.176339', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '82d77e50-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11734.3171933, 'message_signature': '89da740418ad2769c5beec95094b25b7fd1d766914076f7626ec071f46997ac5'}]}, 'timestamp': '2025-12-15 09:57:48.177371', '_unique_id': '23fa53c4c5b24cd2b2630d3827c7d2b2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.178 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.178 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.178 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.178 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.178 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.178 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.178 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.178 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.178 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.178 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.178 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.178 12 ERROR oslo_messaging.notify.messaging Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.178 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.178 12 ERROR oslo_messaging.notify.messaging Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.178 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.178 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.178 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.178 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.178 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.178 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.178 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.178 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.178 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.178 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.178 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.178 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.178 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.178 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.178 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.178 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.178 12 ERROR oslo_messaging.notify.messaging Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.179 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.179 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.180 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ac79e8b3-6895-44b0-90d9-6f507730732f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:57:48.179561', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '82d7e502-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11734.35181457, 'message_signature': '303967d309bac9b68f8fb494b1ee83d55c7a779f07a223e7f17172c1853099f5'}]}, 'timestamp': '2025-12-15 09:57:48.180057', '_unique_id': '1669da70607240c395ee6bef887442cd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.180 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.180 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.180 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.180 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.180 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.180 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.180 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.180 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.180 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.180 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.180 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.180 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.180 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.180 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.180 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.180 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.180 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.180 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.180 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.180 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.180 12 ERROR oslo_messaging.notify.messaging Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.180 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.180 12 ERROR oslo_messaging.notify.messaging Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.180 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.180 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.180 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.180 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.180 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.180 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.180 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.180 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.180 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.180 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.180 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.180 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.180 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.180 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.180 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.180 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.180 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.180 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.180 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.180 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.180 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.180 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.180 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.180 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.180 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.180 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.180 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.180 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.180 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.180 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.180 12 ERROR oslo_messaging.notify.messaging Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.182 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.182 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.183 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2d304fea-2210-46c4-9d3f-f3d6d85018b3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:57:48.182164', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '82d84aa6-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11734.35181457, 'message_signature': '6c5f1516dc56ecff4399d398c2ff4202e4bd152af5448a5b9dee8b629d981bb9'}]}, 'timestamp': '2025-12-15 09:57:48.182645', '_unique_id': '082dfc4d568e49fa974fdada8a821309'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.183 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.183 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.183 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.183 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.183 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.183 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.183 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.183 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.183 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.183 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.183 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.183 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.183 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.183 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.183 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.183 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.183 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.183 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.183 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.183 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.183 12 ERROR oslo_messaging.notify.messaging Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.183 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.183 12 ERROR oslo_messaging.notify.messaging Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.183 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.183 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.183 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.183 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.183 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.183 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.183 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.183 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.183 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.183 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.183 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.183 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.183 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.183 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.183 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.183 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.183 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.183 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.183 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.183 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.183 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.183 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.183 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.183 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.183 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.183 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.183 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.183 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.183 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.183 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.183 12 ERROR oslo_messaging.notify.messaging Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.184 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.196 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.197 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.198 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '24eb9520-f129-4733-863a-f2cc53200f32', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:57:48.184741', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '82da8898-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11734.377421591, 'message_signature': '1d79ee13e99dd3add9a4210bd3e6568448d937656a76afee9f62e7109b001e08'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:57:48.184741', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '82daa076-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11734.377421591, 'message_signature': '8b14a3495d4a17fe979e5e4dbe7a61788cba3afa55bf140d97c501769773abdc'}]}, 'timestamp': '2025-12-15 09:57:48.197922', '_unique_id': 'fc1b2279a99b4df3b75f9055b15943e4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.198 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.198 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.198 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.198 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.198 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.198 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.198 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.198 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.198 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.198 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.198 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.198 12 ERROR oslo_messaging.notify.messaging Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.198 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.198 12 ERROR oslo_messaging.notify.messaging Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.198 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.198 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.198 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.198 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.198 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.198 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.198 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.198 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.198 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.198 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.198 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.198 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.198 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.198 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.198 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.198 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.198 12 ERROR oslo_messaging.notify.messaging Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.201 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.201 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.202 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.204 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7feabebc-7c8f-4997-bdf8-fd5c6099c0f0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:57:48.201548', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '82db44cc-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11734.377421591, 'message_signature': 'f5156ab946e520b903a9098b0bd813d339c1dbec7118efa586d04d2348a52a24'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:57:48.201548', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '82db5c14-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11734.377421591, 'message_signature': '33a9b9646d05d9713debf4320ccc25dc120bad600dacc42838367d0be6416331'}]}, 'timestamp': '2025-12-15 09:57:48.202732', '_unique_id': '5182e4d107af4daeb1bddc124a1920b5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.204 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.204 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.204 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.204 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.204 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.204 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.204 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.204 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.204 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.204 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.204 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.204 12 ERROR oslo_messaging.notify.messaging Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.204 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.204 12 ERROR oslo_messaging.notify.messaging Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.204 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.204 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.204 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.204 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.204 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.204 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.204 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.204 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.204 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.204 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.204 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.204 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.204 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.204 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.204 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.204 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.204 12 ERROR oslo_messaging.notify.messaging Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.205 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.205 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.207 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cbecb0b8-b1f7-4349-8e36-b70e10cab003', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:57:48.205821', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '82dbe904-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11734.35181457, 'message_signature': '5d202eb3fc66870ca4cb87c95c22d1f684b0907f67b80fd5153aed2e69e90e57'}]}, 'timestamp': '2025-12-15 09:57:48.206430', '_unique_id': '537100e22ecd49c182a4f4bbd6739aeb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.207 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.207 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.207 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.207 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.207 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.207 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.207 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.207 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.207 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.207 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.207 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.207 12 ERROR oslo_messaging.notify.messaging Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.207 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.207 12 ERROR oslo_messaging.notify.messaging Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.207 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.207 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.207 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.207 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.207 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.207 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.207 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.207 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.207 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.207 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.207 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.207 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.207 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.207 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.207 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.207 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.207 12 ERROR oslo_messaging.notify.messaging Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.208 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.225 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/memory.usage volume: 51.73828125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.227 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '72770ac7-ebe5-4f3f-a85d-0ae9c6d3846e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.73828125, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'timestamp': '2025-12-15T09:57:48.208811', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '82dee726-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11734.417872373, 'message_signature': 'd6ffee9bcb32d84329d30fc7a8ed0a485e09d26e55afe53b91650cd10a6ccf99'}]}, 'timestamp': '2025-12-15 09:57:48.225959', '_unique_id': '79cbf884d31e45ecac9ecad1b17e03e8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.227 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.227 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.227 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.227 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.227 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.227 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.227 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.227 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.227 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.227 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.227 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.227 12 ERROR oslo_messaging.notify.messaging Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.227 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.227 12 ERROR oslo_messaging.notify.messaging Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.227 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.227 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.227 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.227 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.227 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.227 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.227 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.227 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.227 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.227 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.227 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.227 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.227 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.227 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.227 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.227 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.227 12 ERROR oslo_messaging.notify.messaging Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.228 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.228 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.228 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.230 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f665a44d-90ed-43d0-bf90-eba9da9cff18', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:57:48.228607', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '82df6228-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11734.35181457, 'message_signature': 'b9a76349dbff0a7c00ec0fe4fab1617b278c9c9e8ec6e08cb5a7e5a1c9b1d9a7'}]}, 'timestamp': '2025-12-15 09:57:48.229171', '_unique_id': '9e2a34b5a6a24a3c9c30041ac6f2818b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.230 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.230 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.230 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.230 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.230 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.230 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.230 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.230 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.230 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.230 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.230 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.230 12 ERROR oslo_messaging.notify.messaging Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.230 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.230 12 ERROR oslo_messaging.notify.messaging Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.230 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.230 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.230 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.230 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.230 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.230 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.230 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.230 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.230 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.230 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.230 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.230 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.230 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.230 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.230 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.230 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.230 12 ERROR oslo_messaging.notify.messaging Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.231 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.231 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.232 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '499936a9-6434-44e8-9e6e-312870cbcfda', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:57:48.231441', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '82dfcf92-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11734.35181457, 'message_signature': 'bc7b4891676c99cf4dfb3d87c5de0745cf7f8c3ed152f10ed12d59a7d62760c0'}]}, 'timestamp': '2025-12-15 09:57:48.231904', '_unique_id': '283120386ad949d394aa6d56ee9298d9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.232 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.232 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.232 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.232 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.232 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.232 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.232 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.232 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.232 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.232 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.232 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.232 12 ERROR oslo_messaging.notify.messaging Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.232 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.232 12 ERROR oslo_messaging.notify.messaging Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.232 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.232 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.232 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.232 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.232 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.232 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.232 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.232 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.232 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.232 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.232 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.232 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.232 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.232 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.232 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.232 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.232 12 ERROR oslo_messaging.notify.messaging Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.234 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.234 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.234 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/cpu volume: 12800000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.235 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1a9967f3-9330-41b9-8376-50c4a47adfe4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12800000000, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'timestamp': '2025-12-15T09:57:48.234409', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '82e043a0-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11734.417872373, 'message_signature': 'cf682c04439f3c30dc80c8c0c097f3afdfcc6f0970c3a1da988b05402a8f5d78'}]}, 'timestamp': '2025-12-15 09:57:48.234860', '_unique_id': 'b45e8f57ea254bd5a457497a402fec25'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.235 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.235 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.235 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.235 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.235 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.235 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.235 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.235 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.235 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.235 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.235 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.235 12 ERROR oslo_messaging.notify.messaging Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.235 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.235 12 ERROR oslo_messaging.notify.messaging Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.235 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.235 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.235 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.235 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.235 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.235 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.235 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.235 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.235 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.235 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.235 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.235 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.235 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.235 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.235 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.235 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.235 12 ERROR oslo_messaging.notify.messaging Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.237 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.237 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.237 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.239 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5ed94c00-aff7-4614-a2ee-62f45fa3504f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:57:48.237212', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '82e0b11e-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11734.377421591, 'message_signature': 'c1c77a92f59c0f6ae0cf04a1b87e2e88e70cb9fd9a086f7d666c8877663665b7'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:57:48.237212', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '82e0c2da-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11734.377421591, 'message_signature': 'c00f82f862f559ee7015bc4041d45109f98ff0a1071fc6aed3c58f4cf9416e72'}]}, 'timestamp': '2025-12-15 09:57:48.238204', '_unique_id': 'dff3c3cb96a5461e96f6f09695dfcf12'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.239 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.239 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.239 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.239 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.239 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.239 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.239 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.239 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.239 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.239 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.239 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.239 12 ERROR oslo_messaging.notify.messaging Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.239 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.239 12 ERROR oslo_messaging.notify.messaging Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.239 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.239 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.239 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.239 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.239 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.239 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.239 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.239 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.239 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.239 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.239 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.239 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.239 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.239 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.239 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.239 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.239 12 ERROR oslo_messaging.notify.messaging Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.240 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.240 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.242 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4740264f-1f8b-4359-9948-f1619b264166', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:57:48.240575', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '82e134cc-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11734.35181457, 'message_signature': '81b2bdd4e0852542ed07295a1cd637d2b400db7a1846c90c17baa8ec06b095ed'}]}, 'timestamp': '2025-12-15 09:57:48.241116', '_unique_id': '201044441b674b4282a75b254c33362a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.242 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.242 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.242 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.242 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.242 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.242 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.242 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.242 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.242 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.242 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.242 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.242 12 ERROR oslo_messaging.notify.messaging Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.242 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.242 12 ERROR oslo_messaging.notify.messaging Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.242 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.242 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.242 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.242 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.242 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.242 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.242 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.242 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.242 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.242 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.242 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.242 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.242 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.242 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.242 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.242 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.242 12 ERROR oslo_messaging.notify.messaging Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.243 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.243 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.244 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '29f1d09e-0413-4e5b-988a-8cc2b15ba800', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:57:48.243295', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '82e19e8a-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11734.35181457, 'message_signature': 'aaa69a5870973aed13e6926fa252becc7785ebaf8afb350fd2fb8e58fdfea170'}]}, 'timestamp': '2025-12-15 09:57:48.243803', '_unique_id': '6ec92c2c3e264cdb9fbb048b1777ea3c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.244 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.244 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.244 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.244 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.244 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.244 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.244 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.244 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.244 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.244 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.244 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.244 12 ERROR oslo_messaging.notify.messaging Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.244 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.244 12 ERROR oslo_messaging.notify.messaging Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.244 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.244 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.244 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.244 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.244 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.244 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.244 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.244 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.244 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.244 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.244 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.244 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.244 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.244 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.244 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.244 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.244 12 ERROR oslo_messaging.notify.messaging Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.245 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.246 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.246 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.246 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.248 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '62cecbe0-35ec-4c77-bb33-684bfef56800', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:57:48.246240', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '82e21194-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11734.3171933, 'message_signature': '3f0364710d809e3c2b36d0d60f3105033a206497b3a64fa286ca70f27ccf591f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:57:48.246240', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '82e22198-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11734.3171933, 'message_signature': '8e02b716a1d1a7216af767cf16b926422590a3262f60bf793a62b73c3c9f18d5'}]}, 'timestamp': '2025-12-15 09:57:48.247110', '_unique_id': '4ada466bdb6546d691cc551505672a7c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.248 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.248 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.248 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.248 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.248 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.248 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.248 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.248 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.248 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.248 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.248 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.248 12 ERROR oslo_messaging.notify.messaging Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.248 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.248 12 ERROR oslo_messaging.notify.messaging Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.248 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.248 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.248 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.248 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.248 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.248 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.248 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.248 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.248 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.248 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.248 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.248 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.248 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.248 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.248 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.248 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.248 12 ERROR oslo_messaging.notify.messaging Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.249 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.249 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.249 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.251 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'edff37ac-a072-43b4-bbca-37808e36f192', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:57:48.249279', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '82e28822-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11734.3171933, 'message_signature': '73f24f3027e6dd2cb761b1857c43c25bbb853d71d496765f47e830fc504f5a77'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:57:48.249279', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '82e299e8-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11734.3171933, 'message_signature': '3d9d91a93d658bea529e5b412d24fad2610fe85308640760557875ced6b42388'}]}, 'timestamp': '2025-12-15 09:57:48.250220', '_unique_id': '12f645c2731c4637961eedf648c05946'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.251 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.251 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.251 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.251 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.251 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.251 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.251 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.251 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.251 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.251 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.251 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.251 12 ERROR oslo_messaging.notify.messaging Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.251 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.251 12 ERROR oslo_messaging.notify.messaging Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.251 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.251 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.251 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.251 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.251 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.251 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.251 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.251 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.251 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.251 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.251 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.251 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.251 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.251 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.251 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.251 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.251 12 ERROR oslo_messaging.notify.messaging Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.252 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.252 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.254 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'befeaab6-3252-4719-afcd-3d15bde399f3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:57:48.252577', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '82e30af4-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11734.35181457, 'message_signature': '4bbb4b83d6c7ec142e9dde041eea241b71e75240fde4d2504c2bfdc20b85e7da'}]}, 'timestamp': '2025-12-15 09:57:48.253142', '_unique_id': '43893bd968ae4f1c88e0c643e8152efb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.254 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.254 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.254 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.254 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.254 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.254 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.254 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.254 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.254 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.254 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.254 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.254 12 ERROR oslo_messaging.notify.messaging Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.254 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.254 12 ERROR oslo_messaging.notify.messaging Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.254 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.254 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.254 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.254 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.254 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.254 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.254 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.254 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.254 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.254 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.254 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.254 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.254 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.254 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.254 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.254 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:57:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:57:48.254 12 ERROR oslo_messaging.notify.messaging Dec 15 04:57:49 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Dec 15 04:57:49 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:57:50 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:57:50 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:57:50 localhost nova_compute[286344]: 2025-12-15 09:57:50.926 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:57:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:57:51.475 160590 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:57:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:57:51.476 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:57:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:57:51.476 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:57:55 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:57:55 localhost nova_compute[286344]: 2025-12-15 09:57:55.928 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:57:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0. Dec 15 04:57:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 04:57:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a. Dec 15 04:57:58 localhost systemd[299725]: Created slice User Background Tasks Slice. Dec 15 04:57:58 localhost systemd[299725]: Starting Cleanup of User's Temporary Files and Directories... Dec 15 04:57:58 localhost systemd[1]: tmp-crun.Amvp5P.mount: Deactivated successfully. Dec 15 04:57:58 localhost podman[312587]: 2025-12-15 09:57:58.778878365 +0000 UTC m=+0.103767641 container health_status 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd) Dec 15 04:57:58 localhost systemd[299725]: Finished Cleanup of User's Temporary Files and Directories. Dec 15 04:57:58 localhost podman[312586]: 2025-12-15 09:57:58.815659476 +0000 UTC m=+0.140538972 container health_status 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Dec 15 04:57:58 localhost podman[312586]: 2025-12-15 09:57:58.825316154 +0000 UTC m=+0.150195630 container exec_died 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 15 04:57:58 localhost podman[312587]: 2025-12-15 09:57:58.83887095 +0000 UTC m=+0.163760236 container exec_died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Dec 15 04:57:58 localhost systemd[1]: 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0.service: Deactivated successfully. Dec 15 04:57:58 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.service: Deactivated successfully. Dec 15 04:57:58 localhost podman[312588]: 2025-12-15 09:57:58.917124132 +0000 UTC m=+0.238013187 container health_status b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Dec 15 04:57:58 localhost podman[312588]: 2025-12-15 09:57:58.953843821 +0000 UTC m=+0.274732906 container exec_died b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Dec 15 04:57:58 localhost systemd[1]: b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a.service: Deactivated successfully. Dec 15 04:58:00 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:58:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09. Dec 15 04:58:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 04:58:00 localhost podman[312644]: 2025-12-15 09:58:00.758312907 +0000 UTC m=+0.082744137 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 04:58:00 localhost podman[312643]: 2025-12-15 09:58:00.805492887 +0000 UTC m=+0.132469728 container health_status 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, version=9.6, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., architecture=x86_64, maintainer=Red Hat, Inc., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.) Dec 15 04:58:00 localhost podman[312643]: 2025-12-15 09:58:00.820355299 +0000 UTC m=+0.147332170 container exec_died 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, name=ubi9-minimal, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, release=1755695350, config_id=openstack_network_exporter) Dec 15 04:58:00 localhost systemd[1]: 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09.service: Deactivated successfully. Dec 15 04:58:00 localhost podman[312644]: 2025-12-15 09:58:00.87113313 +0000 UTC m=+0.195564360 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_id=ovn_controller, container_name=ovn_controller) Dec 15 04:58:00 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 04:58:00 localhost nova_compute[286344]: 2025-12-15 09:58:00.929 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:58:01 localhost podman[243449]: time="2025-12-15T09:58:01Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 15 04:58:01 localhost podman[243449]: @ - - [15/Dec/2025:09:58:01 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 154814 "" "Go-http-client/1.1" Dec 15 04:58:01 localhost podman[243449]: @ - - [15/Dec/2025:09:58:01 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18737 "" "Go-http-client/1.1" Dec 15 04:58:04 localhost nova_compute[286344]: 2025-12-15 09:58:04.266 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:58:04 localhost nova_compute[286344]: 2025-12-15 09:58:04.269 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:58:04 localhost openstack_network_exporter[246484]: ERROR 09:58:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 04:58:04 localhost openstack_network_exporter[246484]: ERROR 09:58:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 04:58:04 localhost openstack_network_exporter[246484]: ERROR 09:58:04 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 15 04:58:04 localhost openstack_network_exporter[246484]: ERROR 09:58:04 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 15 04:58:04 localhost openstack_network_exporter[246484]: Dec 15 04:58:04 localhost openstack_network_exporter[246484]: ERROR 09:58:04 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 15 04:58:04 localhost openstack_network_exporter[246484]: Dec 15 04:58:05 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:58:05 localhost nova_compute[286344]: 2025-12-15 09:58:05.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:58:05 localhost nova_compute[286344]: 2025-12-15 09:58:05.933 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:58:07 localhost nova_compute[286344]: 2025-12-15 09:58:07.269 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:58:07 localhost nova_compute[286344]: 2025-12-15 09:58:07.270 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 15 04:58:07 localhost nova_compute[286344]: 2025-12-15 09:58:07.270 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 15 04:58:07 localhost nova_compute[286344]: 2025-12-15 09:58:07.924 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 15 04:58:07 localhost nova_compute[286344]: 2025-12-15 09:58:07.925 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquired lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 15 04:58:07 localhost nova_compute[286344]: 2025-12-15 09:58:07.925 286348 DEBUG nova.network.neutron [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 15 04:58:07 localhost nova_compute[286344]: 2025-12-15 09:58:07.925 286348 DEBUG nova.objects.instance [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 15 04:58:09 localhost nova_compute[286344]: 2025-12-15 09:58:09.219 286348 DEBUG nova.network.neutron [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Updating instance_info_cache with network_info: [{"id": "03ef8889-3216-43fb-8a52-4be17a956ce1", "address": "fa:16:3e:74:df:7c", "network": {"id": "befb7a72-17a9-4bcb-b561-84b8f626685a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.201", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "c785bf23f53946bc99867d8832a50266", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03ef8889-32", "ovs_interfaceid": "03ef8889-3216-43fb-8a52-4be17a956ce1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 15 04:58:09 localhost nova_compute[286344]: 2025-12-15 09:58:09.239 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Releasing lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 15 04:58:09 localhost nova_compute[286344]: 2025-12-15 09:58:09.240 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 15 04:58:09 localhost nova_compute[286344]: 2025-12-15 09:58:09.241 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:58:09 localhost nova_compute[286344]: 2025-12-15 09:58:09.241 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:58:09 localhost nova_compute[286344]: 2025-12-15 09:58:09.242 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:58:09 localhost nova_compute[286344]: 2025-12-15 09:58:09.242 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 15 04:58:09 localhost nova_compute[286344]: 2025-12-15 09:58:09.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:58:09 localhost nova_compute[286344]: 2025-12-15 09:58:09.288 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:58:09 localhost nova_compute[286344]: 2025-12-15 09:58:09.288 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:58:09 localhost nova_compute[286344]: 2025-12-15 09:58:09.289 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:58:09 localhost nova_compute[286344]: 2025-12-15 09:58:09.289 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Auditing locally available compute resources for np0005559462.localdomain (node: np0005559462.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 15 04:58:09 localhost nova_compute[286344]: 2025-12-15 09:58:09.289 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 04:58:09 localhost sshd[312708]: main: sshd: ssh-rsa algorithm is disabled Dec 15 04:58:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 04:58:09 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Dec 15 04:58:09 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3510038993' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Dec 15 04:58:09 localhost nova_compute[286344]: 2025-12-15 09:58:09.742 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 04:58:09 localhost systemd[1]: tmp-crun.aFbstq.mount: Deactivated successfully. Dec 15 04:58:09 localhost podman[312709]: 2025-12-15 09:58:09.763692818 +0000 UTC m=+0.090391730 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 04:58:09 localhost podman[312709]: 2025-12-15 09:58:09.77857217 +0000 UTC m=+0.105271022 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Dec 15 04:58:09 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 04:58:09 localhost nova_compute[286344]: 2025-12-15 09:58:09.844 286348 DEBUG nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 04:58:09 localhost nova_compute[286344]: 2025-12-15 09:58:09.844 286348 DEBUG nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 04:58:10 localhost nova_compute[286344]: 2025-12-15 09:58:10.078 286348 WARNING nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 15 04:58:10 localhost nova_compute[286344]: 2025-12-15 09:58:10.081 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Hypervisor/Node resource view: name=np0005559462.localdomain free_ram=11734MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 15 04:58:10 localhost nova_compute[286344]: 2025-12-15 09:58:10.081 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:58:10 localhost nova_compute[286344]: 2025-12-15 09:58:10.082 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:58:10 localhost nova_compute[286344]: 2025-12-15 09:58:10.183 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Instance 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 15 04:58:10 localhost nova_compute[286344]: 2025-12-15 09:58:10.184 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 15 04:58:10 localhost nova_compute[286344]: 2025-12-15 09:58:10.184 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Final resource view: name=np0005559462.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 15 04:58:10 localhost nova_compute[286344]: 2025-12-15 09:58:10.233 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 04:58:10 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:58:10 localhost ceph-mon[298913]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #37. Immutable memtables: 0. Dec 15 04:58:10 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:58:10.267386) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Dec 15 04:58:10 localhost ceph-mon[298913]: rocksdb: [db/flush_job.cc:856] [default] [JOB 19] Flushing memtable with next log file: 37 Dec 15 04:58:10 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765792690267480, "job": 19, "event": "flush_started", "num_memtables": 1, "num_entries": 940, "num_deletes": 251, "total_data_size": 1952683, "memory_usage": 2058072, "flush_reason": "Manual Compaction"} Dec 15 04:58:10 localhost ceph-mon[298913]: rocksdb: [db/flush_job.cc:885] [default] [JOB 19] Level-0 flush table #38: started Dec 15 04:58:10 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765792690282077, "cf_name": "default", "job": 19, "event": "table_file_creation", "file_number": 38, "file_size": 1760556, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 22420, "largest_seqno": 23359, "table_properties": {"data_size": 1756065, "index_size": 2026, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1413, "raw_key_size": 12051, "raw_average_key_size": 22, "raw_value_size": 1746344, "raw_average_value_size": 3216, "num_data_blocks": 86, "num_entries": 543, "num_filter_entries": 543, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765792650, "oldest_key_time": 1765792650, "file_creation_time": 1765792690, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "603b24af-e2be-4214-bc56-9e652eb4af3d", "db_session_id": "0OJRM9SCUA16EXV0VQZ2", "orig_file_number": 38, "seqno_to_time_mapping": "N/A"}} Dec 15 04:58:10 localhost ceph-mon[298913]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 19] Flush lasted 14762 microseconds, and 5963 cpu microseconds. Dec 15 04:58:10 localhost ceph-mon[298913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Dec 15 04:58:10 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:58:10.282155) [db/flush_job.cc:967] [default] [JOB 19] Level-0 flush table #38: 1760556 bytes OK Dec 15 04:58:10 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:58:10.282219) [db/memtable_list.cc:519] [default] Level-0 commit table #38 started Dec 15 04:58:10 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:58:10.284154) [db/memtable_list.cc:722] [default] Level-0 commit table #38: memtable #1 done Dec 15 04:58:10 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:58:10.284171) EVENT_LOG_v1 {"time_micros": 1765792690284166, "job": 19, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Dec 15 04:58:10 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:58:10.284192) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Dec 15 04:58:10 localhost ceph-mon[298913]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 19] Try to delete WAL files size 1947856, prev total WAL file size 1948180, number of live WAL files 2. Dec 15 04:58:10 localhost ceph-mon[298913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005559462/store.db/000034.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Dec 15 04:58:10 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:58:10.284847) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740033373537' seq:72057594037927935, type:22 .. '6D6772737461740034303038' seq:0, type:0; will stop at (end) Dec 15 04:58:10 localhost ceph-mon[298913]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 20] Compacting 1@0 + 1@6 files to L6, score -1.00 Dec 15 04:58:10 localhost ceph-mon[298913]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 19 Base level 0, inputs: [38(1719KB)], [36(17MB)] Dec 15 04:58:10 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765792690284936, "job": 20, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [38], "files_L6": [36], "score": -1, "input_data_size": 20277770, "oldest_snapshot_seqno": -1} Dec 15 04:58:10 localhost ceph-mon[298913]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 20] Generated table #39: 12332 keys, 18050094 bytes, temperature: kUnknown Dec 15 04:58:10 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765792690399673, "cf_name": "default", "job": 20, "event": "table_file_creation", "file_number": 39, "file_size": 18050094, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17981149, "index_size": 37036, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 30853, "raw_key_size": 330157, "raw_average_key_size": 26, "raw_value_size": 17772316, "raw_average_value_size": 1441, "num_data_blocks": 1411, "num_entries": 12332, "num_filter_entries": 12332, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765792320, "oldest_key_time": 0, "file_creation_time": 1765792690, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "603b24af-e2be-4214-bc56-9e652eb4af3d", "db_session_id": "0OJRM9SCUA16EXV0VQZ2", "orig_file_number": 39, "seqno_to_time_mapping": "N/A"}} Dec 15 04:58:10 localhost ceph-mon[298913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Dec 15 04:58:10 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:58:10.400197) [db/compaction/compaction_job.cc:1663] [default] [JOB 20] Compacted 1@0 + 1@6 files to L6 => 18050094 bytes Dec 15 04:58:10 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:58:10.417161) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 176.5 rd, 157.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.7, 17.7 +0.0 blob) out(17.2 +0.0 blob), read-write-amplify(21.8) write-amplify(10.3) OK, records in: 12849, records dropped: 517 output_compression: NoCompression Dec 15 04:58:10 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:58:10.417210) EVENT_LOG_v1 {"time_micros": 1765792690417190, "job": 20, "event": "compaction_finished", "compaction_time_micros": 114891, "compaction_time_cpu_micros": 53927, "output_level": 6, "num_output_files": 1, "total_output_size": 18050094, "num_input_records": 12849, "num_output_records": 12332, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Dec 15 04:58:10 localhost ceph-mon[298913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005559462/store.db/000038.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Dec 15 04:58:10 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765792690417750, "job": 20, "event": "table_file_deletion", "file_number": 38} Dec 15 04:58:10 localhost ceph-mon[298913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005559462/store.db/000036.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Dec 15 04:58:10 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765792690420674, "job": 20, "event": "table_file_deletion", "file_number": 36} Dec 15 04:58:10 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:58:10.284747) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 04:58:10 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:58:10.420720) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 04:58:10 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:58:10.420727) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 04:58:10 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:58:10.420730) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 04:58:10 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:58:10.420734) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 04:58:10 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:58:10.420737) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 04:58:10 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Dec 15 04:58:10 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2648339520' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Dec 15 04:58:10 localhost nova_compute[286344]: 2025-12-15 09:58:10.722 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.488s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 04:58:10 localhost nova_compute[286344]: 2025-12-15 09:58:10.728 286348 DEBUG nova.compute.provider_tree [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Inventory has not changed in ProviderTree for provider: 26c8956b-6742-4951-b566-971b9bbe323b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 15 04:58:10 localhost nova_compute[286344]: 2025-12-15 09:58:10.935 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:58:11 localhost nova_compute[286344]: 2025-12-15 09:58:11.038 286348 DEBUG nova.scheduler.client.report [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Inventory has not changed for provider 26c8956b-6742-4951-b566-971b9bbe323b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 15 04:58:11 localhost nova_compute[286344]: 2025-12-15 09:58:11.041 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Compute_service record updated for np0005559462.localdomain:np0005559462.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 15 04:58:11 localhost nova_compute[286344]: 2025-12-15 09:58:11.041 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.960s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:58:12 localhost nova_compute[286344]: 2025-12-15 09:58:12.042 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:58:15 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:58:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e. Dec 15 04:58:15 localhost podman[312752]: 2025-12-15 09:58:15.761012064 +0000 UTC m=+0.087442108 container health_status a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Dec 15 04:58:15 localhost podman[312752]: 2025-12-15 09:58:15.799529493 +0000 UTC m=+0.125959547 container exec_died a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Dec 15 04:58:15 localhost systemd[1]: a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e.service: Deactivated successfully. Dec 15 04:58:15 localhost nova_compute[286344]: 2025-12-15 09:58:15.938 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:58:20 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:58:20 localhost nova_compute[286344]: 2025-12-15 09:58:20.942 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:58:25 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:58:25 localhost nova_compute[286344]: 2025-12-15 09:58:25.944 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:58:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0. Dec 15 04:58:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 04:58:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a. Dec 15 04:58:29 localhost podman[312775]: 2025-12-15 09:58:29.763259892 +0000 UTC m=+0.088572570 container health_status 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Dec 15 04:58:29 localhost podman[312775]: 2025-12-15 09:58:29.795393625 +0000 UTC m=+0.120706283 container exec_died 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Dec 15 04:58:29 localhost systemd[1]: 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0.service: Deactivated successfully. Dec 15 04:58:29 localhost systemd[1]: tmp-crun.3l9LBY.mount: Deactivated successfully. Dec 15 04:58:29 localhost podman[312776]: 2025-12-15 09:58:29.862283431 +0000 UTC m=+0.182459356 container health_status 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.build-date=20251202, container_name=multipathd, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 04:58:29 localhost podman[312777]: 2025-12-15 09:58:29.885809384 +0000 UTC m=+0.204082236 container health_status b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true) Dec 15 04:58:29 localhost podman[312777]: 2025-12-15 09:58:29.896332206 +0000 UTC m=+0.214605058 container exec_died b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 04:58:29 localhost podman[312776]: 2025-12-15 09:58:29.906891369 +0000 UTC m=+0.227067284 container exec_died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}) Dec 15 04:58:29 localhost systemd[1]: b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a.service: Deactivated successfully. Dec 15 04:58:29 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.service: Deactivated successfully. Dec 15 04:58:30 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:58:30 localhost systemd[1]: tmp-crun.sGqlWu.mount: Deactivated successfully. Dec 15 04:58:30 localhost nova_compute[286344]: 2025-12-15 09:58:30.947 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:58:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09. Dec 15 04:58:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 04:58:31 localhost podman[312835]: 2025-12-15 09:58:31.752483116 +0000 UTC m=+0.084795485 container health_status 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, architecture=x86_64, io.openshift.expose-services=, container_name=openstack_network_exporter, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, vcs-type=git) Dec 15 04:58:31 localhost podman[312835]: 2025-12-15 09:58:31.768452529 +0000 UTC m=+0.100764908 container exec_died 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, vcs-type=git, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, version=9.6, config_id=openstack_network_exporter, architecture=x86_64, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, release=1755695350) Dec 15 04:58:31 localhost systemd[1]: 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09.service: Deactivated successfully. Dec 15 04:58:31 localhost podman[312836]: 2025-12-15 09:58:31.863566899 +0000 UTC m=+0.190423686 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team) Dec 15 04:58:31 localhost podman[243449]: time="2025-12-15T09:58:31Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 15 04:58:31 localhost podman[243449]: @ - - [15/Dec/2025:09:58:31 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 154814 "" "Go-http-client/1.1" Dec 15 04:58:31 localhost podman[312836]: 2025-12-15 09:58:31.94035433 +0000 UTC m=+0.267211137 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller) Dec 15 04:58:31 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 04:58:32 localhost podman[243449]: @ - - [15/Dec/2025:09:58:31 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18729 "" "Go-http-client/1.1" Dec 15 04:58:34 localhost openstack_network_exporter[246484]: ERROR 09:58:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 04:58:34 localhost openstack_network_exporter[246484]: ERROR 09:58:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 04:58:34 localhost openstack_network_exporter[246484]: ERROR 09:58:34 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 15 04:58:34 localhost openstack_network_exporter[246484]: ERROR 09:58:34 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 15 04:58:34 localhost openstack_network_exporter[246484]: Dec 15 04:58:34 localhost openstack_network_exporter[246484]: ERROR 09:58:34 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 15 04:58:34 localhost openstack_network_exporter[246484]: Dec 15 04:58:35 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:58:35 localhost nova_compute[286344]: 2025-12-15 09:58:35.950 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:58:40 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:58:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 04:58:40 localhost podman[312881]: 2025-12-15 09:58:40.743099745 +0000 UTC m=+0.068787281 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true) Dec 15 04:58:40 localhost podman[312881]: 2025-12-15 09:58:40.77645597 +0000 UTC m=+0.102143556 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2) Dec 15 04:58:40 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 04:58:40 localhost nova_compute[286344]: 2025-12-15 09:58:40.953 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:58:45 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:58:45 localhost nova_compute[286344]: 2025-12-15 09:58:45.956 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:58:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e. Dec 15 04:58:46 localhost systemd[1]: tmp-crun.AYZISH.mount: Deactivated successfully. Dec 15 04:58:46 localhost podman[312920]: 2025-12-15 09:58:46.270268101 +0000 UTC m=+0.082635039 container health_status a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Dec 15 04:58:46 localhost podman[312920]: 2025-12-15 09:58:46.282482922 +0000 UTC m=+0.094849900 container exec_died a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 15 04:58:46 localhost systemd[1]: a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e.service: Deactivated successfully. Dec 15 04:58:47 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Dec 15 04:58:47 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:58:47 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Dec 15 04:58:47 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:58:49 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Dec 15 04:58:49 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:58:50 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:58:50 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:58:50 localhost nova_compute[286344]: 2025-12-15 09:58:50.958 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:58:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:58:51.476 160590 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:58:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:58:51.477 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:58:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:58:51.477 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:58:55 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:58:55 localhost nova_compute[286344]: 2025-12-15 09:58:55.962 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:59:00 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:59:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0. Dec 15 04:59:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 04:59:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a. Dec 15 04:59:00 localhost systemd[1]: tmp-crun.DZkTvB.mount: Deactivated successfully. Dec 15 04:59:00 localhost podman[313010]: 2025-12-15 09:59:00.764888142 +0000 UTC m=+0.086861193 container health_status 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 15 04:59:00 localhost podman[313010]: 2025-12-15 09:59:00.776230341 +0000 UTC m=+0.098203342 container exec_died 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Dec 15 04:59:00 localhost systemd[1]: 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0.service: Deactivated successfully. Dec 15 04:59:00 localhost systemd[1]: tmp-crun.8f9SeE.mount: Deactivated successfully. Dec 15 04:59:00 localhost podman[313012]: 2025-12-15 09:59:00.826376868 +0000 UTC m=+0.145869278 container health_status b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible) Dec 15 04:59:00 localhost podman[313012]: 2025-12-15 09:59:00.833480733 +0000 UTC m=+0.152973173 container exec_died b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 04:59:00 localhost systemd[1]: b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a.service: Deactivated successfully. Dec 15 04:59:00 localhost podman[313011]: 2025-12-15 09:59:00.922140769 +0000 UTC m=+0.242378378 container health_status 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible) Dec 15 04:59:00 localhost podman[313011]: 2025-12-15 09:59:00.96152486 +0000 UTC m=+0.281762449 container exec_died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 04:59:00 localhost nova_compute[286344]: 2025-12-15 09:59:00.965 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:59:00 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.service: Deactivated successfully. Dec 15 04:59:01 localhost podman[243449]: time="2025-12-15T09:59:01Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 15 04:59:01 localhost podman[243449]: @ - - [15/Dec/2025:09:59:01 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 154814 "" "Go-http-client/1.1" Dec 15 04:59:01 localhost podman[243449]: @ - - [15/Dec/2025:09:59:01 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18728 "" "Go-http-client/1.1" Dec 15 04:59:01 localhost nova_compute[286344]: 2025-12-15 09:59:01.995 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:59:01 localhost ovn_metadata_agent[160585]: 2025-12-15 09:59:01.994 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'fe:17:e3', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fe:55:2b:86:15:b5'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 04:59:01 localhost ovn_metadata_agent[160585]: 2025-12-15 09:59:01.996 160590 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Dec 15 04:59:01 localhost ovn_metadata_agent[160585]: 2025-12-15 09:59:01.997 160590 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=12d96d64-e862-4f68-81e5-8d9ec5d3a5e2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 15 04:59:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09. Dec 15 04:59:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 04:59:02 localhost podman[313073]: 2025-12-15 09:59:02.760886284 +0000 UTC m=+0.086543195 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 04:59:02 localhost podman[313072]: 2025-12-15 09:59:02.82395755 +0000 UTC m=+0.150316538 container health_status 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.buildah.version=1.33.7, distribution-scope=public, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, version=9.6, name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter) Dec 15 04:59:02 localhost podman[313072]: 2025-12-15 09:59:02.836366425 +0000 UTC m=+0.162725433 container exec_died 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, release=1755695350, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vcs-type=git, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, io.buildah.version=1.33.7) Dec 15 04:59:02 localhost systemd[1]: 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09.service: Deactivated successfully. Dec 15 04:59:02 localhost podman[313073]: 2025-12-15 09:59:02.879353625 +0000 UTC m=+0.205010576 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller) Dec 15 04:59:02 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 04:59:04 localhost openstack_network_exporter[246484]: ERROR 09:59:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 04:59:04 localhost openstack_network_exporter[246484]: ERROR 09:59:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 04:59:04 localhost openstack_network_exporter[246484]: ERROR 09:59:04 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 15 04:59:04 localhost openstack_network_exporter[246484]: ERROR 09:59:04 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 15 04:59:04 localhost openstack_network_exporter[246484]: Dec 15 04:59:04 localhost openstack_network_exporter[246484]: ERROR 09:59:04 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 15 04:59:04 localhost openstack_network_exporter[246484]: Dec 15 04:59:05 localhost nova_compute[286344]: 2025-12-15 09:59:05.271 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:59:05 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:59:05 localhost ceph-mon[298913]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #40. Immutable memtables: 0. Dec 15 04:59:05 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:59:05.327074) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Dec 15 04:59:05 localhost ceph-mon[298913]: rocksdb: [db/flush_job.cc:856] [default] [JOB 21] Flushing memtable with next log file: 40 Dec 15 04:59:05 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765792745327182, "job": 21, "event": "flush_started", "num_memtables": 1, "num_entries": 791, "num_deletes": 257, "total_data_size": 569625, "memory_usage": 584624, "flush_reason": "Manual Compaction"} Dec 15 04:59:05 localhost ceph-mon[298913]: rocksdb: [db/flush_job.cc:885] [default] [JOB 21] Level-0 flush table #41: started Dec 15 04:59:05 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765792745335221, "cf_name": "default", "job": 21, "event": "table_file_creation", "file_number": 41, "file_size": 556638, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 23360, "largest_seqno": 24150, "table_properties": {"data_size": 553000, "index_size": 1429, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1157, "raw_key_size": 8301, "raw_average_key_size": 18, "raw_value_size": 545556, "raw_average_value_size": 1242, "num_data_blocks": 64, "num_entries": 439, "num_filter_entries": 439, "num_deletions": 257, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765792690, "oldest_key_time": 1765792690, "file_creation_time": 1765792745, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "603b24af-e2be-4214-bc56-9e652eb4af3d", "db_session_id": "0OJRM9SCUA16EXV0VQZ2", "orig_file_number": 41, "seqno_to_time_mapping": "N/A"}} Dec 15 04:59:05 localhost ceph-mon[298913]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 21] Flush lasted 8176 microseconds, and 3056 cpu microseconds. Dec 15 04:59:05 localhost ceph-mon[298913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Dec 15 04:59:05 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:59:05.335280) [db/flush_job.cc:967] [default] [JOB 21] Level-0 flush table #41: 556638 bytes OK Dec 15 04:59:05 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:59:05.335307) [db/memtable_list.cc:519] [default] Level-0 commit table #41 started Dec 15 04:59:05 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:59:05.338294) [db/memtable_list.cc:722] [default] Level-0 commit table #41: memtable #1 done Dec 15 04:59:05 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:59:05.338315) EVENT_LOG_v1 {"time_micros": 1765792745338308, "job": 21, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Dec 15 04:59:05 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:59:05.338340) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Dec 15 04:59:05 localhost ceph-mon[298913]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 21] Try to delete WAL files size 565667, prev total WAL file size 565991, number of live WAL files 2. Dec 15 04:59:05 localhost ceph-mon[298913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005559462/store.db/000037.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Dec 15 04:59:05 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:59:05.338929) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0033373638' seq:72057594037927935, type:22 .. '6C6F676D0034303231' seq:0, type:0; will stop at (end) Dec 15 04:59:05 localhost ceph-mon[298913]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 22] Compacting 1@0 + 1@6 files to L6, score -1.00 Dec 15 04:59:05 localhost ceph-mon[298913]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 21 Base level 0, inputs: [41(543KB)], [39(17MB)] Dec 15 04:59:05 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765792745339016, "job": 22, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [41], "files_L6": [39], "score": -1, "input_data_size": 18606732, "oldest_snapshot_seqno": -1} Dec 15 04:59:05 localhost ceph-mon[298913]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 22] Generated table #42: 12241 keys, 18492084 bytes, temperature: kUnknown Dec 15 04:59:05 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765792745476908, "cf_name": "default", "job": 22, "event": "table_file_creation", "file_number": 42, "file_size": 18492084, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 18422405, "index_size": 37962, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 30661, "raw_key_size": 329220, "raw_average_key_size": 26, "raw_value_size": 18213863, "raw_average_value_size": 1487, "num_data_blocks": 1448, "num_entries": 12241, "num_filter_entries": 12241, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765792320, "oldest_key_time": 0, "file_creation_time": 1765792745, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "603b24af-e2be-4214-bc56-9e652eb4af3d", "db_session_id": "0OJRM9SCUA16EXV0VQZ2", "orig_file_number": 42, "seqno_to_time_mapping": "N/A"}} Dec 15 04:59:05 localhost ceph-mon[298913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Dec 15 04:59:05 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:59:05.477328) [db/compaction/compaction_job.cc:1663] [default] [JOB 22] Compacted 1@0 + 1@6 files to L6 => 18492084 bytes Dec 15 04:59:05 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:59:05.479033) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 134.7 rd, 133.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.5, 17.2 +0.0 blob) out(17.6 +0.0 blob), read-write-amplify(66.6) write-amplify(33.2) OK, records in: 12771, records dropped: 530 output_compression: NoCompression Dec 15 04:59:05 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:59:05.479064) EVENT_LOG_v1 {"time_micros": 1765792745479051, "job": 22, "event": "compaction_finished", "compaction_time_micros": 138093, "compaction_time_cpu_micros": 50571, "output_level": 6, "num_output_files": 1, "total_output_size": 18492084, "num_input_records": 12771, "num_output_records": 12241, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Dec 15 04:59:05 localhost ceph-mon[298913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005559462/store.db/000041.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Dec 15 04:59:05 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765792745479434, "job": 22, "event": "table_file_deletion", "file_number": 41} Dec 15 04:59:05 localhost ceph-mon[298913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005559462/store.db/000039.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Dec 15 04:59:05 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765792745482111, "job": 22, "event": "table_file_deletion", "file_number": 39} Dec 15 04:59:05 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:59:05.338821) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 04:59:05 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:59:05.482227) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 04:59:05 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:59:05.482232) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 04:59:05 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:59:05.482234) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 04:59:05 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:59:05.482236) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 04:59:05 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-09:59:05.482238) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 04:59:05 localhost nova_compute[286344]: 2025-12-15 09:59:05.969 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:59:06 localhost nova_compute[286344]: 2025-12-15 09:59:06.266 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:59:06 localhost nova_compute[286344]: 2025-12-15 09:59:06.269 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:59:06 localhost neutron_dhcp_agent[267542]: 2025-12-15 09:59:06.820 267546 INFO oslo.privsep.daemon [None req-e9823adb-8b16-4b24-9ec0-6ffb388708e3 - - - - - -] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmpgn33bagj/privsep.sock']#033[00m Dec 15 04:59:07 localhost neutron_dhcp_agent[267542]: 2025-12-15 09:59:07.413 267546 INFO oslo.privsep.daemon [None req-e9823adb-8b16-4b24-9ec0-6ffb388708e3 - - - - - -] Spawned new privsep daemon via rootwrap#033[00m Dec 15 04:59:07 localhost neutron_dhcp_agent[267542]: 2025-12-15 09:59:07.319 313119 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m Dec 15 04:59:07 localhost neutron_dhcp_agent[267542]: 2025-12-15 09:59:07.324 313119 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m Dec 15 04:59:07 localhost neutron_dhcp_agent[267542]: 2025-12-15 09:59:07.327 313119 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none#033[00m Dec 15 04:59:07 localhost neutron_dhcp_agent[267542]: 2025-12-15 09:59:07.328 313119 INFO oslo.privsep.daemon [-] privsep daemon running as pid 313119#033[00m Dec 15 04:59:07 localhost neutron_dhcp_agent[267542]: 2025-12-15 09:59:07.893 267546 INFO oslo.privsep.daemon [None req-e9823adb-8b16-4b24-9ec0-6ffb388708e3 - - - - - -] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpn4xaecke/privsep.sock']#033[00m Dec 15 04:59:08 localhost nova_compute[286344]: 2025-12-15 09:59:08.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:59:08 localhost nova_compute[286344]: 2025-12-15 09:59:08.270 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 15 04:59:08 localhost nova_compute[286344]: 2025-12-15 09:59:08.270 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 15 04:59:08 localhost neutron_dhcp_agent[267542]: 2025-12-15 09:59:08.529 267546 INFO oslo.privsep.daemon [None req-e9823adb-8b16-4b24-9ec0-6ffb388708e3 - - - - - -] Spawned new privsep daemon via rootwrap#033[00m Dec 15 04:59:08 localhost neutron_dhcp_agent[267542]: 2025-12-15 09:59:08.443 313128 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m Dec 15 04:59:08 localhost neutron_dhcp_agent[267542]: 2025-12-15 09:59:08.446 313128 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m Dec 15 04:59:08 localhost neutron_dhcp_agent[267542]: 2025-12-15 09:59:08.448 313128 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none#033[00m Dec 15 04:59:08 localhost neutron_dhcp_agent[267542]: 2025-12-15 09:59:08.449 313128 INFO oslo.privsep.daemon [-] privsep daemon running as pid 313128#033[00m Dec 15 04:59:08 localhost nova_compute[286344]: 2025-12-15 09:59:08.567 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 15 04:59:08 localhost nova_compute[286344]: 2025-12-15 09:59:08.568 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquired lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 15 04:59:08 localhost nova_compute[286344]: 2025-12-15 09:59:08.568 286348 DEBUG nova.network.neutron [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 15 04:59:08 localhost nova_compute[286344]: 2025-12-15 09:59:08.569 286348 DEBUG nova.objects.instance [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 15 04:59:09 localhost nova_compute[286344]: 2025-12-15 09:59:09.177 286348 DEBUG nova.network.neutron [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Updating instance_info_cache with network_info: [{"id": "03ef8889-3216-43fb-8a52-4be17a956ce1", "address": "fa:16:3e:74:df:7c", "network": {"id": "befb7a72-17a9-4bcb-b561-84b8f626685a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.201", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "c785bf23f53946bc99867d8832a50266", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03ef8889-32", "ovs_interfaceid": "03ef8889-3216-43fb-8a52-4be17a956ce1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 15 04:59:09 localhost nova_compute[286344]: 2025-12-15 09:59:09.190 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Releasing lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 15 04:59:09 localhost nova_compute[286344]: 2025-12-15 09:59:09.191 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 15 04:59:09 localhost nova_compute[286344]: 2025-12-15 09:59:09.192 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:59:09 localhost nova_compute[286344]: 2025-12-15 09:59:09.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:59:09 localhost nova_compute[286344]: 2025-12-15 09:59:09.270 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 15 04:59:09 localhost neutron_dhcp_agent[267542]: 2025-12-15 09:59:09.511 267546 INFO oslo.privsep.daemon [None req-e9823adb-8b16-4b24-9ec0-6ffb388708e3 - - - - - -] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmpbfik8fre/privsep.sock']#033[00m Dec 15 04:59:09 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e90 do_prune osdmap full prune enabled Dec 15 04:59:09 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e91 e91: 6 total, 6 up, 6 in Dec 15 04:59:09 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e91: 6 total, 6 up, 6 in Dec 15 04:59:10 localhost neutron_dhcp_agent[267542]: 2025-12-15 09:59:10.150 267546 INFO oslo.privsep.daemon [None req-e9823adb-8b16-4b24-9ec0-6ffb388708e3 - - - - - -] Spawned new privsep daemon via rootwrap#033[00m Dec 15 04:59:10 localhost neutron_dhcp_agent[267542]: 2025-12-15 09:59:10.040 313140 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m Dec 15 04:59:10 localhost neutron_dhcp_agent[267542]: 2025-12-15 09:59:10.045 313140 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m Dec 15 04:59:10 localhost neutron_dhcp_agent[267542]: 2025-12-15 09:59:10.047 313140 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m Dec 15 04:59:10 localhost neutron_dhcp_agent[267542]: 2025-12-15 09:59:10.047 313140 INFO oslo.privsep.daemon [-] privsep daemon running as pid 313140#033[00m Dec 15 04:59:10 localhost nova_compute[286344]: 2025-12-15 09:59:10.266 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:59:10 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:59:10 localhost nova_compute[286344]: 2025-12-15 09:59:10.578 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:59:10 localhost nova_compute[286344]: 2025-12-15 09:59:10.578 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:59:10 localhost nova_compute[286344]: 2025-12-15 09:59:10.579 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 04:59:10 localhost nova_compute[286344]: 2025-12-15 09:59:10.597 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:59:10 localhost nova_compute[286344]: 2025-12-15 09:59:10.598 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:59:10 localhost nova_compute[286344]: 2025-12-15 09:59:10.599 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:59:10 localhost nova_compute[286344]: 2025-12-15 09:59:10.599 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Auditing locally available compute resources for np0005559462.localdomain (node: np0005559462.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 15 04:59:10 localhost nova_compute[286344]: 2025-12-15 09:59:10.599 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 04:59:10 localhost nova_compute[286344]: 2025-12-15 09:59:10.973 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:59:10 localhost nova_compute[286344]: 2025-12-15 09:59:10.976 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:59:10 localhost nova_compute[286344]: 2025-12-15 09:59:10.977 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5005 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 15 04:59:10 localhost nova_compute[286344]: 2025-12-15 09:59:10.977 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:59:11 localhost nova_compute[286344]: 2025-12-15 09:59:11.013 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:59:11 localhost nova_compute[286344]: 2025-12-15 09:59:11.016 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:59:11 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Dec 15 04:59:11 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/259873801' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Dec 15 04:59:11 localhost nova_compute[286344]: 2025-12-15 09:59:11.127 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.528s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 04:59:11 localhost nova_compute[286344]: 2025-12-15 09:59:11.209 286348 DEBUG nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 04:59:11 localhost nova_compute[286344]: 2025-12-15 09:59:11.210 286348 DEBUG nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 04:59:11 localhost nova_compute[286344]: 2025-12-15 09:59:11.409 286348 WARNING nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 15 04:59:11 localhost nova_compute[286344]: 2025-12-15 09:59:11.410 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Hypervisor/Node resource view: name=np0005559462.localdomain free_ram=11489MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 15 04:59:11 localhost nova_compute[286344]: 2025-12-15 09:59:11.411 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:59:11 localhost nova_compute[286344]: 2025-12-15 09:59:11.411 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:59:11 localhost nova_compute[286344]: 2025-12-15 09:59:11.475 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Instance 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 15 04:59:11 localhost nova_compute[286344]: 2025-12-15 09:59:11.476 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 15 04:59:11 localhost nova_compute[286344]: 2025-12-15 09:59:11.476 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Final resource view: name=np0005559462.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 15 04:59:11 localhost nova_compute[286344]: 2025-12-15 09:59:11.506 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 04:59:11 localhost neutron_dhcp_agent[267542]: 2025-12-15 09:59:11.565 267546 INFO neutron.agent.linux.ip_lib [None req-e9823adb-8b16-4b24-9ec0-6ffb388708e3 - - - - - -] Device tap5dec1706-d0 cannot be used as it has no MAC address#033[00m Dec 15 04:59:11 localhost nova_compute[286344]: 2025-12-15 09:59:11.612 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:59:11 localhost kernel: device tap5dec1706-d0 entered promiscuous mode Dec 15 04:59:11 localhost NetworkManager[5963]: [1765792751.6211] manager: (tap5dec1706-d0): new Generic device (/org/freedesktop/NetworkManager/Devices/17) Dec 15 04:59:11 localhost systemd-udevd[313177]: Network interface NamePolicy= disabled on kernel command line. Dec 15 04:59:11 localhost nova_compute[286344]: 2025-12-15 09:59:11.626 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:59:11 localhost ovn_controller[154603]: 2025-12-15T09:59:11Z|00069|binding|INFO|Claiming lport 5dec1706-d09a-430e-b5d3-8a027dffe52b for this chassis. Dec 15 04:59:11 localhost ovn_controller[154603]: 2025-12-15T09:59:11Z|00070|binding|INFO|5dec1706-d09a-430e-b5d3-8a027dffe52b: Claiming unknown Dec 15 04:59:11 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 04:59:11 localhost ovn_metadata_agent[160585]: 2025-12-15 09:59:11.637 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.199.3/24', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-bead4979-69b7-4c2f-82ee-3d1038c8a234', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bead4979-69b7-4c2f-82ee-3d1038c8a234', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fc88b563064b4928b96adabe64da69d9', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4b57e44c-db32-4248-a7ba-ac1457a522c0, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=5dec1706-d09a-430e-b5d3-8a027dffe52b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 04:59:11 localhost ovn_metadata_agent[160585]: 2025-12-15 09:59:11.638 160590 INFO neutron.agent.ovn.metadata.agent [-] Port 5dec1706-d09a-430e-b5d3-8a027dffe52b in datapath bead4979-69b7-4c2f-82ee-3d1038c8a234 bound to our chassis#033[00m Dec 15 04:59:11 localhost ovn_metadata_agent[160585]: 2025-12-15 09:59:11.639 160590 DEBUG neutron.agent.ovn.metadata.agent [-] Port df3d390f-c591-4fa6-862b-72578e1bf6f3 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Dec 15 04:59:11 localhost ovn_metadata_agent[160585]: 2025-12-15 09:59:11.639 160590 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network bead4979-69b7-4c2f-82ee-3d1038c8a234, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 15 04:59:11 localhost ovn_controller[154603]: 2025-12-15T09:59:11Z|00071|binding|INFO|Setting lport 5dec1706-d09a-430e-b5d3-8a027dffe52b ovn-installed in OVS Dec 15 04:59:11 localhost ovn_controller[154603]: 2025-12-15T09:59:11Z|00072|binding|INFO|Setting lport 5dec1706-d09a-430e-b5d3-8a027dffe52b up in Southbound Dec 15 04:59:11 localhost ovn_metadata_agent[160585]: 2025-12-15 09:59:11.640 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[a9d6ce57-c205-4950-87f0-d6e8477c2553]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 04:59:11 localhost nova_compute[286344]: 2025-12-15 09:59:11.644 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:59:11 localhost nova_compute[286344]: 2025-12-15 09:59:11.662 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:59:11 localhost nova_compute[286344]: 2025-12-15 09:59:11.695 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:59:11 localhost podman[313181]: 2025-12-15 09:59:11.724057001 +0000 UTC m=+0.080680191 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Dec 15 04:59:11 localhost nova_compute[286344]: 2025-12-15 09:59:11.725 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:59:11 localhost podman[313181]: 2025-12-15 09:59:11.758830598 +0000 UTC m=+0.115453718 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202) Dec 15 04:59:11 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 04:59:11 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e91 do_prune osdmap full prune enabled Dec 15 04:59:11 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e92 e92: 6 total, 6 up, 6 in Dec 15 04:59:11 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e92: 6 total, 6 up, 6 in Dec 15 04:59:11 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : mgrmap e47: np0005559464.aomnqe(active, since 92s), standbys: np0005559462.fudvyx, np0005559463.daptkf Dec 15 04:59:11 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Dec 15 04:59:11 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/411972470' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Dec 15 04:59:12 localhost nova_compute[286344]: 2025-12-15 09:59:12.014 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.507s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 04:59:12 localhost nova_compute[286344]: 2025-12-15 09:59:12.021 286348 DEBUG nova.compute.provider_tree [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Inventory has not changed in ProviderTree for provider: 26c8956b-6742-4951-b566-971b9bbe323b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 15 04:59:12 localhost nova_compute[286344]: 2025-12-15 09:59:12.037 286348 DEBUG nova.scheduler.client.report [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Inventory has not changed for provider 26c8956b-6742-4951-b566-971b9bbe323b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 15 04:59:12 localhost nova_compute[286344]: 2025-12-15 09:59:12.040 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Compute_service record updated for np0005559462.localdomain:np0005559462.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 15 04:59:12 localhost nova_compute[286344]: 2025-12-15 09:59:12.041 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.630s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:59:12 localhost podman[313270]: Dec 15 04:59:12 localhost podman[313270]: 2025-12-15 09:59:12.586244498 +0000 UTC m=+0.092989423 container create c7090ff3801894e273883815666acf27382f9f74c5d24839a649d291965745f7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-bead4979-69b7-4c2f-82ee-3d1038c8a234, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) Dec 15 04:59:12 localhost systemd[1]: Started libpod-conmon-c7090ff3801894e273883815666acf27382f9f74c5d24839a649d291965745f7.scope. Dec 15 04:59:12 localhost podman[313270]: 2025-12-15 09:59:12.540863929 +0000 UTC m=+0.047608884 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 15 04:59:12 localhost systemd[1]: Started libcrun container. Dec 15 04:59:12 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/29ff15cb9d942a42a456c8b08f0a3c62a28b4c64cb33b5b26470825f89511fd5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 04:59:12 localhost podman[313270]: 2025-12-15 09:59:12.658401117 +0000 UTC m=+0.165146052 container init c7090ff3801894e273883815666acf27382f9f74c5d24839a649d291965745f7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-bead4979-69b7-4c2f-82ee-3d1038c8a234, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team) Dec 15 04:59:12 localhost podman[313270]: 2025-12-15 09:59:12.6686496 +0000 UTC m=+0.175394535 container start c7090ff3801894e273883815666acf27382f9f74c5d24839a649d291965745f7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-bead4979-69b7-4c2f-82ee-3d1038c8a234, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0) Dec 15 04:59:12 localhost dnsmasq[313289]: started, version 2.85 cachesize 150 Dec 15 04:59:12 localhost dnsmasq[313289]: DNS service limited to local subnets Dec 15 04:59:12 localhost dnsmasq[313289]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 15 04:59:12 localhost dnsmasq[313289]: warning: no upstream servers configured Dec 15 04:59:12 localhost dnsmasq-dhcp[313289]: DHCP, static leases only on 192.168.199.0, lease time 1d Dec 15 04:59:12 localhost dnsmasq[313289]: read /var/lib/neutron/dhcp/bead4979-69b7-4c2f-82ee-3d1038c8a234/addn_hosts - 0 addresses Dec 15 04:59:12 localhost dnsmasq-dhcp[313289]: read /var/lib/neutron/dhcp/bead4979-69b7-4c2f-82ee-3d1038c8a234/host Dec 15 04:59:12 localhost dnsmasq-dhcp[313289]: read /var/lib/neutron/dhcp/bead4979-69b7-4c2f-82ee-3d1038c8a234/opts Dec 15 04:59:12 localhost neutron_dhcp_agent[267542]: 2025-12-15 09:59:12.886 267546 INFO neutron.agent.dhcp.agent [None req-c989fda9-fd44-4d98-90d5-cdf50d217769 - - - - - -] DHCP configuration for ports {'832f5354-d6ac-4d4f-9f5c-026634fd1f0a'} is completed#033[00m Dec 15 04:59:15 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:59:16 localhost nova_compute[286344]: 2025-12-15 09:59:16.016 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:59:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e. Dec 15 04:59:16 localhost podman[313290]: 2025-12-15 09:59:16.754102302 +0000 UTC m=+0.083189452 container health_status a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Dec 15 04:59:16 localhost podman[313290]: 2025-12-15 09:59:16.768404974 +0000 UTC m=+0.097492104 container exec_died a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Dec 15 04:59:16 localhost systemd[1]: a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e.service: Deactivated successfully. Dec 15 04:59:20 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:59:21 localhost nova_compute[286344]: 2025-12-15 09:59:21.039 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:59:21 localhost nova_compute[286344]: 2025-12-15 09:59:21.041 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:59:21 localhost nova_compute[286344]: 2025-12-15 09:59:21.041 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5023 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 15 04:59:21 localhost nova_compute[286344]: 2025-12-15 09:59:21.042 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:59:21 localhost nova_compute[286344]: 2025-12-15 09:59:21.042 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:59:21 localhost nova_compute[286344]: 2025-12-15 09:59:21.046 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:59:25 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:59:26 localhost nova_compute[286344]: 2025-12-15 09:59:26.046 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:59:30 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:59:31 localhost nova_compute[286344]: 2025-12-15 09:59:31.076 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:59:31 localhost nova_compute[286344]: 2025-12-15 09:59:31.078 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:59:31 localhost nova_compute[286344]: 2025-12-15 09:59:31.078 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5029 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 15 04:59:31 localhost nova_compute[286344]: 2025-12-15 09:59:31.078 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:59:31 localhost nova_compute[286344]: 2025-12-15 09:59:31.079 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:59:31 localhost nova_compute[286344]: 2025-12-15 09:59:31.083 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:59:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0. Dec 15 04:59:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 04:59:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a. Dec 15 04:59:31 localhost systemd[1]: tmp-crun.RTfBHm.mount: Deactivated successfully. Dec 15 04:59:31 localhost podman[313315]: 2025-12-15 09:59:31.773373148 +0000 UTC m=+0.094014909 container health_status 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=multipathd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 04:59:31 localhost podman[313314]: 2025-12-15 09:59:31.814345618 +0000 UTC m=+0.138818413 container health_status 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Dec 15 04:59:31 localhost podman[313314]: 2025-12-15 09:59:31.825414631 +0000 UTC m=+0.149887396 container exec_died 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Dec 15 04:59:31 localhost systemd[1]: 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0.service: Deactivated successfully. Dec 15 04:59:31 localhost podman[313315]: 2025-12-15 09:59:31.86714128 +0000 UTC m=+0.187783041 container exec_died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Dec 15 04:59:31 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.service: Deactivated successfully. Dec 15 04:59:31 localhost podman[243449]: time="2025-12-15T09:59:31Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 15 04:59:31 localhost podman[313316]: 2025-12-15 09:59:31.884368834 +0000 UTC m=+0.202602576 container health_status b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Dec 15 04:59:31 localhost podman[243449]: @ - - [15/Dec/2025:09:59:31 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156640 "" "Go-http-client/1.1" Dec 15 04:59:32 localhost podman[313316]: 2025-12-15 09:59:32.017675272 +0000 UTC m=+0.335909014 container exec_died b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Dec 15 04:59:32 localhost podman[243449]: @ - - [15/Dec/2025:09:59:31 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19220 "" "Go-http-client/1.1" Dec 15 04:59:32 localhost systemd[1]: b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a.service: Deactivated successfully. Dec 15 04:59:32 localhost systemd[1]: tmp-crun.yUUJqx.mount: Deactivated successfully. Dec 15 04:59:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09. Dec 15 04:59:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 04:59:33 localhost podman[313377]: 2025-12-15 09:59:33.756156025 +0000 UTC m=+0.084857143 container health_status 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., name=ubi9-minimal, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, release=1755695350, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, container_name=openstack_network_exporter, managed_by=edpm_ansible, vendor=Red Hat, Inc., distribution-scope=public, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter) Dec 15 04:59:33 localhost podman[313377]: 2025-12-15 09:59:33.772455517 +0000 UTC m=+0.101156665 container exec_died 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-type=git, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, container_name=openstack_network_exporter, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b) Dec 15 04:59:33 localhost systemd[1]: 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09.service: Deactivated successfully. Dec 15 04:59:33 localhost podman[313378]: 2025-12-15 09:59:33.865530202 +0000 UTC m=+0.192100917 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202) Dec 15 04:59:33 localhost podman[313378]: 2025-12-15 09:59:33.924600339 +0000 UTC m=+0.251171014 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Dec 15 04:59:33 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 04:59:34 localhost openstack_network_exporter[246484]: ERROR 09:59:34 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 15 04:59:34 localhost openstack_network_exporter[246484]: ERROR 09:59:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 04:59:34 localhost openstack_network_exporter[246484]: ERROR 09:59:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 04:59:34 localhost openstack_network_exporter[246484]: ERROR 09:59:34 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 15 04:59:34 localhost openstack_network_exporter[246484]: Dec 15 04:59:34 localhost openstack_network_exporter[246484]: ERROR 09:59:34 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 15 04:59:34 localhost openstack_network_exporter[246484]: Dec 15 04:59:35 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:59:36 localhost nova_compute[286344]: 2025-12-15 09:59:36.081 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:59:40 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:59:41 localhost nova_compute[286344]: 2025-12-15 09:59:41.088 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:59:41 localhost nova_compute[286344]: 2025-12-15 09:59:41.089 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:59:41 localhost nova_compute[286344]: 2025-12-15 09:59:41.090 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 15 04:59:41 localhost nova_compute[286344]: 2025-12-15 09:59:41.090 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:59:41 localhost nova_compute[286344]: 2025-12-15 09:59:41.131 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:59:41 localhost nova_compute[286344]: 2025-12-15 09:59:41.132 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:59:41 localhost ovn_controller[154603]: 2025-12-15T09:59:41Z|00073|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory Dec 15 04:59:42 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 04:59:42 localhost podman[313422]: 2025-12-15 09:59:42.731761968 +0000 UTC m=+0.065209649 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible) Dec 15 04:59:42 localhost podman[313422]: 2025-12-15 09:59:42.740383981 +0000 UTC m=+0.073831672 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3) Dec 15 04:59:42 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 04:59:45 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:59:46 localhost nova_compute[286344]: 2025-12-15 09:59:46.132 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:59:46 localhost nova_compute[286344]: 2025-12-15 09:59:46.134 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:59:46 localhost nova_compute[286344]: 2025-12-15 09:59:46.134 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 15 04:59:46 localhost nova_compute[286344]: 2025-12-15 09:59:46.134 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:59:46 localhost nova_compute[286344]: 2025-12-15 09:59:46.135 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:59:46 localhost nova_compute[286344]: 2025-12-15 09:59:46.139 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:59:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e. Dec 15 04:59:47 localhost podman[313459]: 2025-12-15 09:59:47.565185723 +0000 UTC m=+0.087150920 container health_status a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Dec 15 04:59:47 localhost podman[313459]: 2025-12-15 09:59:47.603656121 +0000 UTC m=+0.125621328 container exec_died a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Dec 15 04:59:47 localhost systemd[1]: a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e.service: Deactivated successfully. Dec 15 04:59:47 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559464.localdomain.devices.0}] v 0) Dec 15 04:59:47 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:59:47 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559464.localdomain}] v 0) Dec 15 04:59:47 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:59:47 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559463.localdomain.devices.0}] v 0) Dec 15 04:59:47 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:59:47 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559463.localdomain}] v 0) Dec 15 04:59:47 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:59:48 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559462.localdomain.devices.0}] v 0) Dec 15 04:59:48 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:59:48 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559462.localdomain}] v 0) Dec 15 04:59:48 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.122 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'name': 'test', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005559462.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'c785bf23f53946bc99867d8832a50266', 'user_id': '1ba5fce347b64bfebf995f187193f205', 'hostId': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.123 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.151 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.latency volume: 1243487016 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.152 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.latency volume: 24779175 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.154 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'adffe46c-d16e-42bf-9486-84376cd989eb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1243487016, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:59:48.123556', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ca5a2f5c-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11854.316225942, 'message_signature': '2f7369c636e5731cbd1cff7a86db6102d5af617b353c27fecfa3c17e34bb03ba'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24779175, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:59:48.123556', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ca5a43a2-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11854.316225942, 'message_signature': '5ff7a66cd40ad3c0a41109cc4736744e3a8e1b306c57e6d890bb38525159d376'}]}, 'timestamp': '2025-12-15 09:59:48.152632', '_unique_id': '622bbb0895e940fb8d58438a7bae5a2f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.154 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.154 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.154 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.154 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.154 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.154 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.154 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.154 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.154 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.154 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.154 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.154 12 ERROR oslo_messaging.notify.messaging Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.154 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.154 12 ERROR oslo_messaging.notify.messaging Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.154 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.154 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.154 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.154 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.154 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.154 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.154 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.154 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.154 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.154 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.154 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.154 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.154 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.154 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.154 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.154 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.154 12 ERROR oslo_messaging.notify.messaging Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.155 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.159 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.160 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '03e96106-5627-408c-8a26-f42382193abb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:59:48.155593', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': 'ca5b51b6-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11854.348269613, 'message_signature': 'b93dc614fccd5e4ed5c2ee0dd4a518898b90cb5f40eb73758131b119efeb51bf'}]}, 'timestamp': '2025-12-15 09:59:48.159567', '_unique_id': 'bd84b87c7c4c4285a5515f59380aa1b1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.160 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.160 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.160 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.160 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.160 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.160 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.160 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.160 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.160 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.160 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.160 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.160 12 ERROR oslo_messaging.notify.messaging Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.160 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.160 12 ERROR oslo_messaging.notify.messaging Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.160 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.160 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.160 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.160 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.160 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.160 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.160 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.160 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.160 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.160 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.160 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.160 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.160 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.160 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.160 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.160 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.160 12 ERROR oslo_messaging.notify.messaging Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.161 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.171 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.171 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.173 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '94f23bcc-ef67-4ef8-ac1d-2880b134417a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:59:48.161751', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ca5d2fd6-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11854.354420415, 'message_signature': '6e5a19a499db12645534947fd21f94dd7c969ea306df243ef82fe23fe02d03e7'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:59:48.161751', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ca5d4142-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11854.354420415, 'message_signature': 'eaf6e9d4a2693105b8ce131df6cd3beb0eda5ee576cb181c44119caa6d508dea'}]}, 'timestamp': '2025-12-15 09:59:48.172221', '_unique_id': '78b0b2f17ff54dcf8fcb1e957301183e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.173 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.173 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.173 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.173 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.173 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.173 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.173 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.173 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.173 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.173 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.173 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.173 12 ERROR oslo_messaging.notify.messaging Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.173 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.173 12 ERROR oslo_messaging.notify.messaging Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.173 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.173 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.173 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.173 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.173 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.173 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.173 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.173 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.173 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.173 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.173 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.173 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.173 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.173 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.173 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.173 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.173 12 ERROR oslo_messaging.notify.messaging Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.174 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.174 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.176 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cb51ac01-b933-431e-afec-faf732a39410', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:59:48.174684', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': 'ca5db302-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11854.348269613, 'message_signature': 'efcea19ccc33ba8ecbfa58783c80144b7ed93e15bdf67b682821eb1ceb348dc0'}]}, 'timestamp': '2025-12-15 09:59:48.175192', '_unique_id': 'd86f85964b604fe7bfa05572668ea84b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.176 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.176 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.176 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.176 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.176 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.176 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.176 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.176 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.176 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.176 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.176 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.176 12 ERROR oslo_messaging.notify.messaging Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.176 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.176 12 ERROR oslo_messaging.notify.messaging Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.176 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.176 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.176 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.176 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.176 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.176 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.176 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.176 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.176 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.176 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.176 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.176 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.176 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.176 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.176 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.176 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.176 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.176 12 ERROR oslo_messaging.notify.messaging Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.177 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.177 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.178 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e2cda32c-ffa3-4737-b919-375f816bf2a6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:59:48.177437', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': 'ca5e1dd8-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11854.348269613, 'message_signature': 'f7906cd2eb76e37320b1b783e0d4b87f08e464c774d81b92d88d4dbbc618b2f6'}]}, 'timestamp': '2025-12-15 09:59:48.177893', '_unique_id': '379b4236a93a411686f802b1b32ad617'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.178 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.178 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.178 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.178 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.178 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.178 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.178 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.178 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.178 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.178 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.178 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.178 12 ERROR oslo_messaging.notify.messaging Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.178 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.178 12 ERROR oslo_messaging.notify.messaging Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.178 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.178 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.178 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.178 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.178 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.178 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.178 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.178 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.178 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.178 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.178 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.178 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.178 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.178 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.178 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.178 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.178 12 ERROR oslo_messaging.notify.messaging Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.179 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.180 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.181 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fb550c23-65f2-4a73-ae25-95abdccf6fcd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:59:48.179963', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': 'ca5e81c4-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11854.348269613, 'message_signature': 'bfa14f6eaf77082c17d1c41ef524d0be33a94f97b26344fb990b4a1aaa32564e'}]}, 'timestamp': '2025-12-15 09:59:48.180448', '_unique_id': 'ce9d0aaf8e1c4bf5a4416278f8a9fa65'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.181 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.181 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.181 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.181 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.181 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.181 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.181 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.181 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.181 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.181 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.181 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.181 12 ERROR oslo_messaging.notify.messaging Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.181 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.181 12 ERROR oslo_messaging.notify.messaging Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.181 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.181 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.181 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.181 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.181 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.181 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.181 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.181 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.181 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.181 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.181 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.181 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.181 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.181 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.181 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.181 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.181 12 ERROR oslo_messaging.notify.messaging Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.182 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.182 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.182 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.184 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '68f1ed3b-1be2-4430-aeb2-e73e525523ac', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:59:48.182523', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ca5ee434-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11854.316225942, 'message_signature': '7cd546cd9984b47d3dc1dec3e5073cf69551aee7f3f4949d5b97377baf641a7e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:59:48.182523', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ca5ef5c8-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11854.316225942, 'message_signature': 'af725ee062307771e6414dedc96ecaf7f59b0a820464926dd5ac29fdc734881c'}]}, 'timestamp': '2025-12-15 09:59:48.183392', '_unique_id': 'ba37724872764e2c97e881e71110ffa1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.184 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.184 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.184 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.184 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.184 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.184 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.184 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.184 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.184 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.184 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.184 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.184 12 ERROR oslo_messaging.notify.messaging Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.184 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.184 12 ERROR oslo_messaging.notify.messaging Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.184 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.184 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.184 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.184 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.184 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.184 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.184 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.184 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.184 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.184 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.184 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.184 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.184 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.184 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.184 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.184 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.184 12 ERROR oslo_messaging.notify.messaging Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.185 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.185 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.185 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.187 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8ca49df1-5b82-4861-afcb-90f19cc8e58d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:59:48.185505', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ca5f58a6-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11854.316225942, 'message_signature': 'fbba90b4727f5801aa3b580158359ca90a47ae476b63b49d1758bd81f39ca9fa'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:59:48.185505', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ca5f6a58-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11854.316225942, 'message_signature': 'ce19e397d5355a097a5ba5fc495a9421d93e368ed77a2856f986b2f72156e053'}]}, 'timestamp': '2025-12-15 09:59:48.186375', '_unique_id': 'dc1e00ab13934de88d8ae523ac2d14e7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.187 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.187 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.187 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.187 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.187 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.187 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.187 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.187 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.187 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.187 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.187 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.187 12 ERROR oslo_messaging.notify.messaging Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.187 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.187 12 ERROR oslo_messaging.notify.messaging Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.187 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.187 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.187 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.187 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.187 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.187 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.187 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.187 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.187 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.187 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.187 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.187 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.187 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.187 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.187 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.187 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.187 12 ERROR oslo_messaging.notify.messaging Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.189 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.189 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.192 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '44c0bd15-3769-41fe-b965-ee6bb39eec36', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:59:48.189812', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': 'ca600896-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11854.348269613, 'message_signature': '5de5aa3290633087c5b6e7cb37c5ae49fd957aa8a4543057c51bde9846afc02d'}]}, 'timestamp': '2025-12-15 09:59:48.190562', '_unique_id': 'ca3cf2294ac645339aa41b35d9df6929'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.192 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.192 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.192 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.192 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.192 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.192 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.192 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.192 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.192 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.192 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.192 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.192 12 ERROR oslo_messaging.notify.messaging Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.192 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.192 12 ERROR oslo_messaging.notify.messaging Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.192 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.192 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.192 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.192 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.192 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.192 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.192 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.192 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.192 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.192 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.192 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.192 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.192 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.192 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.192 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.192 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.192 12 ERROR oslo_messaging.notify.messaging Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.193 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.208 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/memory.usage volume: 51.73828125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.210 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2d7c20d0-522c-4c8d-a5da-283f8de9cf6c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.73828125, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'timestamp': '2025-12-15T09:59:48.193457', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'ca62e002-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11854.401087494, 'message_signature': '753b7451804bedaafe90f863c8dddebfe82e1521767d317903a1abe5f116fa45'}]}, 'timestamp': '2025-12-15 09:59:48.209127', '_unique_id': '64c5658b92c54f44bf1a473fd7d6f6c8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.210 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.210 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.210 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.210 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.210 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.210 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.210 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.210 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.210 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.210 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.210 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.210 12 ERROR oslo_messaging.notify.messaging Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.210 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.210 12 ERROR oslo_messaging.notify.messaging Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.210 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.210 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.210 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.210 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.210 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.210 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.210 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.210 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.210 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.210 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.210 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.210 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.210 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.210 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.210 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.210 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.210 12 ERROR oslo_messaging.notify.messaging Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.211 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.211 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.211 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.213 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b52576df-0d88-4e52-b8d1-818e76b16d93', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:59:48.211706', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': 'ca635960-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11854.348269613, 'message_signature': '43db62598be2b81212b66dbb9cd4dd31ac3b8c7e9ac89df49fa07de256c00c74'}]}, 'timestamp': '2025-12-15 09:59:48.212231', '_unique_id': '8cfcc727c52444d794f6f812443d540f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.213 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.213 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.213 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.213 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.213 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.213 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.213 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.213 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.213 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.213 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.213 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.213 12 ERROR oslo_messaging.notify.messaging Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.213 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.213 12 ERROR oslo_messaging.notify.messaging Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.213 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.213 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.213 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.213 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.213 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.213 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.213 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.213 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.213 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.213 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.213 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.213 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.213 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.213 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.213 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.213 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.213 12 ERROR oslo_messaging.notify.messaging Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.214 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.214 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.214 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.216 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c369acc1-1039-4547-9f32-655fc7be5f5c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:59:48.214446', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ca63c3d2-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11854.316225942, 'message_signature': 'dc17b1e2aa23f68c88b1b856a8824242a554626208782f224ba1761ea05dfed9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:59:48.214446', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ca63d6c4-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11854.316225942, 'message_signature': '1f09d3439ac3dde7cdd12e9f30ff4eb2000acf8ad9df62bfc260a2a9cb3693d3'}]}, 'timestamp': '2025-12-15 09:59:48.215374', '_unique_id': '2ec613833830420992f4135768c32868'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.216 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.216 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.216 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.216 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.216 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.216 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.216 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.216 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.216 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.216 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.216 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.216 12 ERROR oslo_messaging.notify.messaging Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.216 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.216 12 ERROR oslo_messaging.notify.messaging Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.216 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.216 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.216 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.216 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.216 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.216 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.216 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.216 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.216 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.216 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.216 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.216 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.216 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.216 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.216 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.216 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.216 12 ERROR oslo_messaging.notify.messaging Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.217 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.217 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.219 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2945eb07-40b0-455d-989a-0f8533fb889c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:59:48.217680', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': 'ca644280-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11854.348269613, 'message_signature': 'af20134322586c35662e497d0b7f241cddb3c2599ef53d92ce842eafd01fe6a7'}]}, 'timestamp': '2025-12-15 09:59:48.218197', '_unique_id': '15187e1798e64603b6c3c4317f82e584'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.219 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.219 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.219 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.219 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.219 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.219 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.219 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.219 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.219 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.219 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.219 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.219 12 ERROR oslo_messaging.notify.messaging Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.219 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.219 12 ERROR oslo_messaging.notify.messaging Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.219 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.219 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.219 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.219 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.219 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.219 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.219 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.219 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.219 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.219 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.219 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.219 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.219 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.219 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.219 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.219 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.219 12 ERROR oslo_messaging.notify.messaging Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.220 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.220 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.221 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '712e169f-2f3d-465f-83fa-bae70882e649', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:59:48.220429', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': 'ca64adec-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11854.348269613, 'message_signature': 'e7ad946ae69bcb8320a9f9178fb1f883f32578b2f0814b4e4daaa6d4b975ad40'}]}, 'timestamp': '2025-12-15 09:59:48.220941', '_unique_id': 'df6c96b2a7dd44af8dd41920fb8b4fb4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.221 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.221 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.221 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.221 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.221 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.221 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.221 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.221 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.221 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.221 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.221 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.221 12 ERROR oslo_messaging.notify.messaging Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.221 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.221 12 ERROR oslo_messaging.notify.messaging Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.221 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.221 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.221 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.221 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.221 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.221 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.221 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.221 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.221 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.221 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.221 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.221 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.221 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.221 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.221 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.221 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.221 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.221 12 ERROR oslo_messaging.notify.messaging Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.223 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.223 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.224 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '624dc4bf-ab35-4276-b90c-cc6d9081ab81', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:59:48.223165', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': 'ca65189a-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11854.348269613, 'message_signature': '9a6af3563f846441ca9850b5dab0edccd846f47f929caed738792987ea0fcc10'}]}, 'timestamp': '2025-12-15 09:59:48.223657', '_unique_id': '8ec6328c455c417487b7716f81f594ef'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.224 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.224 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.224 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.224 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.224 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.224 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.224 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.224 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.224 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.224 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.224 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.224 12 ERROR oslo_messaging.notify.messaging Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.224 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.224 12 ERROR oslo_messaging.notify.messaging Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.224 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.224 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.224 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.224 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.224 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.224 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.224 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.224 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.224 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.224 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.224 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.224 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.224 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.224 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.224 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.224 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.224 12 ERROR oslo_messaging.notify.messaging Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.225 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.226 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.226 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.227 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '651b5bae-b958-41f8-b77a-bbabc6fa29c4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:59:48.226025', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ca65883e-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11854.354420415, 'message_signature': '02927ed7dc894c02bbe4493797e333d2c8cb4a8b3f9da2c96e25ab415333927b'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:59:48.226025', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ca659a4a-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11854.354420415, 'message_signature': '6fc114fcc0ac2c2cb8598a51fd46637c140860c9795f451b53b3c9f93d997287'}]}, 'timestamp': '2025-12-15 09:59:48.226942', '_unique_id': '253bd138bd2145458011d04a3972dbf0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.227 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.227 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.227 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.227 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.227 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.227 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.227 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.227 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.227 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.227 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.227 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.227 12 ERROR oslo_messaging.notify.messaging Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.227 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.227 12 ERROR oslo_messaging.notify.messaging Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.227 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.227 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.227 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.227 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.227 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.227 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.227 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.227 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.227 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.227 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.227 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.227 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.227 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.227 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.227 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.227 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.227 12 ERROR oslo_messaging.notify.messaging Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.229 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.229 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.229 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.229 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.231 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e015c9f1-d681-45ee-9ff0-57d19278f15e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:59:48.229342', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ca6609c6-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11854.354420415, 'message_signature': '4e4bfb9eb70b3db1ca98016a03e71bebdfe537d879e4fd7f87154dadd8b3090d'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:59:48.229342', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ca661baa-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11854.354420415, 'message_signature': '00d64cc7219f9631ccab2606347ca0db50dd6d6ee8483c4859099ed5afb9f52e'}]}, 'timestamp': '2025-12-15 09:59:48.230248', '_unique_id': 'c1c2abe938e549d3be7ab681053b0467'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.231 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.231 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.231 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.231 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.231 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.231 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.231 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.231 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.231 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.231 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.231 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.231 12 ERROR oslo_messaging.notify.messaging Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.231 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.231 12 ERROR oslo_messaging.notify.messaging Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.231 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.231 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.231 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.231 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.231 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.231 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.231 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.231 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.231 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.231 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.231 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.231 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.231 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.231 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.231 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.231 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.231 12 ERROR oslo_messaging.notify.messaging Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.232 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.232 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/cpu volume: 13410000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.233 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c81b1339-45b3-412e-bb4b-1b4d9ce13b3c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 13410000000, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'timestamp': '2025-12-15T09:59:48.232471', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'ca6683f6-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11854.401087494, 'message_signature': 'cb0db728d85a7d788231726cdf4e9d87b09040a9ec20e7bcac8f2523c1a6dec8'}]}, 'timestamp': '2025-12-15 09:59:48.232956', '_unique_id': '070238cc3daf4ae3a16cdff5e4c026fd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.233 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.233 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.233 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.233 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.233 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.233 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.233 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.233 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.233 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.233 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.233 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.233 12 ERROR oslo_messaging.notify.messaging Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.233 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.233 12 ERROR oslo_messaging.notify.messaging Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.233 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.233 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.233 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.233 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.233 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.233 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.233 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.233 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.233 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.233 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.233 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.233 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.233 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.233 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.233 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.233 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.233 12 ERROR oslo_messaging.notify.messaging Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.235 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.235 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.latency volume: 1342134926 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.235 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.latency volume: 123356132 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.237 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e0695492-2d3e-457b-907b-1b88eb7e67ee', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1342134926, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:59:48.235153', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ca66ec60-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11854.316225942, 'message_signature': '405c2fbbda0ef552ffa23f3c9a8205c58d9293d02d70d9c148f51566edaabbf1'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 123356132, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:59:48.235153', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ca66fd90-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11854.316225942, 'message_signature': '49d0a397d8c7a2762de2e6d69dfc036cef711af4daf6698fd25fa933cca78dd1'}]}, 'timestamp': '2025-12-15 09:59:48.236060', '_unique_id': 'aae85b331ec74841aba744c516591a44'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.237 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.237 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.237 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.237 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.237 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.237 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.237 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.237 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.237 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.237 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.237 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.237 12 ERROR oslo_messaging.notify.messaging Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.237 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.237 12 ERROR oslo_messaging.notify.messaging Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.237 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.237 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.237 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.237 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.237 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.237 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.237 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.237 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.237 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.237 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.237 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.237 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.237 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.237 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.237 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.237 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.237 12 ERROR oslo_messaging.notify.messaging Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.238 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.238 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.238 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.240 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '69725de0-dfd5-4eb0-8038-eb69f348d444', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T09:59:48.238305', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'ca6767da-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11854.316225942, 'message_signature': 'd95f9902f842842cd5d33b939136babfbb980c005d82734c587a948ff344d2f2'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T09:59:48.238305', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'ca67782e-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11854.316225942, 'message_signature': 'f5d8cb866b3fc4ed4dd00f3dc6751c7c4dadd7c8a9d17f243d3dfc78689ac2ac'}]}, 'timestamp': '2025-12-15 09:59:48.239190', '_unique_id': '6e82617d400f444f9aa7e2a939cd363b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.240 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.240 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.240 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.240 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.240 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.240 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.240 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.240 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.240 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.240 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.240 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.240 12 ERROR oslo_messaging.notify.messaging Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.240 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.240 12 ERROR oslo_messaging.notify.messaging Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.240 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.240 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.240 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.240 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.240 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.240 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.240 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.240 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.240 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.240 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.240 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.240 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.240 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.240 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.240 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.240 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.240 12 ERROR oslo_messaging.notify.messaging Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.241 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.241 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.241 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.241 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.243 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '66a9a2d0-8dfd-47ff-b0ce-5fd79b1d304b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T09:59:48.241714', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': 'ca67ed22-d99c-11f0-817e-fa163ebaca0f', 'monotonic_time': 11854.348269613, 'message_signature': '46fdc8e58c2c852fb9577928e361c187db855c29d293934cb8fe94bf6b8e77ab'}]}, 'timestamp': '2025-12-15 09:59:48.242227', '_unique_id': '746c893a007c437c8678f18bacaeed12'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.243 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.243 12 ERROR oslo_messaging.notify.messaging yield Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.243 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.243 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.243 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.243 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.243 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.243 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.243 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.243 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.243 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.243 12 ERROR oslo_messaging.notify.messaging Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.243 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.243 12 ERROR oslo_messaging.notify.messaging Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.243 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.243 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.243 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.243 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.243 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.243 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.243 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.243 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.243 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.243 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.243 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.243 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.243 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.243 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.243 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.243 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 04:59:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 09:59:48.243 12 ERROR oslo_messaging.notify.messaging Dec 15 04:59:48 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:59:48 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:59:48 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:59:48 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:59:48 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:59:48 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:59:48 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Dec 15 04:59:48 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:59:49 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Dec 15 04:59:49 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:59:49 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Dec 15 04:59:49 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:59:49 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 04:59:50 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:59:51 localhost nova_compute[286344]: 2025-12-15 09:59:51.141 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:59:51 localhost nova_compute[286344]: 2025-12-15 09:59:51.144 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:59:51 localhost nova_compute[286344]: 2025-12-15 09:59:51.144 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 15 04:59:51 localhost nova_compute[286344]: 2025-12-15 09:59:51.145 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:59:51 localhost nova_compute[286344]: 2025-12-15 09:59:51.164 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:59:51 localhost nova_compute[286344]: 2025-12-15 09:59:51.165 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:59:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:59:51.477 160590 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 04:59:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:59:51.478 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 04:59:51 localhost ovn_metadata_agent[160585]: 2025-12-15 09:59:51.479 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 04:59:55 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 04:59:56 localhost nova_compute[286344]: 2025-12-15 09:59:56.165 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 04:59:56 localhost nova_compute[286344]: 2025-12-15 09:59:56.167 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 04:59:56 localhost nova_compute[286344]: 2025-12-15 09:59:56.167 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 15 04:59:56 localhost nova_compute[286344]: 2025-12-15 09:59:56.167 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:59:56 localhost nova_compute[286344]: 2025-12-15 09:59:56.168 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 04:59:56 localhost nova_compute[286344]: 2025-12-15 09:59:56.171 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:00:00 localhost ceph-mon[298913]: log_channel(cluster) log [INF] : overall HEALTH_OK Dec 15 05:00:00 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 05:00:00 localhost ceph-mon[298913]: overall HEALTH_OK Dec 15 05:00:01 localhost nova_compute[286344]: 2025-12-15 10:00:01.173 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 05:00:01 localhost nova_compute[286344]: 2025-12-15 10:00:01.175 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 05:00:01 localhost nova_compute[286344]: 2025-12-15 10:00:01.175 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m Dec 15 05:00:01 localhost nova_compute[286344]: 2025-12-15 10:00:01.175 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 05:00:01 localhost nova_compute[286344]: 2025-12-15 10:00:01.198 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:00:01 localhost nova_compute[286344]: 2025-12-15 10:00:01.199 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m Dec 15 05:00:01 localhost podman[243449]: time="2025-12-15T10:00:01Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 15 05:00:01 localhost podman[243449]: @ - - [15/Dec/2025:10:00:01 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156640 "" "Go-http-client/1.1" Dec 15 05:00:01 localhost podman[243449]: @ - - [15/Dec/2025:10:00:01 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19232 "" "Go-http-client/1.1" Dec 15 05:00:02 localhost nova_compute[286344]: 2025-12-15 10:00:02.168 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:00:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0. Dec 15 05:00:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 05:00:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a. Dec 15 05:00:02 localhost podman[313606]: 2025-12-15 10:00:02.761944975 +0000 UTC m=+0.087299324 container health_status 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 15 05:00:02 localhost podman[313606]: 2025-12-15 10:00:02.771447299 +0000 UTC m=+0.096801668 container exec_died 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Dec 15 05:00:02 localhost systemd[1]: 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0.service: Deactivated successfully. Dec 15 05:00:02 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:02.792 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'fe:17:e3', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fe:55:2b:86:15:b5'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:00:02 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:02.793 160590 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Dec 15 05:00:02 localhost nova_compute[286344]: 2025-12-15 10:00:02.830 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:00:02 localhost systemd[1]: tmp-crun.9USxm0.mount: Deactivated successfully. Dec 15 05:00:02 localhost podman[313608]: 2025-12-15 10:00:02.876868239 +0000 UTC m=+0.195649965 container health_status b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 05:00:02 localhost podman[313608]: 2025-12-15 10:00:02.891504689 +0000 UTC m=+0.210286415 container exec_died b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, managed_by=edpm_ansible, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:00:02 localhost podman[313607]: 2025-12-15 10:00:02.849854932 +0000 UTC m=+0.172262728 container health_status 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202) Dec 15 05:00:02 localhost systemd[1]: b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a.service: Deactivated successfully. Dec 15 05:00:02 localhost podman[313607]: 2025-12-15 10:00:02.933330701 +0000 UTC m=+0.255738427 container exec_died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}) Dec 15 05:00:02 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.service: Deactivated successfully. Dec 15 05:00:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09. Dec 15 05:00:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 05:00:04 localhost podman[313666]: 2025-12-15 10:00:04.754439102 +0000 UTC m=+0.080647169 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, io.buildah.version=1.41.3, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Dec 15 05:00:04 localhost podman[313666]: 2025-12-15 10:00:04.796257704 +0000 UTC m=+0.122465741 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202) Dec 15 05:00:04 localhost podman[313665]: 2025-12-15 10:00:04.806078415 +0000 UTC m=+0.133199075 container health_status 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_id=openstack_network_exporter, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., distribution-scope=public, name=ubi9-minimal, version=9.6, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers) Dec 15 05:00:04 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 05:00:04 localhost podman[313665]: 2025-12-15 10:00:04.813250862 +0000 UTC m=+0.140371522 container exec_died 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.buildah.version=1.33.7, release=1755695350, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, managed_by=edpm_ansible, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.) Dec 15 05:00:04 localhost systemd[1]: 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09.service: Deactivated successfully. Dec 15 05:00:04 localhost openstack_network_exporter[246484]: ERROR 10:00:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 05:00:04 localhost openstack_network_exporter[246484]: ERROR 10:00:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 05:00:04 localhost openstack_network_exporter[246484]: ERROR 10:00:04 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 15 05:00:04 localhost openstack_network_exporter[246484]: ERROR 10:00:04 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 15 05:00:04 localhost openstack_network_exporter[246484]: Dec 15 05:00:04 localhost openstack_network_exporter[246484]: ERROR 10:00:04 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 15 05:00:04 localhost openstack_network_exporter[246484]: Dec 15 05:00:05 localhost nova_compute[286344]: 2025-12-15 10:00:05.257 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:00:05 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 05:00:06 localhost nova_compute[286344]: 2025-12-15 10:00:06.244 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:00:07 localhost nova_compute[286344]: 2025-12-15 10:00:07.733 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:00:07 localhost nova_compute[286344]: 2025-12-15 10:00:07.734 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:00:08 localhost nova_compute[286344]: 2025-12-15 10:00:08.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:00:09 localhost nova_compute[286344]: 2025-12-15 10:00:09.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:00:09 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:09.795 160590 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=12d96d64-e862-4f68-81e5-8d9ec5d3a5e2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 15 05:00:10 localhost nova_compute[286344]: 2025-12-15 10:00:10.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:00:10 localhost nova_compute[286344]: 2025-12-15 10:00:10.270 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 15 05:00:10 localhost nova_compute[286344]: 2025-12-15 10:00:10.271 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 15 05:00:10 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 05:00:10 localhost nova_compute[286344]: 2025-12-15 10:00:10.413 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 15 05:00:10 localhost nova_compute[286344]: 2025-12-15 10:00:10.413 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquired lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 15 05:00:10 localhost nova_compute[286344]: 2025-12-15 10:00:10.414 286348 DEBUG nova.network.neutron [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 15 05:00:10 localhost nova_compute[286344]: 2025-12-15 10:00:10.414 286348 DEBUG nova.objects.instance [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 15 05:00:10 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:00:10.885 267546 INFO neutron.agent.linux.ip_lib [None req-e823ee84-17c6-4366-9205-5c56b0dfcced - - - - - -] Device tapf642a259-16 cannot be used as it has no MAC address#033[00m Dec 15 05:00:10 localhost nova_compute[286344]: 2025-12-15 10:00:10.910 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:00:10 localhost kernel: device tapf642a259-16 entered promiscuous mode Dec 15 05:00:10 localhost NetworkManager[5963]: [1765792810.9170] manager: (tapf642a259-16): new Generic device (/org/freedesktop/NetworkManager/Devices/18) Dec 15 05:00:10 localhost ovn_controller[154603]: 2025-12-15T10:00:10Z|00074|binding|INFO|Claiming lport f642a259-1650-438c-bdfd-d9145a0c7b84 for this chassis. Dec 15 05:00:10 localhost ovn_controller[154603]: 2025-12-15T10:00:10Z|00075|binding|INFO|f642a259-1650-438c-bdfd-d9145a0c7b84: Claiming unknown Dec 15 05:00:10 localhost nova_compute[286344]: 2025-12-15 10:00:10.919 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:00:10 localhost systemd-udevd[313722]: Network interface NamePolicy= disabled on kernel command line. Dec 15 05:00:10 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:10.931 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-0a27b0d3-52a4-4f8c-9083-769f5c00765d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0a27b0d3-52a4-4f8c-9083-769f5c00765d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7526febbe14640b09a9f0897a2f4af8c', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb6271da-cb66-4d8c-bc07-07fd6816385a, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=f642a259-1650-438c-bdfd-d9145a0c7b84) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:00:10 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:10.933 160590 INFO neutron.agent.ovn.metadata.agent [-] Port f642a259-1650-438c-bdfd-d9145a0c7b84 in datapath 0a27b0d3-52a4-4f8c-9083-769f5c00765d bound to our chassis#033[00m Dec 15 05:00:10 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:10.936 160590 DEBUG neutron.agent.ovn.metadata.agent [-] Port 4c35947b-0bff-4c73-a152-ea54c58ebbf2 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Dec 15 05:00:10 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:10.936 160590 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0a27b0d3-52a4-4f8c-9083-769f5c00765d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 15 05:00:10 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:10.937 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[5a1fdb54-2200-4379-84ea-00ed06b252d5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:00:10 localhost journal[231322]: libvirt version: 11.9.0, package: 1.el9 (builder@centos.org, 2025-11-04-09:54:50, ) Dec 15 05:00:10 localhost journal[231322]: hostname: np0005559462.localdomain Dec 15 05:00:10 localhost journal[231322]: ethtool ioctl error on tapf642a259-16: No such device Dec 15 05:00:10 localhost ovn_controller[154603]: 2025-12-15T10:00:10Z|00076|binding|INFO|Setting lport f642a259-1650-438c-bdfd-d9145a0c7b84 ovn-installed in OVS Dec 15 05:00:10 localhost ovn_controller[154603]: 2025-12-15T10:00:10Z|00077|binding|INFO|Setting lport f642a259-1650-438c-bdfd-d9145a0c7b84 up in Southbound Dec 15 05:00:10 localhost nova_compute[286344]: 2025-12-15 10:00:10.955 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:00:10 localhost journal[231322]: ethtool ioctl error on tapf642a259-16: No such device Dec 15 05:00:10 localhost journal[231322]: ethtool ioctl error on tapf642a259-16: No such device Dec 15 05:00:10 localhost journal[231322]: ethtool ioctl error on tapf642a259-16: No such device Dec 15 05:00:10 localhost journal[231322]: ethtool ioctl error on tapf642a259-16: No such device Dec 15 05:00:10 localhost journal[231322]: ethtool ioctl error on tapf642a259-16: No such device Dec 15 05:00:10 localhost journal[231322]: ethtool ioctl error on tapf642a259-16: No such device Dec 15 05:00:10 localhost journal[231322]: ethtool ioctl error on tapf642a259-16: No such device Dec 15 05:00:10 localhost nova_compute[286344]: 2025-12-15 10:00:10.995 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:00:11 localhost nova_compute[286344]: 2025-12-15 10:00:11.023 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:00:11 localhost nova_compute[286344]: 2025-12-15 10:00:11.122 286348 DEBUG nova.network.neutron [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Updating instance_info_cache with network_info: [{"id": "03ef8889-3216-43fb-8a52-4be17a956ce1", "address": "fa:16:3e:74:df:7c", "network": {"id": "befb7a72-17a9-4bcb-b561-84b8f626685a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.201", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "c785bf23f53946bc99867d8832a50266", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03ef8889-32", "ovs_interfaceid": "03ef8889-3216-43fb-8a52-4be17a956ce1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 15 05:00:11 localhost nova_compute[286344]: 2025-12-15 10:00:11.137 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Releasing lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 15 05:00:11 localhost nova_compute[286344]: 2025-12-15 10:00:11.137 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 15 05:00:11 localhost nova_compute[286344]: 2025-12-15 10:00:11.138 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:00:11 localhost nova_compute[286344]: 2025-12-15 10:00:11.254 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:00:11 localhost nova_compute[286344]: 2025-12-15 10:00:11.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:00:11 localhost nova_compute[286344]: 2025-12-15 10:00:11.270 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 15 05:00:11 localhost nova_compute[286344]: 2025-12-15 10:00:11.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:00:11 localhost nova_compute[286344]: 2025-12-15 10:00:11.288 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 05:00:11 localhost nova_compute[286344]: 2025-12-15 10:00:11.288 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 05:00:11 localhost nova_compute[286344]: 2025-12-15 10:00:11.289 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 05:00:11 localhost nova_compute[286344]: 2025-12-15 10:00:11.289 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Auditing locally available compute resources for np0005559462.localdomain (node: np0005559462.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 15 05:00:11 localhost nova_compute[286344]: 2025-12-15 10:00:11.290 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 05:00:11 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Dec 15 05:00:11 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/4161302188' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Dec 15 05:00:11 localhost nova_compute[286344]: 2025-12-15 10:00:11.720 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.430s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 05:00:11 localhost podman[313813]: Dec 15 05:00:11 localhost podman[313813]: 2025-12-15 10:00:11.734078652 +0000 UTC m=+0.060432470 container create 7b146303a978bcb3df5ff2887887992e6db4ec060d166943872f70bbc5777a10 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0a27b0d3-52a4-4f8c-9083-769f5c00765d, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2) Dec 15 05:00:11 localhost systemd[1]: Started libpod-conmon-7b146303a978bcb3df5ff2887887992e6db4ec060d166943872f70bbc5777a10.scope. Dec 15 05:00:11 localhost systemd[1]: Started libcrun container. Dec 15 05:00:11 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/593702a771b26b2b4a70e8a165408efa4c4335b8fc45cb653feb8bed49b6b747/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 05:00:11 localhost nova_compute[286344]: 2025-12-15 10:00:11.791 286348 DEBUG nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 05:00:11 localhost nova_compute[286344]: 2025-12-15 10:00:11.792 286348 DEBUG nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 05:00:11 localhost podman[313813]: 2025-12-15 10:00:11.698633839 +0000 UTC m=+0.024987677 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 15 05:00:11 localhost podman[313813]: 2025-12-15 10:00:11.801193357 +0000 UTC m=+0.127547185 container init 7b146303a978bcb3df5ff2887887992e6db4ec060d166943872f70bbc5777a10 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0a27b0d3-52a4-4f8c-9083-769f5c00765d, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:00:11 localhost podman[313813]: 2025-12-15 10:00:11.813107801 +0000 UTC m=+0.139461619 container start 7b146303a978bcb3df5ff2887887992e6db4ec060d166943872f70bbc5777a10 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0a27b0d3-52a4-4f8c-9083-769f5c00765d, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true) Dec 15 05:00:11 localhost dnsmasq[313833]: started, version 2.85 cachesize 150 Dec 15 05:00:11 localhost dnsmasq[313833]: DNS service limited to local subnets Dec 15 05:00:11 localhost dnsmasq[313833]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 15 05:00:11 localhost dnsmasq[313833]: warning: no upstream servers configured Dec 15 05:00:11 localhost dnsmasq-dhcp[313833]: DHCP, static leases only on 10.100.0.0, lease time 1d Dec 15 05:00:11 localhost dnsmasq[313833]: read /var/lib/neutron/dhcp/0a27b0d3-52a4-4f8c-9083-769f5c00765d/addn_hosts - 0 addresses Dec 15 05:00:11 localhost dnsmasq-dhcp[313833]: read /var/lib/neutron/dhcp/0a27b0d3-52a4-4f8c-9083-769f5c00765d/host Dec 15 05:00:11 localhost dnsmasq-dhcp[313833]: read /var/lib/neutron/dhcp/0a27b0d3-52a4-4f8c-9083-769f5c00765d/opts Dec 15 05:00:11 localhost nova_compute[286344]: 2025-12-15 10:00:11.934 286348 WARNING nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 15 05:00:11 localhost nova_compute[286344]: 2025-12-15 10:00:11.936 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Hypervisor/Node resource view: name=np0005559462.localdomain free_ram=11430MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 15 05:00:11 localhost nova_compute[286344]: 2025-12-15 10:00:11.936 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 05:00:11 localhost nova_compute[286344]: 2025-12-15 10:00:11.936 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 05:00:11 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:00:11.986 267546 INFO neutron.agent.dhcp.agent [None req-9b490f23-df74-4f28-abe6-c938fc034344 - - - - - -] DHCP configuration for ports {'f9cedc65-c9c9-4b1d-b399-457547cdf8e7'} is completed#033[00m Dec 15 05:00:12 localhost nova_compute[286344]: 2025-12-15 10:00:12.028 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:00:12 localhost nova_compute[286344]: 2025-12-15 10:00:12.234 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Instance 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 15 05:00:12 localhost nova_compute[286344]: 2025-12-15 10:00:12.234 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 15 05:00:12 localhost nova_compute[286344]: 2025-12-15 10:00:12.235 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Final resource view: name=np0005559462.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 15 05:00:12 localhost nova_compute[286344]: 2025-12-15 10:00:12.385 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 05:00:12 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Dec 15 05:00:12 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3566470543' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Dec 15 05:00:12 localhost nova_compute[286344]: 2025-12-15 10:00:12.892 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.507s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 05:00:12 localhost nova_compute[286344]: 2025-12-15 10:00:12.899 286348 DEBUG nova.compute.provider_tree [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Inventory has not changed in ProviderTree for provider: 26c8956b-6742-4951-b566-971b9bbe323b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 15 05:00:12 localhost nova_compute[286344]: 2025-12-15 10:00:12.916 286348 DEBUG nova.scheduler.client.report [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Inventory has not changed for provider 26c8956b-6742-4951-b566-971b9bbe323b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 15 05:00:12 localhost nova_compute[286344]: 2025-12-15 10:00:12.918 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Compute_service record updated for np0005559462.localdomain:np0005559462.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 15 05:00:12 localhost nova_compute[286344]: 2025-12-15 10:00:12.919 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.982s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 05:00:13 localhost nova_compute[286344]: 2025-12-15 10:00:13.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:00:13 localhost nova_compute[286344]: 2025-12-15 10:00:13.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:00:13 localhost nova_compute[286344]: 2025-12-15 10:00:13.271 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Cleaning up deleted instances with incomplete migration _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m Dec 15 05:00:13 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:00:13.465 267546 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-15T10:00:12Z, description=, device_id=17fad7bc-03bf-4750-b5a1-50044b4f0a59, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=4053722c-223b-4b1e-aa18-287d7c614227, ip_allocation=immediate, mac_address=fa:16:3e:dd:5c:09, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-15T10:00:07Z, description=, dns_domain=, id=0a27b0d3-52a4-4f8c-9083-769f5c00765d, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-LiveMigrationTest-893534742-network, port_security_enabled=True, project_id=7526febbe14640b09a9f0897a2f4af8c, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=59437, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=285, status=ACTIVE, subnets=['92c94f9d-deb3-48de-ab15-aef7a594414d'], tags=[], tenant_id=7526febbe14640b09a9f0897a2f4af8c, updated_at=2025-12-15T10:00:08Z, vlan_transparent=None, network_id=0a27b0d3-52a4-4f8c-9083-769f5c00765d, port_security_enabled=False, project_id=7526febbe14640b09a9f0897a2f4af8c, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=336, status=DOWN, tags=[], tenant_id=7526febbe14640b09a9f0897a2f4af8c, updated_at=2025-12-15T10:00:13Z on network 0a27b0d3-52a4-4f8c-9083-769f5c00765d#033[00m Dec 15 05:00:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 05:00:13 localhost systemd[1]: tmp-crun.oGfpUQ.mount: Deactivated successfully. Dec 15 05:00:13 localhost dnsmasq[313833]: read /var/lib/neutron/dhcp/0a27b0d3-52a4-4f8c-9083-769f5c00765d/addn_hosts - 1 addresses Dec 15 05:00:13 localhost dnsmasq-dhcp[313833]: read /var/lib/neutron/dhcp/0a27b0d3-52a4-4f8c-9083-769f5c00765d/host Dec 15 05:00:13 localhost podman[313872]: 2025-12-15 10:00:13.696450927 +0000 UTC m=+0.071927765 container kill 7b146303a978bcb3df5ff2887887992e6db4ec060d166943872f70bbc5777a10 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0a27b0d3-52a4-4f8c-9083-769f5c00765d, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2) Dec 15 05:00:13 localhost dnsmasq-dhcp[313833]: read /var/lib/neutron/dhcp/0a27b0d3-52a4-4f8c-9083-769f5c00765d/opts Dec 15 05:00:13 localhost podman[313883]: 2025-12-15 10:00:13.777849194 +0000 UTC m=+0.100969971 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0) Dec 15 05:00:13 localhost podman[313883]: 2025-12-15 10:00:13.810425618 +0000 UTC m=+0.133546405 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent) Dec 15 05:00:13 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 05:00:13 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:00:13.955 267546 INFO neutron.agent.dhcp.agent [None req-1860f1fe-56c8-4e67-9228-87cbda099784 - - - - - -] DHCP configuration for ports {'4053722c-223b-4b1e-aa18-287d7c614227'} is completed#033[00m Dec 15 05:00:15 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 05:00:15 localhost ceph-mon[298913]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #43. Immutable memtables: 0. Dec 15 05:00:15 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:00:15.409537) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Dec 15 05:00:15 localhost ceph-mon[298913]: rocksdb: [db/flush_job.cc:856] [default] [JOB 23] Flushing memtable with next log file: 43 Dec 15 05:00:15 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765792815409631, "job": 23, "event": "flush_started", "num_memtables": 1, "num_entries": 1040, "num_deletes": 251, "total_data_size": 1170965, "memory_usage": 1191024, "flush_reason": "Manual Compaction"} Dec 15 05:00:15 localhost ceph-mon[298913]: rocksdb: [db/flush_job.cc:885] [default] [JOB 23] Level-0 flush table #44: started Dec 15 05:00:15 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765792815419524, "cf_name": "default", "job": 23, "event": "table_file_creation", "file_number": 44, "file_size": 1152148, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 24151, "largest_seqno": 25190, "table_properties": {"data_size": 1147437, "index_size": 2310, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1349, "raw_key_size": 11214, "raw_average_key_size": 20, "raw_value_size": 1137613, "raw_average_value_size": 2118, "num_data_blocks": 97, "num_entries": 537, "num_filter_entries": 537, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765792745, "oldest_key_time": 1765792745, "file_creation_time": 1765792815, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "603b24af-e2be-4214-bc56-9e652eb4af3d", "db_session_id": "0OJRM9SCUA16EXV0VQZ2", "orig_file_number": 44, "seqno_to_time_mapping": "N/A"}} Dec 15 05:00:15 localhost ceph-mon[298913]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 23] Flush lasted 10031 microseconds, and 4489 cpu microseconds. Dec 15 05:00:15 localhost ceph-mon[298913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Dec 15 05:00:15 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:00:15.419576) [db/flush_job.cc:967] [default] [JOB 23] Level-0 flush table #44: 1152148 bytes OK Dec 15 05:00:15 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:00:15.419598) [db/memtable_list.cc:519] [default] Level-0 commit table #44 started Dec 15 05:00:15 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:00:15.422220) [db/memtable_list.cc:722] [default] Level-0 commit table #44: memtable #1 done Dec 15 05:00:15 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:00:15.422242) EVENT_LOG_v1 {"time_micros": 1765792815422236, "job": 23, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Dec 15 05:00:15 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:00:15.422264) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Dec 15 05:00:15 localhost ceph-mon[298913]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 23] Try to delete WAL files size 1166050, prev total WAL file size 1166050, number of live WAL files 2. Dec 15 05:00:15 localhost ceph-mon[298913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005559462/store.db/000040.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Dec 15 05:00:15 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:00:15.422842) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003131353436' seq:72057594037927935, type:22 .. '7061786F73003131373938' seq:0, type:0; will stop at (end) Dec 15 05:00:15 localhost ceph-mon[298913]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 24] Compacting 1@0 + 1@6 files to L6, score -1.00 Dec 15 05:00:15 localhost ceph-mon[298913]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 23 Base level 0, inputs: [44(1125KB)], [42(17MB)] Dec 15 05:00:15 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765792815422883, "job": 24, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [44], "files_L6": [42], "score": -1, "input_data_size": 19644232, "oldest_snapshot_seqno": -1} Dec 15 05:00:15 localhost ceph-mon[298913]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 24] Generated table #45: 12246 keys, 17856330 bytes, temperature: kUnknown Dec 15 05:00:15 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765792815555319, "cf_name": "default", "job": 24, "event": "table_file_creation", "file_number": 45, "file_size": 17856330, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17787796, "index_size": 36817, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 30661, "raw_key_size": 329824, "raw_average_key_size": 26, "raw_value_size": 17580257, "raw_average_value_size": 1435, "num_data_blocks": 1397, "num_entries": 12246, "num_filter_entries": 12246, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765792320, "oldest_key_time": 0, "file_creation_time": 1765792815, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "603b24af-e2be-4214-bc56-9e652eb4af3d", "db_session_id": "0OJRM9SCUA16EXV0VQZ2", "orig_file_number": 45, "seqno_to_time_mapping": "N/A"}} Dec 15 05:00:15 localhost ceph-mon[298913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Dec 15 05:00:15 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:00:15.555706) [db/compaction/compaction_job.cc:1663] [default] [JOB 24] Compacted 1@0 + 1@6 files to L6 => 17856330 bytes Dec 15 05:00:15 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:00:15.557847) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 148.2 rd, 134.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.1, 17.6 +0.0 blob) out(17.0 +0.0 blob), read-write-amplify(32.5) write-amplify(15.5) OK, records in: 12778, records dropped: 532 output_compression: NoCompression Dec 15 05:00:15 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:00:15.557875) EVENT_LOG_v1 {"time_micros": 1765792815557863, "job": 24, "event": "compaction_finished", "compaction_time_micros": 132556, "compaction_time_cpu_micros": 48902, "output_level": 6, "num_output_files": 1, "total_output_size": 17856330, "num_input_records": 12778, "num_output_records": 12246, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Dec 15 05:00:15 localhost ceph-mon[298913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005559462/store.db/000044.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Dec 15 05:00:15 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765792815558350, "job": 24, "event": "table_file_deletion", "file_number": 44} Dec 15 05:00:15 localhost ceph-mon[298913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005559462/store.db/000042.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Dec 15 05:00:15 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765792815561145, "job": 24, "event": "table_file_deletion", "file_number": 42} Dec 15 05:00:15 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:00:15.422756) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 05:00:15 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:00:15.561198) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 05:00:15 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:00:15.561205) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 05:00:15 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:00:15.561208) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 05:00:15 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:00:15.561211) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 05:00:15 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:00:15.561214) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 05:00:15 localhost nova_compute[286344]: 2025-12-15 10:00:15.747 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:00:16 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:00:16.130 267546 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-15T10:00:12Z, description=, device_id=17fad7bc-03bf-4750-b5a1-50044b4f0a59, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=4053722c-223b-4b1e-aa18-287d7c614227, ip_allocation=immediate, mac_address=fa:16:3e:dd:5c:09, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-15T10:00:07Z, description=, dns_domain=, id=0a27b0d3-52a4-4f8c-9083-769f5c00765d, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-LiveMigrationTest-893534742-network, port_security_enabled=True, project_id=7526febbe14640b09a9f0897a2f4af8c, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=59437, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=285, status=ACTIVE, subnets=['92c94f9d-deb3-48de-ab15-aef7a594414d'], tags=[], tenant_id=7526febbe14640b09a9f0897a2f4af8c, updated_at=2025-12-15T10:00:08Z, vlan_transparent=None, network_id=0a27b0d3-52a4-4f8c-9083-769f5c00765d, port_security_enabled=False, project_id=7526febbe14640b09a9f0897a2f4af8c, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=336, status=DOWN, tags=[], tenant_id=7526febbe14640b09a9f0897a2f4af8c, updated_at=2025-12-15T10:00:13Z on network 0a27b0d3-52a4-4f8c-9083-769f5c00765d#033[00m Dec 15 05:00:16 localhost nova_compute[286344]: 2025-12-15 10:00:16.285 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:00:16 localhost dnsmasq[313833]: read /var/lib/neutron/dhcp/0a27b0d3-52a4-4f8c-9083-769f5c00765d/addn_hosts - 1 addresses Dec 15 05:00:16 localhost dnsmasq-dhcp[313833]: read /var/lib/neutron/dhcp/0a27b0d3-52a4-4f8c-9083-769f5c00765d/host Dec 15 05:00:16 localhost dnsmasq-dhcp[313833]: read /var/lib/neutron/dhcp/0a27b0d3-52a4-4f8c-9083-769f5c00765d/opts Dec 15 05:00:16 localhost systemd[1]: tmp-crun.Iq2sZz.mount: Deactivated successfully. Dec 15 05:00:16 localhost podman[313930]: 2025-12-15 10:00:16.402838776 +0000 UTC m=+0.060529094 container kill 7b146303a978bcb3df5ff2887887992e6db4ec060d166943872f70bbc5777a10 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0a27b0d3-52a4-4f8c-9083-769f5c00765d, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:00:16 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:00:16.620 267546 INFO neutron.agent.dhcp.agent [None req-9e12a376-3056-4bd9-b262-bd79937f917e - - - - - -] DHCP configuration for ports {'4053722c-223b-4b1e-aa18-287d7c614227'} is completed#033[00m Dec 15 05:00:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e. Dec 15 05:00:17 localhost podman[313952]: 2025-12-15 10:00:17.746184477 +0000 UTC m=+0.080713800 container health_status a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Dec 15 05:00:17 localhost podman[313952]: 2025-12-15 10:00:17.754450791 +0000 UTC m=+0.088980074 container exec_died a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 15 05:00:17 localhost systemd[1]: a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e.service: Deactivated successfully. Dec 15 05:00:19 localhost nova_compute[286344]: 2025-12-15 10:00:19.294 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:00:19 localhost nova_compute[286344]: 2025-12-15 10:00:19.294 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m Dec 15 05:00:19 localhost nova_compute[286344]: 2025-12-15 10:00:19.307 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m Dec 15 05:00:19 localhost nova_compute[286344]: 2025-12-15 10:00:19.678 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:00:19 localhost neutron_sriov_agent[260044]: 2025-12-15 10:00:19.879 2 INFO neutron.agent.securitygroups_rpc [None req-a278538e-dead-4c83-8c63-42c9855e1b6e 7eba5d40415d4fd7ab565eed1e07b799 4e26599ab4374eb6b7e9d73e7dfebd03 - - default default] Security group member updated ['489315f5-1e14-49f6-8195-de837d23b935']#033[00m Dec 15 05:00:20 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 05:00:21 localhost nova_compute[286344]: 2025-12-15 10:00:21.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:00:21 localhost nova_compute[286344]: 2025-12-15 10:00:21.289 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:00:21 localhost nova_compute[286344]: 2025-12-15 10:00:21.294 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:00:22 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:00:22.733 267546 INFO neutron.agent.linux.ip_lib [None req-eb26a6bf-27d9-4fbf-b526-f326c0135062 - - - - - -] Device tap41381e1a-bb cannot be used as it has no MAC address#033[00m Dec 15 05:00:22 localhost nova_compute[286344]: 2025-12-15 10:00:22.797 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:00:22 localhost kernel: device tap41381e1a-bb entered promiscuous mode Dec 15 05:00:22 localhost NetworkManager[5963]: [1765792822.8061] manager: (tap41381e1a-bb): new Generic device (/org/freedesktop/NetworkManager/Devices/19) Dec 15 05:00:22 localhost nova_compute[286344]: 2025-12-15 10:00:22.807 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:00:22 localhost ovn_controller[154603]: 2025-12-15T10:00:22Z|00078|binding|INFO|Claiming lport 41381e1a-bb74-4360-9cd9-17d27f7371d9 for this chassis. Dec 15 05:00:22 localhost ovn_controller[154603]: 2025-12-15T10:00:22Z|00079|binding|INFO|41381e1a-bb74-4360-9cd9-17d27f7371d9: Claiming unknown Dec 15 05:00:22 localhost systemd-udevd[313987]: Network interface NamePolicy= disabled on kernel command line. Dec 15 05:00:22 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:22.819 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '19.80.0.2/24', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-0874a63c-034d-48db-8f84-a51c1fe90687', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0874a63c-034d-48db-8f84-a51c1fe90687', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4e26599ab4374eb6b7e9d73e7dfebd03', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d3aba774-abb1-4ddc-beaa-cd88849a49e2, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=41381e1a-bb74-4360-9cd9-17d27f7371d9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:00:22 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:22.822 160590 INFO neutron.agent.ovn.metadata.agent [-] Port 41381e1a-bb74-4360-9cd9-17d27f7371d9 in datapath 0874a63c-034d-48db-8f84-a51c1fe90687 bound to our chassis#033[00m Dec 15 05:00:22 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:22.824 160590 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 0874a63c-034d-48db-8f84-a51c1fe90687 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 15 05:00:22 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:22.825 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[7418c000-ed1f-48c8-bc08-57e8374aef74]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:00:22 localhost journal[231322]: ethtool ioctl error on tap41381e1a-bb: No such device Dec 15 05:00:22 localhost ovn_controller[154603]: 2025-12-15T10:00:22Z|00080|binding|INFO|Setting lport 41381e1a-bb74-4360-9cd9-17d27f7371d9 ovn-installed in OVS Dec 15 05:00:22 localhost ovn_controller[154603]: 2025-12-15T10:00:22Z|00081|binding|INFO|Setting lport 41381e1a-bb74-4360-9cd9-17d27f7371d9 up in Southbound Dec 15 05:00:22 localhost journal[231322]: ethtool ioctl error on tap41381e1a-bb: No such device Dec 15 05:00:22 localhost nova_compute[286344]: 2025-12-15 10:00:22.837 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:00:22 localhost journal[231322]: ethtool ioctl error on tap41381e1a-bb: No such device Dec 15 05:00:22 localhost journal[231322]: ethtool ioctl error on tap41381e1a-bb: No such device Dec 15 05:00:22 localhost journal[231322]: ethtool ioctl error on tap41381e1a-bb: No such device Dec 15 05:00:22 localhost journal[231322]: ethtool ioctl error on tap41381e1a-bb: No such device Dec 15 05:00:22 localhost journal[231322]: ethtool ioctl error on tap41381e1a-bb: No such device Dec 15 05:00:22 localhost journal[231322]: ethtool ioctl error on tap41381e1a-bb: No such device Dec 15 05:00:22 localhost nova_compute[286344]: 2025-12-15 10:00:22.877 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:00:22 localhost nova_compute[286344]: 2025-12-15 10:00:22.911 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:00:23 localhost podman[314058]: Dec 15 05:00:23 localhost podman[314058]: 2025-12-15 10:00:23.816041806 +0000 UTC m=+0.089221201 container create df71c348ebc76afefa8737120d7d97d99f59bee032910fea1a873136e0df546b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0874a63c-034d-48db-8f84-a51c1fe90687, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3) Dec 15 05:00:23 localhost systemd[1]: Started libpod-conmon-df71c348ebc76afefa8737120d7d97d99f59bee032910fea1a873136e0df546b.scope. Dec 15 05:00:23 localhost systemd[1]: Started libcrun container. Dec 15 05:00:23 localhost podman[314058]: 2025-12-15 10:00:23.772271278 +0000 UTC m=+0.045450703 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 15 05:00:23 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ec10130acb2b32acd3d8f8f14a6a8a9c76c0a42c2f2b6b52646e3692b0ed4c53/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 05:00:23 localhost podman[314058]: 2025-12-15 10:00:23.891384524 +0000 UTC m=+0.164563919 container init df71c348ebc76afefa8737120d7d97d99f59bee032910fea1a873136e0df546b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0874a63c-034d-48db-8f84-a51c1fe90687, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2) Dec 15 05:00:23 localhost podman[314058]: 2025-12-15 10:00:23.909681366 +0000 UTC m=+0.182860761 container start df71c348ebc76afefa8737120d7d97d99f59bee032910fea1a873136e0df546b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0874a63c-034d-48db-8f84-a51c1fe90687, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3) Dec 15 05:00:23 localhost dnsmasq[314076]: started, version 2.85 cachesize 150 Dec 15 05:00:23 localhost dnsmasq[314076]: DNS service limited to local subnets Dec 15 05:00:23 localhost dnsmasq[314076]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 15 05:00:23 localhost dnsmasq[314076]: warning: no upstream servers configured Dec 15 05:00:23 localhost dnsmasq-dhcp[314076]: DHCP, static leases only on 19.80.0.0, lease time 1d Dec 15 05:00:23 localhost dnsmasq[314076]: read /var/lib/neutron/dhcp/0874a63c-034d-48db-8f84-a51c1fe90687/addn_hosts - 0 addresses Dec 15 05:00:23 localhost dnsmasq-dhcp[314076]: read /var/lib/neutron/dhcp/0874a63c-034d-48db-8f84-a51c1fe90687/host Dec 15 05:00:23 localhost dnsmasq-dhcp[314076]: read /var/lib/neutron/dhcp/0874a63c-034d-48db-8f84-a51c1fe90687/opts Dec 15 05:00:24 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:00:24.173 267546 INFO neutron.agent.dhcp.agent [None req-34c29776-6a6d-4a6a-b2d3-8beb7897aa64 - - - - - -] DHCP configuration for ports {'c3be4d70-09ff-4304-ad4e-0464f0628ae9'} is completed#033[00m Dec 15 05:00:24 localhost neutron_sriov_agent[260044]: 2025-12-15 10:00:24.587 2 INFO neutron.agent.securitygroups_rpc [None req-9e3fc825-11af-4644-82c7-1f2f22fc7454 7eba5d40415d4fd7ab565eed1e07b799 4e26599ab4374eb6b7e9d73e7dfebd03 - - default default] Security group member updated ['489315f5-1e14-49f6-8195-de837d23b935']#033[00m Dec 15 05:00:24 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:00:24.633 267546 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-15T10:00:24Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=fd72ab01-c63c-4f51-ae94-ca02f6f50dfa, ip_allocation=immediate, mac_address=fa:16:3e:79:ba:17, name=tempest-subport-145676246, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-15T10:00:20Z, description=, dns_domain=, id=0874a63c-034d-48db-8f84-a51c1fe90687, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-subport_net-111449250, port_security_enabled=True, project_id=4e26599ab4374eb6b7e9d73e7dfebd03, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=48429, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=385, status=ACTIVE, subnets=['f242735a-6f42-4cf2-82a4-58c4bfd55da7'], tags=[], tenant_id=4e26599ab4374eb6b7e9d73e7dfebd03, updated_at=2025-12-15T10:00:21Z, vlan_transparent=None, network_id=0874a63c-034d-48db-8f84-a51c1fe90687, port_security_enabled=True, project_id=4e26599ab4374eb6b7e9d73e7dfebd03, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['489315f5-1e14-49f6-8195-de837d23b935'], standard_attr_id=410, status=DOWN, tags=[], tenant_id=4e26599ab4374eb6b7e9d73e7dfebd03, updated_at=2025-12-15T10:00:24Z on network 0874a63c-034d-48db-8f84-a51c1fe90687#033[00m Dec 15 05:00:24 localhost systemd[1]: tmp-crun.jgn4UV.mount: Deactivated successfully. Dec 15 05:00:24 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e92 do_prune osdmap full prune enabled Dec 15 05:00:24 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e93 e93: 6 total, 6 up, 6 in Dec 15 05:00:24 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e93: 6 total, 6 up, 6 in Dec 15 05:00:24 localhost systemd[1]: tmp-crun.3p4nyJ.mount: Deactivated successfully. Dec 15 05:00:24 localhost dnsmasq[314076]: read /var/lib/neutron/dhcp/0874a63c-034d-48db-8f84-a51c1fe90687/addn_hosts - 1 addresses Dec 15 05:00:24 localhost dnsmasq-dhcp[314076]: read /var/lib/neutron/dhcp/0874a63c-034d-48db-8f84-a51c1fe90687/host Dec 15 05:00:24 localhost dnsmasq-dhcp[314076]: read /var/lib/neutron/dhcp/0874a63c-034d-48db-8f84-a51c1fe90687/opts Dec 15 05:00:24 localhost podman[314094]: 2025-12-15 10:00:24.884473479 +0000 UTC m=+0.075649566 container kill df71c348ebc76afefa8737120d7d97d99f59bee032910fea1a873136e0df546b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0874a63c-034d-48db-8f84-a51c1fe90687, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS) Dec 15 05:00:25 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:00:25.221 267546 INFO neutron.agent.dhcp.agent [None req-18dc2393-f604-4d6d-ab8e-3227e0d9c9a3 - - - - - -] DHCP configuration for ports {'fd72ab01-c63c-4f51-ae94-ca02f6f50dfa'} is completed#033[00m Dec 15 05:00:25 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 05:00:25 localhost sshd[314115]: main: sshd: ssh-rsa algorithm is disabled Dec 15 05:00:26 localhost nova_compute[286344]: 2025-12-15 10:00:26.324 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:00:29 localhost nova_compute[286344]: 2025-12-15 10:00:29.103 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:00:30 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 05:00:31 localhost ceph-osd[32311]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0. Dec 15 05:00:31 localhost nova_compute[286344]: 2025-12-15 10:00:31.363 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:00:31 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e93 do_prune osdmap full prune enabled Dec 15 05:00:31 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e94 e94: 6 total, 6 up, 6 in Dec 15 05:00:31 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e94: 6 total, 6 up, 6 in Dec 15 05:00:31 localhost podman[243449]: time="2025-12-15T10:00:31Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 15 05:00:31 localhost podman[243449]: @ - - [15/Dec/2025:10:00:31 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 160285 "" "Go-http-client/1.1" Dec 15 05:00:31 localhost podman[243449]: @ - - [15/Dec/2025:10:00:31 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 20187 "" "Go-http-client/1.1" Dec 15 05:00:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0. Dec 15 05:00:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 05:00:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a. Dec 15 05:00:33 localhost podman[314119]: 2025-12-15 10:00:33.761931391 +0000 UTC m=+0.084713131 container health_status 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Dec 15 05:00:33 localhost podman[314119]: 2025-12-15 10:00:33.772824548 +0000 UTC m=+0.095606328 container exec_died 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Dec 15 05:00:33 localhost systemd[1]: 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0.service: Deactivated successfully. Dec 15 05:00:33 localhost systemd[1]: tmp-crun.tQlXJh.mount: Deactivated successfully. Dec 15 05:00:33 localhost podman[314121]: 2025-12-15 10:00:33.827162819 +0000 UTC m=+0.143692544 container health_status b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Dec 15 05:00:33 localhost podman[314121]: 2025-12-15 10:00:33.861893285 +0000 UTC m=+0.178422970 container exec_died b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, io.buildah.version=1.41.3, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Dec 15 05:00:33 localhost systemd[1]: b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a.service: Deactivated successfully. Dec 15 05:00:33 localhost podman[314120]: 2025-12-15 10:00:33.886342618 +0000 UTC m=+0.206299498 container health_status 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2) Dec 15 05:00:33 localhost podman[314120]: 2025-12-15 10:00:33.894420307 +0000 UTC m=+0.214377147 container exec_died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2) Dec 15 05:00:33 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.service: Deactivated successfully. Dec 15 05:00:34 localhost nova_compute[286344]: 2025-12-15 10:00:34.130 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:00:34 localhost systemd[1]: tmp-crun.tsmjOo.mount: Deactivated successfully. Dec 15 05:00:34 localhost openstack_network_exporter[246484]: ERROR 10:00:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 05:00:34 localhost openstack_network_exporter[246484]: ERROR 10:00:34 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 15 05:00:34 localhost openstack_network_exporter[246484]: ERROR 10:00:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 05:00:34 localhost openstack_network_exporter[246484]: ERROR 10:00:34 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 15 05:00:34 localhost openstack_network_exporter[246484]: Dec 15 05:00:34 localhost openstack_network_exporter[246484]: ERROR 10:00:34 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 15 05:00:34 localhost openstack_network_exporter[246484]: Dec 15 05:00:35 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e94 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 05:00:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09. Dec 15 05:00:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 05:00:35 localhost podman[314182]: 2025-12-15 10:00:35.766079755 +0000 UTC m=+0.094132543 container health_status 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, release=1755695350, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, distribution-scope=public, architecture=x86_64, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, managed_by=edpm_ansible) Dec 15 05:00:35 localhost podman[314183]: 2025-12-15 10:00:35.833941348 +0000 UTC m=+0.159019342 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 05:00:35 localhost podman[314182]: 2025-12-15 10:00:35.858486423 +0000 UTC m=+0.186539181 container exec_died 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, version=9.6, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, vcs-type=git, build-date=2025-08-20T13:12:41, release=1755695350, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, managed_by=edpm_ansible, container_name=openstack_network_exporter) Dec 15 05:00:35 localhost systemd[1]: 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09.service: Deactivated successfully. Dec 15 05:00:35 localhost podman[314183]: 2025-12-15 10:00:35.902492078 +0000 UTC m=+0.227570022 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible) Dec 15 05:00:35 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 05:00:35 localhost neutron_sriov_agent[260044]: 2025-12-15 10:00:35.928 2 INFO neutron.agent.securitygroups_rpc [None req-4b8999d5-ba20-4bb0-aaa5-a530ef72d456 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] Security group member updated ['48dda613-ea3d-4053-a6ae-35f8ce5dfd37']#033[00m Dec 15 05:00:36 localhost nova_compute[286344]: 2025-12-15 10:00:36.394 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:00:36 localhost nova_compute[286344]: 2025-12-15 10:00:36.398 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:00:39 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:00:39.447 267546 INFO neutron.agent.linux.ip_lib [None req-0cd46ec1-90d2-4bf7-8715-dea9c5b05570 - - - - - -] Device tap410191a3-ad cannot be used as it has no MAC address#033[00m Dec 15 05:00:39 localhost nova_compute[286344]: 2025-12-15 10:00:39.471 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:00:39 localhost kernel: device tap410191a3-ad entered promiscuous mode Dec 15 05:00:39 localhost NetworkManager[5963]: [1765792839.4784] manager: (tap410191a3-ad): new Generic device (/org/freedesktop/NetworkManager/Devices/20) Dec 15 05:00:39 localhost ovn_controller[154603]: 2025-12-15T10:00:39Z|00082|binding|INFO|Claiming lport 410191a3-ad8c-47df-98ef-4c3095e45cae for this chassis. Dec 15 05:00:39 localhost ovn_controller[154603]: 2025-12-15T10:00:39Z|00083|binding|INFO|410191a3-ad8c-47df-98ef-4c3095e45cae: Claiming unknown Dec 15 05:00:39 localhost nova_compute[286344]: 2025-12-15 10:00:39.481 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:00:39 localhost systemd-udevd[314235]: Network interface NamePolicy= disabled on kernel command line. Dec 15 05:00:39 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:39.494 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '19.80.0.2/24', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-594a8673-651b-4566-92ec-8dbe6ca00b60', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-594a8673-651b-4566-92ec-8dbe6ca00b60', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8d2a9ec16aa942ab8315d4057f639915', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8bd87d03-8621-4b45-a769-2f1ac086eff3, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=410191a3-ad8c-47df-98ef-4c3095e45cae) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:00:39 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:39.496 160590 INFO neutron.agent.ovn.metadata.agent [-] Port 410191a3-ad8c-47df-98ef-4c3095e45cae in datapath 594a8673-651b-4566-92ec-8dbe6ca00b60 bound to our chassis#033[00m Dec 15 05:00:39 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:39.498 160590 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 594a8673-651b-4566-92ec-8dbe6ca00b60 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 15 05:00:39 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:39.500 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[ea782eef-6d53-419b-9194-86e3b1f5a764]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:00:39 localhost journal[231322]: ethtool ioctl error on tap410191a3-ad: No such device Dec 15 05:00:39 localhost ovn_controller[154603]: 2025-12-15T10:00:39Z|00084|binding|INFO|Setting lport 410191a3-ad8c-47df-98ef-4c3095e45cae ovn-installed in OVS Dec 15 05:00:39 localhost ovn_controller[154603]: 2025-12-15T10:00:39Z|00085|binding|INFO|Setting lport 410191a3-ad8c-47df-98ef-4c3095e45cae up in Southbound Dec 15 05:00:39 localhost journal[231322]: ethtool ioctl error on tap410191a3-ad: No such device Dec 15 05:00:39 localhost nova_compute[286344]: 2025-12-15 10:00:39.527 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:00:39 localhost journal[231322]: ethtool ioctl error on tap410191a3-ad: No such device Dec 15 05:00:39 localhost journal[231322]: ethtool ioctl error on tap410191a3-ad: No such device Dec 15 05:00:39 localhost journal[231322]: ethtool ioctl error on tap410191a3-ad: No such device Dec 15 05:00:39 localhost journal[231322]: ethtool ioctl error on tap410191a3-ad: No such device Dec 15 05:00:39 localhost journal[231322]: ethtool ioctl error on tap410191a3-ad: No such device Dec 15 05:00:39 localhost journal[231322]: ethtool ioctl error on tap410191a3-ad: No such device Dec 15 05:00:39 localhost nova_compute[286344]: 2025-12-15 10:00:39.567 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:00:39 localhost nova_compute[286344]: 2025-12-15 10:00:39.595 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:00:40 localhost nova_compute[286344]: 2025-12-15 10:00:40.101 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:00:40 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e94 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 05:00:40 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e94 do_prune osdmap full prune enabled Dec 15 05:00:40 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e95 e95: 6 total, 6 up, 6 in Dec 15 05:00:40 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e95: 6 total, 6 up, 6 in Dec 15 05:00:40 localhost podman[314307]: Dec 15 05:00:40 localhost podman[314307]: 2025-12-15 10:00:40.474611709 +0000 UTC m=+0.093243111 container create e8bff1a5ec450e2a9f85a19f864e875baf89c06f50c11ecc847096467012caed (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-594a8673-651b-4566-92ec-8dbe6ca00b60, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0) Dec 15 05:00:40 localhost systemd[1]: Started libpod-conmon-e8bff1a5ec450e2a9f85a19f864e875baf89c06f50c11ecc847096467012caed.scope. Dec 15 05:00:40 localhost podman[314307]: 2025-12-15 10:00:40.431740041 +0000 UTC m=+0.050371463 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 15 05:00:40 localhost systemd[1]: Started libcrun container. Dec 15 05:00:40 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2b2a87798477edb1c4ed240e31aa7d3ba3b4d16e88de519df3a73a3e16e5c420/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 05:00:40 localhost podman[314307]: 2025-12-15 10:00:40.555706479 +0000 UTC m=+0.174337871 container init e8bff1a5ec450e2a9f85a19f864e875baf89c06f50c11ecc847096467012caed (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-594a8673-651b-4566-92ec-8dbe6ca00b60, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202) Dec 15 05:00:40 localhost podman[314307]: 2025-12-15 10:00:40.565073829 +0000 UTC m=+0.183705221 container start e8bff1a5ec450e2a9f85a19f864e875baf89c06f50c11ecc847096467012caed (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-594a8673-651b-4566-92ec-8dbe6ca00b60, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0) Dec 15 05:00:40 localhost dnsmasq[314325]: started, version 2.85 cachesize 150 Dec 15 05:00:40 localhost dnsmasq[314325]: DNS service limited to local subnets Dec 15 05:00:40 localhost dnsmasq[314325]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 15 05:00:40 localhost dnsmasq[314325]: warning: no upstream servers configured Dec 15 05:00:40 localhost dnsmasq-dhcp[314325]: DHCP, static leases only on 19.80.0.0, lease time 1d Dec 15 05:00:40 localhost dnsmasq[314325]: read /var/lib/neutron/dhcp/594a8673-651b-4566-92ec-8dbe6ca00b60/addn_hosts - 0 addresses Dec 15 05:00:40 localhost dnsmasq-dhcp[314325]: read /var/lib/neutron/dhcp/594a8673-651b-4566-92ec-8dbe6ca00b60/host Dec 15 05:00:40 localhost dnsmasq-dhcp[314325]: read /var/lib/neutron/dhcp/594a8673-651b-4566-92ec-8dbe6ca00b60/opts Dec 15 05:00:40 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:00:40.744 267546 INFO neutron.agent.dhcp.agent [None req-8cf2a515-305e-4660-9c17-d4bd2457a7b5 - - - - - -] DHCP configuration for ports {'eb478ea1-29fd-4e9a-85ee-b8b88d82f051'} is completed#033[00m Dec 15 05:00:40 localhost neutron_sriov_agent[260044]: 2025-12-15 10:00:40.950 2 INFO neutron.agent.securitygroups_rpc [None req-d7fe2064-7ac6-4628-869d-74dcaf91ed71 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] Security group member updated ['48dda613-ea3d-4053-a6ae-35f8ce5dfd37']#033[00m Dec 15 05:00:41 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:00:41.189 267546 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-15T10:00:40Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=d92e338f-e5ff-4170-8303-27f41ee35ef3, ip_allocation=immediate, mac_address=fa:16:3e:75:03:44, name=tempest-subport-1273927, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-15T10:00:36Z, description=, dns_domain=, id=594a8673-651b-4566-92ec-8dbe6ca00b60, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-subport_net-2115214844, port_security_enabled=True, project_id=8d2a9ec16aa942ab8315d4057f639915, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=39753, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=486, status=ACTIVE, subnets=['5cbb791a-08fe-4211-a38d-eab0baf91959'], tags=[], tenant_id=8d2a9ec16aa942ab8315d4057f639915, updated_at=2025-12-15T10:00:38Z, vlan_transparent=None, network_id=594a8673-651b-4566-92ec-8dbe6ca00b60, port_security_enabled=True, project_id=8d2a9ec16aa942ab8315d4057f639915, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['48dda613-ea3d-4053-a6ae-35f8ce5dfd37'], standard_attr_id=517, status=DOWN, tags=[], tenant_id=8d2a9ec16aa942ab8315d4057f639915, updated_at=2025-12-15T10:00:40Z on network 594a8673-651b-4566-92ec-8dbe6ca00b60#033[00m Dec 15 05:00:41 localhost nova_compute[286344]: 2025-12-15 10:00:41.430 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:00:41 localhost dnsmasq[314325]: read /var/lib/neutron/dhcp/594a8673-651b-4566-92ec-8dbe6ca00b60/addn_hosts - 1 addresses Dec 15 05:00:41 localhost dnsmasq-dhcp[314325]: read /var/lib/neutron/dhcp/594a8673-651b-4566-92ec-8dbe6ca00b60/host Dec 15 05:00:41 localhost dnsmasq-dhcp[314325]: read /var/lib/neutron/dhcp/594a8673-651b-4566-92ec-8dbe6ca00b60/opts Dec 15 05:00:41 localhost podman[314343]: 2025-12-15 10:00:41.435108191 +0000 UTC m=+0.089998781 container kill e8bff1a5ec450e2a9f85a19f864e875baf89c06f50c11ecc847096467012caed (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-594a8673-651b-4566-92ec-8dbe6ca00b60, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Dec 15 05:00:41 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:00:41.829 267546 INFO neutron.agent.dhcp.agent [None req-8dacdfbe-96c0-49d2-9bdf-3c75d9ed409e - - - - - -] DHCP configuration for ports {'d92e338f-e5ff-4170-8303-27f41ee35ef3'} is completed#033[00m Dec 15 05:00:43 localhost nova_compute[286344]: 2025-12-15 10:00:43.242 286348 DEBUG nova.virt.libvirt.driver [None req-93ddfe1e-4219-4b1f-8860-eb27ca1215ed 58b2de48454141d48ee4d84c6ec84836 7526febbe14640b09a9f0897a2f4af8c - - default default] [instance: 8f5a73ce-a171-4a42-9f60-63c0db9e0a32] Creating tmpfile /var/lib/nova/instances/tmpy9pz01bc to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041#033[00m Dec 15 05:00:43 localhost nova_compute[286344]: 2025-12-15 10:00:43.292 286348 DEBUG nova.compute.manager [None req-93ddfe1e-4219-4b1f-8860-eb27ca1215ed 58b2de48454141d48ee4d84c6ec84836 7526febbe14640b09a9f0897a2f4af8c - - default default] destination check data is LibvirtLiveMigrateData(bdms=,block_migration=False,disk_available_mb=13312,disk_over_commit=False,dst_numa_info=,dst_supports_numa_live_migration=,dst_wants_file_backed_memory=False,file_backed_memory_discard=,filename='tmpy9pz01bc',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path=,is_shared_block_storage=,is_shared_instance_path=,is_volume_backed=,migration=,old_vol_attachment_ids=,serial_listen_addr=None,serial_listen_ports=,src_supports_native_luks=,src_supports_numa_live_migration=,supported_perf_events=,target_connect_addr=,vifs=[VIFMigrateData],wait_for_vif_plugged=) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476#033[00m Dec 15 05:00:43 localhost nova_compute[286344]: 2025-12-15 10:00:43.321 286348 DEBUG oslo_concurrency.lockutils [None req-93ddfe1e-4219-4b1f-8860-eb27ca1215ed 58b2de48454141d48ee4d84c6ec84836 7526febbe14640b09a9f0897a2f4af8c - - default default] Acquiring lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 15 05:00:43 localhost nova_compute[286344]: 2025-12-15 10:00:43.322 286348 DEBUG oslo_concurrency.lockutils [None req-93ddfe1e-4219-4b1f-8860-eb27ca1215ed 58b2de48454141d48ee4d84c6ec84836 7526febbe14640b09a9f0897a2f4af8c - - default default] Acquired lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 15 05:00:43 localhost nova_compute[286344]: 2025-12-15 10:00:43.341 286348 INFO nova.compute.rpcapi [None req-93ddfe1e-4219-4b1f-8860-eb27ca1215ed 58b2de48454141d48ee4d84c6ec84836 7526febbe14640b09a9f0897a2f4af8c - - default default] Automatically selected compute RPC version 6.2 from minimum service version 66#033[00m Dec 15 05:00:43 localhost nova_compute[286344]: 2025-12-15 10:00:43.342 286348 DEBUG oslo_concurrency.lockutils [None req-93ddfe1e-4219-4b1f-8860-eb27ca1215ed 58b2de48454141d48ee4d84c6ec84836 7526febbe14640b09a9f0897a2f4af8c - - default default] Releasing lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 15 05:00:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 05:00:44 localhost podman[314363]: 2025-12-15 10:00:44.735394083 +0000 UTC m=+0.073867633 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0) Dec 15 05:00:44 localhost podman[314363]: 2025-12-15 10:00:44.741292908 +0000 UTC m=+0.079766488 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:00:44 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 05:00:45 localhost nova_compute[286344]: 2025-12-15 10:00:45.071 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:00:45 localhost nova_compute[286344]: 2025-12-15 10:00:45.396 286348 DEBUG nova.compute.manager [None req-93ddfe1e-4219-4b1f-8860-eb27ca1215ed 58b2de48454141d48ee4d84c6ec84836 7526febbe14640b09a9f0897a2f4af8c - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=,block_migration=False,disk_available_mb=13312,disk_over_commit=False,dst_numa_info=,dst_supports_numa_live_migration=,dst_wants_file_backed_memory=False,file_backed_memory_discard=,filename='tmpy9pz01bc',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='8f5a73ce-a171-4a42-9f60-63c0db9e0a32',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=,old_vol_attachment_ids=,serial_listen_addr=None,serial_listen_ports=,src_supports_native_luks=,src_supports_numa_live_migration=,supported_perf_events=,target_connect_addr=,vifs=[VIFMigrateData],wait_for_vif_plugged=) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604#033[00m Dec 15 05:00:45 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e95 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 05:00:45 localhost nova_compute[286344]: 2025-12-15 10:00:45.440 286348 DEBUG oslo_concurrency.lockutils [None req-93ddfe1e-4219-4b1f-8860-eb27ca1215ed 58b2de48454141d48ee4d84c6ec84836 7526febbe14640b09a9f0897a2f4af8c - - default default] Acquiring lock "refresh_cache-8f5a73ce-a171-4a42-9f60-63c0db9e0a32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 15 05:00:45 localhost nova_compute[286344]: 2025-12-15 10:00:45.442 286348 DEBUG oslo_concurrency.lockutils [None req-93ddfe1e-4219-4b1f-8860-eb27ca1215ed 58b2de48454141d48ee4d84c6ec84836 7526febbe14640b09a9f0897a2f4af8c - - default default] Acquired lock "refresh_cache-8f5a73ce-a171-4a42-9f60-63c0db9e0a32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 15 05:00:45 localhost nova_compute[286344]: 2025-12-15 10:00:45.443 286348 DEBUG nova.network.neutron [None req-93ddfe1e-4219-4b1f-8860-eb27ca1215ed 58b2de48454141d48ee4d84c6ec84836 7526febbe14640b09a9f0897a2f4af8c - - default default] [instance: 8f5a73ce-a171-4a42-9f60-63c0db9e0a32] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m Dec 15 05:00:46 localhost nova_compute[286344]: 2025-12-15 10:00:46.434 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:00:47 localhost nova_compute[286344]: 2025-12-15 10:00:47.328 286348 DEBUG nova.network.neutron [None req-93ddfe1e-4219-4b1f-8860-eb27ca1215ed 58b2de48454141d48ee4d84c6ec84836 7526febbe14640b09a9f0897a2f4af8c - - default default] [instance: 8f5a73ce-a171-4a42-9f60-63c0db9e0a32] Updating instance_info_cache with network_info: [{"id": "19409185-ff01-4d58-bd6e-5d66014dd5c9", "address": "fa:16:3e:df:04:be", "network": {"id": "ff0cb653-0dc9-48ed-8a20-700650a10509", "bridge": "br-int", "label": "tempest-LiveMigrationTest-2038605812-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "4e26599ab4374eb6b7e9d73e7dfebd03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap19409185-ff", "ovs_interfaceid": "19409185-ff01-4d58-bd6e-5d66014dd5c9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 15 05:00:47 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e95 do_prune osdmap full prune enabled Dec 15 05:00:47 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e96 e96: 6 total, 6 up, 6 in Dec 15 05:00:47 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e96: 6 total, 6 up, 6 in Dec 15 05:00:47 localhost nova_compute[286344]: 2025-12-15 10:00:47.667 286348 DEBUG oslo_concurrency.lockutils [None req-93ddfe1e-4219-4b1f-8860-eb27ca1215ed 58b2de48454141d48ee4d84c6ec84836 7526febbe14640b09a9f0897a2f4af8c - - default default] Releasing lock "refresh_cache-8f5a73ce-a171-4a42-9f60-63c0db9e0a32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 15 05:00:47 localhost nova_compute[286344]: 2025-12-15 10:00:47.670 286348 DEBUG nova.virt.libvirt.driver [None req-93ddfe1e-4219-4b1f-8860-eb27ca1215ed 58b2de48454141d48ee4d84c6ec84836 7526febbe14640b09a9f0897a2f4af8c - - default default] [instance: 8f5a73ce-a171-4a42-9f60-63c0db9e0a32] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=,block_migration=False,disk_available_mb=13312,disk_over_commit=False,dst_numa_info=,dst_supports_numa_live_migration=,dst_wants_file_backed_memory=False,file_backed_memory_discard=,filename='tmpy9pz01bc',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='8f5a73ce-a171-4a42-9f60-63c0db9e0a32',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=,src_supports_native_luks=,src_supports_numa_live_migration=,supported_perf_events=,target_connect_addr=,vifs=[VIFMigrateData],wait_for_vif_plugged=) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827#033[00m Dec 15 05:00:47 localhost nova_compute[286344]: 2025-12-15 10:00:47.671 286348 DEBUG nova.virt.libvirt.driver [None req-93ddfe1e-4219-4b1f-8860-eb27ca1215ed 58b2de48454141d48ee4d84c6ec84836 7526febbe14640b09a9f0897a2f4af8c - - default default] [instance: 8f5a73ce-a171-4a42-9f60-63c0db9e0a32] Creating instance directory: /var/lib/nova/instances/8f5a73ce-a171-4a42-9f60-63c0db9e0a32 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840#033[00m Dec 15 05:00:47 localhost nova_compute[286344]: 2025-12-15 10:00:47.671 286348 DEBUG nova.virt.libvirt.driver [None req-93ddfe1e-4219-4b1f-8860-eb27ca1215ed 58b2de48454141d48ee4d84c6ec84836 7526febbe14640b09a9f0897a2f4af8c - - default default] [instance: 8f5a73ce-a171-4a42-9f60-63c0db9e0a32] Ensure instance console log exists: /var/lib/nova/instances/8f5a73ce-a171-4a42-9f60-63c0db9e0a32/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m Dec 15 05:00:47 localhost nova_compute[286344]: 2025-12-15 10:00:47.672 286348 DEBUG nova.virt.libvirt.driver [None req-93ddfe1e-4219-4b1f-8860-eb27ca1215ed 58b2de48454141d48ee4d84c6ec84836 7526febbe14640b09a9f0897a2f4af8c - - default default] [instance: 8f5a73ce-a171-4a42-9f60-63c0db9e0a32] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794#033[00m Dec 15 05:00:47 localhost nova_compute[286344]: 2025-12-15 10:00:47.674 286348 DEBUG nova.virt.libvirt.vif [None req-93ddfe1e-4219-4b1f-8860-eb27ca1215ed 58b2de48454141d48ee4d84c6ec84836 7526febbe14640b09a9f0897a2f4af8c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-15T10:00:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1394638268',display_name='tempest-LiveMigrationTest-server-1394638268',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(5),hidden=False,host='np0005559463.localdomain',hostname='tempest-livemigrationtest-server-1394638268',id=7,image_ref='b48177c8-9d95-4864-913a-a010f9defaa6',info_cache=InstanceInfoCache,instance_type_id=5,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2025-12-15T10:00:39Z,launched_on='np0005559463.localdomain',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='np0005559463.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='4e26599ab4374eb6b7e9d73e7dfebd03',ramdisk_id='',reservation_id='r-g02irkkv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='b48177c8-9d95-4864-913a-a010f9defaa6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-134706359',owner_user_name='tempest-LiveMigrationTest-134706359-project-member'},tags=,task_state='migrating',terminated_at=None,trusted_certs=,updated_at=2025-12-15T10:00:39Z,user_data=None,user_id='7eba5d40415d4fd7ab565eed1e07b799',uuid=8f5a73ce-a171-4a42-9f60-63c0db9e0a32,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "19409185-ff01-4d58-bd6e-5d66014dd5c9", "address": "fa:16:3e:df:04:be", "network": {"id": "ff0cb653-0dc9-48ed-8a20-700650a10509", "bridge": "br-int", "label": "tempest-LiveMigrationTest-2038605812-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "4e26599ab4374eb6b7e9d73e7dfebd03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap19409185-ff", "ovs_interfaceid": "19409185-ff01-4d58-bd6e-5d66014dd5c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m Dec 15 05:00:47 localhost nova_compute[286344]: 2025-12-15 10:00:47.674 286348 DEBUG nova.network.os_vif_util [None req-93ddfe1e-4219-4b1f-8860-eb27ca1215ed 58b2de48454141d48ee4d84c6ec84836 7526febbe14640b09a9f0897a2f4af8c - - default default] Converting VIF {"id": "19409185-ff01-4d58-bd6e-5d66014dd5c9", "address": "fa:16:3e:df:04:be", "network": {"id": "ff0cb653-0dc9-48ed-8a20-700650a10509", "bridge": "br-int", "label": "tempest-LiveMigrationTest-2038605812-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "4e26599ab4374eb6b7e9d73e7dfebd03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap19409185-ff", "ovs_interfaceid": "19409185-ff01-4d58-bd6e-5d66014dd5c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m Dec 15 05:00:47 localhost nova_compute[286344]: 2025-12-15 10:00:47.675 286348 DEBUG nova.network.os_vif_util [None req-93ddfe1e-4219-4b1f-8860-eb27ca1215ed 58b2de48454141d48ee4d84c6ec84836 7526febbe14640b09a9f0897a2f4af8c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:df:04:be,bridge_name='br-int',has_traffic_filtering=True,id=19409185-ff01-4d58-bd6e-5d66014dd5c9,network=Network(ff0cb653-0dc9-48ed-8a20-700650a10509),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap19409185-ff') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m Dec 15 05:00:47 localhost nova_compute[286344]: 2025-12-15 10:00:47.676 286348 DEBUG os_vif [None req-93ddfe1e-4219-4b1f-8860-eb27ca1215ed 58b2de48454141d48ee4d84c6ec84836 7526febbe14640b09a9f0897a2f4af8c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:df:04:be,bridge_name='br-int',has_traffic_filtering=True,id=19409185-ff01-4d58-bd6e-5d66014dd5c9,network=Network(ff0cb653-0dc9-48ed-8a20-700650a10509),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap19409185-ff') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m Dec 15 05:00:47 localhost nova_compute[286344]: 2025-12-15 10:00:47.677 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:00:47 localhost nova_compute[286344]: 2025-12-15 10:00:47.678 286348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 15 05:00:47 localhost nova_compute[286344]: 2025-12-15 10:00:47.678 286348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Dec 15 05:00:47 localhost nova_compute[286344]: 2025-12-15 10:00:47.682 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:00:47 localhost nova_compute[286344]: 2025-12-15 10:00:47.682 286348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap19409185-ff, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 15 05:00:47 localhost nova_compute[286344]: 2025-12-15 10:00:47.683 286348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap19409185-ff, col_values=(('external_ids', {'iface-id': '19409185-ff01-4d58-bd6e-5d66014dd5c9', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:df:04:be', 'vm-uuid': '8f5a73ce-a171-4a42-9f60-63c0db9e0a32'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 15 05:00:47 localhost nova_compute[286344]: 2025-12-15 10:00:47.685 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:00:47 localhost nova_compute[286344]: 2025-12-15 10:00:47.687 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 05:00:47 localhost nova_compute[286344]: 2025-12-15 10:00:47.694 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:00:47 localhost nova_compute[286344]: 2025-12-15 10:00:47.696 286348 INFO os_vif [None req-93ddfe1e-4219-4b1f-8860-eb27ca1215ed 58b2de48454141d48ee4d84c6ec84836 7526febbe14640b09a9f0897a2f4af8c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:df:04:be,bridge_name='br-int',has_traffic_filtering=True,id=19409185-ff01-4d58-bd6e-5d66014dd5c9,network=Network(ff0cb653-0dc9-48ed-8a20-700650a10509),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap19409185-ff')#033[00m Dec 15 05:00:47 localhost nova_compute[286344]: 2025-12-15 10:00:47.697 286348 DEBUG nova.virt.libvirt.driver [None req-93ddfe1e-4219-4b1f-8860-eb27ca1215ed 58b2de48454141d48ee4d84c6ec84836 7526febbe14640b09a9f0897a2f4af8c - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954#033[00m Dec 15 05:00:47 localhost nova_compute[286344]: 2025-12-15 10:00:47.698 286348 DEBUG nova.compute.manager [None req-93ddfe1e-4219-4b1f-8860-eb27ca1215ed 58b2de48454141d48ee4d84c6ec84836 7526febbe14640b09a9f0897a2f4af8c - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=False,disk_available_mb=13312,disk_over_commit=False,dst_numa_info=,dst_supports_numa_live_migration=,dst_wants_file_backed_memory=False,file_backed_memory_discard=,filename='tmpy9pz01bc',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='8f5a73ce-a171-4a42-9f60-63c0db9e0a32',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=,src_supports_numa_live_migration=,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668#033[00m Dec 15 05:00:47 localhost nova_compute[286344]: 2025-12-15 10:00:47.933 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:00:48 localhost nova_compute[286344]: 2025-12-15 10:00:48.520 286348 DEBUG oslo_concurrency.lockutils [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] Acquiring lock "51fca94e-ecb9-4350-bafc-765e827a1c7b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 05:00:48 localhost nova_compute[286344]: 2025-12-15 10:00:48.521 286348 DEBUG oslo_concurrency.lockutils [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] Lock "51fca94e-ecb9-4350-bafc-765e827a1c7b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 05:00:48 localhost nova_compute[286344]: 2025-12-15 10:00:48.596 286348 DEBUG nova.compute.manager [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m Dec 15 05:00:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e. Dec 15 05:00:48 localhost nova_compute[286344]: 2025-12-15 10:00:48.706 286348 DEBUG oslo_concurrency.lockutils [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 05:00:48 localhost nova_compute[286344]: 2025-12-15 10:00:48.706 286348 DEBUG oslo_concurrency.lockutils [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 05:00:48 localhost nova_compute[286344]: 2025-12-15 10:00:48.713 286348 DEBUG nova.virt.hardware [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m Dec 15 05:00:48 localhost nova_compute[286344]: 2025-12-15 10:00:48.714 286348 INFO nova.compute.claims [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Claim successful on node np0005559462.localdomain#033[00m Dec 15 05:00:48 localhost systemd[1]: tmp-crun.JbQ1up.mount: Deactivated successfully. Dec 15 05:00:48 localhost podman[314384]: 2025-12-15 10:00:48.766874673 +0000 UTC m=+0.097545507 container health_status a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Dec 15 05:00:48 localhost podman[314384]: 2025-12-15 10:00:48.775386453 +0000 UTC m=+0.106057197 container exec_died a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Dec 15 05:00:48 localhost systemd[1]: a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e.service: Deactivated successfully. Dec 15 05:00:49 localhost nova_compute[286344]: 2025-12-15 10:00:49.036 286348 DEBUG oslo_concurrency.processutils [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 05:00:49 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Dec 15 05:00:49 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1183452302' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Dec 15 05:00:49 localhost nova_compute[286344]: 2025-12-15 10:00:49.491 286348 DEBUG oslo_concurrency.processutils [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 05:00:49 localhost nova_compute[286344]: 2025-12-15 10:00:49.499 286348 DEBUG nova.compute.provider_tree [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] Inventory has not changed in ProviderTree for provider: 26c8956b-6742-4951-b566-971b9bbe323b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 15 05:00:49 localhost nova_compute[286344]: 2025-12-15 10:00:49.515 286348 DEBUG nova.scheduler.client.report [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] Inventory has not changed for provider 26c8956b-6742-4951-b566-971b9bbe323b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 15 05:00:49 localhost nova_compute[286344]: 2025-12-15 10:00:49.544 286348 DEBUG oslo_concurrency.lockutils [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.838s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 05:00:49 localhost nova_compute[286344]: 2025-12-15 10:00:49.545 286348 DEBUG nova.compute.manager [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m Dec 15 05:00:49 localhost nova_compute[286344]: 2025-12-15 10:00:49.607 286348 DEBUG nova.compute.manager [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m Dec 15 05:00:49 localhost nova_compute[286344]: 2025-12-15 10:00:49.608 286348 DEBUG nova.network.neutron [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m Dec 15 05:00:49 localhost nova_compute[286344]: 2025-12-15 10:00:49.627 286348 INFO nova.virt.libvirt.driver [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m Dec 15 05:00:49 localhost nova_compute[286344]: 2025-12-15 10:00:49.644 286348 DEBUG nova.compute.manager [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m Dec 15 05:00:49 localhost nova_compute[286344]: 2025-12-15 10:00:49.749 286348 DEBUG nova.compute.manager [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m Dec 15 05:00:49 localhost nova_compute[286344]: 2025-12-15 10:00:49.752 286348 DEBUG nova.virt.libvirt.driver [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m Dec 15 05:00:49 localhost nova_compute[286344]: 2025-12-15 10:00:49.753 286348 INFO nova.virt.libvirt.driver [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Creating image(s)#033[00m Dec 15 05:00:49 localhost nova_compute[286344]: 2025-12-15 10:00:49.788 286348 DEBUG nova.storage.rbd_utils [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] rbd image 51fca94e-ecb9-4350-bafc-765e827a1c7b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m Dec 15 05:00:49 localhost nova_compute[286344]: 2025-12-15 10:00:49.819 286348 DEBUG nova.storage.rbd_utils [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] rbd image 51fca94e-ecb9-4350-bafc-765e827a1c7b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m Dec 15 05:00:49 localhost nova_compute[286344]: 2025-12-15 10:00:49.844 286348 DEBUG nova.storage.rbd_utils [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] rbd image 51fca94e-ecb9-4350-bafc-765e827a1c7b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m Dec 15 05:00:49 localhost nova_compute[286344]: 2025-12-15 10:00:49.848 286348 DEBUG oslo_concurrency.lockutils [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] Acquiring lock "b8e86f310a6a80f319510a846016b35643e549ce" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 05:00:49 localhost nova_compute[286344]: 2025-12-15 10:00:49.849 286348 DEBUG oslo_concurrency.lockutils [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] Lock "b8e86f310a6a80f319510a846016b35643e549ce" acquired by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 05:00:49 localhost nova_compute[286344]: 2025-12-15 10:00:49.871 286348 WARNING oslo_policy.policy [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.#033[00m Dec 15 05:00:49 localhost nova_compute[286344]: 2025-12-15 10:00:49.872 286348 WARNING oslo_policy.policy [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.#033[00m Dec 15 05:00:49 localhost nova_compute[286344]: 2025-12-15 10:00:49.875 286348 DEBUG nova.policy [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c79d291546244f6a970ffc157036d797', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8d2a9ec16aa942ab8315d4057f639915', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m Dec 15 05:00:49 localhost nova_compute[286344]: 2025-12-15 10:00:49.910 286348 DEBUG nova.virt.libvirt.imagebackend [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] Image locations are: [{'url': 'rbd://bce17446-41b5-5408-a23e-0b011906b44a/images/b48177c8-9d95-4864-913a-a010f9defaa6/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://bce17446-41b5-5408-a23e-0b011906b44a/images/b48177c8-9d95-4864-913a-a010f9defaa6/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085#033[00m Dec 15 05:00:50 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Dec 15 05:00:50 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:00:50 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e96 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 05:00:50 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e96 do_prune osdmap full prune enabled Dec 15 05:00:50 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e97 e97: 6 total, 6 up, 6 in Dec 15 05:00:50 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e97: 6 total, 6 up, 6 in Dec 15 05:00:50 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Dec 15 05:00:50 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:00:50 localhost neutron_sriov_agent[260044]: 2025-12-15 10:00:50.721 2 INFO neutron.agent.securitygroups_rpc [req-616ddb84-cc7f-4096-9d62-038d464e39f7 req-36ce0385-4418-4cc2-94d9-22a3e5515580 35309870d5bd457f95d32ae22fa4f72e b24a65ffabd5489d869d7df73e041227 - - default default] Security group rule updated ['dc23abb8-5c05-4fff-8702-c2e188cf6b85']#033[00m Dec 15 05:00:50 localhost nova_compute[286344]: 2025-12-15 10:00:50.873 286348 DEBUG oslo_concurrency.processutils [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b8e86f310a6a80f319510a846016b35643e549ce.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 05:00:50 localhost nova_compute[286344]: 2025-12-15 10:00:50.943 286348 DEBUG oslo_concurrency.processutils [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b8e86f310a6a80f319510a846016b35643e549ce.part --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 05:00:50 localhost nova_compute[286344]: 2025-12-15 10:00:50.946 286348 DEBUG nova.virt.images [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] b48177c8-9d95-4864-913a-a010f9defaa6 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m Dec 15 05:00:50 localhost nova_compute[286344]: 2025-12-15 10:00:50.948 286348 DEBUG nova.privsep.utils [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m Dec 15 05:00:50 localhost nova_compute[286344]: 2025-12-15 10:00:50.948 286348 DEBUG oslo_concurrency.processutils [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/b8e86f310a6a80f319510a846016b35643e549ce.part /var/lib/nova/instances/_base/b8e86f310a6a80f319510a846016b35643e549ce.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 05:00:51 localhost nova_compute[286344]: 2025-12-15 10:00:51.187 286348 DEBUG oslo_concurrency.processutils [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/b8e86f310a6a80f319510a846016b35643e549ce.part /var/lib/nova/instances/_base/b8e86f310a6a80f319510a846016b35643e549ce.converted" returned: 0 in 0.239s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 05:00:51 localhost nova_compute[286344]: 2025-12-15 10:00:51.193 286348 DEBUG oslo_concurrency.processutils [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b8e86f310a6a80f319510a846016b35643e549ce.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 05:00:51 localhost nova_compute[286344]: 2025-12-15 10:00:51.256 286348 DEBUG oslo_concurrency.processutils [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b8e86f310a6a80f319510a846016b35643e549ce.converted --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 05:00:51 localhost nova_compute[286344]: 2025-12-15 10:00:51.258 286348 DEBUG oslo_concurrency.lockutils [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] Lock "b8e86f310a6a80f319510a846016b35643e549ce" "released" by "nova.virt.libvirt.imagebackend.Image.cache..fetch_func_sync" :: held 1.409s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 05:00:51 localhost nova_compute[286344]: 2025-12-15 10:00:51.288 286348 DEBUG nova.storage.rbd_utils [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] rbd image 51fca94e-ecb9-4350-bafc-765e827a1c7b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m Dec 15 05:00:51 localhost nova_compute[286344]: 2025-12-15 10:00:51.293 286348 DEBUG oslo_concurrency.processutils [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/b8e86f310a6a80f319510a846016b35643e549ce 51fca94e-ecb9-4350-bafc-765e827a1c7b_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 05:00:51 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:51.478 160590 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 05:00:51 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:51.478 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 05:00:51 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:51.479 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 05:00:51 localhost nova_compute[286344]: 2025-12-15 10:00:51.504 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:00:51 localhost nova_compute[286344]: 2025-12-15 10:00:51.648 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:00:52 localhost nova_compute[286344]: 2025-12-15 10:00:52.132 286348 DEBUG oslo_concurrency.processutils [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/b8e86f310a6a80f319510a846016b35643e549ce 51fca94e-ecb9-4350-bafc-765e827a1c7b_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.839s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 05:00:52 localhost nova_compute[286344]: 2025-12-15 10:00:52.172 286348 DEBUG nova.network.neutron [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Successfully updated port: d6b04aa0-3423-4a78-adfc-4bf3151f80ed _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m Dec 15 05:00:52 localhost nova_compute[286344]: 2025-12-15 10:00:52.225 286348 DEBUG oslo_concurrency.lockutils [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] Acquiring lock "refresh_cache-51fca94e-ecb9-4350-bafc-765e827a1c7b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 15 05:00:52 localhost nova_compute[286344]: 2025-12-15 10:00:52.226 286348 DEBUG oslo_concurrency.lockutils [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] Acquired lock "refresh_cache-51fca94e-ecb9-4350-bafc-765e827a1c7b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 15 05:00:52 localhost nova_compute[286344]: 2025-12-15 10:00:52.226 286348 DEBUG nova.network.neutron [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m Dec 15 05:00:52 localhost nova_compute[286344]: 2025-12-15 10:00:52.232 286348 DEBUG nova.storage.rbd_utils [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] resizing rbd image 51fca94e-ecb9-4350-bafc-765e827a1c7b_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m Dec 15 05:00:52 localhost nova_compute[286344]: 2025-12-15 10:00:52.362 286348 DEBUG nova.objects.instance [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] Lazy-loading 'migration_context' on Instance uuid 51fca94e-ecb9-4350-bafc-765e827a1c7b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 15 05:00:52 localhost nova_compute[286344]: 2025-12-15 10:00:52.384 286348 DEBUG nova.virt.libvirt.driver [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m Dec 15 05:00:52 localhost nova_compute[286344]: 2025-12-15 10:00:52.384 286348 DEBUG nova.virt.libvirt.driver [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Ensure instance console log exists: /var/lib/nova/instances/51fca94e-ecb9-4350-bafc-765e827a1c7b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m Dec 15 05:00:52 localhost nova_compute[286344]: 2025-12-15 10:00:52.384 286348 DEBUG oslo_concurrency.lockutils [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 05:00:52 localhost nova_compute[286344]: 2025-12-15 10:00:52.385 286348 DEBUG oslo_concurrency.lockutils [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 05:00:52 localhost nova_compute[286344]: 2025-12-15 10:00:52.385 286348 DEBUG oslo_concurrency.lockutils [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 05:00:52 localhost nova_compute[286344]: 2025-12-15 10:00:52.390 286348 DEBUG nova.network.neutron [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m Dec 15 05:00:52 localhost nova_compute[286344]: 2025-12-15 10:00:52.604 286348 DEBUG nova.compute.manager [req-3b7a533a-9ff5-423e-8a46-293fe97783d6 req-0ca64284-ed17-46a4-b116-b78826ab5a4e 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Received event network-changed-d6b04aa0-3423-4a78-adfc-4bf3151f80ed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Dec 15 05:00:52 localhost nova_compute[286344]: 2025-12-15 10:00:52.605 286348 DEBUG nova.compute.manager [req-3b7a533a-9ff5-423e-8a46-293fe97783d6 req-0ca64284-ed17-46a4-b116-b78826ab5a4e 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Refreshing instance network info cache due to event network-changed-d6b04aa0-3423-4a78-adfc-4bf3151f80ed. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m Dec 15 05:00:52 localhost nova_compute[286344]: 2025-12-15 10:00:52.605 286348 DEBUG oslo_concurrency.lockutils [req-3b7a533a-9ff5-423e-8a46-293fe97783d6 req-0ca64284-ed17-46a4-b116-b78826ab5a4e 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] Acquiring lock "refresh_cache-51fca94e-ecb9-4350-bafc-765e827a1c7b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 15 05:00:52 localhost nova_compute[286344]: 2025-12-15 10:00:52.713 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:00:52 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e97 do_prune osdmap full prune enabled Dec 15 05:00:52 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e98 e98: 6 total, 6 up, 6 in Dec 15 05:00:52 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e98: 6 total, 6 up, 6 in Dec 15 05:00:52 localhost neutron_sriov_agent[260044]: 2025-12-15 10:00:52.908 2 INFO neutron.agent.securitygroups_rpc [req-cd75394a-1a77-4cee-a5fa-e6f3f0e40b61 req-7eee3642-bdbc-47d2-bc98-c5cc41c63f48 35309870d5bd457f95d32ae22fa4f72e b24a65ffabd5489d869d7df73e041227 - - default default] Security group rule updated ['45b66115-fad6-46d3-9cf3-ec3447d35cdd']#033[00m Dec 15 05:00:53 localhost nova_compute[286344]: 2025-12-15 10:00:53.046 286348 DEBUG nova.network.neutron [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Updating instance_info_cache with network_info: [{"id": "d6b04aa0-3423-4a78-adfc-4bf3151f80ed", "address": "fa:16:3e:98:70:a4", "network": {"id": "576e7e59-adb4-4dcb-9b64-cc166b1e1e5f", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1988721240-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "8d2a9ec16aa942ab8315d4057f639915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd6b04aa0-34", "ovs_interfaceid": "d6b04aa0-3423-4a78-adfc-4bf3151f80ed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 15 05:00:53 localhost nova_compute[286344]: 2025-12-15 10:00:53.108 286348 DEBUG oslo_concurrency.lockutils [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] Releasing lock "refresh_cache-51fca94e-ecb9-4350-bafc-765e827a1c7b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 15 05:00:53 localhost nova_compute[286344]: 2025-12-15 10:00:53.109 286348 DEBUG nova.compute.manager [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Instance network_info: |[{"id": "d6b04aa0-3423-4a78-adfc-4bf3151f80ed", "address": "fa:16:3e:98:70:a4", "network": {"id": "576e7e59-adb4-4dcb-9b64-cc166b1e1e5f", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1988721240-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "8d2a9ec16aa942ab8315d4057f639915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd6b04aa0-34", "ovs_interfaceid": "d6b04aa0-3423-4a78-adfc-4bf3151f80ed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m Dec 15 05:00:53 localhost nova_compute[286344]: 2025-12-15 10:00:53.109 286348 DEBUG oslo_concurrency.lockutils [req-3b7a533a-9ff5-423e-8a46-293fe97783d6 req-0ca64284-ed17-46a4-b116-b78826ab5a4e 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] Acquired lock "refresh_cache-51fca94e-ecb9-4350-bafc-765e827a1c7b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 15 05:00:53 localhost nova_compute[286344]: 2025-12-15 10:00:53.110 286348 DEBUG nova.network.neutron [req-3b7a533a-9ff5-423e-8a46-293fe97783d6 req-0ca64284-ed17-46a4-b116-b78826ab5a4e 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Refreshing network info cache for port d6b04aa0-3423-4a78-adfc-4bf3151f80ed _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m Dec 15 05:00:53 localhost nova_compute[286344]: 2025-12-15 10:00:53.114 286348 DEBUG nova.virt.libvirt.driver [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Start _get_guest_xml network_info=[{"id": "d6b04aa0-3423-4a78-adfc-4bf3151f80ed", "address": "fa:16:3e:98:70:a4", "network": {"id": "576e7e59-adb4-4dcb-9b64-cc166b1e1e5f", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1988721240-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "8d2a9ec16aa942ab8315d4057f639915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd6b04aa0-34", "ovs_interfaceid": "d6b04aa0-3423-4a78-adfc-4bf3151f80ed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-15T09:59:08Z,direct_url=,disk_format='qcow2',id=b48177c8-9d95-4864-913a-a010f9defaa6,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c785bf23f53946bc99867d8832a50266',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2025-12-15T09:59:09Z,virtual_size=,visibility=) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_secret_uuid': None, 'size': 0, 'device_name': '/dev/vda', 'encrypted': False, 'device_type': 'disk', 'encryption_format': None, 'boot_index': 0, 'guest_format': None, 'encryption_options': None, 'disk_bus': 'virtio', 'image_id': 'b48177c8-9d95-4864-913a-a010f9defaa6'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m Dec 15 05:00:53 localhost nova_compute[286344]: 2025-12-15 10:00:53.122 286348 WARNING nova.virt.libvirt.driver [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 15 05:00:53 localhost nova_compute[286344]: 2025-12-15 10:00:53.132 286348 DEBUG nova.virt.libvirt.host [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] Searching host: 'np0005559462.localdomain' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m Dec 15 05:00:53 localhost nova_compute[286344]: 2025-12-15 10:00:53.132 286348 DEBUG nova.virt.libvirt.host [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m Dec 15 05:00:53 localhost nova_compute[286344]: 2025-12-15 10:00:53.134 286348 DEBUG nova.virt.libvirt.host [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] Searching host: 'np0005559462.localdomain' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m Dec 15 05:00:53 localhost nova_compute[286344]: 2025-12-15 10:00:53.135 286348 DEBUG nova.virt.libvirt.host [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m Dec 15 05:00:53 localhost nova_compute[286344]: 2025-12-15 10:00:53.136 286348 DEBUG nova.virt.libvirt.driver [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m Dec 15 05:00:53 localhost nova_compute[286344]: 2025-12-15 10:00:53.136 286348 DEBUG nova.virt.hardware [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-15T09:59:07Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='845be42c-ebea-4eba-9a91-8af570062efd',id=5,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-15T09:59:08Z,direct_url=,disk_format='qcow2',id=b48177c8-9d95-4864-913a-a010f9defaa6,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='c785bf23f53946bc99867d8832a50266',properties=ImageMetaProps,protected=,size=21430272,status='active',tags=,updated_at=2025-12-15T09:59:09Z,virtual_size=,visibility=), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m Dec 15 05:00:53 localhost nova_compute[286344]: 2025-12-15 10:00:53.137 286348 DEBUG nova.virt.hardware [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m Dec 15 05:00:53 localhost nova_compute[286344]: 2025-12-15 10:00:53.138 286348 DEBUG nova.virt.hardware [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m Dec 15 05:00:53 localhost nova_compute[286344]: 2025-12-15 10:00:53.138 286348 DEBUG nova.virt.hardware [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m Dec 15 05:00:53 localhost nova_compute[286344]: 2025-12-15 10:00:53.139 286348 DEBUG nova.virt.hardware [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m Dec 15 05:00:53 localhost nova_compute[286344]: 2025-12-15 10:00:53.139 286348 DEBUG nova.virt.hardware [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m Dec 15 05:00:53 localhost nova_compute[286344]: 2025-12-15 10:00:53.139 286348 DEBUG nova.virt.hardware [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m Dec 15 05:00:53 localhost nova_compute[286344]: 2025-12-15 10:00:53.140 286348 DEBUG nova.virt.hardware [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m Dec 15 05:00:53 localhost nova_compute[286344]: 2025-12-15 10:00:53.140 286348 DEBUG nova.virt.hardware [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m Dec 15 05:00:53 localhost nova_compute[286344]: 2025-12-15 10:00:53.141 286348 DEBUG nova.virt.hardware [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m Dec 15 05:00:53 localhost nova_compute[286344]: 2025-12-15 10:00:53.141 286348 DEBUG nova.virt.hardware [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m Dec 15 05:00:53 localhost nova_compute[286344]: 2025-12-15 10:00:53.147 286348 DEBUG oslo_concurrency.processutils [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 05:00:53 localhost nova_compute[286344]: 2025-12-15 10:00:53.248 286348 DEBUG nova.network.neutron [None req-93ddfe1e-4219-4b1f-8860-eb27ca1215ed 58b2de48454141d48ee4d84c6ec84836 7526febbe14640b09a9f0897a2f4af8c - - default default] [instance: 8f5a73ce-a171-4a42-9f60-63c0db9e0a32] Port 19409185-ff01-4d58-bd6e-5d66014dd5c9 updated with migration profile {'migrating_to': 'np0005559462.localdomain'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354#033[00m Dec 15 05:00:53 localhost nova_compute[286344]: 2025-12-15 10:00:53.251 286348 DEBUG nova.compute.manager [None req-93ddfe1e-4219-4b1f-8860-eb27ca1215ed 58b2de48454141d48ee4d84c6ec84836 7526febbe14640b09a9f0897a2f4af8c - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=False,disk_available_mb=13312,disk_over_commit=False,dst_numa_info=,dst_supports_numa_live_migration=,dst_wants_file_backed_memory=False,file_backed_memory_discard=,filename='tmpy9pz01bc',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='8f5a73ce-a171-4a42-9f60-63c0db9e0a32',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=,src_supports_numa_live_migration=,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723#033[00m Dec 15 05:00:53 localhost sshd[314711]: main: sshd: ssh-rsa algorithm is disabled Dec 15 05:00:53 localhost systemd-logind[763]: New session 73 of user nova. Dec 15 05:00:53 localhost systemd[1]: Created slice User Slice of UID 42436. Dec 15 05:00:53 localhost systemd[1]: Starting User Runtime Directory /run/user/42436... Dec 15 05:00:53 localhost systemd[1]: Finished User Runtime Directory /run/user/42436. Dec 15 05:00:53 localhost systemd[1]: Starting User Manager for UID 42436... Dec 15 05:00:53 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Dec 15 05:00:53 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2121100873' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Dec 15 05:00:53 localhost nova_compute[286344]: 2025-12-15 10:00:53.596 286348 DEBUG oslo_concurrency.processutils [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 05:00:53 localhost nova_compute[286344]: 2025-12-15 10:00:53.633 286348 DEBUG nova.storage.rbd_utils [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] rbd image 51fca94e-ecb9-4350-bafc-765e827a1c7b_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m Dec 15 05:00:53 localhost nova_compute[286344]: 2025-12-15 10:00:53.638 286348 DEBUG oslo_concurrency.processutils [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 05:00:53 localhost systemd[314715]: Queued start job for default target Main User Target. Dec 15 05:00:53 localhost systemd[314715]: Created slice User Application Slice. Dec 15 05:00:53 localhost systemd[314715]: Started Mark boot as successful after the user session has run 2 minutes. Dec 15 05:00:53 localhost systemd[314715]: Started Daily Cleanup of User's Temporary Directories. Dec 15 05:00:53 localhost systemd[314715]: Reached target Paths. Dec 15 05:00:53 localhost systemd[314715]: Reached target Timers. Dec 15 05:00:53 localhost systemd[314715]: Starting D-Bus User Message Bus Socket... Dec 15 05:00:53 localhost systemd[314715]: Starting Create User's Volatile Files and Directories... Dec 15 05:00:53 localhost systemd[314715]: Listening on D-Bus User Message Bus Socket. Dec 15 05:00:53 localhost systemd[314715]: Reached target Sockets. Dec 15 05:00:53 localhost systemd[314715]: Finished Create User's Volatile Files and Directories. Dec 15 05:00:53 localhost systemd[314715]: Reached target Basic System. Dec 15 05:00:53 localhost systemd[314715]: Reached target Main User Target. Dec 15 05:00:53 localhost systemd[314715]: Startup finished in 165ms. Dec 15 05:00:53 localhost systemd[1]: Started User Manager for UID 42436. Dec 15 05:00:53 localhost systemd[1]: Started Session 73 of User nova. Dec 15 05:00:53 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e98 do_prune osdmap full prune enabled Dec 15 05:00:53 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e99 e99: 6 total, 6 up, 6 in Dec 15 05:00:53 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e99: 6 total, 6 up, 6 in Dec 15 05:00:53 localhost systemd[1]: Started libvirt secret daemon. Dec 15 05:00:53 localhost kernel: device tap19409185-ff entered promiscuous mode Dec 15 05:00:53 localhost systemd-udevd[314803]: Network interface NamePolicy= disabled on kernel command line. Dec 15 05:00:53 localhost NetworkManager[5963]: [1765792853.9510] manager: (tap19409185-ff): new Tun device (/org/freedesktop/NetworkManager/Devices/21) Dec 15 05:00:53 localhost NetworkManager[5963]: [1765792853.9817] device (tap19409185-ff): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external') Dec 15 05:00:53 localhost NetworkManager[5963]: [1765792853.9824] device (tap19409185-ff): state change: unavailable -> disconnected (reason 'none', sys-iface-state: 'external') Dec 15 05:00:53 localhost nova_compute[286344]: 2025-12-15 10:00:53.987 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:00:53 localhost ovn_controller[154603]: 2025-12-15T10:00:53Z|00086|binding|INFO|Claiming lport 19409185-ff01-4d58-bd6e-5d66014dd5c9 for this additional chassis. Dec 15 05:00:53 localhost ovn_controller[154603]: 2025-12-15T10:00:53Z|00087|binding|INFO|19409185-ff01-4d58-bd6e-5d66014dd5c9: Claiming fa:16:3e:df:04:be 10.100.0.13 Dec 15 05:00:53 localhost ovn_controller[154603]: 2025-12-15T10:00:53Z|00088|binding|INFO|Claiming lport fd72ab01-c63c-4f51-ae94-ca02f6f50dfa for this additional chassis. Dec 15 05:00:53 localhost ovn_controller[154603]: 2025-12-15T10:00:53Z|00089|binding|INFO|fd72ab01-c63c-4f51-ae94-ca02f6f50dfa: Claiming fa:16:3e:79:ba:17 19.80.0.198 Dec 15 05:00:54 localhost systemd-machined[84011]: New machine qemu-3-instance-00000007. Dec 15 05:00:54 localhost ovn_controller[154603]: 2025-12-15T10:00:54Z|00090|binding|INFO|Setting lport 19409185-ff01-4d58-bd6e-5d66014dd5c9 ovn-installed in OVS Dec 15 05:00:54 localhost systemd[1]: Started Virtual Machine qemu-3-instance-00000007. Dec 15 05:00:54 localhost nova_compute[286344]: 2025-12-15 10:00:54.017 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:00:54 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Dec 15 05:00:54 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3303723553' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Dec 15 05:00:54 localhost nova_compute[286344]: 2025-12-15 10:00:54.110 286348 DEBUG oslo_concurrency.processutils [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.472s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 05:00:54 localhost nova_compute[286344]: 2025-12-15 10:00:54.111 286348 DEBUG nova.virt.libvirt.vif [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-15T10:00:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-612817668',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(5),hidden=False,host='np0005559462.localdomain',hostname='tempest-liveautoblockmigrationv225test-server-612817668',id=8,image_ref='b48177c8-9d95-4864-913a-a010f9defaa6',info_cache=InstanceInfoCache,instance_type_id=5,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='np0005559462.localdomain',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='np0005559462.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8d2a9ec16aa942ab8315d4057f639915',ramdisk_id='',reservation_id='r-446f3i0r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='b48177c8-9d95-4864-913a-a010f9defaa6',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1078555498',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1078555498-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-15T10:00:49Z,user_data=None,user_id='c79d291546244f6a970ffc157036d797',uuid=51fca94e-ecb9-4350-bafc-765e827a1c7b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d6b04aa0-3423-4a78-adfc-4bf3151f80ed", "address": "fa:16:3e:98:70:a4", "network": {"id": "576e7e59-adb4-4dcb-9b64-cc166b1e1e5f", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1988721240-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "8d2a9ec16aa942ab8315d4057f639915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd6b04aa0-34", "ovs_interfaceid": "d6b04aa0-3423-4a78-adfc-4bf3151f80ed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m Dec 15 05:00:54 localhost nova_compute[286344]: 2025-12-15 10:00:54.112 286348 DEBUG nova.network.os_vif_util [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] Converting VIF {"id": "d6b04aa0-3423-4a78-adfc-4bf3151f80ed", "address": "fa:16:3e:98:70:a4", "network": {"id": "576e7e59-adb4-4dcb-9b64-cc166b1e1e5f", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1988721240-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "8d2a9ec16aa942ab8315d4057f639915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd6b04aa0-34", "ovs_interfaceid": "d6b04aa0-3423-4a78-adfc-4bf3151f80ed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m Dec 15 05:00:54 localhost nova_compute[286344]: 2025-12-15 10:00:54.113 286348 DEBUG nova.network.os_vif_util [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:98:70:a4,bridge_name='br-int',has_traffic_filtering=True,id=d6b04aa0-3423-4a78-adfc-4bf3151f80ed,network=Network(576e7e59-adb4-4dcb-9b64-cc166b1e1e5f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapd6b04aa0-34') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m Dec 15 05:00:54 localhost nova_compute[286344]: 2025-12-15 10:00:54.116 286348 DEBUG nova.objects.instance [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] Lazy-loading 'pci_devices' on Instance uuid 51fca94e-ecb9-4350-bafc-765e827a1c7b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 15 05:00:54 localhost nova_compute[286344]: 2025-12-15 10:00:54.133 286348 DEBUG nova.virt.libvirt.driver [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] End _get_guest_xml xml= Dec 15 05:00:54 localhost nova_compute[286344]: 51fca94e-ecb9-4350-bafc-765e827a1c7b Dec 15 05:00:54 localhost nova_compute[286344]: instance-00000008 Dec 15 05:00:54 localhost nova_compute[286344]: 131072 Dec 15 05:00:54 localhost nova_compute[286344]: 1 Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: tempest-LiveAutoBlockMigrationV225Test-server-612817668 Dec 15 05:00:54 localhost nova_compute[286344]: 2025-12-15 10:00:53 Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: 128 Dec 15 05:00:54 localhost nova_compute[286344]: 1 Dec 15 05:00:54 localhost nova_compute[286344]: 0 Dec 15 05:00:54 localhost nova_compute[286344]: 0 Dec 15 05:00:54 localhost nova_compute[286344]: 1 Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: tempest-LiveAutoBlockMigrationV225Test-1078555498-project-member Dec 15 05:00:54 localhost nova_compute[286344]: tempest-LiveAutoBlockMigrationV225Test-1078555498 Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: RDO Dec 15 05:00:54 localhost nova_compute[286344]: OpenStack Compute Dec 15 05:00:54 localhost nova_compute[286344]: 27.5.2-0.20250829104910.6f8decf.el9 Dec 15 05:00:54 localhost nova_compute[286344]: 51fca94e-ecb9-4350-bafc-765e827a1c7b Dec 15 05:00:54 localhost nova_compute[286344]: 51fca94e-ecb9-4350-bafc-765e827a1c7b Dec 15 05:00:54 localhost nova_compute[286344]: Virtual Machine Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: hvm Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: /dev/urandom Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: Dec 15 05:00:54 localhost nova_compute[286344]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m Dec 15 05:00:54 localhost nova_compute[286344]: 2025-12-15 10:00:54.134 286348 DEBUG nova.compute.manager [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Preparing to wait for external event network-vif-plugged-d6b04aa0-3423-4a78-adfc-4bf3151f80ed prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m Dec 15 05:00:54 localhost nova_compute[286344]: 2025-12-15 10:00:54.135 286348 DEBUG oslo_concurrency.lockutils [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] Acquiring lock "51fca94e-ecb9-4350-bafc-765e827a1c7b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 05:00:54 localhost nova_compute[286344]: 2025-12-15 10:00:54.136 286348 DEBUG oslo_concurrency.lockutils [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] Lock "51fca94e-ecb9-4350-bafc-765e827a1c7b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 05:00:54 localhost nova_compute[286344]: 2025-12-15 10:00:54.136 286348 DEBUG oslo_concurrency.lockutils [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] Lock "51fca94e-ecb9-4350-bafc-765e827a1c7b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 05:00:54 localhost nova_compute[286344]: 2025-12-15 10:00:54.138 286348 DEBUG nova.virt.libvirt.vif [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-15T10:00:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-612817668',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(5),hidden=False,host='np0005559462.localdomain',hostname='tempest-liveautoblockmigrationv225test-server-612817668',id=8,image_ref='b48177c8-9d95-4864-913a-a010f9defaa6',info_cache=InstanceInfoCache,instance_type_id=5,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='np0005559462.localdomain',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='np0005559462.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8d2a9ec16aa942ab8315d4057f639915',ramdisk_id='',reservation_id='r-446f3i0r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='b48177c8-9d95-4864-913a-a010f9defaa6',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1078555498',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1078555498-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-15T10:00:49Z,user_data=None,user_id='c79d291546244f6a970ffc157036d797',uuid=51fca94e-ecb9-4350-bafc-765e827a1c7b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d6b04aa0-3423-4a78-adfc-4bf3151f80ed", "address": "fa:16:3e:98:70:a4", "network": {"id": "576e7e59-adb4-4dcb-9b64-cc166b1e1e5f", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1988721240-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "8d2a9ec16aa942ab8315d4057f639915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd6b04aa0-34", "ovs_interfaceid": "d6b04aa0-3423-4a78-adfc-4bf3151f80ed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m Dec 15 05:00:54 localhost nova_compute[286344]: 2025-12-15 10:00:54.138 286348 DEBUG nova.network.os_vif_util [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] Converting VIF {"id": "d6b04aa0-3423-4a78-adfc-4bf3151f80ed", "address": "fa:16:3e:98:70:a4", "network": {"id": "576e7e59-adb4-4dcb-9b64-cc166b1e1e5f", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1988721240-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "8d2a9ec16aa942ab8315d4057f639915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd6b04aa0-34", "ovs_interfaceid": "d6b04aa0-3423-4a78-adfc-4bf3151f80ed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m Dec 15 05:00:54 localhost nova_compute[286344]: 2025-12-15 10:00:54.140 286348 DEBUG nova.network.os_vif_util [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:98:70:a4,bridge_name='br-int',has_traffic_filtering=True,id=d6b04aa0-3423-4a78-adfc-4bf3151f80ed,network=Network(576e7e59-adb4-4dcb-9b64-cc166b1e1e5f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapd6b04aa0-34') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m Dec 15 05:00:54 localhost nova_compute[286344]: 2025-12-15 10:00:54.141 286348 DEBUG os_vif [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:98:70:a4,bridge_name='br-int',has_traffic_filtering=True,id=d6b04aa0-3423-4a78-adfc-4bf3151f80ed,network=Network(576e7e59-adb4-4dcb-9b64-cc166b1e1e5f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapd6b04aa0-34') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m Dec 15 05:00:54 localhost nova_compute[286344]: 2025-12-15 10:00:54.142 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:00:54 localhost nova_compute[286344]: 2025-12-15 10:00:54.143 286348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 15 05:00:54 localhost nova_compute[286344]: 2025-12-15 10:00:54.143 286348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Dec 15 05:00:54 localhost nova_compute[286344]: 2025-12-15 10:00:54.148 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:00:54 localhost nova_compute[286344]: 2025-12-15 10:00:54.148 286348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd6b04aa0-34, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 15 05:00:54 localhost nova_compute[286344]: 2025-12-15 10:00:54.149 286348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd6b04aa0-34, col_values=(('external_ids', {'iface-id': 'd6b04aa0-3423-4a78-adfc-4bf3151f80ed', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:98:70:a4', 'vm-uuid': '51fca94e-ecb9-4350-bafc-765e827a1c7b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 15 05:00:54 localhost nova_compute[286344]: 2025-12-15 10:00:54.152 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:00:54 localhost nova_compute[286344]: 2025-12-15 10:00:54.161 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 05:00:54 localhost nova_compute[286344]: 2025-12-15 10:00:54.162 286348 INFO os_vif [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:98:70:a4,bridge_name='br-int',has_traffic_filtering=True,id=d6b04aa0-3423-4a78-adfc-4bf3151f80ed,network=Network(576e7e59-adb4-4dcb-9b64-cc166b1e1e5f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapd6b04aa0-34')#033[00m Dec 15 05:00:54 localhost nova_compute[286344]: 2025-12-15 10:00:54.212 286348 DEBUG nova.virt.libvirt.driver [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m Dec 15 05:00:54 localhost nova_compute[286344]: 2025-12-15 10:00:54.213 286348 DEBUG nova.virt.libvirt.driver [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m Dec 15 05:00:54 localhost nova_compute[286344]: 2025-12-15 10:00:54.213 286348 DEBUG nova.virt.libvirt.driver [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] No VIF found with MAC fa:16:3e:98:70:a4, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m Dec 15 05:00:54 localhost nova_compute[286344]: 2025-12-15 10:00:54.214 286348 INFO nova.virt.libvirt.driver [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Using config drive#033[00m Dec 15 05:00:54 localhost nova_compute[286344]: 2025-12-15 10:00:54.253 286348 DEBUG nova.storage.rbd_utils [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] rbd image 51fca94e-ecb9-4350-bafc-765e827a1c7b_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m Dec 15 05:00:54 localhost nova_compute[286344]: 2025-12-15 10:00:54.273 286348 DEBUG nova.network.neutron [req-3b7a533a-9ff5-423e-8a46-293fe97783d6 req-0ca64284-ed17-46a4-b116-b78826ab5a4e 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Updated VIF entry in instance network info cache for port d6b04aa0-3423-4a78-adfc-4bf3151f80ed. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m Dec 15 05:00:54 localhost nova_compute[286344]: 2025-12-15 10:00:54.274 286348 DEBUG nova.network.neutron [req-3b7a533a-9ff5-423e-8a46-293fe97783d6 req-0ca64284-ed17-46a4-b116-b78826ab5a4e 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Updating instance_info_cache with network_info: [{"id": "d6b04aa0-3423-4a78-adfc-4bf3151f80ed", "address": "fa:16:3e:98:70:a4", "network": {"id": "576e7e59-adb4-4dcb-9b64-cc166b1e1e5f", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1988721240-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "8d2a9ec16aa942ab8315d4057f639915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd6b04aa0-34", "ovs_interfaceid": "d6b04aa0-3423-4a78-adfc-4bf3151f80ed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 15 05:00:54 localhost nova_compute[286344]: 2025-12-15 10:00:54.293 286348 DEBUG oslo_concurrency.lockutils [req-3b7a533a-9ff5-423e-8a46-293fe97783d6 req-0ca64284-ed17-46a4-b116-b78826ab5a4e 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] Releasing lock "refresh_cache-51fca94e-ecb9-4350-bafc-765e827a1c7b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 15 05:00:54 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Dec 15 05:00:54 localhost nova_compute[286344]: 2025-12-15 10:00:54.400 286348 DEBUG nova.virt.driver [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] Emitting event Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m Dec 15 05:00:54 localhost nova_compute[286344]: 2025-12-15 10:00:54.401 286348 INFO nova.compute.manager [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] [instance: 8f5a73ce-a171-4a42-9f60-63c0db9e0a32] VM Started (Lifecycle Event)#033[00m Dec 15 05:00:54 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:00:54 localhost nova_compute[286344]: 2025-12-15 10:00:54.418 286348 DEBUG nova.compute.manager [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] [instance: 8f5a73ce-a171-4a42-9f60-63c0db9e0a32] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Dec 15 05:00:54 localhost nova_compute[286344]: 2025-12-15 10:00:54.469 286348 INFO nova.virt.libvirt.driver [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Creating config drive at /var/lib/nova/instances/51fca94e-ecb9-4350-bafc-765e827a1c7b/disk.config#033[00m Dec 15 05:00:54 localhost nova_compute[286344]: 2025-12-15 10:00:54.477 286348 DEBUG oslo_concurrency.processutils [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/51fca94e-ecb9-4350-bafc-765e827a1c7b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0e9w41wf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 05:00:54 localhost nova_compute[286344]: 2025-12-15 10:00:54.607 286348 DEBUG oslo_concurrency.processutils [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/51fca94e-ecb9-4350-bafc-765e827a1c7b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0e9w41wf" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 05:00:54 localhost nova_compute[286344]: 2025-12-15 10:00:54.658 286348 DEBUG nova.storage.rbd_utils [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] rbd image 51fca94e-ecb9-4350-bafc-765e827a1c7b_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m Dec 15 05:00:54 localhost nova_compute[286344]: 2025-12-15 10:00:54.661 286348 DEBUG oslo_concurrency.processutils [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/51fca94e-ecb9-4350-bafc-765e827a1c7b/disk.config 51fca94e-ecb9-4350-bafc-765e827a1c7b_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 05:00:54 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:00:54 localhost neutron_sriov_agent[260044]: 2025-12-15 10:00:54.856 2 INFO neutron.agent.securitygroups_rpc [req-1f074480-540c-45f7-b3a3-fc8b13606a24 req-0ac2850d-8318-4d75-b236-4aac93b6bd8b 35309870d5bd457f95d32ae22fa4f72e b24a65ffabd5489d869d7df73e041227 - - default default] Security group rule updated ['3ac826ad-5329-453e-8715-4e938cae8b82']#033[00m Dec 15 05:00:55 localhost nova_compute[286344]: 2025-12-15 10:00:55.097 286348 DEBUG oslo_concurrency.processutils [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/51fca94e-ecb9-4350-bafc-765e827a1c7b/disk.config 51fca94e-ecb9-4350-bafc-765e827a1c7b_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.436s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 05:00:55 localhost nova_compute[286344]: 2025-12-15 10:00:55.098 286348 INFO nova.virt.libvirt.driver [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Deleting local config drive /var/lib/nova/instances/51fca94e-ecb9-4350-bafc-765e827a1c7b/disk.config because it was imported into RBD.#033[00m Dec 15 05:00:55 localhost kernel: device tapd6b04aa0-34 entered promiscuous mode Dec 15 05:00:55 localhost NetworkManager[5963]: [1765792855.1534] manager: (tapd6b04aa0-34): new Tun device (/org/freedesktop/NetworkManager/Devices/22) Dec 15 05:00:55 localhost ovn_controller[154603]: 2025-12-15T10:00:55Z|00091|binding|INFO|Claiming lport d6b04aa0-3423-4a78-adfc-4bf3151f80ed for this chassis. Dec 15 05:00:55 localhost ovn_controller[154603]: 2025-12-15T10:00:55Z|00092|binding|INFO|d6b04aa0-3423-4a78-adfc-4bf3151f80ed: Claiming fa:16:3e:98:70:a4 10.100.0.13 Dec 15 05:00:55 localhost ovn_controller[154603]: 2025-12-15T10:00:55Z|00093|binding|INFO|Claiming lport d92e338f-e5ff-4170-8303-27f41ee35ef3 for this chassis. Dec 15 05:00:55 localhost ovn_controller[154603]: 2025-12-15T10:00:55Z|00094|binding|INFO|d92e338f-e5ff-4170-8303-27f41ee35ef3: Claiming fa:16:3e:75:03:44 19.80.0.5 Dec 15 05:00:55 localhost nova_compute[286344]: 2025-12-15 10:00:55.164 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:00:55 localhost NetworkManager[5963]: [1765792855.1705] device (tapd6b04aa0-34): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external') Dec 15 05:00:55 localhost NetworkManager[5963]: [1765792855.1710] device (tapd6b04aa0-34): state change: unavailable -> disconnected (reason 'none', sys-iface-state: 'external') Dec 15 05:00:55 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:55.176 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:98:70:a4 10.100.0.13'], port_security=['fa:16:3e:98:70:a4 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-580979727', 'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '51fca94e-ecb9-4350-bafc-765e827a1c7b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-576e7e59-adb4-4dcb-9b64-cc166b1e1e5f', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-580979727', 'neutron:project_id': '8d2a9ec16aa942ab8315d4057f639915', 'neutron:revision_number': '2', 'neutron:security_group_ids': '48dda613-ea3d-4053-a6ae-35f8ce5dfd37', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=22fe063e-3457-4082-83f2-544c43df7165, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[], logical_port=d6b04aa0-3423-4a78-adfc-4bf3151f80ed) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:00:55 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:55.180 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:75:03:44 19.80.0.5'], port_security=['fa:16:3e:75:03:44 19.80.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['d6b04aa0-3423-4a78-adfc-4bf3151f80ed'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-1273927', 'neutron:cidrs': '19.80.0.5/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-594a8673-651b-4566-92ec-8dbe6ca00b60', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-1273927', 'neutron:project_id': '8d2a9ec16aa942ab8315d4057f639915', 'neutron:revision_number': '2', 'neutron:security_group_ids': '48dda613-ea3d-4053-a6ae-35f8ce5dfd37', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=8bd87d03-8621-4b45-a769-2f1ac086eff3, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=d92e338f-e5ff-4170-8303-27f41ee35ef3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:00:55 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:55.182 160590 INFO neutron.agent.ovn.metadata.agent [-] Port d6b04aa0-3423-4a78-adfc-4bf3151f80ed in datapath 576e7e59-adb4-4dcb-9b64-cc166b1e1e5f bound to our chassis#033[00m Dec 15 05:00:55 localhost nova_compute[286344]: 2025-12-15 10:00:55.184 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:00:55 localhost ovn_controller[154603]: 2025-12-15T10:00:55Z|00095|binding|INFO|Setting lport d6b04aa0-3423-4a78-adfc-4bf3151f80ed ovn-installed in OVS Dec 15 05:00:55 localhost ovn_controller[154603]: 2025-12-15T10:00:55Z|00096|binding|INFO|Setting lport d6b04aa0-3423-4a78-adfc-4bf3151f80ed up in Southbound Dec 15 05:00:55 localhost ovn_controller[154603]: 2025-12-15T10:00:55Z|00097|binding|INFO|Setting lport d92e338f-e5ff-4170-8303-27f41ee35ef3 up in Southbound Dec 15 05:00:55 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:55.185 160590 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 576e7e59-adb4-4dcb-9b64-cc166b1e1e5f#033[00m Dec 15 05:00:55 localhost nova_compute[286344]: 2025-12-15 10:00:55.186 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:00:55 localhost nova_compute[286344]: 2025-12-15 10:00:55.187 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:00:55 localhost systemd-machined[84011]: New machine qemu-4-instance-00000008. Dec 15 05:00:55 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:55.194 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[57f5db44-cf62-4fce-98dd-1676a1c46459]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:00:55 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:55.195 160590 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap576e7e59-a1 in ovnmeta-576e7e59-adb4-4dcb-9b64-cc166b1e1e5f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m Dec 15 05:00:55 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:55.196 160858 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap576e7e59-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m Dec 15 05:00:55 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:55.196 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[07558617-81d3-43f0-8de4-aab393351870]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:00:55 localhost systemd[1]: Started Virtual Machine qemu-4-instance-00000008. Dec 15 05:00:55 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:55.197 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[7ac8614b-8c49-4299-90ed-7ec92881b76f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:00:55 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:55.211 160979 DEBUG oslo.privsep.daemon [-] privsep: reply[c9c445e3-5907-4b21-83f2-61bd3834ee6f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:00:55 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:55.225 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[1382c2a0-a6ad-4624-9bad-05768cdfa033]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:00:55 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:55.257 160959 DEBUG oslo.privsep.daemon [-] privsep: reply[7d3aa129-573c-47cf-97ab-1968dc39d428]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:00:55 localhost NetworkManager[5963]: [1765792855.2655] manager: (tap576e7e59-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/23) Dec 15 05:00:55 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:55.264 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[68f96b20-32fc-489e-83a0-d8a2cf16d9ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:00:55 localhost nova_compute[286344]: 2025-12-15 10:00:55.271 286348 DEBUG nova.virt.driver [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] Emitting event Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m Dec 15 05:00:55 localhost nova_compute[286344]: 2025-12-15 10:00:55.271 286348 INFO nova.compute.manager [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] [instance: 8f5a73ce-a171-4a42-9f60-63c0db9e0a32] VM Resumed (Lifecycle Event)#033[00m Dec 15 05:00:55 localhost nova_compute[286344]: 2025-12-15 10:00:55.292 286348 DEBUG nova.compute.manager [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] [instance: 8f5a73ce-a171-4a42-9f60-63c0db9e0a32] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Dec 15 05:00:55 localhost nova_compute[286344]: 2025-12-15 10:00:55.306 286348 DEBUG nova.compute.manager [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] [instance: 8f5a73ce-a171-4a42-9f60-63c0db9e0a32] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m Dec 15 05:00:55 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:55.310 160959 DEBUG oslo.privsep.daemon [-] privsep: reply[b90fbe85-a33c-4b9f-9fc2-68c9eb513ba6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:00:55 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:55.313 160959 DEBUG oslo.privsep.daemon [-] privsep: reply[b96c489a-bd14-4a3f-b7af-5527a120a855]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:00:55 localhost NetworkManager[5963]: [1765792855.3377] device (tap576e7e59-a0): carrier: link connected Dec 15 05:00:55 localhost kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap576e7e59-a1: link becomes ready Dec 15 05:00:55 localhost kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap576e7e59-a0: link becomes ready Dec 15 05:00:55 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:55.341 160959 DEBUG oslo.privsep.daemon [-] privsep: reply[224bc1b5-e890-427c-9bb7-78116379662a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:00:55 localhost nova_compute[286344]: 2025-12-15 10:00:55.354 286348 INFO nova.compute.manager [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] [instance: 8f5a73ce-a171-4a42-9f60-63c0db9e0a32] During the sync_power process the instance has moved from host np0005559463.localdomain to host np0005559462.localdomain#033[00m Dec 15 05:00:55 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:55.360 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[6bf0335b-4ea0-4245-b860-20c42955c58b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap576e7e59-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:57:cc:9e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 24], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1192145, 'reachable_time': 37502, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 314983, 'error': None, 'target': 'ovnmeta-576e7e59-adb4-4dcb-9b64-cc166b1e1e5f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:00:55 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:55.376 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[3f411710-ae85-45dd-a8e1-d9d2cb5b0cd5]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe57:cc9e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1192145, 'tstamp': 1192145}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 314987, 'error': None, 'target': 'ovnmeta-576e7e59-adb4-4dcb-9b64-cc166b1e1e5f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:00:55 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:55.394 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[d7ded6f2-726a-47bc-bde4-fcda4069e935]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap576e7e59-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:57:cc:9e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 24], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1192145, 'reachable_time': 37502, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 314988, 'error': None, 'target': 'ovnmeta-576e7e59-adb4-4dcb-9b64-cc166b1e1e5f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:00:55 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:55.420 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[9dcdfd9a-ef05-4f87-bbe3-099239335a5b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:00:55 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e99 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 05:00:55 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:55.477 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[73de0a47-3efe-4394-b76c-7cbc31fdb248]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:00:55 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:55.480 160590 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap576e7e59-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 15 05:00:55 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:55.480 160590 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Dec 15 05:00:55 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:55.481 160590 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap576e7e59-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 15 05:00:55 localhost nova_compute[286344]: 2025-12-15 10:00:55.486 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:00:55 localhost kernel: device tap576e7e59-a0 entered promiscuous mode Dec 15 05:00:55 localhost nova_compute[286344]: 2025-12-15 10:00:55.491 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:00:55 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:55.492 160590 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap576e7e59-a0, col_values=(('external_ids', {'iface-id': '1a4411b8-2368-4b17-9d10-08f9c3480350'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 15 05:00:55 localhost nova_compute[286344]: 2025-12-15 10:00:55.494 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:00:55 localhost ovn_controller[154603]: 2025-12-15T10:00:55Z|00098|binding|INFO|Releasing lport 1a4411b8-2368-4b17-9d10-08f9c3480350 from this chassis (sb_readonly=0) Dec 15 05:00:55 localhost nova_compute[286344]: 2025-12-15 10:00:55.506 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:00:55 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:55.507 160590 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/576e7e59-adb4-4dcb-9b64-cc166b1e1e5f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/576e7e59-adb4-4dcb-9b64-cc166b1e1e5f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m Dec 15 05:00:55 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:55.508 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[9ed2a40b-4e06-4fab-a417-383df02e541e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:00:55 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:55.509 160590 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = Dec 15 05:00:55 localhost ovn_metadata_agent[160585]: global Dec 15 05:00:55 localhost ovn_metadata_agent[160585]: log /dev/log local0 debug Dec 15 05:00:55 localhost ovn_metadata_agent[160585]: log-tag haproxy-metadata-proxy-576e7e59-adb4-4dcb-9b64-cc166b1e1e5f Dec 15 05:00:55 localhost ovn_metadata_agent[160585]: user root Dec 15 05:00:55 localhost ovn_metadata_agent[160585]: group root Dec 15 05:00:55 localhost ovn_metadata_agent[160585]: maxconn 1024 Dec 15 05:00:55 localhost ovn_metadata_agent[160585]: pidfile /var/lib/neutron/external/pids/576e7e59-adb4-4dcb-9b64-cc166b1e1e5f.pid.haproxy Dec 15 05:00:55 localhost ovn_metadata_agent[160585]: daemon Dec 15 05:00:55 localhost ovn_metadata_agent[160585]: Dec 15 05:00:55 localhost ovn_metadata_agent[160585]: defaults Dec 15 05:00:55 localhost ovn_metadata_agent[160585]: log global Dec 15 05:00:55 localhost ovn_metadata_agent[160585]: mode http Dec 15 05:00:55 localhost ovn_metadata_agent[160585]: option httplog Dec 15 05:00:55 localhost ovn_metadata_agent[160585]: option dontlognull Dec 15 05:00:55 localhost ovn_metadata_agent[160585]: option http-server-close Dec 15 05:00:55 localhost ovn_metadata_agent[160585]: option forwardfor Dec 15 05:00:55 localhost ovn_metadata_agent[160585]: retries 3 Dec 15 05:00:55 localhost ovn_metadata_agent[160585]: timeout http-request 30s Dec 15 05:00:55 localhost ovn_metadata_agent[160585]: timeout connect 30s Dec 15 05:00:55 localhost ovn_metadata_agent[160585]: timeout client 32s Dec 15 05:00:55 localhost ovn_metadata_agent[160585]: timeout server 32s Dec 15 05:00:55 localhost ovn_metadata_agent[160585]: timeout http-keep-alive 30s Dec 15 05:00:55 localhost ovn_metadata_agent[160585]: Dec 15 05:00:55 localhost ovn_metadata_agent[160585]: Dec 15 05:00:55 localhost ovn_metadata_agent[160585]: listen listener Dec 15 05:00:55 localhost ovn_metadata_agent[160585]: bind 169.254.169.254:80 Dec 15 05:00:55 localhost ovn_metadata_agent[160585]: server metadata /var/lib/neutron/metadata_proxy Dec 15 05:00:55 localhost ovn_metadata_agent[160585]: http-request add-header X-OVN-Network-ID 576e7e59-adb4-4dcb-9b64-cc166b1e1e5f Dec 15 05:00:55 localhost ovn_metadata_agent[160585]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m Dec 15 05:00:55 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:55.510 160590 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-576e7e59-adb4-4dcb-9b64-cc166b1e1e5f', 'env', 'PROCESS_TAG=haproxy-576e7e59-adb4-4dcb-9b64-cc166b1e1e5f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/576e7e59-adb4-4dcb-9b64-cc166b1e1e5f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m Dec 15 05:00:55 localhost systemd[1]: session-73.scope: Deactivated successfully. Dec 15 05:00:55 localhost systemd-logind[763]: Session 73 logged out. Waiting for processes to exit. Dec 15 05:00:55 localhost systemd-logind[763]: Removed session 73. Dec 15 05:00:55 localhost nova_compute[286344]: 2025-12-15 10:00:55.552 286348 DEBUG nova.virt.driver [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] Emitting event Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m Dec 15 05:00:55 localhost nova_compute[286344]: 2025-12-15 10:00:55.553 286348 INFO nova.compute.manager [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] VM Started (Lifecycle Event)#033[00m Dec 15 05:00:55 localhost nova_compute[286344]: 2025-12-15 10:00:55.577 286348 DEBUG nova.compute.manager [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Dec 15 05:00:55 localhost nova_compute[286344]: 2025-12-15 10:00:55.581 286348 DEBUG nova.virt.driver [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] Emitting event Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m Dec 15 05:00:55 localhost nova_compute[286344]: 2025-12-15 10:00:55.582 286348 INFO nova.compute.manager [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] VM Paused (Lifecycle Event)#033[00m Dec 15 05:00:55 localhost nova_compute[286344]: 2025-12-15 10:00:55.613 286348 DEBUG nova.compute.manager [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Dec 15 05:00:55 localhost nova_compute[286344]: 2025-12-15 10:00:55.617 286348 DEBUG nova.compute.manager [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m Dec 15 05:00:55 localhost nova_compute[286344]: 2025-12-15 10:00:55.649 286348 INFO nova.compute.manager [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m Dec 15 05:00:55 localhost podman[315045]: Dec 15 05:00:55 localhost podman[315045]: 2025-12-15 10:00:55.975664984 +0000 UTC m=+0.103843842 container create 334c8a6eb0abcc8decf6d5ba63ae90dfeaef71d3014d8a42d417306392b53a6f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-576e7e59-adb4-4dcb-9b64-cc166b1e1e5f, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 05:00:56 localhost podman[315045]: 2025-12-15 10:00:55.922671307 +0000 UTC m=+0.050850225 image pull quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified Dec 15 05:00:56 localhost systemd[1]: Started libpod-conmon-334c8a6eb0abcc8decf6d5ba63ae90dfeaef71d3014d8a42d417306392b53a6f.scope. Dec 15 05:00:56 localhost systemd[1]: Started libcrun container. Dec 15 05:00:56 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/25934885af206e52e90093a8e194d448a632cfa70e1d6f763662fc57e634a973/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 05:00:56 localhost podman[315045]: 2025-12-15 10:00:56.074918882 +0000 UTC m=+0.203097740 container init 334c8a6eb0abcc8decf6d5ba63ae90dfeaef71d3014d8a42d417306392b53a6f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-576e7e59-adb4-4dcb-9b64-cc166b1e1e5f, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true) Dec 15 05:00:56 localhost podman[315045]: 2025-12-15 10:00:56.084566259 +0000 UTC m=+0.212745117 container start 334c8a6eb0abcc8decf6d5ba63ae90dfeaef71d3014d8a42d417306392b53a6f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-576e7e59-adb4-4dcb-9b64-cc166b1e1e5f, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:00:56 localhost neutron-haproxy-ovnmeta-576e7e59-adb4-4dcb-9b64-cc166b1e1e5f[315059]: [NOTICE] (315063) : New worker (315065) forked Dec 15 05:00:56 localhost neutron-haproxy-ovnmeta-576e7e59-adb4-4dcb-9b64-cc166b1e1e5f[315059]: [NOTICE] (315063) : Loading success. Dec 15 05:00:56 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:56.143 160590 INFO neutron.agent.ovn.metadata.agent [-] Port d92e338f-e5ff-4170-8303-27f41ee35ef3 in datapath 594a8673-651b-4566-92ec-8dbe6ca00b60 unbound from our chassis#033[00m Dec 15 05:00:56 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:56.149 160590 DEBUG neutron.agent.ovn.metadata.agent [-] Port defc7b23-8924-4b69-a932-1bf1428cc324 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Dec 15 05:00:56 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:56.149 160590 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 594a8673-651b-4566-92ec-8dbe6ca00b60#033[00m Dec 15 05:00:56 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:56.155 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[4ccbd31d-831f-43c4-89da-4abf9833f8e4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:00:56 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:56.156 160590 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap594a8673-61 in ovnmeta-594a8673-651b-4566-92ec-8dbe6ca00b60 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m Dec 15 05:00:56 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:56.157 160858 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap594a8673-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m Dec 15 05:00:56 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:56.158 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[9f9b07fd-91a3-4d87-9876-8254c0d36bdc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:00:56 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:56.159 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[58b50610-a36f-41e4-a3d5-134b9536747a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:00:56 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:56.165 160979 DEBUG oslo.privsep.daemon [-] privsep: reply[f9718b3d-622b-4ad8-b206-963d4e8484ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:00:56 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:56.185 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[08e46a92-324d-4114-910b-5f760deb3bc6]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:00:56 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:56.204 160959 DEBUG oslo.privsep.daemon [-] privsep: reply[bcc981fc-9caa-4726-9ae3-00f91ca98405]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:00:56 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:56.210 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[0ef269c8-6a8f-4184-8b22-1ebf424ccb13]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:00:56 localhost NetworkManager[5963]: [1765792856.2123] manager: (tap594a8673-60): new Veth device (/org/freedesktop/NetworkManager/Devices/24) Dec 15 05:00:56 localhost systemd-udevd[314955]: Network interface NamePolicy= disabled on kernel command line. Dec 15 05:00:56 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:56.236 160959 DEBUG oslo.privsep.daemon [-] privsep: reply[9e08f1cc-e839-4142-8070-f574cbaca111]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:00:56 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:56.239 160959 DEBUG oslo.privsep.daemon [-] privsep: reply[57bb66d8-3895-411b-9372-ba71e24e77aa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:00:56 localhost kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap594a8673-60: link becomes ready Dec 15 05:00:56 localhost NetworkManager[5963]: [1765792856.2622] device (tap594a8673-60): carrier: link connected Dec 15 05:00:56 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:56.264 160959 DEBUG oslo.privsep.daemon [-] privsep: reply[74da7496-aaa3-49c3-89d7-69b4d4f67406]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:00:56 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:56.278 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[9e050eba-5eb9-4088-aef3-40dad56edf81]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap594a8673-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:de:17:b5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 25], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1192237, 'reachable_time': 22200, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 315084, 'error': None, 'target': 'ovnmeta-594a8673-651b-4566-92ec-8dbe6ca00b60', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:00:56 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:56.290 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[90530bdd-346a-425e-a211-d5174f422863]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fede:17b5'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1192237, 'tstamp': 1192237}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 315085, 'error': None, 'target': 'ovnmeta-594a8673-651b-4566-92ec-8dbe6ca00b60', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:00:56 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:56.302 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[38ab5053-0601-4c7f-8ee6-9ee59cf54bc3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap594a8673-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:de:17:b5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 25], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1192237, 'reachable_time': 22200, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 315086, 'error': None, 'target': 'ovnmeta-594a8673-651b-4566-92ec-8dbe6ca00b60', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:00:56 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:56.325 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[b6480b23-2f97-42f3-aa96-e05711de8045]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:00:56 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:56.367 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[42c96776-4432-44c1-9f12-d8b21ca4d3e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:00:56 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:56.368 160590 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap594a8673-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 15 05:00:56 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:56.368 160590 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Dec 15 05:00:56 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:56.369 160590 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap594a8673-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 15 05:00:56 localhost nova_compute[286344]: 2025-12-15 10:00:56.409 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:00:56 localhost kernel: device tap594a8673-60 entered promiscuous mode Dec 15 05:00:56 localhost nova_compute[286344]: 2025-12-15 10:00:56.413 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:00:56 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:56.414 160590 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap594a8673-60, col_values=(('external_ids', {'iface-id': 'eb478ea1-29fd-4e9a-85ee-b8b88d82f051'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 15 05:00:56 localhost nova_compute[286344]: 2025-12-15 10:00:56.415 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:00:56 localhost ovn_controller[154603]: 2025-12-15T10:00:56Z|00099|binding|INFO|Releasing lport eb478ea1-29fd-4e9a-85ee-b8b88d82f051 from this chassis (sb_readonly=0) Dec 15 05:00:56 localhost nova_compute[286344]: 2025-12-15 10:00:56.429 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:00:56 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:56.429 160590 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/594a8673-651b-4566-92ec-8dbe6ca00b60.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/594a8673-651b-4566-92ec-8dbe6ca00b60.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m Dec 15 05:00:56 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:56.430 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[78f0aad6-295b-4745-a226-cbe6c3c79526]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:00:56 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:56.431 160590 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = Dec 15 05:00:56 localhost ovn_metadata_agent[160585]: global Dec 15 05:00:56 localhost ovn_metadata_agent[160585]: log /dev/log local0 debug Dec 15 05:00:56 localhost ovn_metadata_agent[160585]: log-tag haproxy-metadata-proxy-594a8673-651b-4566-92ec-8dbe6ca00b60 Dec 15 05:00:56 localhost ovn_metadata_agent[160585]: user root Dec 15 05:00:56 localhost ovn_metadata_agent[160585]: group root Dec 15 05:00:56 localhost ovn_metadata_agent[160585]: maxconn 1024 Dec 15 05:00:56 localhost ovn_metadata_agent[160585]: pidfile /var/lib/neutron/external/pids/594a8673-651b-4566-92ec-8dbe6ca00b60.pid.haproxy Dec 15 05:00:56 localhost ovn_metadata_agent[160585]: daemon Dec 15 05:00:56 localhost ovn_metadata_agent[160585]: Dec 15 05:00:56 localhost ovn_metadata_agent[160585]: defaults Dec 15 05:00:56 localhost ovn_metadata_agent[160585]: log global Dec 15 05:00:56 localhost ovn_metadata_agent[160585]: mode http Dec 15 05:00:56 localhost ovn_metadata_agent[160585]: option httplog Dec 15 05:00:56 localhost ovn_metadata_agent[160585]: option dontlognull Dec 15 05:00:56 localhost ovn_metadata_agent[160585]: option http-server-close Dec 15 05:00:56 localhost ovn_metadata_agent[160585]: option forwardfor Dec 15 05:00:56 localhost ovn_metadata_agent[160585]: retries 3 Dec 15 05:00:56 localhost ovn_metadata_agent[160585]: timeout http-request 30s Dec 15 05:00:56 localhost ovn_metadata_agent[160585]: timeout connect 30s Dec 15 05:00:56 localhost ovn_metadata_agent[160585]: timeout client 32s Dec 15 05:00:56 localhost ovn_metadata_agent[160585]: timeout server 32s Dec 15 05:00:56 localhost ovn_metadata_agent[160585]: timeout http-keep-alive 30s Dec 15 05:00:56 localhost ovn_metadata_agent[160585]: Dec 15 05:00:56 localhost ovn_metadata_agent[160585]: Dec 15 05:00:56 localhost ovn_metadata_agent[160585]: listen listener Dec 15 05:00:56 localhost ovn_metadata_agent[160585]: bind 169.254.169.254:80 Dec 15 05:00:56 localhost ovn_metadata_agent[160585]: server metadata /var/lib/neutron/metadata_proxy Dec 15 05:00:56 localhost ovn_metadata_agent[160585]: http-request add-header X-OVN-Network-ID 594a8673-651b-4566-92ec-8dbe6ca00b60 Dec 15 05:00:56 localhost ovn_metadata_agent[160585]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m Dec 15 05:00:56 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:56.431 160590 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-594a8673-651b-4566-92ec-8dbe6ca00b60', 'env', 'PROCESS_TAG=haproxy-594a8673-651b-4566-92ec-8dbe6ca00b60', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/594a8673-651b-4566-92ec-8dbe6ca00b60.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m Dec 15 05:00:56 localhost nova_compute[286344]: 2025-12-15 10:00:56.509 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:00:56 localhost podman[315118]: Dec 15 05:00:56 localhost nova_compute[286344]: 2025-12-15 10:00:56.926 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:00:56 localhost podman[315118]: 2025-12-15 10:00:56.932151958 +0000 UTC m=+0.102667863 container create 98fad658a9ebb45138a90f607082ed4e2385050ae13da1b945cf17b9b016e32b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-594a8673-651b-4566-92ec-8dbe6ca00b60, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:00:56 localhost podman[315118]: 2025-12-15 10:00:56.872344792 +0000 UTC m=+0.042860767 image pull quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified Dec 15 05:00:56 localhost systemd[1]: Started libpod-conmon-98fad658a9ebb45138a90f607082ed4e2385050ae13da1b945cf17b9b016e32b.scope. Dec 15 05:00:56 localhost systemd[1]: Started libcrun container. Dec 15 05:00:56 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ee0c9f336082978f565ae45816420858db6936fbc14607ad49c7da4487c6fa27/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 05:00:57 localhost podman[315118]: 2025-12-15 10:00:57.004240214 +0000 UTC m=+0.174756119 container init 98fad658a9ebb45138a90f607082ed4e2385050ae13da1b945cf17b9b016e32b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-594a8673-651b-4566-92ec-8dbe6ca00b60, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:00:57 localhost podman[315118]: 2025-12-15 10:00:57.019306836 +0000 UTC m=+0.189822741 container start 98fad658a9ebb45138a90f607082ed4e2385050ae13da1b945cf17b9b016e32b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-594a8673-651b-4566-92ec-8dbe6ca00b60, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:00:57 localhost neutron-haproxy-ovnmeta-594a8673-651b-4566-92ec-8dbe6ca00b60[315133]: [NOTICE] (315137) : New worker (315139) forked Dec 15 05:00:57 localhost neutron-haproxy-ovnmeta-594a8673-651b-4566-92ec-8dbe6ca00b60[315133]: [NOTICE] (315137) : Loading success. Dec 15 05:00:57 localhost neutron_sriov_agent[260044]: 2025-12-15 10:00:57.644 2 INFO neutron.agent.securitygroups_rpc [req-e7521624-5024-4459-907a-702dcb2e768b req-8b0c42d8-ba44-48ae-854c-1e47d19cbb47 35309870d5bd457f95d32ae22fa4f72e b24a65ffabd5489d869d7df73e041227 - - default default] Security group rule updated ['69353b8f-deec-4c84-bc97-101538ef1eed']#033[00m Dec 15 05:00:58 localhost ovn_controller[154603]: 2025-12-15T10:00:58Z|00100|binding|INFO|Claiming lport 19409185-ff01-4d58-bd6e-5d66014dd5c9 for this chassis. Dec 15 05:00:58 localhost ovn_controller[154603]: 2025-12-15T10:00:58Z|00101|binding|INFO|19409185-ff01-4d58-bd6e-5d66014dd5c9: Claiming fa:16:3e:df:04:be 10.100.0.13 Dec 15 05:00:58 localhost ovn_controller[154603]: 2025-12-15T10:00:58Z|00102|binding|INFO|Claiming lport fd72ab01-c63c-4f51-ae94-ca02f6f50dfa for this chassis. Dec 15 05:00:58 localhost ovn_controller[154603]: 2025-12-15T10:00:58Z|00103|binding|INFO|fd72ab01-c63c-4f51-ae94-ca02f6f50dfa: Claiming fa:16:3e:79:ba:17 19.80.0.198 Dec 15 05:00:58 localhost ovn_controller[154603]: 2025-12-15T10:00:58Z|00104|binding|INFO|Setting lport 19409185-ff01-4d58-bd6e-5d66014dd5c9 up in Southbound Dec 15 05:00:58 localhost ovn_controller[154603]: 2025-12-15T10:00:58Z|00105|binding|INFO|Setting lport fd72ab01-c63c-4f51-ae94-ca02f6f50dfa up in Southbound Dec 15 05:00:58 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:58.204 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:df:04:be 10.100.0.13'], port_security=['fa:16:3e:df:04:be 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-2031244324', 'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '8f5a73ce-a171-4a42-9f60-63c0db9e0a32', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ff0cb653-0dc9-48ed-8a20-700650a10509', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-2031244324', 'neutron:project_id': '4e26599ab4374eb6b7e9d73e7dfebd03', 'neutron:revision_number': '9', 'neutron:security_group_ids': '489315f5-1e14-49f6-8195-de837d23b935', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005559463.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0504d409-4e28-4301-a421-b053ce853c40, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[], logical_port=19409185-ff01-4d58-bd6e-5d66014dd5c9) old=Port_Binding(up=[False], additional_chassis=[], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:00:58 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:58.208 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:79:ba:17 19.80.0.198'], port_security=['fa:16:3e:79:ba:17 19.80.0.198'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': ''}, parent_port=['19409185-ff01-4d58-bd6e-5d66014dd5c9'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-145676246', 'neutron:cidrs': '19.80.0.198/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0874a63c-034d-48db-8f84-a51c1fe90687', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-145676246', 'neutron:project_id': '4e26599ab4374eb6b7e9d73e7dfebd03', 'neutron:revision_number': '4', 'neutron:security_group_ids': '489315f5-1e14-49f6-8195-de837d23b935', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=d3aba774-abb1-4ddc-beaa-cd88849a49e2, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=fd72ab01-c63c-4f51-ae94-ca02f6f50dfa) old=Port_Binding(up=[False], additional_chassis=[], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:00:58 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:58.210 160590 INFO neutron.agent.ovn.metadata.agent [-] Port 19409185-ff01-4d58-bd6e-5d66014dd5c9 in datapath ff0cb653-0dc9-48ed-8a20-700650a10509 bound to our chassis#033[00m Dec 15 05:00:58 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:58.214 160590 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ff0cb653-0dc9-48ed-8a20-700650a10509#033[00m Dec 15 05:00:58 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:58.224 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[7194b10a-dad8-423b-8b04-71d56c75aff5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:00:58 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:58.225 160590 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapff0cb653-01 in ovnmeta-ff0cb653-0dc9-48ed-8a20-700650a10509 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m Dec 15 05:00:58 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:58.227 160858 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapff0cb653-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m Dec 15 05:00:58 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:58.227 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[5c12547b-396f-4f25-bf14-f2701a2f340e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:00:58 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:58.228 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[55292888-a6ba-40af-8f59-ee5f10282578]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:00:58 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:58.243 160979 DEBUG oslo.privsep.daemon [-] privsep: reply[a49c8f4a-8ade-408d-9029-bfe1cd168ecb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:00:58 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:58.256 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[45003450-28aa-4f61-b90b-a15e9b03e273]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:00:58 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:58.281 160959 DEBUG oslo.privsep.daemon [-] privsep: reply[4b1a6574-6861-48bd-b0db-5d1fb932e1a9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:00:58 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:58.287 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[5177d205-63a5-4be5-a372-ea178c4eaf9c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:00:58 localhost NetworkManager[5963]: [1765792858.2881] manager: (tapff0cb653-00): new Veth device (/org/freedesktop/NetworkManager/Devices/25) Dec 15 05:00:58 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:58.317 160959 DEBUG oslo.privsep.daemon [-] privsep: reply[132d2a5e-c68d-4c0b-b9f0-ae27d0e61ddf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:00:58 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:58.321 160959 DEBUG oslo.privsep.daemon [-] privsep: reply[96b65cdf-56d4-49ea-84d6-d698f4acee66]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:00:58 localhost NetworkManager[5963]: [1765792858.3426] device (tapff0cb653-00): carrier: link connected Dec 15 05:00:58 localhost kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tapff0cb653-01: link becomes ready Dec 15 05:00:58 localhost kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tapff0cb653-00: link becomes ready Dec 15 05:00:58 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:58.349 160959 DEBUG oslo.privsep.daemon [-] privsep: reply[2b775014-361b-4576-bbde-8b497a811eb2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:00:58 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:58.365 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[1e90ed84-4912-4896-8c37-71dfc3b8c8a2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapff0cb653-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:12:41:52'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 26], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1192446, 'reachable_time': 44883, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 315159, 'error': None, 'target': 'ovnmeta-ff0cb653-0dc9-48ed-8a20-700650a10509', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:00:58 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:58.385 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[4883a678-1e17-4278-9163-4de4d9b6ba02]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe12:4152'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1192446, 'tstamp': 1192446}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 315160, 'error': None, 'target': 'ovnmeta-ff0cb653-0dc9-48ed-8a20-700650a10509', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:00:58 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:58.401 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[fa0fa3f2-db1b-40c8-843e-4d6d8cb2c8a3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapff0cb653-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:12:41:52'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 26], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1192446, 'reachable_time': 44883, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 315161, 'error': None, 'target': 'ovnmeta-ff0cb653-0dc9-48ed-8a20-700650a10509', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:00:58 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:58.429 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[447477a8-7db9-470f-b162-f7002eb79f80]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:00:58 localhost neutron_sriov_agent[260044]: 2025-12-15 10:00:58.481 2 WARNING neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [req-93ddfe1e-4219-4b1f-8860-eb27ca1215ed req-603e765c-44b4-4aef-a2b9-3ed6bd7b3f5c 99f2c8ae65a24145818a9cc2c7407cf0 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] This port is not SRIOV, skip binding for port 19409185-ff01-4d58-bd6e-5d66014dd5c9.#033[00m Dec 15 05:00:58 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:58.484 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[9469a231-e710-4818-ba9c-9f929cbb9d4d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:00:58 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:58.486 160590 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapff0cb653-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 15 05:00:58 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:58.487 160590 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Dec 15 05:00:58 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:58.487 160590 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapff0cb653-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 15 05:00:58 localhost kernel: device tapff0cb653-00 entered promiscuous mode Dec 15 05:00:58 localhost nova_compute[286344]: 2025-12-15 10:00:58.524 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:00:58 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:58.529 160590 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapff0cb653-00, col_values=(('external_ids', {'iface-id': '3c672ee2-c7c4-4620-810e-6910f497dd1e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 15 05:00:58 localhost nova_compute[286344]: 2025-12-15 10:00:58.531 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:00:58 localhost ovn_controller[154603]: 2025-12-15T10:00:58Z|00106|binding|INFO|Releasing lport 3c672ee2-c7c4-4620-810e-6910f497dd1e from this chassis (sb_readonly=0) Dec 15 05:00:58 localhost nova_compute[286344]: 2025-12-15 10:00:58.543 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:00:58 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:58.544 160590 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ff0cb653-0dc9-48ed-8a20-700650a10509.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ff0cb653-0dc9-48ed-8a20-700650a10509.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m Dec 15 05:00:58 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:58.545 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[eb2a3f63-0334-4bd2-b808-e0045480debd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:00:58 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:58.546 160590 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = Dec 15 05:00:58 localhost ovn_metadata_agent[160585]: global Dec 15 05:00:58 localhost ovn_metadata_agent[160585]: log /dev/log local0 debug Dec 15 05:00:58 localhost ovn_metadata_agent[160585]: log-tag haproxy-metadata-proxy-ff0cb653-0dc9-48ed-8a20-700650a10509 Dec 15 05:00:58 localhost ovn_metadata_agent[160585]: user root Dec 15 05:00:58 localhost ovn_metadata_agent[160585]: group root Dec 15 05:00:58 localhost ovn_metadata_agent[160585]: maxconn 1024 Dec 15 05:00:58 localhost ovn_metadata_agent[160585]: pidfile /var/lib/neutron/external/pids/ff0cb653-0dc9-48ed-8a20-700650a10509.pid.haproxy Dec 15 05:00:58 localhost ovn_metadata_agent[160585]: daemon Dec 15 05:00:58 localhost ovn_metadata_agent[160585]: Dec 15 05:00:58 localhost ovn_metadata_agent[160585]: defaults Dec 15 05:00:58 localhost ovn_metadata_agent[160585]: log global Dec 15 05:00:58 localhost ovn_metadata_agent[160585]: mode http Dec 15 05:00:58 localhost ovn_metadata_agent[160585]: option httplog Dec 15 05:00:58 localhost ovn_metadata_agent[160585]: option dontlognull Dec 15 05:00:58 localhost ovn_metadata_agent[160585]: option http-server-close Dec 15 05:00:58 localhost ovn_metadata_agent[160585]: option forwardfor Dec 15 05:00:58 localhost ovn_metadata_agent[160585]: retries 3 Dec 15 05:00:58 localhost ovn_metadata_agent[160585]: timeout http-request 30s Dec 15 05:00:58 localhost ovn_metadata_agent[160585]: timeout connect 30s Dec 15 05:00:58 localhost ovn_metadata_agent[160585]: timeout client 32s Dec 15 05:00:58 localhost ovn_metadata_agent[160585]: timeout server 32s Dec 15 05:00:58 localhost ovn_metadata_agent[160585]: timeout http-keep-alive 30s Dec 15 05:00:58 localhost ovn_metadata_agent[160585]: Dec 15 05:00:58 localhost ovn_metadata_agent[160585]: Dec 15 05:00:58 localhost ovn_metadata_agent[160585]: listen listener Dec 15 05:00:58 localhost ovn_metadata_agent[160585]: bind 169.254.169.254:80 Dec 15 05:00:58 localhost ovn_metadata_agent[160585]: server metadata /var/lib/neutron/metadata_proxy Dec 15 05:00:58 localhost ovn_metadata_agent[160585]: http-request add-header X-OVN-Network-ID ff0cb653-0dc9-48ed-8a20-700650a10509 Dec 15 05:00:58 localhost ovn_metadata_agent[160585]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m Dec 15 05:00:58 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:58.546 160590 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ff0cb653-0dc9-48ed-8a20-700650a10509', 'env', 'PROCESS_TAG=haproxy-ff0cb653-0dc9-48ed-8a20-700650a10509', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ff0cb653-0dc9-48ed-8a20-700650a10509.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m Dec 15 05:00:58 localhost nova_compute[286344]: 2025-12-15 10:00:58.644 286348 INFO nova.compute.manager [None req-93ddfe1e-4219-4b1f-8860-eb27ca1215ed 58b2de48454141d48ee4d84c6ec84836 7526febbe14640b09a9f0897a2f4af8c - - default default] [instance: 8f5a73ce-a171-4a42-9f60-63c0db9e0a32] Post operation of migration started#033[00m Dec 15 05:00:58 localhost nova_compute[286344]: 2025-12-15 10:00:58.775 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:00:58 localhost nova_compute[286344]: 2025-12-15 10:00:58.810 286348 DEBUG oslo_concurrency.lockutils [None req-93ddfe1e-4219-4b1f-8860-eb27ca1215ed 58b2de48454141d48ee4d84c6ec84836 7526febbe14640b09a9f0897a2f4af8c - - default default] Acquiring lock "refresh_cache-8f5a73ce-a171-4a42-9f60-63c0db9e0a32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 15 05:00:58 localhost nova_compute[286344]: 2025-12-15 10:00:58.811 286348 DEBUG oslo_concurrency.lockutils [None req-93ddfe1e-4219-4b1f-8860-eb27ca1215ed 58b2de48454141d48ee4d84c6ec84836 7526febbe14640b09a9f0897a2f4af8c - - default default] Acquired lock "refresh_cache-8f5a73ce-a171-4a42-9f60-63c0db9e0a32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 15 05:00:58 localhost nova_compute[286344]: 2025-12-15 10:00:58.812 286348 DEBUG nova.network.neutron [None req-93ddfe1e-4219-4b1f-8860-eb27ca1215ed 58b2de48454141d48ee4d84c6ec84836 7526febbe14640b09a9f0897a2f4af8c - - default default] [instance: 8f5a73ce-a171-4a42-9f60-63c0db9e0a32] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m Dec 15 05:00:58 localhost podman[315194]: Dec 15 05:00:58 localhost podman[315194]: 2025-12-15 10:00:58.989119114 +0000 UTC m=+0.069186657 container create fbf5e5fffb8688761fea1889fc3c1bf4b00ba85f56a27be317821bcad0f556a3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ff0cb653-0dc9-48ed-8a20-700650a10509, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:00:59 localhost systemd[1]: Started libpod-conmon-fbf5e5fffb8688761fea1889fc3c1bf4b00ba85f56a27be317821bcad0f556a3.scope. Dec 15 05:00:59 localhost systemd[1]: Started libcrun container. Dec 15 05:00:59 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/52e193136e567be6221a0f3be354d690b359143abb37b7522fe279a9cd2c04a9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 05:00:59 localhost podman[315194]: 2025-12-15 10:00:59.048477207 +0000 UTC m=+0.128544740 container init fbf5e5fffb8688761fea1889fc3c1bf4b00ba85f56a27be317821bcad0f556a3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ff0cb653-0dc9-48ed-8a20-700650a10509, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2) Dec 15 05:00:59 localhost podman[315194]: 2025-12-15 10:00:59.057432108 +0000 UTC m=+0.137499651 container start fbf5e5fffb8688761fea1889fc3c1bf4b00ba85f56a27be317821bcad0f556a3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ff0cb653-0dc9-48ed-8a20-700650a10509, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true) Dec 15 05:00:59 localhost podman[315194]: 2025-12-15 10:00:58.960496728 +0000 UTC m=+0.040564321 image pull quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified Dec 15 05:00:59 localhost neutron-haproxy-ovnmeta-ff0cb653-0dc9-48ed-8a20-700650a10509[315208]: [NOTICE] (315212) : New worker (315214) forked Dec 15 05:00:59 localhost neutron-haproxy-ovnmeta-ff0cb653-0dc9-48ed-8a20-700650a10509[315208]: [NOTICE] (315212) : Loading success. Dec 15 05:00:59 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:59.114 160590 INFO neutron.agent.ovn.metadata.agent [-] Port fd72ab01-c63c-4f51-ae94-ca02f6f50dfa in datapath 0874a63c-034d-48db-8f84-a51c1fe90687 unbound from our chassis#033[00m Dec 15 05:00:59 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:59.119 160590 DEBUG neutron.agent.ovn.metadata.agent [-] Port 88d39070-3e5a-4b84-a35d-cecd6f04634a IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Dec 15 05:00:59 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:59.120 160590 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0874a63c-034d-48db-8f84-a51c1fe90687#033[00m Dec 15 05:00:59 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:59.127 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[6869fd42-e584-48f7-a373-14c613c82ea5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:00:59 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:59.128 160590 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap0874a63c-01 in ovnmeta-0874a63c-034d-48db-8f84-a51c1fe90687 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m Dec 15 05:00:59 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:59.130 160858 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap0874a63c-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m Dec 15 05:00:59 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:59.130 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[882b7811-ca6f-471c-ab4d-91a342db3f3f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:00:59 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:59.131 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[46bd5eb2-dc05-43a0-88a6-d5421b8f4722]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:00:59 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:59.140 160979 DEBUG oslo.privsep.daemon [-] privsep: reply[18bb5558-cf23-4da1-88b3-005fc018032b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:00:59 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:59.150 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[e880d292-0fb4-4b79-bcf1-cd855f10c389]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:00:59 localhost nova_compute[286344]: 2025-12-15 10:00:59.152 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:00:59 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:59.174 160959 DEBUG oslo.privsep.daemon [-] privsep: reply[33ab59dc-4a3e-4143-bc3a-bfe445b5eee7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:00:59 localhost NetworkManager[5963]: [1765792859.1866] manager: (tap0874a63c-00): new Veth device (/org/freedesktop/NetworkManager/Devices/26) Dec 15 05:00:59 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:59.185 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[8d52368f-bab0-4254-9e93-7413d7defe86]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:00:59 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:59.218 160959 DEBUG oslo.privsep.daemon [-] privsep: reply[b2768587-1354-4a8c-9ffa-929a65eac912]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:00:59 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:59.221 160959 DEBUG oslo.privsep.daemon [-] privsep: reply[a98cb3c4-4809-458f-97e5-c22443590274]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:00:59 localhost NetworkManager[5963]: [1765792859.2437] device (tap0874a63c-00): carrier: link connected Dec 15 05:00:59 localhost kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap0874a63c-00: link becomes ready Dec 15 05:00:59 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:59.249 160959 DEBUG oslo.privsep.daemon [-] privsep: reply[cc3a99c6-646e-4bf4-999b-a88e0870f17c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:00:59 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:59.266 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[4f29efc5-9c59-4cea-a9e4-f525b5523041]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0874a63c-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:b5:21:ca'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 27], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1192536, 'reachable_time': 33010, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 315233, 'error': None, 'target': 'ovnmeta-0874a63c-034d-48db-8f84-a51c1fe90687', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:00:59 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:59.284 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[cb2c5126-8fa1-4a93-82b7-e3f789bc5007]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb5:21ca'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1192536, 'tstamp': 1192536}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 315234, 'error': None, 'target': 'ovnmeta-0874a63c-034d-48db-8f84-a51c1fe90687', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:00:59 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:59.301 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[8476dc86-aa10-4b72-81c9-1a8776546096]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0874a63c-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:b5:21:ca'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 27], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1192536, 'reachable_time': 33010, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 315235, 'error': None, 'target': 'ovnmeta-0874a63c-034d-48db-8f84-a51c1fe90687', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:00:59 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:59.331 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[3cf7e1be-a00c-4eb1-b730-5b5232491cd1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:00:59 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:59.385 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[fb798d72-7595-4fa7-a31e-a80d009bda6a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:00:59 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:59.386 160590 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0874a63c-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 15 05:00:59 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:59.387 160590 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m Dec 15 05:00:59 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:59.388 160590 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0874a63c-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 15 05:00:59 localhost nova_compute[286344]: 2025-12-15 10:00:59.390 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:00:59 localhost kernel: device tap0874a63c-00 entered promiscuous mode Dec 15 05:00:59 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:59.396 160590 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0874a63c-00, col_values=(('external_ids', {'iface-id': 'c3be4d70-09ff-4304-ad4e-0464f0628ae9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 15 05:00:59 localhost ovn_controller[154603]: 2025-12-15T10:00:59Z|00107|binding|INFO|Releasing lport c3be4d70-09ff-4304-ad4e-0464f0628ae9 from this chassis (sb_readonly=0) Dec 15 05:00:59 localhost nova_compute[286344]: 2025-12-15 10:00:59.399 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:00:59 localhost nova_compute[286344]: 2025-12-15 10:00:59.409 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:00:59 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:59.410 160590 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0874a63c-034d-48db-8f84-a51c1fe90687.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0874a63c-034d-48db-8f84-a51c1fe90687.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m Dec 15 05:00:59 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:59.411 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[f77c192b-2685-4007-a352-05a712baa27e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:00:59 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:59.412 160590 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = Dec 15 05:00:59 localhost ovn_metadata_agent[160585]: global Dec 15 05:00:59 localhost ovn_metadata_agent[160585]: log /dev/log local0 debug Dec 15 05:00:59 localhost ovn_metadata_agent[160585]: log-tag haproxy-metadata-proxy-0874a63c-034d-48db-8f84-a51c1fe90687 Dec 15 05:00:59 localhost ovn_metadata_agent[160585]: user root Dec 15 05:00:59 localhost ovn_metadata_agent[160585]: group root Dec 15 05:00:59 localhost ovn_metadata_agent[160585]: maxconn 1024 Dec 15 05:00:59 localhost ovn_metadata_agent[160585]: pidfile /var/lib/neutron/external/pids/0874a63c-034d-48db-8f84-a51c1fe90687.pid.haproxy Dec 15 05:00:59 localhost ovn_metadata_agent[160585]: daemon Dec 15 05:00:59 localhost ovn_metadata_agent[160585]: Dec 15 05:00:59 localhost ovn_metadata_agent[160585]: defaults Dec 15 05:00:59 localhost ovn_metadata_agent[160585]: log global Dec 15 05:00:59 localhost ovn_metadata_agent[160585]: mode http Dec 15 05:00:59 localhost ovn_metadata_agent[160585]: option httplog Dec 15 05:00:59 localhost ovn_metadata_agent[160585]: option dontlognull Dec 15 05:00:59 localhost ovn_metadata_agent[160585]: option http-server-close Dec 15 05:00:59 localhost ovn_metadata_agent[160585]: option forwardfor Dec 15 05:00:59 localhost ovn_metadata_agent[160585]: retries 3 Dec 15 05:00:59 localhost ovn_metadata_agent[160585]: timeout http-request 30s Dec 15 05:00:59 localhost ovn_metadata_agent[160585]: timeout connect 30s Dec 15 05:00:59 localhost ovn_metadata_agent[160585]: timeout client 32s Dec 15 05:00:59 localhost ovn_metadata_agent[160585]: timeout server 32s Dec 15 05:00:59 localhost ovn_metadata_agent[160585]: timeout http-keep-alive 30s Dec 15 05:00:59 localhost ovn_metadata_agent[160585]: Dec 15 05:00:59 localhost ovn_metadata_agent[160585]: Dec 15 05:00:59 localhost ovn_metadata_agent[160585]: listen listener Dec 15 05:00:59 localhost ovn_metadata_agent[160585]: bind 169.254.169.254:80 Dec 15 05:00:59 localhost ovn_metadata_agent[160585]: server metadata /var/lib/neutron/metadata_proxy Dec 15 05:00:59 localhost ovn_metadata_agent[160585]: http-request add-header X-OVN-Network-ID 0874a63c-034d-48db-8f84-a51c1fe90687 Dec 15 05:00:59 localhost ovn_metadata_agent[160585]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m Dec 15 05:00:59 localhost ovn_metadata_agent[160585]: 2025-12-15 10:00:59.413 160590 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-0874a63c-034d-48db-8f84-a51c1fe90687', 'env', 'PROCESS_TAG=haproxy-0874a63c-034d-48db-8f84-a51c1fe90687', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/0874a63c-034d-48db-8f84-a51c1fe90687.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m Dec 15 05:00:59 localhost nova_compute[286344]: 2025-12-15 10:00:59.559 286348 DEBUG nova.network.neutron [None req-93ddfe1e-4219-4b1f-8860-eb27ca1215ed 58b2de48454141d48ee4d84c6ec84836 7526febbe14640b09a9f0897a2f4af8c - - default default] [instance: 8f5a73ce-a171-4a42-9f60-63c0db9e0a32] Updating instance_info_cache with network_info: [{"id": "19409185-ff01-4d58-bd6e-5d66014dd5c9", "address": "fa:16:3e:df:04:be", "network": {"id": "ff0cb653-0dc9-48ed-8a20-700650a10509", "bridge": "br-int", "label": "tempest-LiveMigrationTest-2038605812-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "4e26599ab4374eb6b7e9d73e7dfebd03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap19409185-ff", "ovs_interfaceid": "19409185-ff01-4d58-bd6e-5d66014dd5c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 15 05:00:59 localhost nova_compute[286344]: 2025-12-15 10:00:59.671 286348 DEBUG oslo_concurrency.lockutils [None req-93ddfe1e-4219-4b1f-8860-eb27ca1215ed 58b2de48454141d48ee4d84c6ec84836 7526febbe14640b09a9f0897a2f4af8c - - default default] Releasing lock "refresh_cache-8f5a73ce-a171-4a42-9f60-63c0db9e0a32" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 15 05:00:59 localhost nova_compute[286344]: 2025-12-15 10:00:59.693 286348 DEBUG oslo_concurrency.lockutils [None req-93ddfe1e-4219-4b1f-8860-eb27ca1215ed 58b2de48454141d48ee4d84c6ec84836 7526febbe14640b09a9f0897a2f4af8c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 05:00:59 localhost nova_compute[286344]: 2025-12-15 10:00:59.694 286348 DEBUG oslo_concurrency.lockutils [None req-93ddfe1e-4219-4b1f-8860-eb27ca1215ed 58b2de48454141d48ee4d84c6ec84836 7526febbe14640b09a9f0897a2f4af8c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 05:00:59 localhost nova_compute[286344]: 2025-12-15 10:00:59.694 286348 DEBUG oslo_concurrency.lockutils [None req-93ddfe1e-4219-4b1f-8860-eb27ca1215ed 58b2de48454141d48ee4d84c6ec84836 7526febbe14640b09a9f0897a2f4af8c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 05:00:59 localhost nova_compute[286344]: 2025-12-15 10:00:59.702 286348 INFO nova.virt.libvirt.driver [None req-93ddfe1e-4219-4b1f-8860-eb27ca1215ed 58b2de48454141d48ee4d84c6ec84836 7526febbe14640b09a9f0897a2f4af8c - - default default] [instance: 8f5a73ce-a171-4a42-9f60-63c0db9e0a32] Sending announce-self command to QEMU monitor. Attempt 1 of 3#033[00m Dec 15 05:00:59 localhost journal[204381]: Domain id=3 name='instance-00000007' uuid=8f5a73ce-a171-4a42-9f60-63c0db9e0a32 is tainted: custom-monitor Dec 15 05:00:59 localhost podman[315267]: 2025-12-15 10:00:59.813377427 +0000 UTC m=+0.048897557 image pull quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified Dec 15 05:01:00 localhost podman[315267]: Dec 15 05:01:00 localhost podman[315267]: 2025-12-15 10:01:00.07063366 +0000 UTC m=+0.306153780 container create a2b32f10869a3d9bcf3105745305f4a2d865b6a282e96d17c0d9489add2e8c1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0874a63c-034d-48db-8f84-a51c1fe90687, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:01:00 localhost systemd[1]: Started libpod-conmon-a2b32f10869a3d9bcf3105745305f4a2d865b6a282e96d17c0d9489add2e8c1e.scope. Dec 15 05:01:00 localhost systemd[1]: Started libcrun container. Dec 15 05:01:00 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7e2d8c207c36940f6908fdaa1501d9253b7e7db0b57e87bdb9d8301453e4b4f2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 05:01:00 localhost podman[315267]: 2025-12-15 10:01:00.177243018 +0000 UTC m=+0.412763108 container init a2b32f10869a3d9bcf3105745305f4a2d865b6a282e96d17c0d9489add2e8c1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0874a63c-034d-48db-8f84-a51c1fe90687, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 05:01:00 localhost podman[315267]: 2025-12-15 10:01:00.186212899 +0000 UTC m=+0.421732989 container start a2b32f10869a3d9bcf3105745305f4a2d865b6a282e96d17c0d9489add2e8c1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0874a63c-034d-48db-8f84-a51c1fe90687, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0) Dec 15 05:01:00 localhost neutron-haproxy-ovnmeta-0874a63c-034d-48db-8f84-a51c1fe90687[315281]: [NOTICE] (315285) : New worker (315287) forked Dec 15 05:01:00 localhost neutron-haproxy-ovnmeta-0874a63c-034d-48db-8f84-a51c1fe90687[315281]: [NOTICE] (315285) : Loading success. Dec 15 05:01:00 localhost systemd[1]: tmp-crun.L5oRgo.mount: Deactivated successfully. Dec 15 05:01:00 localhost neutron_sriov_agent[260044]: 2025-12-15 10:01:00.389 2 INFO neutron.agent.securitygroups_rpc [req-07e2af73-e010-4697-8460-145321f4a9c2 req-e8381f7b-b5b3-4bc2-9d39-b7525867df29 35309870d5bd457f95d32ae22fa4f72e b24a65ffabd5489d869d7df73e041227 - - default default] Security group rule updated ['7912360f-9a53-46d4-815b-0955fe884a4e']#033[00m Dec 15 05:01:00 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e99 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 05:01:00 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e99 do_prune osdmap full prune enabled Dec 15 05:01:00 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e100 e100: 6 total, 6 up, 6 in Dec 15 05:01:00 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e100: 6 total, 6 up, 6 in Dec 15 05:01:00 localhost nova_compute[286344]: 2025-12-15 10:01:00.711 286348 INFO nova.virt.libvirt.driver [None req-93ddfe1e-4219-4b1f-8860-eb27ca1215ed 58b2de48454141d48ee4d84c6ec84836 7526febbe14640b09a9f0897a2f4af8c - - default default] [instance: 8f5a73ce-a171-4a42-9f60-63c0db9e0a32] Sending announce-self command to QEMU monitor. Attempt 2 of 3#033[00m Dec 15 05:01:00 localhost neutron_sriov_agent[260044]: 2025-12-15 10:01:00.957 2 INFO neutron.agent.securitygroups_rpc [req-0950b9df-8649-4cf5-b443-20c5f550dca6 req-dff89bf6-faa5-440d-88f8-5af485d3088f 35309870d5bd457f95d32ae22fa4f72e b24a65ffabd5489d869d7df73e041227 - - default default] Security group rule updated ['7912360f-9a53-46d4-815b-0955fe884a4e']#033[00m Dec 15 05:01:01 localhost nova_compute[286344]: 2025-12-15 10:01:01.555 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:01:01 localhost neutron_sriov_agent[260044]: 2025-12-15 10:01:01.605 2 INFO neutron.agent.securitygroups_rpc [req-86620696-3364-4de1-af56-28ab197b2393 req-062f30b0-d586-4710-b896-94e268605f32 35309870d5bd457f95d32ae22fa4f72e b24a65ffabd5489d869d7df73e041227 - - default default] Security group rule updated ['7912360f-9a53-46d4-815b-0955fe884a4e']#033[00m Dec 15 05:01:01 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:01:01.647 267546 INFO neutron.agent.linux.ip_lib [None req-b0b6e92a-edda-44a8-a803-c7c2249384a1 - - - - - -] Device tape816ee9d-84 cannot be used as it has no MAC address#033[00m Dec 15 05:01:01 localhost nova_compute[286344]: 2025-12-15 10:01:01.668 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:01:01 localhost kernel: device tape816ee9d-84 entered promiscuous mode Dec 15 05:01:01 localhost NetworkManager[5963]: [1765792861.6775] manager: (tape816ee9d-84): new Generic device (/org/freedesktop/NetworkManager/Devices/27) Dec 15 05:01:01 localhost ovn_controller[154603]: 2025-12-15T10:01:01Z|00108|binding|INFO|Claiming lport e816ee9d-843f-4652-b732-310b6e7d6146 for this chassis. Dec 15 05:01:01 localhost ovn_controller[154603]: 2025-12-15T10:01:01Z|00109|binding|INFO|e816ee9d-843f-4652-b732-310b6e7d6146: Claiming unknown Dec 15 05:01:01 localhost nova_compute[286344]: 2025-12-15 10:01:01.679 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:01:01 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:01.690 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-3ac513c6-d80a-4d03-a550-1b73e6929696', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3ac513c6-d80a-4d03-a550-1b73e6929696', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '054b5bdd4ed44009a8e1940489c96b34', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d0a9f8e8-aca4-481b-b6ba-86c8894e2b13, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=e816ee9d-843f-4652-b732-310b6e7d6146) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:01:01 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:01.692 160590 INFO neutron.agent.ovn.metadata.agent [-] Port e816ee9d-843f-4652-b732-310b6e7d6146 in datapath 3ac513c6-d80a-4d03-a550-1b73e6929696 bound to our chassis#033[00m Dec 15 05:01:01 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:01.697 160590 DEBUG neutron.agent.ovn.metadata.agent [-] Port 3853e42d-16b8-4e07-816d-1f42c6b667e5 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Dec 15 05:01:01 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:01.698 160590 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3ac513c6-d80a-4d03-a550-1b73e6929696, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 15 05:01:01 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:01.698 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[ab0901f5-2ae1-45d0-8fec-ec95e7b9facf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:01:01 localhost ovn_controller[154603]: 2025-12-15T10:01:01Z|00110|binding|INFO|Setting lport e816ee9d-843f-4652-b732-310b6e7d6146 ovn-installed in OVS Dec 15 05:01:01 localhost ovn_controller[154603]: 2025-12-15T10:01:01Z|00111|binding|INFO|Setting lport e816ee9d-843f-4652-b732-310b6e7d6146 up in Southbound Dec 15 05:01:01 localhost nova_compute[286344]: 2025-12-15 10:01:01.715 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:01:01 localhost nova_compute[286344]: 2025-12-15 10:01:01.718 286348 INFO nova.virt.libvirt.driver [None req-93ddfe1e-4219-4b1f-8860-eb27ca1215ed 58b2de48454141d48ee4d84c6ec84836 7526febbe14640b09a9f0897a2f4af8c - - default default] [instance: 8f5a73ce-a171-4a42-9f60-63c0db9e0a32] Sending announce-self command to QEMU monitor. Attempt 3 of 3#033[00m Dec 15 05:01:01 localhost nova_compute[286344]: 2025-12-15 10:01:01.724 286348 DEBUG nova.compute.manager [None req-93ddfe1e-4219-4b1f-8860-eb27ca1215ed 58b2de48454141d48ee4d84c6ec84836 7526febbe14640b09a9f0897a2f4af8c - - default default] [instance: 8f5a73ce-a171-4a42-9f60-63c0db9e0a32] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Dec 15 05:01:01 localhost nova_compute[286344]: 2025-12-15 10:01:01.749 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:01:01 localhost nova_compute[286344]: 2025-12-15 10:01:01.755 286348 DEBUG nova.objects.instance [None req-93ddfe1e-4219-4b1f-8860-eb27ca1215ed 58b2de48454141d48ee4d84c6ec84836 7526febbe14640b09a9f0897a2f4af8c - - default default] [instance: 8f5a73ce-a171-4a42-9f60-63c0db9e0a32] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m Dec 15 05:01:01 localhost nova_compute[286344]: 2025-12-15 10:01:01.770 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:01:01 localhost podman[243449]: time="2025-12-15T10:01:01Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 15 05:01:01 localhost podman[243449]: @ - - [15/Dec/2025:10:01:01 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 166853 "" "Go-http-client/1.1" Dec 15 05:01:01 localhost podman[243449]: @ - - [15/Dec/2025:10:01:01 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 22572 "" "Go-http-client/1.1" Dec 15 05:01:02 localhost podman[315371]: Dec 15 05:01:02 localhost podman[315371]: 2025-12-15 10:01:02.629320847 +0000 UTC m=+0.077203434 container create 8d0a249025cd30d3f6ced66bb2a1ab69200de31f87cfd16c2c213b3b64a9055f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3ac513c6-d80a-4d03-a550-1b73e6929696, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202) Dec 15 05:01:02 localhost systemd[1]: Started libpod-conmon-8d0a249025cd30d3f6ced66bb2a1ab69200de31f87cfd16c2c213b3b64a9055f.scope. Dec 15 05:01:02 localhost podman[315371]: 2025-12-15 10:01:02.58115793 +0000 UTC m=+0.029040537 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 15 05:01:02 localhost systemd[1]: Started libcrun container. Dec 15 05:01:02 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/027445aa133a13ea97c41b5f10b814a36033ae1a01b8cba7ad962d6983bbd593/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 05:01:02 localhost podman[315371]: 2025-12-15 10:01:02.724114254 +0000 UTC m=+0.171996821 container init 8d0a249025cd30d3f6ced66bb2a1ab69200de31f87cfd16c2c213b3b64a9055f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3ac513c6-d80a-4d03-a550-1b73e6929696, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202) Dec 15 05:01:02 localhost podman[315371]: 2025-12-15 10:01:02.734831159 +0000 UTC m=+0.182713726 container start 8d0a249025cd30d3f6ced66bb2a1ab69200de31f87cfd16c2c213b3b64a9055f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3ac513c6-d80a-4d03-a550-1b73e6929696, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202) Dec 15 05:01:02 localhost dnsmasq[315389]: started, version 2.85 cachesize 150 Dec 15 05:01:02 localhost dnsmasq[315389]: DNS service limited to local subnets Dec 15 05:01:02 localhost dnsmasq[315389]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 15 05:01:02 localhost dnsmasq[315389]: warning: no upstream servers configured Dec 15 05:01:02 localhost dnsmasq-dhcp[315389]: DHCP, static leases only on 10.100.0.0, lease time 1d Dec 15 05:01:02 localhost dnsmasq[315389]: read /var/lib/neutron/dhcp/3ac513c6-d80a-4d03-a550-1b73e6929696/addn_hosts - 0 addresses Dec 15 05:01:02 localhost dnsmasq-dhcp[315389]: read /var/lib/neutron/dhcp/3ac513c6-d80a-4d03-a550-1b73e6929696/host Dec 15 05:01:02 localhost dnsmasq-dhcp[315389]: read /var/lib/neutron/dhcp/3ac513c6-d80a-4d03-a550-1b73e6929696/opts Dec 15 05:01:02 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:01:02.976 267546 INFO neutron.agent.dhcp.agent [None req-19d9822b-49e8-43e5-aa82-901c16797d33 - - - - - -] DHCP configuration for ports {'6e39b16b-65ab-4d65-8ced-b2e105f2b8c3'} is completed#033[00m Dec 15 05:01:03 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:03.042 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'fe:17:e3', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fe:55:2b:86:15:b5'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:01:03 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:03.043 160590 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Dec 15 05:01:03 localhost nova_compute[286344]: 2025-12-15 10:01:03.078 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:01:03 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e100 do_prune osdmap full prune enabled Dec 15 05:01:03 localhost systemd[1]: tmp-crun.MEpK3l.mount: Deactivated successfully. Dec 15 05:01:03 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e101 e101: 6 total, 6 up, 6 in Dec 15 05:01:03 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e101: 6 total, 6 up, 6 in Dec 15 05:01:04 localhost nova_compute[286344]: 2025-12-15 10:01:04.023 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:01:04 localhost nova_compute[286344]: 2025-12-15 10:01:04.196 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:01:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0. Dec 15 05:01:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 05:01:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a. Dec 15 05:01:04 localhost systemd[1]: tmp-crun.CuftZp.mount: Deactivated successfully. Dec 15 05:01:04 localhost podman[315392]: 2025-12-15 10:01:04.780287291 +0000 UTC m=+0.102397605 container health_status b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202) Dec 15 05:01:04 localhost podman[315392]: 2025-12-15 10:01:04.823457515 +0000 UTC m=+0.145567829 container exec_died b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Dec 15 05:01:04 localhost systemd[1]: b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a.service: Deactivated successfully. Dec 15 05:01:04 localhost openstack_network_exporter[246484]: ERROR 10:01:04 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 15 05:01:04 localhost openstack_network_exporter[246484]: ERROR 10:01:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 05:01:04 localhost openstack_network_exporter[246484]: ERROR 10:01:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 05:01:04 localhost openstack_network_exporter[246484]: ERROR 10:01:04 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 15 05:01:04 localhost openstack_network_exporter[246484]: Dec 15 05:01:04 localhost openstack_network_exporter[246484]: ERROR 10:01:04 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 15 05:01:04 localhost openstack_network_exporter[246484]: Dec 15 05:01:04 localhost podman[315390]: 2025-12-15 10:01:04.824085092 +0000 UTC m=+0.151999140 container health_status 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 15 05:01:04 localhost podman[315390]: 2025-12-15 10:01:04.911358213 +0000 UTC m=+0.239272261 container exec_died 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Dec 15 05:01:04 localhost systemd[1]: 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0.service: Deactivated successfully. Dec 15 05:01:04 localhost podman[315391]: 2025-12-15 10:01:04.877504819 +0000 UTC m=+0.202243408 container health_status 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, container_name=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:01:04 localhost podman[315391]: 2025-12-15 10:01:04.96150273 +0000 UTC m=+0.286241299 container exec_died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2) Dec 15 05:01:04 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.service: Deactivated successfully. Dec 15 05:01:05 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Dec 15 05:01:05 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/245526017' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Dec 15 05:01:05 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Dec 15 05:01:05 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/245526017' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Dec 15 05:01:05 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e101 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 05:01:05 localhost systemd[1]: Stopping User Manager for UID 42436... Dec 15 05:01:05 localhost systemd[314715]: Activating special unit Exit the Session... Dec 15 05:01:05 localhost systemd[314715]: Stopped target Main User Target. Dec 15 05:01:05 localhost systemd[314715]: Stopped target Basic System. Dec 15 05:01:05 localhost systemd[314715]: Stopped target Paths. Dec 15 05:01:05 localhost systemd[314715]: Stopped target Sockets. Dec 15 05:01:05 localhost systemd[314715]: Stopped target Timers. Dec 15 05:01:05 localhost systemd[314715]: Stopped Mark boot as successful after the user session has run 2 minutes. Dec 15 05:01:05 localhost systemd[314715]: Stopped Daily Cleanup of User's Temporary Directories. Dec 15 05:01:05 localhost systemd[314715]: Closed D-Bus User Message Bus Socket. Dec 15 05:01:05 localhost systemd[314715]: Stopped Create User's Volatile Files and Directories. Dec 15 05:01:05 localhost systemd[314715]: Removed slice User Application Slice. Dec 15 05:01:05 localhost systemd[314715]: Reached target Shutdown. Dec 15 05:01:05 localhost systemd[314715]: Finished Exit the Session. Dec 15 05:01:05 localhost systemd[314715]: Reached target Exit the Session. Dec 15 05:01:05 localhost systemd[1]: user@42436.service: Deactivated successfully. Dec 15 05:01:05 localhost systemd[1]: Stopped User Manager for UID 42436. Dec 15 05:01:05 localhost systemd[1]: Stopping User Runtime Directory /run/user/42436... Dec 15 05:01:05 localhost systemd[1]: user-runtime-dir@42436.service: Deactivated successfully. Dec 15 05:01:05 localhost systemd[1]: Stopped User Runtime Directory /run/user/42436. Dec 15 05:01:05 localhost systemd[1]: Removed slice User Slice of UID 42436. Dec 15 05:01:05 localhost systemd[1]: run-user-42436.mount: Deactivated successfully. Dec 15 05:01:06 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:01:06.124 267546 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-15T10:01:05Z, description=, device_id=73f5a759-7688-41b9-ae2b-fbcd9ec62dc9, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=f4687fb1-144e-4c75-9be4-6b8fc067b9d1, ip_allocation=immediate, mac_address=fa:16:3e:5d:e5:eb, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-15T10:00:57Z, description=, dns_domain=, id=3ac513c6-d80a-4d03-a550-1b73e6929696, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ServersV294TestFqdnHostnames-1189201574-network, port_security_enabled=True, project_id=054b5bdd4ed44009a8e1940489c96b34, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=38218, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=585, status=ACTIVE, subnets=['225c87a4-c79c-4b41-a792-50b7fbf53b84'], tags=[], tenant_id=054b5bdd4ed44009a8e1940489c96b34, updated_at=2025-12-15T10:00:59Z, vlan_transparent=None, network_id=3ac513c6-d80a-4d03-a550-1b73e6929696, port_security_enabled=False, project_id=054b5bdd4ed44009a8e1940489c96b34, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=622, status=DOWN, tags=[], tenant_id=054b5bdd4ed44009a8e1940489c96b34, updated_at=2025-12-15T10:01:05Z on network 3ac513c6-d80a-4d03-a550-1b73e6929696#033[00m Dec 15 05:01:06 localhost dnsmasq[315389]: read /var/lib/neutron/dhcp/3ac513c6-d80a-4d03-a550-1b73e6929696/addn_hosts - 1 addresses Dec 15 05:01:06 localhost dnsmasq-dhcp[315389]: read /var/lib/neutron/dhcp/3ac513c6-d80a-4d03-a550-1b73e6929696/host Dec 15 05:01:06 localhost dnsmasq-dhcp[315389]: read /var/lib/neutron/dhcp/3ac513c6-d80a-4d03-a550-1b73e6929696/opts Dec 15 05:01:06 localhost podman[315468]: 2025-12-15 10:01:06.35990215 +0000 UTC m=+0.060694608 container kill 8d0a249025cd30d3f6ced66bb2a1ab69200de31f87cfd16c2c213b3b64a9055f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3ac513c6-d80a-4d03-a550-1b73e6929696, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2) Dec 15 05:01:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09. Dec 15 05:01:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 05:01:06 localhost podman[315481]: 2025-12-15 10:01:06.444241529 +0000 UTC m=+0.063047605 container health_status 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vcs-type=git, distribution-scope=public, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.expose-services=, managed_by=edpm_ansible, name=ubi9-minimal, release=1755695350, architecture=x86_64, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}) Dec 15 05:01:06 localhost podman[315481]: 2025-12-15 10:01:06.452195205 +0000 UTC m=+0.071001281 container exec_died 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-type=git, io.buildah.version=1.33.7, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, config_id=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., version=9.6, architecture=x86_64, container_name=openstack_network_exporter, name=ubi9-minimal) Dec 15 05:01:06 localhost systemd[1]: 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09.service: Deactivated successfully. Dec 15 05:01:06 localhost podman[315482]: 2025-12-15 10:01:06.512602574 +0000 UTC m=+0.128682873 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202) Dec 15 05:01:06 localhost podman[315482]: 2025-12-15 10:01:06.555954383 +0000 UTC m=+0.172034732 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, config_id=ovn_controller, managed_by=edpm_ansible) Dec 15 05:01:06 localhost nova_compute[286344]: 2025-12-15 10:01:06.559 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:01:06 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 05:01:06 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:01:06.672 267546 INFO neutron.agent.dhcp.agent [None req-28014bcf-6ecd-4139-8d92-d7f3eaefda6d - - - - - -] DHCP configuration for ports {'f4687fb1-144e-4c75-9be4-6b8fc067b9d1'} is completed#033[00m Dec 15 05:01:07 localhost nova_compute[286344]: 2025-12-15 10:01:07.289 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:01:08 localhost nova_compute[286344]: 2025-12-15 10:01:08.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:01:08 localhost nova_compute[286344]: 2025-12-15 10:01:08.722 286348 DEBUG nova.compute.manager [req-7211d9ca-b254-441b-b1fe-fb5674b111aa req-7bc88645-c826-44e5-8bcf-3be22f536a9c 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Received event network-vif-plugged-d6b04aa0-3423-4a78-adfc-4bf3151f80ed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Dec 15 05:01:08 localhost nova_compute[286344]: 2025-12-15 10:01:08.723 286348 DEBUG oslo_concurrency.lockutils [req-7211d9ca-b254-441b-b1fe-fb5674b111aa req-7bc88645-c826-44e5-8bcf-3be22f536a9c 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] Acquiring lock "51fca94e-ecb9-4350-bafc-765e827a1c7b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 05:01:08 localhost nova_compute[286344]: 2025-12-15 10:01:08.723 286348 DEBUG oslo_concurrency.lockutils [req-7211d9ca-b254-441b-b1fe-fb5674b111aa req-7bc88645-c826-44e5-8bcf-3be22f536a9c 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] Lock "51fca94e-ecb9-4350-bafc-765e827a1c7b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 05:01:08 localhost nova_compute[286344]: 2025-12-15 10:01:08.724 286348 DEBUG oslo_concurrency.lockutils [req-7211d9ca-b254-441b-b1fe-fb5674b111aa req-7bc88645-c826-44e5-8bcf-3be22f536a9c 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] Lock "51fca94e-ecb9-4350-bafc-765e827a1c7b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 05:01:08 localhost nova_compute[286344]: 2025-12-15 10:01:08.724 286348 DEBUG nova.compute.manager [req-7211d9ca-b254-441b-b1fe-fb5674b111aa req-7bc88645-c826-44e5-8bcf-3be22f536a9c 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Processing event network-vif-plugged-d6b04aa0-3423-4a78-adfc-4bf3151f80ed _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m Dec 15 05:01:08 localhost nova_compute[286344]: 2025-12-15 10:01:08.725 286348 DEBUG nova.compute.manager [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Instance event wait completed in 13 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m Dec 15 05:01:08 localhost nova_compute[286344]: 2025-12-15 10:01:08.730 286348 DEBUG nova.virt.driver [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] Emitting event Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m Dec 15 05:01:08 localhost nova_compute[286344]: 2025-12-15 10:01:08.731 286348 INFO nova.compute.manager [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] VM Resumed (Lifecycle Event)#033[00m Dec 15 05:01:08 localhost nova_compute[286344]: 2025-12-15 10:01:08.734 286348 DEBUG nova.virt.libvirt.driver [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m Dec 15 05:01:08 localhost nova_compute[286344]: 2025-12-15 10:01:08.737 286348 INFO nova.virt.libvirt.driver [-] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Instance spawned successfully.#033[00m Dec 15 05:01:08 localhost nova_compute[286344]: 2025-12-15 10:01:08.738 286348 DEBUG nova.virt.libvirt.driver [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m Dec 15 05:01:08 localhost nova_compute[286344]: 2025-12-15 10:01:08.757 286348 DEBUG nova.compute.manager [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Dec 15 05:01:08 localhost nova_compute[286344]: 2025-12-15 10:01:08.762 286348 DEBUG nova.virt.libvirt.driver [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m Dec 15 05:01:08 localhost nova_compute[286344]: 2025-12-15 10:01:08.763 286348 DEBUG nova.virt.libvirt.driver [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m Dec 15 05:01:08 localhost nova_compute[286344]: 2025-12-15 10:01:08.764 286348 DEBUG nova.virt.libvirt.driver [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m Dec 15 05:01:08 localhost nova_compute[286344]: 2025-12-15 10:01:08.764 286348 DEBUG nova.virt.libvirt.driver [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m Dec 15 05:01:08 localhost nova_compute[286344]: 2025-12-15 10:01:08.765 286348 DEBUG nova.virt.libvirt.driver [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m Dec 15 05:01:08 localhost nova_compute[286344]: 2025-12-15 10:01:08.766 286348 DEBUG nova.virt.libvirt.driver [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m Dec 15 05:01:08 localhost nova_compute[286344]: 2025-12-15 10:01:08.773 286348 DEBUG nova.compute.manager [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m Dec 15 05:01:08 localhost nova_compute[286344]: 2025-12-15 10:01:08.798 286348 INFO nova.compute.manager [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m Dec 15 05:01:08 localhost nova_compute[286344]: 2025-12-15 10:01:08.822 286348 INFO nova.compute.manager [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Took 19.07 seconds to spawn the instance on the hypervisor.#033[00m Dec 15 05:01:08 localhost nova_compute[286344]: 2025-12-15 10:01:08.823 286348 DEBUG nova.compute.manager [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Dec 15 05:01:08 localhost nova_compute[286344]: 2025-12-15 10:01:08.886 286348 INFO nova.compute.manager [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Took 20.21 seconds to build instance.#033[00m Dec 15 05:01:08 localhost nova_compute[286344]: 2025-12-15 10:01:08.902 286348 DEBUG oslo_concurrency.lockutils [None req-b6417b37-68f2-4c3b-be06-66bb73cd87a0 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] Lock "51fca94e-ecb9-4350-bafc-765e827a1c7b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 20.382s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 05:01:09 localhost ovn_controller[154603]: 2025-12-15T10:01:09Z|00112|binding|INFO|Releasing lport b35254ad-12eb-47bb-92be-44fefe0694f0 from this chassis (sb_readonly=0) Dec 15 05:01:09 localhost ovn_controller[154603]: 2025-12-15T10:01:09Z|00113|binding|INFO|Releasing lport 1a4411b8-2368-4b17-9d10-08f9c3480350 from this chassis (sb_readonly=0) Dec 15 05:01:09 localhost ovn_controller[154603]: 2025-12-15T10:01:09Z|00114|binding|INFO|Releasing lport 3c672ee2-c7c4-4620-810e-6910f497dd1e from this chassis (sb_readonly=0) Dec 15 05:01:09 localhost ovn_controller[154603]: 2025-12-15T10:01:09Z|00115|binding|INFO|Releasing lport eb478ea1-29fd-4e9a-85ee-b8b88d82f051 from this chassis (sb_readonly=0) Dec 15 05:01:09 localhost ovn_controller[154603]: 2025-12-15T10:01:09Z|00116|binding|INFO|Releasing lport c3be4d70-09ff-4304-ad4e-0464f0628ae9 from this chassis (sb_readonly=0) Dec 15 05:01:09 localhost ovn_controller[154603]: 2025-12-15T10:01:09Z|00117|binding|INFO|Releasing lport b35254ad-12eb-47bb-92be-44fefe0694f0 from this chassis (sb_readonly=0) Dec 15 05:01:09 localhost nova_compute[286344]: 2025-12-15 10:01:09.200 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:01:09 localhost ovn_controller[154603]: 2025-12-15T10:01:09Z|00118|binding|INFO|Releasing lport 1a4411b8-2368-4b17-9d10-08f9c3480350 from this chassis (sb_readonly=0) Dec 15 05:01:09 localhost ovn_controller[154603]: 2025-12-15T10:01:09Z|00119|binding|INFO|Releasing lport 3c672ee2-c7c4-4620-810e-6910f497dd1e from this chassis (sb_readonly=0) Dec 15 05:01:09 localhost ovn_controller[154603]: 2025-12-15T10:01:09Z|00120|binding|INFO|Releasing lport eb478ea1-29fd-4e9a-85ee-b8b88d82f051 from this chassis (sb_readonly=0) Dec 15 05:01:09 localhost ovn_controller[154603]: 2025-12-15T10:01:09Z|00121|binding|INFO|Releasing lport c3be4d70-09ff-4304-ad4e-0464f0628ae9 from this chassis (sb_readonly=0) Dec 15 05:01:09 localhost nova_compute[286344]: 2025-12-15 10:01:09.218 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:01:09 localhost nova_compute[286344]: 2025-12-15 10:01:09.271 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:01:09 localhost nova_compute[286344]: 2025-12-15 10:01:09.271 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:01:10 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:01:10.187 267546 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-15T10:01:05Z, description=, device_id=73f5a759-7688-41b9-ae2b-fbcd9ec62dc9, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=f4687fb1-144e-4c75-9be4-6b8fc067b9d1, ip_allocation=immediate, mac_address=fa:16:3e:5d:e5:eb, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-15T10:00:57Z, description=, dns_domain=, id=3ac513c6-d80a-4d03-a550-1b73e6929696, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ServersV294TestFqdnHostnames-1189201574-network, port_security_enabled=True, project_id=054b5bdd4ed44009a8e1940489c96b34, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=38218, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=585, status=ACTIVE, subnets=['225c87a4-c79c-4b41-a792-50b7fbf53b84'], tags=[], tenant_id=054b5bdd4ed44009a8e1940489c96b34, updated_at=2025-12-15T10:00:59Z, vlan_transparent=None, network_id=3ac513c6-d80a-4d03-a550-1b73e6929696, port_security_enabled=False, project_id=054b5bdd4ed44009a8e1940489c96b34, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=622, status=DOWN, tags=[], tenant_id=054b5bdd4ed44009a8e1940489c96b34, updated_at=2025-12-15T10:01:05Z on network 3ac513c6-d80a-4d03-a550-1b73e6929696#033[00m Dec 15 05:01:10 localhost nova_compute[286344]: 2025-12-15 10:01:10.266 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:01:10 localhost nova_compute[286344]: 2025-12-15 10:01:10.293 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:01:10 localhost nova_compute[286344]: 2025-12-15 10:01:10.293 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 15 05:01:10 localhost nova_compute[286344]: 2025-12-15 10:01:10.293 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 15 05:01:10 localhost nova_compute[286344]: 2025-12-15 10:01:10.393 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 15 05:01:10 localhost nova_compute[286344]: 2025-12-15 10:01:10.393 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquired lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 15 05:01:10 localhost nova_compute[286344]: 2025-12-15 10:01:10.394 286348 DEBUG nova.network.neutron [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 15 05:01:10 localhost nova_compute[286344]: 2025-12-15 10:01:10.394 286348 DEBUG nova.objects.instance [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 15 05:01:10 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e101 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 05:01:10 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e101 do_prune osdmap full prune enabled Dec 15 05:01:10 localhost podman[315547]: 2025-12-15 10:01:10.450947209 +0000 UTC m=+0.073073763 container kill 8d0a249025cd30d3f6ced66bb2a1ab69200de31f87cfd16c2c213b3b64a9055f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3ac513c6-d80a-4d03-a550-1b73e6929696, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true) Dec 15 05:01:10 localhost dnsmasq[315389]: read /var/lib/neutron/dhcp/3ac513c6-d80a-4d03-a550-1b73e6929696/addn_hosts - 1 addresses Dec 15 05:01:10 localhost dnsmasq-dhcp[315389]: read /var/lib/neutron/dhcp/3ac513c6-d80a-4d03-a550-1b73e6929696/host Dec 15 05:01:10 localhost dnsmasq-dhcp[315389]: read /var/lib/neutron/dhcp/3ac513c6-d80a-4d03-a550-1b73e6929696/opts Dec 15 05:01:10 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e102 e102: 6 total, 6 up, 6 in Dec 15 05:01:10 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e102: 6 total, 6 up, 6 in Dec 15 05:01:10 localhost nova_compute[286344]: 2025-12-15 10:01:10.634 286348 DEBUG oslo_concurrency.lockutils [None req-f105e559-0d6d-47e1-8926-de823ff3b94e 7eba5d40415d4fd7ab565eed1e07b799 4e26599ab4374eb6b7e9d73e7dfebd03 - - default default] Acquiring lock "8f5a73ce-a171-4a42-9f60-63c0db9e0a32" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 05:01:10 localhost nova_compute[286344]: 2025-12-15 10:01:10.636 286348 DEBUG oslo_concurrency.lockutils [None req-f105e559-0d6d-47e1-8926-de823ff3b94e 7eba5d40415d4fd7ab565eed1e07b799 4e26599ab4374eb6b7e9d73e7dfebd03 - - default default] Lock "8f5a73ce-a171-4a42-9f60-63c0db9e0a32" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 05:01:10 localhost nova_compute[286344]: 2025-12-15 10:01:10.636 286348 DEBUG oslo_concurrency.lockutils [None req-f105e559-0d6d-47e1-8926-de823ff3b94e 7eba5d40415d4fd7ab565eed1e07b799 4e26599ab4374eb6b7e9d73e7dfebd03 - - default default] Acquiring lock "8f5a73ce-a171-4a42-9f60-63c0db9e0a32-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 05:01:10 localhost nova_compute[286344]: 2025-12-15 10:01:10.637 286348 DEBUG oslo_concurrency.lockutils [None req-f105e559-0d6d-47e1-8926-de823ff3b94e 7eba5d40415d4fd7ab565eed1e07b799 4e26599ab4374eb6b7e9d73e7dfebd03 - - default default] Lock "8f5a73ce-a171-4a42-9f60-63c0db9e0a32-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 05:01:10 localhost nova_compute[286344]: 2025-12-15 10:01:10.637 286348 DEBUG oslo_concurrency.lockutils [None req-f105e559-0d6d-47e1-8926-de823ff3b94e 7eba5d40415d4fd7ab565eed1e07b799 4e26599ab4374eb6b7e9d73e7dfebd03 - - default default] Lock "8f5a73ce-a171-4a42-9f60-63c0db9e0a32-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 05:01:10 localhost nova_compute[286344]: 2025-12-15 10:01:10.639 286348 INFO nova.compute.manager [None req-f105e559-0d6d-47e1-8926-de823ff3b94e 7eba5d40415d4fd7ab565eed1e07b799 4e26599ab4374eb6b7e9d73e7dfebd03 - - default default] [instance: 8f5a73ce-a171-4a42-9f60-63c0db9e0a32] Terminating instance#033[00m Dec 15 05:01:10 localhost nova_compute[286344]: 2025-12-15 10:01:10.642 286348 DEBUG nova.compute.manager [None req-f105e559-0d6d-47e1-8926-de823ff3b94e 7eba5d40415d4fd7ab565eed1e07b799 4e26599ab4374eb6b7e9d73e7dfebd03 - - default default] [instance: 8f5a73ce-a171-4a42-9f60-63c0db9e0a32] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m Dec 15 05:01:10 localhost kernel: device tap19409185-ff left promiscuous mode Dec 15 05:01:10 localhost NetworkManager[5963]: [1765792870.7787] device (tap19409185-ff): state change: disconnected -> unmanaged (reason 'unmanaged', sys-iface-state: 'removed') Dec 15 05:01:10 localhost ovn_controller[154603]: 2025-12-15T10:01:10Z|00122|binding|INFO|Releasing lport 19409185-ff01-4d58-bd6e-5d66014dd5c9 from this chassis (sb_readonly=0) Dec 15 05:01:10 localhost ovn_controller[154603]: 2025-12-15T10:01:10Z|00123|binding|INFO|Setting lport 19409185-ff01-4d58-bd6e-5d66014dd5c9 down in Southbound Dec 15 05:01:10 localhost ovn_controller[154603]: 2025-12-15T10:01:10Z|00124|binding|INFO|Releasing lport fd72ab01-c63c-4f51-ae94-ca02f6f50dfa from this chassis (sb_readonly=0) Dec 15 05:01:10 localhost ovn_controller[154603]: 2025-12-15T10:01:10Z|00125|binding|INFO|Setting lport fd72ab01-c63c-4f51-ae94-ca02f6f50dfa down in Southbound Dec 15 05:01:10 localhost ovn_controller[154603]: 2025-12-15T10:01:10Z|00126|binding|INFO|Removing iface tap19409185-ff ovn-installed in OVS Dec 15 05:01:10 localhost nova_compute[286344]: 2025-12-15 10:01:10.788 286348 DEBUG nova.compute.manager [req-bc592847-6689-430e-91ed-0f66feffec83 req-9d5ff6c8-5798-4613-974e-a5b14c2f6a2a 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Received event network-vif-plugged-d6b04aa0-3423-4a78-adfc-4bf3151f80ed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Dec 15 05:01:10 localhost nova_compute[286344]: 2025-12-15 10:01:10.789 286348 DEBUG oslo_concurrency.lockutils [req-bc592847-6689-430e-91ed-0f66feffec83 req-9d5ff6c8-5798-4613-974e-a5b14c2f6a2a 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] Acquiring lock "51fca94e-ecb9-4350-bafc-765e827a1c7b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 05:01:10 localhost nova_compute[286344]: 2025-12-15 10:01:10.790 286348 DEBUG oslo_concurrency.lockutils [req-bc592847-6689-430e-91ed-0f66feffec83 req-9d5ff6c8-5798-4613-974e-a5b14c2f6a2a 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] Lock "51fca94e-ecb9-4350-bafc-765e827a1c7b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 05:01:10 localhost nova_compute[286344]: 2025-12-15 10:01:10.790 286348 DEBUG oslo_concurrency.lockutils [req-bc592847-6689-430e-91ed-0f66feffec83 req-9d5ff6c8-5798-4613-974e-a5b14c2f6a2a 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] Lock "51fca94e-ecb9-4350-bafc-765e827a1c7b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 05:01:10 localhost nova_compute[286344]: 2025-12-15 10:01:10.791 286348 DEBUG nova.compute.manager [req-bc592847-6689-430e-91ed-0f66feffec83 req-9d5ff6c8-5798-4613-974e-a5b14c2f6a2a 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] No waiting events found dispatching network-vif-plugged-d6b04aa0-3423-4a78-adfc-4bf3151f80ed pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m Dec 15 05:01:10 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:10.795 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:df:04:be 10.100.0.13'], port_security=['fa:16:3e:df:04:be 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-2031244324', 'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '8f5a73ce-a171-4a42-9f60-63c0db9e0a32', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ff0cb653-0dc9-48ed-8a20-700650a10509', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-2031244324', 'neutron:project_id': '4e26599ab4374eb6b7e9d73e7dfebd03', 'neutron:revision_number': '12', 'neutron:security_group_ids': '489315f5-1e14-49f6-8195-de837d23b935', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005559462.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0504d409-4e28-4301-a421-b053ce853c40, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[], logical_port=19409185-ff01-4d58-bd6e-5d66014dd5c9) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:01:10 localhost nova_compute[286344]: 2025-12-15 10:01:10.791 286348 WARNING nova.compute.manager [req-bc592847-6689-430e-91ed-0f66feffec83 req-9d5ff6c8-5798-4613-974e-a5b14c2f6a2a 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Received unexpected event network-vif-plugged-d6b04aa0-3423-4a78-adfc-4bf3151f80ed for instance with vm_state active and task_state None.#033[00m Dec 15 05:01:10 localhost nova_compute[286344]: 2025-12-15 10:01:10.796 286348 DEBUG nova.compute.manager [req-bc592847-6689-430e-91ed-0f66feffec83 req-9d5ff6c8-5798-4613-974e-a5b14c2f6a2a 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] [instance: 8f5a73ce-a171-4a42-9f60-63c0db9e0a32] Received event network-vif-plugged-19409185-ff01-4d58-bd6e-5d66014dd5c9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Dec 15 05:01:10 localhost nova_compute[286344]: 2025-12-15 10:01:10.796 286348 DEBUG oslo_concurrency.lockutils [req-bc592847-6689-430e-91ed-0f66feffec83 req-9d5ff6c8-5798-4613-974e-a5b14c2f6a2a 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] Acquiring lock "8f5a73ce-a171-4a42-9f60-63c0db9e0a32-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 05:01:10 localhost nova_compute[286344]: 2025-12-15 10:01:10.796 286348 DEBUG oslo_concurrency.lockutils [req-bc592847-6689-430e-91ed-0f66feffec83 req-9d5ff6c8-5798-4613-974e-a5b14c2f6a2a 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] Lock "8f5a73ce-a171-4a42-9f60-63c0db9e0a32-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 05:01:10 localhost nova_compute[286344]: 2025-12-15 10:01:10.797 286348 DEBUG oslo_concurrency.lockutils [req-bc592847-6689-430e-91ed-0f66feffec83 req-9d5ff6c8-5798-4613-974e-a5b14c2f6a2a 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] Lock "8f5a73ce-a171-4a42-9f60-63c0db9e0a32-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 05:01:10 localhost nova_compute[286344]: 2025-12-15 10:01:10.797 286348 DEBUG nova.compute.manager [req-bc592847-6689-430e-91ed-0f66feffec83 req-9d5ff6c8-5798-4613-974e-a5b14c2f6a2a 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] [instance: 8f5a73ce-a171-4a42-9f60-63c0db9e0a32] No waiting events found dispatching network-vif-plugged-19409185-ff01-4d58-bd6e-5d66014dd5c9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m Dec 15 05:01:10 localhost nova_compute[286344]: 2025-12-15 10:01:10.797 286348 WARNING nova.compute.manager [req-bc592847-6689-430e-91ed-0f66feffec83 req-9d5ff6c8-5798-4613-974e-a5b14c2f6a2a 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] [instance: 8f5a73ce-a171-4a42-9f60-63c0db9e0a32] Received unexpected event network-vif-plugged-19409185-ff01-4d58-bd6e-5d66014dd5c9 for instance with vm_state active and task_state deleting.#033[00m Dec 15 05:01:10 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:10.798 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:79:ba:17 19.80.0.198'], port_security=['fa:16:3e:79:ba:17 19.80.0.198'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['19409185-ff01-4d58-bd6e-5d66014dd5c9'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-145676246', 'neutron:cidrs': '19.80.0.198/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0874a63c-034d-48db-8f84-a51c1fe90687', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-145676246', 'neutron:project_id': '4e26599ab4374eb6b7e9d73e7dfebd03', 'neutron:revision_number': '5', 'neutron:security_group_ids': '489315f5-1e14-49f6-8195-de837d23b935', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=d3aba774-abb1-4ddc-beaa-cd88849a49e2, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=fd72ab01-c63c-4f51-ae94-ca02f6f50dfa) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:01:10 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:10.799 160590 INFO neutron.agent.ovn.metadata.agent [-] Port 19409185-ff01-4d58-bd6e-5d66014dd5c9 in datapath ff0cb653-0dc9-48ed-8a20-700650a10509 unbound from our chassis#033[00m Dec 15 05:01:10 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:01:10.799 267546 INFO neutron.agent.dhcp.agent [None req-52ffa594-72c3-4e36-8470-aa4d3c6fd327 - - - - - -] DHCP configuration for ports {'f4687fb1-144e-4c75-9be4-6b8fc067b9d1'} is completed#033[00m Dec 15 05:01:10 localhost nova_compute[286344]: 2025-12-15 10:01:10.797 286348 DEBUG nova.compute.manager [req-bc592847-6689-430e-91ed-0f66feffec83 req-9d5ff6c8-5798-4613-974e-a5b14c2f6a2a 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] [instance: 8f5a73ce-a171-4a42-9f60-63c0db9e0a32] Received event network-vif-plugged-19409185-ff01-4d58-bd6e-5d66014dd5c9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Dec 15 05:01:10 localhost nova_compute[286344]: 2025-12-15 10:01:10.798 286348 DEBUG oslo_concurrency.lockutils [req-bc592847-6689-430e-91ed-0f66feffec83 req-9d5ff6c8-5798-4613-974e-a5b14c2f6a2a 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] Acquiring lock "8f5a73ce-a171-4a42-9f60-63c0db9e0a32-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 05:01:10 localhost nova_compute[286344]: 2025-12-15 10:01:10.798 286348 DEBUG oslo_concurrency.lockutils [req-bc592847-6689-430e-91ed-0f66feffec83 req-9d5ff6c8-5798-4613-974e-a5b14c2f6a2a 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] Lock "8f5a73ce-a171-4a42-9f60-63c0db9e0a32-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 05:01:10 localhost nova_compute[286344]: 2025-12-15 10:01:10.799 286348 DEBUG oslo_concurrency.lockutils [req-bc592847-6689-430e-91ed-0f66feffec83 req-9d5ff6c8-5798-4613-974e-a5b14c2f6a2a 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] Lock "8f5a73ce-a171-4a42-9f60-63c0db9e0a32-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 05:01:10 localhost nova_compute[286344]: 2025-12-15 10:01:10.799 286348 DEBUG nova.compute.manager [req-bc592847-6689-430e-91ed-0f66feffec83 req-9d5ff6c8-5798-4613-974e-a5b14c2f6a2a 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] [instance: 8f5a73ce-a171-4a42-9f60-63c0db9e0a32] No waiting events found dispatching network-vif-plugged-19409185-ff01-4d58-bd6e-5d66014dd5c9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m Dec 15 05:01:10 localhost nova_compute[286344]: 2025-12-15 10:01:10.799 286348 WARNING nova.compute.manager [req-bc592847-6689-430e-91ed-0f66feffec83 req-9d5ff6c8-5798-4613-974e-a5b14c2f6a2a 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] [instance: 8f5a73ce-a171-4a42-9f60-63c0db9e0a32] Received unexpected event network-vif-plugged-19409185-ff01-4d58-bd6e-5d66014dd5c9 for instance with vm_state active and task_state deleting.#033[00m Dec 15 05:01:10 localhost nova_compute[286344]: 2025-12-15 10:01:10.800 286348 DEBUG nova.compute.manager [req-bc592847-6689-430e-91ed-0f66feffec83 req-9d5ff6c8-5798-4613-974e-a5b14c2f6a2a 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] [instance: 8f5a73ce-a171-4a42-9f60-63c0db9e0a32] Received event network-vif-plugged-19409185-ff01-4d58-bd6e-5d66014dd5c9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Dec 15 05:01:10 localhost nova_compute[286344]: 2025-12-15 10:01:10.800 286348 DEBUG oslo_concurrency.lockutils [req-bc592847-6689-430e-91ed-0f66feffec83 req-9d5ff6c8-5798-4613-974e-a5b14c2f6a2a 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] Acquiring lock "8f5a73ce-a171-4a42-9f60-63c0db9e0a32-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 05:01:10 localhost nova_compute[286344]: 2025-12-15 10:01:10.800 286348 DEBUG oslo_concurrency.lockutils [req-bc592847-6689-430e-91ed-0f66feffec83 req-9d5ff6c8-5798-4613-974e-a5b14c2f6a2a 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] Lock "8f5a73ce-a171-4a42-9f60-63c0db9e0a32-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 05:01:10 localhost nova_compute[286344]: 2025-12-15 10:01:10.801 286348 DEBUG oslo_concurrency.lockutils [req-bc592847-6689-430e-91ed-0f66feffec83 req-9d5ff6c8-5798-4613-974e-a5b14c2f6a2a 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] Lock "8f5a73ce-a171-4a42-9f60-63c0db9e0a32-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 05:01:10 localhost ovn_controller[154603]: 2025-12-15T10:01:10Z|00127|binding|INFO|Releasing lport b35254ad-12eb-47bb-92be-44fefe0694f0 from this chassis (sb_readonly=0) Dec 15 05:01:10 localhost ovn_controller[154603]: 2025-12-15T10:01:10Z|00128|binding|INFO|Releasing lport 1a4411b8-2368-4b17-9d10-08f9c3480350 from this chassis (sb_readonly=0) Dec 15 05:01:10 localhost ovn_controller[154603]: 2025-12-15T10:01:10Z|00129|binding|INFO|Releasing lport 3c672ee2-c7c4-4620-810e-6910f497dd1e from this chassis (sb_readonly=0) Dec 15 05:01:10 localhost ovn_controller[154603]: 2025-12-15T10:01:10Z|00130|binding|INFO|Releasing lport eb478ea1-29fd-4e9a-85ee-b8b88d82f051 from this chassis (sb_readonly=0) Dec 15 05:01:10 localhost ovn_controller[154603]: 2025-12-15T10:01:10Z|00131|binding|INFO|Releasing lport c3be4d70-09ff-4304-ad4e-0464f0628ae9 from this chassis (sb_readonly=0) Dec 15 05:01:10 localhost nova_compute[286344]: 2025-12-15 10:01:10.803 286348 DEBUG nova.compute.manager [req-bc592847-6689-430e-91ed-0f66feffec83 req-9d5ff6c8-5798-4613-974e-a5b14c2f6a2a 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] [instance: 8f5a73ce-a171-4a42-9f60-63c0db9e0a32] No waiting events found dispatching network-vif-plugged-19409185-ff01-4d58-bd6e-5d66014dd5c9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m Dec 15 05:01:10 localhost nova_compute[286344]: 2025-12-15 10:01:10.803 286348 WARNING nova.compute.manager [req-bc592847-6689-430e-91ed-0f66feffec83 req-9d5ff6c8-5798-4613-974e-a5b14c2f6a2a 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] [instance: 8f5a73ce-a171-4a42-9f60-63c0db9e0a32] Received unexpected event network-vif-plugged-19409185-ff01-4d58-bd6e-5d66014dd5c9 for instance with vm_state active and task_state deleting.#033[00m Dec 15 05:01:10 localhost nova_compute[286344]: 2025-12-15 10:01:10.804 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:01:10 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:10.804 160590 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ff0cb653-0dc9-48ed-8a20-700650a10509, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 15 05:01:10 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:10.805 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[e064afd4-99db-4799-94d4-a0252aa463f0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:01:10 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:10.806 160590 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ff0cb653-0dc9-48ed-8a20-700650a10509 namespace which is not needed anymore#033[00m Dec 15 05:01:10 localhost nova_compute[286344]: 2025-12-15 10:01:10.828 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:01:10 localhost systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000007.scope: Deactivated successfully. Dec 15 05:01:10 localhost systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000007.scope: Consumed 1.409s CPU time. Dec 15 05:01:10 localhost systemd-machined[84011]: Machine qemu-3-instance-00000007 terminated. Dec 15 05:01:10 localhost nova_compute[286344]: 2025-12-15 10:01:10.865 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:01:10 localhost nova_compute[286344]: 2025-12-15 10:01:10.870 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:01:10 localhost nova_compute[286344]: 2025-12-15 10:01:10.882 286348 INFO nova.virt.libvirt.driver [-] [instance: 8f5a73ce-a171-4a42-9f60-63c0db9e0a32] Instance destroyed successfully.#033[00m Dec 15 05:01:10 localhost nova_compute[286344]: 2025-12-15 10:01:10.882 286348 DEBUG nova.objects.instance [None req-f105e559-0d6d-47e1-8926-de823ff3b94e 7eba5d40415d4fd7ab565eed1e07b799 4e26599ab4374eb6b7e9d73e7dfebd03 - - default default] Lazy-loading 'resources' on Instance uuid 8f5a73ce-a171-4a42-9f60-63c0db9e0a32 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 15 05:01:10 localhost nova_compute[286344]: 2025-12-15 10:01:10.898 286348 DEBUG nova.virt.libvirt.vif [None req-f105e559-0d6d-47e1-8926-de823ff3b94e 7eba5d40415d4fd7ab565eed1e07b799 4e26599ab4374eb6b7e9d73e7dfebd03 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-12-15T10:00:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1394638268',display_name='tempest-LiveMigrationTest-server-1394638268',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(5),hidden=False,host='np0005559462.localdomain',hostname='tempest-livemigrationtest-server-1394638268',id=7,image_ref='b48177c8-9d95-4864-913a-a010f9defaa6',info_cache=InstanceInfoCache,instance_type_id=5,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2025-12-15T10:00:39Z,launched_on='np0005559463.localdomain',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=,new_flavor=None,node='np0005559462.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=,power_state=1,progress=0,project_id='4e26599ab4374eb6b7e9d73e7dfebd03',ramdisk_id='',reservation_id='r-g02irkkv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='b48177c8-9d95-4864-913a-a010f9defaa6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-134706359',owner_user_name='tempest-LiveMigrationTest-134706359-project-member'},tags=,task_state='deleting',terminated_at=None,trusted_certs=,updated_at=2025-12-15T10:01:01Z,user_data=None,user_id='7eba5d40415d4fd7ab565eed1e07b799',uuid=8f5a73ce-a171-4a42-9f60-63c0db9e0a32,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "19409185-ff01-4d58-bd6e-5d66014dd5c9", "address": "fa:16:3e:df:04:be", "network": {"id": "ff0cb653-0dc9-48ed-8a20-700650a10509", "bridge": "br-int", "label": "tempest-LiveMigrationTest-2038605812-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "4e26599ab4374eb6b7e9d73e7dfebd03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap19409185-ff", "ovs_interfaceid": "19409185-ff01-4d58-bd6e-5d66014dd5c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m Dec 15 05:01:10 localhost nova_compute[286344]: 2025-12-15 10:01:10.899 286348 DEBUG nova.network.os_vif_util [None req-f105e559-0d6d-47e1-8926-de823ff3b94e 7eba5d40415d4fd7ab565eed1e07b799 4e26599ab4374eb6b7e9d73e7dfebd03 - - default default] Converting VIF {"id": "19409185-ff01-4d58-bd6e-5d66014dd5c9", "address": "fa:16:3e:df:04:be", "network": {"id": "ff0cb653-0dc9-48ed-8a20-700650a10509", "bridge": "br-int", "label": "tempest-LiveMigrationTest-2038605812-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "4e26599ab4374eb6b7e9d73e7dfebd03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap19409185-ff", "ovs_interfaceid": "19409185-ff01-4d58-bd6e-5d66014dd5c9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m Dec 15 05:01:10 localhost nova_compute[286344]: 2025-12-15 10:01:10.900 286348 DEBUG nova.network.os_vif_util [None req-f105e559-0d6d-47e1-8926-de823ff3b94e 7eba5d40415d4fd7ab565eed1e07b799 4e26599ab4374eb6b7e9d73e7dfebd03 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:df:04:be,bridge_name='br-int',has_traffic_filtering=True,id=19409185-ff01-4d58-bd6e-5d66014dd5c9,network=Network(ff0cb653-0dc9-48ed-8a20-700650a10509),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap19409185-ff') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m Dec 15 05:01:10 localhost nova_compute[286344]: 2025-12-15 10:01:10.901 286348 DEBUG os_vif [None req-f105e559-0d6d-47e1-8926-de823ff3b94e 7eba5d40415d4fd7ab565eed1e07b799 4e26599ab4374eb6b7e9d73e7dfebd03 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:df:04:be,bridge_name='br-int',has_traffic_filtering=True,id=19409185-ff01-4d58-bd6e-5d66014dd5c9,network=Network(ff0cb653-0dc9-48ed-8a20-700650a10509),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap19409185-ff') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m Dec 15 05:01:10 localhost nova_compute[286344]: 2025-12-15 10:01:10.910 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:01:10 localhost nova_compute[286344]: 2025-12-15 10:01:10.910 286348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap19409185-ff, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 15 05:01:10 localhost nova_compute[286344]: 2025-12-15 10:01:10.917 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:01:10 localhost nova_compute[286344]: 2025-12-15 10:01:10.919 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 05:01:10 localhost nova_compute[286344]: 2025-12-15 10:01:10.923 286348 INFO os_vif [None req-f105e559-0d6d-47e1-8926-de823ff3b94e 7eba5d40415d4fd7ab565eed1e07b799 4e26599ab4374eb6b7e9d73e7dfebd03 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:df:04:be,bridge_name='br-int',has_traffic_filtering=True,id=19409185-ff01-4d58-bd6e-5d66014dd5c9,network=Network(ff0cb653-0dc9-48ed-8a20-700650a10509),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap19409185-ff')#033[00m Dec 15 05:01:10 localhost snmpd[69387]: empty variable list in _query Dec 15 05:01:10 localhost snmpd[69387]: empty variable list in _query Dec 15 05:01:10 localhost snmpd[69387]: empty variable list in _query Dec 15 05:01:10 localhost snmpd[69387]: empty variable list in _query Dec 15 05:01:10 localhost snmpd[69387]: empty variable list in _query Dec 15 05:01:10 localhost snmpd[69387]: empty variable list in _query Dec 15 05:01:11 localhost systemd[1]: tmp-crun.XnqkUy.mount: Deactivated successfully. Dec 15 05:01:11 localhost neutron-haproxy-ovnmeta-ff0cb653-0dc9-48ed-8a20-700650a10509[315208]: [NOTICE] (315212) : haproxy version is 2.8.14-c23fe91 Dec 15 05:01:11 localhost neutron-haproxy-ovnmeta-ff0cb653-0dc9-48ed-8a20-700650a10509[315208]: [NOTICE] (315212) : path to executable is /usr/sbin/haproxy Dec 15 05:01:11 localhost neutron-haproxy-ovnmeta-ff0cb653-0dc9-48ed-8a20-700650a10509[315208]: [WARNING] (315212) : Exiting Master process... Dec 15 05:01:11 localhost neutron-haproxy-ovnmeta-ff0cb653-0dc9-48ed-8a20-700650a10509[315208]: [ALERT] (315212) : Current worker (315214) exited with code 143 (Terminated) Dec 15 05:01:11 localhost neutron-haproxy-ovnmeta-ff0cb653-0dc9-48ed-8a20-700650a10509[315208]: [WARNING] (315212) : All workers exited. Exiting... (0) Dec 15 05:01:11 localhost systemd[1]: libpod-fbf5e5fffb8688761fea1889fc3c1bf4b00ba85f56a27be317821bcad0f556a3.scope: Deactivated successfully. Dec 15 05:01:11 localhost podman[315597]: 2025-12-15 10:01:11.061294847 +0000 UTC m=+0.106931067 container died fbf5e5fffb8688761fea1889fc3c1bf4b00ba85f56a27be317821bcad0f556a3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ff0cb653-0dc9-48ed-8a20-700650a10509, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true) Dec 15 05:01:11 localhost podman[315597]: 2025-12-15 10:01:11.111603798 +0000 UTC m=+0.157239968 container cleanup fbf5e5fffb8688761fea1889fc3c1bf4b00ba85f56a27be317821bcad0f556a3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ff0cb653-0dc9-48ed-8a20-700650a10509, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:01:11 localhost podman[315628]: 2025-12-15 10:01:11.157373336 +0000 UTC m=+0.089956249 container cleanup fbf5e5fffb8688761fea1889fc3c1bf4b00ba85f56a27be317821bcad0f556a3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ff0cb653-0dc9-48ed-8a20-700650a10509, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS) Dec 15 05:01:11 localhost systemd[1]: libpod-conmon-fbf5e5fffb8688761fea1889fc3c1bf4b00ba85f56a27be317821bcad0f556a3.scope: Deactivated successfully. Dec 15 05:01:11 localhost podman[315643]: 2025-12-15 10:01:11.226964742 +0000 UTC m=+0.090499882 container remove fbf5e5fffb8688761fea1889fc3c1bf4b00ba85f56a27be317821bcad0f556a3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ff0cb653-0dc9-48ed-8a20-700650a10509, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS) Dec 15 05:01:11 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:11.232 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[1c7312dc-cc12-443e-be97-82499116f3d5]: (4, ('Mon Dec 15 10:01:10 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-ff0cb653-0dc9-48ed-8a20-700650a10509 (fbf5e5fffb8688761fea1889fc3c1bf4b00ba85f56a27be317821bcad0f556a3)\nfbf5e5fffb8688761fea1889fc3c1bf4b00ba85f56a27be317821bcad0f556a3\nMon Dec 15 10:01:11 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-ff0cb653-0dc9-48ed-8a20-700650a10509 (fbf5e5fffb8688761fea1889fc3c1bf4b00ba85f56a27be317821bcad0f556a3)\nfbf5e5fffb8688761fea1889fc3c1bf4b00ba85f56a27be317821bcad0f556a3\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:01:11 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:11.234 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[9eab12a8-6a62-4f3c-af7b-476088f332a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:01:11 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:11.235 160590 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapff0cb653-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 15 05:01:11 localhost nova_compute[286344]: 2025-12-15 10:01:11.237 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:01:11 localhost kernel: device tapff0cb653-00 left promiscuous mode Dec 15 05:01:11 localhost nova_compute[286344]: 2025-12-15 10:01:11.239 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:01:11 localhost nova_compute[286344]: 2025-12-15 10:01:11.245 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:01:11 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:11.251 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[6855092a-fc0c-4706-a9c2-df07793502d6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:01:11 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:11.266 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[28439cef-d5b9-4682-882c-cdcdadf1af56]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:01:11 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:11.267 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[bf790c25-4556-4fe1-8325-f39f20307d65]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:01:11 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:11.286 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[18b4da5c-e853-4921-b343-dd278f03cf95]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1192439, 'reachable_time': 34968, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1356, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 315661, 'error': None, 'target': 'ovnmeta-ff0cb653-0dc9-48ed-8a20-700650a10509', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:01:11 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:11.291 160979 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ff0cb653-0dc9-48ed-8a20-700650a10509 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m Dec 15 05:01:11 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:11.291 160979 DEBUG oslo.privsep.daemon [-] privsep: reply[44c5b6ef-cb51-460d-9967-97150173bfd8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:01:11 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:11.292 160590 INFO neutron.agent.ovn.metadata.agent [-] Port fd72ab01-c63c-4f51-ae94-ca02f6f50dfa in datapath 0874a63c-034d-48db-8f84-a51c1fe90687 unbound from our chassis#033[00m Dec 15 05:01:11 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:11.295 160590 DEBUG neutron.agent.ovn.metadata.agent [-] Port 88d39070-3e5a-4b84-a35d-cecd6f04634a IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Dec 15 05:01:11 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:11.295 160590 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0874a63c-034d-48db-8f84-a51c1fe90687, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 15 05:01:11 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:11.296 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[5cbf4b75-cb4c-4051-addb-cebdce0175e9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:01:11 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:11.297 160590 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-0874a63c-034d-48db-8f84-a51c1fe90687 namespace which is not needed anymore#033[00m Dec 15 05:01:11 localhost systemd[1]: var-lib-containers-storage-overlay-52e193136e567be6221a0f3be354d690b359143abb37b7522fe279a9cd2c04a9-merged.mount: Deactivated successfully. Dec 15 05:01:11 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-fbf5e5fffb8688761fea1889fc3c1bf4b00ba85f56a27be317821bcad0f556a3-userdata-shm.mount: Deactivated successfully. Dec 15 05:01:11 localhost systemd[1]: run-netns-ovnmeta\x2dff0cb653\x2d0dc9\x2d48ed\x2d8a20\x2d700650a10509.mount: Deactivated successfully. Dec 15 05:01:11 localhost neutron-haproxy-ovnmeta-0874a63c-034d-48db-8f84-a51c1fe90687[315281]: [NOTICE] (315285) : haproxy version is 2.8.14-c23fe91 Dec 15 05:01:11 localhost neutron-haproxy-ovnmeta-0874a63c-034d-48db-8f84-a51c1fe90687[315281]: [NOTICE] (315285) : path to executable is /usr/sbin/haproxy Dec 15 05:01:11 localhost neutron-haproxy-ovnmeta-0874a63c-034d-48db-8f84-a51c1fe90687[315281]: [WARNING] (315285) : Exiting Master process... Dec 15 05:01:11 localhost neutron-haproxy-ovnmeta-0874a63c-034d-48db-8f84-a51c1fe90687[315281]: [ALERT] (315285) : Current worker (315287) exited with code 143 (Terminated) Dec 15 05:01:11 localhost neutron-haproxy-ovnmeta-0874a63c-034d-48db-8f84-a51c1fe90687[315281]: [WARNING] (315285) : All workers exited. Exiting... (0) Dec 15 05:01:11 localhost systemd[1]: libpod-a2b32f10869a3d9bcf3105745305f4a2d865b6a282e96d17c0d9489add2e8c1e.scope: Deactivated successfully. Dec 15 05:01:11 localhost podman[315677]: 2025-12-15 10:01:11.503582122 +0000 UTC m=+0.083853318 container died a2b32f10869a3d9bcf3105745305f4a2d865b6a282e96d17c0d9489add2e8c1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0874a63c-034d-48db-8f84-a51c1fe90687, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Dec 15 05:01:11 localhost podman[315677]: 2025-12-15 10:01:11.540048781 +0000 UTC m=+0.120319487 container cleanup a2b32f10869a3d9bcf3105745305f4a2d865b6a282e96d17c0d9489add2e8c1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0874a63c-034d-48db-8f84-a51c1fe90687, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Dec 15 05:01:11 localhost nova_compute[286344]: 2025-12-15 10:01:11.596 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:01:11 localhost podman[315690]: 2025-12-15 10:01:11.600115153 +0000 UTC m=+0.100486419 container cleanup a2b32f10869a3d9bcf3105745305f4a2d865b6a282e96d17c0d9489add2e8c1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0874a63c-034d-48db-8f84-a51c1fe90687, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:01:11 localhost systemd[1]: libpod-conmon-a2b32f10869a3d9bcf3105745305f4a2d865b6a282e96d17c0d9489add2e8c1e.scope: Deactivated successfully. Dec 15 05:01:11 localhost nova_compute[286344]: 2025-12-15 10:01:11.613 286348 INFO nova.virt.libvirt.driver [None req-f105e559-0d6d-47e1-8926-de823ff3b94e 7eba5d40415d4fd7ab565eed1e07b799 4e26599ab4374eb6b7e9d73e7dfebd03 - - default default] [instance: 8f5a73ce-a171-4a42-9f60-63c0db9e0a32] Deleting instance files /var/lib/nova/instances/8f5a73ce-a171-4a42-9f60-63c0db9e0a32_del#033[00m Dec 15 05:01:11 localhost nova_compute[286344]: 2025-12-15 10:01:11.613 286348 INFO nova.virt.libvirt.driver [None req-f105e559-0d6d-47e1-8926-de823ff3b94e 7eba5d40415d4fd7ab565eed1e07b799 4e26599ab4374eb6b7e9d73e7dfebd03 - - default default] [instance: 8f5a73ce-a171-4a42-9f60-63c0db9e0a32] Deletion of /var/lib/nova/instances/8f5a73ce-a171-4a42-9f60-63c0db9e0a32_del complete#033[00m Dec 15 05:01:11 localhost ovn_controller[154603]: 2025-12-15T10:01:11Z|00132|binding|INFO|Releasing lport b35254ad-12eb-47bb-92be-44fefe0694f0 from this chassis (sb_readonly=0) Dec 15 05:01:11 localhost ovn_controller[154603]: 2025-12-15T10:01:11Z|00133|binding|INFO|Releasing lport 1a4411b8-2368-4b17-9d10-08f9c3480350 from this chassis (sb_readonly=0) Dec 15 05:01:11 localhost ovn_controller[154603]: 2025-12-15T10:01:11Z|00134|binding|INFO|Releasing lport eb478ea1-29fd-4e9a-85ee-b8b88d82f051 from this chassis (sb_readonly=0) Dec 15 05:01:11 localhost ovn_controller[154603]: 2025-12-15T10:01:11Z|00135|binding|INFO|Releasing lport c3be4d70-09ff-4304-ad4e-0464f0628ae9 from this chassis (sb_readonly=0) Dec 15 05:01:11 localhost podman[315703]: 2025-12-15 10:01:11.662319717 +0000 UTC m=+0.096713737 container remove a2b32f10869a3d9bcf3105745305f4a2d865b6a282e96d17c0d9489add2e8c1e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0874a63c-034d-48db-8f84-a51c1fe90687, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Dec 15 05:01:11 localhost nova_compute[286344]: 2025-12-15 10:01:11.664 286348 INFO nova.compute.manager [None req-f105e559-0d6d-47e1-8926-de823ff3b94e 7eba5d40415d4fd7ab565eed1e07b799 4e26599ab4374eb6b7e9d73e7dfebd03 - - default default] [instance: 8f5a73ce-a171-4a42-9f60-63c0db9e0a32] Took 1.02 seconds to destroy the instance on the hypervisor.#033[00m Dec 15 05:01:11 localhost nova_compute[286344]: 2025-12-15 10:01:11.665 286348 DEBUG oslo.service.loopingcall [None req-f105e559-0d6d-47e1-8926-de823ff3b94e 7eba5d40415d4fd7ab565eed1e07b799 4e26599ab4374eb6b7e9d73e7dfebd03 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m Dec 15 05:01:11 localhost nova_compute[286344]: 2025-12-15 10:01:11.665 286348 DEBUG nova.compute.manager [-] [instance: 8f5a73ce-a171-4a42-9f60-63c0db9e0a32] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m Dec 15 05:01:11 localhost nova_compute[286344]: 2025-12-15 10:01:11.665 286348 DEBUG nova.network.neutron [-] [instance: 8f5a73ce-a171-4a42-9f60-63c0db9e0a32] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m Dec 15 05:01:11 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:11.671 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[fac39de1-f949-474f-b8ba-c9288fb4161b]: (4, ('Mon Dec 15 10:01:11 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-0874a63c-034d-48db-8f84-a51c1fe90687 (a2b32f10869a3d9bcf3105745305f4a2d865b6a282e96d17c0d9489add2e8c1e)\na2b32f10869a3d9bcf3105745305f4a2d865b6a282e96d17c0d9489add2e8c1e\nMon Dec 15 10:01:11 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-0874a63c-034d-48db-8f84-a51c1fe90687 (a2b32f10869a3d9bcf3105745305f4a2d865b6a282e96d17c0d9489add2e8c1e)\na2b32f10869a3d9bcf3105745305f4a2d865b6a282e96d17c0d9489add2e8c1e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:01:11 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:11.673 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[7cb3c522-f961-40fd-8281-51df811e05e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:01:11 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:11.674 160590 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0874a63c-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 15 05:01:11 localhost nova_compute[286344]: 2025-12-15 10:01:11.685 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:01:11 localhost kernel: device tap0874a63c-00 left promiscuous mode Dec 15 05:01:11 localhost nova_compute[286344]: 2025-12-15 10:01:11.692 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:01:11 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:11.695 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[fa01b0ab-c1a7-4a72-a0a3-27b01e47f988]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:01:11 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:11.714 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[1582ee0e-8dfd-4880-8f25-681e2a44b6d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:01:11 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:11.715 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[e26fe3ef-625c-4e69-a1e9-55d295a2b9fc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:01:11 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:11.731 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[2b6c8ed4-298d-43f8-bb87-118cb263e7f1]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1192528, 'reachable_time': 36298, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1356, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 315723, 'error': None, 'target': 'ovnmeta-0874a63c-034d-48db-8f84-a51c1fe90687', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:01:11 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:11.733 160979 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-0874a63c-034d-48db-8f84-a51c1fe90687 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m Dec 15 05:01:11 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:11.733 160979 DEBUG oslo.privsep.daemon [-] privsep: reply[2ea66f19-0505-4f27-92de-04b5d2f97867]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:01:11 localhost nova_compute[286344]: 2025-12-15 10:01:11.790 286348 DEBUG nova.network.neutron [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Updating instance_info_cache with network_info: [{"id": "03ef8889-3216-43fb-8a52-4be17a956ce1", "address": "fa:16:3e:74:df:7c", "network": {"id": "befb7a72-17a9-4bcb-b561-84b8f626685a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.201", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "c785bf23f53946bc99867d8832a50266", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03ef8889-32", "ovs_interfaceid": "03ef8889-3216-43fb-8a52-4be17a956ce1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 15 05:01:11 localhost nova_compute[286344]: 2025-12-15 10:01:11.812 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Releasing lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 15 05:01:11 localhost nova_compute[286344]: 2025-12-15 10:01:11.813 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 15 05:01:11 localhost nova_compute[286344]: 2025-12-15 10:01:11.813 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:01:12 localhost systemd[1]: var-lib-containers-storage-overlay-7e2d8c207c36940f6908fdaa1501d9253b7e7db0b57e87bdb9d8301453e4b4f2-merged.mount: Deactivated successfully. Dec 15 05:01:12 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a2b32f10869a3d9bcf3105745305f4a2d865b6a282e96d17c0d9489add2e8c1e-userdata-shm.mount: Deactivated successfully. Dec 15 05:01:12 localhost systemd[1]: run-netns-ovnmeta\x2d0874a63c\x2d034d\x2d48db\x2d8f84\x2da51c1fe90687.mount: Deactivated successfully. Dec 15 05:01:12 localhost nova_compute[286344]: 2025-12-15 10:01:12.992 286348 DEBUG nova.virt.libvirt.driver [None req-3896e173-607d-4857-b33f-eee46c72bcf0 b27b4cc96007415aab89230177c048ab 466123b442f148a6ab70ab607ead6747 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Check if temp file /var/lib/nova/instances/tmpc3uv3_wg exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065#033[00m Dec 15 05:01:12 localhost nova_compute[286344]: 2025-12-15 10:01:12.993 286348 DEBUG nova.compute.manager [None req-3896e173-607d-4857-b33f-eee46c72bcf0 b27b4cc96007415aab89230177c048ab 466123b442f148a6ab70ab607ead6747 - - default default] source check data is LibvirtLiveMigrateData(bdms=,block_migration=False,disk_available_mb=12288,disk_over_commit=,dst_numa_info=,dst_supports_numa_live_migration=,dst_wants_file_backed_memory=False,file_backed_memory_discard=,filename='tmpc3uv3_wg',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='51fca94e-ecb9-4350-bafc-765e827a1c7b',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=,old_vol_attachment_ids=,serial_listen_addr=None,serial_listen_ports=,src_supports_native_luks=,src_supports_numa_live_migration=,supported_perf_events=,target_connect_addr=,vifs=[VIFMigrateData],wait_for_vif_plugged=) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587#033[00m Dec 15 05:01:13 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:13.045 160590 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=12d96d64-e862-4f68-81e5-8d9ec5d3a5e2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 15 05:01:13 localhost nova_compute[286344]: 2025-12-15 10:01:13.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:01:13 localhost nova_compute[286344]: 2025-12-15 10:01:13.271 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 15 05:01:13 localhost nova_compute[286344]: 2025-12-15 10:01:13.272 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:01:13 localhost nova_compute[286344]: 2025-12-15 10:01:13.296 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 05:01:13 localhost nova_compute[286344]: 2025-12-15 10:01:13.297 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 05:01:13 localhost nova_compute[286344]: 2025-12-15 10:01:13.297 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 05:01:13 localhost nova_compute[286344]: 2025-12-15 10:01:13.298 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Auditing locally available compute resources for np0005559462.localdomain (node: np0005559462.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 15 05:01:13 localhost nova_compute[286344]: 2025-12-15 10:01:13.298 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 05:01:13 localhost nova_compute[286344]: 2025-12-15 10:01:13.736 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.438s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 05:01:13 localhost nova_compute[286344]: 2025-12-15 10:01:13.936 286348 DEBUG nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] skipping disk for instance-00000008 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 05:01:13 localhost nova_compute[286344]: 2025-12-15 10:01:13.936 286348 DEBUG nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] skipping disk for instance-00000008 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 05:01:13 localhost nova_compute[286344]: 2025-12-15 10:01:13.948 286348 DEBUG nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 05:01:13 localhost nova_compute[286344]: 2025-12-15 10:01:13.948 286348 DEBUG nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 05:01:14 localhost nova_compute[286344]: 2025-12-15 10:01:14.218 286348 WARNING nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 15 05:01:14 localhost nova_compute[286344]: 2025-12-15 10:01:14.220 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Hypervisor/Node resource view: name=np0005559462.localdomain free_ram=11204MB free_disk=41.50135803222656GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 15 05:01:14 localhost nova_compute[286344]: 2025-12-15 10:01:14.220 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 05:01:14 localhost nova_compute[286344]: 2025-12-15 10:01:14.221 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 05:01:14 localhost nova_compute[286344]: 2025-12-15 10:01:14.278 286348 INFO nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Updating resource usage from migration c3174169-6033-440f-9a8d-32658f6e6711#033[00m Dec 15 05:01:14 localhost nova_compute[286344]: 2025-12-15 10:01:14.309 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Instance 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 15 05:01:14 localhost nova_compute[286344]: 2025-12-15 10:01:14.310 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Instance 8f5a73ce-a171-4a42-9f60-63c0db9e0a32 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 15 05:01:14 localhost nova_compute[286344]: 2025-12-15 10:01:14.310 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Migration c3174169-6033-440f-9a8d-32658f6e6711 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m Dec 15 05:01:14 localhost nova_compute[286344]: 2025-12-15 10:01:14.311 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 15 05:01:14 localhost nova_compute[286344]: 2025-12-15 10:01:14.311 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Final resource view: name=np0005559462.localdomain phys_ram=15738MB used_ram=1280MB phys_disk=41GB used_disk=4GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 15 05:01:14 localhost nova_compute[286344]: 2025-12-15 10:01:14.340 286348 DEBUG nova.scheduler.client.report [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Refreshing inventories for resource provider 26c8956b-6742-4951-b566-971b9bbe323b _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Dec 15 05:01:14 localhost nova_compute[286344]: 2025-12-15 10:01:14.360 286348 DEBUG nova.scheduler.client.report [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Updating ProviderTree inventory for provider 26c8956b-6742-4951-b566-971b9bbe323b from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Dec 15 05:01:14 localhost nova_compute[286344]: 2025-12-15 10:01:14.361 286348 DEBUG nova.compute.provider_tree [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Updating inventory in ProviderTree for provider 26c8956b-6742-4951-b566-971b9bbe323b with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Dec 15 05:01:14 localhost nova_compute[286344]: 2025-12-15 10:01:14.380 286348 DEBUG nova.scheduler.client.report [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Refreshing aggregate associations for resource provider 26c8956b-6742-4951-b566-971b9bbe323b, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Dec 15 05:01:14 localhost nova_compute[286344]: 2025-12-15 10:01:14.405 286348 DEBUG nova.scheduler.client.report [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Refreshing trait associations for resource provider 26c8956b-6742-4951-b566-971b9bbe323b, traits: COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_IDE,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NODE,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_CLMUL,HW_CPU_X86_ABM,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_BMI2,HW_CPU_X86_SSE42,HW_CPU_X86_FMA3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_BMI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE,HW_CPU_X86_AVX,HW_CPU_X86_SVM,COMPUTE_SECURITY_TPM_2_0,COMPUTE_TRUSTED_CERTS,COMPUTE_RESCUE_BFV,HW_CPU_X86_AVX2,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSSE3,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_AESNI,HW_CPU_X86_SSE4A,HW_CPU_X86_MMX,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_F16C,HW_CPU_X86_SHA,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_SATA _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Dec 15 05:01:14 localhost nova_compute[286344]: 2025-12-15 10:01:14.464 286348 DEBUG nova.network.neutron [-] [instance: 8f5a73ce-a171-4a42-9f60-63c0db9e0a32] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 15 05:01:14 localhost nova_compute[286344]: 2025-12-15 10:01:14.473 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 05:01:14 localhost nova_compute[286344]: 2025-12-15 10:01:14.490 286348 INFO nova.compute.manager [-] [instance: 8f5a73ce-a171-4a42-9f60-63c0db9e0a32] Took 2.82 seconds to deallocate network for instance.#033[00m Dec 15 05:01:14 localhost nova_compute[286344]: 2025-12-15 10:01:14.538 286348 DEBUG oslo_concurrency.lockutils [None req-f105e559-0d6d-47e1-8926-de823ff3b94e 7eba5d40415d4fd7ab565eed1e07b799 4e26599ab4374eb6b7e9d73e7dfebd03 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 05:01:14 localhost ovn_controller[154603]: 2025-12-15T10:01:14Z|00136|binding|INFO|Releasing lport b35254ad-12eb-47bb-92be-44fefe0694f0 from this chassis (sb_readonly=0) Dec 15 05:01:14 localhost ovn_controller[154603]: 2025-12-15T10:01:14Z|00137|binding|INFO|Releasing lport 1a4411b8-2368-4b17-9d10-08f9c3480350 from this chassis (sb_readonly=0) Dec 15 05:01:14 localhost ovn_controller[154603]: 2025-12-15T10:01:14Z|00138|binding|INFO|Releasing lport eb478ea1-29fd-4e9a-85ee-b8b88d82f051 from this chassis (sb_readonly=0) Dec 15 05:01:14 localhost nova_compute[286344]: 2025-12-15 10:01:14.603 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:01:14 localhost nova_compute[286344]: 2025-12-15 10:01:14.715 286348 DEBUG nova.compute.manager [req-e72a08d4-beb2-4b90-896a-6fd0991d9ee7 req-3b887818-4f6e-45b9-9f7e-08b4899857c4 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] [instance: 8f5a73ce-a171-4a42-9f60-63c0db9e0a32] Received event network-vif-plugged-19409185-ff01-4d58-bd6e-5d66014dd5c9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Dec 15 05:01:14 localhost nova_compute[286344]: 2025-12-15 10:01:14.716 286348 DEBUG oslo_concurrency.lockutils [req-e72a08d4-beb2-4b90-896a-6fd0991d9ee7 req-3b887818-4f6e-45b9-9f7e-08b4899857c4 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] Acquiring lock "8f5a73ce-a171-4a42-9f60-63c0db9e0a32-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 05:01:14 localhost nova_compute[286344]: 2025-12-15 10:01:14.717 286348 DEBUG oslo_concurrency.lockutils [req-e72a08d4-beb2-4b90-896a-6fd0991d9ee7 req-3b887818-4f6e-45b9-9f7e-08b4899857c4 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] Lock "8f5a73ce-a171-4a42-9f60-63c0db9e0a32-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 05:01:14 localhost nova_compute[286344]: 2025-12-15 10:01:14.717 286348 DEBUG oslo_concurrency.lockutils [req-e72a08d4-beb2-4b90-896a-6fd0991d9ee7 req-3b887818-4f6e-45b9-9f7e-08b4899857c4 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] Lock "8f5a73ce-a171-4a42-9f60-63c0db9e0a32-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 05:01:14 localhost nova_compute[286344]: 2025-12-15 10:01:14.718 286348 DEBUG nova.compute.manager [req-e72a08d4-beb2-4b90-896a-6fd0991d9ee7 req-3b887818-4f6e-45b9-9f7e-08b4899857c4 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] [instance: 8f5a73ce-a171-4a42-9f60-63c0db9e0a32] No waiting events found dispatching network-vif-plugged-19409185-ff01-4d58-bd6e-5d66014dd5c9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m Dec 15 05:01:14 localhost nova_compute[286344]: 2025-12-15 10:01:14.718 286348 WARNING nova.compute.manager [req-e72a08d4-beb2-4b90-896a-6fd0991d9ee7 req-3b887818-4f6e-45b9-9f7e-08b4899857c4 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] [instance: 8f5a73ce-a171-4a42-9f60-63c0db9e0a32] Received unexpected event network-vif-plugged-19409185-ff01-4d58-bd6e-5d66014dd5c9 for instance with vm_state deleted and task_state None.#033[00m Dec 15 05:01:14 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Dec 15 05:01:14 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/4287963856' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Dec 15 05:01:14 localhost nova_compute[286344]: 2025-12-15 10:01:14.903 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.430s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 05:01:14 localhost nova_compute[286344]: 2025-12-15 10:01:14.909 286348 DEBUG nova.compute.provider_tree [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Inventory has not changed in ProviderTree for provider: 26c8956b-6742-4951-b566-971b9bbe323b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 15 05:01:14 localhost nova_compute[286344]: 2025-12-15 10:01:14.928 286348 DEBUG nova.scheduler.client.report [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Inventory has not changed for provider 26c8956b-6742-4951-b566-971b9bbe323b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 15 05:01:14 localhost nova_compute[286344]: 2025-12-15 10:01:14.954 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Compute_service record updated for np0005559462.localdomain:np0005559462.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 15 05:01:14 localhost nova_compute[286344]: 2025-12-15 10:01:14.955 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.734s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 05:01:14 localhost nova_compute[286344]: 2025-12-15 10:01:14.956 286348 DEBUG oslo_concurrency.lockutils [None req-f105e559-0d6d-47e1-8926-de823ff3b94e 7eba5d40415d4fd7ab565eed1e07b799 4e26599ab4374eb6b7e9d73e7dfebd03 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.418s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 05:01:15 localhost nova_compute[286344]: 2025-12-15 10:01:15.046 286348 DEBUG oslo_concurrency.processutils [None req-f105e559-0d6d-47e1-8926-de823ff3b94e 7eba5d40415d4fd7ab565eed1e07b799 4e26599ab4374eb6b7e9d73e7dfebd03 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 05:01:15 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e102 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 05:01:15 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Dec 15 05:01:15 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/329717185' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Dec 15 05:01:15 localhost nova_compute[286344]: 2025-12-15 10:01:15.513 286348 DEBUG oslo_concurrency.processutils [None req-f105e559-0d6d-47e1-8926-de823ff3b94e 7eba5d40415d4fd7ab565eed1e07b799 4e26599ab4374eb6b7e9d73e7dfebd03 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.467s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 05:01:15 localhost nova_compute[286344]: 2025-12-15 10:01:15.519 286348 DEBUG nova.compute.provider_tree [None req-f105e559-0d6d-47e1-8926-de823ff3b94e 7eba5d40415d4fd7ab565eed1e07b799 4e26599ab4374eb6b7e9d73e7dfebd03 - - default default] Inventory has not changed in ProviderTree for provider: 26c8956b-6742-4951-b566-971b9bbe323b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 15 05:01:15 localhost nova_compute[286344]: 2025-12-15 10:01:15.539 286348 DEBUG nova.scheduler.client.report [None req-f105e559-0d6d-47e1-8926-de823ff3b94e 7eba5d40415d4fd7ab565eed1e07b799 4e26599ab4374eb6b7e9d73e7dfebd03 - - default default] Inventory has not changed for provider 26c8956b-6742-4951-b566-971b9bbe323b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 15 05:01:15 localhost nova_compute[286344]: 2025-12-15 10:01:15.583 286348 DEBUG oslo_concurrency.lockutils [None req-f105e559-0d6d-47e1-8926-de823ff3b94e 7eba5d40415d4fd7ab565eed1e07b799 4e26599ab4374eb6b7e9d73e7dfebd03 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.627s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 05:01:15 localhost nova_compute[286344]: 2025-12-15 10:01:15.623 286348 INFO nova.scheduler.client.report [None req-f105e559-0d6d-47e1-8926-de823ff3b94e 7eba5d40415d4fd7ab565eed1e07b799 4e26599ab4374eb6b7e9d73e7dfebd03 - - default default] Deleted allocations for instance 8f5a73ce-a171-4a42-9f60-63c0db9e0a32#033[00m Dec 15 05:01:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 05:01:15 localhost systemd[1]: tmp-crun.dtSUhb.mount: Deactivated successfully. Dec 15 05:01:15 localhost nova_compute[286344]: 2025-12-15 10:01:15.758 286348 DEBUG oslo_concurrency.lockutils [None req-f105e559-0d6d-47e1-8926-de823ff3b94e 7eba5d40415d4fd7ab565eed1e07b799 4e26599ab4374eb6b7e9d73e7dfebd03 - - default default] Lock "8f5a73ce-a171-4a42-9f60-63c0db9e0a32" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 5.122s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 05:01:15 localhost podman[315791]: 2025-12-15 10:01:15.765277079 +0000 UTC m=+0.082803082 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true) Dec 15 05:01:15 localhost podman[315791]: 2025-12-15 10:01:15.794410548 +0000 UTC m=+0.111936551 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 05:01:15 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 05:01:15 localhost neutron_sriov_agent[260044]: 2025-12-15 10:01:15.864 2 INFO neutron.agent.securitygroups_rpc [None req-32473ef8-eb31-4291-ade2-f31ea1574d32 3d412e2d4a3a4f54b39c55f7a39b768f 054b5bdd4ed44009a8e1940489c96b34 - - default default] Security group rule updated ['1f23d768-bded-4ba1-9b80-f5121af3632b']#033[00m Dec 15 05:01:15 localhost nova_compute[286344]: 2025-12-15 10:01:15.919 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:01:15 localhost nova_compute[286344]: 2025-12-15 10:01:15.957 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:01:16 localhost neutron_sriov_agent[260044]: 2025-12-15 10:01:16.157 2 INFO neutron.agent.securitygroups_rpc [None req-67928b1b-3ed2-4ffb-8d01-d3c224816b57 3d412e2d4a3a4f54b39c55f7a39b768f 054b5bdd4ed44009a8e1940489c96b34 - - default default] Security group rule updated ['1f23d768-bded-4ba1-9b80-f5121af3632b']#033[00m Dec 15 05:01:16 localhost nova_compute[286344]: 2025-12-15 10:01:16.635 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:01:16 localhost nova_compute[286344]: 2025-12-15 10:01:16.690 286348 INFO nova.compute.manager [None req-3896e173-607d-4857-b33f-eee46c72bcf0 b27b4cc96007415aab89230177c048ab 466123b442f148a6ab70ab607ead6747 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Took 2.66 seconds for pre_live_migration on destination host np0005559463.localdomain.#033[00m Dec 15 05:01:16 localhost nova_compute[286344]: 2025-12-15 10:01:16.690 286348 DEBUG nova.compute.manager [None req-3896e173-607d-4857-b33f-eee46c72bcf0 b27b4cc96007415aab89230177c048ab 466123b442f148a6ab70ab607ead6747 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Instance event wait completed in 0 seconds for wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m Dec 15 05:01:16 localhost nova_compute[286344]: 2025-12-15 10:01:16.713 286348 DEBUG nova.compute.manager [None req-3896e173-607d-4857-b33f-eee46c72bcf0 b27b4cc96007415aab89230177c048ab 466123b442f148a6ab70ab607ead6747 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=False,disk_available_mb=12288,disk_over_commit=,dst_numa_info=,dst_supports_numa_live_migration=,dst_wants_file_backed_memory=False,file_backed_memory_discard=,filename='tmpc3uv3_wg',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='51fca94e-ecb9-4350-bafc-765e827a1c7b',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(c3174169-6033-440f-9a8d-32658f6e6711),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=,src_supports_numa_live_migration=,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939#033[00m Dec 15 05:01:16 localhost nova_compute[286344]: 2025-12-15 10:01:16.717 286348 DEBUG nova.objects.instance [None req-3896e173-607d-4857-b33f-eee46c72bcf0 b27b4cc96007415aab89230177c048ab 466123b442f148a6ab70ab607ead6747 - - default default] Lazy-loading 'migration_context' on Instance uuid 51fca94e-ecb9-4350-bafc-765e827a1c7b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 15 05:01:16 localhost nova_compute[286344]: 2025-12-15 10:01:16.720 286348 DEBUG nova.virt.libvirt.driver [None req-3896e173-607d-4857-b33f-eee46c72bcf0 b27b4cc96007415aab89230177c048ab 466123b442f148a6ab70ab607ead6747 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639#033[00m Dec 15 05:01:16 localhost nova_compute[286344]: 2025-12-15 10:01:16.722 286348 DEBUG nova.virt.libvirt.driver [None req-3896e173-607d-4857-b33f-eee46c72bcf0 b27b4cc96007415aab89230177c048ab 466123b442f148a6ab70ab607ead6747 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440#033[00m Dec 15 05:01:16 localhost nova_compute[286344]: 2025-12-15 10:01:16.722 286348 DEBUG nova.virt.libvirt.driver [None req-3896e173-607d-4857-b33f-eee46c72bcf0 b27b4cc96007415aab89230177c048ab 466123b442f148a6ab70ab607ead6747 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449#033[00m Dec 15 05:01:16 localhost nova_compute[286344]: 2025-12-15 10:01:16.739 286348 DEBUG nova.virt.libvirt.vif [None req-3896e173-607d-4857-b33f-eee46c72bcf0 b27b4cc96007415aab89230177c048ab 466123b442f148a6ab70ab607ead6747 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-15T10:00:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-612817668',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(5),hidden=False,host='np0005559462.localdomain',hostname='tempest-liveautoblockmigrationv225test-server-612817668',id=8,image_ref='b48177c8-9d95-4864-913a-a010f9defaa6',info_cache=InstanceInfoCache,instance_type_id=5,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2025-12-15T10:01:08Z,launched_on='np0005559462.localdomain',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='np0005559462.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='8d2a9ec16aa942ab8315d4057f639915',ramdisk_id='',reservation_id='r-446f3i0r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='b48177c8-9d95-4864-913a-a010f9defaa6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1078555498',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1078555498-project-member'},tags=,task_state='migrating',terminated_at=None,trusted_certs=,updated_at=2025-12-15T10:01:08Z,user_data=None,user_id='c79d291546244f6a970ffc157036d797',uuid=51fca94e-ecb9-4350-bafc-765e827a1c7b,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d6b04aa0-3423-4a78-adfc-4bf3151f80ed", "address": "fa:16:3e:98:70:a4", "network": {"id": "576e7e59-adb4-4dcb-9b64-cc166b1e1e5f", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1988721240-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "8d2a9ec16aa942ab8315d4057f639915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapd6b04aa0-34", "ovs_interfaceid": "d6b04aa0-3423-4a78-adfc-4bf3151f80ed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m Dec 15 05:01:16 localhost nova_compute[286344]: 2025-12-15 10:01:16.740 286348 DEBUG nova.network.os_vif_util [None req-3896e173-607d-4857-b33f-eee46c72bcf0 b27b4cc96007415aab89230177c048ab 466123b442f148a6ab70ab607ead6747 - - default default] Converting VIF {"id": "d6b04aa0-3423-4a78-adfc-4bf3151f80ed", "address": "fa:16:3e:98:70:a4", "network": {"id": "576e7e59-adb4-4dcb-9b64-cc166b1e1e5f", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1988721240-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "8d2a9ec16aa942ab8315d4057f639915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapd6b04aa0-34", "ovs_interfaceid": "d6b04aa0-3423-4a78-adfc-4bf3151f80ed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m Dec 15 05:01:16 localhost nova_compute[286344]: 2025-12-15 10:01:16.741 286348 DEBUG nova.network.os_vif_util [None req-3896e173-607d-4857-b33f-eee46c72bcf0 b27b4cc96007415aab89230177c048ab 466123b442f148a6ab70ab607ead6747 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:98:70:a4,bridge_name='br-int',has_traffic_filtering=True,id=d6b04aa0-3423-4a78-adfc-4bf3151f80ed,network=Network(576e7e59-adb4-4dcb-9b64-cc166b1e1e5f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapd6b04aa0-34') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m Dec 15 05:01:16 localhost nova_compute[286344]: 2025-12-15 10:01:16.742 286348 DEBUG nova.virt.libvirt.migration [None req-3896e173-607d-4857-b33f-eee46c72bcf0 b27b4cc96007415aab89230177c048ab 466123b442f148a6ab70ab607ead6747 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Updating guest XML with vif config: Dec 15 05:01:16 localhost nova_compute[286344]: Dec 15 05:01:16 localhost nova_compute[286344]: Dec 15 05:01:16 localhost nova_compute[286344]: Dec 15 05:01:16 localhost nova_compute[286344]: Dec 15 05:01:16 localhost nova_compute[286344]: Dec 15 05:01:16 localhost nova_compute[286344]: Dec 15 05:01:16 localhost nova_compute[286344]: _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388#033[00m Dec 15 05:01:16 localhost nova_compute[286344]: 2025-12-15 10:01:16.744 286348 DEBUG nova.virt.libvirt.driver [None req-3896e173-607d-4857-b33f-eee46c72bcf0 b27b4cc96007415aab89230177c048ab 466123b442f148a6ab70ab607ead6747 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272#033[00m Dec 15 05:01:16 localhost nova_compute[286344]: 2025-12-15 10:01:16.796 286348 DEBUG nova.compute.manager [req-62c6b339-9d58-473c-afad-5e0ec24cc47b req-ec86cb7c-db0d-4567-b903-fdde0f7e4ef3 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Received event network-vif-unplugged-d6b04aa0-3423-4a78-adfc-4bf3151f80ed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Dec 15 05:01:16 localhost nova_compute[286344]: 2025-12-15 10:01:16.797 286348 DEBUG oslo_concurrency.lockutils [req-62c6b339-9d58-473c-afad-5e0ec24cc47b req-ec86cb7c-db0d-4567-b903-fdde0f7e4ef3 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] Acquiring lock "51fca94e-ecb9-4350-bafc-765e827a1c7b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 05:01:16 localhost nova_compute[286344]: 2025-12-15 10:01:16.797 286348 DEBUG oslo_concurrency.lockutils [req-62c6b339-9d58-473c-afad-5e0ec24cc47b req-ec86cb7c-db0d-4567-b903-fdde0f7e4ef3 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] Lock "51fca94e-ecb9-4350-bafc-765e827a1c7b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 05:01:16 localhost nova_compute[286344]: 2025-12-15 10:01:16.798 286348 DEBUG oslo_concurrency.lockutils [req-62c6b339-9d58-473c-afad-5e0ec24cc47b req-ec86cb7c-db0d-4567-b903-fdde0f7e4ef3 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] Lock "51fca94e-ecb9-4350-bafc-765e827a1c7b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 05:01:16 localhost nova_compute[286344]: 2025-12-15 10:01:16.799 286348 DEBUG nova.compute.manager [req-62c6b339-9d58-473c-afad-5e0ec24cc47b req-ec86cb7c-db0d-4567-b903-fdde0f7e4ef3 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] No waiting events found dispatching network-vif-unplugged-d6b04aa0-3423-4a78-adfc-4bf3151f80ed pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m Dec 15 05:01:16 localhost nova_compute[286344]: 2025-12-15 10:01:16.799 286348 DEBUG nova.compute.manager [req-62c6b339-9d58-473c-afad-5e0ec24cc47b req-ec86cb7c-db0d-4567-b903-fdde0f7e4ef3 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Received event network-vif-unplugged-d6b04aa0-3423-4a78-adfc-4bf3151f80ed for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m Dec 15 05:01:16 localhost nova_compute[286344]: 2025-12-15 10:01:16.800 286348 DEBUG nova.compute.manager [req-62c6b339-9d58-473c-afad-5e0ec24cc47b req-ec86cb7c-db0d-4567-b903-fdde0f7e4ef3 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Received event network-vif-plugged-d6b04aa0-3423-4a78-adfc-4bf3151f80ed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Dec 15 05:01:16 localhost nova_compute[286344]: 2025-12-15 10:01:16.800 286348 DEBUG oslo_concurrency.lockutils [req-62c6b339-9d58-473c-afad-5e0ec24cc47b req-ec86cb7c-db0d-4567-b903-fdde0f7e4ef3 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] Acquiring lock "51fca94e-ecb9-4350-bafc-765e827a1c7b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 05:01:16 localhost nova_compute[286344]: 2025-12-15 10:01:16.801 286348 DEBUG oslo_concurrency.lockutils [req-62c6b339-9d58-473c-afad-5e0ec24cc47b req-ec86cb7c-db0d-4567-b903-fdde0f7e4ef3 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] Lock "51fca94e-ecb9-4350-bafc-765e827a1c7b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 05:01:16 localhost nova_compute[286344]: 2025-12-15 10:01:16.801 286348 DEBUG oslo_concurrency.lockutils [req-62c6b339-9d58-473c-afad-5e0ec24cc47b req-ec86cb7c-db0d-4567-b903-fdde0f7e4ef3 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] Lock "51fca94e-ecb9-4350-bafc-765e827a1c7b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 05:01:16 localhost nova_compute[286344]: 2025-12-15 10:01:16.802 286348 DEBUG nova.compute.manager [req-62c6b339-9d58-473c-afad-5e0ec24cc47b req-ec86cb7c-db0d-4567-b903-fdde0f7e4ef3 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] No waiting events found dispatching network-vif-plugged-d6b04aa0-3423-4a78-adfc-4bf3151f80ed pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m Dec 15 05:01:16 localhost nova_compute[286344]: 2025-12-15 10:01:16.803 286348 WARNING nova.compute.manager [req-62c6b339-9d58-473c-afad-5e0ec24cc47b req-ec86cb7c-db0d-4567-b903-fdde0f7e4ef3 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Received unexpected event network-vif-plugged-d6b04aa0-3423-4a78-adfc-4bf3151f80ed for instance with vm_state active and task_state migrating.#033[00m Dec 15 05:01:16 localhost nova_compute[286344]: 2025-12-15 10:01:16.804 286348 DEBUG nova.compute.manager [req-62c6b339-9d58-473c-afad-5e0ec24cc47b req-ec86cb7c-db0d-4567-b903-fdde0f7e4ef3 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Received event network-changed-d6b04aa0-3423-4a78-adfc-4bf3151f80ed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Dec 15 05:01:16 localhost nova_compute[286344]: 2025-12-15 10:01:16.804 286348 DEBUG nova.compute.manager [req-62c6b339-9d58-473c-afad-5e0ec24cc47b req-ec86cb7c-db0d-4567-b903-fdde0f7e4ef3 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Refreshing instance network info cache due to event network-changed-d6b04aa0-3423-4a78-adfc-4bf3151f80ed. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m Dec 15 05:01:16 localhost nova_compute[286344]: 2025-12-15 10:01:16.805 286348 DEBUG oslo_concurrency.lockutils [req-62c6b339-9d58-473c-afad-5e0ec24cc47b req-ec86cb7c-db0d-4567-b903-fdde0f7e4ef3 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] Acquiring lock "refresh_cache-51fca94e-ecb9-4350-bafc-765e827a1c7b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 15 05:01:16 localhost nova_compute[286344]: 2025-12-15 10:01:16.806 286348 DEBUG oslo_concurrency.lockutils [req-62c6b339-9d58-473c-afad-5e0ec24cc47b req-ec86cb7c-db0d-4567-b903-fdde0f7e4ef3 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] Acquired lock "refresh_cache-51fca94e-ecb9-4350-bafc-765e827a1c7b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 15 05:01:16 localhost nova_compute[286344]: 2025-12-15 10:01:16.807 286348 DEBUG nova.network.neutron [req-62c6b339-9d58-473c-afad-5e0ec24cc47b req-ec86cb7c-db0d-4567-b903-fdde0f7e4ef3 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Refreshing network info cache for port d6b04aa0-3423-4a78-adfc-4bf3151f80ed _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m Dec 15 05:01:17 localhost neutron_sriov_agent[260044]: 2025-12-15 10:01:17.175 2 INFO neutron.agent.securitygroups_rpc [None req-e7882f94-34ce-4687-86ba-a1c9e6dc1bf6 7eba5d40415d4fd7ab565eed1e07b799 4e26599ab4374eb6b7e9d73e7dfebd03 - - default default] Security group member updated ['489315f5-1e14-49f6-8195-de837d23b935']#033[00m Dec 15 05:01:17 localhost nova_compute[286344]: 2025-12-15 10:01:17.226 286348 DEBUG nova.virt.libvirt.migration [None req-3896e173-607d-4857-b33f-eee46c72bcf0 b27b4cc96007415aab89230177c048ab 466123b442f148a6ab70ab607ead6747 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Current None elapsed 0 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m Dec 15 05:01:17 localhost nova_compute[286344]: 2025-12-15 10:01:17.227 286348 INFO nova.virt.libvirt.migration [None req-3896e173-607d-4857-b33f-eee46c72bcf0 b27b4cc96007415aab89230177c048ab 466123b442f148a6ab70ab607ead6747 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Increasing downtime to 50 ms after 0 sec elapsed time#033[00m Dec 15 05:01:17 localhost nova_compute[286344]: 2025-12-15 10:01:17.321 286348 INFO nova.virt.libvirt.driver [None req-3896e173-607d-4857-b33f-eee46c72bcf0 b27b4cc96007415aab89230177c048ab 466123b442f148a6ab70ab607ead6747 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).#033[00m Dec 15 05:01:17 localhost dnsmasq[314076]: read /var/lib/neutron/dhcp/0874a63c-034d-48db-8f84-a51c1fe90687/addn_hosts - 0 addresses Dec 15 05:01:17 localhost dnsmasq-dhcp[314076]: read /var/lib/neutron/dhcp/0874a63c-034d-48db-8f84-a51c1fe90687/host Dec 15 05:01:17 localhost podman[315824]: 2025-12-15 10:01:17.455592496 +0000 UTC m=+0.074165210 container kill df71c348ebc76afefa8737120d7d97d99f59bee032910fea1a873136e0df546b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0874a63c-034d-48db-8f84-a51c1fe90687, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Dec 15 05:01:17 localhost dnsmasq-dhcp[314076]: read /var/lib/neutron/dhcp/0874a63c-034d-48db-8f84-a51c1fe90687/opts Dec 15 05:01:17 localhost nova_compute[286344]: 2025-12-15 10:01:17.667 286348 DEBUG nova.network.neutron [req-62c6b339-9d58-473c-afad-5e0ec24cc47b req-ec86cb7c-db0d-4567-b903-fdde0f7e4ef3 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Updated VIF entry in instance network info cache for port d6b04aa0-3423-4a78-adfc-4bf3151f80ed. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m Dec 15 05:01:17 localhost nova_compute[286344]: 2025-12-15 10:01:17.668 286348 DEBUG nova.network.neutron [req-62c6b339-9d58-473c-afad-5e0ec24cc47b req-ec86cb7c-db0d-4567-b903-fdde0f7e4ef3 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Updating instance_info_cache with network_info: [{"id": "d6b04aa0-3423-4a78-adfc-4bf3151f80ed", "address": "fa:16:3e:98:70:a4", "network": {"id": "576e7e59-adb4-4dcb-9b64-cc166b1e1e5f", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1988721240-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "8d2a9ec16aa942ab8315d4057f639915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd6b04aa0-34", "ovs_interfaceid": "d6b04aa0-3423-4a78-adfc-4bf3151f80ed", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "np0005559463.localdomain"}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 15 05:01:17 localhost nova_compute[286344]: 2025-12-15 10:01:17.840 286348 DEBUG nova.virt.libvirt.migration [None req-3896e173-607d-4857-b33f-eee46c72bcf0 b27b4cc96007415aab89230177c048ab 466123b442f148a6ab70ab607ead6747 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Current 50 elapsed 1 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m Dec 15 05:01:17 localhost nova_compute[286344]: 2025-12-15 10:01:17.842 286348 DEBUG nova.virt.libvirt.migration [None req-3896e173-607d-4857-b33f-eee46c72bcf0 b27b4cc96007415aab89230177c048ab 466123b442f148a6ab70ab607ead6747 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m Dec 15 05:01:17 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e102 do_prune osdmap full prune enabled Dec 15 05:01:17 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e103 e103: 6 total, 6 up, 6 in Dec 15 05:01:17 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e103: 6 total, 6 up, 6 in Dec 15 05:01:17 localhost dnsmasq[314076]: exiting on receipt of SIGTERM Dec 15 05:01:17 localhost podman[315863]: 2025-12-15 10:01:17.900413544 +0000 UTC m=+0.077556043 container kill df71c348ebc76afefa8737120d7d97d99f59bee032910fea1a873136e0df546b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0874a63c-034d-48db-8f84-a51c1fe90687, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) Dec 15 05:01:17 localhost systemd[1]: libpod-df71c348ebc76afefa8737120d7d97d99f59bee032910fea1a873136e0df546b.scope: Deactivated successfully. Dec 15 05:01:17 localhost podman[315879]: 2025-12-15 10:01:17.98338789 +0000 UTC m=+0.068620103 container died df71c348ebc76afefa8737120d7d97d99f59bee032910fea1a873136e0df546b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0874a63c-034d-48db-8f84-a51c1fe90687, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2) Dec 15 05:01:18 localhost podman[315879]: 2025-12-15 10:01:18.014579378 +0000 UTC m=+0.099811541 container cleanup df71c348ebc76afefa8737120d7d97d99f59bee032910fea1a873136e0df546b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0874a63c-034d-48db-8f84-a51c1fe90687, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2) Dec 15 05:01:18 localhost systemd[1]: libpod-conmon-df71c348ebc76afefa8737120d7d97d99f59bee032910fea1a873136e0df546b.scope: Deactivated successfully. Dec 15 05:01:18 localhost nova_compute[286344]: 2025-12-15 10:01:18.027 286348 DEBUG oslo_concurrency.lockutils [req-62c6b339-9d58-473c-afad-5e0ec24cc47b req-ec86cb7c-db0d-4567-b903-fdde0f7e4ef3 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] Releasing lock "refresh_cache-51fca94e-ecb9-4350-bafc-765e827a1c7b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 15 05:01:18 localhost podman[315881]: 2025-12-15 10:01:18.081799336 +0000 UTC m=+0.154688425 container remove df71c348ebc76afefa8737120d7d97d99f59bee032910fea1a873136e0df546b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0874a63c-034d-48db-8f84-a51c1fe90687, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2) Dec 15 05:01:18 localhost ovn_controller[154603]: 2025-12-15T10:01:18Z|00139|binding|INFO|Releasing lport 41381e1a-bb74-4360-9cd9-17d27f7371d9 from this chassis (sb_readonly=0) Dec 15 05:01:18 localhost kernel: device tap41381e1a-bb left promiscuous mode Dec 15 05:01:18 localhost nova_compute[286344]: 2025-12-15 10:01:18.096 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:01:18 localhost ovn_controller[154603]: 2025-12-15T10:01:18Z|00140|binding|INFO|Setting lport 41381e1a-bb74-4360-9cd9-17d27f7371d9 down in Southbound Dec 15 05:01:18 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:18.109 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '19.80.0.2/24', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-0874a63c-034d-48db-8f84-a51c1fe90687', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0874a63c-034d-48db-8f84-a51c1fe90687', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4e26599ab4374eb6b7e9d73e7dfebd03', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005559462.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d3aba774-abb1-4ddc-beaa-cd88849a49e2, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=41381e1a-bb74-4360-9cd9-17d27f7371d9) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:01:18 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:18.112 160590 INFO neutron.agent.ovn.metadata.agent [-] Port 41381e1a-bb74-4360-9cd9-17d27f7371d9 in datapath 0874a63c-034d-48db-8f84-a51c1fe90687 unbound from our chassis#033[00m Dec 15 05:01:18 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:18.116 160590 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0874a63c-034d-48db-8f84-a51c1fe90687, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 15 05:01:18 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:18.117 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[4c0cfb06-5771-4dea-bb2f-1f63fdde9b82]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:01:18 localhost nova_compute[286344]: 2025-12-15 10:01:18.123 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:01:18 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:01:18.153 267546 INFO neutron.agent.dhcp.agent [None req-9a5e6c03-18e3-41a0-ad66-5b1d8e10cbd5 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:01:18 localhost nova_compute[286344]: 2025-12-15 10:01:18.346 286348 DEBUG nova.virt.libvirt.migration [None req-3896e173-607d-4857-b33f-eee46c72bcf0 b27b4cc96007415aab89230177c048ab 466123b442f148a6ab70ab607ead6747 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Current 50 elapsed 1 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m Dec 15 05:01:18 localhost nova_compute[286344]: 2025-12-15 10:01:18.347 286348 DEBUG nova.virt.libvirt.migration [None req-3896e173-607d-4857-b33f-eee46c72bcf0 b27b4cc96007415aab89230177c048ab 466123b442f148a6ab70ab607ead6747 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m Dec 15 05:01:18 localhost systemd[1]: var-lib-containers-storage-overlay-ec10130acb2b32acd3d8f8f14a6a8a9c76c0a42c2f2b6b52646e3692b0ed4c53-merged.mount: Deactivated successfully. Dec 15 05:01:18 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-df71c348ebc76afefa8737120d7d97d99f59bee032910fea1a873136e0df546b-userdata-shm.mount: Deactivated successfully. Dec 15 05:01:18 localhost systemd[1]: run-netns-qdhcp\x2d0874a63c\x2d034d\x2d48db\x2d8f84\x2da51c1fe90687.mount: Deactivated successfully. Dec 15 05:01:18 localhost nova_compute[286344]: 2025-12-15 10:01:18.498 286348 DEBUG nova.virt.driver [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] Emitting event Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m Dec 15 05:01:18 localhost nova_compute[286344]: 2025-12-15 10:01:18.499 286348 INFO nova.compute.manager [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] VM Paused (Lifecycle Event)#033[00m Dec 15 05:01:18 localhost nova_compute[286344]: 2025-12-15 10:01:18.520 286348 DEBUG nova.compute.manager [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Dec 15 05:01:18 localhost nova_compute[286344]: 2025-12-15 10:01:18.524 286348 DEBUG nova.compute.manager [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m Dec 15 05:01:18 localhost nova_compute[286344]: 2025-12-15 10:01:18.556 286348 INFO nova.compute.manager [None req-ed090d16-496a-473a-be36-97cf3cf6502a - - - - - -] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] During sync_power_state the instance has a pending task (migrating). Skip.#033[00m Dec 15 05:01:18 localhost kernel: device tapd6b04aa0-34 left promiscuous mode Dec 15 05:01:18 localhost NetworkManager[5963]: [1765792878.6428] device (tapd6b04aa0-34): state change: disconnected -> unmanaged (reason 'unmanaged', sys-iface-state: 'removed') Dec 15 05:01:18 localhost nova_compute[286344]: 2025-12-15 10:01:18.651 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:01:18 localhost ovn_controller[154603]: 2025-12-15T10:01:18Z|00141|binding|INFO|Releasing lport d6b04aa0-3423-4a78-adfc-4bf3151f80ed from this chassis (sb_readonly=0) Dec 15 05:01:18 localhost ovn_controller[154603]: 2025-12-15T10:01:18Z|00142|binding|INFO|Setting lport d6b04aa0-3423-4a78-adfc-4bf3151f80ed down in Southbound Dec 15 05:01:18 localhost ovn_controller[154603]: 2025-12-15T10:01:18Z|00143|binding|INFO|Releasing lport d92e338f-e5ff-4170-8303-27f41ee35ef3 from this chassis (sb_readonly=0) Dec 15 05:01:18 localhost ovn_controller[154603]: 2025-12-15T10:01:18Z|00144|binding|INFO|Setting lport d92e338f-e5ff-4170-8303-27f41ee35ef3 down in Southbound Dec 15 05:01:18 localhost ovn_controller[154603]: 2025-12-15T10:01:18Z|00145|binding|INFO|Removing iface tapd6b04aa0-34 ovn-installed in OVS Dec 15 05:01:18 localhost nova_compute[286344]: 2025-12-15 10:01:18.654 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:01:18 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:01:18.667 267546 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:01:18 localhost ovn_controller[154603]: 2025-12-15T10:01:18Z|00146|binding|INFO|Releasing lport b35254ad-12eb-47bb-92be-44fefe0694f0 from this chassis (sb_readonly=0) Dec 15 05:01:18 localhost ovn_controller[154603]: 2025-12-15T10:01:18Z|00147|binding|INFO|Releasing lport 1a4411b8-2368-4b17-9d10-08f9c3480350 from this chassis (sb_readonly=0) Dec 15 05:01:18 localhost ovn_controller[154603]: 2025-12-15T10:01:18Z|00148|binding|INFO|Releasing lport eb478ea1-29fd-4e9a-85ee-b8b88d82f051 from this chassis (sb_readonly=0) Dec 15 05:01:18 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:18.667 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:98:70:a4 10.100.0.13'], port_security=['fa:16:3e:98:70:a4 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain,np0005559463.localdomain', 'activation-strategy': 'rarp', 'additional-chassis-activated': '76226714-9b28-461f-943c-7bdb4833176e'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-580979727', 'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '51fca94e-ecb9-4350-bafc-765e827a1c7b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-576e7e59-adb4-4dcb-9b64-cc166b1e1e5f', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-580979727', 'neutron:project_id': '8d2a9ec16aa942ab8315d4057f639915', 'neutron:revision_number': '8', 'neutron:security_group_ids': '48dda613-ea3d-4053-a6ae-35f8ce5dfd37', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005559462.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=22fe063e-3457-4082-83f2-544c43df7165, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[], logical_port=d6b04aa0-3423-4a78-adfc-4bf3151f80ed) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:01:18 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:18.670 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:75:03:44 19.80.0.5'], port_security=['fa:16:3e:75:03:44 19.80.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['d6b04aa0-3423-4a78-adfc-4bf3151f80ed'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-1273927', 'neutron:cidrs': '19.80.0.5/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-594a8673-651b-4566-92ec-8dbe6ca00b60', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-1273927', 'neutron:project_id': '8d2a9ec16aa942ab8315d4057f639915', 'neutron:revision_number': '3', 'neutron:security_group_ids': '48dda613-ea3d-4053-a6ae-35f8ce5dfd37', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=8bd87d03-8621-4b45-a769-2f1ac086eff3, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=d92e338f-e5ff-4170-8303-27f41ee35ef3) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:01:18 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:18.672 160590 INFO neutron.agent.ovn.metadata.agent [-] Port d6b04aa0-3423-4a78-adfc-4bf3151f80ed in datapath 576e7e59-adb4-4dcb-9b64-cc166b1e1e5f unbound from our chassis#033[00m Dec 15 05:01:18 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:18.676 160590 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 576e7e59-adb4-4dcb-9b64-cc166b1e1e5f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 15 05:01:18 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:18.677 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[49526a54-eb10-4f52-bfcb-e777bc810234]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:01:18 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:18.678 160590 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-576e7e59-adb4-4dcb-9b64-cc166b1e1e5f namespace which is not needed anymore#033[00m Dec 15 05:01:18 localhost systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000008.scope: Deactivated successfully. Dec 15 05:01:18 localhost systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000008.scope: Consumed 10.092s CPU time. Dec 15 05:01:18 localhost systemd-machined[84011]: Machine qemu-4-instance-00000008 terminated. Dec 15 05:01:18 localhost nova_compute[286344]: 2025-12-15 10:01:18.693 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:01:18 localhost nova_compute[286344]: 2025-12-15 10:01:18.707 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:01:18 localhost journal[204381]: Unable to get XATTR trusted.libvirt.security.ref_selinux on vms/51fca94e-ecb9-4350-bafc-765e827a1c7b_disk: No such file or directory Dec 15 05:01:18 localhost journal[204381]: Unable to get XATTR trusted.libvirt.security.ref_dac on vms/51fca94e-ecb9-4350-bafc-765e827a1c7b_disk: No such file or directory Dec 15 05:01:18 localhost nova_compute[286344]: 2025-12-15 10:01:18.844 286348 DEBUG nova.virt.libvirt.driver [None req-3896e173-607d-4857-b33f-eee46c72bcf0 b27b4cc96007415aab89230177c048ab 466123b442f148a6ab70ab607ead6747 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279#033[00m Dec 15 05:01:18 localhost nova_compute[286344]: 2025-12-15 10:01:18.845 286348 DEBUG nova.virt.libvirt.driver [None req-3896e173-607d-4857-b33f-eee46c72bcf0 b27b4cc96007415aab89230177c048ab 466123b442f148a6ab70ab607ead6747 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327#033[00m Dec 15 05:01:18 localhost nova_compute[286344]: 2025-12-15 10:01:18.845 286348 DEBUG nova.virt.libvirt.driver [None req-3896e173-607d-4857-b33f-eee46c72bcf0 b27b4cc96007415aab89230177c048ab 466123b442f148a6ab70ab607ead6747 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630#033[00m Dec 15 05:01:18 localhost nova_compute[286344]: 2025-12-15 10:01:18.852 286348 DEBUG nova.virt.libvirt.guest [None req-3896e173-607d-4857-b33f-eee46c72bcf0 b27b4cc96007415aab89230177c048ab 466123b442f148a6ab70ab607ead6747 - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid '51fca94e-ecb9-4350-bafc-765e827a1c7b' (instance-00000008) get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688#033[00m Dec 15 05:01:18 localhost nova_compute[286344]: 2025-12-15 10:01:18.853 286348 INFO nova.virt.libvirt.driver [None req-3896e173-607d-4857-b33f-eee46c72bcf0 b27b4cc96007415aab89230177c048ab 466123b442f148a6ab70ab607ead6747 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Migration operation has completed#033[00m Dec 15 05:01:18 localhost nova_compute[286344]: 2025-12-15 10:01:18.854 286348 INFO nova.compute.manager [None req-3896e173-607d-4857-b33f-eee46c72bcf0 b27b4cc96007415aab89230177c048ab 466123b442f148a6ab70ab607ead6747 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] _post_live_migration() is started..#033[00m Dec 15 05:01:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e. Dec 15 05:01:18 localhost neutron-haproxy-ovnmeta-576e7e59-adb4-4dcb-9b64-cc166b1e1e5f[315059]: [NOTICE] (315063) : haproxy version is 2.8.14-c23fe91 Dec 15 05:01:18 localhost neutron-haproxy-ovnmeta-576e7e59-adb4-4dcb-9b64-cc166b1e1e5f[315059]: [NOTICE] (315063) : path to executable is /usr/sbin/haproxy Dec 15 05:01:18 localhost neutron-haproxy-ovnmeta-576e7e59-adb4-4dcb-9b64-cc166b1e1e5f[315059]: [WARNING] (315063) : Exiting Master process... Dec 15 05:01:18 localhost neutron-haproxy-ovnmeta-576e7e59-adb4-4dcb-9b64-cc166b1e1e5f[315059]: [ALERT] (315063) : Current worker (315065) exited with code 143 (Terminated) Dec 15 05:01:18 localhost neutron-haproxy-ovnmeta-576e7e59-adb4-4dcb-9b64-cc166b1e1e5f[315059]: [WARNING] (315063) : All workers exited. Exiting... (0) Dec 15 05:01:18 localhost systemd[1]: libpod-334c8a6eb0abcc8decf6d5ba63ae90dfeaef71d3014d8a42d417306392b53a6f.scope: Deactivated successfully. Dec 15 05:01:18 localhost podman[315928]: 2025-12-15 10:01:18.882834966 +0000 UTC m=+0.083772866 container died 334c8a6eb0abcc8decf6d5ba63ae90dfeaef71d3014d8a42d417306392b53a6f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-576e7e59-adb4-4dcb-9b64-cc166b1e1e5f, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:01:18 localhost podman[315928]: 2025-12-15 10:01:18.915568844 +0000 UTC m=+0.116506714 container cleanup 334c8a6eb0abcc8decf6d5ba63ae90dfeaef71d3014d8a42d417306392b53a6f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-576e7e59-adb4-4dcb-9b64-cc166b1e1e5f, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2) Dec 15 05:01:18 localhost podman[315957]: 2025-12-15 10:01:18.968694254 +0000 UTC m=+0.065583229 container cleanup 334c8a6eb0abcc8decf6d5ba63ae90dfeaef71d3014d8a42d417306392b53a6f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-576e7e59-adb4-4dcb-9b64-cc166b1e1e5f, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202) Dec 15 05:01:18 localhost systemd[1]: libpod-conmon-334c8a6eb0abcc8decf6d5ba63ae90dfeaef71d3014d8a42d417306392b53a6f.scope: Deactivated successfully. Dec 15 05:01:19 localhost podman[315975]: 2025-12-15 10:01:19.020199733 +0000 UTC m=+0.079537261 container remove 334c8a6eb0abcc8decf6d5ba63ae90dfeaef71d3014d8a42d417306392b53a6f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-576e7e59-adb4-4dcb-9b64-cc166b1e1e5f, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, io.buildah.version=1.41.3) Dec 15 05:01:19 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:19.030 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[65d00bc5-8dcc-4b03-836f-121193bd2531]: (4, ('Mon Dec 15 10:01:18 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-576e7e59-adb4-4dcb-9b64-cc166b1e1e5f (334c8a6eb0abcc8decf6d5ba63ae90dfeaef71d3014d8a42d417306392b53a6f)\n334c8a6eb0abcc8decf6d5ba63ae90dfeaef71d3014d8a42d417306392b53a6f\nMon Dec 15 10:01:18 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-576e7e59-adb4-4dcb-9b64-cc166b1e1e5f (334c8a6eb0abcc8decf6d5ba63ae90dfeaef71d3014d8a42d417306392b53a6f)\n334c8a6eb0abcc8decf6d5ba63ae90dfeaef71d3014d8a42d417306392b53a6f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:01:19 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:19.033 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[0534309d-91d3-47d4-9f20-0559d8226a65]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:01:19 localhost podman[315950]: 2025-12-15 10:01:19.034293361 +0000 UTC m=+0.155011893 container health_status a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Dec 15 05:01:19 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:19.035 160590 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap576e7e59-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 15 05:01:19 localhost nova_compute[286344]: 2025-12-15 10:01:19.038 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:01:19 localhost kernel: device tap576e7e59-a0 left promiscuous mode Dec 15 05:01:19 localhost nova_compute[286344]: 2025-12-15 10:01:19.054 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:01:19 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:19.059 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[9434a5a1-a1b2-44f4-9e54-0da4e7b1d10f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:01:19 localhost podman[315950]: 2025-12-15 10:01:19.077505037 +0000 UTC m=+0.198223569 container exec_died a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Dec 15 05:01:19 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:19.079 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[7e4085ff-b839-41b3-985d-a14615e7b22e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:01:19 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:19.081 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[037d837c-f534-412e-9b54-8606ace0849f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:01:19 localhost systemd[1]: a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e.service: Deactivated successfully. Dec 15 05:01:19 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:19.098 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[590fb02e-ccc5-4e2a-81a1-fcb9666693fb]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1192136, 'reachable_time': 29707, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1356, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 316013, 'error': None, 'target': 'ovnmeta-576e7e59-adb4-4dcb-9b64-cc166b1e1e5f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:01:19 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:19.100 160979 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-576e7e59-adb4-4dcb-9b64-cc166b1e1e5f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m Dec 15 05:01:19 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:19.100 160979 DEBUG oslo.privsep.daemon [-] privsep: reply[f718c53b-a964-4147-b3d4-98c15cb44e1a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:01:19 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:19.101 160590 INFO neutron.agent.ovn.metadata.agent [-] Port d92e338f-e5ff-4170-8303-27f41ee35ef3 in datapath 594a8673-651b-4566-92ec-8dbe6ca00b60 unbound from our chassis#033[00m Dec 15 05:01:19 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:19.105 160590 DEBUG neutron.agent.ovn.metadata.agent [-] Port defc7b23-8924-4b69-a932-1bf1428cc324 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Dec 15 05:01:19 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:19.105 160590 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 594a8673-651b-4566-92ec-8dbe6ca00b60, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 15 05:01:19 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:19.106 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[a3f50318-5ee0-43e2-bdc0-6e31e6805b75]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:01:19 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:19.107 160590 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-594a8673-651b-4566-92ec-8dbe6ca00b60 namespace which is not needed anymore#033[00m Dec 15 05:01:19 localhost neutron-haproxy-ovnmeta-594a8673-651b-4566-92ec-8dbe6ca00b60[315133]: [NOTICE] (315137) : haproxy version is 2.8.14-c23fe91 Dec 15 05:01:19 localhost neutron-haproxy-ovnmeta-594a8673-651b-4566-92ec-8dbe6ca00b60[315133]: [NOTICE] (315137) : path to executable is /usr/sbin/haproxy Dec 15 05:01:19 localhost neutron-haproxy-ovnmeta-594a8673-651b-4566-92ec-8dbe6ca00b60[315133]: [WARNING] (315137) : Exiting Master process... Dec 15 05:01:19 localhost neutron-haproxy-ovnmeta-594a8673-651b-4566-92ec-8dbe6ca00b60[315133]: [WARNING] (315137) : Exiting Master process... Dec 15 05:01:19 localhost neutron-haproxy-ovnmeta-594a8673-651b-4566-92ec-8dbe6ca00b60[315133]: [ALERT] (315137) : Current worker (315139) exited with code 143 (Terminated) Dec 15 05:01:19 localhost neutron-haproxy-ovnmeta-594a8673-651b-4566-92ec-8dbe6ca00b60[315133]: [WARNING] (315137) : All workers exited. Exiting... (0) Dec 15 05:01:19 localhost systemd[1]: libpod-98fad658a9ebb45138a90f607082ed4e2385050ae13da1b945cf17b9b016e32b.scope: Deactivated successfully. Dec 15 05:01:19 localhost podman[316030]: 2025-12-15 10:01:19.282431269 +0000 UTC m=+0.063953358 container died 98fad658a9ebb45138a90f607082ed4e2385050ae13da1b945cf17b9b016e32b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-594a8673-651b-4566-92ec-8dbe6ca00b60, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:01:19 localhost podman[316030]: 2025-12-15 10:01:19.332235927 +0000 UTC m=+0.113757986 container cleanup 98fad658a9ebb45138a90f607082ed4e2385050ae13da1b945cf17b9b016e32b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-594a8673-651b-4566-92ec-8dbe6ca00b60, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Dec 15 05:01:19 localhost podman[316043]: 2025-12-15 10:01:19.405399891 +0000 UTC m=+0.118561875 container cleanup 98fad658a9ebb45138a90f607082ed4e2385050ae13da1b945cf17b9b016e32b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-594a8673-651b-4566-92ec-8dbe6ca00b60, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3) Dec 15 05:01:19 localhost systemd[1]: libpod-conmon-98fad658a9ebb45138a90f607082ed4e2385050ae13da1b945cf17b9b016e32b.scope: Deactivated successfully. Dec 15 05:01:19 localhost systemd[1]: var-lib-containers-storage-overlay-ee0c9f336082978f565ae45816420858db6936fbc14607ad49c7da4487c6fa27-merged.mount: Deactivated successfully. Dec 15 05:01:19 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-98fad658a9ebb45138a90f607082ed4e2385050ae13da1b945cf17b9b016e32b-userdata-shm.mount: Deactivated successfully. Dec 15 05:01:19 localhost systemd[1]: var-lib-containers-storage-overlay-25934885af206e52e90093a8e194d448a632cfa70e1d6f763662fc57e634a973-merged.mount: Deactivated successfully. Dec 15 05:01:19 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-334c8a6eb0abcc8decf6d5ba63ae90dfeaef71d3014d8a42d417306392b53a6f-userdata-shm.mount: Deactivated successfully. Dec 15 05:01:19 localhost systemd[1]: run-netns-ovnmeta\x2d576e7e59\x2dadb4\x2d4dcb\x2d9b64\x2dcc166b1e1e5f.mount: Deactivated successfully. Dec 15 05:01:19 localhost podman[316058]: 2025-12-15 10:01:19.472441004 +0000 UTC m=+0.123201288 container remove 98fad658a9ebb45138a90f607082ed4e2385050ae13da1b945cf17b9b016e32b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-594a8673-651b-4566-92ec-8dbe6ca00b60, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202) Dec 15 05:01:19 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:19.477 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[f73ce60e-e326-4113-8874-81ba215e9ec5]: (4, ('Mon Dec 15 10:01:19 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-594a8673-651b-4566-92ec-8dbe6ca00b60 (98fad658a9ebb45138a90f607082ed4e2385050ae13da1b945cf17b9b016e32b)\n98fad658a9ebb45138a90f607082ed4e2385050ae13da1b945cf17b9b016e32b\nMon Dec 15 10:01:19 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-594a8673-651b-4566-92ec-8dbe6ca00b60 (98fad658a9ebb45138a90f607082ed4e2385050ae13da1b945cf17b9b016e32b)\n98fad658a9ebb45138a90f607082ed4e2385050ae13da1b945cf17b9b016e32b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:01:19 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:19.478 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[6d607214-0f2d-4222-a7f4-34eed808feb1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:01:19 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:19.480 160590 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap594a8673-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 15 05:01:19 localhost nova_compute[286344]: 2025-12-15 10:01:19.482 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:01:19 localhost kernel: device tap594a8673-60 left promiscuous mode Dec 15 05:01:19 localhost nova_compute[286344]: 2025-12-15 10:01:19.495 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:01:19 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:19.503 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[43d1a92b-db31-49da-b7cf-db763a048a46]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:01:19 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:19.525 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[80b83d58-6075-4418-9141-85cbb23da1f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:01:19 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:19.527 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[263b2789-5af4-47ba-b1a6-9f35cf374a93]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:01:19 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:19.543 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[8def1d0a-a46a-4beb-b116-f1af245be126]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1192231, 'reachable_time': 21076, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1356, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 316081, 'error': None, 'target': 'ovnmeta-594a8673-651b-4566-92ec-8dbe6ca00b60', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:01:19 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:19.545 160979 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-594a8673-651b-4566-92ec-8dbe6ca00b60 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m Dec 15 05:01:19 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:19.546 160979 DEBUG oslo.privsep.daemon [-] privsep: reply[37fae21d-fd85-4f50-92d5-9a893228c135]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:01:19 localhost systemd[1]: run-netns-ovnmeta\x2d594a8673\x2d651b\x2d4566\x2d92ec\x2d8dbe6ca00b60.mount: Deactivated successfully. Dec 15 05:01:19 localhost nova_compute[286344]: 2025-12-15 10:01:19.565 286348 DEBUG nova.compute.manager [req-a9566103-9053-4234-8f31-d1a685d69d69 req-3bb32ab0-b2ed-43a0-9992-6f3f36b6ef1b 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Received event network-vif-unplugged-d6b04aa0-3423-4a78-adfc-4bf3151f80ed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Dec 15 05:01:19 localhost nova_compute[286344]: 2025-12-15 10:01:19.566 286348 DEBUG oslo_concurrency.lockutils [req-a9566103-9053-4234-8f31-d1a685d69d69 req-3bb32ab0-b2ed-43a0-9992-6f3f36b6ef1b 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] Acquiring lock "51fca94e-ecb9-4350-bafc-765e827a1c7b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 05:01:19 localhost nova_compute[286344]: 2025-12-15 10:01:19.566 286348 DEBUG oslo_concurrency.lockutils [req-a9566103-9053-4234-8f31-d1a685d69d69 req-3bb32ab0-b2ed-43a0-9992-6f3f36b6ef1b 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] Lock "51fca94e-ecb9-4350-bafc-765e827a1c7b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 05:01:19 localhost nova_compute[286344]: 2025-12-15 10:01:19.567 286348 DEBUG oslo_concurrency.lockutils [req-a9566103-9053-4234-8f31-d1a685d69d69 req-3bb32ab0-b2ed-43a0-9992-6f3f36b6ef1b 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] Lock "51fca94e-ecb9-4350-bafc-765e827a1c7b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 05:01:19 localhost nova_compute[286344]: 2025-12-15 10:01:19.567 286348 DEBUG nova.compute.manager [req-a9566103-9053-4234-8f31-d1a685d69d69 req-3bb32ab0-b2ed-43a0-9992-6f3f36b6ef1b 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] No waiting events found dispatching network-vif-unplugged-d6b04aa0-3423-4a78-adfc-4bf3151f80ed pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m Dec 15 05:01:19 localhost nova_compute[286344]: 2025-12-15 10:01:19.567 286348 DEBUG nova.compute.manager [req-a9566103-9053-4234-8f31-d1a685d69d69 req-3bb32ab0-b2ed-43a0-9992-6f3f36b6ef1b 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Received event network-vif-unplugged-d6b04aa0-3423-4a78-adfc-4bf3151f80ed for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m Dec 15 05:01:19 localhost nova_compute[286344]: 2025-12-15 10:01:19.816 286348 DEBUG nova.network.neutron [None req-3896e173-607d-4857-b33f-eee46c72bcf0 b27b4cc96007415aab89230177c048ab 466123b442f148a6ab70ab607ead6747 - - default default] Activated binding for port d6b04aa0-3423-4a78-adfc-4bf3151f80ed and host np0005559463.localdomain migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181#033[00m Dec 15 05:01:19 localhost nova_compute[286344]: 2025-12-15 10:01:19.817 286348 DEBUG nova.compute.manager [None req-3896e173-607d-4857-b33f-eee46c72bcf0 b27b4cc96007415aab89230177c048ab 466123b442f148a6ab70ab607ead6747 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "d6b04aa0-3423-4a78-adfc-4bf3151f80ed", "address": "fa:16:3e:98:70:a4", "network": {"id": "576e7e59-adb4-4dcb-9b64-cc166b1e1e5f", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1988721240-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "8d2a9ec16aa942ab8315d4057f639915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd6b04aa0-34", "ovs_interfaceid": "d6b04aa0-3423-4a78-adfc-4bf3151f80ed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326#033[00m Dec 15 05:01:19 localhost nova_compute[286344]: 2025-12-15 10:01:19.822 286348 DEBUG nova.virt.libvirt.vif [None req-3896e173-607d-4857-b33f-eee46c72bcf0 b27b4cc96007415aab89230177c048ab 466123b442f148a6ab70ab607ead6747 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-15T10:00:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-612817668',ec2_ids=,ephemeral_gb=0,ephemeral_key_uuid=None,fault=,flavor=Flavor(5),hidden=False,host='np0005559462.localdomain',hostname='tempest-liveautoblockmigrationv225test-server-612817668',id=8,image_ref='b48177c8-9d95-4864-913a-a010f9defaa6',info_cache=InstanceInfoCache,instance_type_id=5,kernel_id='',key_data=None,key_name=None,keypairs=,launch_index=0,launched_at=2025-12-15T10:01:08Z,launched_on='np0005559462.localdomain',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='np0005559462.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='8d2a9ec16aa942ab8315d4057f639915',ramdisk_id='',reservation_id='r-446f3i0r',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='b48177c8-9d95-4864-913a-a010f9defaa6',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1078555498',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1078555498-project-member'},tags=,task_state='migrating',terminated_at=None,trusted_certs=,updated_at=2025-12-15T10:01:12Z,user_data=None,user_id='c79d291546244f6a970ffc157036d797',uuid=51fca94e-ecb9-4350-bafc-765e827a1c7b,vcpu_model=,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d6b04aa0-3423-4a78-adfc-4bf3151f80ed", "address": "fa:16:3e:98:70:a4", "network": {"id": "576e7e59-adb4-4dcb-9b64-cc166b1e1e5f", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1988721240-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "8d2a9ec16aa942ab8315d4057f639915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd6b04aa0-34", "ovs_interfaceid": "d6b04aa0-3423-4a78-adfc-4bf3151f80ed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m Dec 15 05:01:19 localhost nova_compute[286344]: 2025-12-15 10:01:19.823 286348 DEBUG nova.network.os_vif_util [None req-3896e173-607d-4857-b33f-eee46c72bcf0 b27b4cc96007415aab89230177c048ab 466123b442f148a6ab70ab607ead6747 - - default default] Converting VIF {"id": "d6b04aa0-3423-4a78-adfc-4bf3151f80ed", "address": "fa:16:3e:98:70:a4", "network": {"id": "576e7e59-adb4-4dcb-9b64-cc166b1e1e5f", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1988721240-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.2"}}], "meta": {"injected": false, "tenant_id": "8d2a9ec16aa942ab8315d4057f639915", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd6b04aa0-34", "ovs_interfaceid": "d6b04aa0-3423-4a78-adfc-4bf3151f80ed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m Dec 15 05:01:19 localhost nova_compute[286344]: 2025-12-15 10:01:19.825 286348 DEBUG nova.network.os_vif_util [None req-3896e173-607d-4857-b33f-eee46c72bcf0 b27b4cc96007415aab89230177c048ab 466123b442f148a6ab70ab607ead6747 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:98:70:a4,bridge_name='br-int',has_traffic_filtering=True,id=d6b04aa0-3423-4a78-adfc-4bf3151f80ed,network=Network(576e7e59-adb4-4dcb-9b64-cc166b1e1e5f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapd6b04aa0-34') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m Dec 15 05:01:19 localhost nova_compute[286344]: 2025-12-15 10:01:19.826 286348 DEBUG os_vif [None req-3896e173-607d-4857-b33f-eee46c72bcf0 b27b4cc96007415aab89230177c048ab 466123b442f148a6ab70ab607ead6747 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:98:70:a4,bridge_name='br-int',has_traffic_filtering=True,id=d6b04aa0-3423-4a78-adfc-4bf3151f80ed,network=Network(576e7e59-adb4-4dcb-9b64-cc166b1e1e5f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapd6b04aa0-34') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m Dec 15 05:01:19 localhost nova_compute[286344]: 2025-12-15 10:01:19.831 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:01:19 localhost nova_compute[286344]: 2025-12-15 10:01:19.832 286348 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd6b04aa0-34, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 15 05:01:19 localhost nova_compute[286344]: 2025-12-15 10:01:19.843 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 05:01:19 localhost nova_compute[286344]: 2025-12-15 10:01:19.847 286348 INFO os_vif [None req-3896e173-607d-4857-b33f-eee46c72bcf0 b27b4cc96007415aab89230177c048ab 466123b442f148a6ab70ab607ead6747 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:98:70:a4,bridge_name='br-int',has_traffic_filtering=True,id=d6b04aa0-3423-4a78-adfc-4bf3151f80ed,network=Network(576e7e59-adb4-4dcb-9b64-cc166b1e1e5f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapd6b04aa0-34')#033[00m Dec 15 05:01:19 localhost nova_compute[286344]: 2025-12-15 10:01:19.848 286348 DEBUG oslo_concurrency.lockutils [None req-3896e173-607d-4857-b33f-eee46c72bcf0 b27b4cc96007415aab89230177c048ab 466123b442f148a6ab70ab607ead6747 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 05:01:19 localhost nova_compute[286344]: 2025-12-15 10:01:19.849 286348 DEBUG oslo_concurrency.lockutils [None req-3896e173-607d-4857-b33f-eee46c72bcf0 b27b4cc96007415aab89230177c048ab 466123b442f148a6ab70ab607ead6747 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 05:01:19 localhost nova_compute[286344]: 2025-12-15 10:01:19.849 286348 DEBUG oslo_concurrency.lockutils [None req-3896e173-607d-4857-b33f-eee46c72bcf0 b27b4cc96007415aab89230177c048ab 466123b442f148a6ab70ab607ead6747 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 05:01:19 localhost nova_compute[286344]: 2025-12-15 10:01:19.849 286348 DEBUG nova.compute.manager [None req-3896e173-607d-4857-b33f-eee46c72bcf0 b27b4cc96007415aab89230177c048ab 466123b442f148a6ab70ab607ead6747 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349#033[00m Dec 15 05:01:19 localhost nova_compute[286344]: 2025-12-15 10:01:19.850 286348 INFO nova.virt.libvirt.driver [None req-3896e173-607d-4857-b33f-eee46c72bcf0 b27b4cc96007415aab89230177c048ab 466123b442f148a6ab70ab607ead6747 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Deleting instance files /var/lib/nova/instances/51fca94e-ecb9-4350-bafc-765e827a1c7b_del#033[00m Dec 15 05:01:19 localhost nova_compute[286344]: 2025-12-15 10:01:19.851 286348 INFO nova.virt.libvirt.driver [None req-3896e173-607d-4857-b33f-eee46c72bcf0 b27b4cc96007415aab89230177c048ab 466123b442f148a6ab70ab607ead6747 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Deletion of /var/lib/nova/instances/51fca94e-ecb9-4350-bafc-765e827a1c7b_del complete#033[00m Dec 15 05:01:19 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e103 do_prune osdmap full prune enabled Dec 15 05:01:19 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e104 e104: 6 total, 6 up, 6 in Dec 15 05:01:19 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e104: 6 total, 6 up, 6 in Dec 15 05:01:19 localhost neutron_sriov_agent[260044]: 2025-12-15 10:01:19.952 2 INFO neutron.agent.securitygroups_rpc [None req-2b27788c-cec3-4dfd-b873-1b22752a80d2 7eba5d40415d4fd7ab565eed1e07b799 4e26599ab4374eb6b7e9d73e7dfebd03 - - default default] Security group member updated ['489315f5-1e14-49f6-8195-de837d23b935']#033[00m Dec 15 05:01:20 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e104 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 05:01:20 localhost nova_compute[286344]: 2025-12-15 10:01:20.634 286348 DEBUG nova.compute.manager [req-852de7c5-cc62-446f-8a45-28d3dfe5234b req-ff0f6167-bdb9-4209-8789-e6801442c7d3 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Received event network-vif-plugged-d6b04aa0-3423-4a78-adfc-4bf3151f80ed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Dec 15 05:01:20 localhost nova_compute[286344]: 2025-12-15 10:01:20.635 286348 DEBUG oslo_concurrency.lockutils [req-852de7c5-cc62-446f-8a45-28d3dfe5234b req-ff0f6167-bdb9-4209-8789-e6801442c7d3 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] Acquiring lock "51fca94e-ecb9-4350-bafc-765e827a1c7b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 05:01:20 localhost nova_compute[286344]: 2025-12-15 10:01:20.636 286348 DEBUG oslo_concurrency.lockutils [req-852de7c5-cc62-446f-8a45-28d3dfe5234b req-ff0f6167-bdb9-4209-8789-e6801442c7d3 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] Lock "51fca94e-ecb9-4350-bafc-765e827a1c7b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 05:01:20 localhost nova_compute[286344]: 2025-12-15 10:01:20.636 286348 DEBUG oslo_concurrency.lockutils [req-852de7c5-cc62-446f-8a45-28d3dfe5234b req-ff0f6167-bdb9-4209-8789-e6801442c7d3 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] Lock "51fca94e-ecb9-4350-bafc-765e827a1c7b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 05:01:20 localhost nova_compute[286344]: 2025-12-15 10:01:20.636 286348 DEBUG nova.compute.manager [req-852de7c5-cc62-446f-8a45-28d3dfe5234b req-ff0f6167-bdb9-4209-8789-e6801442c7d3 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] No waiting events found dispatching network-vif-plugged-d6b04aa0-3423-4a78-adfc-4bf3151f80ed pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m Dec 15 05:01:20 localhost nova_compute[286344]: 2025-12-15 10:01:20.637 286348 WARNING nova.compute.manager [req-852de7c5-cc62-446f-8a45-28d3dfe5234b req-ff0f6167-bdb9-4209-8789-e6801442c7d3 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Received unexpected event network-vif-plugged-d6b04aa0-3423-4a78-adfc-4bf3151f80ed for instance with vm_state active and task_state migrating.#033[00m Dec 15 05:01:20 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e104 do_prune osdmap full prune enabled Dec 15 05:01:20 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e105 e105: 6 total, 6 up, 6 in Dec 15 05:01:20 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e105: 6 total, 6 up, 6 in Dec 15 05:01:21 localhost nova_compute[286344]: 2025-12-15 10:01:21.673 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:01:22 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e105 do_prune osdmap full prune enabled Dec 15 05:01:22 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e106 e106: 6 total, 6 up, 6 in Dec 15 05:01:22 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e106: 6 total, 6 up, 6 in Dec 15 05:01:23 localhost neutron_sriov_agent[260044]: 2025-12-15 10:01:23.632 2 INFO neutron.agent.securitygroups_rpc [req-07a6fb28-4e79-426f-96c8-9c4ff09be2d3 req-fd822fcc-c626-49df-8458-848e6a54f7fb 3d412e2d4a3a4f54b39c55f7a39b768f 054b5bdd4ed44009a8e1940489c96b34 - - default default] Security group member updated ['1f23d768-bded-4ba1-9b80-f5121af3632b']#033[00m Dec 15 05:01:23 localhost nova_compute[286344]: 2025-12-15 10:01:23.636 286348 DEBUG nova.compute.manager [req-da6f0c0c-2a89-469e-94f3-454a2d54392a req-78afa1ee-653b-4a43-bb45-a47191c3803d 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Received event network-vif-plugged-d6b04aa0-3423-4a78-adfc-4bf3151f80ed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m Dec 15 05:01:23 localhost nova_compute[286344]: 2025-12-15 10:01:23.636 286348 DEBUG oslo_concurrency.lockutils [req-da6f0c0c-2a89-469e-94f3-454a2d54392a req-78afa1ee-653b-4a43-bb45-a47191c3803d 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] Acquiring lock "51fca94e-ecb9-4350-bafc-765e827a1c7b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 05:01:23 localhost nova_compute[286344]: 2025-12-15 10:01:23.637 286348 DEBUG oslo_concurrency.lockutils [req-da6f0c0c-2a89-469e-94f3-454a2d54392a req-78afa1ee-653b-4a43-bb45-a47191c3803d 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] Lock "51fca94e-ecb9-4350-bafc-765e827a1c7b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 05:01:23 localhost nova_compute[286344]: 2025-12-15 10:01:23.637 286348 DEBUG oslo_concurrency.lockutils [req-da6f0c0c-2a89-469e-94f3-454a2d54392a req-78afa1ee-653b-4a43-bb45-a47191c3803d 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] Lock "51fca94e-ecb9-4350-bafc-765e827a1c7b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 05:01:23 localhost nova_compute[286344]: 2025-12-15 10:01:23.637 286348 DEBUG nova.compute.manager [req-da6f0c0c-2a89-469e-94f3-454a2d54392a req-78afa1ee-653b-4a43-bb45-a47191c3803d 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] No waiting events found dispatching network-vif-plugged-d6b04aa0-3423-4a78-adfc-4bf3151f80ed pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m Dec 15 05:01:23 localhost nova_compute[286344]: 2025-12-15 10:01:23.638 286348 WARNING nova.compute.manager [req-da6f0c0c-2a89-469e-94f3-454a2d54392a req-78afa1ee-653b-4a43-bb45-a47191c3803d 01c6e0cd81404938a10f65d81ef6d178 2f96e5f995e6442f94f881dabbe4dbf7 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Received unexpected event network-vif-plugged-d6b04aa0-3423-4a78-adfc-4bf3151f80ed for instance with vm_state active and task_state migrating.#033[00m Dec 15 05:01:23 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:01:23.801 267546 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-15T10:01:22Z, description=, device_id=039e588b-4069-4619-869a-ca49c45dd58b, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=bcd3121a-82b6-4ebd-b4f4-772b6501cdcc, ip_allocation=immediate, mac_address=fa:16:3e:94:6e:10, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-15T10:00:57Z, description=, dns_domain=, id=3ac513c6-d80a-4d03-a550-1b73e6929696, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ServersV294TestFqdnHostnames-1189201574-network, port_security_enabled=True, project_id=054b5bdd4ed44009a8e1940489c96b34, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=38218, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=585, status=ACTIVE, subnets=['225c87a4-c79c-4b41-a792-50b7fbf53b84'], tags=[], tenant_id=054b5bdd4ed44009a8e1940489c96b34, updated_at=2025-12-15T10:00:59Z, vlan_transparent=None, network_id=3ac513c6-d80a-4d03-a550-1b73e6929696, port_security_enabled=True, project_id=054b5bdd4ed44009a8e1940489c96b34, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['1f23d768-bded-4ba1-9b80-f5121af3632b'], standard_attr_id=676, status=DOWN, tags=[], tenant_id=054b5bdd4ed44009a8e1940489c96b34, updated_at=2025-12-15T10:01:23Z on network 3ac513c6-d80a-4d03-a550-1b73e6929696#033[00m Dec 15 05:01:23 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e106 do_prune osdmap full prune enabled Dec 15 05:01:24 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e107 e107: 6 total, 6 up, 6 in Dec 15 05:01:24 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e107: 6 total, 6 up, 6 in Dec 15 05:01:24 localhost dnsmasq[315389]: read /var/lib/neutron/dhcp/3ac513c6-d80a-4d03-a550-1b73e6929696/addn_hosts - 2 addresses Dec 15 05:01:24 localhost dnsmasq-dhcp[315389]: read /var/lib/neutron/dhcp/3ac513c6-d80a-4d03-a550-1b73e6929696/host Dec 15 05:01:24 localhost podman[316099]: 2025-12-15 10:01:24.024430668 +0000 UTC m=+0.066475811 container kill 8d0a249025cd30d3f6ced66bb2a1ab69200de31f87cfd16c2c213b3b64a9055f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3ac513c6-d80a-4d03-a550-1b73e6929696, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true) Dec 15 05:01:24 localhost dnsmasq-dhcp[315389]: read /var/lib/neutron/dhcp/3ac513c6-d80a-4d03-a550-1b73e6929696/opts Dec 15 05:01:24 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:01:24.217 267546 INFO neutron.agent.dhcp.agent [None req-3743e546-08cd-4c41-8a8b-ada2280f2bcd - - - - - -] DHCP configuration for ports {'bcd3121a-82b6-4ebd-b4f4-772b6501cdcc'} is completed#033[00m Dec 15 05:01:24 localhost nova_compute[286344]: 2025-12-15 10:01:24.835 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:01:25 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e107 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 05:01:25 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e107 do_prune osdmap full prune enabled Dec 15 05:01:25 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e108 e108: 6 total, 6 up, 6 in Dec 15 05:01:25 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e108: 6 total, 6 up, 6 in Dec 15 05:01:25 localhost nova_compute[286344]: 2025-12-15 10:01:25.880 286348 DEBUG nova.virt.driver [-] Emitting event Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m Dec 15 05:01:25 localhost nova_compute[286344]: 2025-12-15 10:01:25.880 286348 INFO nova.compute.manager [-] [instance: 8f5a73ce-a171-4a42-9f60-63c0db9e0a32] VM Stopped (Lifecycle Event)#033[00m Dec 15 05:01:26 localhost nova_compute[286344]: 2025-12-15 10:01:26.695 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:01:26 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:01:26.943 267546 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=np0005559463.localdomain, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-15T10:01:22Z, description=, device_id=039e588b-4069-4619-869a-ca49c45dd58b, device_owner=compute:nova, dns_assignment=[], dns_domain=, dns_name=xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx-guest-test.domaintest.com, extra_dhcp_opts=[], fixed_ips=[], id=bcd3121a-82b6-4ebd-b4f4-772b6501cdcc, ip_allocation=immediate, mac_address=fa:16:3e:94:6e:10, name=, network_id=3ac513c6-d80a-4d03-a550-1b73e6929696, port_security_enabled=True, project_id=054b5bdd4ed44009a8e1940489c96b34, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['1f23d768-bded-4ba1-9b80-f5121af3632b'], standard_attr_id=676, status=DOWN, tags=[], tenant_id=054b5bdd4ed44009a8e1940489c96b34, updated_at=2025-12-15T10:01:24Z on network 3ac513c6-d80a-4d03-a550-1b73e6929696#033[00m Dec 15 05:01:27 localhost nova_compute[286344]: 2025-12-15 10:01:27.078 286348 DEBUG nova.compute.manager [None req-85c30a07-2fc7-436b-bdc8-0085110a7ab8 - - - - - -] [instance: 8f5a73ce-a171-4a42-9f60-63c0db9e0a32] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Dec 15 05:01:27 localhost dnsmasq[315389]: read /var/lib/neutron/dhcp/3ac513c6-d80a-4d03-a550-1b73e6929696/addn_hosts - 2 addresses Dec 15 05:01:27 localhost dnsmasq-dhcp[315389]: read /var/lib/neutron/dhcp/3ac513c6-d80a-4d03-a550-1b73e6929696/host Dec 15 05:01:27 localhost dnsmasq-dhcp[315389]: read /var/lib/neutron/dhcp/3ac513c6-d80a-4d03-a550-1b73e6929696/opts Dec 15 05:01:27 localhost systemd[1]: tmp-crun.gtK3vM.mount: Deactivated successfully. Dec 15 05:01:27 localhost podman[316136]: 2025-12-15 10:01:27.208420782 +0000 UTC m=+0.062891081 container kill 8d0a249025cd30d3f6ced66bb2a1ab69200de31f87cfd16c2c213b3b64a9055f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3ac513c6-d80a-4d03-a550-1b73e6929696, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202) Dec 15 05:01:27 localhost nova_compute[286344]: 2025-12-15 10:01:27.238 286348 DEBUG oslo_concurrency.lockutils [None req-3896e173-607d-4857-b33f-eee46c72bcf0 b27b4cc96007415aab89230177c048ab 466123b442f148a6ab70ab607ead6747 - - default default] Acquiring lock "51fca94e-ecb9-4350-bafc-765e827a1c7b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 05:01:27 localhost nova_compute[286344]: 2025-12-15 10:01:27.239 286348 DEBUG oslo_concurrency.lockutils [None req-3896e173-607d-4857-b33f-eee46c72bcf0 b27b4cc96007415aab89230177c048ab 466123b442f148a6ab70ab607ead6747 - - default default] Lock "51fca94e-ecb9-4350-bafc-765e827a1c7b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 05:01:27 localhost nova_compute[286344]: 2025-12-15 10:01:27.240 286348 DEBUG oslo_concurrency.lockutils [None req-3896e173-607d-4857-b33f-eee46c72bcf0 b27b4cc96007415aab89230177c048ab 466123b442f148a6ab70ab607ead6747 - - default default] Lock "51fca94e-ecb9-4350-bafc-765e827a1c7b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 05:01:27 localhost nova_compute[286344]: 2025-12-15 10:01:27.278 286348 DEBUG oslo_concurrency.lockutils [None req-3896e173-607d-4857-b33f-eee46c72bcf0 b27b4cc96007415aab89230177c048ab 466123b442f148a6ab70ab607ead6747 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 05:01:27 localhost nova_compute[286344]: 2025-12-15 10:01:27.278 286348 DEBUG oslo_concurrency.lockutils [None req-3896e173-607d-4857-b33f-eee46c72bcf0 b27b4cc96007415aab89230177c048ab 466123b442f148a6ab70ab607ead6747 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 05:01:27 localhost nova_compute[286344]: 2025-12-15 10:01:27.279 286348 DEBUG oslo_concurrency.lockutils [None req-3896e173-607d-4857-b33f-eee46c72bcf0 b27b4cc96007415aab89230177c048ab 466123b442f148a6ab70ab607ead6747 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 05:01:27 localhost nova_compute[286344]: 2025-12-15 10:01:27.279 286348 DEBUG nova.compute.resource_tracker [None req-3896e173-607d-4857-b33f-eee46c72bcf0 b27b4cc96007415aab89230177c048ab 466123b442f148a6ab70ab607ead6747 - - default default] Auditing locally available compute resources for np0005559462.localdomain (node: np0005559462.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 15 05:01:27 localhost nova_compute[286344]: 2025-12-15 10:01:27.279 286348 DEBUG oslo_concurrency.processutils [None req-3896e173-607d-4857-b33f-eee46c72bcf0 b27b4cc96007415aab89230177c048ab 466123b442f148a6ab70ab607ead6747 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 05:01:27 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:01:27.476 267546 INFO neutron.agent.dhcp.agent [None req-c37e1624-8bb8-4ab8-bbe9-cbd5ba175124 - - - - - -] DHCP configuration for ports {'bcd3121a-82b6-4ebd-b4f4-772b6501cdcc'} is completed#033[00m Dec 15 05:01:27 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Dec 15 05:01:27 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3731973343' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Dec 15 05:01:27 localhost nova_compute[286344]: 2025-12-15 10:01:27.767 286348 DEBUG oslo_concurrency.processutils [None req-3896e173-607d-4857-b33f-eee46c72bcf0 b27b4cc96007415aab89230177c048ab 466123b442f148a6ab70ab607ead6747 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.487s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 05:01:27 localhost nova_compute[286344]: 2025-12-15 10:01:27.881 286348 DEBUG nova.virt.libvirt.driver [None req-3896e173-607d-4857-b33f-eee46c72bcf0 b27b4cc96007415aab89230177c048ab 466123b442f148a6ab70ab607ead6747 - - default default] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 05:01:27 localhost nova_compute[286344]: 2025-12-15 10:01:27.882 286348 DEBUG nova.virt.libvirt.driver [None req-3896e173-607d-4857-b33f-eee46c72bcf0 b27b4cc96007415aab89230177c048ab 466123b442f148a6ab70ab607ead6747 - - default default] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 05:01:28 localhost nova_compute[286344]: 2025-12-15 10:01:28.062 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:01:28 localhost nova_compute[286344]: 2025-12-15 10:01:28.142 286348 WARNING nova.virt.libvirt.driver [None req-3896e173-607d-4857-b33f-eee46c72bcf0 b27b4cc96007415aab89230177c048ab 466123b442f148a6ab70ab607ead6747 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 15 05:01:28 localhost nova_compute[286344]: 2025-12-15 10:01:28.144 286348 DEBUG nova.compute.resource_tracker [None req-3896e173-607d-4857-b33f-eee46c72bcf0 b27b4cc96007415aab89230177c048ab 466123b442f148a6ab70ab607ead6747 - - default default] Hypervisor/Node resource view: name=np0005559462.localdomain free_ram=11339MB free_disk=41.50214767456055GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 15 05:01:28 localhost nova_compute[286344]: 2025-12-15 10:01:28.145 286348 DEBUG oslo_concurrency.lockutils [None req-3896e173-607d-4857-b33f-eee46c72bcf0 b27b4cc96007415aab89230177c048ab 466123b442f148a6ab70ab607ead6747 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 05:01:28 localhost nova_compute[286344]: 2025-12-15 10:01:28.145 286348 DEBUG oslo_concurrency.lockutils [None req-3896e173-607d-4857-b33f-eee46c72bcf0 b27b4cc96007415aab89230177c048ab 466123b442f148a6ab70ab607ead6747 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 05:01:28 localhost nova_compute[286344]: 2025-12-15 10:01:28.358 286348 DEBUG nova.compute.resource_tracker [None req-3896e173-607d-4857-b33f-eee46c72bcf0 b27b4cc96007415aab89230177c048ab 466123b442f148a6ab70ab607ead6747 - - default default] Migration for instance 51fca94e-ecb9-4350-bafc-765e827a1c7b refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m Dec 15 05:01:28 localhost nova_compute[286344]: 2025-12-15 10:01:28.425 286348 DEBUG nova.compute.resource_tracker [None req-3896e173-607d-4857-b33f-eee46c72bcf0 b27b4cc96007415aab89230177c048ab 466123b442f148a6ab70ab607ead6747 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491#033[00m Dec 15 05:01:28 localhost nova_compute[286344]: 2025-12-15 10:01:28.460 286348 DEBUG nova.compute.resource_tracker [None req-3896e173-607d-4857-b33f-eee46c72bcf0 b27b4cc96007415aab89230177c048ab 466123b442f148a6ab70ab607ead6747 - - default default] Instance 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 15 05:01:28 localhost nova_compute[286344]: 2025-12-15 10:01:28.460 286348 DEBUG nova.compute.resource_tracker [None req-3896e173-607d-4857-b33f-eee46c72bcf0 b27b4cc96007415aab89230177c048ab 466123b442f148a6ab70ab607ead6747 - - default default] Migration c3174169-6033-440f-9a8d-32658f6e6711 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m Dec 15 05:01:28 localhost nova_compute[286344]: 2025-12-15 10:01:28.461 286348 DEBUG nova.compute.resource_tracker [None req-3896e173-607d-4857-b33f-eee46c72bcf0 b27b4cc96007415aab89230177c048ab 466123b442f148a6ab70ab607ead6747 - - default default] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 15 05:01:28 localhost nova_compute[286344]: 2025-12-15 10:01:28.461 286348 DEBUG nova.compute.resource_tracker [None req-3896e173-607d-4857-b33f-eee46c72bcf0 b27b4cc96007415aab89230177c048ab 466123b442f148a6ab70ab607ead6747 - - default default] Final resource view: name=np0005559462.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 15 05:01:28 localhost nova_compute[286344]: 2025-12-15 10:01:28.520 286348 DEBUG oslo_concurrency.processutils [None req-3896e173-607d-4857-b33f-eee46c72bcf0 b27b4cc96007415aab89230177c048ab 466123b442f148a6ab70ab607ead6747 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 05:01:28 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Dec 15 05:01:28 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1601525876' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Dec 15 05:01:28 localhost nova_compute[286344]: 2025-12-15 10:01:28.981 286348 DEBUG oslo_concurrency.processutils [None req-3896e173-607d-4857-b33f-eee46c72bcf0 b27b4cc96007415aab89230177c048ab 466123b442f148a6ab70ab607ead6747 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.461s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 05:01:28 localhost nova_compute[286344]: 2025-12-15 10:01:28.986 286348 DEBUG nova.compute.provider_tree [None req-3896e173-607d-4857-b33f-eee46c72bcf0 b27b4cc96007415aab89230177c048ab 466123b442f148a6ab70ab607ead6747 - - default default] Inventory has not changed in ProviderTree for provider: 26c8956b-6742-4951-b566-971b9bbe323b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 15 05:01:29 localhost nova_compute[286344]: 2025-12-15 10:01:29.006 286348 DEBUG nova.scheduler.client.report [None req-3896e173-607d-4857-b33f-eee46c72bcf0 b27b4cc96007415aab89230177c048ab 466123b442f148a6ab70ab607ead6747 - - default default] Inventory has not changed for provider 26c8956b-6742-4951-b566-971b9bbe323b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 15 05:01:29 localhost nova_compute[286344]: 2025-12-15 10:01:29.029 286348 DEBUG nova.compute.resource_tracker [None req-3896e173-607d-4857-b33f-eee46c72bcf0 b27b4cc96007415aab89230177c048ab 466123b442f148a6ab70ab607ead6747 - - default default] Compute_service record updated for np0005559462.localdomain:np0005559462.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 15 05:01:29 localhost nova_compute[286344]: 2025-12-15 10:01:29.029 286348 DEBUG oslo_concurrency.lockutils [None req-3896e173-607d-4857-b33f-eee46c72bcf0 b27b4cc96007415aab89230177c048ab 466123b442f148a6ab70ab607ead6747 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.884s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 05:01:29 localhost nova_compute[286344]: 2025-12-15 10:01:29.037 286348 INFO nova.compute.manager [None req-3896e173-607d-4857-b33f-eee46c72bcf0 b27b4cc96007415aab89230177c048ab 466123b442f148a6ab70ab607ead6747 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Migrating instance to np0005559463.localdomain finished successfully.#033[00m Dec 15 05:01:29 localhost nova_compute[286344]: 2025-12-15 10:01:29.323 286348 INFO nova.scheduler.client.report [None req-3896e173-607d-4857-b33f-eee46c72bcf0 b27b4cc96007415aab89230177c048ab 466123b442f148a6ab70ab607ead6747 - - default default] Deleted allocation for migration c3174169-6033-440f-9a8d-32658f6e6711#033[00m Dec 15 05:01:29 localhost nova_compute[286344]: 2025-12-15 10:01:29.324 286348 DEBUG nova.virt.libvirt.driver [None req-3896e173-607d-4857-b33f-eee46c72bcf0 b27b4cc96007415aab89230177c048ab 466123b442f148a6ab70ab607ead6747 - - default default] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662#033[00m Dec 15 05:01:29 localhost nova_compute[286344]: 2025-12-15 10:01:29.838 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:01:30 localhost nova_compute[286344]: 2025-12-15 10:01:30.279 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:01:30 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e108 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 05:01:30 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e108 do_prune osdmap full prune enabled Dec 15 05:01:30 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e109 e109: 6 total, 6 up, 6 in Dec 15 05:01:30 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e109: 6 total, 6 up, 6 in Dec 15 05:01:30 localhost ovn_controller[154603]: 2025-12-15T10:01:30Z|00149|binding|INFO|Releasing lport b35254ad-12eb-47bb-92be-44fefe0694f0 from this chassis (sb_readonly=0) Dec 15 05:01:30 localhost nova_compute[286344]: 2025-12-15 10:01:30.830 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:01:31 localhost nova_compute[286344]: 2025-12-15 10:01:31.716 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:01:31 localhost podman[243449]: time="2025-12-15T10:01:31Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 15 05:01:31 localhost podman[243449]: @ - - [15/Dec/2025:10:01:31 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 162109 "" "Go-http-client/1.1" Dec 15 05:01:31 localhost podman[243449]: @ - - [15/Dec/2025:10:01:31 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 20687 "" "Go-http-client/1.1" Dec 15 05:01:33 localhost nova_compute[286344]: 2025-12-15 10:01:33.831 286348 DEBUG nova.virt.driver [-] Emitting event Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m Dec 15 05:01:33 localhost nova_compute[286344]: 2025-12-15 10:01:33.832 286348 INFO nova.compute.manager [-] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] VM Stopped (Lifecycle Event)#033[00m Dec 15 05:01:33 localhost nova_compute[286344]: 2025-12-15 10:01:33.850 286348 DEBUG nova.compute.manager [None req-95f0e4b9-6bc9-4392-870b-b72602be8342 - - - - - -] [instance: 51fca94e-ecb9-4350-bafc-765e827a1c7b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m Dec 15 05:01:34 localhost nova_compute[286344]: 2025-12-15 10:01:34.841 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:01:34 localhost openstack_network_exporter[246484]: ERROR 10:01:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 05:01:34 localhost openstack_network_exporter[246484]: ERROR 10:01:34 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 15 05:01:34 localhost openstack_network_exporter[246484]: ERROR 10:01:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 05:01:34 localhost openstack_network_exporter[246484]: ERROR 10:01:34 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 15 05:01:34 localhost openstack_network_exporter[246484]: Dec 15 05:01:34 localhost openstack_network_exporter[246484]: ERROR 10:01:34 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 15 05:01:34 localhost openstack_network_exporter[246484]: Dec 15 05:01:34 localhost systemd[1]: tmp-crun.Nfyz4r.mount: Deactivated successfully. Dec 15 05:01:34 localhost dnsmasq[313833]: read /var/lib/neutron/dhcp/0a27b0d3-52a4-4f8c-9083-769f5c00765d/addn_hosts - 0 addresses Dec 15 05:01:34 localhost dnsmasq-dhcp[313833]: read /var/lib/neutron/dhcp/0a27b0d3-52a4-4f8c-9083-769f5c00765d/host Dec 15 05:01:34 localhost podman[316218]: 2025-12-15 10:01:34.962814584 +0000 UTC m=+0.074726652 container kill 7b146303a978bcb3df5ff2887887992e6db4ec060d166943872f70bbc5777a10 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0a27b0d3-52a4-4f8c-9083-769f5c00765d, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Dec 15 05:01:34 localhost dnsmasq-dhcp[313833]: read /var/lib/neutron/dhcp/0a27b0d3-52a4-4f8c-9083-769f5c00765d/opts Dec 15 05:01:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0. Dec 15 05:01:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a. Dec 15 05:01:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 05:01:35 localhost podman[316235]: 2025-12-15 10:01:35.082959137 +0000 UTC m=+0.086310170 container health_status b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, container_name=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Dec 15 05:01:35 localhost podman[316235]: 2025-12-15 10:01:35.08916382 +0000 UTC m=+0.092514853 container exec_died b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0) Dec 15 05:01:35 localhost systemd[1]: b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a.service: Deactivated successfully. Dec 15 05:01:35 localhost kernel: device tapf642a259-16 left promiscuous mode Dec 15 05:01:35 localhost ovn_controller[154603]: 2025-12-15T10:01:35Z|00150|binding|INFO|Releasing lport f642a259-1650-438c-bdfd-d9145a0c7b84 from this chassis (sb_readonly=0) Dec 15 05:01:35 localhost ovn_controller[154603]: 2025-12-15T10:01:35Z|00151|binding|INFO|Setting lport f642a259-1650-438c-bdfd-d9145a0c7b84 down in Southbound Dec 15 05:01:35 localhost nova_compute[286344]: 2025-12-15 10:01:35.113 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:01:35 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:35.122 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-0a27b0d3-52a4-4f8c-9083-769f5c00765d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0a27b0d3-52a4-4f8c-9083-769f5c00765d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7526febbe14640b09a9f0897a2f4af8c', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005559462.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb6271da-cb66-4d8c-bc07-07fd6816385a, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=f642a259-1650-438c-bdfd-d9145a0c7b84) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:01:35 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:35.124 160590 INFO neutron.agent.ovn.metadata.agent [-] Port f642a259-1650-438c-bdfd-d9145a0c7b84 in datapath 0a27b0d3-52a4-4f8c-9083-769f5c00765d unbound from our chassis#033[00m Dec 15 05:01:35 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:35.128 160590 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0a27b0d3-52a4-4f8c-9083-769f5c00765d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 15 05:01:35 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:35.130 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[fced58e0-ed51-4d21-94b5-f1de14e56950]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:01:35 localhost nova_compute[286344]: 2025-12-15 10:01:35.136 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:01:35 localhost podman[316237]: 2025-12-15 10:01:35.143438168 +0000 UTC m=+0.139968232 container health_status 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_id=multipathd, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251202, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Dec 15 05:01:35 localhost podman[316237]: 2025-12-15 10:01:35.182343208 +0000 UTC m=+0.178873232 container exec_died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202) Dec 15 05:01:35 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.service: Deactivated successfully. Dec 15 05:01:35 localhost podman[316234]: 2025-12-15 10:01:35.184347486 +0000 UTC m=+0.191444221 container health_status 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 15 05:01:35 localhost podman[316234]: 2025-12-15 10:01:35.264495353 +0000 UTC m=+0.271592028 container exec_died 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 15 05:01:35 localhost systemd[1]: 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0.service: Deactivated successfully. Dec 15 05:01:35 localhost neutron_sriov_agent[260044]: 2025-12-15 10:01:35.306 2 INFO neutron.agent.securitygroups_rpc [None req-a22b59f9-7391-4435-b01a-98a84d0c5ff2 c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] Security group member updated ['48dda613-ea3d-4053-a6ae-35f8ce5dfd37']#033[00m Dec 15 05:01:35 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e109 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 05:01:35 localhost dnsmasq[314325]: read /var/lib/neutron/dhcp/594a8673-651b-4566-92ec-8dbe6ca00b60/addn_hosts - 0 addresses Dec 15 05:01:35 localhost dnsmasq-dhcp[314325]: read /var/lib/neutron/dhcp/594a8673-651b-4566-92ec-8dbe6ca00b60/host Dec 15 05:01:35 localhost dnsmasq-dhcp[314325]: read /var/lib/neutron/dhcp/594a8673-651b-4566-92ec-8dbe6ca00b60/opts Dec 15 05:01:35 localhost podman[316320]: 2025-12-15 10:01:35.58882441 +0000 UTC m=+0.057596742 container kill e8bff1a5ec450e2a9f85a19f864e875baf89c06f50c11ecc847096467012caed (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-594a8673-651b-4566-92ec-8dbe6ca00b60, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 05:01:35 localhost nova_compute[286344]: 2025-12-15 10:01:35.891 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:01:35 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e109 do_prune osdmap full prune enabled Dec 15 05:01:35 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e110 e110: 6 total, 6 up, 6 in Dec 15 05:01:35 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e110: 6 total, 6 up, 6 in Dec 15 05:01:36 localhost dnsmasq[314325]: exiting on receipt of SIGTERM Dec 15 05:01:36 localhost podman[316359]: 2025-12-15 10:01:36.297354419 +0000 UTC m=+0.060381270 container kill e8bff1a5ec450e2a9f85a19f864e875baf89c06f50c11ecc847096467012caed (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-594a8673-651b-4566-92ec-8dbe6ca00b60, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:01:36 localhost systemd[1]: libpod-e8bff1a5ec450e2a9f85a19f864e875baf89c06f50c11ecc847096467012caed.scope: Deactivated successfully. Dec 15 05:01:36 localhost podman[316373]: 2025-12-15 10:01:36.346983992 +0000 UTC m=+0.039300780 container died e8bff1a5ec450e2a9f85a19f864e875baf89c06f50c11ecc847096467012caed (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-594a8673-651b-4566-92ec-8dbe6ca00b60, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 05:01:36 localhost podman[316373]: 2025-12-15 10:01:36.378806548 +0000 UTC m=+0.071123286 container cleanup e8bff1a5ec450e2a9f85a19f864e875baf89c06f50c11ecc847096467012caed (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-594a8673-651b-4566-92ec-8dbe6ca00b60, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:01:36 localhost systemd[1]: libpod-conmon-e8bff1a5ec450e2a9f85a19f864e875baf89c06f50c11ecc847096467012caed.scope: Deactivated successfully. Dec 15 05:01:36 localhost nova_compute[286344]: 2025-12-15 10:01:36.461 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:01:36 localhost ovn_controller[154603]: 2025-12-15T10:01:36Z|00152|binding|INFO|Removing iface tap410191a3-ad ovn-installed in OVS Dec 15 05:01:36 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:36.470 160590 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port defc7b23-8924-4b69-a932-1bf1428cc324 with type ""#033[00m Dec 15 05:01:36 localhost ovn_controller[154603]: 2025-12-15T10:01:36Z|00153|binding|INFO|Removing lport 410191a3-ad8c-47df-98ef-4c3095e45cae ovn-installed in OVS Dec 15 05:01:36 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:36.471 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '19.80.0.2/24', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-594a8673-651b-4566-92ec-8dbe6ca00b60', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-594a8673-651b-4566-92ec-8dbe6ca00b60', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8d2a9ec16aa942ab8315d4057f639915', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005559462.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8bd87d03-8621-4b45-a769-2f1ac086eff3, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=410191a3-ad8c-47df-98ef-4c3095e45cae) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:01:36 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:36.472 160590 INFO neutron.agent.ovn.metadata.agent [-] Port 410191a3-ad8c-47df-98ef-4c3095e45cae in datapath 594a8673-651b-4566-92ec-8dbe6ca00b60 unbound from our chassis#033[00m Dec 15 05:01:36 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:36.474 160590 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 594a8673-651b-4566-92ec-8dbe6ca00b60, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 15 05:01:36 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:36.475 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[d19da4d6-b023-4eb3-935a-0d67182aefcb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:01:36 localhost nova_compute[286344]: 2025-12-15 10:01:36.482 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:01:36 localhost podman[316380]: 2025-12-15 10:01:36.491211859 +0000 UTC m=+0.170504165 container remove e8bff1a5ec450e2a9f85a19f864e875baf89c06f50c11ecc847096467012caed (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-594a8673-651b-4566-92ec-8dbe6ca00b60, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Dec 15 05:01:36 localhost nova_compute[286344]: 2025-12-15 10:01:36.502 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:01:36 localhost kernel: device tap410191a3-ad left promiscuous mode Dec 15 05:01:36 localhost nova_compute[286344]: 2025-12-15 10:01:36.515 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:01:36 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:01:36.536 267546 INFO neutron.agent.dhcp.agent [None req-4c957767-7a77-4132-b90c-83483f9b5216 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:01:36 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:01:36.646 267546 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:01:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09. Dec 15 05:01:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 05:01:36 localhost nova_compute[286344]: 2025-12-15 10:01:36.722 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:01:36 localhost podman[316402]: 2025-12-15 10:01:36.771210213 +0000 UTC m=+0.093424225 container health_status 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_id=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, managed_by=edpm_ansible, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, distribution-scope=public, architecture=x86_64, vcs-type=git, io.buildah.version=1.33.7, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Dec 15 05:01:36 localhost podman[316402]: 2025-12-15 10:01:36.809492356 +0000 UTC m=+0.131706328 container exec_died 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_id=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, architecture=x86_64, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal) Dec 15 05:01:36 localhost systemd[1]: 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09.service: Deactivated successfully. Dec 15 05:01:36 localhost podman[316403]: 2025-12-15 10:01:36.831339565 +0000 UTC m=+0.150743378 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller) Dec 15 05:01:36 localhost ovn_controller[154603]: 2025-12-15T10:01:36Z|00154|binding|INFO|Releasing lport b35254ad-12eb-47bb-92be-44fefe0694f0 from this chassis (sb_readonly=0) Dec 15 05:01:36 localhost podman[316403]: 2025-12-15 10:01:36.88061667 +0000 UTC m=+0.200020473 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Dec 15 05:01:36 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 05:01:36 localhost nova_compute[286344]: 2025-12-15 10:01:36.914 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:01:36 localhost systemd[1]: var-lib-containers-storage-overlay-2b2a87798477edb1c4ed240e31aa7d3ba3b4d16e88de519df3a73a3e16e5c420-merged.mount: Deactivated successfully. Dec 15 05:01:36 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e8bff1a5ec450e2a9f85a19f864e875baf89c06f50c11ecc847096467012caed-userdata-shm.mount: Deactivated successfully. Dec 15 05:01:36 localhost systemd[1]: run-netns-qdhcp\x2d594a8673\x2d651b\x2d4566\x2d92ec\x2d8dbe6ca00b60.mount: Deactivated successfully. Dec 15 05:01:39 localhost neutron_sriov_agent[260044]: 2025-12-15 10:01:39.031 2 INFO neutron.agent.securitygroups_rpc [None req-656aec54-86f1-42f6-897e-e203095fe84d c79d291546244f6a970ffc157036d797 8d2a9ec16aa942ab8315d4057f639915 - - default default] Security group member updated ['48dda613-ea3d-4053-a6ae-35f8ce5dfd37']#033[00m Dec 15 05:01:39 localhost nova_compute[286344]: 2025-12-15 10:01:39.877 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:01:40 localhost ovn_controller[154603]: 2025-12-15T10:01:40Z|00155|binding|INFO|Releasing lport b35254ad-12eb-47bb-92be-44fefe0694f0 from this chassis (sb_readonly=0) Dec 15 05:01:40 localhost nova_compute[286344]: 2025-12-15 10:01:40.214 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:01:40 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 05:01:41 localhost dnsmasq[313833]: exiting on receipt of SIGTERM Dec 15 05:01:41 localhost podman[316466]: 2025-12-15 10:01:41.600103574 +0000 UTC m=+0.052428544 container kill 7b146303a978bcb3df5ff2887887992e6db4ec060d166943872f70bbc5777a10 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0a27b0d3-52a4-4f8c-9083-769f5c00765d, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true) Dec 15 05:01:41 localhost systemd[1]: libpod-7b146303a978bcb3df5ff2887887992e6db4ec060d166943872f70bbc5777a10.scope: Deactivated successfully. Dec 15 05:01:41 localhost podman[316479]: 2025-12-15 10:01:41.677553893 +0000 UTC m=+0.060434140 container died 7b146303a978bcb3df5ff2887887992e6db4ec060d166943872f70bbc5777a10 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0a27b0d3-52a4-4f8c-9083-769f5c00765d, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Dec 15 05:01:41 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7b146303a978bcb3df5ff2887887992e6db4ec060d166943872f70bbc5777a10-userdata-shm.mount: Deactivated successfully. Dec 15 05:01:41 localhost podman[316479]: 2025-12-15 10:01:41.711123701 +0000 UTC m=+0.094003888 container cleanup 7b146303a978bcb3df5ff2887887992e6db4ec060d166943872f70bbc5777a10 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0a27b0d3-52a4-4f8c-9083-769f5c00765d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Dec 15 05:01:41 localhost systemd[1]: libpod-conmon-7b146303a978bcb3df5ff2887887992e6db4ec060d166943872f70bbc5777a10.scope: Deactivated successfully. Dec 15 05:01:41 localhost nova_compute[286344]: 2025-12-15 10:01:41.722 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:01:41 localhost podman[316480]: 2025-12-15 10:01:41.761636406 +0000 UTC m=+0.139433159 container remove 7b146303a978bcb3df5ff2887887992e6db4ec060d166943872f70bbc5777a10 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0a27b0d3-52a4-4f8c-9083-769f5c00765d, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2) Dec 15 05:01:42 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:01:42.177 267546 INFO neutron.agent.dhcp.agent [None req-8e934fe2-f470-46b0-b43c-734b360eb46a - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:01:42 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:01:42.423 267546 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:01:42 localhost systemd[1]: var-lib-containers-storage-overlay-593702a771b26b2b4a70e8a165408efa4c4335b8fc45cb653feb8bed49b6b747-merged.mount: Deactivated successfully. Dec 15 05:01:42 localhost systemd[1]: run-netns-qdhcp\x2d0a27b0d3\x2d52a4\x2d4f8c\x2d9083\x2d769f5c00765d.mount: Deactivated successfully. Dec 15 05:01:43 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:01:43.199 267546 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:01:44 localhost nova_compute[286344]: 2025-12-15 10:01:44.880 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:01:45 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 05:01:45 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e110 do_prune osdmap full prune enabled Dec 15 05:01:45 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e111 e111: 6 total, 6 up, 6 in Dec 15 05:01:45 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e111: 6 total, 6 up, 6 in Dec 15 05:01:46 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 05:01:46 localhost nova_compute[286344]: 2025-12-15 10:01:46.726 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:01:46 localhost podman[316507]: 2025-12-15 10:01:46.748501314 +0000 UTC m=+0.078814504 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:01:46 localhost podman[316507]: 2025-12-15 10:01:46.787520096 +0000 UTC m=+0.117833336 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:01:46 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 05:01:46 localhost ovn_controller[154603]: 2025-12-15T10:01:46Z|00156|binding|INFO|Releasing lport b35254ad-12eb-47bb-92be-44fefe0694f0 from this chassis (sb_readonly=0) Dec 15 05:01:46 localhost nova_compute[286344]: 2025-12-15 10:01:46.906 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.123 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'name': 'test', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005559462.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'c785bf23f53946bc99867d8832a50266', 'user_id': '1ba5fce347b64bfebf995f187193f205', 'hostId': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.124 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.151 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.latency volume: 1243487016 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.152 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.latency volume: 24779175 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.154 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '360f59f8-a32a-4f34-af48-d37b7103279e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1243487016, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T10:01:48.124350', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '11e0d376-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 11974.31702687, 'message_signature': '5c2b5ece028081739e01b20da708e12fdcbbd70582bb2712d1195f4de6205c0a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24779175, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T10:01:48.124350', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '11e0e8b6-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 11974.31702687, 'message_signature': 'b638de4780d129fe0dd43031219a73339e97e105ca5fa43e50fb4a94988b3675'}]}, 'timestamp': '2025-12-15 10:01:48.153281', '_unique_id': 'c094bb40c83b452087237e0726a03d04'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.154 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.154 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.154 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.154 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.154 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.154 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.154 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.154 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.154 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.154 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.154 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.154 12 ERROR oslo_messaging.notify.messaging Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.154 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.154 12 ERROR oslo_messaging.notify.messaging Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.154 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.154 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.154 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.154 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.154 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.154 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.154 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.154 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.154 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.154 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.154 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.154 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.154 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.154 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.154 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.154 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.154 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.154 12 ERROR oslo_messaging.notify.messaging Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.156 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.159 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.161 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6c864fb6-a9ae-406c-99d4-3ea878159289', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:01:48.156349', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '11e1effe-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 11974.349046819, 'message_signature': '8216e02333a088fffe101c357ca7664defac5563f94c9a4562d9f8d842c14ee8'}]}, 'timestamp': '2025-12-15 10:01:48.160091', '_unique_id': '132c5e20b6094d0ba1caafbe4ee8933f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.161 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.161 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.161 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.161 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.161 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.161 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.161 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.161 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.161 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.161 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.161 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.161 12 ERROR oslo_messaging.notify.messaging Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.161 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.161 12 ERROR oslo_messaging.notify.messaging Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.161 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.161 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.161 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.161 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.161 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.161 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.161 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.161 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.161 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.161 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.161 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.161 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.161 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.161 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.161 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.161 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.161 12 ERROR oslo_messaging.notify.messaging Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.162 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.162 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.162 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.164 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5196c298-64c5-410b-82e3-bc2ce49d3758', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:01:48.162790', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '11e26fb0-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 11974.349046819, 'message_signature': '0ba13bdc97f66c6b9e22a18ca9550bcf40d71bb4520a7416bacbeef25d85e10a'}]}, 'timestamp': '2025-12-15 10:01:48.163486', '_unique_id': '6abab135cd194b5d889d03729d8a7a70'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.164 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.164 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.164 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.164 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.164 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.164 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.164 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.164 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.164 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.164 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.164 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.164 12 ERROR oslo_messaging.notify.messaging Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.164 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.164 12 ERROR oslo_messaging.notify.messaging Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.164 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.164 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.164 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.164 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.164 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.164 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.164 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.164 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.164 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.164 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.164 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.164 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.164 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.164 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.164 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.164 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.164 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.164 12 ERROR oslo_messaging.notify.messaging Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.165 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.166 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.167 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e591db2f-0ced-400e-8456-70ca6ca5b207', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:01:48.165956', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '11e2eb0c-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 11974.349046819, 'message_signature': 'a01fde95c94187b69273afd05c9689cfabb527acd3cfc040ab18f14b6b710097'}]}, 'timestamp': '2025-12-15 10:01:48.166493', '_unique_id': '9b2e1f08b7d5451fb416dd0227bb5f56'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.167 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.167 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.167 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.167 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.167 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.167 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.167 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.167 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.167 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.167 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.167 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.167 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.167 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.167 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.167 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.167 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.167 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.167 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.167 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.167 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.167 12 ERROR oslo_messaging.notify.messaging Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.167 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.167 12 ERROR oslo_messaging.notify.messaging Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.167 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.167 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.167 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.167 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.167 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.167 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.167 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.167 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.167 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.167 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.167 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.167 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.167 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.167 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.167 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.167 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.167 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.167 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.167 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.167 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.167 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.167 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.167 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.167 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.167 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.167 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.167 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.167 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.167 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.167 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.167 12 ERROR oslo_messaging.notify.messaging Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.168 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.179 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.180 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.181 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'adbb8c27-7562-4edb-aa5d-abfa2d791d1c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T10:01:48.168632', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '11e5077a-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 11974.361327302, 'message_signature': '94986aebe6a58b4806d369f98f16ea97b7e944f0b6ab0e7951e22b8d6b05d734'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T10:01:48.168632', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '11e51a30-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 11974.361327302, 'message_signature': 'b1bf2caa52bb2b3984b6061b810ae69fe7d9e3bcbc2bb8668d820a7aab1d4339'}]}, 'timestamp': '2025-12-15 10:01:48.180761', '_unique_id': 'dd45801779304582a0013b6543e7e245'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.181 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.181 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.181 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.181 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.181 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.181 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.181 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.181 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.181 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.181 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.181 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.181 12 ERROR oslo_messaging.notify.messaging Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.181 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.181 12 ERROR oslo_messaging.notify.messaging Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.181 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.181 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.181 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.181 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.181 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.181 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.181 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.181 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.181 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.181 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.181 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.181 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.181 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.181 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.181 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.181 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.181 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.181 12 ERROR oslo_messaging.notify.messaging Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.183 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.199 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/memory.usage volume: 51.73828125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.201 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e44de724-d3c0-420d-b2e9-07bfcb5a090e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.73828125, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'timestamp': '2025-12-15T10:01:48.183326', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '11e7fbe2-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 11974.391544617, 'message_signature': 'a63f3c5b3d70cb9ef931f77b9496bb81bab3efe692dfb6fe455ca2277067884e'}]}, 'timestamp': '2025-12-15 10:01:48.199719', '_unique_id': '860396f73c984024b3e7d635d2553656'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.201 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.201 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.201 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.201 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.201 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.201 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.201 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.201 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.201 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.201 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.201 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.201 12 ERROR oslo_messaging.notify.messaging Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.201 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.201 12 ERROR oslo_messaging.notify.messaging Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.201 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.201 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.201 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.201 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.201 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.201 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.201 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.201 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.201 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.201 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.201 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.201 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.201 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.201 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.201 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.201 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.201 12 ERROR oslo_messaging.notify.messaging Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.202 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.202 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.202 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.204 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '60e21ea0-15ff-45c2-973a-1ed2b41496e5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:01:48.202664', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '11e885a8-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 11974.349046819, 'message_signature': '1d848590c1a1a0a2041043fa5999f9c58ad89b08fac4717f73dfdf9c27082ddb'}]}, 'timestamp': '2025-12-15 10:01:48.203238', '_unique_id': 'f70b1e1eb89441fcbcd07944c0ef9d0d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.204 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.204 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.204 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.204 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.204 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.204 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.204 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.204 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.204 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.204 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.204 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.204 12 ERROR oslo_messaging.notify.messaging Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.204 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.204 12 ERROR oslo_messaging.notify.messaging Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.204 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.204 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.204 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.204 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.204 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.204 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.204 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.204 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.204 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.204 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.204 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.204 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.204 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.204 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.204 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.204 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.204 12 ERROR oslo_messaging.notify.messaging Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.205 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.205 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.205 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.207 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0e600132-817a-468a-95bb-ed314814b7cb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T10:01:48.205505', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '11e8f290-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 11974.31702687, 'message_signature': '917864c1af180f5e1f4d0d22e4c29eaf78799ef614cc1022e69d40336dfb6781'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T10:01:48.205505', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '11e905dc-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 11974.31702687, 'message_signature': '9aa41d5170a1d69c2141405a3f9fbe60218bfd4ca85bde08f373002a23121684'}]}, 'timestamp': '2025-12-15 10:01:48.206453', '_unique_id': '5762baa0ff0341868501278eac917825'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.207 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.207 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.207 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.207 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.207 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.207 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.207 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.207 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.207 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.207 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.207 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.207 12 ERROR oslo_messaging.notify.messaging Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.207 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.207 12 ERROR oslo_messaging.notify.messaging Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.207 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.207 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.207 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.207 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.207 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.207 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.207 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.207 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.207 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.207 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.207 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.207 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.207 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.207 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.207 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.207 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.207 12 ERROR oslo_messaging.notify.messaging Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.208 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.208 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.latency volume: 1342134926 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.209 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.latency volume: 123356132 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.210 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd32f28e0-0638-4186-9d65-74347d8f9f51', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1342134926, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T10:01:48.208778', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '11e973d2-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 11974.31702687, 'message_signature': '4ec68ea05a74168b7fa69ebf1b8dec9666e5c005866f9e019a231b698d92f3db'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 123356132, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T10:01:48.208778', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '11e98750-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 11974.31702687, 'message_signature': 'fc2d924035f633165e4f843e890db2426d0cf743c19c56e3c6124f787bca2df4'}]}, 'timestamp': '2025-12-15 10:01:48.209809', '_unique_id': '37a68793c3af4c1da37948e19803fc5b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.210 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.210 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.210 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.210 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.210 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.210 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.210 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.210 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.210 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.210 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.210 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.210 12 ERROR oslo_messaging.notify.messaging Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.210 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.210 12 ERROR oslo_messaging.notify.messaging Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.210 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.210 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.210 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.210 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.210 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.210 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.210 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.210 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.210 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.210 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.210 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.210 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.210 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.210 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.210 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.210 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.210 12 ERROR oslo_messaging.notify.messaging Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.211 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.212 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.213 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd8caf631-1644-41a8-968d-c53e267dc663', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:01:48.212054', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '11e9f24e-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 11974.349046819, 'message_signature': 'c5d740884fba1dcaa5a0f80e29595ae729a3d0eeebbf880ac6f9fdec03838ef3'}]}, 'timestamp': '2025-12-15 10:01:48.212515', '_unique_id': '22b1633dce334ebc83672cac4ccb191d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.213 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.213 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.213 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.213 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.213 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.213 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.213 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.213 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.213 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.213 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.213 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.213 12 ERROR oslo_messaging.notify.messaging Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.213 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.213 12 ERROR oslo_messaging.notify.messaging Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.213 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.213 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.213 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.213 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.213 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.213 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.213 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.213 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.213 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.213 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.213 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.213 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.213 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.213 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.213 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.213 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.213 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.213 12 ERROR oslo_messaging.notify.messaging Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.214 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.214 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.214 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.216 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a487e3a3-f062-4697-9b15-5748812d77b6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:01:48.214740', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '11ea5b4e-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 11974.349046819, 'message_signature': '7f2c716cb5f3bcd9f6d547096258d303e4efdf9fa73a43bbe714d22718e5f1d9'}]}, 'timestamp': '2025-12-15 10:01:48.215232', '_unique_id': '5b85aa96d0eb4d7da3663426246e1ec1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.216 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.216 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.216 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.216 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.216 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.216 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.216 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.216 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.216 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.216 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.216 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.216 12 ERROR oslo_messaging.notify.messaging Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.216 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.216 12 ERROR oslo_messaging.notify.messaging Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.216 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.216 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.216 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.216 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.216 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.216 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.216 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.216 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.216 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.216 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.216 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.216 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.216 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.216 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.216 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.216 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.216 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.216 12 ERROR oslo_messaging.notify.messaging Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.217 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.217 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.217 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.219 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1a47b17e-826c-46f3-89cd-789fb64d407d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T10:01:48.217281', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '11eabe0e-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 11974.361327302, 'message_signature': '8df8f5588edd48db7e47ebaba656a87bfbd6ff390071bd43a23c610ef4a8461d'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T10:01:48.217281', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '11eacde0-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 11974.361327302, 'message_signature': 'b169e6ce76ed2a95f76d30fe857c9b969f099bb5c43c33ff64c5a54a64724243'}]}, 'timestamp': '2025-12-15 10:01:48.218138', '_unique_id': 'b5c0dbf564894b31b01b19d618cf1e48'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.219 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.219 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.219 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.219 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.219 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.219 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.219 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.219 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.219 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.219 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.219 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.219 12 ERROR oslo_messaging.notify.messaging Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.219 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.219 12 ERROR oslo_messaging.notify.messaging Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.219 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.219 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.219 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.219 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.219 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.219 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.219 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.219 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.219 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.219 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.219 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.219 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.219 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.219 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.219 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.219 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.219 12 ERROR oslo_messaging.notify.messaging Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.220 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.220 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.220 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.222 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '52afc3d7-23a8-4c91-8888-9f61fe8b67c7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T10:01:48.220251', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '11eb31ea-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 11974.31702687, 'message_signature': 'af0ae746528770d532068fd864edde7b4308b84428afd20f119f8d081048a610'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T10:01:48.220251', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '11eb41c6-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 11974.31702687, 'message_signature': 'a5a70a9c65b15c309a3b9dc4d9346d905885d30baa0ccd0ceac8d3096db6958f'}]}, 'timestamp': '2025-12-15 10:01:48.221103', '_unique_id': '667afccb9a46448cb6ce7c6b21fb871f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.222 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.222 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.222 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.222 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.222 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.222 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.222 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.222 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.222 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.222 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.222 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.222 12 ERROR oslo_messaging.notify.messaging Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.222 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.222 12 ERROR oslo_messaging.notify.messaging Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.222 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.222 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.222 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.222 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.222 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.222 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.222 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.222 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.222 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.222 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.222 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.222 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.222 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.222 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.222 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.222 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.222 12 ERROR oslo_messaging.notify.messaging Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.223 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.223 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.224 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4f8ae1f7-8167-4f27-aa7d-e630fae953b9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:01:48.223207', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '11eba5a8-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 11974.349046819, 'message_signature': '4b4629a49337cce3c673b5b4d5ddd202f38b22274cf28b33aeb39fdf140e5ce3'}]}, 'timestamp': '2025-12-15 10:01:48.223660', '_unique_id': '8a6a3f04b6214c41939baa0d7ef8d723'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.224 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.224 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.224 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.224 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.224 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.224 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.224 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.224 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.224 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.224 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.224 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.224 12 ERROR oslo_messaging.notify.messaging Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.224 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.224 12 ERROR oslo_messaging.notify.messaging Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.224 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.224 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.224 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.224 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.224 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.224 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.224 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.224 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.224 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.224 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.224 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.224 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.224 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.224 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.224 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.224 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.224 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.224 12 ERROR oslo_messaging.notify.messaging Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.225 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.225 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.225 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.226 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.227 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '556e47b6-3d76-40f3-89f3-4efc14d973f4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T10:01:48.225836', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '11ec0d9a-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 11974.31702687, 'message_signature': '8ed22ce6aacecc350bfb24fcc7e988722ad8fee1829dc1eb5c16fdf309e21728'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T10:01:48.225836', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '11ec1d9e-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 11974.31702687, 'message_signature': '7ae9867dc871b72ae1ec3a48e5feced4223ae15c218300ce03f7db1dd74513f8'}]}, 'timestamp': '2025-12-15 10:01:48.226723', '_unique_id': 'e47c5d9d9e0a4f499ea6e9047edafb93'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.227 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.227 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.227 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.227 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.227 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.227 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.227 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.227 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.227 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.227 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.227 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.227 12 ERROR oslo_messaging.notify.messaging Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.227 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.227 12 ERROR oslo_messaging.notify.messaging Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.227 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.227 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.227 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.227 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.227 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.227 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.227 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.227 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.227 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.227 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.227 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.227 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.227 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.227 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.227 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.227 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.227 12 ERROR oslo_messaging.notify.messaging Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.228 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.228 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/cpu volume: 14020000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.230 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '078fd988-bd51-4f90-b489-a20eed300722', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 14020000000, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'timestamp': '2025-12-15T10:01:48.228807', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '11ec816c-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 11974.391544617, 'message_signature': '8a7cb1c5e286a45a3185ec6803a6c9285e53c5961e0052367207019a66a53d3a'}]}, 'timestamp': '2025-12-15 10:01:48.229270', '_unique_id': '72e488608dd240ba98c319226f739add'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.230 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.230 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.230 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.230 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.230 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.230 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.230 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.230 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.230 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.230 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.230 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.230 12 ERROR oslo_messaging.notify.messaging Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.230 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.230 12 ERROR oslo_messaging.notify.messaging Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.230 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.230 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.230 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.230 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.230 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.230 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.230 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.230 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.230 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.230 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.230 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.230 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.230 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.230 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.230 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.230 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.230 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.230 12 ERROR oslo_messaging.notify.messaging Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.231 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.231 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.231 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.233 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e1b4d11d-613d-4e3a-a8fd-850751e8c89c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T10:01:48.231319', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '11ece256-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 11974.361327302, 'message_signature': '62ba5e375d81d363230c0bc11c13bc0751ca31f85165deb3adfcf8be69ae48ac'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T10:01:48.231319', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '11ecf214-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 11974.361327302, 'message_signature': '3b6f2db4d70f6122972f1a1f4a54e8bcd9d83fc413e1a913d954f36192dcbeac'}]}, 'timestamp': '2025-12-15 10:01:48.232172', '_unique_id': '577732c0c54f41f6822a414c609a6a72'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.233 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.233 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.233 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.233 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.233 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.233 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.233 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.233 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.233 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.233 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.233 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.233 12 ERROR oslo_messaging.notify.messaging Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.233 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.233 12 ERROR oslo_messaging.notify.messaging Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.233 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.233 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.233 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.233 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.233 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.233 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.233 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.233 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.233 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.233 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.233 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.233 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.233 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.233 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.233 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.233 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.233 12 ERROR oslo_messaging.notify.messaging Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.234 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.234 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.235 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ed525430-8a6a-401a-816e-0940efe7f016', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:01:48.234263', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '11ed557e-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 11974.349046819, 'message_signature': '052db9fb4b02094e643d5d0e166d08a97dab832472d0b63de91d0083adf46456'}]}, 'timestamp': '2025-12-15 10:01:48.234764', '_unique_id': '8696e4a6117d465ea961debc0bbc4863'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.235 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.235 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.235 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.235 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.235 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.235 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.235 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.235 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.235 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.235 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.235 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.235 12 ERROR oslo_messaging.notify.messaging Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.235 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.235 12 ERROR oslo_messaging.notify.messaging Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.235 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.235 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.235 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.235 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.235 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.235 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.235 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.235 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.235 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.235 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.235 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.235 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.235 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.235 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.235 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.235 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.235 12 ERROR oslo_messaging.notify.messaging Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.235 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.236 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.236 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fda2ca06-ab11-41dc-b693-eff6ddce5270', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:01:48.236082', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '11ed9962-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 11974.349046819, 'message_signature': 'f78fed44c364a9fea1ffdae655653792bbc021fe27f7fb86e1935783123a89fb'}]}, 'timestamp': '2025-12-15 10:01:48.236365', '_unique_id': '8deb3fa2805646bbb1aececbe95a5eb6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.236 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.236 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.236 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.236 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.236 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.236 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.236 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.236 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.236 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.236 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.236 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.236 12 ERROR oslo_messaging.notify.messaging Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.236 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.236 12 ERROR oslo_messaging.notify.messaging Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.236 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.236 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.236 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.236 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.236 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.236 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.236 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.236 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.236 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.236 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.236 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.236 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.236 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.236 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.236 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.236 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.236 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.236 12 ERROR oslo_messaging.notify.messaging Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.237 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.237 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.238 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8ba9845f-99ae-42ab-9c65-9365485cf9a2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:01:48.237618', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '11edd54e-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 11974.349046819, 'message_signature': '8c6090cbc96b85dd1c6cb3dca2e1524b86c63855c42f5250b3ff0ee00b8ce521'}]}, 'timestamp': '2025-12-15 10:01:48.237901', '_unique_id': 'f3d77dcb13554bf9905ac6fd9c736d50'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.238 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.238 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.238 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.238 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.238 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.238 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.238 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.238 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.238 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.238 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.238 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.238 12 ERROR oslo_messaging.notify.messaging Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.238 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.238 12 ERROR oslo_messaging.notify.messaging Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.238 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.238 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.238 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.238 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.238 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.238 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.238 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.238 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.238 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.238 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.238 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.238 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.238 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.238 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.238 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.238 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.238 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.238 12 ERROR oslo_messaging.notify.messaging Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.239 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.239 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.239 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.240 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bc80dbb4-3a44-48fc-b897-5e4743289f63', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T10:01:48.239198', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '11ee14a0-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 11974.31702687, 'message_signature': '318d8c2a5a622fb0f516a032bc8e39c6dccd60d9a9d94cf9ece400656404a652'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T10:01:48.239198', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '11ee1ec8-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 11974.31702687, 'message_signature': '58b10ec00bbd85116fec62b853952da666d7915b3e84759c974c066381c13573'}]}, 'timestamp': '2025-12-15 10:01:48.239763', '_unique_id': 'd974633ddf3a4511b084f01d76984a5f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.240 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.240 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.240 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.240 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.240 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.240 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.240 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.240 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.240 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.240 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.240 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.240 12 ERROR oslo_messaging.notify.messaging Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.240 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.240 12 ERROR oslo_messaging.notify.messaging Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.240 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.240 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.240 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.240 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.240 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.240 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.240 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.240 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.240 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.240 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.240 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.240 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.240 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.240 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.240 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.240 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.240 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:01:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:01:48.240 12 ERROR oslo_messaging.notify.messaging Dec 15 05:01:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e. Dec 15 05:01:49 localhost systemd[1]: tmp-crun.7lAubi.mount: Deactivated successfully. Dec 15 05:01:49 localhost podman[316525]: 2025-12-15 10:01:49.856211788 +0000 UTC m=+0.185484145 container health_status a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 15 05:01:49 localhost nova_compute[286344]: 2025-12-15 10:01:49.937 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:01:49 localhost podman[316525]: 2025-12-15 10:01:49.96253976 +0000 UTC m=+0.291812147 container exec_died a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Dec 15 05:01:50 localhost systemd[1]: a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e.service: Deactivated successfully. Dec 15 05:01:50 localhost neutron_sriov_agent[260044]: 2025-12-15 10:01:50.354 2 INFO neutron.agent.securitygroups_rpc [None req-d20a3d84-599b-4ad0-9bd0-84a025315a2f 81c354e8024a4f49a3f913eba91f220d 22113fb22b5d4b6487fec57cee18ff23 - - default default] Security group member updated ['c7f6d550-5740-4ef8-8ff3-564deea47f02']#033[00m Dec 15 05:01:50 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e111 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 05:01:51 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:51.479 160590 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 05:01:51 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:51.479 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 05:01:51 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:51.480 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 05:01:51 localhost nova_compute[286344]: 2025-12-15 10:01:51.730 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:01:52 localhost ovn_controller[154603]: 2025-12-15T10:01:52Z|00157|binding|INFO|Releasing lport b35254ad-12eb-47bb-92be-44fefe0694f0 from this chassis (sb_readonly=0) Dec 15 05:01:52 localhost nova_compute[286344]: 2025-12-15 10:01:52.725 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:01:52 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:01:52.777 267546 INFO neutron.agent.linux.ip_lib [None req-411d2e84-97ce-447d-ad99-68143474b0cf - - - - - -] Device tap3dd1228a-86 cannot be used as it has no MAC address#033[00m Dec 15 05:01:52 localhost nova_compute[286344]: 2025-12-15 10:01:52.800 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:01:52 localhost kernel: device tap3dd1228a-86 entered promiscuous mode Dec 15 05:01:52 localhost nova_compute[286344]: 2025-12-15 10:01:52.810 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:01:52 localhost NetworkManager[5963]: [1765792912.8142] manager: (tap3dd1228a-86): new Generic device (/org/freedesktop/NetworkManager/Devices/28) Dec 15 05:01:52 localhost ovn_controller[154603]: 2025-12-15T10:01:52Z|00158|binding|INFO|Claiming lport 3dd1228a-865d-4119-ba96-8c2b1a62bcd9 for this chassis. Dec 15 05:01:52 localhost ovn_controller[154603]: 2025-12-15T10:01:52Z|00159|binding|INFO|3dd1228a-865d-4119-ba96-8c2b1a62bcd9: Claiming unknown Dec 15 05:01:52 localhost systemd-udevd[316626]: Network interface NamePolicy= disabled on kernel command line. Dec 15 05:01:52 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:52.833 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-3c7fb015-354b-4d02-b527-15571516e679', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3c7fb015-354b-4d02-b527-15571516e679', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e293c1347d4b492e9405d19a0168b9af', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f63083fa-ddf2-4e6e-8d5f-204f8d11fdb7, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=3dd1228a-865d-4119-ba96-8c2b1a62bcd9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:01:52 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:52.835 160590 INFO neutron.agent.ovn.metadata.agent [-] Port 3dd1228a-865d-4119-ba96-8c2b1a62bcd9 in datapath 3c7fb015-354b-4d02-b527-15571516e679 bound to our chassis#033[00m Dec 15 05:01:52 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:52.837 160590 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 3c7fb015-354b-4d02-b527-15571516e679 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 15 05:01:52 localhost ovn_metadata_agent[160585]: 2025-12-15 10:01:52.840 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[da977147-32b9-410c-b025-0698c75c674c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:01:52 localhost journal[231322]: ethtool ioctl error on tap3dd1228a-86: No such device Dec 15 05:01:52 localhost nova_compute[286344]: 2025-12-15 10:01:52.848 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:01:52 localhost journal[231322]: ethtool ioctl error on tap3dd1228a-86: No such device Dec 15 05:01:52 localhost journal[231322]: ethtool ioctl error on tap3dd1228a-86: No such device Dec 15 05:01:52 localhost ovn_controller[154603]: 2025-12-15T10:01:52Z|00160|binding|INFO|Setting lport 3dd1228a-865d-4119-ba96-8c2b1a62bcd9 ovn-installed in OVS Dec 15 05:01:52 localhost ovn_controller[154603]: 2025-12-15T10:01:52Z|00161|binding|INFO|Setting lport 3dd1228a-865d-4119-ba96-8c2b1a62bcd9 up in Southbound Dec 15 05:01:52 localhost nova_compute[286344]: 2025-12-15 10:01:52.856 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:01:52 localhost nova_compute[286344]: 2025-12-15 10:01:52.859 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:01:52 localhost journal[231322]: ethtool ioctl error on tap3dd1228a-86: No such device Dec 15 05:01:52 localhost journal[231322]: ethtool ioctl error on tap3dd1228a-86: No such device Dec 15 05:01:52 localhost journal[231322]: ethtool ioctl error on tap3dd1228a-86: No such device Dec 15 05:01:52 localhost journal[231322]: ethtool ioctl error on tap3dd1228a-86: No such device Dec 15 05:01:52 localhost journal[231322]: ethtool ioctl error on tap3dd1228a-86: No such device Dec 15 05:01:52 localhost nova_compute[286344]: 2025-12-15 10:01:52.889 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:01:52 localhost nova_compute[286344]: 2025-12-15 10:01:52.919 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:01:53 localhost podman[316697]: 2025-12-15 10:01:53.851247329 +0000 UTC m=+0.049884421 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 15 05:01:54 localhost neutron_sriov_agent[260044]: 2025-12-15 10:01:54.052 2 INFO neutron.agent.securitygroups_rpc [None req-3c3c557a-8b24-415e-9e0d-52e6548a0a52 81c354e8024a4f49a3f913eba91f220d 22113fb22b5d4b6487fec57cee18ff23 - - default default] Security group member updated ['c7f6d550-5740-4ef8-8ff3-564deea47f02']#033[00m Dec 15 05:01:54 localhost podman[316697]: Dec 15 05:01:54 localhost podman[316697]: 2025-12-15 10:01:54.381259627 +0000 UTC m=+0.579896679 container create 02a45d648a12222f39884d105514a5d30f227026fd770e53e0ed238140000f1a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3c7fb015-354b-4d02-b527-15571516e679, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Dec 15 05:01:54 localhost systemd[1]: Started libpod-conmon-02a45d648a12222f39884d105514a5d30f227026fd770e53e0ed238140000f1a.scope. Dec 15 05:01:54 localhost systemd[1]: Started libcrun container. Dec 15 05:01:54 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f5a29ed0703f676e54f8808c6ffb69dd94a0385cdded86a14167a64fcc965bd2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 05:01:54 localhost podman[316697]: 2025-12-15 10:01:54.545868676 +0000 UTC m=+0.744505728 container init 02a45d648a12222f39884d105514a5d30f227026fd770e53e0ed238140000f1a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3c7fb015-354b-4d02-b527-15571516e679, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) Dec 15 05:01:54 localhost dnsmasq[316716]: started, version 2.85 cachesize 150 Dec 15 05:01:54 localhost dnsmasq[316716]: DNS service limited to local subnets Dec 15 05:01:54 localhost dnsmasq[316716]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 15 05:01:54 localhost dnsmasq[316716]: warning: no upstream servers configured Dec 15 05:01:54 localhost dnsmasq-dhcp[316716]: DHCP, static leases only on 10.100.0.0, lease time 1d Dec 15 05:01:54 localhost dnsmasq[316716]: read /var/lib/neutron/dhcp/3c7fb015-354b-4d02-b527-15571516e679/addn_hosts - 0 addresses Dec 15 05:01:54 localhost dnsmasq-dhcp[316716]: read /var/lib/neutron/dhcp/3c7fb015-354b-4d02-b527-15571516e679/host Dec 15 05:01:54 localhost dnsmasq-dhcp[316716]: read /var/lib/neutron/dhcp/3c7fb015-354b-4d02-b527-15571516e679/opts Dec 15 05:01:54 localhost podman[316697]: 2025-12-15 10:01:54.569693713 +0000 UTC m=+0.768330765 container start 02a45d648a12222f39884d105514a5d30f227026fd770e53e0ed238140000f1a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3c7fb015-354b-4d02-b527-15571516e679, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2) Dec 15 05:01:54 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:01:54.732 267546 INFO neutron.agent.dhcp.agent [None req-2752a7ce-dc1f-43ed-b7b6-738a26ff843b - - - - - -] DHCP configuration for ports {'00c4fa05-07ac-4c7c-8712-80d9a60c812b'} is completed#033[00m Dec 15 05:01:54 localhost nova_compute[286344]: 2025-12-15 10:01:54.941 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:01:55 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Dec 15 05:01:55 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:01:55 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e111 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 05:01:55 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Dec 15 05:01:55 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:01:56 localhost nova_compute[286344]: 2025-12-15 10:01:56.751 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:01:58 localhost nova_compute[286344]: 2025-12-15 10:01:58.602 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:01:59 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Dec 15 05:01:59 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:01:59 localhost nova_compute[286344]: 2025-12-15 10:01:59.943 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:02:00 localhost ceph-mon[298913]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Dec 15 05:02:00 localhost ceph-mon[298913]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 600.0 total, 600.0 interval#012Cumulative writes: 3069 writes, 26K keys, 3069 commit groups, 1.0 writes per commit group, ingest: 0.05 GB, 0.08 MB/s#012Cumulative WAL: 3069 writes, 3069 syncs, 1.00 writes per sync, written: 0.05 GB, 0.08 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 3069 writes, 26K keys, 3069 commit groups, 1.0 writes per commit group, ingest: 48.83 MB, 0.08 MB/s#012Interval WAL: 3069 writes, 3069 syncs, 1.00 writes per sync, written: 0.05 GB, 0.08 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 132.9 0.27 0.09 12 0.022 0 0 0.0 0.0#012 L6 1/0 17.03 MB 0.0 0.2 0.0 0.2 0.2 0.0 0.0 5.2 161.4 146.6 1.27 0.48 11 0.115 129K 5668 0.0 0.0#012 Sum 1/0 17.03 MB 0.0 0.2 0.0 0.2 0.2 0.1 0.0 6.2 133.1 144.2 1.54 0.57 23 0.067 129K 5668 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.2 0.0 0.2 0.2 0.1 0.0 6.2 133.5 144.6 1.53 0.57 22 0.070 129K 5668 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low 0/0 0.00 KB 0.0 0.2 0.0 0.2 0.2 0.0 0.0 0.0 161.4 146.6 1.27 0.48 11 0.115 129K 5668 0.0 0.0#012High 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 135.0 0.26 0.09 11 0.024 0 0 0.0 0.0#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.5 0.00 0.00 1 0.004 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.0 total, 600.0 interval#012Flush(GB): cumulative 0.035, interval 0.035#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.22 GB write, 0.37 MB/s write, 0.20 GB read, 0.34 MB/s read, 1.5 seconds#012Interval compaction: 0.22 GB write, 0.37 MB/s write, 0.20 GB read, 0.34 MB/s read, 1.5 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55e4c4afd350#2 capacity: 308.00 MB usage: 20.52 MB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 0 last_secs: 0.000134 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(960,19.66 MB,6.3825%) FilterBlock(23,375.92 KB,0.119192%) IndexBlock(23,502.08 KB,0.159192%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] ** Dec 15 05:02:00 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:02:00.232 267546 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-15T10:01:59Z, description=, device_id=62b970f7-b6f6-451f-abdf-5855d4766c8c, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=7dbf793f-36b3-47a1-845a-50c202a0153a, ip_allocation=immediate, mac_address=fa:16:3e:eb:e2:93, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-15T10:01:50Z, description=, dns_domain=, id=3c7fb015-354b-4d02-b527-15571516e679, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ServerGroupTestJSON-2128860125-network, port_security_enabled=True, project_id=e293c1347d4b492e9405d19a0168b9af, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=12472, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=745, status=ACTIVE, subnets=['be678adc-48b8-4f32-adfa-9f19d45280e1'], tags=[], tenant_id=e293c1347d4b492e9405d19a0168b9af, updated_at=2025-12-15T10:01:51Z, vlan_transparent=None, network_id=3c7fb015-354b-4d02-b527-15571516e679, port_security_enabled=False, project_id=e293c1347d4b492e9405d19a0168b9af, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=785, status=DOWN, tags=[], tenant_id=e293c1347d4b492e9405d19a0168b9af, updated_at=2025-12-15T10:01:59Z on network 3c7fb015-354b-4d02-b527-15571516e679#033[00m Dec 15 05:02:00 localhost dnsmasq[316716]: read /var/lib/neutron/dhcp/3c7fb015-354b-4d02-b527-15571516e679/addn_hosts - 1 addresses Dec 15 05:02:00 localhost podman[316750]: 2025-12-15 10:02:00.438743639 +0000 UTC m=+0.059512458 container kill 02a45d648a12222f39884d105514a5d30f227026fd770e53e0ed238140000f1a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3c7fb015-354b-4d02-b527-15571516e679, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2) Dec 15 05:02:00 localhost dnsmasq-dhcp[316716]: read /var/lib/neutron/dhcp/3c7fb015-354b-4d02-b527-15571516e679/host Dec 15 05:02:00 localhost dnsmasq-dhcp[316716]: read /var/lib/neutron/dhcp/3c7fb015-354b-4d02-b527-15571516e679/opts Dec 15 05:02:00 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:02:00 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e111 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 05:02:00 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:02:00.779 267546 INFO neutron.agent.dhcp.agent [None req-6e74851e-0847-426c-ba9c-65063ab8c02d - - - - - -] DHCP configuration for ports {'7dbf793f-36b3-47a1-845a-50c202a0153a'} is completed#033[00m Dec 15 05:02:01 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:02:01.667 267546 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-15T10:01:59Z, description=, device_id=62b970f7-b6f6-451f-abdf-5855d4766c8c, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=7dbf793f-36b3-47a1-845a-50c202a0153a, ip_allocation=immediate, mac_address=fa:16:3e:eb:e2:93, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-15T10:01:50Z, description=, dns_domain=, id=3c7fb015-354b-4d02-b527-15571516e679, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ServerGroupTestJSON-2128860125-network, port_security_enabled=True, project_id=e293c1347d4b492e9405d19a0168b9af, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=12472, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=745, status=ACTIVE, subnets=['be678adc-48b8-4f32-adfa-9f19d45280e1'], tags=[], tenant_id=e293c1347d4b492e9405d19a0168b9af, updated_at=2025-12-15T10:01:51Z, vlan_transparent=None, network_id=3c7fb015-354b-4d02-b527-15571516e679, port_security_enabled=False, project_id=e293c1347d4b492e9405d19a0168b9af, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=785, status=DOWN, tags=[], tenant_id=e293c1347d4b492e9405d19a0168b9af, updated_at=2025-12-15T10:01:59Z on network 3c7fb015-354b-4d02-b527-15571516e679#033[00m Dec 15 05:02:01 localhost nova_compute[286344]: 2025-12-15 10:02:01.780 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:02:01 localhost podman[243449]: time="2025-12-15T10:02:01Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 15 05:02:01 localhost podman[243449]: @ - - [15/Dec/2025:10:02:01 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 160288 "" "Go-http-client/1.1" Dec 15 05:02:01 localhost podman[243449]: @ - - [15/Dec/2025:10:02:01 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 20194 "" "Go-http-client/1.1" Dec 15 05:02:01 localhost dnsmasq[316716]: read /var/lib/neutron/dhcp/3c7fb015-354b-4d02-b527-15571516e679/addn_hosts - 1 addresses Dec 15 05:02:01 localhost podman[316786]: 2025-12-15 10:02:01.983374134 +0000 UTC m=+0.138318322 container kill 02a45d648a12222f39884d105514a5d30f227026fd770e53e0ed238140000f1a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3c7fb015-354b-4d02-b527-15571516e679, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 05:02:01 localhost dnsmasq-dhcp[316716]: read /var/lib/neutron/dhcp/3c7fb015-354b-4d02-b527-15571516e679/host Dec 15 05:02:01 localhost dnsmasq-dhcp[316716]: read /var/lib/neutron/dhcp/3c7fb015-354b-4d02-b527-15571516e679/opts Dec 15 05:02:02 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:02:02.225 267546 INFO neutron.agent.dhcp.agent [None req-8d2eeacf-17db-4134-b5ce-27d2ec8545d9 - - - - - -] DHCP configuration for ports {'7dbf793f-36b3-47a1-845a-50c202a0153a'} is completed#033[00m Dec 15 05:02:03 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:02:03.669 267546 INFO neutron.agent.linux.ip_lib [None req-d80257ea-68e9-4e68-8bfe-520d9611533d - - - - - -] Device tap896d8fa2-0a cannot be used as it has no MAC address#033[00m Dec 15 05:02:03 localhost nova_compute[286344]: 2025-12-15 10:02:03.684 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:02:03 localhost kernel: device tap896d8fa2-0a entered promiscuous mode Dec 15 05:02:03 localhost NetworkManager[5963]: [1765792923.6919] manager: (tap896d8fa2-0a): new Generic device (/org/freedesktop/NetworkManager/Devices/29) Dec 15 05:02:03 localhost nova_compute[286344]: 2025-12-15 10:02:03.692 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:02:03 localhost systemd-udevd[316818]: Network interface NamePolicy= disabled on kernel command line. Dec 15 05:02:03 localhost ovn_controller[154603]: 2025-12-15T10:02:03Z|00162|binding|INFO|Claiming lport 896d8fa2-0a54-4158-8c07-ad4aca07b7da for this chassis. Dec 15 05:02:03 localhost ovn_controller[154603]: 2025-12-15T10:02:03Z|00163|binding|INFO|896d8fa2-0a54-4158-8c07-ad4aca07b7da: Claiming unknown Dec 15 05:02:03 localhost ovn_metadata_agent[160585]: 2025-12-15 10:02:03.703 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-2a2f0563-8e9e-4627-9f77-482d6d72668e', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2a2f0563-8e9e-4627-9f77-482d6d72668e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e5ab79dc17834225a9022f4e643bb74d', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2e43ead4-eb43-48b6-8f41-90a04b16885a, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=896d8fa2-0a54-4158-8c07-ad4aca07b7da) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:02:03 localhost ovn_metadata_agent[160585]: 2025-12-15 10:02:03.704 160590 INFO neutron.agent.ovn.metadata.agent [-] Port 896d8fa2-0a54-4158-8c07-ad4aca07b7da in datapath 2a2f0563-8e9e-4627-9f77-482d6d72668e bound to our chassis#033[00m Dec 15 05:02:03 localhost ovn_metadata_agent[160585]: 2025-12-15 10:02:03.705 160590 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 2a2f0563-8e9e-4627-9f77-482d6d72668e or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 15 05:02:03 localhost ovn_metadata_agent[160585]: 2025-12-15 10:02:03.706 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[4dc3c9b4-55c2-48b4-b955-891e80795f6c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:02:03 localhost journal[231322]: ethtool ioctl error on tap896d8fa2-0a: No such device Dec 15 05:02:03 localhost journal[231322]: ethtool ioctl error on tap896d8fa2-0a: No such device Dec 15 05:02:03 localhost journal[231322]: ethtool ioctl error on tap896d8fa2-0a: No such device Dec 15 05:02:03 localhost ovn_controller[154603]: 2025-12-15T10:02:03Z|00164|binding|INFO|Setting lport 896d8fa2-0a54-4158-8c07-ad4aca07b7da ovn-installed in OVS Dec 15 05:02:03 localhost ovn_controller[154603]: 2025-12-15T10:02:03Z|00165|binding|INFO|Setting lport 896d8fa2-0a54-4158-8c07-ad4aca07b7da up in Southbound Dec 15 05:02:03 localhost nova_compute[286344]: 2025-12-15 10:02:03.726 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:02:03 localhost nova_compute[286344]: 2025-12-15 10:02:03.728 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:02:03 localhost journal[231322]: ethtool ioctl error on tap896d8fa2-0a: No such device Dec 15 05:02:03 localhost journal[231322]: ethtool ioctl error on tap896d8fa2-0a: No such device Dec 15 05:02:03 localhost journal[231322]: ethtool ioctl error on tap896d8fa2-0a: No such device Dec 15 05:02:03 localhost journal[231322]: ethtool ioctl error on tap896d8fa2-0a: No such device Dec 15 05:02:03 localhost journal[231322]: ethtool ioctl error on tap896d8fa2-0a: No such device Dec 15 05:02:03 localhost nova_compute[286344]: 2025-12-15 10:02:03.754 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:02:03 localhost nova_compute[286344]: 2025-12-15 10:02:03.787 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:02:03 localhost ovn_metadata_agent[160585]: 2025-12-15 10:02:03.838 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'fe:17:e3', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fe:55:2b:86:15:b5'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:02:03 localhost ovn_metadata_agent[160585]: 2025-12-15 10:02:03.839 160590 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Dec 15 05:02:03 localhost nova_compute[286344]: 2025-12-15 10:02:03.839 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:02:04 localhost podman[316889]: Dec 15 05:02:04 localhost podman[316889]: 2025-12-15 10:02:04.67656517 +0000 UTC m=+0.074170781 container create ace877ce93906ba24cccdf6a91d26953369d6e29dd65cb74f34858055b290c3d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2a2f0563-8e9e-4627-9f77-482d6d72668e, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:02:04 localhost systemd[1]: Started libpod-conmon-ace877ce93906ba24cccdf6a91d26953369d6e29dd65cb74f34858055b290c3d.scope. Dec 15 05:02:04 localhost podman[316889]: 2025-12-15 10:02:04.634165488 +0000 UTC m=+0.031771109 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 15 05:02:04 localhost systemd[1]: Started libcrun container. Dec 15 05:02:04 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ea54e032abe4aceec8fd959facdcc2a0f734064463615687ee0df074aeaa3285/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 05:02:04 localhost podman[316889]: 2025-12-15 10:02:04.754363562 +0000 UTC m=+0.151969163 container init ace877ce93906ba24cccdf6a91d26953369d6e29dd65cb74f34858055b290c3d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2a2f0563-8e9e-4627-9f77-482d6d72668e, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3) Dec 15 05:02:04 localhost podman[316889]: 2025-12-15 10:02:04.763120304 +0000 UTC m=+0.160725905 container start ace877ce93906ba24cccdf6a91d26953369d6e29dd65cb74f34858055b290c3d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2a2f0563-8e9e-4627-9f77-482d6d72668e, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0) Dec 15 05:02:04 localhost dnsmasq[316907]: started, version 2.85 cachesize 150 Dec 15 05:02:04 localhost dnsmasq[316907]: DNS service limited to local subnets Dec 15 05:02:04 localhost dnsmasq[316907]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 15 05:02:04 localhost dnsmasq[316907]: warning: no upstream servers configured Dec 15 05:02:04 localhost dnsmasq-dhcp[316907]: DHCP, static leases only on 10.100.0.0, lease time 1d Dec 15 05:02:04 localhost dnsmasq[316907]: read /var/lib/neutron/dhcp/2a2f0563-8e9e-4627-9f77-482d6d72668e/addn_hosts - 0 addresses Dec 15 05:02:04 localhost dnsmasq-dhcp[316907]: read /var/lib/neutron/dhcp/2a2f0563-8e9e-4627-9f77-482d6d72668e/host Dec 15 05:02:04 localhost dnsmasq-dhcp[316907]: read /var/lib/neutron/dhcp/2a2f0563-8e9e-4627-9f77-482d6d72668e/opts Dec 15 05:02:04 localhost ovn_metadata_agent[160585]: 2025-12-15 10:02:04.841 160590 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=12d96d64-e862-4f68-81e5-8d9ec5d3a5e2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 15 05:02:04 localhost openstack_network_exporter[246484]: ERROR 10:02:04 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 15 05:02:04 localhost openstack_network_exporter[246484]: ERROR 10:02:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 05:02:04 localhost openstack_network_exporter[246484]: ERROR 10:02:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 05:02:04 localhost openstack_network_exporter[246484]: ERROR 10:02:04 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 15 05:02:04 localhost openstack_network_exporter[246484]: Dec 15 05:02:04 localhost openstack_network_exporter[246484]: ERROR 10:02:04 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 15 05:02:04 localhost openstack_network_exporter[246484]: Dec 15 05:02:04 localhost nova_compute[286344]: 2025-12-15 10:02:04.945 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:02:04 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:02:04.973 267546 INFO neutron.agent.dhcp.agent [None req-b7360097-4552-4dcc-be24-14564f631a53 - - - - - -] DHCP configuration for ports {'2b03cfef-6984-400b-bba8-42bcac721064'} is completed#033[00m Dec 15 05:02:05 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e111 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 05:02:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0. Dec 15 05:02:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 05:02:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a. Dec 15 05:02:05 localhost podman[316909]: 2025-12-15 10:02:05.744698108 +0000 UTC m=+0.072827315 container health_status 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=multipathd, org.label-schema.license=GPLv2, tcib_managed=true) Dec 15 05:02:05 localhost podman[316908]: 2025-12-15 10:02:05.80659739 +0000 UTC m=+0.134344906 container health_status 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Dec 15 05:02:05 localhost podman[316908]: 2025-12-15 10:02:05.816640908 +0000 UTC m=+0.144388524 container exec_died 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 15 05:02:05 localhost systemd[1]: 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0.service: Deactivated successfully. Dec 15 05:02:05 localhost systemd[1]: tmp-crun.P1SvCn.mount: Deactivated successfully. Dec 15 05:02:05 localhost podman[316910]: 2025-12-15 10:02:05.87638195 +0000 UTC m=+0.198545282 container health_status b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, config_id=ceilometer_agent_compute, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 05:02:05 localhost podman[316909]: 2025-12-15 10:02:05.884934456 +0000 UTC m=+0.213063673 container exec_died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, config_id=multipathd, org.label-schema.schema-version=1.0) Dec 15 05:02:05 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.service: Deactivated successfully. Dec 15 05:02:05 localhost podman[316910]: 2025-12-15 10:02:05.938872118 +0000 UTC m=+0.261035490 container exec_died b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0) Dec 15 05:02:05 localhost systemd[1]: b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a.service: Deactivated successfully. Dec 15 05:02:06 localhost nova_compute[286344]: 2025-12-15 10:02:06.125 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:02:06 localhost dnsmasq[316716]: read /var/lib/neutron/dhcp/3c7fb015-354b-4d02-b527-15571516e679/addn_hosts - 0 addresses Dec 15 05:02:06 localhost dnsmasq-dhcp[316716]: read /var/lib/neutron/dhcp/3c7fb015-354b-4d02-b527-15571516e679/host Dec 15 05:02:06 localhost podman[316980]: 2025-12-15 10:02:06.309685022 +0000 UTC m=+0.060010410 container kill 02a45d648a12222f39884d105514a5d30f227026fd770e53e0ed238140000f1a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3c7fb015-354b-4d02-b527-15571516e679, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Dec 15 05:02:06 localhost dnsmasq-dhcp[316716]: read /var/lib/neutron/dhcp/3c7fb015-354b-4d02-b527-15571516e679/opts Dec 15 05:02:06 localhost ovn_controller[154603]: 2025-12-15T10:02:06Z|00166|binding|INFO|Releasing lport 3dd1228a-865d-4119-ba96-8c2b1a62bcd9 from this chassis (sb_readonly=0) Dec 15 05:02:06 localhost nova_compute[286344]: 2025-12-15 10:02:06.599 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:02:06 localhost ovn_controller[154603]: 2025-12-15T10:02:06Z|00167|binding|INFO|Setting lport 3dd1228a-865d-4119-ba96-8c2b1a62bcd9 down in Southbound Dec 15 05:02:06 localhost kernel: device tap3dd1228a-86 left promiscuous mode Dec 15 05:02:06 localhost ovn_metadata_agent[160585]: 2025-12-15 10:02:06.610 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-3c7fb015-354b-4d02-b527-15571516e679', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3c7fb015-354b-4d02-b527-15571516e679', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e293c1347d4b492e9405d19a0168b9af', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005559462.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f63083fa-ddf2-4e6e-8d5f-204f8d11fdb7, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=3dd1228a-865d-4119-ba96-8c2b1a62bcd9) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:02:06 localhost ovn_metadata_agent[160585]: 2025-12-15 10:02:06.612 160590 INFO neutron.agent.ovn.metadata.agent [-] Port 3dd1228a-865d-4119-ba96-8c2b1a62bcd9 in datapath 3c7fb015-354b-4d02-b527-15571516e679 unbound from our chassis#033[00m Dec 15 05:02:06 localhost ovn_metadata_agent[160585]: 2025-12-15 10:02:06.615 160590 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3c7fb015-354b-4d02-b527-15571516e679, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 15 05:02:06 localhost ovn_metadata_agent[160585]: 2025-12-15 10:02:06.616 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[c73dfd7a-9fef-45ad-9b85-52d289e0782a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:02:06 localhost nova_compute[286344]: 2025-12-15 10:02:06.651 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:02:06 localhost nova_compute[286344]: 2025-12-15 10:02:06.652 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:02:06 localhost nova_compute[286344]: 2025-12-15 10:02:06.785 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:02:07 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e111 do_prune osdmap full prune enabled Dec 15 05:02:07 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e112 e112: 6 total, 6 up, 6 in Dec 15 05:02:07 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e112: 6 total, 6 up, 6 in Dec 15 05:02:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09. Dec 15 05:02:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 05:02:07 localhost podman[317004]: 2025-12-15 10:02:07.740887461 +0000 UTC m=+0.070910782 container health_status 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, name=ubi9-minimal, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, vendor=Red Hat, Inc., vcs-type=git, distribution-scope=public, build-date=2025-08-20T13:12:41, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter) Dec 15 05:02:07 localhost podman[317004]: 2025-12-15 10:02:07.758495228 +0000 UTC m=+0.088518539 container exec_died 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, release=1755695350, vcs-type=git, name=ubi9-minimal, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., distribution-scope=public, managed_by=edpm_ansible, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6) Dec 15 05:02:07 localhost systemd[1]: 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09.service: Deactivated successfully. Dec 15 05:02:07 localhost podman[317005]: 2025-12-15 10:02:07.795449199 +0000 UTC m=+0.121382987 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, managed_by=edpm_ansible) Dec 15 05:02:07 localhost podman[317005]: 2025-12-15 10:02:07.905414241 +0000 UTC m=+0.231348049 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Dec 15 05:02:07 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 05:02:08 localhost nova_compute[286344]: 2025-12-15 10:02:08.266 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:02:08 localhost nova_compute[286344]: 2025-12-15 10:02:08.269 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:02:08 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:02:08.678 267546 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-15T10:02:08Z, description=, device_id=f0b9ad60-4e5c-45e0-bde4-d98067188cc9, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=afa32bcf-05da-4c53-a0aa-dd9809e79503, ip_allocation=immediate, mac_address=fa:16:3e:a5:c2:2a, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-15T10:02:00Z, description=, dns_domain=, id=2a2f0563-8e9e-4627-9f77-482d6d72668e, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-FloatingIPsNegativeTestJSON-5430338-network, port_security_enabled=True, project_id=e5ab79dc17834225a9022f4e643bb74d, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=10363, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=800, status=ACTIVE, subnets=['c5d09857-a4c3-4ce6-a184-9f44830d97e5'], tags=[], tenant_id=e5ab79dc17834225a9022f4e643bb74d, updated_at=2025-12-15T10:02:02Z, vlan_transparent=None, network_id=2a2f0563-8e9e-4627-9f77-482d6d72668e, port_security_enabled=False, project_id=e5ab79dc17834225a9022f4e643bb74d, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=834, status=DOWN, tags=[], tenant_id=e5ab79dc17834225a9022f4e643bb74d, updated_at=2025-12-15T10:02:08Z on network 2a2f0563-8e9e-4627-9f77-482d6d72668e#033[00m Dec 15 05:02:08 localhost dnsmasq[316907]: read /var/lib/neutron/dhcp/2a2f0563-8e9e-4627-9f77-482d6d72668e/addn_hosts - 1 addresses Dec 15 05:02:08 localhost dnsmasq-dhcp[316907]: read /var/lib/neutron/dhcp/2a2f0563-8e9e-4627-9f77-482d6d72668e/host Dec 15 05:02:08 localhost podman[317066]: 2025-12-15 10:02:08.955441248 +0000 UTC m=+0.059729903 container kill ace877ce93906ba24cccdf6a91d26953369d6e29dd65cb74f34858055b290c3d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2a2f0563-8e9e-4627-9f77-482d6d72668e, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202) Dec 15 05:02:08 localhost dnsmasq-dhcp[316907]: read /var/lib/neutron/dhcp/2a2f0563-8e9e-4627-9f77-482d6d72668e/opts Dec 15 05:02:09 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:02:09.179 267546 INFO neutron.agent.dhcp.agent [None req-adfa5ec4-295a-4450-8133-0367ba054357 - - - - - -] DHCP configuration for ports {'afa32bcf-05da-4c53-a0aa-dd9809e79503'} is completed#033[00m Dec 15 05:02:09 localhost nova_compute[286344]: 2025-12-15 10:02:09.271 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:02:09 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e112 do_prune osdmap full prune enabled Dec 15 05:02:09 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e113 e113: 6 total, 6 up, 6 in Dec 15 05:02:09 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e113: 6 total, 6 up, 6 in Dec 15 05:02:09 localhost nova_compute[286344]: 2025-12-15 10:02:09.948 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:02:10 localhost nova_compute[286344]: 2025-12-15 10:02:10.269 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:02:10 localhost nova_compute[286344]: 2025-12-15 10:02:10.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:02:10 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e113 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 05:02:10 localhost snmpd[69387]: empty variable list in _query Dec 15 05:02:10 localhost snmpd[69387]: empty variable list in _query Dec 15 05:02:10 localhost snmpd[69387]: empty variable list in _query Dec 15 05:02:10 localhost snmpd[69387]: empty variable list in _query Dec 15 05:02:10 localhost snmpd[69387]: empty variable list in _query Dec 15 05:02:10 localhost snmpd[69387]: empty variable list in _query Dec 15 05:02:11 localhost nova_compute[286344]: 2025-12-15 10:02:11.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:02:11 localhost nova_compute[286344]: 2025-12-15 10:02:11.271 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 15 05:02:11 localhost nova_compute[286344]: 2025-12-15 10:02:11.357 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m Dec 15 05:02:11 localhost nova_compute[286344]: 2025-12-15 10:02:11.467 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:02:11 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e113 do_prune osdmap full prune enabled Dec 15 05:02:11 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e114 e114: 6 total, 6 up, 6 in Dec 15 05:02:11 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e114: 6 total, 6 up, 6 in Dec 15 05:02:11 localhost nova_compute[286344]: 2025-12-15 10:02:11.823 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:02:12 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:02:12.055 267546 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-15T10:02:08Z, description=, device_id=f0b9ad60-4e5c-45e0-bde4-d98067188cc9, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=afa32bcf-05da-4c53-a0aa-dd9809e79503, ip_allocation=immediate, mac_address=fa:16:3e:a5:c2:2a, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-15T10:02:00Z, description=, dns_domain=, id=2a2f0563-8e9e-4627-9f77-482d6d72668e, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-FloatingIPsNegativeTestJSON-5430338-network, port_security_enabled=True, project_id=e5ab79dc17834225a9022f4e643bb74d, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=10363, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=800, status=ACTIVE, subnets=['c5d09857-a4c3-4ce6-a184-9f44830d97e5'], tags=[], tenant_id=e5ab79dc17834225a9022f4e643bb74d, updated_at=2025-12-15T10:02:02Z, vlan_transparent=None, network_id=2a2f0563-8e9e-4627-9f77-482d6d72668e, port_security_enabled=False, project_id=e5ab79dc17834225a9022f4e643bb74d, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=834, status=DOWN, tags=[], tenant_id=e5ab79dc17834225a9022f4e643bb74d, updated_at=2025-12-15T10:02:08Z on network 2a2f0563-8e9e-4627-9f77-482d6d72668e#033[00m Dec 15 05:02:12 localhost dnsmasq[316907]: read /var/lib/neutron/dhcp/2a2f0563-8e9e-4627-9f77-482d6d72668e/addn_hosts - 1 addresses Dec 15 05:02:12 localhost dnsmasq-dhcp[316907]: read /var/lib/neutron/dhcp/2a2f0563-8e9e-4627-9f77-482d6d72668e/host Dec 15 05:02:12 localhost dnsmasq-dhcp[316907]: read /var/lib/neutron/dhcp/2a2f0563-8e9e-4627-9f77-482d6d72668e/opts Dec 15 05:02:12 localhost podman[317104]: 2025-12-15 10:02:12.267662044 +0000 UTC m=+0.064011931 container kill ace877ce93906ba24cccdf6a91d26953369d6e29dd65cb74f34858055b290c3d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2a2f0563-8e9e-4627-9f77-482d6d72668e, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:02:12 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:02:12.541 267546 INFO neutron.agent.dhcp.agent [None req-a9258f37-36d6-473f-9159-988611901fbe - - - - - -] DHCP configuration for ports {'afa32bcf-05da-4c53-a0aa-dd9809e79503'} is completed#033[00m Dec 15 05:02:12 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e114 do_prune osdmap full prune enabled Dec 15 05:02:12 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e115 e115: 6 total, 6 up, 6 in Dec 15 05:02:12 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e115: 6 total, 6 up, 6 in Dec 15 05:02:13 localhost ovn_controller[154603]: 2025-12-15T10:02:13Z|00168|binding|INFO|Releasing lport b35254ad-12eb-47bb-92be-44fefe0694f0 from this chassis (sb_readonly=0) Dec 15 05:02:13 localhost nova_compute[286344]: 2025-12-15 10:02:13.123 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:02:13 localhost nova_compute[286344]: 2025-12-15 10:02:13.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:02:13 localhost nova_compute[286344]: 2025-12-15 10:02:13.289 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 05:02:13 localhost nova_compute[286344]: 2025-12-15 10:02:13.290 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 05:02:13 localhost nova_compute[286344]: 2025-12-15 10:02:13.290 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 05:02:13 localhost nova_compute[286344]: 2025-12-15 10:02:13.290 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Auditing locally available compute resources for np0005559462.localdomain (node: np0005559462.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 15 05:02:13 localhost nova_compute[286344]: 2025-12-15 10:02:13.291 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 05:02:13 localhost dnsmasq[316716]: exiting on receipt of SIGTERM Dec 15 05:02:13 localhost podman[317162]: 2025-12-15 10:02:13.70278085 +0000 UTC m=+0.059500166 container kill 02a45d648a12222f39884d105514a5d30f227026fd770e53e0ed238140000f1a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3c7fb015-354b-4d02-b527-15571516e679, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 05:02:13 localhost systemd[1]: libpod-02a45d648a12222f39884d105514a5d30f227026fd770e53e0ed238140000f1a.scope: Deactivated successfully. Dec 15 05:02:13 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Dec 15 05:02:13 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3421813453' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Dec 15 05:02:13 localhost nova_compute[286344]: 2025-12-15 10:02:13.762 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.471s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 05:02:13 localhost podman[317175]: 2025-12-15 10:02:13.788428909 +0000 UTC m=+0.073564346 container died 02a45d648a12222f39884d105514a5d30f227026fd770e53e0ed238140000f1a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3c7fb015-354b-4d02-b527-15571516e679, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true) Dec 15 05:02:13 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-02a45d648a12222f39884d105514a5d30f227026fd770e53e0ed238140000f1a-userdata-shm.mount: Deactivated successfully. Dec 15 05:02:13 localhost systemd[1]: var-lib-containers-storage-overlay-f5a29ed0703f676e54f8808c6ffb69dd94a0385cdded86a14167a64fcc965bd2-merged.mount: Deactivated successfully. Dec 15 05:02:13 localhost podman[317175]: 2025-12-15 10:02:13.829573296 +0000 UTC m=+0.114708683 container cleanup 02a45d648a12222f39884d105514a5d30f227026fd770e53e0ed238140000f1a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3c7fb015-354b-4d02-b527-15571516e679, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Dec 15 05:02:13 localhost systemd[1]: libpod-conmon-02a45d648a12222f39884d105514a5d30f227026fd770e53e0ed238140000f1a.scope: Deactivated successfully. Dec 15 05:02:13 localhost nova_compute[286344]: 2025-12-15 10:02:13.834 286348 DEBUG nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 05:02:13 localhost nova_compute[286344]: 2025-12-15 10:02:13.834 286348 DEBUG nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 05:02:13 localhost podman[317182]: 2025-12-15 10:02:13.88792652 +0000 UTC m=+0.158052441 container remove 02a45d648a12222f39884d105514a5d30f227026fd770e53e0ed238140000f1a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3c7fb015-354b-4d02-b527-15571516e679, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0) Dec 15 05:02:13 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:02:13.923 267546 INFO neutron.agent.dhcp.agent [None req-3d594767-fac5-46eb-847d-bca88b525c8f - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:02:13 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:02:13.954 267546 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:02:14 localhost nova_compute[286344]: 2025-12-15 10:02:14.014 286348 WARNING nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 15 05:02:14 localhost nova_compute[286344]: 2025-12-15 10:02:14.015 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Hypervisor/Node resource view: name=np0005559462.localdomain free_ram=11298MB free_disk=41.70030212402344GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 15 05:02:14 localhost nova_compute[286344]: 2025-12-15 10:02:14.015 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 05:02:14 localhost nova_compute[286344]: 2025-12-15 10:02:14.016 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 05:02:14 localhost nova_compute[286344]: 2025-12-15 10:02:14.322 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Instance 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 15 05:02:14 localhost nova_compute[286344]: 2025-12-15 10:02:14.323 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 15 05:02:14 localhost nova_compute[286344]: 2025-12-15 10:02:14.323 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Final resource view: name=np0005559462.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 15 05:02:14 localhost nova_compute[286344]: 2025-12-15 10:02:14.375 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 05:02:14 localhost systemd[1]: run-netns-qdhcp\x2d3c7fb015\x2d354b\x2d4d02\x2db527\x2d15571516e679.mount: Deactivated successfully. Dec 15 05:02:14 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Dec 15 05:02:14 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2889985413' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Dec 15 05:02:14 localhost nova_compute[286344]: 2025-12-15 10:02:14.932 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.557s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 05:02:14 localhost nova_compute[286344]: 2025-12-15 10:02:14.938 286348 DEBUG nova.compute.provider_tree [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Inventory has not changed in ProviderTree for provider: 26c8956b-6742-4951-b566-971b9bbe323b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 15 05:02:14 localhost nova_compute[286344]: 2025-12-15 10:02:14.950 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:02:15 localhost nova_compute[286344]: 2025-12-15 10:02:15.177 286348 DEBUG nova.scheduler.client.report [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Inventory has not changed for provider 26c8956b-6742-4951-b566-971b9bbe323b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 15 05:02:15 localhost nova_compute[286344]: 2025-12-15 10:02:15.179 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Compute_service record updated for np0005559462.localdomain:np0005559462.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 15 05:02:15 localhost nova_compute[286344]: 2025-12-15 10:02:15.180 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.164s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 05:02:15 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e115 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 05:02:16 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:02:16.618 267546 INFO neutron.agent.linux.ip_lib [None req-ab83494a-eb65-4c3c-9e9f-7c02703b7019 - - - - - -] Device tap6aa5107a-36 cannot be used as it has no MAC address#033[00m Dec 15 05:02:16 localhost neutron_sriov_agent[260044]: 2025-12-15 10:02:16.623 2 INFO neutron.agent.securitygroups_rpc [None req-b3d2b9e7-82c6-4c1c-806f-ba22a775a18b ce46e81b73964575a8193484874630c3 14c31ef3385b4922aa039962d102e162 - - default default] Security group member updated ['6bf686bf-7721-4b52-a122-903123b828a6']#033[00m Dec 15 05:02:16 localhost nova_compute[286344]: 2025-12-15 10:02:16.637 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:02:16 localhost kernel: device tap6aa5107a-36 entered promiscuous mode Dec 15 05:02:16 localhost NetworkManager[5963]: [1765792936.6444] manager: (tap6aa5107a-36): new Generic device (/org/freedesktop/NetworkManager/Devices/30) Dec 15 05:02:16 localhost ovn_controller[154603]: 2025-12-15T10:02:16Z|00169|binding|INFO|Claiming lport 6aa5107a-36bf-4bfe-b132-66e561ba23ee for this chassis. Dec 15 05:02:16 localhost ovn_controller[154603]: 2025-12-15T10:02:16Z|00170|binding|INFO|6aa5107a-36bf-4bfe-b132-66e561ba23ee: Claiming unknown Dec 15 05:02:16 localhost nova_compute[286344]: 2025-12-15 10:02:16.647 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:02:16 localhost systemd-udevd[317236]: Network interface NamePolicy= disabled on kernel command line. Dec 15 05:02:16 localhost ovn_metadata_agent[160585]: 2025-12-15 10:02:16.667 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fed9:b903/64', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-5af231a5-45a3-4f18-bd7e-2eb305581c05', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5af231a5-45a3-4f18-bd7e-2eb305581c05', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '14c31ef3385b4922aa039962d102e162', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b1bcb2c8-35bd-449e-9e06-d43ccd95dadd, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=6aa5107a-36bf-4bfe-b132-66e561ba23ee) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:02:16 localhost ovn_metadata_agent[160585]: 2025-12-15 10:02:16.669 160590 INFO neutron.agent.ovn.metadata.agent [-] Port 6aa5107a-36bf-4bfe-b132-66e561ba23ee in datapath 5af231a5-45a3-4f18-bd7e-2eb305581c05 bound to our chassis#033[00m Dec 15 05:02:16 localhost ovn_metadata_agent[160585]: 2025-12-15 10:02:16.671 160590 DEBUG neutron.agent.ovn.metadata.agent [-] Port b4be71dc-f202-42d1-9198-c08696b56333 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Dec 15 05:02:16 localhost ovn_metadata_agent[160585]: 2025-12-15 10:02:16.672 160590 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5af231a5-45a3-4f18-bd7e-2eb305581c05, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 15 05:02:16 localhost ovn_metadata_agent[160585]: 2025-12-15 10:02:16.672 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[e6bbc273-1eac-4a47-a722-ce254c304310]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:02:16 localhost ovn_controller[154603]: 2025-12-15T10:02:16Z|00171|binding|INFO|Setting lport 6aa5107a-36bf-4bfe-b132-66e561ba23ee ovn-installed in OVS Dec 15 05:02:16 localhost ovn_controller[154603]: 2025-12-15T10:02:16Z|00172|binding|INFO|Setting lport 6aa5107a-36bf-4bfe-b132-66e561ba23ee up in Southbound Dec 15 05:02:16 localhost nova_compute[286344]: 2025-12-15 10:02:16.689 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:02:16 localhost nova_compute[286344]: 2025-12-15 10:02:16.760 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:02:16 localhost nova_compute[286344]: 2025-12-15 10:02:16.784 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:02:16 localhost nova_compute[286344]: 2025-12-15 10:02:16.826 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:02:17 localhost nova_compute[286344]: 2025-12-15 10:02:17.180 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:02:17 localhost nova_compute[286344]: 2025-12-15 10:02:17.181 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:02:17 localhost nova_compute[286344]: 2025-12-15 10:02:17.182 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 15 05:02:17 localhost podman[317291]: Dec 15 05:02:17 localhost podman[317291]: 2025-12-15 10:02:17.51380223 +0000 UTC m=+0.064907156 container create 4301cffa7e8e0db2867dbbe6728c2c8ea5c4018f11874044bccefbda12f8356f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5af231a5-45a3-4f18-bd7e-2eb305581c05, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Dec 15 05:02:17 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 05:02:17 localhost systemd[1]: Started libpod-conmon-4301cffa7e8e0db2867dbbe6728c2c8ea5c4018f11874044bccefbda12f8356f.scope. Dec 15 05:02:17 localhost systemd[1]: Started libcrun container. Dec 15 05:02:17 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8847140ceaacf49b6654585981e7c12959b7a1c50842abd7c03062dc9b1751eb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 05:02:17 localhost podman[317291]: 2025-12-15 10:02:17.47834751 +0000 UTC m=+0.029452416 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 15 05:02:17 localhost podman[317291]: 2025-12-15 10:02:17.580510264 +0000 UTC m=+0.131615210 container init 4301cffa7e8e0db2867dbbe6728c2c8ea5c4018f11874044bccefbda12f8356f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5af231a5-45a3-4f18-bd7e-2eb305581c05, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) Dec 15 05:02:17 localhost systemd[1]: tmp-crun.v4VxV8.mount: Deactivated successfully. Dec 15 05:02:17 localhost dnsmasq[317320]: started, version 2.85 cachesize 150 Dec 15 05:02:17 localhost dnsmasq[317320]: DNS service limited to local subnets Dec 15 05:02:17 localhost dnsmasq[317320]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 15 05:02:17 localhost dnsmasq[317320]: warning: no upstream servers configured Dec 15 05:02:17 localhost dnsmasq[317320]: read /var/lib/neutron/dhcp/5af231a5-45a3-4f18-bd7e-2eb305581c05/addn_hosts - 0 addresses Dec 15 05:02:17 localhost podman[317304]: 2025-12-15 10:02:17.632094561 +0000 UTC m=+0.091783309 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Dec 15 05:02:17 localhost podman[317291]: 2025-12-15 10:02:17.649906554 +0000 UTC m=+0.201011510 container start 4301cffa7e8e0db2867dbbe6728c2c8ea5c4018f11874044bccefbda12f8356f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5af231a5-45a3-4f18-bd7e-2eb305581c05, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true) Dec 15 05:02:17 localhost podman[317304]: 2025-12-15 10:02:17.668518609 +0000 UTC m=+0.128207307 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Dec 15 05:02:17 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 05:02:17 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:02:17.862 267546 INFO neutron.agent.dhcp.agent [None req-84e6f879-1cae-4336-bef0-b934e432aab0 - - - - - -] DHCP configuration for ports {'cf350dfa-c111-45ce-825d-d6dd52b068d5'} is completed#033[00m Dec 15 05:02:18 localhost podman[317343]: 2025-12-15 10:02:18.037166543 +0000 UTC m=+0.058459438 container kill 4301cffa7e8e0db2867dbbe6728c2c8ea5c4018f11874044bccefbda12f8356f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5af231a5-45a3-4f18-bd7e-2eb305581c05, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:02:18 localhost dnsmasq[317320]: exiting on receipt of SIGTERM Dec 15 05:02:18 localhost systemd[1]: libpod-4301cffa7e8e0db2867dbbe6728c2c8ea5c4018f11874044bccefbda12f8356f.scope: Deactivated successfully. Dec 15 05:02:18 localhost podman[317356]: 2025-12-15 10:02:18.103705213 +0000 UTC m=+0.050793445 container died 4301cffa7e8e0db2867dbbe6728c2c8ea5c4018f11874044bccefbda12f8356f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5af231a5-45a3-4f18-bd7e-2eb305581c05, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0) Dec 15 05:02:18 localhost podman[317356]: 2025-12-15 10:02:18.137251031 +0000 UTC m=+0.084339213 container cleanup 4301cffa7e8e0db2867dbbe6728c2c8ea5c4018f11874044bccefbda12f8356f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5af231a5-45a3-4f18-bd7e-2eb305581c05, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0) Dec 15 05:02:18 localhost systemd[1]: libpod-conmon-4301cffa7e8e0db2867dbbe6728c2c8ea5c4018f11874044bccefbda12f8356f.scope: Deactivated successfully. Dec 15 05:02:18 localhost podman[317358]: 2025-12-15 10:02:18.186206875 +0000 UTC m=+0.124764782 container remove 4301cffa7e8e0db2867dbbe6728c2c8ea5c4018f11874044bccefbda12f8356f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5af231a5-45a3-4f18-bd7e-2eb305581c05, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202) Dec 15 05:02:18 localhost nova_compute[286344]: 2025-12-15 10:02:18.231 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:02:18 localhost kernel: device tap6aa5107a-36 left promiscuous mode Dec 15 05:02:18 localhost ovn_controller[154603]: 2025-12-15T10:02:18Z|00173|binding|INFO|Releasing lport 6aa5107a-36bf-4bfe-b132-66e561ba23ee from this chassis (sb_readonly=0) Dec 15 05:02:18 localhost ovn_controller[154603]: 2025-12-15T10:02:18Z|00174|binding|INFO|Setting lport 6aa5107a-36bf-4bfe-b132-66e561ba23ee down in Southbound Dec 15 05:02:18 localhost ovn_metadata_agent[160585]: 2025-12-15 10:02:18.242 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fed9:b903/64', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-5af231a5-45a3-4f18-bd7e-2eb305581c05', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5af231a5-45a3-4f18-bd7e-2eb305581c05', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '14c31ef3385b4922aa039962d102e162', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b1bcb2c8-35bd-449e-9e06-d43ccd95dadd, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=6aa5107a-36bf-4bfe-b132-66e561ba23ee) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:02:18 localhost ovn_metadata_agent[160585]: 2025-12-15 10:02:18.244 160590 INFO neutron.agent.ovn.metadata.agent [-] Port 6aa5107a-36bf-4bfe-b132-66e561ba23ee in datapath 5af231a5-45a3-4f18-bd7e-2eb305581c05 unbound from our chassis#033[00m Dec 15 05:02:18 localhost ovn_metadata_agent[160585]: 2025-12-15 10:02:18.247 160590 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5af231a5-45a3-4f18-bd7e-2eb305581c05, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 15 05:02:18 localhost ovn_metadata_agent[160585]: 2025-12-15 10:02:18.249 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[a856ca65-ac04-45ab-826b-dc1d4dc2b700]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:02:18 localhost nova_compute[286344]: 2025-12-15 10:02:18.261 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:02:18 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:02:18.494 267546 INFO neutron.agent.dhcp.agent [None req-9d9ce27b-a3c0-43df-b02f-dbde2d123e8d - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:02:18 localhost systemd[1]: var-lib-containers-storage-overlay-8847140ceaacf49b6654585981e7c12959b7a1c50842abd7c03062dc9b1751eb-merged.mount: Deactivated successfully. Dec 15 05:02:18 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4301cffa7e8e0db2867dbbe6728c2c8ea5c4018f11874044bccefbda12f8356f-userdata-shm.mount: Deactivated successfully. Dec 15 05:02:18 localhost systemd[1]: run-netns-qdhcp\x2d5af231a5\x2d45a3\x2d4f18\x2dbd7e\x2d2eb305581c05.mount: Deactivated successfully. Dec 15 05:02:19 localhost dnsmasq[316907]: read /var/lib/neutron/dhcp/2a2f0563-8e9e-4627-9f77-482d6d72668e/addn_hosts - 0 addresses Dec 15 05:02:19 localhost dnsmasq-dhcp[316907]: read /var/lib/neutron/dhcp/2a2f0563-8e9e-4627-9f77-482d6d72668e/host Dec 15 05:02:19 localhost dnsmasq-dhcp[316907]: read /var/lib/neutron/dhcp/2a2f0563-8e9e-4627-9f77-482d6d72668e/opts Dec 15 05:02:19 localhost podman[317400]: 2025-12-15 10:02:19.531110297 +0000 UTC m=+0.059661522 container kill ace877ce93906ba24cccdf6a91d26953369d6e29dd65cb74f34858055b290c3d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2a2f0563-8e9e-4627-9f77-482d6d72668e, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:02:19 localhost ovn_controller[154603]: 2025-12-15T10:02:19Z|00175|binding|INFO|Releasing lport b35254ad-12eb-47bb-92be-44fefe0694f0 from this chassis (sb_readonly=0) Dec 15 05:02:19 localhost nova_compute[286344]: 2025-12-15 10:02:19.683 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:02:19 localhost nova_compute[286344]: 2025-12-15 10:02:19.952 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:02:20 localhost ovn_controller[154603]: 2025-12-15T10:02:20Z|00176|binding|INFO|Releasing lport 896d8fa2-0a54-4158-8c07-ad4aca07b7da from this chassis (sb_readonly=0) Dec 15 05:02:20 localhost ovn_controller[154603]: 2025-12-15T10:02:20Z|00177|binding|INFO|Setting lport 896d8fa2-0a54-4158-8c07-ad4aca07b7da down in Southbound Dec 15 05:02:20 localhost nova_compute[286344]: 2025-12-15 10:02:20.091 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:02:20 localhost kernel: device tap896d8fa2-0a left promiscuous mode Dec 15 05:02:20 localhost nova_compute[286344]: 2025-12-15 10:02:20.121 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:02:20 localhost ovn_metadata_agent[160585]: 2025-12-15 10:02:20.192 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-2a2f0563-8e9e-4627-9f77-482d6d72668e', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2a2f0563-8e9e-4627-9f77-482d6d72668e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e5ab79dc17834225a9022f4e643bb74d', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005559462.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2e43ead4-eb43-48b6-8f41-90a04b16885a, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=896d8fa2-0a54-4158-8c07-ad4aca07b7da) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:02:20 localhost ovn_metadata_agent[160585]: 2025-12-15 10:02:20.194 160590 INFO neutron.agent.ovn.metadata.agent [-] Port 896d8fa2-0a54-4158-8c07-ad4aca07b7da in datapath 2a2f0563-8e9e-4627-9f77-482d6d72668e unbound from our chassis#033[00m Dec 15 05:02:20 localhost ovn_metadata_agent[160585]: 2025-12-15 10:02:20.197 160590 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2a2f0563-8e9e-4627-9f77-482d6d72668e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 15 05:02:20 localhost ovn_metadata_agent[160585]: 2025-12-15 10:02:20.198 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[45569ef6-dfd3-41c0-97e6-48d1ffa1cead]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:02:20 localhost neutron_sriov_agent[260044]: 2025-12-15 10:02:20.309 2 INFO neutron.agent.securitygroups_rpc [None req-40aeeab5-4d3c-48e9-a017-ae416b726e0b ce46e81b73964575a8193484874630c3 14c31ef3385b4922aa039962d102e162 - - default default] Security group member updated ['6bf686bf-7721-4b52-a122-903123b828a6']#033[00m Dec 15 05:02:20 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e115 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 05:02:20 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e115 do_prune osdmap full prune enabled Dec 15 05:02:20 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e116 e116: 6 total, 6 up, 6 in Dec 15 05:02:20 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e116: 6 total, 6 up, 6 in Dec 15 05:02:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e. Dec 15 05:02:20 localhost podman[317422]: 2025-12-15 10:02:20.750483037 +0000 UTC m=+0.079988634 container health_status a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Dec 15 05:02:20 localhost podman[317422]: 2025-12-15 10:02:20.759096905 +0000 UTC m=+0.088602552 container exec_died a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Dec 15 05:02:20 localhost systemd[1]: a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e.service: Deactivated successfully. Dec 15 05:02:20 localhost neutron_sriov_agent[260044]: 2025-12-15 10:02:20.824 2 INFO neutron.agent.securitygroups_rpc [req-ea61ded7-f414-4d84-91f7-be39071acf6c req-ac20536d-0c71-449c-a814-5a0bb062c984 3d412e2d4a3a4f54b39c55f7a39b768f 054b5bdd4ed44009a8e1940489c96b34 - - default default] Security group member updated ['1f23d768-bded-4ba1-9b80-f5121af3632b']#033[00m Dec 15 05:02:21 localhost dnsmasq[315389]: read /var/lib/neutron/dhcp/3ac513c6-d80a-4d03-a550-1b73e6929696/addn_hosts - 1 addresses Dec 15 05:02:21 localhost dnsmasq-dhcp[315389]: read /var/lib/neutron/dhcp/3ac513c6-d80a-4d03-a550-1b73e6929696/host Dec 15 05:02:21 localhost dnsmasq-dhcp[315389]: read /var/lib/neutron/dhcp/3ac513c6-d80a-4d03-a550-1b73e6929696/opts Dec 15 05:02:21 localhost podman[317461]: 2025-12-15 10:02:21.082375614 +0000 UTC m=+0.062494048 container kill 8d0a249025cd30d3f6ced66bb2a1ab69200de31f87cfd16c2c213b3b64a9055f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3ac513c6-d80a-4d03-a550-1b73e6929696, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 05:02:21 localhost nova_compute[286344]: 2025-12-15 10:02:21.867 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:02:22 localhost ovn_controller[154603]: 2025-12-15T10:02:22Z|00178|binding|INFO|Releasing lport b35254ad-12eb-47bb-92be-44fefe0694f0 from this chassis (sb_readonly=0) Dec 15 05:02:22 localhost nova_compute[286344]: 2025-12-15 10:02:22.674 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:02:23 localhost dnsmasq[316907]: exiting on receipt of SIGTERM Dec 15 05:02:23 localhost podman[317499]: 2025-12-15 10:02:23.705077193 +0000 UTC m=+0.065242106 container kill ace877ce93906ba24cccdf6a91d26953369d6e29dd65cb74f34858055b290c3d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2a2f0563-8e9e-4627-9f77-482d6d72668e, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202) Dec 15 05:02:23 localhost systemd[1]: libpod-ace877ce93906ba24cccdf6a91d26953369d6e29dd65cb74f34858055b290c3d.scope: Deactivated successfully. Dec 15 05:02:23 localhost podman[317512]: 2025-12-15 10:02:23.773339411 +0000 UTC m=+0.055579339 container died ace877ce93906ba24cccdf6a91d26953369d6e29dd65cb74f34858055b290c3d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2a2f0563-8e9e-4627-9f77-482d6d72668e, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2) Dec 15 05:02:23 localhost systemd[1]: tmp-crun.nHHYyT.mount: Deactivated successfully. Dec 15 05:02:23 localhost podman[317512]: 2025-12-15 10:02:23.822387906 +0000 UTC m=+0.104627754 container cleanup ace877ce93906ba24cccdf6a91d26953369d6e29dd65cb74f34858055b290c3d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2a2f0563-8e9e-4627-9f77-482d6d72668e, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 05:02:23 localhost systemd[1]: libpod-conmon-ace877ce93906ba24cccdf6a91d26953369d6e29dd65cb74f34858055b290c3d.scope: Deactivated successfully. Dec 15 05:02:23 localhost podman[317519]: 2025-12-15 10:02:23.905368581 +0000 UTC m=+0.173634502 container remove ace877ce93906ba24cccdf6a91d26953369d6e29dd65cb74f34858055b290c3d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2a2f0563-8e9e-4627-9f77-482d6d72668e, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:02:24 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:02:24.243 267546 INFO neutron.agent.dhcp.agent [None req-dd4f7afc-a2b6-42ca-adeb-60ad13aa109b - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:02:24 localhost dnsmasq[315389]: read /var/lib/neutron/dhcp/3ac513c6-d80a-4d03-a550-1b73e6929696/addn_hosts - 0 addresses Dec 15 05:02:24 localhost dnsmasq-dhcp[315389]: read /var/lib/neutron/dhcp/3ac513c6-d80a-4d03-a550-1b73e6929696/host Dec 15 05:02:24 localhost dnsmasq-dhcp[315389]: read /var/lib/neutron/dhcp/3ac513c6-d80a-4d03-a550-1b73e6929696/opts Dec 15 05:02:24 localhost podman[317559]: 2025-12-15 10:02:24.311856062 +0000 UTC m=+0.055730072 container kill 8d0a249025cd30d3f6ced66bb2a1ab69200de31f87cfd16c2c213b3b64a9055f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3ac513c6-d80a-4d03-a550-1b73e6929696, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:02:24 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:02:24.417 267546 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:02:24 localhost ovn_controller[154603]: 2025-12-15T10:02:24Z|00179|binding|INFO|Releasing lport e816ee9d-843f-4652-b732-310b6e7d6146 from this chassis (sb_readonly=0) Dec 15 05:02:24 localhost ovn_controller[154603]: 2025-12-15T10:02:24Z|00180|binding|INFO|Setting lport e816ee9d-843f-4652-b732-310b6e7d6146 down in Southbound Dec 15 05:02:24 localhost nova_compute[286344]: 2025-12-15 10:02:24.528 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:02:24 localhost kernel: device tape816ee9d-84 left promiscuous mode Dec 15 05:02:24 localhost ovn_metadata_agent[160585]: 2025-12-15 10:02:24.546 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-3ac513c6-d80a-4d03-a550-1b73e6929696', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3ac513c6-d80a-4d03-a550-1b73e6929696', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '054b5bdd4ed44009a8e1940489c96b34', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005559462.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d0a9f8e8-aca4-481b-b6ba-86c8894e2b13, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=e816ee9d-843f-4652-b732-310b6e7d6146) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:02:24 localhost ovn_metadata_agent[160585]: 2025-12-15 10:02:24.548 160590 INFO neutron.agent.ovn.metadata.agent [-] Port e816ee9d-843f-4652-b732-310b6e7d6146 in datapath 3ac513c6-d80a-4d03-a550-1b73e6929696 unbound from our chassis#033[00m Dec 15 05:02:24 localhost ovn_metadata_agent[160585]: 2025-12-15 10:02:24.551 160590 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3ac513c6-d80a-4d03-a550-1b73e6929696, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 15 05:02:24 localhost ovn_metadata_agent[160585]: 2025-12-15 10:02:24.552 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[a8332dcb-29c6-4220-91a8-00a307b141f1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:02:24 localhost nova_compute[286344]: 2025-12-15 10:02:24.561 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:02:24 localhost systemd[1]: var-lib-containers-storage-overlay-ea54e032abe4aceec8fd959facdcc2a0f734064463615687ee0df074aeaa3285-merged.mount: Deactivated successfully. Dec 15 05:02:24 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ace877ce93906ba24cccdf6a91d26953369d6e29dd65cb74f34858055b290c3d-userdata-shm.mount: Deactivated successfully. Dec 15 05:02:24 localhost systemd[1]: run-netns-qdhcp\x2d2a2f0563\x2d8e9e\x2d4627\x2d9f77\x2d482d6d72668e.mount: Deactivated successfully. Dec 15 05:02:24 localhost nova_compute[286344]: 2025-12-15 10:02:24.954 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:02:25 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:02:25.053 267546 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:02:25 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e116 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 05:02:26 localhost ovn_controller[154603]: 2025-12-15T10:02:26Z|00181|binding|INFO|Releasing lport b35254ad-12eb-47bb-92be-44fefe0694f0 from this chassis (sb_readonly=0) Dec 15 05:02:26 localhost nova_compute[286344]: 2025-12-15 10:02:26.822 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:02:26 localhost nova_compute[286344]: 2025-12-15 10:02:26.870 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:02:27 localhost systemd[1]: tmp-crun.nDZOUw.mount: Deactivated successfully. Dec 15 05:02:27 localhost dnsmasq[315389]: exiting on receipt of SIGTERM Dec 15 05:02:27 localhost podman[317599]: 2025-12-15 10:02:27.37064594 +0000 UTC m=+0.062612382 container kill 8d0a249025cd30d3f6ced66bb2a1ab69200de31f87cfd16c2c213b3b64a9055f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3ac513c6-d80a-4d03-a550-1b73e6929696, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:02:27 localhost systemd[1]: libpod-8d0a249025cd30d3f6ced66bb2a1ab69200de31f87cfd16c2c213b3b64a9055f.scope: Deactivated successfully. Dec 15 05:02:27 localhost podman[317613]: 2025-12-15 10:02:27.437267872 +0000 UTC m=+0.047866404 container died 8d0a249025cd30d3f6ced66bb2a1ab69200de31f87cfd16c2c213b3b64a9055f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3ac513c6-d80a-4d03-a550-1b73e6929696, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Dec 15 05:02:27 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8d0a249025cd30d3f6ced66bb2a1ab69200de31f87cfd16c2c213b3b64a9055f-userdata-shm.mount: Deactivated successfully. Dec 15 05:02:27 localhost podman[317613]: 2025-12-15 10:02:27.482906674 +0000 UTC m=+0.093505146 container remove 8d0a249025cd30d3f6ced66bb2a1ab69200de31f87cfd16c2c213b3b64a9055f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3ac513c6-d80a-4d03-a550-1b73e6929696, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 05:02:27 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:02:27.507 267546 INFO neutron.agent.dhcp.agent [None req-27d6633c-2fd9-40ce-8690-f6b341049aa8 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:02:27 localhost systemd[1]: libpod-conmon-8d0a249025cd30d3f6ced66bb2a1ab69200de31f87cfd16c2c213b3b64a9055f.scope: Deactivated successfully. Dec 15 05:02:27 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:02:27.549 267546 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:02:28 localhost systemd[1]: var-lib-containers-storage-overlay-027445aa133a13ea97c41b5f10b814a36033ae1a01b8cba7ad962d6983bbd593-merged.mount: Deactivated successfully. Dec 15 05:02:28 localhost systemd[1]: run-netns-qdhcp\x2d3ac513c6\x2dd80a\x2d4d03\x2da550\x2d1b73e6929696.mount: Deactivated successfully. Dec 15 05:02:28 localhost nova_compute[286344]: 2025-12-15 10:02:28.494 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:02:29 localhost nova_compute[286344]: 2025-12-15 10:02:29.956 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:02:30 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e116 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 05:02:31 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:02:31.528 267546 INFO neutron.agent.linux.ip_lib [None req-d2d19399-c369-454f-ad71-b295c8027b64 - - - - - -] Device tap2becb88a-55 cannot be used as it has no MAC address#033[00m Dec 15 05:02:31 localhost nova_compute[286344]: 2025-12-15 10:02:31.550 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:02:31 localhost kernel: device tap2becb88a-55 entered promiscuous mode Dec 15 05:02:31 localhost ovn_controller[154603]: 2025-12-15T10:02:31Z|00182|binding|INFO|Claiming lport 2becb88a-5592-4ce2-8f39-91733006352a for this chassis. Dec 15 05:02:31 localhost ovn_controller[154603]: 2025-12-15T10:02:31Z|00183|binding|INFO|2becb88a-5592-4ce2-8f39-91733006352a: Claiming unknown Dec 15 05:02:31 localhost NetworkManager[5963]: [1765792951.5586] manager: (tap2becb88a-55): new Generic device (/org/freedesktop/NetworkManager/Devices/31) Dec 15 05:02:31 localhost nova_compute[286344]: 2025-12-15 10:02:31.561 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:02:31 localhost systemd-udevd[317648]: Network interface NamePolicy= disabled on kernel command line. Dec 15 05:02:31 localhost ovn_metadata_agent[160585]: 2025-12-15 10:02:31.571 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-79c68e1b-56f8-4ad5-8e20-701273ec3796', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-79c68e1b-56f8-4ad5-8e20-701273ec3796', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '14c31ef3385b4922aa039962d102e162', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9250aaaf-9950-40e2-9996-bca4b7ad0d5d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=2becb88a-5592-4ce2-8f39-91733006352a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:02:31 localhost ovn_metadata_agent[160585]: 2025-12-15 10:02:31.572 160590 INFO neutron.agent.ovn.metadata.agent [-] Port 2becb88a-5592-4ce2-8f39-91733006352a in datapath 79c68e1b-56f8-4ad5-8e20-701273ec3796 bound to our chassis#033[00m Dec 15 05:02:31 localhost ovn_metadata_agent[160585]: 2025-12-15 10:02:31.574 160590 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 79c68e1b-56f8-4ad5-8e20-701273ec3796 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 15 05:02:31 localhost ovn_metadata_agent[160585]: 2025-12-15 10:02:31.575 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[93863830-b44f-4a54-a18e-1d0179121f46]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:02:31 localhost journal[231322]: ethtool ioctl error on tap2becb88a-55: No such device Dec 15 05:02:31 localhost journal[231322]: ethtool ioctl error on tap2becb88a-55: No such device Dec 15 05:02:31 localhost ovn_controller[154603]: 2025-12-15T10:02:31Z|00184|binding|INFO|Setting lport 2becb88a-5592-4ce2-8f39-91733006352a up in Southbound Dec 15 05:02:31 localhost ovn_controller[154603]: 2025-12-15T10:02:31Z|00185|binding|INFO|Setting lport 2becb88a-5592-4ce2-8f39-91733006352a ovn-installed in OVS Dec 15 05:02:31 localhost nova_compute[286344]: 2025-12-15 10:02:31.599 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:02:31 localhost nova_compute[286344]: 2025-12-15 10:02:31.601 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:02:31 localhost journal[231322]: ethtool ioctl error on tap2becb88a-55: No such device Dec 15 05:02:31 localhost journal[231322]: ethtool ioctl error on tap2becb88a-55: No such device Dec 15 05:02:31 localhost journal[231322]: ethtool ioctl error on tap2becb88a-55: No such device Dec 15 05:02:31 localhost journal[231322]: ethtool ioctl error on tap2becb88a-55: No such device Dec 15 05:02:31 localhost journal[231322]: ethtool ioctl error on tap2becb88a-55: No such device Dec 15 05:02:31 localhost journal[231322]: ethtool ioctl error on tap2becb88a-55: No such device Dec 15 05:02:31 localhost nova_compute[286344]: 2025-12-15 10:02:31.635 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:02:31 localhost nova_compute[286344]: 2025-12-15 10:02:31.661 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:02:31 localhost podman[243449]: time="2025-12-15T10:02:31Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 15 05:02:31 localhost nova_compute[286344]: 2025-12-15 10:02:31.912 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:02:31 localhost podman[243449]: @ - - [15/Dec/2025:10:02:31 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156640 "" "Go-http-client/1.1" Dec 15 05:02:31 localhost podman[243449]: @ - - [15/Dec/2025:10:02:31 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19243 "" "Go-http-client/1.1" Dec 15 05:02:32 localhost podman[317719]: Dec 15 05:02:32 localhost podman[317719]: 2025-12-15 10:02:32.492858249 +0000 UTC m=+0.090595767 container create 565ec04adec5277043a6f6673b75cdbc8c33a1bf662c1c403dbc0e3dab9f6e0c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-79c68e1b-56f8-4ad5-8e20-701273ec3796, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:02:32 localhost systemd[1]: Started libpod-conmon-565ec04adec5277043a6f6673b75cdbc8c33a1bf662c1c403dbc0e3dab9f6e0c.scope. Dec 15 05:02:32 localhost systemd[1]: Started libcrun container. Dec 15 05:02:32 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/508f32826c9e4d5162c6dab18cf3f915b0657618ca607194bdbc814a0daf7b86/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 05:02:32 localhost podman[317719]: 2025-12-15 10:02:32.449810248 +0000 UTC m=+0.047547776 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 15 05:02:32 localhost podman[317719]: 2025-12-15 10:02:32.561420485 +0000 UTC m=+0.159157993 container init 565ec04adec5277043a6f6673b75cdbc8c33a1bf662c1c403dbc0e3dab9f6e0c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-79c68e1b-56f8-4ad5-8e20-701273ec3796, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Dec 15 05:02:32 localhost podman[317719]: 2025-12-15 10:02:32.571615997 +0000 UTC m=+0.169353515 container start 565ec04adec5277043a6f6673b75cdbc8c33a1bf662c1c403dbc0e3dab9f6e0c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-79c68e1b-56f8-4ad5-8e20-701273ec3796, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3) Dec 15 05:02:32 localhost dnsmasq[317737]: started, version 2.85 cachesize 150 Dec 15 05:02:32 localhost dnsmasq[317737]: DNS service limited to local subnets Dec 15 05:02:32 localhost dnsmasq[317737]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 15 05:02:32 localhost dnsmasq[317737]: warning: no upstream servers configured Dec 15 05:02:32 localhost dnsmasq-dhcp[317737]: DHCPv6, static leases only on 2001:db8::, lease time 1d Dec 15 05:02:32 localhost dnsmasq[317737]: read /var/lib/neutron/dhcp/79c68e1b-56f8-4ad5-8e20-701273ec3796/addn_hosts - 0 addresses Dec 15 05:02:32 localhost dnsmasq-dhcp[317737]: read /var/lib/neutron/dhcp/79c68e1b-56f8-4ad5-8e20-701273ec3796/host Dec 15 05:02:32 localhost dnsmasq-dhcp[317737]: read /var/lib/neutron/dhcp/79c68e1b-56f8-4ad5-8e20-701273ec3796/opts Dec 15 05:02:32 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:02:32.700 267546 INFO neutron.agent.dhcp.agent [None req-f727769d-28fa-4f03-9b98-9001c1363f27 - - - - - -] DHCP configuration for ports {'c4cd8cf0-ccbf-43fe-944f-0f31971b0dbb'} is completed#033[00m Dec 15 05:02:32 localhost dnsmasq[317737]: exiting on receipt of SIGTERM Dec 15 05:02:32 localhost podman[317755]: 2025-12-15 10:02:32.935022746 +0000 UTC m=+0.061477201 container kill 565ec04adec5277043a6f6673b75cdbc8c33a1bf662c1c403dbc0e3dab9f6e0c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-79c68e1b-56f8-4ad5-8e20-701273ec3796, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS) Dec 15 05:02:32 localhost systemd[1]: libpod-565ec04adec5277043a6f6673b75cdbc8c33a1bf662c1c403dbc0e3dab9f6e0c.scope: Deactivated successfully. Dec 15 05:02:33 localhost podman[317768]: 2025-12-15 10:02:33.000677412 +0000 UTC m=+0.052871243 container died 565ec04adec5277043a6f6673b75cdbc8c33a1bf662c1c403dbc0e3dab9f6e0c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-79c68e1b-56f8-4ad5-8e20-701273ec3796, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true) Dec 15 05:02:33 localhost ovn_controller[154603]: 2025-12-15T10:02:33Z|00186|binding|INFO|Removing iface tap2becb88a-55 ovn-installed in OVS Dec 15 05:02:33 localhost ovn_metadata_agent[160585]: 2025-12-15 10:02:33.055 160590 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 7ae1a528-a4da-4f49-b0c5-3b608374ce62 with type ""#033[00m Dec 15 05:02:33 localhost ovn_metadata_agent[160585]: 2025-12-15 10:02:33.057 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-79c68e1b-56f8-4ad5-8e20-701273ec3796', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-79c68e1b-56f8-4ad5-8e20-701273ec3796', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '14c31ef3385b4922aa039962d102e162', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9250aaaf-9950-40e2-9996-bca4b7ad0d5d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=2becb88a-5592-4ce2-8f39-91733006352a) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:02:33 localhost ovn_metadata_agent[160585]: 2025-12-15 10:02:33.059 160590 INFO neutron.agent.ovn.metadata.agent [-] Port 2becb88a-5592-4ce2-8f39-91733006352a in datapath 79c68e1b-56f8-4ad5-8e20-701273ec3796 unbound from our chassis#033[00m Dec 15 05:02:33 localhost ovn_metadata_agent[160585]: 2025-12-15 10:02:33.059 160590 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 79c68e1b-56f8-4ad5-8e20-701273ec3796 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 15 05:02:33 localhost ovn_metadata_agent[160585]: 2025-12-15 10:02:33.060 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[b586ee6f-d682-4d51-ba78-194c400564a6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:02:33 localhost ovn_controller[154603]: 2025-12-15T10:02:33Z|00187|binding|INFO|Removing lport 2becb88a-5592-4ce2-8f39-91733006352a ovn-installed in OVS Dec 15 05:02:33 localhost nova_compute[286344]: 2025-12-15 10:02:33.098 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:02:33 localhost podman[317768]: 2025-12-15 10:02:33.10727026 +0000 UTC m=+0.159464071 container cleanup 565ec04adec5277043a6f6673b75cdbc8c33a1bf662c1c403dbc0e3dab9f6e0c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-79c68e1b-56f8-4ad5-8e20-701273ec3796, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:02:33 localhost systemd[1]: libpod-conmon-565ec04adec5277043a6f6673b75cdbc8c33a1bf662c1c403dbc0e3dab9f6e0c.scope: Deactivated successfully. Dec 15 05:02:33 localhost podman[317775]: 2025-12-15 10:02:33.130971195 +0000 UTC m=+0.167510983 container remove 565ec04adec5277043a6f6673b75cdbc8c33a1bf662c1c403dbc0e3dab9f6e0c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-79c68e1b-56f8-4ad5-8e20-701273ec3796, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:02:33 localhost nova_compute[286344]: 2025-12-15 10:02:33.144 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:02:33 localhost kernel: device tap2becb88a-55 left promiscuous mode Dec 15 05:02:33 localhost nova_compute[286344]: 2025-12-15 10:02:33.159 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:02:33 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:02:33.174 267546 INFO neutron.agent.dhcp.agent [None req-d9c74ebf-89a4-4472-86fc-b883194cd446 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:02:33 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:02:33.175 267546 INFO neutron.agent.dhcp.agent [None req-d9c74ebf-89a4-4472-86fc-b883194cd446 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:02:33 localhost systemd[1]: var-lib-containers-storage-overlay-508f32826c9e4d5162c6dab18cf3f915b0657618ca607194bdbc814a0daf7b86-merged.mount: Deactivated successfully. Dec 15 05:02:33 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-565ec04adec5277043a6f6673b75cdbc8c33a1bf662c1c403dbc0e3dab9f6e0c-userdata-shm.mount: Deactivated successfully. Dec 15 05:02:33 localhost systemd[1]: run-netns-qdhcp\x2d79c68e1b\x2d56f8\x2d4ad5\x2d8e20\x2d701273ec3796.mount: Deactivated successfully. Dec 15 05:02:33 localhost ovn_controller[154603]: 2025-12-15T10:02:33Z|00188|binding|INFO|Releasing lport b35254ad-12eb-47bb-92be-44fefe0694f0 from this chassis (sb_readonly=0) Dec 15 05:02:33 localhost nova_compute[286344]: 2025-12-15 10:02:33.668 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:02:33 localhost nova_compute[286344]: 2025-12-15 10:02:33.879 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:02:34 localhost openstack_network_exporter[246484]: ERROR 10:02:34 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 15 05:02:34 localhost openstack_network_exporter[246484]: ERROR 10:02:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 05:02:34 localhost openstack_network_exporter[246484]: ERROR 10:02:34 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 15 05:02:34 localhost openstack_network_exporter[246484]: Dec 15 05:02:34 localhost openstack_network_exporter[246484]: ERROR 10:02:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 05:02:34 localhost openstack_network_exporter[246484]: ERROR 10:02:34 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 15 05:02:34 localhost openstack_network_exporter[246484]: Dec 15 05:02:34 localhost nova_compute[286344]: 2025-12-15 10:02:34.958 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:02:35 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e116 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 05:02:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0. Dec 15 05:02:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 05:02:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a. Dec 15 05:02:36 localhost systemd[1]: tmp-crun.ppTmk3.mount: Deactivated successfully. Dec 15 05:02:36 localhost podman[317800]: 2025-12-15 10:02:36.773176306 +0000 UTC m=+0.098486444 container health_status 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, container_name=multipathd, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=multipathd) Dec 15 05:02:36 localhost podman[317801]: 2025-12-15 10:02:36.820714921 +0000 UTC m=+0.141810062 container health_status b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202) Dec 15 05:02:36 localhost podman[317801]: 2025-12-15 10:02:36.835292584 +0000 UTC m=+0.156387675 container exec_died b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible) Dec 15 05:02:36 localhost podman[317800]: 2025-12-15 10:02:36.835762897 +0000 UTC m=+0.161073015 container exec_died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true) Dec 15 05:02:36 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.service: Deactivated successfully. Dec 15 05:02:36 localhost nova_compute[286344]: 2025-12-15 10:02:36.941 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:02:36 localhost systemd[1]: b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a.service: Deactivated successfully. Dec 15 05:02:36 localhost podman[317799]: 2025-12-15 10:02:36.945771989 +0000 UTC m=+0.276256171 container health_status 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Dec 15 05:02:37 localhost podman[317799]: 2025-12-15 10:02:37.028294812 +0000 UTC m=+0.358778984 container exec_died 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Dec 15 05:02:37 localhost systemd[1]: 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0.service: Deactivated successfully. Dec 15 05:02:37 localhost neutron_sriov_agent[260044]: 2025-12-15 10:02:37.377 2 INFO neutron.agent.securitygroups_rpc [None req-09d63f63-22cd-4485-9005-66e1526817d4 ce46e81b73964575a8193484874630c3 14c31ef3385b4922aa039962d102e162 - - default default] Security group member updated ['6bf686bf-7721-4b52-a122-903123b828a6']#033[00m Dec 15 05:02:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09. Dec 15 05:02:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 05:02:38 localhost systemd[1]: tmp-crun.vxhXZW.mount: Deactivated successfully. Dec 15 05:02:38 localhost podman[317859]: 2025-12-15 10:02:38.753249803 +0000 UTC m=+0.083892161 container health_status 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, version=9.6, architecture=x86_64, io.buildah.version=1.33.7, release=1755695350, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container) Dec 15 05:02:38 localhost podman[317859]: 2025-12-15 10:02:38.790856423 +0000 UTC m=+0.121498771 container exec_died 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.33.7, version=9.6, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, name=ubi9-minimal, release=1755695350, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, maintainer=Red Hat, Inc.) Dec 15 05:02:38 localhost systemd[1]: 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09.service: Deactivated successfully. Dec 15 05:02:38 localhost podman[317860]: 2025-12-15 10:02:38.808248204 +0000 UTC m=+0.134773178 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:02:38 localhost podman[317860]: 2025-12-15 10:02:38.878455965 +0000 UTC m=+0.204980899 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Dec 15 05:02:38 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 05:02:38 localhost neutron_sriov_agent[260044]: 2025-12-15 10:02:38.898 2 INFO neutron.agent.securitygroups_rpc [None req-a21bd7ed-bc2f-456f-bc45-07fb028cd272 ce46e81b73964575a8193484874630c3 14c31ef3385b4922aa039962d102e162 - - default default] Security group member updated ['6bf686bf-7721-4b52-a122-903123b828a6']#033[00m Dec 15 05:02:38 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:02:38.927 267546 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:02:39 localhost neutron_sriov_agent[260044]: 2025-12-15 10:02:39.070 2 INFO neutron.agent.securitygroups_rpc [None req-3eb601fa-880b-4b1a-a2c7-75e484e9b677 c989a3ea16ee452cbc34536515aa30cc b5363bf3c2794ed781c829ed2039e110 - - default default] Security group rule updated ['b823eae1-1ebf-40f6-9db6-46dcf45e5390']#033[00m Dec 15 05:02:39 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:02:39.413 267546 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:02:39 localhost nova_compute[286344]: 2025-12-15 10:02:39.959 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:02:40 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e116 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 05:02:41 localhost nova_compute[286344]: 2025-12-15 10:02:41.983 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:02:42 localhost nova_compute[286344]: 2025-12-15 10:02:42.075 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:02:44 localhost nova_compute[286344]: 2025-12-15 10:02:44.961 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:02:44 localhost sshd[317902]: main: sshd: ssh-rsa algorithm is disabled Dec 15 05:02:45 localhost ovn_controller[154603]: 2025-12-15T10:02:45Z|00189|binding|INFO|Releasing lport b35254ad-12eb-47bb-92be-44fefe0694f0 from this chassis (sb_readonly=0) Dec 15 05:02:45 localhost nova_compute[286344]: 2025-12-15 10:02:45.227 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:02:45 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e116 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 05:02:45 localhost neutron_sriov_agent[260044]: 2025-12-15 10:02:45.549 2 INFO neutron.agent.securitygroups_rpc [None req-fafce219-b8f6-49a6-a5f1-9157ae297182 d5377d4013ec43b98bbf91c063971dac 1808d9ac20f44272b13e28ceb7a5e4c6 - - default default] Security group member updated ['f398dd81-32a0-4005-81c7-146c01e93e74']#033[00m Dec 15 05:02:47 localhost nova_compute[286344]: 2025-12-15 10:02:47.024 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:02:48 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 05:02:48 localhost systemd[1]: tmp-crun.Md9gfH.mount: Deactivated successfully. Dec 15 05:02:48 localhost podman[317904]: 2025-12-15 10:02:48.75933165 +0000 UTC m=+0.093414744 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible) Dec 15 05:02:48 localhost podman[317904]: 2025-12-15 10:02:48.768448071 +0000 UTC m=+0.102531115 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:02:48 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 05:02:49 localhost neutron_sriov_agent[260044]: 2025-12-15 10:02:49.259 2 INFO neutron.agent.securitygroups_rpc [None req-6bdea9ef-11d5-4bce-ba4d-7bd04764ad7e d5377d4013ec43b98bbf91c063971dac 1808d9ac20f44272b13e28ceb7a5e4c6 - - default default] Security group member updated ['f398dd81-32a0-4005-81c7-146c01e93e74']#033[00m Dec 15 05:02:49 localhost neutron_sriov_agent[260044]: 2025-12-15 10:02:49.336 2 INFO neutron.agent.securitygroups_rpc [None req-6bdea9ef-11d5-4bce-ba4d-7bd04764ad7e d5377d4013ec43b98bbf91c063971dac 1808d9ac20f44272b13e28ceb7a5e4c6 - - default default] Security group member updated ['f398dd81-32a0-4005-81c7-146c01e93e74']#033[00m Dec 15 05:02:49 localhost neutron_sriov_agent[260044]: 2025-12-15 10:02:49.942 2 INFO neutron.agent.securitygroups_rpc [None req-12ec496f-57cd-48ea-9d8c-8f61b6db6076 d5377d4013ec43b98bbf91c063971dac 1808d9ac20f44272b13e28ceb7a5e4c6 - - default default] Security group member updated ['f398dd81-32a0-4005-81c7-146c01e93e74']#033[00m Dec 15 05:02:49 localhost nova_compute[286344]: 2025-12-15 10:02:49.964 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:02:50 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e116 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 05:02:50 localhost nova_compute[286344]: 2025-12-15 10:02:50.566 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:02:50 localhost neutron_sriov_agent[260044]: 2025-12-15 10:02:50.827 2 INFO neutron.agent.securitygroups_rpc [None req-b0fc8483-0336-4abe-9e2e-5f19ec545ad0 d5377d4013ec43b98bbf91c063971dac 1808d9ac20f44272b13e28ceb7a5e4c6 - - default default] Security group member updated ['f398dd81-32a0-4005-81c7-146c01e93e74']#033[00m Dec 15 05:02:50 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:02:50.861 267546 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:02:51 localhost nova_compute[286344]: 2025-12-15 10:02:51.414 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:02:51 localhost ovn_metadata_agent[160585]: 2025-12-15 10:02:51.479 160590 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 05:02:51 localhost ovn_metadata_agent[160585]: 2025-12-15 10:02:51.480 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 05:02:51 localhost ovn_metadata_agent[160585]: 2025-12-15 10:02:51.482 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 05:02:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e. Dec 15 05:02:51 localhost podman[317923]: 2025-12-15 10:02:51.743434282 +0000 UTC m=+0.077457683 container health_status a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 15 05:02:51 localhost podman[317923]: 2025-12-15 10:02:51.756674288 +0000 UTC m=+0.090697659 container exec_died a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Dec 15 05:02:51 localhost systemd[1]: a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e.service: Deactivated successfully. Dec 15 05:02:52 localhost neutron_sriov_agent[260044]: 2025-12-15 10:02:52.078 2 INFO neutron.agent.securitygroups_rpc [req-d7d4d39e-b173-44d8-a353-45b9b8b2238a req-750325c1-6641-4ec6-8406-6244031fd442 34b5d2b3fefc46c08c7db0954001edbc 9c1f18e297264783b4b7757dc5db064e - - default default] Security group rule updated ['66e59027-ea8e-4c02-8937-d9f719566bec']#033[00m Dec 15 05:02:52 localhost nova_compute[286344]: 2025-12-15 10:02:52.080 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:02:52 localhost neutron_sriov_agent[260044]: 2025-12-15 10:02:52.573 2 INFO neutron.agent.securitygroups_rpc [req-0742be84-2908-4753-adf2-1320baab5426 req-d9be0209-136c-4549-a3d5-3f7b551e90a7 34b5d2b3fefc46c08c7db0954001edbc 9c1f18e297264783b4b7757dc5db064e - - default default] Security group rule updated ['66e59027-ea8e-4c02-8937-d9f719566bec']#033[00m Dec 15 05:02:54 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:02:54.516 267546 INFO neutron.agent.linux.ip_lib [None req-aa62153b-9929-4534-848e-4dcc3bec3d46 - - - - - -] Device tap358ecb33-b6 cannot be used as it has no MAC address#033[00m Dec 15 05:02:54 localhost nova_compute[286344]: 2025-12-15 10:02:54.577 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:02:54 localhost kernel: device tap358ecb33-b6 entered promiscuous mode Dec 15 05:02:54 localhost NetworkManager[5963]: [1765792974.5842] manager: (tap358ecb33-b6): new Generic device (/org/freedesktop/NetworkManager/Devices/32) Dec 15 05:02:54 localhost ovn_controller[154603]: 2025-12-15T10:02:54Z|00190|binding|INFO|Claiming lport 358ecb33-b639-4049-814f-eadc6ff33eff for this chassis. Dec 15 05:02:54 localhost ovn_controller[154603]: 2025-12-15T10:02:54Z|00191|binding|INFO|358ecb33-b639-4049-814f-eadc6ff33eff: Claiming unknown Dec 15 05:02:54 localhost nova_compute[286344]: 2025-12-15 10:02:54.589 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:02:54 localhost systemd-udevd[317955]: Network interface NamePolicy= disabled on kernel command line. Dec 15 05:02:54 localhost ovn_metadata_agent[160585]: 2025-12-15 10:02:54.596 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-ffecc2b3-7684-4365-b349-feb33b6a786e', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ffecc2b3-7684-4365-b349-feb33b6a786e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1808d9ac20f44272b13e28ceb7a5e4c6', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=606fc3ee-b802-4fc6-928d-12af2a6d9240, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=358ecb33-b639-4049-814f-eadc6ff33eff) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:02:54 localhost ovn_metadata_agent[160585]: 2025-12-15 10:02:54.598 160590 INFO neutron.agent.ovn.metadata.agent [-] Port 358ecb33-b639-4049-814f-eadc6ff33eff in datapath ffecc2b3-7684-4365-b349-feb33b6a786e bound to our chassis#033[00m Dec 15 05:02:54 localhost ovn_metadata_agent[160585]: 2025-12-15 10:02:54.599 160590 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network ffecc2b3-7684-4365-b349-feb33b6a786e or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 15 05:02:54 localhost ovn_metadata_agent[160585]: 2025-12-15 10:02:54.600 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[fd1b12be-462e-45cf-9ccb-7994648cc055]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:02:54 localhost journal[231322]: ethtool ioctl error on tap358ecb33-b6: No such device Dec 15 05:02:54 localhost nova_compute[286344]: 2025-12-15 10:02:54.615 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:02:54 localhost ovn_controller[154603]: 2025-12-15T10:02:54Z|00192|binding|INFO|Setting lport 358ecb33-b639-4049-814f-eadc6ff33eff ovn-installed in OVS Dec 15 05:02:54 localhost ovn_controller[154603]: 2025-12-15T10:02:54Z|00193|binding|INFO|Setting lport 358ecb33-b639-4049-814f-eadc6ff33eff up in Southbound Dec 15 05:02:54 localhost journal[231322]: ethtool ioctl error on tap358ecb33-b6: No such device Dec 15 05:02:54 localhost journal[231322]: ethtool ioctl error on tap358ecb33-b6: No such device Dec 15 05:02:54 localhost journal[231322]: ethtool ioctl error on tap358ecb33-b6: No such device Dec 15 05:02:54 localhost journal[231322]: ethtool ioctl error on tap358ecb33-b6: No such device Dec 15 05:02:54 localhost journal[231322]: ethtool ioctl error on tap358ecb33-b6: No such device Dec 15 05:02:54 localhost journal[231322]: ethtool ioctl error on tap358ecb33-b6: No such device Dec 15 05:02:54 localhost journal[231322]: ethtool ioctl error on tap358ecb33-b6: No such device Dec 15 05:02:54 localhost nova_compute[286344]: 2025-12-15 10:02:54.662 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:02:54 localhost nova_compute[286344]: 2025-12-15 10:02:54.692 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:02:54 localhost nova_compute[286344]: 2025-12-15 10:02:54.966 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:02:55 localhost systemd[1]: virtsecretd.service: Deactivated successfully. Dec 15 05:02:55 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e116 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 05:02:55 localhost podman[318061]: Dec 15 05:02:55 localhost podman[318061]: 2025-12-15 10:02:55.623263304 +0000 UTC m=+0.092212262 container create 421e75515481fc0cc25361ccade48ebcf461c8abd41f3d2df1b7646fa767b3d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ffecc2b3-7684-4365-b349-feb33b6a786e, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 05:02:55 localhost systemd[1]: Started libpod-conmon-421e75515481fc0cc25361ccade48ebcf461c8abd41f3d2df1b7646fa767b3d8.scope. Dec 15 05:02:55 localhost podman[318061]: 2025-12-15 10:02:55.583094853 +0000 UTC m=+0.052043841 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 15 05:02:55 localhost systemd[1]: Started libcrun container. Dec 15 05:02:55 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/047c03711c99433568d57a76e7f60e9000ad82a1f88b7a1f821f80177a503765/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 05:02:55 localhost podman[318061]: 2025-12-15 10:02:55.71462827 +0000 UTC m=+0.183577238 container init 421e75515481fc0cc25361ccade48ebcf461c8abd41f3d2df1b7646fa767b3d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ffecc2b3-7684-4365-b349-feb33b6a786e, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Dec 15 05:02:55 localhost podman[318061]: 2025-12-15 10:02:55.723935008 +0000 UTC m=+0.192883966 container start 421e75515481fc0cc25361ccade48ebcf461c8abd41f3d2df1b7646fa767b3d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ffecc2b3-7684-4365-b349-feb33b6a786e, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202) Dec 15 05:02:55 localhost dnsmasq[318082]: started, version 2.85 cachesize 150 Dec 15 05:02:55 localhost dnsmasq[318082]: DNS service limited to local subnets Dec 15 05:02:55 localhost dnsmasq[318082]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 15 05:02:55 localhost dnsmasq[318082]: warning: no upstream servers configured Dec 15 05:02:55 localhost dnsmasq-dhcp[318082]: DHCP, static leases only on 10.100.0.0, lease time 1d Dec 15 05:02:55 localhost dnsmasq[318082]: read /var/lib/neutron/dhcp/ffecc2b3-7684-4365-b349-feb33b6a786e/addn_hosts - 0 addresses Dec 15 05:02:55 localhost dnsmasq-dhcp[318082]: read /var/lib/neutron/dhcp/ffecc2b3-7684-4365-b349-feb33b6a786e/host Dec 15 05:02:55 localhost dnsmasq-dhcp[318082]: read /var/lib/neutron/dhcp/ffecc2b3-7684-4365-b349-feb33b6a786e/opts Dec 15 05:02:55 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:02:55.902 267546 INFO neutron.agent.dhcp.agent [None req-a7b4c802-b487-4775-a7eb-5efb57b7b1e1 - - - - - -] DHCP configuration for ports {'0dda2dbe-564e-46ee-8274-5fa125647654'} is completed#033[00m Dec 15 05:02:56 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Dec 15 05:02:56 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:02:56 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Dec 15 05:02:56 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:02:56 localhost neutron_sriov_agent[260044]: 2025-12-15 10:02:56.792 2 INFO neutron.agent.securitygroups_rpc [None req-97feab61-8c89-405e-a5c2-e3c04c4b1097 d5377d4013ec43b98bbf91c063971dac 1808d9ac20f44272b13e28ceb7a5e4c6 - - default default] Security group member updated ['f398dd81-32a0-4005-81c7-146c01e93e74']#033[00m Dec 15 05:02:56 localhost nova_compute[286344]: 2025-12-15 10:02:56.793 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:02:56 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:02:56.799 267546 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-15T10:02:56Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=2fde306f-249b-41f1-8c99-196a682bcc72, ip_allocation=immediate, mac_address=fa:16:3e:55:28:c0, name=tempest-PortsTestJSON-1705118698, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-15T10:02:52Z, description=, dns_domain=, id=ffecc2b3-7684-4365-b349-feb33b6a786e, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsTestJSON-1391057240, port_security_enabled=True, project_id=1808d9ac20f44272b13e28ceb7a5e4c6, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=17562, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1098, status=ACTIVE, subnets=['8576d792-47bb-48bf-b33c-fb07fa91f4aa'], tags=[], tenant_id=1808d9ac20f44272b13e28ceb7a5e4c6, updated_at=2025-12-15T10:02:53Z, vlan_transparent=None, network_id=ffecc2b3-7684-4365-b349-feb33b6a786e, port_security_enabled=True, project_id=1808d9ac20f44272b13e28ceb7a5e4c6, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['f398dd81-32a0-4005-81c7-146c01e93e74'], standard_attr_id=1120, status=DOWN, tags=[], tenant_id=1808d9ac20f44272b13e28ceb7a5e4c6, updated_at=2025-12-15T10:02:56Z on network ffecc2b3-7684-4365-b349-feb33b6a786e#033[00m Dec 15 05:02:56 localhost dnsmasq[318082]: read /var/lib/neutron/dhcp/ffecc2b3-7684-4365-b349-feb33b6a786e/addn_hosts - 1 addresses Dec 15 05:02:56 localhost dnsmasq-dhcp[318082]: read /var/lib/neutron/dhcp/ffecc2b3-7684-4365-b349-feb33b6a786e/host Dec 15 05:02:56 localhost dnsmasq-dhcp[318082]: read /var/lib/neutron/dhcp/ffecc2b3-7684-4365-b349-feb33b6a786e/opts Dec 15 05:02:56 localhost podman[318151]: 2025-12-15 10:02:56.994141594 +0000 UTC m=+0.049122890 container kill 421e75515481fc0cc25361ccade48ebcf461c8abd41f3d2df1b7646fa767b3d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ffecc2b3-7684-4365-b349-feb33b6a786e, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:02:57 localhost nova_compute[286344]: 2025-12-15 10:02:57.082 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:02:57 localhost neutron_sriov_agent[260044]: 2025-12-15 10:02:57.127 2 INFO neutron.agent.securitygroups_rpc [None req-cc73a3b4-85be-4199-8018-e4097236be1a d5377d4013ec43b98bbf91c063971dac 1808d9ac20f44272b13e28ceb7a5e4c6 - - default default] Security group member updated ['f398dd81-32a0-4005-81c7-146c01e93e74']#033[00m Dec 15 05:02:57 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:02:57.231 267546 INFO neutron.agent.dhcp.agent [None req-10d0b15b-24c3-4f7a-ac31-3c56313f303a - - - - - -] DHCP configuration for ports {'2fde306f-249b-41f1-8c99-196a682bcc72'} is completed#033[00m Dec 15 05:02:57 localhost dnsmasq[318082]: read /var/lib/neutron/dhcp/ffecc2b3-7684-4365-b349-feb33b6a786e/addn_hosts - 0 addresses Dec 15 05:02:57 localhost dnsmasq-dhcp[318082]: read /var/lib/neutron/dhcp/ffecc2b3-7684-4365-b349-feb33b6a786e/host Dec 15 05:02:57 localhost dnsmasq-dhcp[318082]: read /var/lib/neutron/dhcp/ffecc2b3-7684-4365-b349-feb33b6a786e/opts Dec 15 05:02:57 localhost podman[318191]: 2025-12-15 10:02:57.408100541 +0000 UTC m=+0.058781786 container kill 421e75515481fc0cc25361ccade48ebcf461c8abd41f3d2df1b7646fa767b3d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ffecc2b3-7684-4365-b349-feb33b6a786e, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS) Dec 15 05:02:57 localhost dnsmasq[318082]: exiting on receipt of SIGTERM Dec 15 05:02:57 localhost podman[318231]: 2025-12-15 10:02:57.838093413 +0000 UTC m=+0.068638040 container kill 421e75515481fc0cc25361ccade48ebcf461c8abd41f3d2df1b7646fa767b3d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ffecc2b3-7684-4365-b349-feb33b6a786e, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS) Dec 15 05:02:57 localhost systemd[1]: libpod-421e75515481fc0cc25361ccade48ebcf461c8abd41f3d2df1b7646fa767b3d8.scope: Deactivated successfully. Dec 15 05:02:57 localhost ovn_controller[154603]: 2025-12-15T10:02:57Z|00194|binding|INFO|Removing iface tap358ecb33-b6 ovn-installed in OVS Dec 15 05:02:57 localhost ovn_metadata_agent[160585]: 2025-12-15 10:02:57.842 160590 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port ee10367d-578b-467e-a84b-b7474636fe73 with type ""#033[00m Dec 15 05:02:57 localhost ovn_controller[154603]: 2025-12-15T10:02:57Z|00195|binding|INFO|Removing lport 358ecb33-b639-4049-814f-eadc6ff33eff ovn-installed in OVS Dec 15 05:02:57 localhost ovn_metadata_agent[160585]: 2025-12-15 10:02:57.843 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-ffecc2b3-7684-4365-b349-feb33b6a786e', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ffecc2b3-7684-4365-b349-feb33b6a786e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1808d9ac20f44272b13e28ceb7a5e4c6', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005559462.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=606fc3ee-b802-4fc6-928d-12af2a6d9240, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=358ecb33-b639-4049-814f-eadc6ff33eff) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:02:57 localhost ovn_metadata_agent[160585]: 2025-12-15 10:02:57.844 160590 INFO neutron.agent.ovn.metadata.agent [-] Port 358ecb33-b639-4049-814f-eadc6ff33eff in datapath ffecc2b3-7684-4365-b349-feb33b6a786e unbound from our chassis#033[00m Dec 15 05:02:57 localhost ovn_metadata_agent[160585]: 2025-12-15 10:02:57.846 160590 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ffecc2b3-7684-4365-b349-feb33b6a786e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 15 05:02:57 localhost ovn_metadata_agent[160585]: 2025-12-15 10:02:57.846 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[1405405b-d58f-4b58-95f1-76687da85510]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:02:57 localhost nova_compute[286344]: 2025-12-15 10:02:57.874 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:02:57 localhost podman[318247]: 2025-12-15 10:02:57.921565481 +0000 UTC m=+0.062133009 container died 421e75515481fc0cc25361ccade48ebcf461c8abd41f3d2df1b7646fa767b3d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ffecc2b3-7684-4365-b349-feb33b6a786e, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Dec 15 05:02:57 localhost podman[318247]: 2025-12-15 10:02:57.949220426 +0000 UTC m=+0.089787904 container cleanup 421e75515481fc0cc25361ccade48ebcf461c8abd41f3d2df1b7646fa767b3d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ffecc2b3-7684-4365-b349-feb33b6a786e, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:02:57 localhost systemd[1]: libpod-conmon-421e75515481fc0cc25361ccade48ebcf461c8abd41f3d2df1b7646fa767b3d8.scope: Deactivated successfully. Dec 15 05:02:57 localhost systemd[1]: var-lib-containers-storage-overlay-047c03711c99433568d57a76e7f60e9000ad82a1f88b7a1f821f80177a503765-merged.mount: Deactivated successfully. Dec 15 05:02:57 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-421e75515481fc0cc25361ccade48ebcf461c8abd41f3d2df1b7646fa767b3d8-userdata-shm.mount: Deactivated successfully. Dec 15 05:02:58 localhost podman[318246]: 2025-12-15 10:02:58.011741285 +0000 UTC m=+0.137854423 container remove 421e75515481fc0cc25361ccade48ebcf461c8abd41f3d2df1b7646fa767b3d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ffecc2b3-7684-4365-b349-feb33b6a786e, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202) Dec 15 05:02:58 localhost nova_compute[286344]: 2025-12-15 10:02:58.022 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:02:58 localhost kernel: device tap358ecb33-b6 left promiscuous mode Dec 15 05:02:58 localhost nova_compute[286344]: 2025-12-15 10:02:58.034 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:02:58 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:02:58.072 267546 INFO neutron.agent.dhcp.agent [None req-cd3b8211-1a27-45a1-b6e7-1040e8fb4003 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:02:58 localhost systemd[1]: run-netns-qdhcp\x2dffecc2b3\x2d7684\x2d4365\x2db349\x2dfeb33b6a786e.mount: Deactivated successfully. Dec 15 05:02:58 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:02:58.090 267546 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:02:58 localhost ovn_controller[154603]: 2025-12-15T10:02:58Z|00196|binding|INFO|Releasing lport b35254ad-12eb-47bb-92be-44fefe0694f0 from this chassis (sb_readonly=0) Dec 15 05:02:58 localhost nova_compute[286344]: 2025-12-15 10:02:58.366 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:02:58 localhost ovn_controller[154603]: 2025-12-15T10:02:58Z|00197|binding|INFO|Releasing lport b35254ad-12eb-47bb-92be-44fefe0694f0 from this chassis (sb_readonly=0) Dec 15 05:02:58 localhost nova_compute[286344]: 2025-12-15 10:02:58.577 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:02:59 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Dec 15 05:02:59 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:02:59 localhost nova_compute[286344]: 2025-12-15 10:02:59.968 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:03:00 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:03:00 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e116 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 05:03:01 localhost ovn_controller[154603]: 2025-12-15T10:03:01Z|00198|binding|INFO|Releasing lport b35254ad-12eb-47bb-92be-44fefe0694f0 from this chassis (sb_readonly=0) Dec 15 05:03:01 localhost nova_compute[286344]: 2025-12-15 10:03:01.861 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:03:01 localhost podman[243449]: time="2025-12-15T10:03:01Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 15 05:03:01 localhost podman[243449]: @ - - [15/Dec/2025:10:03:01 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156640 "" "Go-http-client/1.1" Dec 15 05:03:01 localhost podman[243449]: @ - - [15/Dec/2025:10:03:01 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19251 "" "Go-http-client/1.1" Dec 15 05:03:02 localhost nova_compute[286344]: 2025-12-15 10:03:02.086 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:03:02 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e116 do_prune osdmap full prune enabled Dec 15 05:03:02 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e117 e117: 6 total, 6 up, 6 in Dec 15 05:03:02 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e117: 6 total, 6 up, 6 in Dec 15 05:03:03 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e117 do_prune osdmap full prune enabled Dec 15 05:03:03 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e118 e118: 6 total, 6 up, 6 in Dec 15 05:03:03 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e118: 6 total, 6 up, 6 in Dec 15 05:03:04 localhost ovn_metadata_agent[160585]: 2025-12-15 10:03:04.034 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'fe:17:e3', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fe:55:2b:86:15:b5'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:03:04 localhost nova_compute[286344]: 2025-12-15 10:03:04.034 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:03:04 localhost ovn_metadata_agent[160585]: 2025-12-15 10:03:04.035 160590 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Dec 15 05:03:04 localhost openstack_network_exporter[246484]: ERROR 10:03:04 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 15 05:03:04 localhost openstack_network_exporter[246484]: ERROR 10:03:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 05:03:04 localhost openstack_network_exporter[246484]: ERROR 10:03:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 05:03:04 localhost openstack_network_exporter[246484]: ERROR 10:03:04 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 15 05:03:04 localhost openstack_network_exporter[246484]: Dec 15 05:03:04 localhost openstack_network_exporter[246484]: ERROR 10:03:04 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 15 05:03:04 localhost openstack_network_exporter[246484]: Dec 15 05:03:04 localhost nova_compute[286344]: 2025-12-15 10:03:04.970 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:03:05 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e118 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 05:03:05 localhost ceph-mon[298913]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #46. Immutable memtables: 0. Dec 15 05:03:05 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:03:05.956178) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Dec 15 05:03:05 localhost ceph-mon[298913]: rocksdb: [db/flush_job.cc:856] [default] [JOB 25] Flushing memtable with next log file: 46 Dec 15 05:03:05 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765792985956260, "job": 25, "event": "flush_started", "num_memtables": 1, "num_entries": 2306, "num_deletes": 259, "total_data_size": 2284013, "memory_usage": 2333872, "flush_reason": "Manual Compaction"} Dec 15 05:03:05 localhost ceph-mon[298913]: rocksdb: [db/flush_job.cc:885] [default] [JOB 25] Level-0 flush table #47: started Dec 15 05:03:05 localhost neutron_sriov_agent[260044]: 2025-12-15 10:03:05.958 2 INFO neutron.agent.securitygroups_rpc [None req-0a61d590-6611-4636-89cd-83d84ef55c11 d5377d4013ec43b98bbf91c063971dac 1808d9ac20f44272b13e28ceb7a5e4c6 - - default default] Security group member updated ['f398dd81-32a0-4005-81c7-146c01e93e74']#033[00m Dec 15 05:03:05 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765792985974746, "cf_name": "default", "job": 25, "event": "table_file_creation", "file_number": 47, "file_size": 2209728, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 25191, "largest_seqno": 27496, "table_properties": {"data_size": 2200555, "index_size": 5678, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2437, "raw_key_size": 20459, "raw_average_key_size": 21, "raw_value_size": 2181534, "raw_average_value_size": 2263, "num_data_blocks": 248, "num_entries": 964, "num_filter_entries": 964, "num_deletions": 259, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765792816, "oldest_key_time": 1765792816, "file_creation_time": 1765792985, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "603b24af-e2be-4214-bc56-9e652eb4af3d", "db_session_id": "0OJRM9SCUA16EXV0VQZ2", "orig_file_number": 47, "seqno_to_time_mapping": "N/A"}} Dec 15 05:03:05 localhost ceph-mon[298913]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 25] Flush lasted 18599 microseconds, and 4740 cpu microseconds. Dec 15 05:03:05 localhost ceph-mon[298913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Dec 15 05:03:05 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:03:05.974786) [db/flush_job.cc:967] [default] [JOB 25] Level-0 flush table #47: 2209728 bytes OK Dec 15 05:03:05 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:03:05.974806) [db/memtable_list.cc:519] [default] Level-0 commit table #47 started Dec 15 05:03:05 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:03:05.976666) [db/memtable_list.cc:722] [default] Level-0 commit table #47: memtable #1 done Dec 15 05:03:05 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:03:05.976689) EVENT_LOG_v1 {"time_micros": 1765792985976682, "job": 25, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Dec 15 05:03:05 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:03:05.976711) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Dec 15 05:03:05 localhost ceph-mon[298913]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 25] Try to delete WAL files size 2274356, prev total WAL file size 2274356, number of live WAL files 2. Dec 15 05:03:05 localhost ceph-mon[298913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005559462/store.db/000043.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Dec 15 05:03:05 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:03:05.978008) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003131373937' seq:72057594037927935, type:22 .. '7061786F73003132303439' seq:0, type:0; will stop at (end) Dec 15 05:03:05 localhost ceph-mon[298913]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 26] Compacting 1@0 + 1@6 files to L6, score -1.00 Dec 15 05:03:05 localhost ceph-mon[298913]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 25 Base level 0, inputs: [47(2157KB)], [45(17MB)] Dec 15 05:03:05 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765792985978064, "job": 26, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [47], "files_L6": [45], "score": -1, "input_data_size": 20066058, "oldest_snapshot_seqno": -1} Dec 15 05:03:06 localhost ceph-mon[298913]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 26] Generated table #48: 12678 keys, 17537247 bytes, temperature: kUnknown Dec 15 05:03:06 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765792986110111, "cf_name": "default", "job": 26, "event": "table_file_creation", "file_number": 48, "file_size": 17537247, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17464718, "index_size": 39724, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 31749, "raw_key_size": 339739, "raw_average_key_size": 26, "raw_value_size": 17248351, "raw_average_value_size": 1360, "num_data_blocks": 1514, "num_entries": 12678, "num_filter_entries": 12678, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765792320, "oldest_key_time": 0, "file_creation_time": 1765792985, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "603b24af-e2be-4214-bc56-9e652eb4af3d", "db_session_id": "0OJRM9SCUA16EXV0VQZ2", "orig_file_number": 48, "seqno_to_time_mapping": "N/A"}} Dec 15 05:03:06 localhost ceph-mon[298913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Dec 15 05:03:06 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:03:06.110407) [db/compaction/compaction_job.cc:1663] [default] [JOB 26] Compacted 1@0 + 1@6 files to L6 => 17537247 bytes Dec 15 05:03:06 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:03:06.112055) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 151.9 rd, 132.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.1, 17.0 +0.0 blob) out(16.7 +0.0 blob), read-write-amplify(17.0) write-amplify(7.9) OK, records in: 13210, records dropped: 532 output_compression: NoCompression Dec 15 05:03:06 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:03:06.112087) EVENT_LOG_v1 {"time_micros": 1765792986112074, "job": 26, "event": "compaction_finished", "compaction_time_micros": 132121, "compaction_time_cpu_micros": 51027, "output_level": 6, "num_output_files": 1, "total_output_size": 17537247, "num_input_records": 13210, "num_output_records": 12678, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Dec 15 05:03:06 localhost ceph-mon[298913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005559462/store.db/000047.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Dec 15 05:03:06 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765792986112563, "job": 26, "event": "table_file_deletion", "file_number": 47} Dec 15 05:03:06 localhost ceph-mon[298913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005559462/store.db/000045.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Dec 15 05:03:06 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765792986114977, "job": 26, "event": "table_file_deletion", "file_number": 45} Dec 15 05:03:06 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:03:05.977920) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 05:03:06 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:03:06.115145) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 05:03:06 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:03:06.115153) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 05:03:06 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:03:06.115157) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 05:03:06 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:03:06.115160) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 05:03:06 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:03:06.115163) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 05:03:06 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:03:06.372 267546 INFO neutron.agent.linux.ip_lib [None req-b6138e84-27d9-402a-b756-232728fe5ea4 - - - - - -] Device tapd10e503e-ef cannot be used as it has no MAC address#033[00m Dec 15 05:03:06 localhost nova_compute[286344]: 2025-12-15 10:03:06.390 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:03:06 localhost kernel: device tapd10e503e-ef entered promiscuous mode Dec 15 05:03:06 localhost NetworkManager[5963]: [1765792986.3964] manager: (tapd10e503e-ef): new Generic device (/org/freedesktop/NetworkManager/Devices/33) Dec 15 05:03:06 localhost ovn_controller[154603]: 2025-12-15T10:03:06Z|00199|binding|INFO|Claiming lport d10e503e-ef84-4538-aed9-daa76a68eb88 for this chassis. Dec 15 05:03:06 localhost ovn_controller[154603]: 2025-12-15T10:03:06Z|00200|binding|INFO|d10e503e-ef84-4538-aed9-daa76a68eb88: Claiming unknown Dec 15 05:03:06 localhost nova_compute[286344]: 2025-12-15 10:03:06.397 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:03:06 localhost systemd-udevd[318284]: Network interface NamePolicy= disabled on kernel command line. Dec 15 05:03:06 localhost ovn_metadata_agent[160585]: 2025-12-15 10:03:06.410 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-7045181e-34f8-4eb6-9637-1ed97ee9f99e', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7045181e-34f8-4eb6-9637-1ed97ee9f99e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9f4dc08c7491451692c298986081227e', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=22537a10-1b9d-495d-9379-0c843feed9f6, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=d10e503e-ef84-4538-aed9-daa76a68eb88) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:03:06 localhost ovn_metadata_agent[160585]: 2025-12-15 10:03:06.412 160590 INFO neutron.agent.ovn.metadata.agent [-] Port d10e503e-ef84-4538-aed9-daa76a68eb88 in datapath 7045181e-34f8-4eb6-9637-1ed97ee9f99e bound to our chassis#033[00m Dec 15 05:03:06 localhost ovn_metadata_agent[160585]: 2025-12-15 10:03:06.414 160590 DEBUG neutron.agent.ovn.metadata.agent [-] Port 82c74775-4d6d-4e15-bccd-874c714e37c3 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Dec 15 05:03:06 localhost ovn_metadata_agent[160585]: 2025-12-15 10:03:06.414 160590 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7045181e-34f8-4eb6-9637-1ed97ee9f99e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 15 05:03:06 localhost ovn_metadata_agent[160585]: 2025-12-15 10:03:06.415 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[06fd02ab-0079-4413-ae9d-63ef682e4764]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:03:06 localhost journal[231322]: ethtool ioctl error on tapd10e503e-ef: No such device Dec 15 05:03:06 localhost journal[231322]: ethtool ioctl error on tapd10e503e-ef: No such device Dec 15 05:03:06 localhost ovn_controller[154603]: 2025-12-15T10:03:06Z|00201|binding|INFO|Setting lport d10e503e-ef84-4538-aed9-daa76a68eb88 ovn-installed in OVS Dec 15 05:03:06 localhost ovn_controller[154603]: 2025-12-15T10:03:06Z|00202|binding|INFO|Setting lport d10e503e-ef84-4538-aed9-daa76a68eb88 up in Southbound Dec 15 05:03:06 localhost journal[231322]: ethtool ioctl error on tapd10e503e-ef: No such device Dec 15 05:03:06 localhost nova_compute[286344]: 2025-12-15 10:03:06.428 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:03:06 localhost journal[231322]: ethtool ioctl error on tapd10e503e-ef: No such device Dec 15 05:03:06 localhost journal[231322]: ethtool ioctl error on tapd10e503e-ef: No such device Dec 15 05:03:06 localhost journal[231322]: ethtool ioctl error on tapd10e503e-ef: No such device Dec 15 05:03:06 localhost journal[231322]: ethtool ioctl error on tapd10e503e-ef: No such device Dec 15 05:03:06 localhost journal[231322]: ethtool ioctl error on tapd10e503e-ef: No such device Dec 15 05:03:06 localhost nova_compute[286344]: 2025-12-15 10:03:06.462 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:03:06 localhost nova_compute[286344]: 2025-12-15 10:03:06.488 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:03:06 localhost neutron_sriov_agent[260044]: 2025-12-15 10:03:06.506 2 INFO neutron.agent.securitygroups_rpc [None req-1f2433bb-1d81-409f-ad51-1c0228363b74 d5377d4013ec43b98bbf91c063971dac 1808d9ac20f44272b13e28ceb7a5e4c6 - - default default] Security group member updated ['f398dd81-32a0-4005-81c7-146c01e93e74']#033[00m Dec 15 05:03:06 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:03:06.565 267546 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:03:06 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Dec 15 05:03:06 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1692383407' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Dec 15 05:03:06 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Dec 15 05:03:06 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1692383407' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Dec 15 05:03:07 localhost neutron_sriov_agent[260044]: 2025-12-15 10:03:07.052 2 INFO neutron.agent.securitygroups_rpc [None req-82d24d53-a632-4a13-b33f-5803561d3021 d5377d4013ec43b98bbf91c063971dac 1808d9ac20f44272b13e28ceb7a5e4c6 - - default default] Security group member updated ['f398dd81-32a0-4005-81c7-146c01e93e74']#033[00m Dec 15 05:03:07 localhost nova_compute[286344]: 2025-12-15 10:03:07.089 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:03:07 localhost podman[318355]: Dec 15 05:03:07 localhost podman[318355]: 2025-12-15 10:03:07.312594536 +0000 UTC m=+0.092959587 container create bad17501b69f40ab6397cef7afcf60e36f304342d154893c4765c4c59d281e20 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7045181e-34f8-4eb6-9637-1ed97ee9f99e, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Dec 15 05:03:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0. Dec 15 05:03:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 05:03:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a. Dec 15 05:03:07 localhost systemd[1]: Started libpod-conmon-bad17501b69f40ab6397cef7afcf60e36f304342d154893c4765c4c59d281e20.scope. Dec 15 05:03:07 localhost podman[318355]: 2025-12-15 10:03:07.267688423 +0000 UTC m=+0.048053504 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 15 05:03:07 localhost systemd[1]: Started libcrun container. Dec 15 05:03:07 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d444bf2f714d48e60d2f0a0983095b45c03ccaecdbbadc879bc4926da7ea564e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 05:03:07 localhost podman[318355]: 2025-12-15 10:03:07.409392929 +0000 UTC m=+0.189757990 container init bad17501b69f40ab6397cef7afcf60e36f304342d154893c4765c4c59d281e20 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7045181e-34f8-4eb6-9637-1ed97ee9f99e, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:03:07 localhost podman[318355]: 2025-12-15 10:03:07.418684301 +0000 UTC m=+0.199049352 container start bad17501b69f40ab6397cef7afcf60e36f304342d154893c4765c4c59d281e20 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7045181e-34f8-4eb6-9637-1ed97ee9f99e, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Dec 15 05:03:07 localhost dnsmasq[318406]: started, version 2.85 cachesize 150 Dec 15 05:03:07 localhost dnsmasq[318406]: DNS service limited to local subnets Dec 15 05:03:07 localhost dnsmasq[318406]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 15 05:03:07 localhost dnsmasq[318406]: warning: no upstream servers configured Dec 15 05:03:07 localhost dnsmasq-dhcp[318406]: DHCP, static leases only on 10.100.0.0, lease time 1d Dec 15 05:03:07 localhost dnsmasq[318406]: read /var/lib/neutron/dhcp/7045181e-34f8-4eb6-9637-1ed97ee9f99e/addn_hosts - 0 addresses Dec 15 05:03:07 localhost dnsmasq-dhcp[318406]: read /var/lib/neutron/dhcp/7045181e-34f8-4eb6-9637-1ed97ee9f99e/host Dec 15 05:03:07 localhost dnsmasq-dhcp[318406]: read /var/lib/neutron/dhcp/7045181e-34f8-4eb6-9637-1ed97ee9f99e/opts Dec 15 05:03:07 localhost podman[318372]: 2025-12-15 10:03:07.506341194 +0000 UTC m=+0.142758673 container health_status b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:03:07 localhost podman[318371]: 2025-12-15 10:03:07.466081497 +0000 UTC m=+0.101607053 container health_status 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:03:07 localhost podman[318372]: 2025-12-15 10:03:07.520389886 +0000 UTC m=+0.156807335 container exec_died b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:03:07 localhost systemd[1]: b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a.service: Deactivated successfully. Dec 15 05:03:07 localhost podman[318371]: 2025-12-15 10:03:07.545801122 +0000 UTC m=+0.181326648 container exec_died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Dec 15 05:03:07 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:03:07.548 267546 INFO neutron.agent.dhcp.agent [None req-6b6455ca-65b9-42db-97ce-f8aba8d5145b - - - - - -] DHCP configuration for ports {'2c64b36a-3ea5-4e21-9463-144abcca07f6'} is completed#033[00m Dec 15 05:03:07 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.service: Deactivated successfully. Dec 15 05:03:07 localhost podman[318370]: 2025-12-15 10:03:07.606730467 +0000 UTC m=+0.247227308 container health_status 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Dec 15 05:03:07 localhost podman[318370]: 2025-12-15 10:03:07.61843859 +0000 UTC m=+0.258935471 container exec_died 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 15 05:03:07 localhost systemd[1]: 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0.service: Deactivated successfully. Dec 15 05:03:07 localhost neutron_sriov_agent[260044]: 2025-12-15 10:03:07.964 2 INFO neutron.agent.securitygroups_rpc [None req-69dc86e0-4413-4597-a81c-9918ae11e3c6 d5377d4013ec43b98bbf91c063971dac 1808d9ac20f44272b13e28ceb7a5e4c6 - - default default] Security group member updated ['f398dd81-32a0-4005-81c7-146c01e93e74']#033[00m Dec 15 05:03:07 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:03:07.992 267546 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:03:08 localhost nova_compute[286344]: 2025-12-15 10:03:08.268 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:03:08 localhost nova_compute[286344]: 2025-12-15 10:03:08.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:03:08 localhost nova_compute[286344]: 2025-12-15 10:03:08.596 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:03:08 localhost neutron_sriov_agent[260044]: 2025-12-15 10:03:08.809 2 INFO neutron.agent.securitygroups_rpc [None req-8b43d7ba-32bc-44f0-a50e-41d4b3ecb3be d5377d4013ec43b98bbf91c063971dac 1808d9ac20f44272b13e28ceb7a5e4c6 - - default default] Security group member updated ['f398dd81-32a0-4005-81c7-146c01e93e74']#033[00m Dec 15 05:03:09 localhost nova_compute[286344]: 2025-12-15 10:03:09.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:03:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09. Dec 15 05:03:09 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 05:03:09 localhost podman[318434]: 2025-12-15 10:03:09.772274057 +0000 UTC m=+0.102556448 container health_status 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vcs-type=git, vendor=Red Hat, Inc., managed_by=edpm_ansible, release=1755695350, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, distribution-scope=public, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Dec 15 05:03:09 localhost podman[318434]: 2025-12-15 10:03:09.789400935 +0000 UTC m=+0.119683376 container exec_died 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vendor=Red Hat, Inc., io.buildah.version=1.33.7, version=9.6, io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public) Dec 15 05:03:09 localhost systemd[1]: 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09.service: Deactivated successfully. Dec 15 05:03:09 localhost podman[318435]: 2025-12-15 10:03:09.860771502 +0000 UTC m=+0.184230182 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Dec 15 05:03:09 localhost podman[318435]: 2025-12-15 10:03:09.927954073 +0000 UTC m=+0.251412783 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:03:09 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 05:03:09 localhost nova_compute[286344]: 2025-12-15 10:03:09.972 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:03:10 localhost ovn_metadata_agent[160585]: 2025-12-15 10:03:10.036 160590 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=12d96d64-e862-4f68-81e5-8d9ec5d3a5e2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 15 05:03:10 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:03:10.239 267546 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-15T10:03:09Z, description=, device_id=040ad888-10ff-422f-9f59-4bf55ac4388f, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=2d171257-6f7e-4c1f-94d7-35501c5b9b5d, ip_allocation=immediate, mac_address=fa:16:3e:4a:24:0d, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-15T10:03:04Z, description=, dns_domain=, id=7045181e-34f8-4eb6-9637-1ed97ee9f99e, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-DeleteServersTestJSON-1627578888-network, port_security_enabled=True, project_id=9f4dc08c7491451692c298986081227e, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=3890, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1167, status=ACTIVE, subnets=['f30ce761-ee5d-4f4e-9d08-636b746e13f5'], tags=[], tenant_id=9f4dc08c7491451692c298986081227e, updated_at=2025-12-15T10:03:04Z, vlan_transparent=None, network_id=7045181e-34f8-4eb6-9637-1ed97ee9f99e, port_security_enabled=False, project_id=9f4dc08c7491451692c298986081227e, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1206, status=DOWN, tags=[], tenant_id=9f4dc08c7491451692c298986081227e, updated_at=2025-12-15T10:03:10Z on network 7045181e-34f8-4eb6-9637-1ed97ee9f99e#033[00m Dec 15 05:03:10 localhost neutron_sriov_agent[260044]: 2025-12-15 10:03:10.323 2 INFO neutron.agent.securitygroups_rpc [None req-9a0fa253-1864-4f82-accb-ded9bf8004d1 d5377d4013ec43b98bbf91c063971dac 1808d9ac20f44272b13e28ceb7a5e4c6 - - default default] Security group member updated ['f398dd81-32a0-4005-81c7-146c01e93e74']#033[00m Dec 15 05:03:10 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:03:10.342 267546 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:03:10 localhost dnsmasq[318406]: read /var/lib/neutron/dhcp/7045181e-34f8-4eb6-9637-1ed97ee9f99e/addn_hosts - 1 addresses Dec 15 05:03:10 localhost dnsmasq-dhcp[318406]: read /var/lib/neutron/dhcp/7045181e-34f8-4eb6-9637-1ed97ee9f99e/host Dec 15 05:03:10 localhost podman[318496]: 2025-12-15 10:03:10.451938035 +0000 UTC m=+0.059282915 container kill bad17501b69f40ab6397cef7afcf60e36f304342d154893c4765c4c59d281e20 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7045181e-34f8-4eb6-9637-1ed97ee9f99e, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true) Dec 15 05:03:10 localhost dnsmasq-dhcp[318406]: read /var/lib/neutron/dhcp/7045181e-34f8-4eb6-9637-1ed97ee9f99e/opts Dec 15 05:03:10 localhost systemd-journald[47230]: Data hash table of /run/log/journal/738a39f68bc78fb81032e509449fb759/system.journal has a fill level at 75.0 (53723 of 71630 items, 25165824 file size, 468 bytes per hash table item), suggesting rotation. Dec 15 05:03:10 localhost systemd-journald[47230]: /run/log/journal/738a39f68bc78fb81032e509449fb759/system.journal: Journal header limits reached or header out-of-date, rotating. Dec 15 05:03:10 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Dec 15 05:03:10 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e118 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 05:03:10 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e118 do_prune osdmap full prune enabled Dec 15 05:03:10 localhost rsyslogd[759]: imjournal: journal files changed, reloading... [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ] Dec 15 05:03:10 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e119 e119: 6 total, 6 up, 6 in Dec 15 05:03:10 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e119: 6 total, 6 up, 6 in Dec 15 05:03:10 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:03:10.657 267546 INFO neutron.agent.dhcp.agent [None req-687be7d0-d208-4595-a2cf-500228e2fbdf - - - - - -] DHCP configuration for ports {'2d171257-6f7e-4c1f-94d7-35501c5b9b5d'} is completed#033[00m Dec 15 05:03:11 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:03:11.228 267546 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-15T10:03:09Z, description=, device_id=040ad888-10ff-422f-9f59-4bf55ac4388f, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=2d171257-6f7e-4c1f-94d7-35501c5b9b5d, ip_allocation=immediate, mac_address=fa:16:3e:4a:24:0d, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-15T10:03:04Z, description=, dns_domain=, id=7045181e-34f8-4eb6-9637-1ed97ee9f99e, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-DeleteServersTestJSON-1627578888-network, port_security_enabled=True, project_id=9f4dc08c7491451692c298986081227e, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=3890, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1167, status=ACTIVE, subnets=['f30ce761-ee5d-4f4e-9d08-636b746e13f5'], tags=[], tenant_id=9f4dc08c7491451692c298986081227e, updated_at=2025-12-15T10:03:04Z, vlan_transparent=None, network_id=7045181e-34f8-4eb6-9637-1ed97ee9f99e, port_security_enabled=False, project_id=9f4dc08c7491451692c298986081227e, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1206, status=DOWN, tags=[], tenant_id=9f4dc08c7491451692c298986081227e, updated_at=2025-12-15T10:03:10Z on network 7045181e-34f8-4eb6-9637-1ed97ee9f99e#033[00m Dec 15 05:03:11 localhost nova_compute[286344]: 2025-12-15 10:03:11.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:03:11 localhost dnsmasq[318406]: read /var/lib/neutron/dhcp/7045181e-34f8-4eb6-9637-1ed97ee9f99e/addn_hosts - 1 addresses Dec 15 05:03:11 localhost dnsmasq-dhcp[318406]: read /var/lib/neutron/dhcp/7045181e-34f8-4eb6-9637-1ed97ee9f99e/host Dec 15 05:03:11 localhost dnsmasq-dhcp[318406]: read /var/lib/neutron/dhcp/7045181e-34f8-4eb6-9637-1ed97ee9f99e/opts Dec 15 05:03:11 localhost podman[318534]: 2025-12-15 10:03:11.456173584 +0000 UTC m=+0.061730726 container kill bad17501b69f40ab6397cef7afcf60e36f304342d154893c4765c4c59d281e20 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7045181e-34f8-4eb6-9637-1ed97ee9f99e, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Dec 15 05:03:11 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:03:11.706 267546 INFO neutron.agent.dhcp.agent [None req-d28a9363-9b34-49e1-8daf-14482193089b - - - - - -] DHCP configuration for ports {'2d171257-6f7e-4c1f-94d7-35501c5b9b5d'} is completed#033[00m Dec 15 05:03:12 localhost nova_compute[286344]: 2025-12-15 10:03:12.131 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:03:12 localhost nova_compute[286344]: 2025-12-15 10:03:12.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:03:12 localhost nova_compute[286344]: 2025-12-15 10:03:12.270 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 15 05:03:12 localhost nova_compute[286344]: 2025-12-15 10:03:12.271 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 15 05:03:12 localhost nova_compute[286344]: 2025-12-15 10:03:12.351 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 15 05:03:12 localhost nova_compute[286344]: 2025-12-15 10:03:12.352 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquired lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 15 05:03:12 localhost nova_compute[286344]: 2025-12-15 10:03:12.352 286348 DEBUG nova.network.neutron [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 15 05:03:12 localhost nova_compute[286344]: 2025-12-15 10:03:12.352 286348 DEBUG nova.objects.instance [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 15 05:03:12 localhost nova_compute[286344]: 2025-12-15 10:03:12.815 286348 DEBUG nova.network.neutron [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Updating instance_info_cache with network_info: [{"id": "03ef8889-3216-43fb-8a52-4be17a956ce1", "address": "fa:16:3e:74:df:7c", "network": {"id": "befb7a72-17a9-4bcb-b561-84b8f626685a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.201", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "c785bf23f53946bc99867d8832a50266", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03ef8889-32", "ovs_interfaceid": "03ef8889-3216-43fb-8a52-4be17a956ce1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 15 05:03:12 localhost nova_compute[286344]: 2025-12-15 10:03:12.832 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Releasing lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 15 05:03:12 localhost nova_compute[286344]: 2025-12-15 10:03:12.832 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 15 05:03:12 localhost nova_compute[286344]: 2025-12-15 10:03:12.833 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:03:13 localhost nova_compute[286344]: 2025-12-15 10:03:13.829 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:03:14 localhost ovn_metadata_agent[160585]: 2025-12-15 10:03:14.541 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1b:4f:e4 10.100.0.18 10.100.0.3'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.3/28', 'neutron:device_id': 'ovnmeta-c01bb2d6-6a01-4a10-805d-eefdd790a78b', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c01bb2d6-6a01-4a10-805d-eefdd790a78b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1808d9ac20f44272b13e28ceb7a5e4c6', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=39533dc5-dd13-4b28-902b-8aea0380ab87, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=2230332c-50bf-433a-a9eb-b44dc66c15a9) old=Port_Binding(mac=['fa:16:3e:1b:4f:e4 10.100.0.3'], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'ovnmeta-c01bb2d6-6a01-4a10-805d-eefdd790a78b', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c01bb2d6-6a01-4a10-805d-eefdd790a78b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1808d9ac20f44272b13e28ceb7a5e4c6', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:03:14 localhost ovn_metadata_agent[160585]: 2025-12-15 10:03:14.543 160590 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 2230332c-50bf-433a-a9eb-b44dc66c15a9 in datapath c01bb2d6-6a01-4a10-805d-eefdd790a78b updated#033[00m Dec 15 05:03:14 localhost ovn_metadata_agent[160585]: 2025-12-15 10:03:14.545 160590 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c01bb2d6-6a01-4a10-805d-eefdd790a78b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 15 05:03:14 localhost ovn_metadata_agent[160585]: 2025-12-15 10:03:14.546 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[f25925e7-657a-4d2e-9d79-0e8a39a6897b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:03:14 localhost nova_compute[286344]: 2025-12-15 10:03:14.974 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:03:15 localhost neutron_sriov_agent[260044]: 2025-12-15 10:03:15.116 2 INFO neutron.agent.securitygroups_rpc [None req-722d5312-dcd4-43e1-91f9-1f013bcaa7ca d5377d4013ec43b98bbf91c063971dac 1808d9ac20f44272b13e28ceb7a5e4c6 - - default default] Security group member updated ['f398dd81-32a0-4005-81c7-146c01e93e74']#033[00m Dec 15 05:03:15 localhost nova_compute[286344]: 2025-12-15 10:03:15.269 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:03:15 localhost nova_compute[286344]: 2025-12-15 10:03:15.288 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 05:03:15 localhost nova_compute[286344]: 2025-12-15 10:03:15.288 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 05:03:15 localhost nova_compute[286344]: 2025-12-15 10:03:15.288 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 05:03:15 localhost nova_compute[286344]: 2025-12-15 10:03:15.289 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Auditing locally available compute resources for np0005559462.localdomain (node: np0005559462.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 15 05:03:15 localhost nova_compute[286344]: 2025-12-15 10:03:15.289 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 05:03:15 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e119 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 05:03:15 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Dec 15 05:03:15 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/841436904' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Dec 15 05:03:15 localhost nova_compute[286344]: 2025-12-15 10:03:15.728 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.439s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 05:03:15 localhost dnsmasq[318406]: read /var/lib/neutron/dhcp/7045181e-34f8-4eb6-9637-1ed97ee9f99e/addn_hosts - 0 addresses Dec 15 05:03:15 localhost podman[318594]: 2025-12-15 10:03:15.78389695 +0000 UTC m=+0.037782456 container kill bad17501b69f40ab6397cef7afcf60e36f304342d154893c4765c4c59d281e20 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7045181e-34f8-4eb6-9637-1ed97ee9f99e, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:03:15 localhost dnsmasq-dhcp[318406]: read /var/lib/neutron/dhcp/7045181e-34f8-4eb6-9637-1ed97ee9f99e/host Dec 15 05:03:15 localhost dnsmasq-dhcp[318406]: read /var/lib/neutron/dhcp/7045181e-34f8-4eb6-9637-1ed97ee9f99e/opts Dec 15 05:03:15 localhost nova_compute[286344]: 2025-12-15 10:03:15.802 286348 DEBUG nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 05:03:15 localhost nova_compute[286344]: 2025-12-15 10:03:15.803 286348 DEBUG nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 05:03:16 localhost ovn_controller[154603]: 2025-12-15T10:03:16Z|00203|binding|INFO|Releasing lport d10e503e-ef84-4538-aed9-daa76a68eb88 from this chassis (sb_readonly=0) Dec 15 05:03:16 localhost kernel: device tapd10e503e-ef left promiscuous mode Dec 15 05:03:16 localhost ovn_controller[154603]: 2025-12-15T10:03:16Z|00204|binding|INFO|Setting lport d10e503e-ef84-4538-aed9-daa76a68eb88 down in Southbound Dec 15 05:03:16 localhost nova_compute[286344]: 2025-12-15 10:03:16.005 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:03:16 localhost ovn_metadata_agent[160585]: 2025-12-15 10:03:16.012 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-7045181e-34f8-4eb6-9637-1ed97ee9f99e', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7045181e-34f8-4eb6-9637-1ed97ee9f99e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9f4dc08c7491451692c298986081227e', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005559462.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=22537a10-1b9d-495d-9379-0c843feed9f6, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=d10e503e-ef84-4538-aed9-daa76a68eb88) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:03:16 localhost ovn_metadata_agent[160585]: 2025-12-15 10:03:16.014 160590 INFO neutron.agent.ovn.metadata.agent [-] Port d10e503e-ef84-4538-aed9-daa76a68eb88 in datapath 7045181e-34f8-4eb6-9637-1ed97ee9f99e unbound from our chassis#033[00m Dec 15 05:03:16 localhost ovn_metadata_agent[160585]: 2025-12-15 10:03:16.016 160590 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7045181e-34f8-4eb6-9637-1ed97ee9f99e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 15 05:03:16 localhost ovn_metadata_agent[160585]: 2025-12-15 10:03:16.017 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[565fa7eb-d89f-46b2-bdfb-25f19d7ba7ce]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:03:16 localhost nova_compute[286344]: 2025-12-15 10:03:16.026 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:03:16 localhost neutron_sriov_agent[260044]: 2025-12-15 10:03:16.051 2 INFO neutron.agent.securitygroups_rpc [None req-1c11c032-a23e-478a-b222-193334c0c2c3 d5377d4013ec43b98bbf91c063971dac 1808d9ac20f44272b13e28ceb7a5e4c6 - - default default] Security group member updated ['f398dd81-32a0-4005-81c7-146c01e93e74']#033[00m Dec 15 05:03:16 localhost nova_compute[286344]: 2025-12-15 10:03:16.056 286348 WARNING nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 15 05:03:16 localhost nova_compute[286344]: 2025-12-15 10:03:16.057 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Hypervisor/Node resource view: name=np0005559462.localdomain free_ram=11351MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 15 05:03:16 localhost nova_compute[286344]: 2025-12-15 10:03:16.058 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 05:03:16 localhost nova_compute[286344]: 2025-12-15 10:03:16.058 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 05:03:16 localhost nova_compute[286344]: 2025-12-15 10:03:16.130 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Instance 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 15 05:03:16 localhost nova_compute[286344]: 2025-12-15 10:03:16.131 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 15 05:03:16 localhost nova_compute[286344]: 2025-12-15 10:03:16.132 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Final resource view: name=np0005559462.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 15 05:03:16 localhost nova_compute[286344]: 2025-12-15 10:03:16.177 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 05:03:16 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Dec 15 05:03:16 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1352784428' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Dec 15 05:03:16 localhost nova_compute[286344]: 2025-12-15 10:03:16.635 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 05:03:16 localhost nova_compute[286344]: 2025-12-15 10:03:16.641 286348 DEBUG nova.compute.provider_tree [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Inventory has not changed in ProviderTree for provider: 26c8956b-6742-4951-b566-971b9bbe323b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 15 05:03:16 localhost nova_compute[286344]: 2025-12-15 10:03:16.665 286348 DEBUG nova.scheduler.client.report [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Inventory has not changed for provider 26c8956b-6742-4951-b566-971b9bbe323b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 15 05:03:16 localhost nova_compute[286344]: 2025-12-15 10:03:16.695 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Compute_service record updated for np0005559462.localdomain:np0005559462.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 15 05:03:16 localhost nova_compute[286344]: 2025-12-15 10:03:16.696 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.638s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 05:03:16 localhost sshd[318640]: main: sshd: ssh-rsa algorithm is disabled Dec 15 05:03:17 localhost nova_compute[286344]: 2025-12-15 10:03:17.163 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:03:17 localhost neutron_sriov_agent[260044]: 2025-12-15 10:03:17.216 2 INFO neutron.agent.securitygroups_rpc [None req-189e9d0b-d6e0-40b4-97f1-7c337acc8d80 d5377d4013ec43b98bbf91c063971dac 1808d9ac20f44272b13e28ceb7a5e4c6 - - default default] Security group member updated ['f398dd81-32a0-4005-81c7-146c01e93e74']#033[00m Dec 15 05:03:17 localhost neutron_sriov_agent[260044]: 2025-12-15 10:03:17.996 2 INFO neutron.agent.securitygroups_rpc [None req-3fb00dc0-bb29-401b-83a8-c3d8f7174378 d5377d4013ec43b98bbf91c063971dac 1808d9ac20f44272b13e28ceb7a5e4c6 - - default default] Security group member updated ['f398dd81-32a0-4005-81c7-146c01e93e74']#033[00m Dec 15 05:03:18 localhost ovn_controller[154603]: 2025-12-15T10:03:18Z|00205|binding|INFO|Releasing lport b35254ad-12eb-47bb-92be-44fefe0694f0 from this chassis (sb_readonly=0) Dec 15 05:03:18 localhost nova_compute[286344]: 2025-12-15 10:03:18.373 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:03:18 localhost nova_compute[286344]: 2025-12-15 10:03:18.697 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:03:18 localhost nova_compute[286344]: 2025-12-15 10:03:18.697 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:03:18 localhost nova_compute[286344]: 2025-12-15 10:03:18.698 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 15 05:03:18 localhost dnsmasq[318406]: exiting on receipt of SIGTERM Dec 15 05:03:18 localhost systemd[1]: libpod-bad17501b69f40ab6397cef7afcf60e36f304342d154893c4765c4c59d281e20.scope: Deactivated successfully. Dec 15 05:03:18 localhost podman[318659]: 2025-12-15 10:03:18.958262525 +0000 UTC m=+0.045711814 container kill bad17501b69f40ab6397cef7afcf60e36f304342d154893c4765c4c59d281e20 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7045181e-34f8-4eb6-9637-1ed97ee9f99e, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202) Dec 15 05:03:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 05:03:19 localhost podman[318672]: 2025-12-15 10:03:19.022170695 +0000 UTC m=+0.052877815 container died bad17501b69f40ab6397cef7afcf60e36f304342d154893c4765c4c59d281e20 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7045181e-34f8-4eb6-9637-1ed97ee9f99e, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:03:19 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-bad17501b69f40ab6397cef7afcf60e36f304342d154893c4765c4c59d281e20-userdata-shm.mount: Deactivated successfully. Dec 15 05:03:19 localhost podman[318672]: 2025-12-15 10:03:19.051328174 +0000 UTC m=+0.082035274 container cleanup bad17501b69f40ab6397cef7afcf60e36f304342d154893c4765c4c59d281e20 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7045181e-34f8-4eb6-9637-1ed97ee9f99e, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202) Dec 15 05:03:19 localhost systemd[1]: libpod-conmon-bad17501b69f40ab6397cef7afcf60e36f304342d154893c4765c4c59d281e20.scope: Deactivated successfully. Dec 15 05:03:19 localhost podman[318679]: 2025-12-15 10:03:19.082025272 +0000 UTC m=+0.096255619 container remove bad17501b69f40ab6397cef7afcf60e36f304342d154893c4765c4c59d281e20 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7045181e-34f8-4eb6-9637-1ed97ee9f99e, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Dec 15 05:03:19 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:03:19.108 267546 INFO neutron.agent.dhcp.agent [None req-c5da21df-3adf-4011-a4a4-0f1039d07db6 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:03:19 localhost podman[318680]: 2025-12-15 10:03:19.134226739 +0000 UTC m=+0.145576024 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent) Dec 15 05:03:19 localhost podman[318680]: 2025-12-15 10:03:19.139292446 +0000 UTC m=+0.150641701 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:03:19 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 05:03:19 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:03:19.162 267546 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:03:19 localhost sshd[318721]: main: sshd: ssh-rsa algorithm is disabled Dec 15 05:03:19 localhost systemd[1]: var-lib-containers-storage-overlay-d444bf2f714d48e60d2f0a0983095b45c03ccaecdbbadc879bc4926da7ea564e-merged.mount: Deactivated successfully. Dec 15 05:03:19 localhost systemd[1]: run-netns-qdhcp\x2d7045181e\x2d34f8\x2d4eb6\x2d9637\x2d1ed97ee9f99e.mount: Deactivated successfully. Dec 15 05:03:19 localhost nova_compute[286344]: 2025-12-15 10:03:19.975 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:03:20 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e119 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 05:03:21 localhost sshd[318723]: main: sshd: ssh-rsa algorithm is disabled Dec 15 05:03:22 localhost nova_compute[286344]: 2025-12-15 10:03:22.199 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:03:22 localhost neutron_sriov_agent[260044]: 2025-12-15 10:03:22.413 2 INFO neutron.agent.securitygroups_rpc [None req-c743fbab-6e59-4242-9121-15fc9ee68a36 d5377d4013ec43b98bbf91c063971dac 1808d9ac20f44272b13e28ceb7a5e4c6 - - default default] Security group member updated ['f398dd81-32a0-4005-81c7-146c01e93e74']#033[00m Dec 15 05:03:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e. Dec 15 05:03:22 localhost podman[318726]: 2025-12-15 10:03:22.728315487 +0000 UTC m=+0.061624263 container health_status a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Dec 15 05:03:22 localhost podman[318726]: 2025-12-15 10:03:22.738576844 +0000 UTC m=+0.071885640 container exec_died a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 15 05:03:22 localhost systemd[1]: a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e.service: Deactivated successfully. Dec 15 05:03:22 localhost neutron_sriov_agent[260044]: 2025-12-15 10:03:22.907 2 INFO neutron.agent.securitygroups_rpc [None req-42b014e0-1211-432f-acf4-e2e492aaa857 d5377d4013ec43b98bbf91c063971dac 1808d9ac20f44272b13e28ceb7a5e4c6 - - default default] Security group member updated ['f398dd81-32a0-4005-81c7-146c01e93e74']#033[00m Dec 15 05:03:23 localhost neutron_sriov_agent[260044]: 2025-12-15 10:03:23.416 2 INFO neutron.agent.securitygroups_rpc [None req-430f3123-b79d-4ea5-8ee0-ebae697ac401 d5377d4013ec43b98bbf91c063971dac 1808d9ac20f44272b13e28ceb7a5e4c6 - - default default] Security group member updated ['f398dd81-32a0-4005-81c7-146c01e93e74']#033[00m Dec 15 05:03:23 localhost neutron_sriov_agent[260044]: 2025-12-15 10:03:23.804 2 INFO neutron.agent.securitygroups_rpc [None req-a0640d3f-a617-4c57-b990-421784b6811c d5377d4013ec43b98bbf91c063971dac 1808d9ac20f44272b13e28ceb7a5e4c6 - - default default] Security group member updated ['f398dd81-32a0-4005-81c7-146c01e93e74']#033[00m Dec 15 05:03:23 localhost sshd[318750]: main: sshd: ssh-rsa algorithm is disabled Dec 15 05:03:24 localhost nova_compute[286344]: 2025-12-15 10:03:24.977 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:03:25 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e119 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 05:03:27 localhost nova_compute[286344]: 2025-12-15 10:03:27.202 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:03:27 localhost neutron_sriov_agent[260044]: 2025-12-15 10:03:27.737 2 INFO neutron.agent.securitygroups_rpc [None req-f536c91d-71af-4b48-b0d4-57c1d2356ad4 d5377d4013ec43b98bbf91c063971dac 1808d9ac20f44272b13e28ceb7a5e4c6 - - default default] Security group member updated ['f398dd81-32a0-4005-81c7-146c01e93e74']#033[00m Dec 15 05:03:28 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:03:28.031 267546 INFO neutron.agent.linux.ip_lib [None req-50432737-599a-4295-8af1-0e4514243a89 - - - - - -] Device tap3096c7c2-40 cannot be used as it has no MAC address#033[00m Dec 15 05:03:28 localhost nova_compute[286344]: 2025-12-15 10:03:28.053 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:03:28 localhost kernel: device tap3096c7c2-40 entered promiscuous mode Dec 15 05:03:28 localhost nova_compute[286344]: 2025-12-15 10:03:28.063 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:03:28 localhost ovn_controller[154603]: 2025-12-15T10:03:28Z|00206|binding|INFO|Claiming lport 3096c7c2-4035-4b3c-a767-fdcaeb7e984e for this chassis. Dec 15 05:03:28 localhost ovn_controller[154603]: 2025-12-15T10:03:28Z|00207|binding|INFO|3096c7c2-4035-4b3c-a767-fdcaeb7e984e: Claiming unknown Dec 15 05:03:28 localhost NetworkManager[5963]: [1765793008.0659] manager: (tap3096c7c2-40): new Generic device (/org/freedesktop/NetworkManager/Devices/34) Dec 15 05:03:28 localhost systemd-udevd[318762]: Network interface NamePolicy= disabled on kernel command line. Dec 15 05:03:28 localhost ovn_metadata_agent[160585]: 2025-12-15 10:03:28.075 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-86f59812-6a66-4f5b-98b9-04cfc7d48936', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-86f59812-6a66-4f5b-98b9-04cfc7d48936', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1808d9ac20f44272b13e28ceb7a5e4c6', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7c2c22fb-7dde-4770-b316-be1848768db7, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=3096c7c2-4035-4b3c-a767-fdcaeb7e984e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:03:28 localhost nova_compute[286344]: 2025-12-15 10:03:28.078 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:03:28 localhost ovn_metadata_agent[160585]: 2025-12-15 10:03:28.077 160590 INFO neutron.agent.ovn.metadata.agent [-] Port 3096c7c2-4035-4b3c-a767-fdcaeb7e984e in datapath 86f59812-6a66-4f5b-98b9-04cfc7d48936 bound to our chassis#033[00m Dec 15 05:03:28 localhost ovn_metadata_agent[160585]: 2025-12-15 10:03:28.079 160590 DEBUG neutron.agent.ovn.metadata.agent [-] Port d47bfd65-83dc-4cae-a05f-4a712f6160d0 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Dec 15 05:03:28 localhost ovn_metadata_agent[160585]: 2025-12-15 10:03:28.079 160590 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 86f59812-6a66-4f5b-98b9-04cfc7d48936, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 15 05:03:28 localhost ovn_controller[154603]: 2025-12-15T10:03:28Z|00208|binding|INFO|Setting lport 3096c7c2-4035-4b3c-a767-fdcaeb7e984e ovn-installed in OVS Dec 15 05:03:28 localhost ovn_controller[154603]: 2025-12-15T10:03:28Z|00209|binding|INFO|Setting lport 3096c7c2-4035-4b3c-a767-fdcaeb7e984e up in Southbound Dec 15 05:03:28 localhost ovn_metadata_agent[160585]: 2025-12-15 10:03:28.080 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[d01621f9-eff6-415c-93a9-19fd0d6372ac]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:03:28 localhost nova_compute[286344]: 2025-12-15 10:03:28.081 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:03:28 localhost journal[231322]: ethtool ioctl error on tap3096c7c2-40: No such device Dec 15 05:03:28 localhost journal[231322]: ethtool ioctl error on tap3096c7c2-40: No such device Dec 15 05:03:28 localhost journal[231322]: ethtool ioctl error on tap3096c7c2-40: No such device Dec 15 05:03:28 localhost journal[231322]: ethtool ioctl error on tap3096c7c2-40: No such device Dec 15 05:03:28 localhost journal[231322]: ethtool ioctl error on tap3096c7c2-40: No such device Dec 15 05:03:28 localhost journal[231322]: ethtool ioctl error on tap3096c7c2-40: No such device Dec 15 05:03:28 localhost journal[231322]: ethtool ioctl error on tap3096c7c2-40: No such device Dec 15 05:03:28 localhost journal[231322]: ethtool ioctl error on tap3096c7c2-40: No such device Dec 15 05:03:28 localhost nova_compute[286344]: 2025-12-15 10:03:28.161 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:03:28 localhost nova_compute[286344]: 2025-12-15 10:03:28.187 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:03:28 localhost nova_compute[286344]: 2025-12-15 10:03:28.214 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:03:29 localhost podman[318833]: Dec 15 05:03:29 localhost podman[318833]: 2025-12-15 10:03:29.248390225 +0000 UTC m=+0.090101846 container create 01a65ae63b5af870d0034915099b8dc295ce4bcc2e8583dbe53b24437b0c7e22 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-86f59812-6a66-4f5b-98b9-04cfc7d48936, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true) Dec 15 05:03:29 localhost systemd[1]: Started libpod-conmon-01a65ae63b5af870d0034915099b8dc295ce4bcc2e8583dbe53b24437b0c7e22.scope. Dec 15 05:03:29 localhost systemd[1]: Started libcrun container. Dec 15 05:03:29 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2fe7bfeee5eaf9e2bbec8e8de7563411e63931afc9a65e5e552495f7940a54b2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 05:03:29 localhost podman[318833]: 2025-12-15 10:03:29.209906993 +0000 UTC m=+0.051618654 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 15 05:03:29 localhost podman[318833]: 2025-12-15 10:03:29.318291344 +0000 UTC m=+0.160002995 container init 01a65ae63b5af870d0034915099b8dc295ce4bcc2e8583dbe53b24437b0c7e22 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-86f59812-6a66-4f5b-98b9-04cfc7d48936, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:03:29 localhost podman[318833]: 2025-12-15 10:03:29.326637893 +0000 UTC m=+0.168349534 container start 01a65ae63b5af870d0034915099b8dc295ce4bcc2e8583dbe53b24437b0c7e22 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-86f59812-6a66-4f5b-98b9-04cfc7d48936, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Dec 15 05:03:29 localhost dnsmasq[318851]: started, version 2.85 cachesize 150 Dec 15 05:03:29 localhost dnsmasq[318851]: DNS service limited to local subnets Dec 15 05:03:29 localhost dnsmasq[318851]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 15 05:03:29 localhost dnsmasq[318851]: warning: no upstream servers configured Dec 15 05:03:29 localhost dnsmasq-dhcp[318851]: DHCP, static leases only on 10.100.0.0, lease time 1d Dec 15 05:03:29 localhost dnsmasq[318851]: read /var/lib/neutron/dhcp/86f59812-6a66-4f5b-98b9-04cfc7d48936/addn_hosts - 0 addresses Dec 15 05:03:29 localhost dnsmasq-dhcp[318851]: read /var/lib/neutron/dhcp/86f59812-6a66-4f5b-98b9-04cfc7d48936/host Dec 15 05:03:29 localhost dnsmasq-dhcp[318851]: read /var/lib/neutron/dhcp/86f59812-6a66-4f5b-98b9-04cfc7d48936/opts Dec 15 05:03:29 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:03:29.396 267546 INFO neutron.agent.dhcp.agent [None req-f32df057-b513-44c9-8ce2-01cfef16a764 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-15T10:03:27Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=e06d9fce-b396-4cf1-b521-cb23ab5d6531, ip_allocation=immediate, mac_address=fa:16:3e:d7:b2:39, name=tempest-PortsTestJSON-2002547358, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-15T10:03:25Z, description=, dns_domain=, id=86f59812-6a66-4f5b-98b9-04cfc7d48936, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsTestJSON-48778850, port_security_enabled=True, project_id=1808d9ac20f44272b13e28ceb7a5e4c6, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=25047, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1276, status=ACTIVE, subnets=['b16b7652-16e2-4d26-a3e6-1ff176d940b2'], tags=[], tenant_id=1808d9ac20f44272b13e28ceb7a5e4c6, updated_at=2025-12-15T10:03:26Z, vlan_transparent=None, network_id=86f59812-6a66-4f5b-98b9-04cfc7d48936, port_security_enabled=True, project_id=1808d9ac20f44272b13e28ceb7a5e4c6, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['f398dd81-32a0-4005-81c7-146c01e93e74'], standard_attr_id=1293, status=DOWN, tags=[], tenant_id=1808d9ac20f44272b13e28ceb7a5e4c6, updated_at=2025-12-15T10:03:27Z on network 86f59812-6a66-4f5b-98b9-04cfc7d48936#033[00m Dec 15 05:03:29 localhost dnsmasq[318851]: read /var/lib/neutron/dhcp/86f59812-6a66-4f5b-98b9-04cfc7d48936/addn_hosts - 1 addresses Dec 15 05:03:29 localhost dnsmasq-dhcp[318851]: read /var/lib/neutron/dhcp/86f59812-6a66-4f5b-98b9-04cfc7d48936/host Dec 15 05:03:29 localhost podman[318870]: 2025-12-15 10:03:29.604540538 +0000 UTC m=+0.057189753 container kill 01a65ae63b5af870d0034915099b8dc295ce4bcc2e8583dbe53b24437b0c7e22 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-86f59812-6a66-4f5b-98b9-04cfc7d48936, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:03:29 localhost dnsmasq-dhcp[318851]: read /var/lib/neutron/dhcp/86f59812-6a66-4f5b-98b9-04cfc7d48936/opts Dec 15 05:03:29 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:03:29.666 267546 INFO neutron.agent.dhcp.agent [None req-589e18eb-73f3-4f10-bd26-f3f1387e525f - - - - - -] DHCP configuration for ports {'52b23b0f-efce-4f47-a671-cf2744f147ac'} is completed#033[00m Dec 15 05:03:29 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:03:29.840 267546 INFO neutron.agent.dhcp.agent [None req-1309af4e-1107-4e64-b143-7980742be823 - - - - - -] DHCP configuration for ports {'e06d9fce-b396-4cf1-b521-cb23ab5d6531'} is completed#033[00m Dec 15 05:03:29 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:03:29.925 267546 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-15T10:03:27Z, description=, device_id=5a73012b-8bfb-4542-97de-bc744361b340, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=e06d9fce-b396-4cf1-b521-cb23ab5d6531, ip_allocation=immediate, mac_address=fa:16:3e:d7:b2:39, name=tempest-PortsTestJSON-2002547358, network_id=86f59812-6a66-4f5b-98b9-04cfc7d48936, port_security_enabled=True, project_id=1808d9ac20f44272b13e28ceb7a5e4c6, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['f398dd81-32a0-4005-81c7-146c01e93e74'], standard_attr_id=1293, status=DOWN, tags=[], tenant_id=1808d9ac20f44272b13e28ceb7a5e4c6, updated_at=2025-12-15T10:03:28Z on network 86f59812-6a66-4f5b-98b9-04cfc7d48936#033[00m Dec 15 05:03:29 localhost nova_compute[286344]: 2025-12-15 10:03:29.978 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:03:30 localhost dnsmasq[318851]: read /var/lib/neutron/dhcp/86f59812-6a66-4f5b-98b9-04cfc7d48936/addn_hosts - 1 addresses Dec 15 05:03:30 localhost dnsmasq-dhcp[318851]: read /var/lib/neutron/dhcp/86f59812-6a66-4f5b-98b9-04cfc7d48936/host Dec 15 05:03:30 localhost podman[318907]: 2025-12-15 10:03:30.117969066 +0000 UTC m=+0.055649624 container kill 01a65ae63b5af870d0034915099b8dc295ce4bcc2e8583dbe53b24437b0c7e22 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-86f59812-6a66-4f5b-98b9-04cfc7d48936, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Dec 15 05:03:30 localhost dnsmasq-dhcp[318851]: read /var/lib/neutron/dhcp/86f59812-6a66-4f5b-98b9-04cfc7d48936/opts Dec 15 05:03:30 localhost sshd[318929]: main: sshd: ssh-rsa algorithm is disabled Dec 15 05:03:30 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:03:30.389 267546 INFO neutron.agent.dhcp.agent [None req-7f77b173-c8cc-4acd-b634-2013e1efacc8 - - - - - -] DHCP configuration for ports {'e06d9fce-b396-4cf1-b521-cb23ab5d6531'} is completed#033[00m Dec 15 05:03:30 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e119 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 05:03:31 localhost neutron_sriov_agent[260044]: 2025-12-15 10:03:31.065 2 INFO neutron.agent.securitygroups_rpc [None req-bf9d6462-b5fe-4329-afab-2764805c92c2 d5377d4013ec43b98bbf91c063971dac 1808d9ac20f44272b13e28ceb7a5e4c6 - - default default] Security group member updated ['f398dd81-32a0-4005-81c7-146c01e93e74']#033[00m Dec 15 05:03:31 localhost dnsmasq[318851]: read /var/lib/neutron/dhcp/86f59812-6a66-4f5b-98b9-04cfc7d48936/addn_hosts - 0 addresses Dec 15 05:03:31 localhost dnsmasq-dhcp[318851]: read /var/lib/neutron/dhcp/86f59812-6a66-4f5b-98b9-04cfc7d48936/host Dec 15 05:03:31 localhost dnsmasq-dhcp[318851]: read /var/lib/neutron/dhcp/86f59812-6a66-4f5b-98b9-04cfc7d48936/opts Dec 15 05:03:31 localhost podman[318948]: 2025-12-15 10:03:31.295124413 +0000 UTC m=+0.061960492 container kill 01a65ae63b5af870d0034915099b8dc295ce4bcc2e8583dbe53b24437b0c7e22 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-86f59812-6a66-4f5b-98b9-04cfc7d48936, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS) Dec 15 05:03:31 localhost ovn_controller[154603]: 2025-12-15T10:03:31Z|00210|binding|INFO|Releasing lport 3096c7c2-4035-4b3c-a767-fdcaeb7e984e from this chassis (sb_readonly=0) Dec 15 05:03:31 localhost kernel: device tap3096c7c2-40 left promiscuous mode Dec 15 05:03:31 localhost ovn_controller[154603]: 2025-12-15T10:03:31Z|00211|binding|INFO|Setting lport 3096c7c2-4035-4b3c-a767-fdcaeb7e984e down in Southbound Dec 15 05:03:31 localhost nova_compute[286344]: 2025-12-15 10:03:31.454 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:03:31 localhost ovn_metadata_agent[160585]: 2025-12-15 10:03:31.462 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-86f59812-6a66-4f5b-98b9-04cfc7d48936', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-86f59812-6a66-4f5b-98b9-04cfc7d48936', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1808d9ac20f44272b13e28ceb7a5e4c6', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005559462.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7c2c22fb-7dde-4770-b316-be1848768db7, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=3096c7c2-4035-4b3c-a767-fdcaeb7e984e) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:03:31 localhost ovn_metadata_agent[160585]: 2025-12-15 10:03:31.463 160590 INFO neutron.agent.ovn.metadata.agent [-] Port 3096c7c2-4035-4b3c-a767-fdcaeb7e984e in datapath 86f59812-6a66-4f5b-98b9-04cfc7d48936 unbound from our chassis#033[00m Dec 15 05:03:31 localhost ovn_metadata_agent[160585]: 2025-12-15 10:03:31.464 160590 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 86f59812-6a66-4f5b-98b9-04cfc7d48936, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 15 05:03:31 localhost ovn_metadata_agent[160585]: 2025-12-15 10:03:31.465 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[f51625c6-7e18-4c13-b116-302792424b0e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:03:31 localhost nova_compute[286344]: 2025-12-15 10:03:31.473 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:03:31 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e119 do_prune osdmap full prune enabled Dec 15 05:03:31 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e120 e120: 6 total, 6 up, 6 in Dec 15 05:03:31 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e120: 6 total, 6 up, 6 in Dec 15 05:03:31 localhost podman[243449]: time="2025-12-15T10:03:31Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 15 05:03:31 localhost podman[243449]: @ - - [15/Dec/2025:10:03:31 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 158464 "" "Go-http-client/1.1" Dec 15 05:03:31 localhost podman[243449]: @ - - [15/Dec/2025:10:03:31 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19713 "" "Go-http-client/1.1" Dec 15 05:03:32 localhost nova_compute[286344]: 2025-12-15 10:03:32.208 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:03:32 localhost dnsmasq[318851]: exiting on receipt of SIGTERM Dec 15 05:03:32 localhost podman[318987]: 2025-12-15 10:03:32.318837429 +0000 UTC m=+0.050632288 container kill 01a65ae63b5af870d0034915099b8dc295ce4bcc2e8583dbe53b24437b0c7e22 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-86f59812-6a66-4f5b-98b9-04cfc7d48936, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Dec 15 05:03:32 localhost systemd[1]: libpod-01a65ae63b5af870d0034915099b8dc295ce4bcc2e8583dbe53b24437b0c7e22.scope: Deactivated successfully. Dec 15 05:03:32 localhost podman[318999]: 2025-12-15 10:03:32.365131508 +0000 UTC m=+0.036202497 container died 01a65ae63b5af870d0034915099b8dc295ce4bcc2e8583dbe53b24437b0c7e22 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-86f59812-6a66-4f5b-98b9-04cfc7d48936, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:03:32 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-01a65ae63b5af870d0034915099b8dc295ce4bcc2e8583dbe53b24437b0c7e22-userdata-shm.mount: Deactivated successfully. Dec 15 05:03:32 localhost podman[318999]: 2025-12-15 10:03:32.399849347 +0000 UTC m=+0.070920296 container cleanup 01a65ae63b5af870d0034915099b8dc295ce4bcc2e8583dbe53b24437b0c7e22 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-86f59812-6a66-4f5b-98b9-04cfc7d48936, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202) Dec 15 05:03:32 localhost systemd[1]: libpod-conmon-01a65ae63b5af870d0034915099b8dc295ce4bcc2e8583dbe53b24437b0c7e22.scope: Deactivated successfully. Dec 15 05:03:32 localhost podman[319006]: 2025-12-15 10:03:32.47828655 +0000 UTC m=+0.136060306 container remove 01a65ae63b5af870d0034915099b8dc295ce4bcc2e8583dbe53b24437b0c7e22 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-86f59812-6a66-4f5b-98b9-04cfc7d48936, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Dec 15 05:03:32 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:03:32.505 267546 INFO neutron.agent.dhcp.agent [None req-c0aacc95-4d6a-4a8b-8185-fa0c92462681 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:03:32 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e120 do_prune osdmap full prune enabled Dec 15 05:03:32 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e121 e121: 6 total, 6 up, 6 in Dec 15 05:03:32 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e121: 6 total, 6 up, 6 in Dec 15 05:03:32 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:03:32.641 267546 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:03:32 localhost ovn_controller[154603]: 2025-12-15T10:03:32Z|00212|binding|INFO|Releasing lport b35254ad-12eb-47bb-92be-44fefe0694f0 from this chassis (sb_readonly=0) Dec 15 05:03:32 localhost nova_compute[286344]: 2025-12-15 10:03:32.906 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:03:33 localhost systemd[1]: var-lib-containers-storage-overlay-2fe7bfeee5eaf9e2bbec8e8de7563411e63931afc9a65e5e552495f7940a54b2-merged.mount: Deactivated successfully. Dec 15 05:03:33 localhost systemd[1]: run-netns-qdhcp\x2d86f59812\x2d6a66\x2d4f5b\x2d98b9\x2d04cfc7d48936.mount: Deactivated successfully. Dec 15 05:03:33 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e121 do_prune osdmap full prune enabled Dec 15 05:03:33 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e122 e122: 6 total, 6 up, 6 in Dec 15 05:03:33 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e122: 6 total, 6 up, 6 in Dec 15 05:03:33 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:03:33.886 267546 INFO neutron.agent.linux.ip_lib [None req-a6413006-b19d-47d6-813e-01b36016f412 - - - - - -] Device tap10ce6638-0c cannot be used as it has no MAC address#033[00m Dec 15 05:03:33 localhost nova_compute[286344]: 2025-12-15 10:03:33.908 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:03:33 localhost kernel: device tap10ce6638-0c entered promiscuous mode Dec 15 05:03:33 localhost NetworkManager[5963]: [1765793013.9172] manager: (tap10ce6638-0c): new Generic device (/org/freedesktop/NetworkManager/Devices/35) Dec 15 05:03:33 localhost systemd-udevd[319041]: Network interface NamePolicy= disabled on kernel command line. Dec 15 05:03:33 localhost ovn_controller[154603]: 2025-12-15T10:03:33Z|00213|binding|INFO|Claiming lport 10ce6638-0ca4-4e8d-aec9-4163a87c1a20 for this chassis. Dec 15 05:03:33 localhost ovn_controller[154603]: 2025-12-15T10:03:33Z|00214|binding|INFO|10ce6638-0ca4-4e8d-aec9-4163a87c1a20: Claiming unknown Dec 15 05:03:33 localhost nova_compute[286344]: 2025-12-15 10:03:33.956 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:03:33 localhost ovn_metadata_agent[160585]: 2025-12-15 10:03:33.968 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-250d134d-a49b-43f4-b0e9-01d6d0fc04e1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-250d134d-a49b-43f4-b0e9-01d6d0fc04e1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1808d9ac20f44272b13e28ceb7a5e4c6', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1eb1729d-99dd-4cb3-9468-5d1f6c4147b9, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=10ce6638-0ca4-4e8d-aec9-4163a87c1a20) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:03:33 localhost ovn_metadata_agent[160585]: 2025-12-15 10:03:33.970 160590 INFO neutron.agent.ovn.metadata.agent [-] Port 10ce6638-0ca4-4e8d-aec9-4163a87c1a20 in datapath 250d134d-a49b-43f4-b0e9-01d6d0fc04e1 bound to our chassis#033[00m Dec 15 05:03:33 localhost ovn_metadata_agent[160585]: 2025-12-15 10:03:33.971 160590 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 250d134d-a49b-43f4-b0e9-01d6d0fc04e1 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 15 05:03:33 localhost ovn_metadata_agent[160585]: 2025-12-15 10:03:33.972 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[3b841bfd-137d-49a2-a910-71a168ef9152]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:03:33 localhost nova_compute[286344]: 2025-12-15 10:03:33.992 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:03:33 localhost ovn_controller[154603]: 2025-12-15T10:03:33Z|00215|binding|INFO|Setting lport 10ce6638-0ca4-4e8d-aec9-4163a87c1a20 ovn-installed in OVS Dec 15 05:03:33 localhost ovn_controller[154603]: 2025-12-15T10:03:33Z|00216|binding|INFO|Setting lport 10ce6638-0ca4-4e8d-aec9-4163a87c1a20 up in Southbound Dec 15 05:03:33 localhost nova_compute[286344]: 2025-12-15 10:03:33.996 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:03:34 localhost nova_compute[286344]: 2025-12-15 10:03:34.029 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:03:34 localhost nova_compute[286344]: 2025-12-15 10:03:34.055 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:03:34 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e122 do_prune osdmap full prune enabled Dec 15 05:03:34 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e123 e123: 6 total, 6 up, 6 in Dec 15 05:03:34 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e123: 6 total, 6 up, 6 in Dec 15 05:03:34 localhost podman[319096]: Dec 15 05:03:34 localhost openstack_network_exporter[246484]: ERROR 10:03:34 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 15 05:03:34 localhost openstack_network_exporter[246484]: ERROR 10:03:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 05:03:34 localhost openstack_network_exporter[246484]: ERROR 10:03:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 05:03:34 localhost openstack_network_exporter[246484]: ERROR 10:03:34 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 15 05:03:34 localhost openstack_network_exporter[246484]: Dec 15 05:03:34 localhost openstack_network_exporter[246484]: ERROR 10:03:34 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 15 05:03:34 localhost openstack_network_exporter[246484]: Dec 15 05:03:34 localhost podman[319096]: 2025-12-15 10:03:34.859290741 +0000 UTC m=+0.082064074 container create dc47333f8744d2dcc90b51ae4b10d7d68d4aaa0675b0952faff8c8ad17f89c7d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-250d134d-a49b-43f4-b0e9-01d6d0fc04e1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:03:34 localhost systemd[1]: Started libpod-conmon-dc47333f8744d2dcc90b51ae4b10d7d68d4aaa0675b0952faff8c8ad17f89c7d.scope. Dec 15 05:03:34 localhost podman[319096]: 2025-12-15 10:03:34.81129132 +0000 UTC m=+0.034064613 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 15 05:03:34 localhost systemd[1]: Started libcrun container. Dec 15 05:03:34 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6121f72ea72f60dfea0e390c2b22d62814a6969c5596b142caf99bc659155395/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 05:03:34 localhost podman[319096]: 2025-12-15 10:03:34.932517063 +0000 UTC m=+0.155290306 container init dc47333f8744d2dcc90b51ae4b10d7d68d4aaa0675b0952faff8c8ad17f89c7d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-250d134d-a49b-43f4-b0e9-01d6d0fc04e1, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:03:34 localhost podman[319096]: 2025-12-15 10:03:34.941317045 +0000 UTC m=+0.164090288 container start dc47333f8744d2dcc90b51ae4b10d7d68d4aaa0675b0952faff8c8ad17f89c7d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-250d134d-a49b-43f4-b0e9-01d6d0fc04e1, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Dec 15 05:03:34 localhost dnsmasq[319115]: started, version 2.85 cachesize 150 Dec 15 05:03:34 localhost dnsmasq[319115]: DNS service limited to local subnets Dec 15 05:03:34 localhost dnsmasq[319115]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 15 05:03:34 localhost dnsmasq[319115]: warning: no upstream servers configured Dec 15 05:03:34 localhost dnsmasq-dhcp[319115]: DHCP, static leases only on 10.100.0.0, lease time 1d Dec 15 05:03:34 localhost dnsmasq[319115]: read /var/lib/neutron/dhcp/250d134d-a49b-43f4-b0e9-01d6d0fc04e1/addn_hosts - 0 addresses Dec 15 05:03:34 localhost dnsmasq-dhcp[319115]: read /var/lib/neutron/dhcp/250d134d-a49b-43f4-b0e9-01d6d0fc04e1/host Dec 15 05:03:34 localhost dnsmasq-dhcp[319115]: read /var/lib/neutron/dhcp/250d134d-a49b-43f4-b0e9-01d6d0fc04e1/opts Dec 15 05:03:35 localhost nova_compute[286344]: 2025-12-15 10:03:35.018 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:03:35 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:03:35.091 267546 INFO neutron.agent.dhcp.agent [None req-8ddae0f8-4ddc-4de8-a176-ef109f8a2359 - - - - - -] DHCP configuration for ports {'e33b3ea8-c618-43da-a6e7-8ca32ec03a7a', '2a0d26c7-6d34-4d9e-b074-2ed6c50ddadd'} is completed#033[00m Dec 15 05:03:35 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e123 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 05:03:35 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e123 do_prune osdmap full prune enabled Dec 15 05:03:35 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e124 e124: 6 total, 6 up, 6 in Dec 15 05:03:35 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e124: 6 total, 6 up, 6 in Dec 15 05:03:35 localhost neutron_sriov_agent[260044]: 2025-12-15 10:03:35.735 2 INFO neutron.agent.securitygroups_rpc [None req-8dd4269d-8113-453d-b7d3-288f94816260 d5377d4013ec43b98bbf91c063971dac 1808d9ac20f44272b13e28ceb7a5e4c6 - - default default] Security group member updated ['12be56c7-ee23-45e4-9687-84e749ab49b1']#033[00m Dec 15 05:03:35 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:03:35.776 267546 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-15T10:03:35Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=bb6335e7-659a-413c-91bb-e070a2fdf092, ip_allocation=immediate, mac_address=fa:16:3e:29:2d:c1, name=tempest-PortsTestJSON-2090024269, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-15T10:02:44Z, description=, dns_domain=, id=250d134d-a49b-43f4-b0e9-01d6d0fc04e1, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsTestJSON-test-network-383586829, port_security_enabled=True, project_id=1808d9ac20f44272b13e28ceb7a5e4c6, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=65280, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1044, status=ACTIVE, subnets=['fbb47525-ef3f-434f-ac80-3c0506eca498'], tags=[], tenant_id=1808d9ac20f44272b13e28ceb7a5e4c6, updated_at=2025-12-15T10:03:33Z, vlan_transparent=None, network_id=250d134d-a49b-43f4-b0e9-01d6d0fc04e1, port_security_enabled=True, project_id=1808d9ac20f44272b13e28ceb7a5e4c6, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['12be56c7-ee23-45e4-9687-84e749ab49b1'], standard_attr_id=1342, status=DOWN, tags=[], tenant_id=1808d9ac20f44272b13e28ceb7a5e4c6, updated_at=2025-12-15T10:03:35Z on network 250d134d-a49b-43f4-b0e9-01d6d0fc04e1#033[00m Dec 15 05:03:35 localhost dnsmasq[319115]: read /var/lib/neutron/dhcp/250d134d-a49b-43f4-b0e9-01d6d0fc04e1/addn_hosts - 1 addresses Dec 15 05:03:35 localhost podman[319133]: 2025-12-15 10:03:35.99614648 +0000 UTC m=+0.062643418 container kill dc47333f8744d2dcc90b51ae4b10d7d68d4aaa0675b0952faff8c8ad17f89c7d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-250d134d-a49b-43f4-b0e9-01d6d0fc04e1, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Dec 15 05:03:35 localhost dnsmasq-dhcp[319115]: read /var/lib/neutron/dhcp/250d134d-a49b-43f4-b0e9-01d6d0fc04e1/host Dec 15 05:03:36 localhost dnsmasq-dhcp[319115]: read /var/lib/neutron/dhcp/250d134d-a49b-43f4-b0e9-01d6d0fc04e1/opts Dec 15 05:03:36 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:03:36.248 267546 INFO neutron.agent.dhcp.agent [None req-bf786bd3-b54f-4627-b8d5-c08869e4489e - - - - - -] DHCP configuration for ports {'bb6335e7-659a-413c-91bb-e070a2fdf092'} is completed#033[00m Dec 15 05:03:36 localhost dnsmasq[319115]: exiting on receipt of SIGTERM Dec 15 05:03:36 localhost podman[319170]: 2025-12-15 10:03:36.685314226 +0000 UTC m=+0.055927541 container kill dc47333f8744d2dcc90b51ae4b10d7d68d4aaa0675b0952faff8c8ad17f89c7d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-250d134d-a49b-43f4-b0e9-01d6d0fc04e1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Dec 15 05:03:36 localhost systemd[1]: libpod-dc47333f8744d2dcc90b51ae4b10d7d68d4aaa0675b0952faff8c8ad17f89c7d.scope: Deactivated successfully. Dec 15 05:03:36 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e124 do_prune osdmap full prune enabled Dec 15 05:03:36 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e125 e125: 6 total, 6 up, 6 in Dec 15 05:03:36 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e125: 6 total, 6 up, 6 in Dec 15 05:03:36 localhost podman[319183]: 2025-12-15 10:03:36.761157164 +0000 UTC m=+0.062833313 container died dc47333f8744d2dcc90b51ae4b10d7d68d4aaa0675b0952faff8c8ad17f89c7d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-250d134d-a49b-43f4-b0e9-01d6d0fc04e1, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Dec 15 05:03:36 localhost podman[319183]: 2025-12-15 10:03:36.795134834 +0000 UTC m=+0.096810923 container cleanup dc47333f8744d2dcc90b51ae4b10d7d68d4aaa0675b0952faff8c8ad17f89c7d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-250d134d-a49b-43f4-b0e9-01d6d0fc04e1, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Dec 15 05:03:36 localhost systemd[1]: libpod-conmon-dc47333f8744d2dcc90b51ae4b10d7d68d4aaa0675b0952faff8c8ad17f89c7d.scope: Deactivated successfully. Dec 15 05:03:36 localhost podman[319185]: 2025-12-15 10:03:36.84212899 +0000 UTC m=+0.134239870 container remove dc47333f8744d2dcc90b51ae4b10d7d68d4aaa0675b0952faff8c8ad17f89c7d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-250d134d-a49b-43f4-b0e9-01d6d0fc04e1, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2) Dec 15 05:03:36 localhost systemd[1]: var-lib-containers-storage-overlay-6121f72ea72f60dfea0e390c2b22d62814a6969c5596b142caf99bc659155395-merged.mount: Deactivated successfully. Dec 15 05:03:36 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-dc47333f8744d2dcc90b51ae4b10d7d68d4aaa0675b0952faff8c8ad17f89c7d-userdata-shm.mount: Deactivated successfully. Dec 15 05:03:37 localhost ovn_metadata_agent[160585]: 2025-12-15 10:03:37.067 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2b:21:22 10.100.0.18 10.100.0.3'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.3/28', 'neutron:device_id': 'ovnmeta-250d134d-a49b-43f4-b0e9-01d6d0fc04e1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-250d134d-a49b-43f4-b0e9-01d6d0fc04e1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1808d9ac20f44272b13e28ceb7a5e4c6', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1eb1729d-99dd-4cb3-9468-5d1f6c4147b9, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=2a0d26c7-6d34-4d9e-b074-2ed6c50ddadd) old=Port_Binding(mac=['fa:16:3e:2b:21:22 10.100.0.3'], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'ovnmeta-250d134d-a49b-43f4-b0e9-01d6d0fc04e1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-250d134d-a49b-43f4-b0e9-01d6d0fc04e1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1808d9ac20f44272b13e28ceb7a5e4c6', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:03:37 localhost ovn_metadata_agent[160585]: 2025-12-15 10:03:37.070 160590 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 2a0d26c7-6d34-4d9e-b074-2ed6c50ddadd in datapath 250d134d-a49b-43f4-b0e9-01d6d0fc04e1 updated#033[00m Dec 15 05:03:37 localhost ovn_metadata_agent[160585]: 2025-12-15 10:03:37.072 160590 DEBUG neutron.agent.ovn.metadata.agent [-] Port b4039022-19c7-4992-bfc8-07d8cc3e1470 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Dec 15 05:03:37 localhost ovn_metadata_agent[160585]: 2025-12-15 10:03:37.073 160590 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 250d134d-a49b-43f4-b0e9-01d6d0fc04e1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 15 05:03:37 localhost ovn_metadata_agent[160585]: 2025-12-15 10:03:37.073 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[28c33ef4-fc2c-452c-8842-b0e797a11262]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:03:37 localhost nova_compute[286344]: 2025-12-15 10:03:37.248 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:03:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0. Dec 15 05:03:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 05:03:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a. Dec 15 05:03:37 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e125 do_prune osdmap full prune enabled Dec 15 05:03:37 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e126 e126: 6 total, 6 up, 6 in Dec 15 05:03:37 localhost neutron_sriov_agent[260044]: 2025-12-15 10:03:37.754 2 INFO neutron.agent.securitygroups_rpc [None req-52d52bdb-7409-4bdc-b7bf-4526b5480b77 d5377d4013ec43b98bbf91c063971dac 1808d9ac20f44272b13e28ceb7a5e4c6 - - default default] Security group member updated ['7be9db4e-b288-4501-8eee-78a7440e5fd3', '12be56c7-ee23-45e4-9687-84e749ab49b1']#033[00m Dec 15 05:03:37 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e126: 6 total, 6 up, 6 in Dec 15 05:03:37 localhost systemd[1]: tmp-crun.4NyQgj.mount: Deactivated successfully. Dec 15 05:03:37 localhost podman[319236]: 2025-12-15 10:03:37.778808809 +0000 UTC m=+0.101377837 container health_status 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 15 05:03:37 localhost podman[319238]: 2025-12-15 10:03:37.747106136 +0000 UTC m=+0.068957686 container health_status b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:03:37 localhost podman[319236]: 2025-12-15 10:03:37.817391625 +0000 UTC m=+0.139960653 container exec_died 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Dec 15 05:03:37 localhost systemd[1]: 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0.service: Deactivated successfully. Dec 15 05:03:37 localhost podman[319238]: 2025-12-15 10:03:37.82997733 +0000 UTC m=+0.151828850 container exec_died b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:03:37 localhost systemd[1]: b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a.service: Deactivated successfully. Dec 15 05:03:37 localhost podman[319237]: 2025-12-15 10:03:37.821023736 +0000 UTC m=+0.142086636 container health_status 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_id=multipathd, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true) Dec 15 05:03:37 localhost podman[319237]: 2025-12-15 10:03:37.903438458 +0000 UTC m=+0.224501348 container exec_died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=multipathd, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, managed_by=edpm_ansible, io.buildah.version=1.41.3) Dec 15 05:03:37 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.service: Deactivated successfully. Dec 15 05:03:38 localhost podman[319318]: Dec 15 05:03:38 localhost podman[319318]: 2025-12-15 10:03:38.179718832 +0000 UTC m=+0.072605118 container create 8ef5afcb32334e330a8f973483a59535a18dc448340bd1cfc00ca2db49dfede9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-250d134d-a49b-43f4-b0e9-01d6d0fc04e1, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3) Dec 15 05:03:38 localhost neutron_sriov_agent[260044]: 2025-12-15 10:03:38.221 2 INFO neutron.agent.securitygroups_rpc [None req-037131f4-aca9-4172-9580-636f8bf0bf05 d5377d4013ec43b98bbf91c063971dac 1808d9ac20f44272b13e28ceb7a5e4c6 - - default default] Security group member updated ['7be9db4e-b288-4501-8eee-78a7440e5fd3']#033[00m Dec 15 05:03:38 localhost systemd[1]: Started libpod-conmon-8ef5afcb32334e330a8f973483a59535a18dc448340bd1cfc00ca2db49dfede9.scope. Dec 15 05:03:38 localhost podman[319318]: 2025-12-15 10:03:38.140324376 +0000 UTC m=+0.033210682 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 15 05:03:38 localhost systemd[1]: Started libcrun container. Dec 15 05:03:38 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/681d94e39dba3dbf830f45c7e7f97fc47cbb67049a5a7478261f8c97e3f8e1ea/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 05:03:38 localhost podman[319318]: 2025-12-15 10:03:38.276224017 +0000 UTC m=+0.169110303 container init 8ef5afcb32334e330a8f973483a59535a18dc448340bd1cfc00ca2db49dfede9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-250d134d-a49b-43f4-b0e9-01d6d0fc04e1, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS) Dec 15 05:03:38 localhost podman[319318]: 2025-12-15 10:03:38.281676803 +0000 UTC m=+0.174563079 container start 8ef5afcb32334e330a8f973483a59535a18dc448340bd1cfc00ca2db49dfede9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-250d134d-a49b-43f4-b0e9-01d6d0fc04e1, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0) Dec 15 05:03:38 localhost dnsmasq[319336]: started, version 2.85 cachesize 150 Dec 15 05:03:38 localhost dnsmasq[319336]: DNS service limited to local subnets Dec 15 05:03:38 localhost dnsmasq[319336]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 15 05:03:38 localhost dnsmasq[319336]: warning: no upstream servers configured Dec 15 05:03:38 localhost dnsmasq-dhcp[319336]: DHCP, static leases only on 10.100.0.0, lease time 1d Dec 15 05:03:38 localhost dnsmasq-dhcp[319336]: DHCP, static leases only on 10.100.0.16, lease time 1d Dec 15 05:03:38 localhost dnsmasq[319336]: read /var/lib/neutron/dhcp/250d134d-a49b-43f4-b0e9-01d6d0fc04e1/addn_hosts - 1 addresses Dec 15 05:03:38 localhost dnsmasq-dhcp[319336]: read /var/lib/neutron/dhcp/250d134d-a49b-43f4-b0e9-01d6d0fc04e1/host Dec 15 05:03:38 localhost dnsmasq-dhcp[319336]: read /var/lib/neutron/dhcp/250d134d-a49b-43f4-b0e9-01d6d0fc04e1/opts Dec 15 05:03:38 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:03:38.324 267546 INFO neutron.agent.dhcp.agent [None req-d9f04660-658d-414a-b953-ab3b4caee08b - - - - - -] Trigger reload_allocations for port admin_state_up=False, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-15T10:03:35Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=bb6335e7-659a-413c-91bb-e070a2fdf092, ip_allocation=immediate, mac_address=fa:16:3e:29:2d:c1, name=tempest-PortsTestJSON-1877797334, network_id=250d134d-a49b-43f4-b0e9-01d6d0fc04e1, port_security_enabled=True, project_id=1808d9ac20f44272b13e28ceb7a5e4c6, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['7be9db4e-b288-4501-8eee-78a7440e5fd3'], standard_attr_id=1342, status=DOWN, tags=[], tenant_id=1808d9ac20f44272b13e28ceb7a5e4c6, updated_at=2025-12-15T10:03:37Z on network 250d134d-a49b-43f4-b0e9-01d6d0fc04e1#033[00m Dec 15 05:03:38 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:03:38.326 267546 INFO oslo.privsep.daemon [None req-d9f04660-658d-414a-b953-ab3b4caee08b - - - - - -] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.dhcp_release_cmd', '--privsep_sock_path', '/tmp/tmp36b67tnm/privsep.sock']#033[00m Dec 15 05:03:38 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:03:38.490 267546 INFO neutron.agent.dhcp.agent [None req-7c42b57f-18e4-4bc3-9cac-046573498127 - - - - - -] DHCP configuration for ports {'e33b3ea8-c618-43da-a6e7-8ca32ec03a7a', '2a0d26c7-6d34-4d9e-b074-2ed6c50ddadd', 'bb6335e7-659a-413c-91bb-e070a2fdf092', '10ce6638-0ca4-4e8d-aec9-4163a87c1a20'} is completed#033[00m Dec 15 05:03:38 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e126 do_prune osdmap full prune enabled Dec 15 05:03:38 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e127 e127: 6 total, 6 up, 6 in Dec 15 05:03:38 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e127: 6 total, 6 up, 6 in Dec 15 05:03:38 localhost systemd[1]: tmp-crun.2bihAo.mount: Deactivated successfully. Dec 15 05:03:38 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:03:38.956 267546 INFO oslo.privsep.daemon [None req-d9f04660-658d-414a-b953-ab3b4caee08b - - - - - -] Spawned new privsep daemon via rootwrap#033[00m Dec 15 05:03:38 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:03:38.835 319341 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m Dec 15 05:03:38 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:03:38.839 319341 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m Dec 15 05:03:38 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:03:38.842 319341 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m Dec 15 05:03:38 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:03:38.843 319341 INFO oslo.privsep.daemon [-] privsep daemon running as pid 319341#033[00m Dec 15 05:03:39 localhost dnsmasq-dhcp[319336]: DHCPRELEASE(tap10ce6638-0c) 10.100.0.12 fa:16:3e:29:2d:c1 Dec 15 05:03:39 localhost podman[319364]: 2025-12-15 10:03:39.730580341 +0000 UTC m=+0.054502915 container kill 8ef5afcb32334e330a8f973483a59535a18dc448340bd1cfc00ca2db49dfede9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-250d134d-a49b-43f4-b0e9-01d6d0fc04e1, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202) Dec 15 05:03:39 localhost dnsmasq[319336]: read /var/lib/neutron/dhcp/250d134d-a49b-43f4-b0e9-01d6d0fc04e1/addn_hosts - 1 addresses Dec 15 05:03:39 localhost dnsmasq-dhcp[319336]: read /var/lib/neutron/dhcp/250d134d-a49b-43f4-b0e9-01d6d0fc04e1/host Dec 15 05:03:39 localhost dnsmasq-dhcp[319336]: read /var/lib/neutron/dhcp/250d134d-a49b-43f4-b0e9-01d6d0fc04e1/opts Dec 15 05:03:39 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e127 do_prune osdmap full prune enabled Dec 15 05:03:39 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e128 e128: 6 total, 6 up, 6 in Dec 15 05:03:39 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e128: 6 total, 6 up, 6 in Dec 15 05:03:39 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:03:39.936 267546 INFO neutron.agent.dhcp.agent [None req-fcf2d325-4f06-4e09-a9a8-9b377c4a63ad - - - - - -] DHCP configuration for ports {'bb6335e7-659a-413c-91bb-e070a2fdf092'} is completed#033[00m Dec 15 05:03:40 localhost nova_compute[286344]: 2025-12-15 10:03:40.020 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:03:40 localhost systemd[1]: tmp-crun.uOPBgx.mount: Deactivated successfully. Dec 15 05:03:40 localhost dnsmasq[319336]: read /var/lib/neutron/dhcp/250d134d-a49b-43f4-b0e9-01d6d0fc04e1/addn_hosts - 0 addresses Dec 15 05:03:40 localhost dnsmasq-dhcp[319336]: read /var/lib/neutron/dhcp/250d134d-a49b-43f4-b0e9-01d6d0fc04e1/host Dec 15 05:03:40 localhost dnsmasq-dhcp[319336]: read /var/lib/neutron/dhcp/250d134d-a49b-43f4-b0e9-01d6d0fc04e1/opts Dec 15 05:03:40 localhost podman[319403]: 2025-12-15 10:03:40.203488725 +0000 UTC m=+0.056555836 container kill 8ef5afcb32334e330a8f973483a59535a18dc448340bd1cfc00ca2db49dfede9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-250d134d-a49b-43f4-b0e9-01d6d0fc04e1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true) Dec 15 05:03:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09. Dec 15 05:03:40 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 05:03:40 localhost systemd[1]: tmp-crun.XaaMuo.mount: Deactivated successfully. Dec 15 05:03:40 localhost podman[319419]: 2025-12-15 10:03:40.308879303 +0000 UTC m=+0.071500981 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Dec 15 05:03:40 localhost podman[319418]: 2025-12-15 10:03:40.362879884 +0000 UTC m=+0.134768524 container health_status 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, distribution-scope=public, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, version=9.6) Dec 15 05:03:40 localhost podman[319419]: 2025-12-15 10:03:40.370240208 +0000 UTC m=+0.132861896 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, container_name=ovn_controller) Dec 15 05:03:40 localhost podman[319418]: 2025-12-15 10:03:40.380392822 +0000 UTC m=+0.152281452 container exec_died 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.expose-services=, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, name=ubi9-minimal, version=9.6, container_name=openstack_network_exporter, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, build-date=2025-08-20T13:12:41) Dec 15 05:03:40 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 05:03:40 localhost systemd[1]: 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09.service: Deactivated successfully. Dec 15 05:03:40 localhost ovn_metadata_agent[160585]: 2025-12-15 10:03:40.414 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2b:21:22 10.100.0.18 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.2/28', 'neutron:device_id': 'ovnmeta-250d134d-a49b-43f4-b0e9-01d6d0fc04e1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-250d134d-a49b-43f4-b0e9-01d6d0fc04e1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1808d9ac20f44272b13e28ceb7a5e4c6', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1eb1729d-99dd-4cb3-9468-5d1f6c4147b9, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=2a0d26c7-6d34-4d9e-b074-2ed6c50ddadd) old=Port_Binding(mac=['fa:16:3e:2b:21:22 10.100.0.18 10.100.0.3'], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.3/28', 'neutron:device_id': 'ovnmeta-250d134d-a49b-43f4-b0e9-01d6d0fc04e1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-250d134d-a49b-43f4-b0e9-01d6d0fc04e1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1808d9ac20f44272b13e28ceb7a5e4c6', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:03:40 localhost ovn_metadata_agent[160585]: 2025-12-15 10:03:40.415 160590 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 2a0d26c7-6d34-4d9e-b074-2ed6c50ddadd in datapath 250d134d-a49b-43f4-b0e9-01d6d0fc04e1 updated#033[00m Dec 15 05:03:40 localhost ovn_metadata_agent[160585]: 2025-12-15 10:03:40.418 160590 DEBUG neutron.agent.ovn.metadata.agent [-] Port b4039022-19c7-4992-bfc8-07d8cc3e1470 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Dec 15 05:03:40 localhost ovn_metadata_agent[160585]: 2025-12-15 10:03:40.418 160590 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 250d134d-a49b-43f4-b0e9-01d6d0fc04e1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 15 05:03:40 localhost ovn_metadata_agent[160585]: 2025-12-15 10:03:40.419 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[fa732525-07bd-4f40-8751-49f501b99871]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:03:40 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:03:40.552 267546 INFO neutron.agent.dhcp.agent [None req-e5f64073-f407-4c73-8179-8de712cf0ea0 - - - - - -] DHCP configuration for ports {'e33b3ea8-c618-43da-a6e7-8ca32ec03a7a', '2a0d26c7-6d34-4d9e-b074-2ed6c50ddadd', '10ce6638-0ca4-4e8d-aec9-4163a87c1a20'} is completed#033[00m Dec 15 05:03:40 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 05:03:40 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e128 do_prune osdmap full prune enabled Dec 15 05:03:40 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e129 e129: 6 total, 6 up, 6 in Dec 15 05:03:40 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e129: 6 total, 6 up, 6 in Dec 15 05:03:40 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Dec 15 05:03:40 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/539179827' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Dec 15 05:03:40 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Dec 15 05:03:40 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/539179827' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Dec 15 05:03:41 localhost neutron_sriov_agent[260044]: 2025-12-15 10:03:41.414 2 INFO neutron.agent.securitygroups_rpc [None req-90082c2d-bad6-4a48-926b-7bdc189a1b0c d5377d4013ec43b98bbf91c063971dac 1808d9ac20f44272b13e28ceb7a5e4c6 - - default default] Security group member updated ['f8bc8059-257b-424e-99c5-0df3eaaf9b27']#033[00m Dec 15 05:03:41 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:03:41.458 267546 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-15T10:03:41Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=a8611e3e-b504-4b9b-a3e6-a797f9aa1242, ip_allocation=immediate, mac_address=fa:16:3e:d8:6f:c0, name=tempest-PortsTestJSON-389546704, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-15T10:02:44Z, description=, dns_domain=, id=250d134d-a49b-43f4-b0e9-01d6d0fc04e1, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsTestJSON-test-network-383586829, port_security_enabled=True, project_id=1808d9ac20f44272b13e28ceb7a5e4c6, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=65280, qos_policy_id=None, revision_number=5, router:external=False, shared=False, standard_attr_id=1044, status=ACTIVE, subnets=['3a4d73a2-bb85-4e83-a13f-7e92cd29f3a4', '7e0ec288-fc3d-4df0-91b8-25a91960bfaa'], tags=[], tenant_id=1808d9ac20f44272b13e28ceb7a5e4c6, updated_at=2025-12-15T10:03:39Z, vlan_transparent=None, network_id=250d134d-a49b-43f4-b0e9-01d6d0fc04e1, port_security_enabled=True, project_id=1808d9ac20f44272b13e28ceb7a5e4c6, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['f8bc8059-257b-424e-99c5-0df3eaaf9b27'], standard_attr_id=1399, status=DOWN, tags=[], tenant_id=1808d9ac20f44272b13e28ceb7a5e4c6, updated_at=2025-12-15T10:03:41Z on network 250d134d-a49b-43f4-b0e9-01d6d0fc04e1#033[00m Dec 15 05:03:41 localhost dnsmasq[319336]: read /var/lib/neutron/dhcp/250d134d-a49b-43f4-b0e9-01d6d0fc04e1/addn_hosts - 1 addresses Dec 15 05:03:41 localhost podman[319489]: 2025-12-15 10:03:41.703280185 +0000 UTC m=+0.058121205 container kill 8ef5afcb32334e330a8f973483a59535a18dc448340bd1cfc00ca2db49dfede9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-250d134d-a49b-43f4-b0e9-01d6d0fc04e1, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:03:41 localhost dnsmasq-dhcp[319336]: read /var/lib/neutron/dhcp/250d134d-a49b-43f4-b0e9-01d6d0fc04e1/host Dec 15 05:03:41 localhost dnsmasq-dhcp[319336]: read /var/lib/neutron/dhcp/250d134d-a49b-43f4-b0e9-01d6d0fc04e1/opts Dec 15 05:03:41 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e129 do_prune osdmap full prune enabled Dec 15 05:03:41 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e130 e130: 6 total, 6 up, 6 in Dec 15 05:03:41 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e130: 6 total, 6 up, 6 in Dec 15 05:03:42 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:03:42.015 267546 INFO neutron.agent.dhcp.agent [None req-91a687d5-2274-4250-bdab-77706dfa25bf - - - - - -] DHCP configuration for ports {'a8611e3e-b504-4b9b-a3e6-a797f9aa1242'} is completed#033[00m Dec 15 05:03:42 localhost nova_compute[286344]: 2025-12-15 10:03:42.285 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:03:42 localhost dnsmasq[319336]: exiting on receipt of SIGTERM Dec 15 05:03:42 localhost podman[319528]: 2025-12-15 10:03:42.805952759 +0000 UTC m=+0.052829413 container kill 8ef5afcb32334e330a8f973483a59535a18dc448340bd1cfc00ca2db49dfede9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-250d134d-a49b-43f4-b0e9-01d6d0fc04e1, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202) Dec 15 05:03:42 localhost systemd[1]: libpod-8ef5afcb32334e330a8f973483a59535a18dc448340bd1cfc00ca2db49dfede9.scope: Deactivated successfully. Dec 15 05:03:42 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e130 do_prune osdmap full prune enabled Dec 15 05:03:42 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e131 e131: 6 total, 6 up, 6 in Dec 15 05:03:42 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e131: 6 total, 6 up, 6 in Dec 15 05:03:42 localhost podman[319543]: 2025-12-15 10:03:42.881593262 +0000 UTC m=+0.053098800 container died 8ef5afcb32334e330a8f973483a59535a18dc448340bd1cfc00ca2db49dfede9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-250d134d-a49b-43f4-b0e9-01d6d0fc04e1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2) Dec 15 05:03:42 localhost systemd[1]: tmp-crun.CLAw8k.mount: Deactivated successfully. Dec 15 05:03:42 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8ef5afcb32334e330a8f973483a59535a18dc448340bd1cfc00ca2db49dfede9-userdata-shm.mount: Deactivated successfully. Dec 15 05:03:42 localhost podman[319543]: 2025-12-15 10:03:42.934951297 +0000 UTC m=+0.106456805 container remove 8ef5afcb32334e330a8f973483a59535a18dc448340bd1cfc00ca2db49dfede9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-250d134d-a49b-43f4-b0e9-01d6d0fc04e1, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Dec 15 05:03:42 localhost systemd[1]: libpod-conmon-8ef5afcb32334e330a8f973483a59535a18dc448340bd1cfc00ca2db49dfede9.scope: Deactivated successfully. Dec 15 05:03:43 localhost ovn_metadata_agent[160585]: 2025-12-15 10:03:43.503 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2b:21:22 10.100.0.18 10.100.0.2 10.100.0.34'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.2/28 10.100.0.34/28', 'neutron:device_id': 'ovnmeta-250d134d-a49b-43f4-b0e9-01d6d0fc04e1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-250d134d-a49b-43f4-b0e9-01d6d0fc04e1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1808d9ac20f44272b13e28ceb7a5e4c6', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1eb1729d-99dd-4cb3-9468-5d1f6c4147b9, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=2a0d26c7-6d34-4d9e-b074-2ed6c50ddadd) old=Port_Binding(mac=['fa:16:3e:2b:21:22 10.100.0.18 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.2/28', 'neutron:device_id': 'ovnmeta-250d134d-a49b-43f4-b0e9-01d6d0fc04e1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-250d134d-a49b-43f4-b0e9-01d6d0fc04e1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1808d9ac20f44272b13e28ceb7a5e4c6', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:03:43 localhost ovn_metadata_agent[160585]: 2025-12-15 10:03:43.505 160590 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 2a0d26c7-6d34-4d9e-b074-2ed6c50ddadd in datapath 250d134d-a49b-43f4-b0e9-01d6d0fc04e1 updated#033[00m Dec 15 05:03:43 localhost ovn_metadata_agent[160585]: 2025-12-15 10:03:43.507 160590 DEBUG neutron.agent.ovn.metadata.agent [-] Port b4039022-19c7-4992-bfc8-07d8cc3e1470 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Dec 15 05:03:43 localhost ovn_metadata_agent[160585]: 2025-12-15 10:03:43.508 160590 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 250d134d-a49b-43f4-b0e9-01d6d0fc04e1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 15 05:03:43 localhost ovn_metadata_agent[160585]: 2025-12-15 10:03:43.508 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[758b8ece-8c65-4dbd-bbe2-60fef7690956]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:03:43 localhost systemd[1]: var-lib-containers-storage-overlay-681d94e39dba3dbf830f45c7e7f97fc47cbb67049a5a7478261f8c97e3f8e1ea-merged.mount: Deactivated successfully. Dec 15 05:03:43 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e131 do_prune osdmap full prune enabled Dec 15 05:03:43 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e132 e132: 6 total, 6 up, 6 in Dec 15 05:03:43 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e132: 6 total, 6 up, 6 in Dec 15 05:03:44 localhost neutron_sriov_agent[260044]: 2025-12-15 10:03:44.215 2 INFO neutron.agent.securitygroups_rpc [None req-4fbb28a1-d3a0-4c3c-8457-054b24999e02 d5377d4013ec43b98bbf91c063971dac 1808d9ac20f44272b13e28ceb7a5e4c6 - - default default] Security group member updated ['2d16e43b-069c-4513-8db0-4d6b4f60237d', 'fb8ac40a-bf20-49a5-84be-e2fcedacb8f5', 'f8bc8059-257b-424e-99c5-0df3eaaf9b27']#033[00m Dec 15 05:03:45 localhost nova_compute[286344]: 2025-12-15 10:03:45.058 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:03:45 localhost neutron_sriov_agent[260044]: 2025-12-15 10:03:45.076 2 INFO neutron.agent.securitygroups_rpc [None req-9b7b2cb6-10f6-4c29-b9a5-a37c25ade3d0 d5377d4013ec43b98bbf91c063971dac 1808d9ac20f44272b13e28ceb7a5e4c6 - - default default] Security group member updated ['2d16e43b-069c-4513-8db0-4d6b4f60237d', 'fb8ac40a-bf20-49a5-84be-e2fcedacb8f5']#033[00m Dec 15 05:03:45 localhost podman[319621]: Dec 15 05:03:45 localhost podman[319621]: 2025-12-15 10:03:45.164019787 +0000 UTC m=+0.087186383 container create c82311c20cf01a78120b91984f89c608fb809edb2784dd0c4e949f62d3f937b7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-250d134d-a49b-43f4-b0e9-01d6d0fc04e1, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2) Dec 15 05:03:45 localhost systemd[1]: Started libpod-conmon-c82311c20cf01a78120b91984f89c608fb809edb2784dd0c4e949f62d3f937b7.scope. Dec 15 05:03:45 localhost podman[319621]: 2025-12-15 10:03:45.123536133 +0000 UTC m=+0.046702779 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 15 05:03:45 localhost systemd[1]: Started libcrun container. Dec 15 05:03:45 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7e69da53e39c86edc0d98389d2dab9b15a42b34708a1a13b06a00c77e2e5770c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 05:03:45 localhost podman[319621]: 2025-12-15 10:03:45.245798663 +0000 UTC m=+0.168965279 container init c82311c20cf01a78120b91984f89c608fb809edb2784dd0c4e949f62d3f937b7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-250d134d-a49b-43f4-b0e9-01d6d0fc04e1, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Dec 15 05:03:45 localhost podman[319621]: 2025-12-15 10:03:45.255424063 +0000 UTC m=+0.178590649 container start c82311c20cf01a78120b91984f89c608fb809edb2784dd0c4e949f62d3f937b7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-250d134d-a49b-43f4-b0e9-01d6d0fc04e1, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 05:03:45 localhost dnsmasq[319639]: started, version 2.85 cachesize 150 Dec 15 05:03:45 localhost dnsmasq[319639]: DNS service limited to local subnets Dec 15 05:03:45 localhost dnsmasq[319639]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 15 05:03:45 localhost dnsmasq[319639]: warning: no upstream servers configured Dec 15 05:03:45 localhost dnsmasq-dhcp[319639]: DHCP, static leases only on 10.100.0.0, lease time 1d Dec 15 05:03:45 localhost dnsmasq-dhcp[319639]: DHCP, static leases only on 10.100.0.32, lease time 1d Dec 15 05:03:45 localhost dnsmasq-dhcp[319639]: DHCP, static leases only on 10.100.0.16, lease time 1d Dec 15 05:03:45 localhost dnsmasq[319639]: read /var/lib/neutron/dhcp/250d134d-a49b-43f4-b0e9-01d6d0fc04e1/addn_hosts - 1 addresses Dec 15 05:03:45 localhost dnsmasq-dhcp[319639]: read /var/lib/neutron/dhcp/250d134d-a49b-43f4-b0e9-01d6d0fc04e1/host Dec 15 05:03:45 localhost dnsmasq-dhcp[319639]: read /var/lib/neutron/dhcp/250d134d-a49b-43f4-b0e9-01d6d0fc04e1/opts Dec 15 05:03:45 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:03:45.313 267546 INFO neutron.agent.dhcp.agent [None req-7d10f87e-591a-49f1-bd5e-13d3e68618b7 - - - - - -] Trigger reload_allocations for port admin_state_up=False, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-15T10:03:41Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=a8611e3e-b504-4b9b-a3e6-a797f9aa1242, ip_allocation=immediate, mac_address=fa:16:3e:d8:6f:c0, name=tempest-PortsTestJSON-1536709937, network_id=250d134d-a49b-43f4-b0e9-01d6d0fc04e1, port_security_enabled=True, project_id=1808d9ac20f44272b13e28ceb7a5e4c6, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['2d16e43b-069c-4513-8db0-4d6b4f60237d', 'fb8ac40a-bf20-49a5-84be-e2fcedacb8f5'], standard_attr_id=1399, status=DOWN, tags=[], tenant_id=1808d9ac20f44272b13e28ceb7a5e4c6, updated_at=2025-12-15T10:03:43Z on network 250d134d-a49b-43f4-b0e9-01d6d0fc04e1#033[00m Dec 15 05:03:45 localhost dnsmasq-dhcp[319639]: DHCPRELEASE(tap10ce6638-0c) 10.100.0.6 fa:16:3e:d8:6f:c0 Dec 15 05:03:45 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 05:03:45 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e132 do_prune osdmap full prune enabled Dec 15 05:03:45 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:03:45.578 267546 INFO neutron.agent.dhcp.agent [None req-c84e52fa-5aa8-4e11-be43-7f945589b877 - - - - - -] DHCP configuration for ports {'e33b3ea8-c618-43da-a6e7-8ca32ec03a7a', '2a0d26c7-6d34-4d9e-b074-2ed6c50ddadd', 'a8611e3e-b504-4b9b-a3e6-a797f9aa1242', '10ce6638-0ca4-4e8d-aec9-4163a87c1a20'} is completed#033[00m Dec 15 05:03:45 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e133 e133: 6 total, 6 up, 6 in Dec 15 05:03:45 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e133: 6 total, 6 up, 6 in Dec 15 05:03:45 localhost dnsmasq[319639]: read /var/lib/neutron/dhcp/250d134d-a49b-43f4-b0e9-01d6d0fc04e1/addn_hosts - 1 addresses Dec 15 05:03:45 localhost podman[319658]: 2025-12-15 10:03:45.901976433 +0000 UTC m=+0.066801093 container kill c82311c20cf01a78120b91984f89c608fb809edb2784dd0c4e949f62d3f937b7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-250d134d-a49b-43f4-b0e9-01d6d0fc04e1, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2) Dec 15 05:03:45 localhost dnsmasq-dhcp[319639]: read /var/lib/neutron/dhcp/250d134d-a49b-43f4-b0e9-01d6d0fc04e1/host Dec 15 05:03:45 localhost dnsmasq-dhcp[319639]: read /var/lib/neutron/dhcp/250d134d-a49b-43f4-b0e9-01d6d0fc04e1/opts Dec 15 05:03:46 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:03:46.095 267546 INFO neutron.agent.dhcp.agent [None req-65d13f80-565e-478b-9d3a-887b20b158ca - - - - - -] DHCP configuration for ports {'a8611e3e-b504-4b9b-a3e6-a797f9aa1242'} is completed#033[00m Dec 15 05:03:46 localhost podman[319695]: 2025-12-15 10:03:46.304060675 +0000 UTC m=+0.043730165 container kill c82311c20cf01a78120b91984f89c608fb809edb2784dd0c4e949f62d3f937b7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-250d134d-a49b-43f4-b0e9-01d6d0fc04e1, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true) Dec 15 05:03:46 localhost dnsmasq[319639]: read /var/lib/neutron/dhcp/250d134d-a49b-43f4-b0e9-01d6d0fc04e1/addn_hosts - 0 addresses Dec 15 05:03:46 localhost dnsmasq-dhcp[319639]: read /var/lib/neutron/dhcp/250d134d-a49b-43f4-b0e9-01d6d0fc04e1/host Dec 15 05:03:46 localhost dnsmasq-dhcp[319639]: read /var/lib/neutron/dhcp/250d134d-a49b-43f4-b0e9-01d6d0fc04e1/opts Dec 15 05:03:46 localhost podman[319731]: 2025-12-15 10:03:46.926329597 +0000 UTC m=+0.057880380 container kill c82311c20cf01a78120b91984f89c608fb809edb2784dd0c4e949f62d3f937b7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-250d134d-a49b-43f4-b0e9-01d6d0fc04e1, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:03:46 localhost dnsmasq[319639]: exiting on receipt of SIGTERM Dec 15 05:03:46 localhost systemd[1]: libpod-c82311c20cf01a78120b91984f89c608fb809edb2784dd0c4e949f62d3f937b7.scope: Deactivated successfully. Dec 15 05:03:46 localhost podman[319744]: 2025-12-15 10:03:46.987363874 +0000 UTC m=+0.045059528 container died c82311c20cf01a78120b91984f89c608fb809edb2784dd0c4e949f62d3f937b7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-250d134d-a49b-43f4-b0e9-01d6d0fc04e1, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2) Dec 15 05:03:47 localhost podman[319744]: 2025-12-15 10:03:47.013598781 +0000 UTC m=+0.071294415 container cleanup c82311c20cf01a78120b91984f89c608fb809edb2784dd0c4e949f62d3f937b7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-250d134d-a49b-43f4-b0e9-01d6d0fc04e1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:03:47 localhost systemd[1]: libpod-conmon-c82311c20cf01a78120b91984f89c608fb809edb2784dd0c4e949f62d3f937b7.scope: Deactivated successfully. Dec 15 05:03:47 localhost podman[319745]: 2025-12-15 10:03:47.076733951 +0000 UTC m=+0.128018625 container remove c82311c20cf01a78120b91984f89c608fb809edb2784dd0c4e949f62d3f937b7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-250d134d-a49b-43f4-b0e9-01d6d0fc04e1, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 05:03:47 localhost systemd[1]: var-lib-containers-storage-overlay-7e69da53e39c86edc0d98389d2dab9b15a42b34708a1a13b06a00c77e2e5770c-merged.mount: Deactivated successfully. Dec 15 05:03:47 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c82311c20cf01a78120b91984f89c608fb809edb2784dd0c4e949f62d3f937b7-userdata-shm.mount: Deactivated successfully. Dec 15 05:03:47 localhost nova_compute[286344]: 2025-12-15 10:03:47.327 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:03:47 localhost podman[319818]: Dec 15 05:03:47 localhost podman[319818]: 2025-12-15 10:03:47.95547077 +0000 UTC m=+0.069929771 container create d0b9d2615b7a5f676b89f5902169608fb9f665cd890553ec1318a8292fb801c0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-250d134d-a49b-43f4-b0e9-01d6d0fc04e1, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:03:48 localhost systemd[1]: Started libpod-conmon-d0b9d2615b7a5f676b89f5902169608fb9f665cd890553ec1318a8292fb801c0.scope. Dec 15 05:03:48 localhost systemd[1]: tmp-crun.0oVFJM.mount: Deactivated successfully. Dec 15 05:03:48 localhost podman[319818]: 2025-12-15 10:03:47.921198922 +0000 UTC m=+0.035657943 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 15 05:03:48 localhost systemd[1]: Started libcrun container. Dec 15 05:03:48 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/81aa91866c543edebd306eaf9d851077db84fe05a7055487a726e31545859ff3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 05:03:48 localhost podman[319818]: 2025-12-15 10:03:48.035293678 +0000 UTC m=+0.149752669 container init d0b9d2615b7a5f676b89f5902169608fb9f665cd890553ec1318a8292fb801c0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-250d134d-a49b-43f4-b0e9-01d6d0fc04e1, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3) Dec 15 05:03:48 localhost dnsmasq[319837]: started, version 2.85 cachesize 150 Dec 15 05:03:48 localhost dnsmasq[319837]: DNS service limited to local subnets Dec 15 05:03:48 localhost dnsmasq[319837]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 15 05:03:48 localhost dnsmasq[319837]: warning: no upstream servers configured Dec 15 05:03:48 localhost dnsmasq-dhcp[319837]: DHCP, static leases only on 10.100.0.32, lease time 1d Dec 15 05:03:48 localhost dnsmasq-dhcp[319837]: DHCP, static leases only on 10.100.0.16, lease time 1d Dec 15 05:03:48 localhost dnsmasq[319837]: read /var/lib/neutron/dhcp/250d134d-a49b-43f4-b0e9-01d6d0fc04e1/addn_hosts - 0 addresses Dec 15 05:03:48 localhost dnsmasq-dhcp[319837]: read /var/lib/neutron/dhcp/250d134d-a49b-43f4-b0e9-01d6d0fc04e1/host Dec 15 05:03:48 localhost dnsmasq-dhcp[319837]: read /var/lib/neutron/dhcp/250d134d-a49b-43f4-b0e9-01d6d0fc04e1/opts Dec 15 05:03:48 localhost podman[319818]: 2025-12-15 10:03:48.049133374 +0000 UTC m=+0.163592355 container start d0b9d2615b7a5f676b89f5902169608fb9f665cd890553ec1318a8292fb801c0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-250d134d-a49b-43f4-b0e9-01d6d0fc04e1, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.124 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'name': 'test', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005559462.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'c785bf23f53946bc99867d8832a50266', 'user_id': '1ba5fce347b64bfebf995f187193f205', 'hostId': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.125 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.138 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.138 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.140 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3a0ce260-a409-4436-ab31-76f048676599', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T10:03:48.125302', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5965361a-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12094.317974433, 'message_signature': '1714e4555561d9454e1aef8328639feb8477d6c6b561d6e3fa7973ee3cddb81a'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T10:03:48.125302', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5965484e-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12094.317974433, 'message_signature': '0f40c798d001046d3151ffe1d014efb60bf70c74483aaf09e1e7fa74475b14bb'}]}, 'timestamp': '2025-12-15 10:03:48.139105', '_unique_id': '3130fd44bb884643a8949d9b7e8c807d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.140 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.140 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.140 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.140 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.140 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.140 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.140 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.140 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.140 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.140 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.140 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.140 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.140 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.140 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.140 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.140 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.140 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.140 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.140 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.140 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.140 12 ERROR oslo_messaging.notify.messaging Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.140 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.140 12 ERROR oslo_messaging.notify.messaging Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.140 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.140 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.140 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.140 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.140 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.140 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.140 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.140 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.140 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.140 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.140 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.140 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.140 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.140 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.140 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.140 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.140 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.140 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.140 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.140 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.140 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.140 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.140 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.140 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.140 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.140 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.140 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.140 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.140 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.140 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.140 12 ERROR oslo_messaging.notify.messaging Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.141 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.142 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.142 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.143 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fbaf9a13-d720-4fd8-93cc-07aa8fa18477', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T10:03:48.142107', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5965d1ce-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12094.317974433, 'message_signature': '3024afee73b4aa90ad2eb5d646b28b9fc0bc722723185a045b473269c69e077e'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T10:03:48.142107', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5965e1d2-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12094.317974433, 'message_signature': 'a02d156f13a7047a9b22cbd5afccb2502ef9891938be059a5b0ba2d151791911'}]}, 'timestamp': '2025-12-15 10:03:48.142969', '_unique_id': '32b8743d8c944c75bcd047a732b47ee7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.143 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.143 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.143 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.143 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.143 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.143 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.143 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.143 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.143 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.143 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.143 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.143 12 ERROR oslo_messaging.notify.messaging Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.143 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.143 12 ERROR oslo_messaging.notify.messaging Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.143 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.143 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.143 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.143 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.143 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.143 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.143 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.143 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.143 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.143 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.143 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.143 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.143 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.143 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.143 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.143 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.143 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.143 12 ERROR oslo_messaging.notify.messaging Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.145 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.148 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.150 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c7b5c9e3-7ed9-463b-aa79-756dd904b43d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:03:48.145276', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '5966e168-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12094.337948762, 'message_signature': '88e1b1be18b0bab14dd7b4342d9ceb258ef366d07a90f97d0bd38371f8d36b3d'}]}, 'timestamp': '2025-12-15 10:03:48.149521', '_unique_id': '16bc0ad65ed64b2b85fe23768646a8a0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.150 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.150 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.150 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.150 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.150 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.150 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.150 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.150 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.150 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.150 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.150 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.150 12 ERROR oslo_messaging.notify.messaging Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.150 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.150 12 ERROR oslo_messaging.notify.messaging Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.150 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.150 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.150 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.150 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.150 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.150 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.150 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.150 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.150 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.150 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.150 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.150 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.150 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.150 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.150 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.150 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.150 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.150 12 ERROR oslo_messaging.notify.messaging Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.151 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.152 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.152 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.153 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4b7b693f-3bad-4154-94e1-58fd84352ab8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:03:48.152212', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '59675c60-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12094.337948762, 'message_signature': 'd082f04785541393c4eda3c7aa87cb471895845eb080eac27d4bcbc91e5477be'}]}, 'timestamp': '2025-12-15 10:03:48.152696', '_unique_id': '87d9cdeb712a47f387d51c61c50fed5a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.153 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.153 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.153 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.153 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.153 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.153 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.153 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.153 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.153 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.153 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.153 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.153 12 ERROR oslo_messaging.notify.messaging Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.153 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.153 12 ERROR oslo_messaging.notify.messaging Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.153 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.153 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.153 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.153 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.153 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.153 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.153 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.153 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.153 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.153 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.153 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.153 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.153 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.153 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.153 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.153 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.153 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.153 12 ERROR oslo_messaging.notify.messaging Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.154 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.185 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.186 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.187 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5a4650c0-2baa-442f-af4f-a298746abcf5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T10:03:48.154835', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '596c7d26-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12094.347526722, 'message_signature': 'f4aea692451824164406bf92962d2208bd03b543c5e289b0c26e65fb25c092b0'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T10:03:48.154835', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '596c8e1a-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12094.347526722, 'message_signature': 'eaf66162ea9ab59e731b4bf17e414b7ea028f635bc07c1b0d125d2ae11e9069c'}]}, 'timestamp': '2025-12-15 10:03:48.186681', '_unique_id': 'ea4f41e3b3da47cab091b45d177065a5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.187 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.187 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.187 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.187 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.187 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.187 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.187 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.187 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.187 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.187 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.187 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.187 12 ERROR oslo_messaging.notify.messaging Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.187 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.187 12 ERROR oslo_messaging.notify.messaging Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.187 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.187 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.187 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.187 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.187 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.187 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.187 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.187 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.187 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.187 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.187 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.187 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.187 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.187 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.187 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.187 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.187 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.187 12 ERROR oslo_messaging.notify.messaging Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.188 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.188 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.190 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '86ef5c0a-1042-4686-bd7f-449c6c938645', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:03:48.188888', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '596cf670-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12094.337948762, 'message_signature': '39ab66a216635848cfa41a2955161183f19ac9353e18393910a509220e233613'}]}, 'timestamp': '2025-12-15 10:03:48.189375', '_unique_id': '4f77652a6ca04c0eb8bc7cf756e978f5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.190 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.190 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.190 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.190 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.190 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.190 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.190 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.190 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.190 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.190 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.190 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.190 12 ERROR oslo_messaging.notify.messaging Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.190 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.190 12 ERROR oslo_messaging.notify.messaging Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.190 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.190 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.190 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.190 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.190 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.190 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.190 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.190 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.190 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.190 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.190 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.190 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.190 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.190 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.190 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.190 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.190 12 ERROR oslo_messaging.notify.messaging Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.191 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.191 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.191 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.193 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f288d648-2218-45f3-898b-00d202715381', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T10:03:48.191467', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '596d598a-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12094.347526722, 'message_signature': '7dae519f8ab2e19b7d73659a1c4a1f6c051356e51db2ae6df96b00586ddadd17'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T10:03:48.191467', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '596d6aec-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12094.347526722, 'message_signature': '3300d9cdd41b6af2d2dfa56c729f80a7054797d954fccca5c1778d1bb6efe326'}]}, 'timestamp': '2025-12-15 10:03:48.192329', '_unique_id': '2b427e2a764a4e888770f5933ec13df1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.193 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.193 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.193 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.193 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.193 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.193 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.193 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.193 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.193 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.193 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.193 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.193 12 ERROR oslo_messaging.notify.messaging Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.193 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.193 12 ERROR oslo_messaging.notify.messaging Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.193 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.193 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.193 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.193 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.193 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.193 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.193 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.193 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.193 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.193 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.193 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.193 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.193 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.193 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.193 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.193 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.193 12 ERROR oslo_messaging.notify.messaging Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.194 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.217 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/memory.usage volume: 51.73828125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.219 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '433a6a29-2f82-4875-831a-46d571bb135b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.73828125, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'timestamp': '2025-12-15T10:03:48.194431', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '5971635e-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12094.410359055, 'message_signature': '27e59365c8d6552356bbbb188c87f5d3f850e325f44fa62f1b3d5b3679ded1a3'}]}, 'timestamp': '2025-12-15 10:03:48.218419', '_unique_id': '8f109c1b5e974d83bc5c8e40e607eba3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.219 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.219 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.219 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.219 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.219 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.219 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.219 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.219 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.219 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.219 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.219 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.219 12 ERROR oslo_messaging.notify.messaging Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.219 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.219 12 ERROR oslo_messaging.notify.messaging Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.219 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.219 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.219 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.219 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.219 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.219 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.219 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.219 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.219 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.219 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.219 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.219 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.219 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.219 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.219 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.219 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.219 12 ERROR oslo_messaging.notify.messaging Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.220 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.221 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.222 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '68fed64c-1b9e-4963-8d05-92955162908e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:03:48.220974', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '5971dd66-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12094.337948762, 'message_signature': 'a8c7377637fd9cb347c37f93b9e95af43d38049ddd0b1125aee829b07c8fd6ea'}]}, 'timestamp': '2025-12-15 10:03:48.221511', '_unique_id': '5d50ac2b1ac649b4a3e590a78c2c6535'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.222 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.222 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.222 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.222 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.222 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.222 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.222 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.222 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.222 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.222 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.222 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.222 12 ERROR oslo_messaging.notify.messaging Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.222 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.222 12 ERROR oslo_messaging.notify.messaging Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.222 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.222 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.222 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.222 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.222 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.222 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.222 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.222 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.222 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.222 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.222 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.222 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.222 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.222 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.222 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.222 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.222 12 ERROR oslo_messaging.notify.messaging Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.223 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.223 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.225 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '739acda4-663f-4d17-a43a-e94629d2e887', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:03:48.223782', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '59724b20-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12094.337948762, 'message_signature': '2c52b05e88f072cc5904cd2b1b54a9869b09651e6becef4d229bc43051fb1d42'}]}, 'timestamp': '2025-12-15 10:03:48.224319', '_unique_id': '9858ab56af3a4d1cba7536c0fbec5cd9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.225 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.225 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.225 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.225 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.225 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.225 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.225 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.225 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.225 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.225 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.225 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.225 12 ERROR oslo_messaging.notify.messaging Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.225 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.225 12 ERROR oslo_messaging.notify.messaging Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.225 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.225 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.225 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.225 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.225 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.225 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.225 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.225 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.225 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.225 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.225 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.225 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.225 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.225 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.225 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.225 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.225 12 ERROR oslo_messaging.notify.messaging Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.226 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.226 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.228 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4fc6db16-0fac-4323-8a64-f2261b2ab233', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:03:48.226523', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '5972b376-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12094.337948762, 'message_signature': 'af29d925188b3165130dc5d190d133b29c477e6c4f0acbea196872573127701c'}]}, 'timestamp': '2025-12-15 10:03:48.227047', '_unique_id': '9ad01e4711174c62a3f5ed7fe541e35b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.228 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.228 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.228 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.228 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.228 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.228 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.228 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.228 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.228 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.228 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.228 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.228 12 ERROR oslo_messaging.notify.messaging Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.228 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.228 12 ERROR oslo_messaging.notify.messaging Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.228 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.228 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.228 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.228 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.228 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.228 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.228 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.228 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.228 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.228 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.228 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.228 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.228 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.228 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.228 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.228 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.228 12 ERROR oslo_messaging.notify.messaging Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.229 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.229 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.229 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.231 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '20dfe31d-686d-4713-a32a-ef414ee1f0ed', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:03:48.229528', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '597328ba-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12094.337948762, 'message_signature': '7b7e4275146c21a16cb9a412d3090c6b2129ac1907d799f2e3b845d63a20d6e2'}]}, 'timestamp': '2025-12-15 10:03:48.230032', '_unique_id': '6e288c83d98b43ca8a5ecc339b27c8bf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.231 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.231 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.231 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.231 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.231 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.231 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.231 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.231 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.231 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.231 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.231 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.231 12 ERROR oslo_messaging.notify.messaging Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.231 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.231 12 ERROR oslo_messaging.notify.messaging Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.231 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.231 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.231 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.231 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.231 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.231 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.231 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.231 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.231 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.231 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.231 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.231 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.231 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.231 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.231 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.231 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.231 12 ERROR oslo_messaging.notify.messaging Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.231 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.231 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.232 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '44f20d4a-cd30-4369-9801-6d1797953e8f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:03:48.231898', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '597383f0-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12094.337948762, 'message_signature': 'e480d7f0c5a0c0b07744cf42dc9baf0c76ac67b4a19b1652b51060e3e41d5655'}]}, 'timestamp': '2025-12-15 10:03:48.232245', '_unique_id': '513570c2609c4552bd8e3e46f5ff635c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.232 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.232 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.232 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.232 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.232 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.232 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.232 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.232 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.232 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.232 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.232 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.232 12 ERROR oslo_messaging.notify.messaging Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.232 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.232 12 ERROR oslo_messaging.notify.messaging Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.232 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.232 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.232 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.232 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.232 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.232 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.232 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.232 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.232 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.232 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.232 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.232 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.232 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.232 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.232 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.232 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.232 12 ERROR oslo_messaging.notify.messaging Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.233 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.233 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.latency volume: 1243487016 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.234 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.latency volume: 24779175 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.234 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '390f87b5-1b5f-4a48-b6de-216ef61b6360', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1243487016, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T10:03:48.233722', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5973c9d2-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12094.347526722, 'message_signature': 'f1667e0d22b490525ac6a4a61d5a537d3f12ef6fa7eb7c8fc73e89372b8dac6b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24779175, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T10:03:48.233722', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5973d544-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12094.347526722, 'message_signature': 'c22473459d43b8775c07188565687172a8007e52a57a778f28dd805d94b35466'}]}, 'timestamp': '2025-12-15 10:03:48.234308', '_unique_id': '2048cffb345b43d191cd71f7c11db9f6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.234 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.234 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.234 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.234 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.234 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.234 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.234 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.234 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.234 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.234 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.234 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.234 12 ERROR oslo_messaging.notify.messaging Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.234 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.234 12 ERROR oslo_messaging.notify.messaging Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.234 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.234 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.234 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.234 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.234 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.234 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.234 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.234 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.234 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.234 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.234 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.234 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.234 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.234 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.234 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.234 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.234 12 ERROR oslo_messaging.notify.messaging Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.235 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.235 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.236 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.237 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a563c177-6564-4c58-9e9a-864acb3e2830', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T10:03:48.235793', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '59741b30-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12094.347526722, 'message_signature': '563f889ee5b2dc4ba8c0e4ff97cfc779320d6ab5a5bb28ec4f351def6c5b31dc'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T10:03:48.235793', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '597426a2-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12094.347526722, 'message_signature': '01dce2fbad00cf2ecdec52fb44c575b88b398419c93d7ed14a51722aea6460b7'}]}, 'timestamp': '2025-12-15 10:03:48.236387', '_unique_id': '7c1ea3fa8fea4b58a71b554f75d056bc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.237 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.237 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.237 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.237 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.237 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.237 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.237 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.237 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.237 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.237 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.237 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.237 12 ERROR oslo_messaging.notify.messaging Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.237 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.237 12 ERROR oslo_messaging.notify.messaging Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.237 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.237 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.237 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.237 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.237 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.237 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.237 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.237 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.237 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.237 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.237 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.237 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.237 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.237 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.237 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.237 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.237 12 ERROR oslo_messaging.notify.messaging Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.238 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.238 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.239 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '289622f5-a664-4477-9dda-c61244bd93c3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:03:48.238375', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '59747f8a-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12094.337948762, 'message_signature': '09bf159c7ca19a28863b5d76dd88eaff39df823ae906e1f5258af701b68726e8'}]}, 'timestamp': '2025-12-15 10:03:48.238686', '_unique_id': '8e7dc384ae12474398c039abd0fdd72e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.239 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.239 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.239 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.239 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.239 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.239 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.239 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.239 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.239 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.239 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.239 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.239 12 ERROR oslo_messaging.notify.messaging Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.239 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.239 12 ERROR oslo_messaging.notify.messaging Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.239 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.239 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.239 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.239 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.239 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.239 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.239 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.239 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.239 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.239 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.239 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.239 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.239 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.239 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.239 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.239 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.239 12 ERROR oslo_messaging.notify.messaging Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.240 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.240 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.240 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.241 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '36db20aa-3ecb-4e45-8472-56a74a8eb315', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T10:03:48.240333', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '5974cc38-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12094.347526722, 'message_signature': '6eaf520dc8ce840a263927cd6e20b77cf66b5b3da750a7bd5b4894d2bb3b5d14'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T10:03:48.240333', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '5974d70a-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12094.347526722, 'message_signature': '3f164bf354b5c1516c8d416e0d790c27c87a8ad16c0761ddcb0d349600c99991'}]}, 'timestamp': '2025-12-15 10:03:48.240906', '_unique_id': '956d2d9b67dc4aeb847a2468a47196c5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.241 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.241 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.241 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.241 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.241 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.241 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.241 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.241 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.241 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.241 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.241 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.241 12 ERROR oslo_messaging.notify.messaging Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.241 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.241 12 ERROR oslo_messaging.notify.messaging Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.241 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.241 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.241 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.241 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.241 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.241 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.241 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.241 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.241 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.241 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.241 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.241 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.241 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.241 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.241 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.241 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.241 12 ERROR oslo_messaging.notify.messaging Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.242 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.242 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.latency volume: 1342134926 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.242 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.latency volume: 123356132 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.243 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ce6a9ebc-e3ac-432e-8be7-1e0b95a2f293', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1342134926, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T10:03:48.242449', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '59751e68-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12094.347526722, 'message_signature': '1aedb0dde46e3d5e6d826422acaa8f0e69b7f0c2e692f7c3dcbc2610581f2c88'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 123356132, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T10:03:48.242449', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '597528d6-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12094.347526722, 'message_signature': '7ece8c5440970b13b90e3e0e8ab35886683c906035a04486b8f520fbb9a077ab'}]}, 'timestamp': '2025-12-15 10:03:48.243049', '_unique_id': '4e7b12147ef5489997b9d60f620fce53'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.243 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.243 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.243 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.243 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.243 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.243 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.243 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.243 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.243 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.243 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.243 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.243 12 ERROR oslo_messaging.notify.messaging Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.243 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.243 12 ERROR oslo_messaging.notify.messaging Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.243 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.243 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.243 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.243 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.243 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.243 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.243 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.243 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.243 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.243 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.243 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.243 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.243 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.243 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.243 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.243 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.243 12 ERROR oslo_messaging.notify.messaging Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.244 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.244 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/cpu volume: 14630000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.245 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '18b63176-6dac-460e-b3c7-64a777a33537', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 14630000000, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'timestamp': '2025-12-15T10:03:48.244545', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '5975703e-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12094.410359055, 'message_signature': 'cf093d0e2247901837b2284a789844b412c7039654791a46c4a81462a2f9a61c'}]}, 'timestamp': '2025-12-15 10:03:48.244834', '_unique_id': 'a3dc2917d2eb4a89aefd7cf549877709'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.245 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.245 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.245 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.245 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.245 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.245 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.245 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.245 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.245 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.245 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.245 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.245 12 ERROR oslo_messaging.notify.messaging Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.245 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.245 12 ERROR oslo_messaging.notify.messaging Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.245 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.245 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.245 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.245 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.245 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.245 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.245 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.245 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.245 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.245 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.245 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.245 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.245 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.245 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.245 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.245 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.245 12 ERROR oslo_messaging.notify.messaging Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.246 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.246 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.246 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.247 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7099b890-e643-4b69-b310-355c8508c1a6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:03:48.246364', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '5975b8b4-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12094.337948762, 'message_signature': 'bd1741280be072c82a0d082c853d5bdb870431e38b58337bff2b22dec5f17223'}]}, 'timestamp': '2025-12-15 10:03:48.246702', '_unique_id': 'd467a63aaa734b18bd5510b5d689fec8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.247 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.247 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.247 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.247 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.247 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.247 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.247 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.247 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.247 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.247 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.247 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.247 12 ERROR oslo_messaging.notify.messaging Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.247 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.247 12 ERROR oslo_messaging.notify.messaging Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.247 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.247 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.247 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.247 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.247 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.247 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.247 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.247 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.247 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.247 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.247 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.247 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.247 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.247 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.247 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.247 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.247 12 ERROR oslo_messaging.notify.messaging Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.248 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.248 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.248 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.248 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.249 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd95b8cde-5336-43ef-ab92-51123ee47e2d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T10:03:48.248252', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '597600e4-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12094.317974433, 'message_signature': '56d3121affbf6a1e23f8c8abea8c6de691e90aaf41b0c3ffaf90b23936194ddf'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T10:03:48.248252', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '59760d00-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12094.317974433, 'message_signature': 'a1bc20dc97073d7e45e53665c04b2d78a6998613d2ad64381d2f9afffdb27f99'}]}, 'timestamp': '2025-12-15 10:03:48.248841', '_unique_id': 'd627bb6e1ddb4fa790e88113675565f8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.249 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.249 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.249 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.249 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.249 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.249 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.249 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.249 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.249 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.249 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.249 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.249 12 ERROR oslo_messaging.notify.messaging Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.249 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.249 12 ERROR oslo_messaging.notify.messaging Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.249 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.249 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.249 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.249 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.249 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.249 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.249 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.249 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.249 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.249 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.249 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.249 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.249 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.249 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.249 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.249 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:03:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:03:48.249 12 ERROR oslo_messaging.notify.messaging Dec 15 05:03:48 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:03:48.256 267546 INFO neutron.agent.dhcp.agent [None req-74477874-1ec9-40c3-8981-14e770f6bfec - - - - - -] DHCP configuration for ports {'e33b3ea8-c618-43da-a6e7-8ca32ec03a7a', '2a0d26c7-6d34-4d9e-b074-2ed6c50ddadd', '10ce6638-0ca4-4e8d-aec9-4163a87c1a20'} is completed#033[00m Dec 15 05:03:48 localhost dnsmasq[319837]: exiting on receipt of SIGTERM Dec 15 05:03:48 localhost systemd[1]: libpod-d0b9d2615b7a5f676b89f5902169608fb9f665cd890553ec1318a8292fb801c0.scope: Deactivated successfully. Dec 15 05:03:48 localhost podman[319854]: 2025-12-15 10:03:48.3662712 +0000 UTC m=+0.050854733 container kill d0b9d2615b7a5f676b89f5902169608fb9f665cd890553ec1318a8292fb801c0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-250d134d-a49b-43f4-b0e9-01d6d0fc04e1, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 05:03:48 localhost podman[319867]: 2025-12-15 10:03:48.41742674 +0000 UTC m=+0.041124671 container died d0b9d2615b7a5f676b89f5902169608fb9f665cd890553ec1318a8292fb801c0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-250d134d-a49b-43f4-b0e9-01d6d0fc04e1, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true) Dec 15 05:03:48 localhost systemd[1]: tmp-crun.sIXDLz.mount: Deactivated successfully. Dec 15 05:03:48 localhost podman[319867]: 2025-12-15 10:03:48.462070767 +0000 UTC m=+0.085768638 container cleanup d0b9d2615b7a5f676b89f5902169608fb9f665cd890553ec1318a8292fb801c0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-250d134d-a49b-43f4-b0e9-01d6d0fc04e1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202) Dec 15 05:03:48 localhost systemd[1]: libpod-conmon-d0b9d2615b7a5f676b89f5902169608fb9f665cd890553ec1318a8292fb801c0.scope: Deactivated successfully. Dec 15 05:03:48 localhost podman[319874]: 2025-12-15 10:03:48.479727149 +0000 UTC m=+0.089863550 container remove d0b9d2615b7a5f676b89f5902169608fb9f665cd890553ec1318a8292fb801c0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-250d134d-a49b-43f4-b0e9-01d6d0fc04e1, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Dec 15 05:03:48 localhost nova_compute[286344]: 2025-12-15 10:03:48.528 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:03:48 localhost ovn_controller[154603]: 2025-12-15T10:03:48Z|00217|binding|INFO|Releasing lport 10ce6638-0ca4-4e8d-aec9-4163a87c1a20 from this chassis (sb_readonly=0) Dec 15 05:03:48 localhost ovn_controller[154603]: 2025-12-15T10:03:48Z|00218|binding|INFO|Setting lport 10ce6638-0ca4-4e8d-aec9-4163a87c1a20 down in Southbound Dec 15 05:03:48 localhost kernel: device tap10ce6638-0c left promiscuous mode Dec 15 05:03:48 localhost ovn_metadata_agent[160585]: 2025-12-15 10:03:48.537 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.19/28 10.100.0.3/28 10.100.0.35/28', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-250d134d-a49b-43f4-b0e9-01d6d0fc04e1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-250d134d-a49b-43f4-b0e9-01d6d0fc04e1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1808d9ac20f44272b13e28ceb7a5e4c6', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005559462.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1eb1729d-99dd-4cb3-9468-5d1f6c4147b9, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=10ce6638-0ca4-4e8d-aec9-4163a87c1a20) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:03:48 localhost ovn_metadata_agent[160585]: 2025-12-15 10:03:48.539 160590 INFO neutron.agent.ovn.metadata.agent [-] Port 10ce6638-0ca4-4e8d-aec9-4163a87c1a20 in datapath 250d134d-a49b-43f4-b0e9-01d6d0fc04e1 unbound from our chassis#033[00m Dec 15 05:03:48 localhost ovn_metadata_agent[160585]: 2025-12-15 10:03:48.541 160590 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 250d134d-a49b-43f4-b0e9-01d6d0fc04e1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 15 05:03:48 localhost ovn_metadata_agent[160585]: 2025-12-15 10:03:48.542 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[eaa8badd-7d82-49a0-af80-9a5749b21db7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:03:48 localhost nova_compute[286344]: 2025-12-15 10:03:48.543 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:03:48 localhost neutron_sriov_agent[260044]: 2025-12-15 10:03:48.645 2 INFO neutron.agent.securitygroups_rpc [None req-453d9a40-671c-4de4-adfa-08560f6bb6f3 d5377d4013ec43b98bbf91c063971dac 1808d9ac20f44272b13e28ceb7a5e4c6 - - default default] Security group member updated ['f398dd81-32a0-4005-81c7-146c01e93e74']#033[00m Dec 15 05:03:48 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:03:48.802 267546 INFO neutron.agent.dhcp.agent [None req-bd991695-df36-4e59-adf1-6e7c3af27ead - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:03:48 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:03:48.803 267546 INFO neutron.agent.dhcp.agent [None req-bd991695-df36-4e59-adf1-6e7c3af27ead - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:03:49 localhost systemd[1]: var-lib-containers-storage-overlay-81aa91866c543edebd306eaf9d851077db84fe05a7055487a726e31545859ff3-merged.mount: Deactivated successfully. Dec 15 05:03:49 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d0b9d2615b7a5f676b89f5902169608fb9f665cd890553ec1318a8292fb801c0-userdata-shm.mount: Deactivated successfully. Dec 15 05:03:49 localhost systemd[1]: run-netns-qdhcp\x2d250d134d\x2da49b\x2d43f4\x2db0e9\x2d01d6d0fc04e1.mount: Deactivated successfully. Dec 15 05:03:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 05:03:49 localhost podman[319893]: 2025-12-15 10:03:49.252778683 +0000 UTC m=+0.061892959 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, managed_by=edpm_ansible, config_id=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:03:49 localhost podman[319893]: 2025-12-15 10:03:49.282064417 +0000 UTC m=+0.091178673 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_metadata_agent) Dec 15 05:03:49 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 05:03:49 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:03:49.297 267546 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:03:49 localhost ovn_controller[154603]: 2025-12-15T10:03:49Z|00219|binding|INFO|Releasing lport b35254ad-12eb-47bb-92be-44fefe0694f0 from this chassis (sb_readonly=0) Dec 15 05:03:49 localhost nova_compute[286344]: 2025-12-15 10:03:49.621 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:03:49 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:03:49.887 267546 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:03:50 localhost nova_compute[286344]: 2025-12-15 10:03:50.061 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:03:50 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:03:50.175 267546 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:03:50 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e133 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 05:03:50 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e133 do_prune osdmap full prune enabled Dec 15 05:03:50 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e134 e134: 6 total, 6 up, 6 in Dec 15 05:03:50 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e134: 6 total, 6 up, 6 in Dec 15 05:03:51 localhost ovn_metadata_agent[160585]: 2025-12-15 10:03:51.480 160590 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 05:03:51 localhost ovn_metadata_agent[160585]: 2025-12-15 10:03:51.480 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 05:03:51 localhost ovn_metadata_agent[160585]: 2025-12-15 10:03:51.481 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 05:03:52 localhost nova_compute[286344]: 2025-12-15 10:03:52.331 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:03:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e. Dec 15 05:03:53 localhost podman[319911]: 2025-12-15 10:03:53.385170182 +0000 UTC m=+0.068656799 container health_status a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Dec 15 05:03:53 localhost podman[319911]: 2025-12-15 10:03:53.390106316 +0000 UTC m=+0.073592983 container exec_died a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Dec 15 05:03:53 localhost systemd[1]: a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e.service: Deactivated successfully. Dec 15 05:03:55 localhost nova_compute[286344]: 2025-12-15 10:03:55.062 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:03:55 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:03:55.112 267546 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:03:55 localhost sshd[319934]: main: sshd: ssh-rsa algorithm is disabled Dec 15 05:03:55 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 05:03:57 localhost nova_compute[286344]: 2025-12-15 10:03:57.334 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:03:57 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Dec 15 05:03:57 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:03:57 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Dec 15 05:03:57 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:03:59 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Dec 15 05:03:59 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:03:59 localhost neutron_sriov_agent[260044]: 2025-12-15 10:03:59.919 2 INFO neutron.agent.securitygroups_rpc [None req-a53f8e5b-1d0b-43fa-9ef6-3a1e47ae8800 19323f4a08564f78b2f57705b1ed2d40 d40af34d2f0a4a3a970832bd2c20f504 - - default default] Security group member updated ['8b68ca3e-3ca9-4dbf-9281-e34fc61f9100']#033[00m Dec 15 05:04:00 localhost nova_compute[286344]: 2025-12-15 10:04:00.064 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:04:00 localhost neutron_sriov_agent[260044]: 2025-12-15 10:04:00.381 2 INFO neutron.agent.securitygroups_rpc [None req-d3ff38f9-e38d-41ea-8654-01e834bbfd04 19323f4a08564f78b2f57705b1ed2d40 d40af34d2f0a4a3a970832bd2c20f504 - - default default] Security group member updated ['8b68ca3e-3ca9-4dbf-9281-e34fc61f9100']#033[00m Dec 15 05:04:00 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:04:00 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 05:04:01 localhost neutron_sriov_agent[260044]: 2025-12-15 10:04:01.116 2 INFO neutron.agent.securitygroups_rpc [None req-fe329efd-a44a-413f-8a83-051f27c0156c 19323f4a08564f78b2f57705b1ed2d40 d40af34d2f0a4a3a970832bd2c20f504 - - default default] Security group member updated ['8b68ca3e-3ca9-4dbf-9281-e34fc61f9100']#033[00m Dec 15 05:04:01 localhost podman[243449]: time="2025-12-15T10:04:01Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 15 05:04:01 localhost podman[243449]: @ - - [15/Dec/2025:10:04:01 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156640 "" "Go-http-client/1.1" Dec 15 05:04:01 localhost podman[243449]: @ - - [15/Dec/2025:10:04:01 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19249 "" "Go-http-client/1.1" Dec 15 05:04:02 localhost neutron_sriov_agent[260044]: 2025-12-15 10:04:02.046 2 INFO neutron.agent.securitygroups_rpc [None req-55f46d37-c6c9-4f6a-b4b7-324ec2baae51 19323f4a08564f78b2f57705b1ed2d40 d40af34d2f0a4a3a970832bd2c20f504 - - default default] Security group member updated ['8b68ca3e-3ca9-4dbf-9281-e34fc61f9100']#033[00m Dec 15 05:04:02 localhost nova_compute[286344]: 2025-12-15 10:04:02.369 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:04:03 localhost neutron_sriov_agent[260044]: 2025-12-15 10:04:03.240 2 INFO neutron.agent.securitygroups_rpc [None req-f60848d5-7c39-4016-b802-af0e8cc46504 33617797f45c40508aea1ace4b3675ed 85fbe34cb68645c0865fa1cf53f700b3 - - default default] Security group member updated ['6927a59a-c295-4f55-be96-3781855a630b']#033[00m Dec 15 05:04:03 localhost neutron_sriov_agent[260044]: 2025-12-15 10:04:03.549 2 INFO neutron.agent.securitygroups_rpc [None req-d77de2f3-882f-4d8e-8eee-6507a75810ec 19323f4a08564f78b2f57705b1ed2d40 d40af34d2f0a4a3a970832bd2c20f504 - - default default] Security group member updated ['8b68ca3e-3ca9-4dbf-9281-e34fc61f9100']#033[00m Dec 15 05:04:04 localhost ovn_metadata_agent[160585]: 2025-12-15 10:04:04.161 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'fe:17:e3', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fe:55:2b:86:15:b5'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:04:04 localhost ovn_metadata_agent[160585]: 2025-12-15 10:04:04.162 160590 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Dec 15 05:04:04 localhost nova_compute[286344]: 2025-12-15 10:04:04.163 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:04:04 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:04:04.283 267546 INFO neutron.agent.linux.ip_lib [None req-ad134b81-2476-4012-93ab-68bfdd6f7569 - - - - - -] Device tap57619a5a-b1 cannot be used as it has no MAC address#033[00m Dec 15 05:04:04 localhost nova_compute[286344]: 2025-12-15 10:04:04.303 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:04:04 localhost kernel: device tap57619a5a-b1 entered promiscuous mode Dec 15 05:04:04 localhost ovn_controller[154603]: 2025-12-15T10:04:04Z|00220|binding|INFO|Claiming lport 57619a5a-b115-4772-abf3-f3d2494fe5b1 for this chassis. Dec 15 05:04:04 localhost ovn_controller[154603]: 2025-12-15T10:04:04Z|00221|binding|INFO|57619a5a-b115-4772-abf3-f3d2494fe5b1: Claiming unknown Dec 15 05:04:04 localhost nova_compute[286344]: 2025-12-15 10:04:04.313 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:04:04 localhost NetworkManager[5963]: [1765793044.3161] manager: (tap57619a5a-b1): new Generic device (/org/freedesktop/NetworkManager/Devices/36) Dec 15 05:04:04 localhost systemd-udevd[320031]: Network interface NamePolicy= disabled on kernel command line. Dec 15 05:04:04 localhost ovn_metadata_agent[160585]: 2025-12-15 10:04:04.328 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-7842d897-78f9-4f71-9bb1-8eff4b50f3d6', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7842d897-78f9-4f71-9bb1-8eff4b50f3d6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5619c5c73e504af8ba8040027fc52cd0', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cc061c61-218c-4770-a641-a91c5cc330cb, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=57619a5a-b115-4772-abf3-f3d2494fe5b1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:04:04 localhost ovn_metadata_agent[160585]: 2025-12-15 10:04:04.330 160590 INFO neutron.agent.ovn.metadata.agent [-] Port 57619a5a-b115-4772-abf3-f3d2494fe5b1 in datapath 7842d897-78f9-4f71-9bb1-8eff4b50f3d6 bound to our chassis#033[00m Dec 15 05:04:04 localhost ovn_metadata_agent[160585]: 2025-12-15 10:04:04.332 160590 DEBUG neutron.agent.ovn.metadata.agent [-] Port 8b33e378-b295-4026-b9ab-923bd80b2c63 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Dec 15 05:04:04 localhost ovn_metadata_agent[160585]: 2025-12-15 10:04:04.332 160590 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7842d897-78f9-4f71-9bb1-8eff4b50f3d6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 15 05:04:04 localhost ovn_metadata_agent[160585]: 2025-12-15 10:04:04.333 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[f1c1046c-96cb-44db-8f66-d90f7f828d0d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:04:04 localhost journal[231322]: ethtool ioctl error on tap57619a5a-b1: No such device Dec 15 05:04:04 localhost ovn_controller[154603]: 2025-12-15T10:04:04Z|00222|binding|INFO|Setting lport 57619a5a-b115-4772-abf3-f3d2494fe5b1 ovn-installed in OVS Dec 15 05:04:04 localhost ovn_controller[154603]: 2025-12-15T10:04:04Z|00223|binding|INFO|Setting lport 57619a5a-b115-4772-abf3-f3d2494fe5b1 up in Southbound Dec 15 05:04:04 localhost nova_compute[286344]: 2025-12-15 10:04:04.345 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:04:04 localhost journal[231322]: ethtool ioctl error on tap57619a5a-b1: No such device Dec 15 05:04:04 localhost journal[231322]: ethtool ioctl error on tap57619a5a-b1: No such device Dec 15 05:04:04 localhost journal[231322]: ethtool ioctl error on tap57619a5a-b1: No such device Dec 15 05:04:04 localhost journal[231322]: ethtool ioctl error on tap57619a5a-b1: No such device Dec 15 05:04:04 localhost journal[231322]: ethtool ioctl error on tap57619a5a-b1: No such device Dec 15 05:04:04 localhost journal[231322]: ethtool ioctl error on tap57619a5a-b1: No such device Dec 15 05:04:04 localhost journal[231322]: ethtool ioctl error on tap57619a5a-b1: No such device Dec 15 05:04:04 localhost nova_compute[286344]: 2025-12-15 10:04:04.379 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:04:04 localhost nova_compute[286344]: 2025-12-15 10:04:04.405 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:04:04 localhost neutron_sriov_agent[260044]: 2025-12-15 10:04:04.664 2 INFO neutron.agent.securitygroups_rpc [None req-2146b295-3185-4e0b-bc70-403869aea930 19323f4a08564f78b2f57705b1ed2d40 d40af34d2f0a4a3a970832bd2c20f504 - - default default] Security group member updated ['8b68ca3e-3ca9-4dbf-9281-e34fc61f9100']#033[00m Dec 15 05:04:04 localhost openstack_network_exporter[246484]: ERROR 10:04:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 05:04:04 localhost openstack_network_exporter[246484]: ERROR 10:04:04 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 15 05:04:04 localhost openstack_network_exporter[246484]: ERROR 10:04:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 05:04:04 localhost openstack_network_exporter[246484]: ERROR 10:04:04 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 15 05:04:04 localhost openstack_network_exporter[246484]: Dec 15 05:04:04 localhost openstack_network_exporter[246484]: ERROR 10:04:04 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 15 05:04:04 localhost openstack_network_exporter[246484]: Dec 15 05:04:05 localhost nova_compute[286344]: 2025-12-15 10:04:05.066 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:04:05 localhost podman[320102]: Dec 15 05:04:05 localhost podman[320102]: 2025-12-15 10:04:05.20141727 +0000 UTC m=+0.068992637 container create 5564a2a5e3ddfe6d028d5382e8e67a11cd46b468ca09bbacade8b986d417afd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7842d897-78f9-4f71-9bb1-8eff4b50f3d6, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true) Dec 15 05:04:05 localhost systemd[1]: Started libpod-conmon-5564a2a5e3ddfe6d028d5382e8e67a11cd46b468ca09bbacade8b986d417afd5.scope. Dec 15 05:04:05 localhost systemd[1]: tmp-crun.wj5AX4.mount: Deactivated successfully. Dec 15 05:04:05 localhost systemd[1]: Started libcrun container. Dec 15 05:04:05 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/578d21ad95e08a97157b0194224e45641f055c167755f77abc77f6b371aa0038/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 05:04:05 localhost podman[320102]: 2025-12-15 10:04:05.161237004 +0000 UTC m=+0.028812401 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 15 05:04:05 localhost podman[320102]: 2025-12-15 10:04:05.270677993 +0000 UTC m=+0.138253360 container init 5564a2a5e3ddfe6d028d5382e8e67a11cd46b468ca09bbacade8b986d417afd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7842d897-78f9-4f71-9bb1-8eff4b50f3d6, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202) Dec 15 05:04:05 localhost podman[320102]: 2025-12-15 10:04:05.278019037 +0000 UTC m=+0.145594404 container start 5564a2a5e3ddfe6d028d5382e8e67a11cd46b468ca09bbacade8b986d417afd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7842d897-78f9-4f71-9bb1-8eff4b50f3d6, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3) Dec 15 05:04:05 localhost dnsmasq[320120]: started, version 2.85 cachesize 150 Dec 15 05:04:05 localhost dnsmasq[320120]: DNS service limited to local subnets Dec 15 05:04:05 localhost dnsmasq[320120]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 15 05:04:05 localhost dnsmasq[320120]: warning: no upstream servers configured Dec 15 05:04:05 localhost dnsmasq-dhcp[320120]: DHCP, static leases only on 10.100.0.0, lease time 1d Dec 15 05:04:05 localhost dnsmasq[320120]: read /var/lib/neutron/dhcp/7842d897-78f9-4f71-9bb1-8eff4b50f3d6/addn_hosts - 0 addresses Dec 15 05:04:05 localhost dnsmasq-dhcp[320120]: read /var/lib/neutron/dhcp/7842d897-78f9-4f71-9bb1-8eff4b50f3d6/host Dec 15 05:04:05 localhost dnsmasq-dhcp[320120]: read /var/lib/neutron/dhcp/7842d897-78f9-4f71-9bb1-8eff4b50f3d6/opts Dec 15 05:04:05 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:04:05.505 267546 INFO neutron.agent.dhcp.agent [None req-102e9e0c-dad0-4994-b375-1a4f4903c3b2 - - - - - -] DHCP configuration for ports {'3a753b15-6687-4189-8524-f20be2b8c344'} is completed#033[00m Dec 15 05:04:05 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 05:04:05 localhost neutron_sriov_agent[260044]: 2025-12-15 10:04:05.900 2 INFO neutron.agent.securitygroups_rpc [None req-6b77dfcb-5c8c-4144-8396-695689ab8897 7817bde0aff34302827daf7f0baa5519 e6d5640db50b4446b57609dc8cf24987 - - default default] Security group member updated ['d7aa069c-b15f-4f9f-a305-5527dbbe375e']#033[00m Dec 15 05:04:06 localhost neutron_sriov_agent[260044]: 2025-12-15 10:04:06.213 2 INFO neutron.agent.securitygroups_rpc [None req-6db50289-aa84-4284-9d12-bf1289e85b60 c9894569c5e847e7843f28567440493a adbbfc5d99eb40139108520a7e1a1cb9 - - default default] Security group member updated ['91c293a0-f485-4e51-a2d0-9c7b5d2254f6']#033[00m Dec 15 05:04:06 localhost neutron_sriov_agent[260044]: 2025-12-15 10:04:06.615 2 INFO neutron.agent.securitygroups_rpc [None req-cd72a83a-298d-4639-a687-ef6425e578e9 7817bde0aff34302827daf7f0baa5519 e6d5640db50b4446b57609dc8cf24987 - - default default] Security group member updated ['d7aa069c-b15f-4f9f-a305-5527dbbe375e']#033[00m Dec 15 05:04:06 localhost neutron_sriov_agent[260044]: 2025-12-15 10:04:06.998 2 INFO neutron.agent.securitygroups_rpc [None req-afc3b04f-5682-4e2e-bdff-547a81d4419d 19323f4a08564f78b2f57705b1ed2d40 d40af34d2f0a4a3a970832bd2c20f504 - - default default] Security group member updated ['8b68ca3e-3ca9-4dbf-9281-e34fc61f9100']#033[00m Dec 15 05:04:07 localhost neutron_sriov_agent[260044]: 2025-12-15 10:04:07.303 2 INFO neutron.agent.securitygroups_rpc [None req-be359948-54c4-45f4-841b-4b66a8fad899 c9894569c5e847e7843f28567440493a adbbfc5d99eb40139108520a7e1a1cb9 - - default default] Security group member updated ['91c293a0-f485-4e51-a2d0-9c7b5d2254f6']#033[00m Dec 15 05:04:07 localhost nova_compute[286344]: 2025-12-15 10:04:07.410 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:04:07 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:04:07.562 267546 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-15T10:04:07Z, description=, device_id=628bbf52-70bd-4147-91ba-40ad4c3874e3, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=1ec0a9de-bf37-464a-834b-9c4aba762e24, ip_allocation=immediate, mac_address=fa:16:3e:2f:8d:df, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-15T10:04:01Z, description=, dns_domain=, id=7842d897-78f9-4f71-9bb1-8eff4b50f3d6, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-FloatingIPAdminTestJSON-test-network-1083697997, port_security_enabled=True, project_id=5619c5c73e504af8ba8040027fc52cd0, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=52572, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1558, status=ACTIVE, subnets=['98f039e8-309b-4979-bc6c-8e9f04031d97'], tags=[], tenant_id=5619c5c73e504af8ba8040027fc52cd0, updated_at=2025-12-15T10:04:02Z, vlan_transparent=None, network_id=7842d897-78f9-4f71-9bb1-8eff4b50f3d6, port_security_enabled=False, project_id=5619c5c73e504af8ba8040027fc52cd0, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1598, status=DOWN, tags=[], tenant_id=5619c5c73e504af8ba8040027fc52cd0, updated_at=2025-12-15T10:04:07Z on network 7842d897-78f9-4f71-9bb1-8eff4b50f3d6#033[00m Dec 15 05:04:07 localhost dnsmasq[320120]: read /var/lib/neutron/dhcp/7842d897-78f9-4f71-9bb1-8eff4b50f3d6/addn_hosts - 1 addresses Dec 15 05:04:07 localhost dnsmasq-dhcp[320120]: read /var/lib/neutron/dhcp/7842d897-78f9-4f71-9bb1-8eff4b50f3d6/host Dec 15 05:04:07 localhost dnsmasq-dhcp[320120]: read /var/lib/neutron/dhcp/7842d897-78f9-4f71-9bb1-8eff4b50f3d6/opts Dec 15 05:04:07 localhost podman[320138]: 2025-12-15 10:04:07.776060848 +0000 UTC m=+0.059932841 container kill 5564a2a5e3ddfe6d028d5382e8e67a11cd46b468ca09bbacade8b986d417afd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7842d897-78f9-4f71-9bb1-8eff4b50f3d6, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 05:04:07 localhost neutron_sriov_agent[260044]: 2025-12-15 10:04:07.864 2 INFO neutron.agent.securitygroups_rpc [None req-b191df09-b159-4b7e-b622-68f1c9988c7c 19323f4a08564f78b2f57705b1ed2d40 d40af34d2f0a4a3a970832bd2c20f504 - - default default] Security group member updated ['8b68ca3e-3ca9-4dbf-9281-e34fc61f9100']#033[00m Dec 15 05:04:07 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:04:07.964 267546 INFO neutron.agent.dhcp.agent [None req-60194a6c-3613-4e17-98db-179714fd0531 - - - - - -] DHCP configuration for ports {'1ec0a9de-bf37-464a-834b-9c4aba762e24'} is completed#033[00m Dec 15 05:04:08 localhost nova_compute[286344]: 2025-12-15 10:04:08.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:04:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0. Dec 15 05:04:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 05:04:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a. Dec 15 05:04:08 localhost podman[320160]: 2025-12-15 10:04:08.775621151 +0000 UTC m=+0.093396738 container health_status 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:04:08 localhost podman[320160]: 2025-12-15 10:04:08.781910169 +0000 UTC m=+0.099685756 container exec_died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Dec 15 05:04:08 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.service: Deactivated successfully. Dec 15 05:04:08 localhost podman[320159]: 2025-12-15 10:04:08.819591711 +0000 UTC m=+0.140314332 container health_status 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Dec 15 05:04:08 localhost podman[320159]: 2025-12-15 10:04:08.827460268 +0000 UTC m=+0.148182899 container exec_died 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 15 05:04:08 localhost systemd[1]: 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0.service: Deactivated successfully. Dec 15 05:04:08 localhost systemd[1]: tmp-crun.xZGTFd.mount: Deactivated successfully. Dec 15 05:04:08 localhost podman[320161]: 2025-12-15 10:04:08.890853864 +0000 UTC m=+0.206470887 container health_status b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute) Dec 15 05:04:08 localhost podman[320161]: 2025-12-15 10:04:08.926688351 +0000 UTC m=+0.242305384 container exec_died b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute) Dec 15 05:04:08 localhost systemd[1]: b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a.service: Deactivated successfully. Dec 15 05:04:09 localhost neutron_sriov_agent[260044]: 2025-12-15 10:04:09.224 2 INFO neutron.agent.securitygroups_rpc [None req-95485b93-0550-4f8d-b420-cf584c53332e 19323f4a08564f78b2f57705b1ed2d40 d40af34d2f0a4a3a970832bd2c20f504 - - default default] Security group member updated ['8b68ca3e-3ca9-4dbf-9281-e34fc61f9100']#033[00m Dec 15 05:04:09 localhost nova_compute[286344]: 2025-12-15 10:04:09.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:04:09 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:04:09.557 267546 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-15T10:04:07Z, description=, device_id=628bbf52-70bd-4147-91ba-40ad4c3874e3, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=1ec0a9de-bf37-464a-834b-9c4aba762e24, ip_allocation=immediate, mac_address=fa:16:3e:2f:8d:df, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-15T10:04:01Z, description=, dns_domain=, id=7842d897-78f9-4f71-9bb1-8eff4b50f3d6, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-FloatingIPAdminTestJSON-test-network-1083697997, port_security_enabled=True, project_id=5619c5c73e504af8ba8040027fc52cd0, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=52572, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1558, status=ACTIVE, subnets=['98f039e8-309b-4979-bc6c-8e9f04031d97'], tags=[], tenant_id=5619c5c73e504af8ba8040027fc52cd0, updated_at=2025-12-15T10:04:02Z, vlan_transparent=None, network_id=7842d897-78f9-4f71-9bb1-8eff4b50f3d6, port_security_enabled=False, project_id=5619c5c73e504af8ba8040027fc52cd0, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1598, status=DOWN, tags=[], tenant_id=5619c5c73e504af8ba8040027fc52cd0, updated_at=2025-12-15T10:04:07Z on network 7842d897-78f9-4f71-9bb1-8eff4b50f3d6#033[00m Dec 15 05:04:09 localhost dnsmasq[320120]: read /var/lib/neutron/dhcp/7842d897-78f9-4f71-9bb1-8eff4b50f3d6/addn_hosts - 1 addresses Dec 15 05:04:09 localhost podman[320237]: 2025-12-15 10:04:09.770850666 +0000 UTC m=+0.061646844 container kill 5564a2a5e3ddfe6d028d5382e8e67a11cd46b468ca09bbacade8b986d417afd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7842d897-78f9-4f71-9bb1-8eff4b50f3d6, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Dec 15 05:04:09 localhost dnsmasq-dhcp[320120]: read /var/lib/neutron/dhcp/7842d897-78f9-4f71-9bb1-8eff4b50f3d6/host Dec 15 05:04:09 localhost dnsmasq-dhcp[320120]: read /var/lib/neutron/dhcp/7842d897-78f9-4f71-9bb1-8eff4b50f3d6/opts Dec 15 05:04:10 localhost nova_compute[286344]: 2025-12-15 10:04:10.068 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:04:10 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:04:10.077 267546 INFO neutron.agent.dhcp.agent [None req-d6620926-c4b4-44ab-b465-795351cd3eaa - - - - - -] DHCP configuration for ports {'1ec0a9de-bf37-464a-834b-9c4aba762e24'} is completed#033[00m Dec 15 05:04:10 localhost nova_compute[286344]: 2025-12-15 10:04:10.265 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:04:10 localhost neutron_sriov_agent[260044]: 2025-12-15 10:04:10.548 2 INFO neutron.agent.securitygroups_rpc [None req-aa01cb2a-8ec2-48f5-91b9-a1f939dff14b 33617797f45c40508aea1ace4b3675ed 85fbe34cb68645c0865fa1cf53f700b3 - - default default] Security group member updated ['6927a59a-c295-4f55-be96-3781855a630b']#033[00m Dec 15 05:04:10 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 05:04:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09. Dec 15 05:04:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 05:04:10 localhost podman[320257]: 2025-12-15 10:04:10.752290765 +0000 UTC m=+0.085640355 container health_status 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vcs-type=git, managed_by=edpm_ansible, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, build-date=2025-08-20T13:12:41) Dec 15 05:04:10 localhost podman[320257]: 2025-12-15 10:04:10.763810253 +0000 UTC m=+0.097159873 container exec_died 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., distribution-scope=public, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., architecture=x86_64, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41) Dec 15 05:04:10 localhost systemd[1]: 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09.service: Deactivated successfully. Dec 15 05:04:10 localhost podman[320258]: 2025-12-15 10:04:10.858222216 +0000 UTC m=+0.188314584 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, container_name=ovn_controller) Dec 15 05:04:10 localhost podman[320258]: 2025-12-15 10:04:10.935465338 +0000 UTC m=+0.265557726 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true) Dec 15 05:04:10 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 05:04:12 localhost neutron_sriov_agent[260044]: 2025-12-15 10:04:12.150 2 INFO neutron.agent.securitygroups_rpc [None req-fb6ae39a-f963-4072-afc1-c07deb714848 64529faa0f8048b29a83c4d324dc7fd1 5619c5c73e504af8ba8040027fc52cd0 - - default default] Security group member updated ['acdc5a94-0e43-47e4-abae-9834cf1f69f0']#033[00m Dec 15 05:04:12 localhost nova_compute[286344]: 2025-12-15 10:04:12.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:04:12 localhost nova_compute[286344]: 2025-12-15 10:04:12.270 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 15 05:04:12 localhost nova_compute[286344]: 2025-12-15 10:04:12.271 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 15 05:04:12 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:04:12.281 267546 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-15T10:04:11Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=91956511-c51f-4b76-bd15-7f1397a09724, ip_allocation=immediate, mac_address=fa:16:3e:02:b4:5c, name=tempest-FloatingIPAdminTestJSON-720220382, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-15T10:04:01Z, description=, dns_domain=, id=7842d897-78f9-4f71-9bb1-8eff4b50f3d6, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-FloatingIPAdminTestJSON-test-network-1083697997, port_security_enabled=True, project_id=5619c5c73e504af8ba8040027fc52cd0, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=52572, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1558, status=ACTIVE, subnets=['98f039e8-309b-4979-bc6c-8e9f04031d97'], tags=[], tenant_id=5619c5c73e504af8ba8040027fc52cd0, updated_at=2025-12-15T10:04:02Z, vlan_transparent=None, network_id=7842d897-78f9-4f71-9bb1-8eff4b50f3d6, port_security_enabled=True, project_id=5619c5c73e504af8ba8040027fc52cd0, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['acdc5a94-0e43-47e4-abae-9834cf1f69f0'], standard_attr_id=1618, status=DOWN, tags=[], tenant_id=5619c5c73e504af8ba8040027fc52cd0, updated_at=2025-12-15T10:04:11Z on network 7842d897-78f9-4f71-9bb1-8eff4b50f3d6#033[00m Dec 15 05:04:12 localhost nova_compute[286344]: 2025-12-15 10:04:12.398 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 15 05:04:12 localhost nova_compute[286344]: 2025-12-15 10:04:12.399 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquired lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 15 05:04:12 localhost nova_compute[286344]: 2025-12-15 10:04:12.399 286348 DEBUG nova.network.neutron [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 15 05:04:12 localhost nova_compute[286344]: 2025-12-15 10:04:12.400 286348 DEBUG nova.objects.instance [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 15 05:04:12 localhost nova_compute[286344]: 2025-12-15 10:04:12.412 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:04:12 localhost systemd[1]: tmp-crun.MYABCF.mount: Deactivated successfully. Dec 15 05:04:12 localhost dnsmasq[320120]: read /var/lib/neutron/dhcp/7842d897-78f9-4f71-9bb1-8eff4b50f3d6/addn_hosts - 2 addresses Dec 15 05:04:12 localhost dnsmasq-dhcp[320120]: read /var/lib/neutron/dhcp/7842d897-78f9-4f71-9bb1-8eff4b50f3d6/host Dec 15 05:04:12 localhost podman[320318]: 2025-12-15 10:04:12.449897716 +0000 UTC m=+0.052499625 container kill 5564a2a5e3ddfe6d028d5382e8e67a11cd46b468ca09bbacade8b986d417afd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7842d897-78f9-4f71-9bb1-8eff4b50f3d6, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:04:12 localhost dnsmasq-dhcp[320120]: read /var/lib/neutron/dhcp/7842d897-78f9-4f71-9bb1-8eff4b50f3d6/opts Dec 15 05:04:12 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:04:12.628 267546 INFO neutron.agent.dhcp.agent [None req-5a1bdd12-981c-4b47-b497-044929bf4c0a - - - - - -] DHCP configuration for ports {'91956511-c51f-4b76-bd15-7f1397a09724'} is completed#033[00m Dec 15 05:04:12 localhost nova_compute[286344]: 2025-12-15 10:04:12.902 286348 DEBUG nova.network.neutron [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Updating instance_info_cache with network_info: [{"id": "03ef8889-3216-43fb-8a52-4be17a956ce1", "address": "fa:16:3e:74:df:7c", "network": {"id": "befb7a72-17a9-4bcb-b561-84b8f626685a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.201", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "c785bf23f53946bc99867d8832a50266", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03ef8889-32", "ovs_interfaceid": "03ef8889-3216-43fb-8a52-4be17a956ce1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 15 05:04:12 localhost nova_compute[286344]: 2025-12-15 10:04:12.918 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Releasing lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 15 05:04:12 localhost nova_compute[286344]: 2025-12-15 10:04:12.919 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 15 05:04:12 localhost nova_compute[286344]: 2025-12-15 10:04:12.919 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:04:12 localhost nova_compute[286344]: 2025-12-15 10:04:12.919 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:04:13 localhost neutron_sriov_agent[260044]: 2025-12-15 10:04:13.013 2 INFO neutron.agent.securitygroups_rpc [None req-d181c413-c9af-4a6a-938f-6eeabc9300da 19323f4a08564f78b2f57705b1ed2d40 d40af34d2f0a4a3a970832bd2c20f504 - - default default] Security group member updated ['8b68ca3e-3ca9-4dbf-9281-e34fc61f9100']#033[00m Dec 15 05:04:13 localhost ovn_metadata_agent[160585]: 2025-12-15 10:04:13.164 160590 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=12d96d64-e862-4f68-81e5-8d9ec5d3a5e2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 15 05:04:13 localhost neutron_sriov_agent[260044]: 2025-12-15 10:04:13.786 2 INFO neutron.agent.securitygroups_rpc [None req-904339f4-19c1-4a43-9afc-5d486028bb74 19323f4a08564f78b2f57705b1ed2d40 d40af34d2f0a4a3a970832bd2c20f504 - - default default] Security group member updated ['8b68ca3e-3ca9-4dbf-9281-e34fc61f9100']#033[00m Dec 15 05:04:14 localhost neutron_sriov_agent[260044]: 2025-12-15 10:04:14.329 2 INFO neutron.agent.securitygroups_rpc [None req-668bd9c3-c66d-4c30-9be0-38c1e9426087 19323f4a08564f78b2f57705b1ed2d40 d40af34d2f0a4a3a970832bd2c20f504 - - default default] Security group member updated ['8b68ca3e-3ca9-4dbf-9281-e34fc61f9100']#033[00m Dec 15 05:04:15 localhost nova_compute[286344]: 2025-12-15 10:04:15.069 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:04:15 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 05:04:17 localhost neutron_sriov_agent[260044]: 2025-12-15 10:04:17.185 2 INFO neutron.agent.securitygroups_rpc [None req-d60005af-06b1-4591-895c-1f74204b68dc 64529faa0f8048b29a83c4d324dc7fd1 5619c5c73e504af8ba8040027fc52cd0 - - default default] Security group member updated ['acdc5a94-0e43-47e4-abae-9834cf1f69f0']#033[00m Dec 15 05:04:17 localhost nova_compute[286344]: 2025-12-15 10:04:17.271 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:04:17 localhost nova_compute[286344]: 2025-12-15 10:04:17.289 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 05:04:17 localhost nova_compute[286344]: 2025-12-15 10:04:17.290 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 05:04:17 localhost nova_compute[286344]: 2025-12-15 10:04:17.290 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 05:04:17 localhost nova_compute[286344]: 2025-12-15 10:04:17.290 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Auditing locally available compute resources for np0005559462.localdomain (node: np0005559462.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 15 05:04:17 localhost nova_compute[286344]: 2025-12-15 10:04:17.290 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 05:04:17 localhost nova_compute[286344]: 2025-12-15 10:04:17.414 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:04:17 localhost dnsmasq[320120]: read /var/lib/neutron/dhcp/7842d897-78f9-4f71-9bb1-8eff4b50f3d6/addn_hosts - 1 addresses Dec 15 05:04:17 localhost dnsmasq-dhcp[320120]: read /var/lib/neutron/dhcp/7842d897-78f9-4f71-9bb1-8eff4b50f3d6/host Dec 15 05:04:17 localhost podman[320358]: 2025-12-15 10:04:17.443080624 +0000 UTC m=+0.068738380 container kill 5564a2a5e3ddfe6d028d5382e8e67a11cd46b468ca09bbacade8b986d417afd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7842d897-78f9-4f71-9bb1-8eff4b50f3d6, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Dec 15 05:04:17 localhost dnsmasq-dhcp[320120]: read /var/lib/neutron/dhcp/7842d897-78f9-4f71-9bb1-8eff4b50f3d6/opts Dec 15 05:04:17 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Dec 15 05:04:17 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3984791221' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Dec 15 05:04:17 localhost nova_compute[286344]: 2025-12-15 10:04:17.752 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.462s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 05:04:17 localhost nova_compute[286344]: 2025-12-15 10:04:17.824 286348 DEBUG nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 05:04:17 localhost nova_compute[286344]: 2025-12-15 10:04:17.825 286348 DEBUG nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 05:04:18 localhost nova_compute[286344]: 2025-12-15 10:04:18.043 286348 WARNING nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 15 05:04:18 localhost nova_compute[286344]: 2025-12-15 10:04:18.045 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Hypervisor/Node resource view: name=np0005559462.localdomain free_ram=11294MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 15 05:04:18 localhost nova_compute[286344]: 2025-12-15 10:04:18.046 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 05:04:18 localhost nova_compute[286344]: 2025-12-15 10:04:18.046 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 05:04:18 localhost nova_compute[286344]: 2025-12-15 10:04:18.121 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Instance 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 15 05:04:18 localhost nova_compute[286344]: 2025-12-15 10:04:18.122 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 15 05:04:18 localhost nova_compute[286344]: 2025-12-15 10:04:18.122 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Final resource view: name=np0005559462.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 15 05:04:18 localhost nova_compute[286344]: 2025-12-15 10:04:18.159 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 05:04:18 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Dec 15 05:04:18 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1462585402' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Dec 15 05:04:18 localhost nova_compute[286344]: 2025-12-15 10:04:18.618 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 05:04:18 localhost nova_compute[286344]: 2025-12-15 10:04:18.625 286348 DEBUG nova.compute.provider_tree [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Inventory has not changed in ProviderTree for provider: 26c8956b-6742-4951-b566-971b9bbe323b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 15 05:04:18 localhost nova_compute[286344]: 2025-12-15 10:04:18.641 286348 DEBUG nova.scheduler.client.report [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Inventory has not changed for provider 26c8956b-6742-4951-b566-971b9bbe323b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 15 05:04:18 localhost nova_compute[286344]: 2025-12-15 10:04:18.643 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Compute_service record updated for np0005559462.localdomain:np0005559462.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 15 05:04:18 localhost nova_compute[286344]: 2025-12-15 10:04:18.644 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.598s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 05:04:19 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 05:04:19 localhost podman[320435]: 2025-12-15 10:04:19.757011298 +0000 UTC m=+0.077481059 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:04:19 localhost podman[320435]: 2025-12-15 10:04:19.789507951 +0000 UTC m=+0.109977732 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Dec 15 05:04:19 localhost dnsmasq[320120]: read /var/lib/neutron/dhcp/7842d897-78f9-4f71-9bb1-8eff4b50f3d6/addn_hosts - 0 addresses Dec 15 05:04:19 localhost dnsmasq-dhcp[320120]: read /var/lib/neutron/dhcp/7842d897-78f9-4f71-9bb1-8eff4b50f3d6/host Dec 15 05:04:19 localhost dnsmasq-dhcp[320120]: read /var/lib/neutron/dhcp/7842d897-78f9-4f71-9bb1-8eff4b50f3d6/opts Dec 15 05:04:19 localhost podman[320451]: 2025-12-15 10:04:19.790933377 +0000 UTC m=+0.066312560 container kill 5564a2a5e3ddfe6d028d5382e8e67a11cd46b468ca09bbacade8b986d417afd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7842d897-78f9-4f71-9bb1-8eff4b50f3d6, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 05:04:19 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 05:04:19 localhost nova_compute[286344]: 2025-12-15 10:04:19.985 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:04:19 localhost kernel: device tap57619a5a-b1 left promiscuous mode Dec 15 05:04:19 localhost ovn_controller[154603]: 2025-12-15T10:04:19Z|00224|binding|INFO|Releasing lport 57619a5a-b115-4772-abf3-f3d2494fe5b1 from this chassis (sb_readonly=0) Dec 15 05:04:19 localhost ovn_controller[154603]: 2025-12-15T10:04:19Z|00225|binding|INFO|Setting lport 57619a5a-b115-4772-abf3-f3d2494fe5b1 down in Southbound Dec 15 05:04:20 localhost ovn_metadata_agent[160585]: 2025-12-15 10:04:20.000 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-7842d897-78f9-4f71-9bb1-8eff4b50f3d6', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7842d897-78f9-4f71-9bb1-8eff4b50f3d6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5619c5c73e504af8ba8040027fc52cd0', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005559462.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cc061c61-218c-4770-a641-a91c5cc330cb, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=57619a5a-b115-4772-abf3-f3d2494fe5b1) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:04:20 localhost ovn_metadata_agent[160585]: 2025-12-15 10:04:20.002 160590 INFO neutron.agent.ovn.metadata.agent [-] Port 57619a5a-b115-4772-abf3-f3d2494fe5b1 in datapath 7842d897-78f9-4f71-9bb1-8eff4b50f3d6 unbound from our chassis#033[00m Dec 15 05:04:20 localhost ovn_metadata_agent[160585]: 2025-12-15 10:04:20.004 160590 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7842d897-78f9-4f71-9bb1-8eff4b50f3d6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 15 05:04:20 localhost ovn_metadata_agent[160585]: 2025-12-15 10:04:20.005 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[ba2cdf3e-ac94-49cd-b0a4-d913c5ea78a5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:04:20 localhost nova_compute[286344]: 2025-12-15 10:04:20.014 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:04:20 localhost nova_compute[286344]: 2025-12-15 10:04:20.015 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:04:20 localhost nova_compute[286344]: 2025-12-15 10:04:20.071 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:04:20 localhost nova_compute[286344]: 2025-12-15 10:04:20.325 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:04:20 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 05:04:20 localhost nova_compute[286344]: 2025-12-15 10:04:20.644 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:04:20 localhost nova_compute[286344]: 2025-12-15 10:04:20.645 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:04:20 localhost nova_compute[286344]: 2025-12-15 10:04:20.645 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 15 05:04:22 localhost nova_compute[286344]: 2025-12-15 10:04:22.455 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:04:23 localhost nova_compute[286344]: 2025-12-15 10:04:23.160 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:04:23 localhost dnsmasq[320120]: exiting on receipt of SIGTERM Dec 15 05:04:23 localhost podman[320497]: 2025-12-15 10:04:23.4531281 +0000 UTC m=+0.059724047 container kill 5564a2a5e3ddfe6d028d5382e8e67a11cd46b468ca09bbacade8b986d417afd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7842d897-78f9-4f71-9bb1-8eff4b50f3d6, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3) Dec 15 05:04:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e. Dec 15 05:04:23 localhost systemd[1]: libpod-5564a2a5e3ddfe6d028d5382e8e67a11cd46b468ca09bbacade8b986d417afd5.scope: Deactivated successfully. Dec 15 05:04:23 localhost podman[320511]: 2025-12-15 10:04:23.525278544 +0000 UTC m=+0.054822632 container died 5564a2a5e3ddfe6d028d5382e8e67a11cd46b468ca09bbacade8b986d417afd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7842d897-78f9-4f71-9bb1-8eff4b50f3d6, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3) Dec 15 05:04:23 localhost podman[320511]: 2025-12-15 10:04:23.561324157 +0000 UTC m=+0.090868195 container cleanup 5564a2a5e3ddfe6d028d5382e8e67a11cd46b468ca09bbacade8b986d417afd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7842d897-78f9-4f71-9bb1-8eff4b50f3d6, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS) Dec 15 05:04:23 localhost systemd[1]: libpod-conmon-5564a2a5e3ddfe6d028d5382e8e67a11cd46b468ca09bbacade8b986d417afd5.scope: Deactivated successfully. Dec 15 05:04:23 localhost podman[320512]: 2025-12-15 10:04:23.611906172 +0000 UTC m=+0.135171333 container health_status a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Dec 15 05:04:23 localhost podman[320512]: 2025-12-15 10:04:23.622632451 +0000 UTC m=+0.145897642 container exec_died a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 15 05:04:23 localhost systemd[1]: a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e.service: Deactivated successfully. Dec 15 05:04:23 localhost podman[320513]: 2025-12-15 10:04:23.66776884 +0000 UTC m=+0.187163625 container remove 5564a2a5e3ddfe6d028d5382e8e67a11cd46b468ca09bbacade8b986d417afd5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7842d897-78f9-4f71-9bb1-8eff4b50f3d6, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true) Dec 15 05:04:23 localhost neutron_sriov_agent[260044]: 2025-12-15 10:04:23.808 2 INFO neutron.agent.securitygroups_rpc [None req-4f912a92-0a5e-4ebb-9605-82ccd65a0c13 3520f12d603643359512247956f536da 00da3dc49efb473f8dd4d26c5e7cce87 - - default default] Security group member updated ['f74b7233-4b50-43b4-b439-47f3091629fe']#033[00m Dec 15 05:04:23 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:04:23.881 267546 INFO neutron.agent.dhcp.agent [None req-c2a7da4a-9516-4458-879c-91f9b9ec0dfa - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:04:23 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:04:23.950 267546 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:04:24 localhost nova_compute[286344]: 2025-12-15 10:04:24.088 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:04:24 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:04:24.386 267546 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:04:24 localhost systemd[1]: var-lib-containers-storage-overlay-578d21ad95e08a97157b0194224e45641f055c167755f77abc77f6b371aa0038-merged.mount: Deactivated successfully. Dec 15 05:04:24 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5564a2a5e3ddfe6d028d5382e8e67a11cd46b468ca09bbacade8b986d417afd5-userdata-shm.mount: Deactivated successfully. Dec 15 05:04:24 localhost systemd[1]: run-netns-qdhcp\x2d7842d897\x2d78f9\x2d4f71\x2d9bb1\x2d8eff4b50f3d6.mount: Deactivated successfully. Dec 15 05:04:24 localhost ovn_controller[154603]: 2025-12-15T10:04:24Z|00226|binding|INFO|Releasing lport b35254ad-12eb-47bb-92be-44fefe0694f0 from this chassis (sb_readonly=0) Dec 15 05:04:24 localhost nova_compute[286344]: 2025-12-15 10:04:24.935 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:04:25 localhost nova_compute[286344]: 2025-12-15 10:04:25.072 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:04:25 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 05:04:25 localhost neutron_sriov_agent[260044]: 2025-12-15 10:04:25.666 2 INFO neutron.agent.securitygroups_rpc [None req-35304b05-115d-420a-abbb-c108c8a8d850 3520f12d603643359512247956f536da 00da3dc49efb473f8dd4d26c5e7cce87 - - default default] Security group member updated ['f74b7233-4b50-43b4-b439-47f3091629fe']#033[00m Dec 15 05:04:26 localhost ovn_controller[154603]: 2025-12-15T10:04:26Z|00227|binding|INFO|Releasing lport b35254ad-12eb-47bb-92be-44fefe0694f0 from this chassis (sb_readonly=0) Dec 15 05:04:26 localhost nova_compute[286344]: 2025-12-15 10:04:26.332 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:04:26 localhost neutron_sriov_agent[260044]: 2025-12-15 10:04:26.888 2 INFO neutron.agent.securitygroups_rpc [None req-99b5ed1a-c43c-4df0-9473-9379f681d15c 3520f12d603643359512247956f536da 00da3dc49efb473f8dd4d26c5e7cce87 - - default default] Security group member updated ['f74b7233-4b50-43b4-b439-47f3091629fe']#033[00m Dec 15 05:04:26 localhost ovn_controller[154603]: 2025-12-15T10:04:26Z|00228|binding|INFO|Releasing lport b35254ad-12eb-47bb-92be-44fefe0694f0 from this chassis (sb_readonly=0) Dec 15 05:04:26 localhost nova_compute[286344]: 2025-12-15 10:04:26.944 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:04:27 localhost nova_compute[286344]: 2025-12-15 10:04:27.485 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:04:27 localhost nova_compute[286344]: 2025-12-15 10:04:27.937 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:04:28 localhost neutron_sriov_agent[260044]: 2025-12-15 10:04:28.726 2 INFO neutron.agent.securitygroups_rpc [None req-27eda8d2-7e84-408a-ba75-6a16ce7356f3 3520f12d603643359512247956f536da 00da3dc49efb473f8dd4d26c5e7cce87 - - default default] Security group member updated ['f74b7233-4b50-43b4-b439-47f3091629fe']#033[00m Dec 15 05:04:29 localhost ovn_controller[154603]: 2025-12-15T10:04:29Z|00229|binding|INFO|Releasing lport b35254ad-12eb-47bb-92be-44fefe0694f0 from this chassis (sb_readonly=0) Dec 15 05:04:29 localhost nova_compute[286344]: 2025-12-15 10:04:29.598 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:04:30 localhost nova_compute[286344]: 2025-12-15 10:04:30.074 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:04:30 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 05:04:31 localhost ovn_controller[154603]: 2025-12-15T10:04:31Z|00230|binding|INFO|Releasing lport b35254ad-12eb-47bb-92be-44fefe0694f0 from this chassis (sb_readonly=0) Dec 15 05:04:31 localhost nova_compute[286344]: 2025-12-15 10:04:31.204 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:04:31 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e134 do_prune osdmap full prune enabled Dec 15 05:04:31 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e135 e135: 6 total, 6 up, 6 in Dec 15 05:04:31 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e135: 6 total, 6 up, 6 in Dec 15 05:04:31 localhost podman[243449]: time="2025-12-15T10:04:31Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 15 05:04:31 localhost podman[243449]: @ - - [15/Dec/2025:10:04:31 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156640 "" "Go-http-client/1.1" Dec 15 05:04:31 localhost podman[243449]: @ - - [15/Dec/2025:10:04:31 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19254 "" "Go-http-client/1.1" Dec 15 05:04:32 localhost nova_compute[286344]: 2025-12-15 10:04:32.529 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:04:34 localhost openstack_network_exporter[246484]: ERROR 10:04:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 05:04:34 localhost openstack_network_exporter[246484]: ERROR 10:04:34 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 15 05:04:34 localhost openstack_network_exporter[246484]: ERROR 10:04:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 05:04:34 localhost openstack_network_exporter[246484]: ERROR 10:04:34 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 15 05:04:34 localhost openstack_network_exporter[246484]: Dec 15 05:04:34 localhost openstack_network_exporter[246484]: ERROR 10:04:34 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 15 05:04:34 localhost openstack_network_exporter[246484]: Dec 15 05:04:34 localhost neutron_sriov_agent[260044]: 2025-12-15 10:04:34.961 2 INFO neutron.agent.securitygroups_rpc [None req-cfc6a4f8-11be-4680-bfb8-be3da9c1800c 7732c1fde68d40b3beb5c5f9345af293 038fdb9daf9d43f5be9457a11b03ec86 - - default default] Security group member updated ['d942e487-4c1f-4bd1-a77b-444ce3009093']#033[00m Dec 15 05:04:35 localhost nova_compute[286344]: 2025-12-15 10:04:35.076 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:04:35 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e135 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 05:04:36 localhost neutron_sriov_agent[260044]: 2025-12-15 10:04:36.374 2 INFO neutron.agent.securitygroups_rpc [None req-d57d27b1-04e3-4daf-8dbb-e4fa891233b6 7732c1fde68d40b3beb5c5f9345af293 038fdb9daf9d43f5be9457a11b03ec86 - - default default] Security group member updated ['d942e487-4c1f-4bd1-a77b-444ce3009093']#033[00m Dec 15 05:04:37 localhost nova_compute[286344]: 2025-12-15 10:04:37.556 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:04:37 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e135 do_prune osdmap full prune enabled Dec 15 05:04:37 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e136 e136: 6 total, 6 up, 6 in Dec 15 05:04:37 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e136: 6 total, 6 up, 6 in Dec 15 05:04:38 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e136 do_prune osdmap full prune enabled Dec 15 05:04:38 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e137 e137: 6 total, 6 up, 6 in Dec 15 05:04:38 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e137: 6 total, 6 up, 6 in Dec 15 05:04:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0. Dec 15 05:04:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 05:04:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a. Dec 15 05:04:39 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e137 do_prune osdmap full prune enabled Dec 15 05:04:39 localhost podman[320558]: 2025-12-15 10:04:39.751700703 +0000 UTC m=+0.080057395 container health_status 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Dec 15 05:04:39 localhost podman[320558]: 2025-12-15 10:04:39.760325938 +0000 UTC m=+0.088682650 container exec_died 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 15 05:04:39 localhost systemd[1]: 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0.service: Deactivated successfully. Dec 15 05:04:39 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e138 e138: 6 total, 6 up, 6 in Dec 15 05:04:39 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e138: 6 total, 6 up, 6 in Dec 15 05:04:39 localhost podman[320560]: 2025-12-15 10:04:39.81433171 +0000 UTC m=+0.137501592 container health_status b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 05:04:39 localhost podman[320560]: 2025-12-15 10:04:39.822634667 +0000 UTC m=+0.145804599 container exec_died b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Dec 15 05:04:39 localhost systemd[1]: b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a.service: Deactivated successfully. Dec 15 05:04:39 localhost podman[320559]: 2025-12-15 10:04:39.871350106 +0000 UTC m=+0.196403786 container health_status 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=multipathd, org.label-schema.license=GPLv2) Dec 15 05:04:39 localhost podman[320559]: 2025-12-15 10:04:39.885397647 +0000 UTC m=+0.210451377 container exec_died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3) Dec 15 05:04:39 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.service: Deactivated successfully. Dec 15 05:04:40 localhost nova_compute[286344]: 2025-12-15 10:04:40.078 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:04:40 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 05:04:40 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e138 do_prune osdmap full prune enabled Dec 15 05:04:40 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e139 e139: 6 total, 6 up, 6 in Dec 15 05:04:40 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e139: 6 total, 6 up, 6 in Dec 15 05:04:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09. Dec 15 05:04:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 05:04:41 localhost podman[320616]: 2025-12-15 10:04:41.758307545 +0000 UTC m=+0.085343186 container health_status 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, distribution-scope=public, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, release=1755695350, managed_by=edpm_ansible, name=ubi9-minimal, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, vcs-type=git, version=9.6) Dec 15 05:04:41 localhost podman[320616]: 2025-12-15 10:04:41.769856954 +0000 UTC m=+0.096892575 container exec_died 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., managed_by=edpm_ansible, io.openshift.expose-services=, name=ubi9-minimal, config_id=openstack_network_exporter, release=1755695350, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git) Dec 15 05:04:41 localhost systemd[1]: 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09.service: Deactivated successfully. Dec 15 05:04:41 localhost podman[320617]: 2025-12-15 10:04:41.863866107 +0000 UTC m=+0.188056797 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true) Dec 15 05:04:41 localhost podman[320617]: 2025-12-15 10:04:41.906720659 +0000 UTC m=+0.230911329 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller) Dec 15 05:04:41 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 05:04:42 localhost nova_compute[286344]: 2025-12-15 10:04:42.592 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:04:43 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:04:43.409 267546 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:04:45 localhost nova_compute[286344]: 2025-12-15 10:04:45.080 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:04:45 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 05:04:45 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e139 do_prune osdmap full prune enabled Dec 15 05:04:45 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e140 e140: 6 total, 6 up, 6 in Dec 15 05:04:45 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e140: 6 total, 6 up, 6 in Dec 15 05:04:46 localhost neutron_sriov_agent[260044]: 2025-12-15 10:04:46.544 2 INFO neutron.agent.securitygroups_rpc [None req-ecd743bf-1e79-4d27-8b42-6d9ed3a5ffd5 25d1636c66804417a83ea281cb8c9d46 563dc37280b84f6fb12f8a3e7cef24a1 - - default default] Security group member updated ['eb2e19fb-6707-4d05-b98d-77f65ef15125']#033[00m Dec 15 05:04:46 localhost neutron_sriov_agent[260044]: 2025-12-15 10:04:46.681 2 INFO neutron.agent.securitygroups_rpc [None req-ecd743bf-1e79-4d27-8b42-6d9ed3a5ffd5 25d1636c66804417a83ea281cb8c9d46 563dc37280b84f6fb12f8a3e7cef24a1 - - default default] Security group member updated ['eb2e19fb-6707-4d05-b98d-77f65ef15125']#033[00m Dec 15 05:04:47 localhost neutron_sriov_agent[260044]: 2025-12-15 10:04:47.577 2 INFO neutron.agent.securitygroups_rpc [None req-35b9ecf0-6d0a-458d-b8a4-e1303ad17b38 25d1636c66804417a83ea281cb8c9d46 563dc37280b84f6fb12f8a3e7cef24a1 - - default default] Security group member updated ['eb2e19fb-6707-4d05-b98d-77f65ef15125']#033[00m Dec 15 05:04:47 localhost nova_compute[286344]: 2025-12-15 10:04:47.632 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:04:47 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:04:47.635 267546 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:04:48 localhost neutron_sriov_agent[260044]: 2025-12-15 10:04:48.146 2 INFO neutron.agent.securitygroups_rpc [None req-15189b10-3067-4132-b00c-112b866ab37b 25d1636c66804417a83ea281cb8c9d46 563dc37280b84f6fb12f8a3e7cef24a1 - - default default] Security group member updated ['eb2e19fb-6707-4d05-b98d-77f65ef15125']#033[00m Dec 15 05:04:50 localhost nova_compute[286344]: 2025-12-15 10:04:50.082 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:04:50 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e140 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 05:04:50 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e140 do_prune osdmap full prune enabled Dec 15 05:04:50 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e141 e141: 6 total, 6 up, 6 in Dec 15 05:04:50 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e141: 6 total, 6 up, 6 in Dec 15 05:04:50 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 05:04:50 localhost podman[320661]: 2025-12-15 10:04:50.74191021 +0000 UTC m=+0.070124226 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS) Dec 15 05:04:50 localhost podman[320661]: 2025-12-15 10:04:50.772416084 +0000 UTC m=+0.100630030 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:04:50 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 05:04:51 localhost neutron_sriov_agent[260044]: 2025-12-15 10:04:51.181 2 INFO neutron.agent.securitygroups_rpc [None req-66a851d2-93fc-48cd-aacb-0761998ba8ff 2af48a84171e48bebc064b89f6439b19 e13843b63193470fb73acd260577a584 - - default default] Security group member updated ['4cd94ed3-9cfc-4ea0-a946-5c12e5f4a827']#033[00m Dec 15 05:04:51 localhost ovn_metadata_agent[160585]: 2025-12-15 10:04:51.481 160590 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 05:04:51 localhost ovn_metadata_agent[160585]: 2025-12-15 10:04:51.482 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 05:04:51 localhost ovn_metadata_agent[160585]: 2025-12-15 10:04:51.483 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 05:04:51 localhost neutron_sriov_agent[260044]: 2025-12-15 10:04:51.801 2 INFO neutron.agent.securitygroups_rpc [None req-66a851d2-93fc-48cd-aacb-0761998ba8ff 2af48a84171e48bebc064b89f6439b19 e13843b63193470fb73acd260577a584 - - default default] Security group member updated ['4cd94ed3-9cfc-4ea0-a946-5c12e5f4a827']#033[00m Dec 15 05:04:52 localhost nova_compute[286344]: 2025-12-15 10:04:52.668 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:04:52 localhost neutron_sriov_agent[260044]: 2025-12-15 10:04:52.936 2 INFO neutron.agent.securitygroups_rpc [None req-6b893e9b-aa66-4a57-87e4-0741157b6fd5 2af48a84171e48bebc064b89f6439b19 e13843b63193470fb73acd260577a584 - - default default] Security group member updated ['4cd94ed3-9cfc-4ea0-a946-5c12e5f4a827']#033[00m Dec 15 05:04:53 localhost neutron_sriov_agent[260044]: 2025-12-15 10:04:53.512 2 INFO neutron.agent.securitygroups_rpc [None req-e1687962-fbe8-4549-93b3-11c84a0c79cd 2af48a84171e48bebc064b89f6439b19 e13843b63193470fb73acd260577a584 - - default default] Security group member updated ['4cd94ed3-9cfc-4ea0-a946-5c12e5f4a827']#033[00m Dec 15 05:04:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e. Dec 15 05:04:53 localhost podman[320678]: 2025-12-15 10:04:53.762370903 +0000 UTC m=+0.081193633 container health_status a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 15 05:04:53 localhost podman[320678]: 2025-12-15 10:04:53.800499887 +0000 UTC m=+0.119322617 container exec_died a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Dec 15 05:04:53 localhost systemd[1]: a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e.service: Deactivated successfully. Dec 15 05:04:53 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:04:53.885 267546 INFO neutron.agent.linux.ip_lib [None req-c07d57b6-4500-41e8-a833-eb43dfa1a51b - - - - - -] Device tap0a9f7a0a-96 cannot be used as it has no MAC address#033[00m Dec 15 05:04:53 localhost nova_compute[286344]: 2025-12-15 10:04:53.912 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:04:53 localhost kernel: device tap0a9f7a0a-96 entered promiscuous mode Dec 15 05:04:53 localhost ovn_controller[154603]: 2025-12-15T10:04:53Z|00231|binding|INFO|Claiming lport 0a9f7a0a-96b1-4b9a-bc1c-6545a366856b for this chassis. Dec 15 05:04:53 localhost ovn_controller[154603]: 2025-12-15T10:04:53Z|00232|binding|INFO|0a9f7a0a-96b1-4b9a-bc1c-6545a366856b: Claiming unknown Dec 15 05:04:53 localhost NetworkManager[5963]: [1765793093.9236] manager: (tap0a9f7a0a-96): new Generic device (/org/freedesktop/NetworkManager/Devices/37) Dec 15 05:04:53 localhost nova_compute[286344]: 2025-12-15 10:04:53.922 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:04:53 localhost systemd-udevd[320712]: Network interface NamePolicy= disabled on kernel command line. Dec 15 05:04:53 localhost ovn_metadata_agent[160585]: 2025-12-15 10:04:53.931 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-c65af4c5-ccf6-443c-a9e0-706f066852e4', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c65af4c5-ccf6-443c-a9e0-706f066852e4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '563dc37280b84f6fb12f8a3e7cef24a1', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b0519035-c3ed-434b-9353-cedfebc7fa6d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=0a9f7a0a-96b1-4b9a-bc1c-6545a366856b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:04:53 localhost ovn_metadata_agent[160585]: 2025-12-15 10:04:53.934 160590 INFO neutron.agent.ovn.metadata.agent [-] Port 0a9f7a0a-96b1-4b9a-bc1c-6545a366856b in datapath c65af4c5-ccf6-443c-a9e0-706f066852e4 bound to our chassis#033[00m Dec 15 05:04:53 localhost ovn_metadata_agent[160585]: 2025-12-15 10:04:53.935 160590 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network c65af4c5-ccf6-443c-a9e0-706f066852e4 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 15 05:04:53 localhost ovn_metadata_agent[160585]: 2025-12-15 10:04:53.936 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[3d03ffde-1769-417b-bd6c-1d5781373d10]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:04:53 localhost journal[231322]: ethtool ioctl error on tap0a9f7a0a-96: No such device Dec 15 05:04:53 localhost ovn_controller[154603]: 2025-12-15T10:04:53Z|00233|binding|INFO|Setting lport 0a9f7a0a-96b1-4b9a-bc1c-6545a366856b ovn-installed in OVS Dec 15 05:04:53 localhost ovn_controller[154603]: 2025-12-15T10:04:53Z|00234|binding|INFO|Setting lport 0a9f7a0a-96b1-4b9a-bc1c-6545a366856b up in Southbound Dec 15 05:04:53 localhost journal[231322]: ethtool ioctl error on tap0a9f7a0a-96: No such device Dec 15 05:04:53 localhost nova_compute[286344]: 2025-12-15 10:04:53.963 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:04:53 localhost journal[231322]: ethtool ioctl error on tap0a9f7a0a-96: No such device Dec 15 05:04:53 localhost journal[231322]: ethtool ioctl error on tap0a9f7a0a-96: No such device Dec 15 05:04:53 localhost journal[231322]: ethtool ioctl error on tap0a9f7a0a-96: No such device Dec 15 05:04:53 localhost journal[231322]: ethtool ioctl error on tap0a9f7a0a-96: No such device Dec 15 05:04:53 localhost journal[231322]: ethtool ioctl error on tap0a9f7a0a-96: No such device Dec 15 05:04:53 localhost journal[231322]: ethtool ioctl error on tap0a9f7a0a-96: No such device Dec 15 05:04:54 localhost nova_compute[286344]: 2025-12-15 10:04:54.007 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:04:54 localhost nova_compute[286344]: 2025-12-15 10:04:54.044 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:04:54 localhost podman[320783]: Dec 15 05:04:54 localhost podman[320783]: 2025-12-15 10:04:54.977029129 +0000 UTC m=+0.085922782 container create a3a5f00715764a5b922c912c52ca54ee51d6711d6891e7fe4f167c056f11e44e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c65af4c5-ccf6-443c-a9e0-706f066852e4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:04:55 localhost systemd[1]: Started libpod-conmon-a3a5f00715764a5b922c912c52ca54ee51d6711d6891e7fe4f167c056f11e44e.scope. Dec 15 05:04:55 localhost systemd[1]: tmp-crun.jMVVjE.mount: Deactivated successfully. Dec 15 05:04:55 localhost podman[320783]: 2025-12-15 10:04:54.940938606 +0000 UTC m=+0.049832279 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 15 05:04:55 localhost systemd[1]: Started libcrun container. Dec 15 05:04:55 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/af1e3419cb91aa152408d3957397082580c6897aee2afbe88e53f7503cd9adc8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 05:04:55 localhost podman[320783]: 2025-12-15 10:04:55.083680168 +0000 UTC m=+0.192573811 container init a3a5f00715764a5b922c912c52ca54ee51d6711d6891e7fe4f167c056f11e44e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c65af4c5-ccf6-443c-a9e0-706f066852e4, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202) Dec 15 05:04:55 localhost nova_compute[286344]: 2025-12-15 10:04:55.085 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:04:55 localhost podman[320783]: 2025-12-15 10:04:55.090053287 +0000 UTC m=+0.198946930 container start a3a5f00715764a5b922c912c52ca54ee51d6711d6891e7fe4f167c056f11e44e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c65af4c5-ccf6-443c-a9e0-706f066852e4, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2) Dec 15 05:04:55 localhost dnsmasq[320801]: started, version 2.85 cachesize 150 Dec 15 05:04:55 localhost dnsmasq[320801]: DNS service limited to local subnets Dec 15 05:04:55 localhost dnsmasq[320801]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 15 05:04:55 localhost dnsmasq[320801]: warning: no upstream servers configured Dec 15 05:04:55 localhost dnsmasq-dhcp[320801]: DHCP, static leases only on 10.100.0.0, lease time 1d Dec 15 05:04:55 localhost dnsmasq[320801]: read /var/lib/neutron/dhcp/c65af4c5-ccf6-443c-a9e0-706f066852e4/addn_hosts - 0 addresses Dec 15 05:04:55 localhost dnsmasq-dhcp[320801]: read /var/lib/neutron/dhcp/c65af4c5-ccf6-443c-a9e0-706f066852e4/host Dec 15 05:04:55 localhost dnsmasq-dhcp[320801]: read /var/lib/neutron/dhcp/c65af4c5-ccf6-443c-a9e0-706f066852e4/opts Dec 15 05:04:55 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:04:55.234 267546 INFO neutron.agent.dhcp.agent [None req-2c313e4c-106e-45c6-af32-fd197c1d27cb - - - - - -] DHCP configuration for ports {'08314c99-1b63-4e5f-99da-a3649f9a9ac8'} is completed#033[00m Dec 15 05:04:55 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e141 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 05:04:56 localhost neutron_sriov_agent[260044]: 2025-12-15 10:04:56.967 2 INFO neutron.agent.securitygroups_rpc [None req-e426e794-554f-40af-90a5-90f2064220fa 24bea5dd532742feaa6e0ad6cacbb6fd beeb00f2f9c9442f94b7368b5c3f4007 - - default default] Security group rule updated ['9c74cb5a-c586-466e-8a06-28318d260ea1']#033[00m Dec 15 05:04:57 localhost nova_compute[286344]: 2025-12-15 10:04:57.719 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:04:58 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Dec 15 05:04:58 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:04:58 localhost dnsmasq[320801]: exiting on receipt of SIGTERM Dec 15 05:04:58 localhost podman[320885]: 2025-12-15 10:04:58.835230817 +0000 UTC m=+0.060403983 container kill a3a5f00715764a5b922c912c52ca54ee51d6711d6891e7fe4f167c056f11e44e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c65af4c5-ccf6-443c-a9e0-706f066852e4, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) Dec 15 05:04:58 localhost systemd[1]: libpod-a3a5f00715764a5b922c912c52ca54ee51d6711d6891e7fe4f167c056f11e44e.scope: Deactivated successfully. Dec 15 05:04:58 localhost podman[320912]: 2025-12-15 10:04:58.911626648 +0000 UTC m=+0.059955671 container died a3a5f00715764a5b922c912c52ca54ee51d6711d6891e7fe4f167c056f11e44e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c65af4c5-ccf6-443c-a9e0-706f066852e4, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true) Dec 15 05:04:58 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Dec 15 05:04:58 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:04:58 localhost systemd[1]: tmp-crun.IUpyCb.mount: Deactivated successfully. Dec 15 05:04:58 localhost podman[320912]: 2025-12-15 10:04:58.950744317 +0000 UTC m=+0.099073290 container cleanup a3a5f00715764a5b922c912c52ca54ee51d6711d6891e7fe4f167c056f11e44e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c65af4c5-ccf6-443c-a9e0-706f066852e4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0) Dec 15 05:04:58 localhost systemd[1]: libpod-conmon-a3a5f00715764a5b922c912c52ca54ee51d6711d6891e7fe4f167c056f11e44e.scope: Deactivated successfully. Dec 15 05:04:58 localhost podman[320914]: 2025-12-15 10:04:58.989194389 +0000 UTC m=+0.129632074 container remove a3a5f00715764a5b922c912c52ca54ee51d6711d6891e7fe4f167c056f11e44e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c65af4c5-ccf6-443c-a9e0-706f066852e4, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) Dec 15 05:04:59 localhost nova_compute[286344]: 2025-12-15 10:04:58.999 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:04:59 localhost kernel: device tap0a9f7a0a-96 left promiscuous mode Dec 15 05:04:59 localhost ovn_controller[154603]: 2025-12-15T10:04:59Z|00235|binding|INFO|Releasing lport 0a9f7a0a-96b1-4b9a-bc1c-6545a366856b from this chassis (sb_readonly=0) Dec 15 05:04:59 localhost ovn_controller[154603]: 2025-12-15T10:04:59Z|00236|binding|INFO|Setting lport 0a9f7a0a-96b1-4b9a-bc1c-6545a366856b down in Southbound Dec 15 05:04:59 localhost nova_compute[286344]: 2025-12-15 10:04:59.027 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:04:59 localhost ovn_metadata_agent[160585]: 2025-12-15 10:04:59.025 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-c65af4c5-ccf6-443c-a9e0-706f066852e4', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c65af4c5-ccf6-443c-a9e0-706f066852e4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '563dc37280b84f6fb12f8a3e7cef24a1', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005559462.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b0519035-c3ed-434b-9353-cedfebc7fa6d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=0a9f7a0a-96b1-4b9a-bc1c-6545a366856b) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:04:59 localhost ovn_metadata_agent[160585]: 2025-12-15 10:04:59.027 160590 INFO neutron.agent.ovn.metadata.agent [-] Port 0a9f7a0a-96b1-4b9a-bc1c-6545a366856b in datapath c65af4c5-ccf6-443c-a9e0-706f066852e4 unbound from our chassis#033[00m Dec 15 05:04:59 localhost ovn_metadata_agent[160585]: 2025-12-15 10:04:59.029 160590 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c65af4c5-ccf6-443c-a9e0-706f066852e4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 15 05:04:59 localhost ovn_metadata_agent[160585]: 2025-12-15 10:04:59.030 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[d7ab1471-7c89-4e41-a5d1-f590e2272ed3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:04:59 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:04:59.500 267546 INFO neutron.agent.dhcp.agent [None req-3e936cf0-342f-494a-a800-07f2c23160c0 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:04:59 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Dec 15 05:04:59 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:04:59 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:04:59.650 267546 INFO neutron.agent.linux.ip_lib [None req-04bc7122-a8ac-47f8-b096-3fca903e3a73 - - - - - -] Device tap3c362108-9e cannot be used as it has no MAC address#033[00m Dec 15 05:04:59 localhost nova_compute[286344]: 2025-12-15 10:04:59.669 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:04:59 localhost kernel: device tap3c362108-9e entered promiscuous mode Dec 15 05:04:59 localhost ovn_controller[154603]: 2025-12-15T10:04:59Z|00237|binding|INFO|Claiming lport 3c362108-9e60-4c22-9f0a-818e7a646527 for this chassis. Dec 15 05:04:59 localhost nova_compute[286344]: 2025-12-15 10:04:59.676 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:04:59 localhost ovn_controller[154603]: 2025-12-15T10:04:59Z|00238|binding|INFO|3c362108-9e60-4c22-9f0a-818e7a646527: Claiming unknown Dec 15 05:04:59 localhost NetworkManager[5963]: [1765793099.6768] manager: (tap3c362108-9e): new Generic device (/org/freedesktop/NetworkManager/Devices/38) Dec 15 05:04:59 localhost systemd-udevd[320951]: Network interface NamePolicy= disabled on kernel command line. Dec 15 05:04:59 localhost ovn_metadata_agent[160585]: 2025-12-15 10:04:59.686 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::1/64', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-d6e75a2b-2604-402d-b4f1-86d545a4d7e3', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d6e75a2b-2604-402d-b4f1-86d545a4d7e3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e13843b63193470fb73acd260577a584', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e397cd2d-9696-473f-afe3-1425aae73e97, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=3c362108-9e60-4c22-9f0a-818e7a646527) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:04:59 localhost ovn_metadata_agent[160585]: 2025-12-15 10:04:59.688 160590 INFO neutron.agent.ovn.metadata.agent [-] Port 3c362108-9e60-4c22-9f0a-818e7a646527 in datapath d6e75a2b-2604-402d-b4f1-86d545a4d7e3 bound to our chassis#033[00m Dec 15 05:04:59 localhost ovn_metadata_agent[160585]: 2025-12-15 10:04:59.689 160590 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network d6e75a2b-2604-402d-b4f1-86d545a4d7e3 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 15 05:04:59 localhost ovn_metadata_agent[160585]: 2025-12-15 10:04:59.690 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[8a117976-0a37-4055-916f-008b7b226eb0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:04:59 localhost ovn_controller[154603]: 2025-12-15T10:04:59Z|00239|binding|INFO|Setting lport 3c362108-9e60-4c22-9f0a-818e7a646527 ovn-installed in OVS Dec 15 05:04:59 localhost ovn_controller[154603]: 2025-12-15T10:04:59Z|00240|binding|INFO|Setting lport 3c362108-9e60-4c22-9f0a-818e7a646527 up in Southbound Dec 15 05:04:59 localhost nova_compute[286344]: 2025-12-15 10:04:59.716 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:04:59 localhost nova_compute[286344]: 2025-12-15 10:04:59.753 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:04:59 localhost nova_compute[286344]: 2025-12-15 10:04:59.778 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:04:59 localhost systemd[1]: var-lib-containers-storage-overlay-af1e3419cb91aa152408d3957397082580c6897aee2afbe88e53f7503cd9adc8-merged.mount: Deactivated successfully. Dec 15 05:04:59 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a3a5f00715764a5b922c912c52ca54ee51d6711d6891e7fe4f167c056f11e44e-userdata-shm.mount: Deactivated successfully. Dec 15 05:04:59 localhost systemd[1]: run-netns-qdhcp\x2dc65af4c5\x2dccf6\x2d443c\x2da9e0\x2d706f066852e4.mount: Deactivated successfully. Dec 15 05:04:59 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e141 do_prune osdmap full prune enabled Dec 15 05:04:59 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e142 e142: 6 total, 6 up, 6 in Dec 15 05:04:59 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e142: 6 total, 6 up, 6 in Dec 15 05:05:00 localhost nova_compute[286344]: 2025-12-15 10:05:00.127 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:00 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:05:00 localhost podman[321006]: Dec 15 05:05:00 localhost podman[321006]: 2025-12-15 10:05:00.587755922 +0000 UTC m=+0.090522877 container create b60518e39f380de13646d6c409acd39b876092e3b26961688ba7542147e4bffa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d6e75a2b-2604-402d-b4f1-86d545a4d7e3, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 05:05:00 localhost systemd[1]: Started libpod-conmon-b60518e39f380de13646d6c409acd39b876092e3b26961688ba7542147e4bffa.scope. Dec 15 05:05:00 localhost podman[321006]: 2025-12-15 10:05:00.544410867 +0000 UTC m=+0.047177832 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 15 05:05:00 localhost systemd[1]: Started libcrun container. Dec 15 05:05:00 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 05:05:00 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/09d04e8bf67d9115b3e068c092d3aae527741f6b267309ac232cd3f1348032a5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 05:05:00 localhost podman[321006]: 2025-12-15 10:05:00.663194429 +0000 UTC m=+0.165961364 container init b60518e39f380de13646d6c409acd39b876092e3b26961688ba7542147e4bffa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d6e75a2b-2604-402d-b4f1-86d545a4d7e3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Dec 15 05:05:00 localhost podman[321006]: 2025-12-15 10:05:00.672859561 +0000 UTC m=+0.175626556 container start b60518e39f380de13646d6c409acd39b876092e3b26961688ba7542147e4bffa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d6e75a2b-2604-402d-b4f1-86d545a4d7e3, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:05:00 localhost dnsmasq[321024]: started, version 2.85 cachesize 150 Dec 15 05:05:00 localhost dnsmasq[321024]: DNS service limited to local subnets Dec 15 05:05:00 localhost dnsmasq[321024]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 15 05:05:00 localhost dnsmasq[321024]: warning: no upstream servers configured Dec 15 05:05:00 localhost dnsmasq-dhcp[321024]: DHCPv6, static leases only on 2001:db8::, lease time 1d Dec 15 05:05:00 localhost dnsmasq[321024]: read /var/lib/neutron/dhcp/d6e75a2b-2604-402d-b4f1-86d545a4d7e3/addn_hosts - 0 addresses Dec 15 05:05:00 localhost dnsmasq-dhcp[321024]: read /var/lib/neutron/dhcp/d6e75a2b-2604-402d-b4f1-86d545a4d7e3/host Dec 15 05:05:00 localhost dnsmasq-dhcp[321024]: read /var/lib/neutron/dhcp/d6e75a2b-2604-402d-b4f1-86d545a4d7e3/opts Dec 15 05:05:00 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:05:00.873 267546 INFO neutron.agent.dhcp.agent [None req-9191c86a-0e0b-4c35-91fb-a895ee9e44df - - - - - -] DHCP configuration for ports {'9fdf48f5-9c35-4966-b83c-e8b59cc3502f'} is completed#033[00m Dec 15 05:05:01 localhost podman[321042]: 2025-12-15 10:05:01.066405449 +0000 UTC m=+0.051036088 container kill b60518e39f380de13646d6c409acd39b876092e3b26961688ba7542147e4bffa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d6e75a2b-2604-402d-b4f1-86d545a4d7e3, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:05:01 localhost dnsmasq[321024]: exiting on receipt of SIGTERM Dec 15 05:05:01 localhost systemd[1]: libpod-b60518e39f380de13646d6c409acd39b876092e3b26961688ba7542147e4bffa.scope: Deactivated successfully. Dec 15 05:05:01 localhost podman[321055]: 2025-12-15 10:05:01.138495143 +0000 UTC m=+0.059224753 container died b60518e39f380de13646d6c409acd39b876092e3b26961688ba7542147e4bffa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d6e75a2b-2604-402d-b4f1-86d545a4d7e3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Dec 15 05:05:01 localhost systemd[1]: tmp-crun.gu48k6.mount: Deactivated successfully. Dec 15 05:05:01 localhost podman[321055]: 2025-12-15 10:05:01.457403044 +0000 UTC m=+0.378132624 container cleanup b60518e39f380de13646d6c409acd39b876092e3b26961688ba7542147e4bffa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d6e75a2b-2604-402d-b4f1-86d545a4d7e3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:05:01 localhost systemd[1]: libpod-conmon-b60518e39f380de13646d6c409acd39b876092e3b26961688ba7542147e4bffa.scope: Deactivated successfully. Dec 15 05:05:01 localhost podman[321062]: 2025-12-15 10:05:01.522874262 +0000 UTC m=+0.429784666 container remove b60518e39f380de13646d6c409acd39b876092e3b26961688ba7542147e4bffa (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d6e75a2b-2604-402d-b4f1-86d545a4d7e3, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 05:05:01 localhost kernel: device tap3c362108-9e left promiscuous mode Dec 15 05:05:01 localhost ovn_controller[154603]: 2025-12-15T10:05:01Z|00241|binding|INFO|Releasing lport 3c362108-9e60-4c22-9f0a-818e7a646527 from this chassis (sb_readonly=0) Dec 15 05:05:01 localhost ovn_controller[154603]: 2025-12-15T10:05:01Z|00242|binding|INFO|Setting lport 3c362108-9e60-4c22-9f0a-818e7a646527 down in Southbound Dec 15 05:05:01 localhost nova_compute[286344]: 2025-12-15 10:05:01.540 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:01 localhost ovn_metadata_agent[160585]: 2025-12-15 10:05:01.549 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-d6e75a2b-2604-402d-b4f1-86d545a4d7e3', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d6e75a2b-2604-402d-b4f1-86d545a4d7e3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e13843b63193470fb73acd260577a584', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e397cd2d-9696-473f-afe3-1425aae73e97, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=3c362108-9e60-4c22-9f0a-818e7a646527) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:05:01 localhost ovn_metadata_agent[160585]: 2025-12-15 10:05:01.551 160590 INFO neutron.agent.ovn.metadata.agent [-] Port 3c362108-9e60-4c22-9f0a-818e7a646527 in datapath d6e75a2b-2604-402d-b4f1-86d545a4d7e3 unbound from our chassis#033[00m Dec 15 05:05:01 localhost ovn_metadata_agent[160585]: 2025-12-15 10:05:01.552 160590 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network d6e75a2b-2604-402d-b4f1-86d545a4d7e3 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 15 05:05:01 localhost ovn_metadata_agent[160585]: 2025-12-15 10:05:01.553 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[566aa124-ac1f-4736-ac09-082a5c1dfa85]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:05:01 localhost nova_compute[286344]: 2025-12-15 10:05:01.559 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:01 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e142 do_prune osdmap full prune enabled Dec 15 05:05:01 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e143 e143: 6 total, 6 up, 6 in Dec 15 05:05:01 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e143: 6 total, 6 up, 6 in Dec 15 05:05:01 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Dec 15 05:05:01 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/308347239' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Dec 15 05:05:01 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Dec 15 05:05:01 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/308347239' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Dec 15 05:05:01 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:05:01.805 267546 INFO neutron.agent.dhcp.agent [None req-15b3adee-f2b4-48c2-84ea-93a8c78fff81 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:05:01 localhost systemd[1]: var-lib-containers-storage-overlay-09d04e8bf67d9115b3e068c092d3aae527741f6b267309ac232cd3f1348032a5-merged.mount: Deactivated successfully. Dec 15 05:05:01 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b60518e39f380de13646d6c409acd39b876092e3b26961688ba7542147e4bffa-userdata-shm.mount: Deactivated successfully. Dec 15 05:05:01 localhost systemd[1]: run-netns-qdhcp\x2dd6e75a2b\x2d2604\x2d402d\x2db4f1\x2d86d545a4d7e3.mount: Deactivated successfully. Dec 15 05:05:01 localhost podman[243449]: time="2025-12-15T10:05:01Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 15 05:05:01 localhost podman[243449]: @ - - [15/Dec/2025:10:05:01 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156640 "" "Go-http-client/1.1" Dec 15 05:05:01 localhost podman[243449]: @ - - [15/Dec/2025:10:05:01 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19249 "" "Go-http-client/1.1" Dec 15 05:05:02 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:05:02.690 267546 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:05:02 localhost nova_compute[286344]: 2025-12-15 10:05:02.761 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:02 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:05:02.868 267546 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:05:03 localhost ovn_controller[154603]: 2025-12-15T10:05:03Z|00243|binding|INFO|Releasing lport b35254ad-12eb-47bb-92be-44fefe0694f0 from this chassis (sb_readonly=0) Dec 15 05:05:03 localhost nova_compute[286344]: 2025-12-15 10:05:03.112 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:04 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:05:04.059 267546 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:05:04 localhost ovn_metadata_agent[160585]: 2025-12-15 10:05:04.294 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'fe:17:e3', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fe:55:2b:86:15:b5'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:05:04 localhost ovn_metadata_agent[160585]: 2025-12-15 10:05:04.296 160590 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Dec 15 05:05:04 localhost nova_compute[286344]: 2025-12-15 10:05:04.329 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:04 localhost openstack_network_exporter[246484]: ERROR 10:05:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 05:05:04 localhost openstack_network_exporter[246484]: ERROR 10:05:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 05:05:04 localhost openstack_network_exporter[246484]: ERROR 10:05:04 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 15 05:05:04 localhost openstack_network_exporter[246484]: ERROR 10:05:04 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 15 05:05:04 localhost openstack_network_exporter[246484]: Dec 15 05:05:04 localhost openstack_network_exporter[246484]: ERROR 10:05:04 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 15 05:05:04 localhost openstack_network_exporter[246484]: Dec 15 05:05:05 localhost nova_compute[286344]: 2025-12-15 10:05:05.129 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:05 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 05:05:05 localhost ceph-osd[31375]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Dec 15 05:05:05 localhost ceph-osd[31375]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 8400.1 total, 600.0 interval#012Cumulative writes: 9221 writes, 36K keys, 9221 commit groups, 1.0 writes per commit group, ingest: 0.03 GB, 0.00 MB/s#012Cumulative WAL: 9221 writes, 2545 syncs, 3.62 writes per sync, written: 0.03 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 4172 writes, 13K keys, 4172 commit groups, 1.0 writes per commit group, ingest: 12.17 MB, 0.02 MB/s#012Interval WAL: 4172 writes, 1815 syncs, 2.30 writes per sync, written: 0.01 GB, 0.02 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Dec 15 05:05:06 localhost neutron_sriov_agent[260044]: 2025-12-15 10:05:06.033 2 INFO neutron.agent.securitygroups_rpc [None req-214ea5a4-115b-4a51-9308-ddfcb2b1e2e2 a22542ef31414501844801d3b102584b 54df976b8a364f93b0b8b0128def8f10 - - default default] Security group member updated ['3b08e208-1f02-4d1e-84d0-d090034d8a7c']#033[00m Dec 15 05:05:06 localhost neutron_sriov_agent[260044]: 2025-12-15 10:05:06.198 2 INFO neutron.agent.securitygroups_rpc [None req-1c32b1e4-15ba-4bea-b53a-691395afe83c 5c5e00fffe0e4c47b6ec1c0d83d4c5d2 fbb7b7a53ab3404f84f8b0b8e1aa8c80 - - default default] Security group member updated ['3dc31410-cb9e-4df0-aa7c-9a38b5dd45ea']#033[00m Dec 15 05:05:06 localhost neutron_sriov_agent[260044]: 2025-12-15 10:05:06.882 2 INFO neutron.agent.securitygroups_rpc [None req-0cfb772c-5772-4a41-994f-a843f57c1036 a22542ef31414501844801d3b102584b 54df976b8a364f93b0b8b0128def8f10 - - default default] Security group member updated ['3b08e208-1f02-4d1e-84d0-d090034d8a7c']#033[00m Dec 15 05:05:07 localhost nova_compute[286344]: 2025-12-15 10:05:07.764 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:08 localhost neutron_sriov_agent[260044]: 2025-12-15 10:05:08.170 2 INFO neutron.agent.securitygroups_rpc [None req-c1c1d2b9-43be-4228-8534-2d09efa0ff6d 5c5e00fffe0e4c47b6ec1c0d83d4c5d2 fbb7b7a53ab3404f84f8b0b8e1aa8c80 - - default default] Security group member updated ['3dc31410-cb9e-4df0-aa7c-9a38b5dd45ea']#033[00m Dec 15 05:05:09 localhost sshd[321083]: main: sshd: ssh-rsa algorithm is disabled Dec 15 05:05:10 localhost nova_compute[286344]: 2025-12-15 10:05:10.133 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:10 localhost ceph-osd[32311]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Dec 15 05:05:10 localhost ceph-osd[32311]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 8400.2 total, 600.0 interval#012Cumulative writes: 11K writes, 43K keys, 11K commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.00 MB/s#012Cumulative WAL: 11K writes, 3027 syncs, 3.67 writes per sync, written: 0.04 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 5287 writes, 18K keys, 5287 commit groups, 1.0 writes per commit group, ingest: 16.80 MB, 0.03 MB/s#012Interval WAL: 5287 writes, 2222 syncs, 2.38 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Dec 15 05:05:10 localhost nova_compute[286344]: 2025-12-15 10:05:10.271 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:05:10 localhost ovn_metadata_agent[160585]: 2025-12-15 10:05:10.298 160590 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=12d96d64-e862-4f68-81e5-8d9ec5d3a5e2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 15 05:05:10 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 05:05:10 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e143 do_prune osdmap full prune enabled Dec 15 05:05:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0. Dec 15 05:05:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 05:05:10 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e144 e144: 6 total, 6 up, 6 in Dec 15 05:05:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a. Dec 15 05:05:10 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e144: 6 total, 6 up, 6 in Dec 15 05:05:10 localhost systemd[1]: tmp-crun.vY9uqV.mount: Deactivated successfully. Dec 15 05:05:10 localhost podman[321087]: 2025-12-15 10:05:10.771640992 +0000 UTC m=+0.089283275 container health_status b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:05:10 localhost podman[321085]: 2025-12-15 10:05:10.739186129 +0000 UTC m=+0.066338850 container health_status 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 15 05:05:10 localhost podman[321087]: 2025-12-15 10:05:10.806251688 +0000 UTC m=+0.123893911 container exec_died b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true) Dec 15 05:05:10 localhost systemd[1]: b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a.service: Deactivated successfully. Dec 15 05:05:10 localhost podman[321085]: 2025-12-15 10:05:10.821575392 +0000 UTC m=+0.148728103 container exec_died 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Dec 15 05:05:10 localhost podman[321086]: 2025-12-15 10:05:10.856850694 +0000 UTC m=+0.178751734 container health_status 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible) Dec 15 05:05:10 localhost podman[321086]: 2025-12-15 10:05:10.867546892 +0000 UTC m=+0.189447912 container exec_died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true) Dec 15 05:05:10 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.service: Deactivated successfully. Dec 15 05:05:10 localhost systemd[1]: 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0.service: Deactivated successfully. Dec 15 05:05:11 localhost neutron_sriov_agent[260044]: 2025-12-15 10:05:11.130 2 INFO neutron.agent.securitygroups_rpc [None req-cd2e50e4-2022-4b64-bc97-cf9701287a1b b927ea383a0640b0839846e8a3038e87 2863215517a546fa9ec3fbc1298732b9 - - default default] Security group rule updated ['774aa455-e717-4fca-8dd8-670318036f8e']#033[00m Dec 15 05:05:11 localhost nova_compute[286344]: 2025-12-15 10:05:11.266 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:05:11 localhost nova_compute[286344]: 2025-12-15 10:05:11.269 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:05:11 localhost neutron_sriov_agent[260044]: 2025-12-15 10:05:11.471 2 INFO neutron.agent.securitygroups_rpc [None req-431699f0-30a9-4213-8818-1d504821d8ba b927ea383a0640b0839846e8a3038e87 2863215517a546fa9ec3fbc1298732b9 - - default default] Security group rule updated ['774aa455-e717-4fca-8dd8-670318036f8e']#033[00m Dec 15 05:05:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09. Dec 15 05:05:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 05:05:12 localhost podman[321143]: 2025-12-15 10:05:12.729540427 +0000 UTC m=+0.061745076 container health_status 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., config_id=openstack_network_exporter, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.buildah.version=1.33.7, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, name=ubi9-minimal, vcs-type=git, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b) Dec 15 05:05:12 localhost podman[321143]: 2025-12-15 10:05:12.746355047 +0000 UTC m=+0.078559686 container exec_died 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.openshift.tags=minimal rhel9, name=ubi9-minimal, managed_by=edpm_ansible, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, distribution-scope=public, version=9.6, io.buildah.version=1.33.7, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, release=1755695350, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Dec 15 05:05:12 localhost systemd[1]: 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09.service: Deactivated successfully. Dec 15 05:05:12 localhost nova_compute[286344]: 2025-12-15 10:05:12.790 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:12 localhost podman[321144]: 2025-12-15 10:05:12.813027556 +0000 UTC m=+0.139790359 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller) Dec 15 05:05:12 localhost podman[321144]: 2025-12-15 10:05:12.87434657 +0000 UTC m=+0.201109413 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, config_id=ovn_controller, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3) Dec 15 05:05:12 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 05:05:13 localhost nova_compute[286344]: 2025-12-15 10:05:13.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:05:13 localhost nova_compute[286344]: 2025-12-15 10:05:13.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:05:13 localhost neutron_sriov_agent[260044]: 2025-12-15 10:05:13.496 2 INFO neutron.agent.securitygroups_rpc [None req-16b18c6a-8e2d-47a3-bcde-5c2f7989d11f b927ea383a0640b0839846e8a3038e87 2863215517a546fa9ec3fbc1298732b9 - - default default] Security group rule updated ['054d91fb-563b-4ea9-adf7-f95f6287a119']#033[00m Dec 15 05:05:13 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:05:13.549 267546 INFO neutron.agent.linux.ip_lib [None req-750f6e38-9068-4e71-9c70-961dd1a73ca3 - - - - - -] Device tap8e9b47ba-93 cannot be used as it has no MAC address#033[00m Dec 15 05:05:13 localhost nova_compute[286344]: 2025-12-15 10:05:13.571 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:13 localhost kernel: device tap8e9b47ba-93 entered promiscuous mode Dec 15 05:05:13 localhost ovn_controller[154603]: 2025-12-15T10:05:13Z|00244|binding|INFO|Claiming lport 8e9b47ba-93aa-4955-91d2-958702cde86f for this chassis. Dec 15 05:05:13 localhost nova_compute[286344]: 2025-12-15 10:05:13.579 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:13 localhost ovn_controller[154603]: 2025-12-15T10:05:13Z|00245|binding|INFO|8e9b47ba-93aa-4955-91d2-958702cde86f: Claiming unknown Dec 15 05:05:13 localhost NetworkManager[5963]: [1765793113.5802] manager: (tap8e9b47ba-93): new Generic device (/org/freedesktop/NetworkManager/Devices/39) Dec 15 05:05:13 localhost systemd-udevd[321199]: Network interface NamePolicy= disabled on kernel command line. Dec 15 05:05:13 localhost ovn_controller[154603]: 2025-12-15T10:05:13Z|00246|binding|INFO|Releasing lport b35254ad-12eb-47bb-92be-44fefe0694f0 from this chassis (sb_readonly=0) Dec 15 05:05:13 localhost ovn_metadata_agent[160585]: 2025-12-15 10:05:13.601 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:ffff::2/64', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-307e1014-96f2-485d-9b39-dd1a53bbce39', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-307e1014-96f2-485d-9b39-dd1a53bbce39', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '58db13bf0d834ccda318448020067936', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bb700d79-8a1a-4b17-ad19-514767bed4fc, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=8e9b47ba-93aa-4955-91d2-958702cde86f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:05:13 localhost ovn_metadata_agent[160585]: 2025-12-15 10:05:13.603 160590 INFO neutron.agent.ovn.metadata.agent [-] Port 8e9b47ba-93aa-4955-91d2-958702cde86f in datapath 307e1014-96f2-485d-9b39-dd1a53bbce39 bound to our chassis#033[00m Dec 15 05:05:13 localhost ovn_metadata_agent[160585]: 2025-12-15 10:05:13.605 160590 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 307e1014-96f2-485d-9b39-dd1a53bbce39 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 15 05:05:13 localhost ovn_metadata_agent[160585]: 2025-12-15 10:05:13.607 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[f586b8e4-47be-4a87-b477-fe7682b177a2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:05:13 localhost journal[231322]: ethtool ioctl error on tap8e9b47ba-93: No such device Dec 15 05:05:13 localhost ovn_controller[154603]: 2025-12-15T10:05:13Z|00247|binding|INFO|Setting lport 8e9b47ba-93aa-4955-91d2-958702cde86f ovn-installed in OVS Dec 15 05:05:13 localhost ovn_controller[154603]: 2025-12-15T10:05:13Z|00248|binding|INFO|Setting lport 8e9b47ba-93aa-4955-91d2-958702cde86f up in Southbound Dec 15 05:05:13 localhost nova_compute[286344]: 2025-12-15 10:05:13.616 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:13 localhost journal[231322]: ethtool ioctl error on tap8e9b47ba-93: No such device Dec 15 05:05:13 localhost journal[231322]: ethtool ioctl error on tap8e9b47ba-93: No such device Dec 15 05:05:13 localhost journal[231322]: ethtool ioctl error on tap8e9b47ba-93: No such device Dec 15 05:05:13 localhost journal[231322]: ethtool ioctl error on tap8e9b47ba-93: No such device Dec 15 05:05:13 localhost journal[231322]: ethtool ioctl error on tap8e9b47ba-93: No such device Dec 15 05:05:13 localhost journal[231322]: ethtool ioctl error on tap8e9b47ba-93: No such device Dec 15 05:05:13 localhost journal[231322]: ethtool ioctl error on tap8e9b47ba-93: No such device Dec 15 05:05:13 localhost nova_compute[286344]: 2025-12-15 10:05:13.665 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:13 localhost nova_compute[286344]: 2025-12-15 10:05:13.699 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:13 localhost neutron_sriov_agent[260044]: 2025-12-15 10:05:13.850 2 INFO neutron.agent.securitygroups_rpc [None req-f66068f4-e4bd-4f6e-8097-94b247583c94 b927ea383a0640b0839846e8a3038e87 2863215517a546fa9ec3fbc1298732b9 - - default default] Security group rule updated ['054d91fb-563b-4ea9-adf7-f95f6287a119']#033[00m Dec 15 05:05:14 localhost nova_compute[286344]: 2025-12-15 10:05:14.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:05:14 localhost nova_compute[286344]: 2025-12-15 10:05:14.272 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 15 05:05:14 localhost nova_compute[286344]: 2025-12-15 10:05:14.273 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 15 05:05:14 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Dec 15 05:05:14 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3281193376' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Dec 15 05:05:14 localhost nova_compute[286344]: 2025-12-15 10:05:14.399 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 15 05:05:14 localhost nova_compute[286344]: 2025-12-15 10:05:14.400 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquired lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 15 05:05:14 localhost nova_compute[286344]: 2025-12-15 10:05:14.401 286348 DEBUG nova.network.neutron [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 15 05:05:14 localhost nova_compute[286344]: 2025-12-15 10:05:14.402 286348 DEBUG nova.objects.instance [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 15 05:05:14 localhost neutron_sriov_agent[260044]: 2025-12-15 10:05:14.460 2 INFO neutron.agent.securitygroups_rpc [None req-7566ee9f-e052-4bc8-b176-343de3e2293c b927ea383a0640b0839846e8a3038e87 2863215517a546fa9ec3fbc1298732b9 - - default default] Security group rule updated ['054d91fb-563b-4ea9-adf7-f95f6287a119']#033[00m Dec 15 05:05:14 localhost podman[321270]: Dec 15 05:05:14 localhost podman[321270]: 2025-12-15 10:05:14.513770015 +0000 UTC m=+0.080903135 container create de37692395c3281252a36a476ac3795cfecdc2602a63c414d8accd8a1744ef0a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-307e1014-96f2-485d-9b39-dd1a53bbce39, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:05:14 localhost systemd[1]: Started libpod-conmon-de37692395c3281252a36a476ac3795cfecdc2602a63c414d8accd8a1744ef0a.scope. Dec 15 05:05:14 localhost systemd[1]: Started libcrun container. Dec 15 05:05:14 localhost podman[321270]: 2025-12-15 10:05:14.483518188 +0000 UTC m=+0.050651288 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 15 05:05:14 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/291069eecaeaf2840287e2aff8611ace2f4cc02008b5e608012fb764bff07026/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 05:05:14 localhost podman[321270]: 2025-12-15 10:05:14.60230004 +0000 UTC m=+0.169433140 container init de37692395c3281252a36a476ac3795cfecdc2602a63c414d8accd8a1744ef0a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-307e1014-96f2-485d-9b39-dd1a53bbce39, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202) Dec 15 05:05:14 localhost podman[321270]: 2025-12-15 10:05:14.616307931 +0000 UTC m=+0.183441061 container start de37692395c3281252a36a476ac3795cfecdc2602a63c414d8accd8a1744ef0a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-307e1014-96f2-485d-9b39-dd1a53bbce39, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2) Dec 15 05:05:14 localhost dnsmasq[321289]: started, version 2.85 cachesize 150 Dec 15 05:05:14 localhost dnsmasq[321289]: DNS service limited to local subnets Dec 15 05:05:14 localhost dnsmasq[321289]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 15 05:05:14 localhost dnsmasq[321289]: warning: no upstream servers configured Dec 15 05:05:14 localhost dnsmasq-dhcp[321289]: DHCPv6, static leases only on 2001:db8:0:ffff::, lease time 1d Dec 15 05:05:14 localhost dnsmasq[321289]: read /var/lib/neutron/dhcp/307e1014-96f2-485d-9b39-dd1a53bbce39/addn_hosts - 0 addresses Dec 15 05:05:14 localhost dnsmasq-dhcp[321289]: read /var/lib/neutron/dhcp/307e1014-96f2-485d-9b39-dd1a53bbce39/host Dec 15 05:05:14 localhost dnsmasq-dhcp[321289]: read /var/lib/neutron/dhcp/307e1014-96f2-485d-9b39-dd1a53bbce39/opts Dec 15 05:05:14 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:05:14.853 267546 INFO neutron.agent.dhcp.agent [None req-2248fcb0-c7c5-4b20-93d5-5f541d97c756 - - - - - -] DHCP configuration for ports {'f62bf546-3dfc-4412-bd60-ca61e95a21a6'} is completed#033[00m Dec 15 05:05:14 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e144 do_prune osdmap full prune enabled Dec 15 05:05:14 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e145 e145: 6 total, 6 up, 6 in Dec 15 05:05:14 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e145: 6 total, 6 up, 6 in Dec 15 05:05:15 localhost neutron_sriov_agent[260044]: 2025-12-15 10:05:15.064 2 INFO neutron.agent.securitygroups_rpc [None req-b1c33550-69f3-4af8-b4f9-7ad276f4a7f8 b927ea383a0640b0839846e8a3038e87 2863215517a546fa9ec3fbc1298732b9 - - default default] Security group rule updated ['054d91fb-563b-4ea9-adf7-f95f6287a119']#033[00m Dec 15 05:05:15 localhost nova_compute[286344]: 2025-12-15 10:05:15.175 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:15 localhost systemd[1]: tmp-crun.uiIGrf.mount: Deactivated successfully. Dec 15 05:05:15 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e145 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 05:05:15 localhost nova_compute[286344]: 2025-12-15 10:05:15.716 286348 DEBUG nova.network.neutron [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Updating instance_info_cache with network_info: [{"id": "03ef8889-3216-43fb-8a52-4be17a956ce1", "address": "fa:16:3e:74:df:7c", "network": {"id": "befb7a72-17a9-4bcb-b561-84b8f626685a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.201", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "c785bf23f53946bc99867d8832a50266", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03ef8889-32", "ovs_interfaceid": "03ef8889-3216-43fb-8a52-4be17a956ce1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 15 05:05:15 localhost nova_compute[286344]: 2025-12-15 10:05:15.735 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Releasing lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 15 05:05:15 localhost nova_compute[286344]: 2025-12-15 10:05:15.735 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 15 05:05:15 localhost neutron_sriov_agent[260044]: 2025-12-15 10:05:15.832 2 INFO neutron.agent.securitygroups_rpc [None req-1b948e01-a058-40cf-85f9-6473e10662dd b927ea383a0640b0839846e8a3038e87 2863215517a546fa9ec3fbc1298732b9 - - default default] Security group rule updated ['054d91fb-563b-4ea9-adf7-f95f6287a119']#033[00m Dec 15 05:05:15 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e145 do_prune osdmap full prune enabled Dec 15 05:05:15 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e146 e146: 6 total, 6 up, 6 in Dec 15 05:05:16 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e146: 6 total, 6 up, 6 in Dec 15 05:05:16 localhost neutron_sriov_agent[260044]: 2025-12-15 10:05:16.132 2 INFO neutron.agent.securitygroups_rpc [None req-ae2c927c-fa90-4e49-a4cc-0b72ee187b0e b927ea383a0640b0839846e8a3038e87 2863215517a546fa9ec3fbc1298732b9 - - default default] Security group rule updated ['054d91fb-563b-4ea9-adf7-f95f6287a119']#033[00m Dec 15 05:05:16 localhost neutron_sriov_agent[260044]: 2025-12-15 10:05:16.476 2 INFO neutron.agent.securitygroups_rpc [None req-6a4310c9-def7-4aff-aa4f-2464ff286613 b927ea383a0640b0839846e8a3038e87 2863215517a546fa9ec3fbc1298732b9 - - default default] Security group rule updated ['054d91fb-563b-4ea9-adf7-f95f6287a119']#033[00m Dec 15 05:05:16 localhost neutron_sriov_agent[260044]: 2025-12-15 10:05:16.601 2 INFO neutron.agent.securitygroups_rpc [None req-d0ba7ffc-a73a-448e-8d8f-8c2cd4ad99c2 a22542ef31414501844801d3b102584b 54df976b8a364f93b0b8b0128def8f10 - - default default] Security group member updated ['3b08e208-1f02-4d1e-84d0-d090034d8a7c']#033[00m Dec 15 05:05:16 localhost nova_compute[286344]: 2025-12-15 10:05:16.731 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:05:17 localhost neutron_sriov_agent[260044]: 2025-12-15 10:05:17.015 2 INFO neutron.agent.securitygroups_rpc [None req-3c314ef4-25ca-49c7-8850-96026762e38e b927ea383a0640b0839846e8a3038e87 2863215517a546fa9ec3fbc1298732b9 - - default default] Security group rule updated ['054d91fb-563b-4ea9-adf7-f95f6287a119']#033[00m Dec 15 05:05:17 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e146 do_prune osdmap full prune enabled Dec 15 05:05:17 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e147 e147: 6 total, 6 up, 6 in Dec 15 05:05:17 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e147: 6 total, 6 up, 6 in Dec 15 05:05:17 localhost neutron_sriov_agent[260044]: 2025-12-15 10:05:17.622 2 INFO neutron.agent.securitygroups_rpc [None req-2eeb83ea-a3c4-43d3-b23a-98f17937e24f d03c2d1ea31b421aabc2ccaccb47df84 08fb1034ee1a4def8bd8d9aecf9dfe86 - - default default] Security group member updated ['5de66dcc-b971-420f-89b1-9a69fc21c19e']#033[00m Dec 15 05:05:17 localhost neutron_sriov_agent[260044]: 2025-12-15 10:05:17.680 2 INFO neutron.agent.securitygroups_rpc [None req-345d51e0-2b37-4ba7-9af0-218115276ae6 b927ea383a0640b0839846e8a3038e87 2863215517a546fa9ec3fbc1298732b9 - - default default] Security group rule updated ['054d91fb-563b-4ea9-adf7-f95f6287a119']#033[00m Dec 15 05:05:17 localhost nova_compute[286344]: 2025-12-15 10:05:17.829 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:18 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e147 do_prune osdmap full prune enabled Dec 15 05:05:18 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e148 e148: 6 total, 6 up, 6 in Dec 15 05:05:18 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e148: 6 total, 6 up, 6 in Dec 15 05:05:18 localhost neutron_sriov_agent[260044]: 2025-12-15 10:05:18.124 2 INFO neutron.agent.securitygroups_rpc [None req-216e2b02-6d39-4bd0-8f41-00a7845ba76f b927ea383a0640b0839846e8a3038e87 2863215517a546fa9ec3fbc1298732b9 - - default default] Security group rule updated ['054d91fb-563b-4ea9-adf7-f95f6287a119']#033[00m Dec 15 05:05:18 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:05:18.305 267546 INFO neutron.agent.linux.ip_lib [None req-25ca1238-59a9-4586-b4f2-018f7e40fd22 - - - - - -] Device tapb37f7934-d7 cannot be used as it has no MAC address#033[00m Dec 15 05:05:18 localhost nova_compute[286344]: 2025-12-15 10:05:18.329 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:18 localhost kernel: device tapb37f7934-d7 entered promiscuous mode Dec 15 05:05:18 localhost NetworkManager[5963]: [1765793118.3363] manager: (tapb37f7934-d7): new Generic device (/org/freedesktop/NetworkManager/Devices/40) Dec 15 05:05:18 localhost nova_compute[286344]: 2025-12-15 10:05:18.336 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:18 localhost ovn_controller[154603]: 2025-12-15T10:05:18Z|00249|binding|INFO|Claiming lport b37f7934-d790-4e14-92c9-321fdf7e6bfd for this chassis. Dec 15 05:05:18 localhost ovn_controller[154603]: 2025-12-15T10:05:18Z|00250|binding|INFO|b37f7934-d790-4e14-92c9-321fdf7e6bfd: Claiming unknown Dec 15 05:05:18 localhost systemd-udevd[321300]: Network interface NamePolicy= disabled on kernel command line. Dec 15 05:05:18 localhost journal[231322]: ethtool ioctl error on tapb37f7934-d7: No such device Dec 15 05:05:18 localhost journal[231322]: ethtool ioctl error on tapb37f7934-d7: No such device Dec 15 05:05:18 localhost ovn_controller[154603]: 2025-12-15T10:05:18Z|00251|binding|INFO|Setting lport b37f7934-d790-4e14-92c9-321fdf7e6bfd ovn-installed in OVS Dec 15 05:05:18 localhost ovn_controller[154603]: 2025-12-15T10:05:18Z|00252|binding|INFO|Setting lport b37f7934-d790-4e14-92c9-321fdf7e6bfd up in Southbound Dec 15 05:05:18 localhost ovn_metadata_agent[160585]: 2025-12-15 10:05:18.376 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::3/64', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-910645bf-c485-48fb-b20c-a5b1b9aff07a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-910645bf-c485-48fb-b20c-a5b1b9aff07a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '58db13bf0d834ccda318448020067936', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=31b248e1-bd95-4c35-a311-ee84f197fcc4, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=b37f7934-d790-4e14-92c9-321fdf7e6bfd) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:05:18 localhost nova_compute[286344]: 2025-12-15 10:05:18.377 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:18 localhost journal[231322]: ethtool ioctl error on tapb37f7934-d7: No such device Dec 15 05:05:18 localhost ovn_metadata_agent[160585]: 2025-12-15 10:05:18.380 160590 INFO neutron.agent.ovn.metadata.agent [-] Port b37f7934-d790-4e14-92c9-321fdf7e6bfd in datapath 910645bf-c485-48fb-b20c-a5b1b9aff07a bound to our chassis#033[00m Dec 15 05:05:18 localhost ovn_metadata_agent[160585]: 2025-12-15 10:05:18.381 160590 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 910645bf-c485-48fb-b20c-a5b1b9aff07a or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 15 05:05:18 localhost ovn_metadata_agent[160585]: 2025-12-15 10:05:18.382 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[8564247a-6a26-477a-aead-7d441ed93c47]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:05:18 localhost journal[231322]: ethtool ioctl error on tapb37f7934-d7: No such device Dec 15 05:05:18 localhost journal[231322]: ethtool ioctl error on tapb37f7934-d7: No such device Dec 15 05:05:18 localhost journal[231322]: ethtool ioctl error on tapb37f7934-d7: No such device Dec 15 05:05:18 localhost journal[231322]: ethtool ioctl error on tapb37f7934-d7: No such device Dec 15 05:05:18 localhost journal[231322]: ethtool ioctl error on tapb37f7934-d7: No such device Dec 15 05:05:18 localhost nova_compute[286344]: 2025-12-15 10:05:18.419 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:18 localhost nova_compute[286344]: 2025-12-15 10:05:18.448 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:18 localhost neutron_sriov_agent[260044]: 2025-12-15 10:05:18.691 2 INFO neutron.agent.securitygroups_rpc [None req-ee7b32eb-9e7d-41f8-8870-7c75ff798665 d03c2d1ea31b421aabc2ccaccb47df84 08fb1034ee1a4def8bd8d9aecf9dfe86 - - default default] Security group member updated ['5de66dcc-b971-420f-89b1-9a69fc21c19e']#033[00m Dec 15 05:05:19 localhost podman[321369]: Dec 15 05:05:19 localhost podman[321369]: 2025-12-15 10:05:19.254961828 +0000 UTC m=+0.088747982 container create baa8fedf27155d0c04f600491b11cdab3da32226a8d3d2716c611e07a9c7308a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-910645bf-c485-48fb-b20c-a5b1b9aff07a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true) Dec 15 05:05:19 localhost nova_compute[286344]: 2025-12-15 10:05:19.271 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:05:19 localhost nova_compute[286344]: 2025-12-15 10:05:19.271 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 15 05:05:19 localhost nova_compute[286344]: 2025-12-15 10:05:19.272 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:05:19 localhost systemd[1]: Started libpod-conmon-baa8fedf27155d0c04f600491b11cdab3da32226a8d3d2716c611e07a9c7308a.scope. Dec 15 05:05:19 localhost systemd[1]: Started libcrun container. Dec 15 05:05:19 localhost podman[321369]: 2025-12-15 10:05:19.214958867 +0000 UTC m=+0.048745061 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 15 05:05:19 localhost nova_compute[286344]: 2025-12-15 10:05:19.317 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 05:05:19 localhost nova_compute[286344]: 2025-12-15 10:05:19.318 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 05:05:19 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d26a353df1666edc345a50323abf0fb31972cb2d78e711e20d841c6d1d3a1225/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 05:05:19 localhost nova_compute[286344]: 2025-12-15 10:05:19.319 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 05:05:19 localhost nova_compute[286344]: 2025-12-15 10:05:19.319 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Auditing locally available compute resources for np0005559462.localdomain (node: np0005559462.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 15 05:05:19 localhost nova_compute[286344]: 2025-12-15 10:05:19.319 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 05:05:19 localhost podman[321369]: 2025-12-15 10:05:19.328683902 +0000 UTC m=+0.162470076 container init baa8fedf27155d0c04f600491b11cdab3da32226a8d3d2716c611e07a9c7308a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-910645bf-c485-48fb-b20c-a5b1b9aff07a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Dec 15 05:05:19 localhost podman[321369]: 2025-12-15 10:05:19.336833216 +0000 UTC m=+0.170619380 container start baa8fedf27155d0c04f600491b11cdab3da32226a8d3d2716c611e07a9c7308a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-910645bf-c485-48fb-b20c-a5b1b9aff07a, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3) Dec 15 05:05:19 localhost dnsmasq[321388]: started, version 2.85 cachesize 150 Dec 15 05:05:19 localhost dnsmasq[321388]: DNS service limited to local subnets Dec 15 05:05:19 localhost dnsmasq[321388]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 15 05:05:19 localhost dnsmasq[321388]: warning: no upstream servers configured Dec 15 05:05:19 localhost dnsmasq-dhcp[321388]: DHCPv6, static leases only on 2001:db8::, lease time 1d Dec 15 05:05:19 localhost dnsmasq[321388]: read /var/lib/neutron/dhcp/910645bf-c485-48fb-b20c-a5b1b9aff07a/addn_hosts - 0 addresses Dec 15 05:05:19 localhost dnsmasq-dhcp[321388]: read /var/lib/neutron/dhcp/910645bf-c485-48fb-b20c-a5b1b9aff07a/host Dec 15 05:05:19 localhost dnsmasq-dhcp[321388]: read /var/lib/neutron/dhcp/910645bf-c485-48fb-b20c-a5b1b9aff07a/opts Dec 15 05:05:19 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:05:19.592 267546 INFO neutron.agent.dhcp.agent [None req-a214e430-99b5-486f-8373-4c61cd11df46 - - - - - -] DHCP configuration for ports {'b63cbb04-9882-4b1f-85a1-e1236a1c4bc1'} is completed#033[00m Dec 15 05:05:19 localhost dnsmasq[321388]: exiting on receipt of SIGTERM Dec 15 05:05:19 localhost podman[321423]: 2025-12-15 10:05:19.659519452 +0000 UTC m=+0.062623479 container kill baa8fedf27155d0c04f600491b11cdab3da32226a8d3d2716c611e07a9c7308a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-910645bf-c485-48fb-b20c-a5b1b9aff07a, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202) Dec 15 05:05:19 localhost systemd[1]: libpod-baa8fedf27155d0c04f600491b11cdab3da32226a8d3d2716c611e07a9c7308a.scope: Deactivated successfully. Dec 15 05:05:19 localhost podman[321437]: 2025-12-15 10:05:19.738382756 +0000 UTC m=+0.063481560 container died baa8fedf27155d0c04f600491b11cdab3da32226a8d3d2716c611e07a9c7308a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-910645bf-c485-48fb-b20c-a5b1b9aff07a, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0) Dec 15 05:05:19 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Dec 15 05:05:19 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/4204711138' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Dec 15 05:05:19 localhost podman[321437]: 2025-12-15 10:05:19.771491633 +0000 UTC m=+0.096590387 container cleanup baa8fedf27155d0c04f600491b11cdab3da32226a8d3d2716c611e07a9c7308a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-910645bf-c485-48fb-b20c-a5b1b9aff07a, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3) Dec 15 05:05:19 localhost systemd[1]: libpod-conmon-baa8fedf27155d0c04f600491b11cdab3da32226a8d3d2716c611e07a9c7308a.scope: Deactivated successfully. Dec 15 05:05:19 localhost nova_compute[286344]: 2025-12-15 10:05:19.781 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.461s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 05:05:19 localhost podman[321443]: 2025-12-15 10:05:19.819971777 +0000 UTC m=+0.130334463 container remove baa8fedf27155d0c04f600491b11cdab3da32226a8d3d2716c611e07a9c7308a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-910645bf-c485-48fb-b20c-a5b1b9aff07a, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS) Dec 15 05:05:19 localhost ovn_controller[154603]: 2025-12-15T10:05:19Z|00253|binding|INFO|Releasing lport b37f7934-d790-4e14-92c9-321fdf7e6bfd from this chassis (sb_readonly=0) Dec 15 05:05:19 localhost ovn_controller[154603]: 2025-12-15T10:05:19Z|00254|binding|INFO|Setting lport b37f7934-d790-4e14-92c9-321fdf7e6bfd down in Southbound Dec 15 05:05:19 localhost nova_compute[286344]: 2025-12-15 10:05:19.832 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:19 localhost kernel: device tapb37f7934-d7 left promiscuous mode Dec 15 05:05:19 localhost neutron_sriov_agent[260044]: 2025-12-15 10:05:19.839 2 INFO neutron.agent.securitygroups_rpc [None req-ef9c94f5-c9f7-47c5-a165-f65f20c2ca5a d03c2d1ea31b421aabc2ccaccb47df84 08fb1034ee1a4def8bd8d9aecf9dfe86 - - default default] Security group member updated ['5de66dcc-b971-420f-89b1-9a69fc21c19e']#033[00m Dec 15 05:05:19 localhost ovn_metadata_agent[160585]: 2025-12-15 10:05:19.840 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::3/64', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-910645bf-c485-48fb-b20c-a5b1b9aff07a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-910645bf-c485-48fb-b20c-a5b1b9aff07a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '58db13bf0d834ccda318448020067936', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=31b248e1-bd95-4c35-a311-ee84f197fcc4, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=b37f7934-d790-4e14-92c9-321fdf7e6bfd) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:05:19 localhost ovn_metadata_agent[160585]: 2025-12-15 10:05:19.843 160590 INFO neutron.agent.ovn.metadata.agent [-] Port b37f7934-d790-4e14-92c9-321fdf7e6bfd in datapath 910645bf-c485-48fb-b20c-a5b1b9aff07a unbound from our chassis#033[00m Dec 15 05:05:19 localhost ovn_metadata_agent[160585]: 2025-12-15 10:05:19.844 160590 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 910645bf-c485-48fb-b20c-a5b1b9aff07a or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 15 05:05:19 localhost ovn_metadata_agent[160585]: 2025-12-15 10:05:19.846 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[15bdf372-5f93-4c18-af94-fb90f766670b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:05:19 localhost nova_compute[286344]: 2025-12-15 10:05:19.856 286348 DEBUG nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 05:05:19 localhost nova_compute[286344]: 2025-12-15 10:05:19.857 286348 DEBUG nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 05:05:19 localhost nova_compute[286344]: 2025-12-15 10:05:19.858 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:20 localhost nova_compute[286344]: 2025-12-15 10:05:20.059 286348 WARNING nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 15 05:05:20 localhost nova_compute[286344]: 2025-12-15 10:05:20.060 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Hypervisor/Node resource view: name=np0005559462.localdomain free_ram=11270MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 15 05:05:20 localhost nova_compute[286344]: 2025-12-15 10:05:20.061 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 05:05:20 localhost nova_compute[286344]: 2025-12-15 10:05:20.061 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 05:05:20 localhost nova_compute[286344]: 2025-12-15 10:05:20.214 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:20 localhost systemd[1]: var-lib-containers-storage-overlay-d26a353df1666edc345a50323abf0fb31972cb2d78e711e20d841c6d1d3a1225-merged.mount: Deactivated successfully. Dec 15 05:05:20 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-baa8fedf27155d0c04f600491b11cdab3da32226a8d3d2716c611e07a9c7308a-userdata-shm.mount: Deactivated successfully. Dec 15 05:05:20 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 05:05:20 localhost systemd[1]: run-netns-qdhcp\x2d910645bf\x2dc485\x2d48fb\x2db20c\x2da5b1b9aff07a.mount: Deactivated successfully. Dec 15 05:05:20 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:05:20.800 267546 INFO neutron.agent.dhcp.agent [None req-555c61ad-16c9-49f9-aa43-d4f88b7ec7d0 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:05:20 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:05:20.800 267546 INFO neutron.agent.dhcp.agent [None req-555c61ad-16c9-49f9-aa43-d4f88b7ec7d0 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:05:20 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:05:20.801 267546 INFO neutron.agent.dhcp.agent [None req-555c61ad-16c9-49f9-aa43-d4f88b7ec7d0 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:05:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 05:05:20 localhost podman[321465]: 2025-12-15 10:05:20.904805824 +0000 UTC m=+0.082089635 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:05:20 localhost podman[321465]: 2025-12-15 10:05:20.914448765 +0000 UTC m=+0.091732576 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Dec 15 05:05:20 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 05:05:21 localhost neutron_sriov_agent[260044]: 2025-12-15 10:05:21.134 2 INFO neutron.agent.securitygroups_rpc [None req-b9f022fe-c21c-4560-ad15-6111386ebbf4 d03c2d1ea31b421aabc2ccaccb47df84 08fb1034ee1a4def8bd8d9aecf9dfe86 - - default default] Security group member updated ['5de66dcc-b971-420f-89b1-9a69fc21c19e']#033[00m Dec 15 05:05:21 localhost nova_compute[286344]: 2025-12-15 10:05:21.202 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Instance 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 15 05:05:21 localhost nova_compute[286344]: 2025-12-15 10:05:21.203 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 15 05:05:21 localhost nova_compute[286344]: 2025-12-15 10:05:21.203 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Final resource view: name=np0005559462.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 15 05:05:21 localhost nova_compute[286344]: 2025-12-15 10:05:21.400 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 05:05:21 localhost ovn_controller[154603]: 2025-12-15T10:05:21Z|00255|binding|INFO|Releasing lport b35254ad-12eb-47bb-92be-44fefe0694f0 from this chassis (sb_readonly=0) Dec 15 05:05:21 localhost nova_compute[286344]: 2025-12-15 10:05:21.486 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:21 localhost neutron_sriov_agent[260044]: 2025-12-15 10:05:21.578 2 INFO neutron.agent.securitygroups_rpc [None req-dcffe485-3736-415c-ad40-ce53458a0ef9 b927ea383a0640b0839846e8a3038e87 2863215517a546fa9ec3fbc1298732b9 - - default default] Security group rule updated ['b61f64a6-4cdd-4e28-a23f-eb6f0e46cf34']#033[00m Dec 15 05:05:21 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Dec 15 05:05:21 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/523974834' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Dec 15 05:05:21 localhost nova_compute[286344]: 2025-12-15 10:05:21.856 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 05:05:21 localhost nova_compute[286344]: 2025-12-15 10:05:21.864 286348 DEBUG nova.compute.provider_tree [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Inventory has not changed in ProviderTree for provider: 26c8956b-6742-4951-b566-971b9bbe323b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 15 05:05:21 localhost nova_compute[286344]: 2025-12-15 10:05:21.882 286348 DEBUG nova.scheduler.client.report [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Inventory has not changed for provider 26c8956b-6742-4951-b566-971b9bbe323b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 15 05:05:21 localhost nova_compute[286344]: 2025-12-15 10:05:21.909 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Compute_service record updated for np0005559462.localdomain:np0005559462.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 15 05:05:21 localhost nova_compute[286344]: 2025-12-15 10:05:21.910 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.848s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 05:05:21 localhost nova_compute[286344]: 2025-12-15 10:05:21.911 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:05:21 localhost nova_compute[286344]: 2025-12-15 10:05:21.911 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Cleaning up deleted instances with incomplete migration _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m Dec 15 05:05:21 localhost neutron_sriov_agent[260044]: 2025-12-15 10:05:21.965 2 INFO neutron.agent.securitygroups_rpc [None req-4fd9d644-28dc-4282-92f4-8171e95a95cd d03c2d1ea31b421aabc2ccaccb47df84 08fb1034ee1a4def8bd8d9aecf9dfe86 - - default default] Security group member updated ['5de66dcc-b971-420f-89b1-9a69fc21c19e']#033[00m Dec 15 05:05:22 localhost nova_compute[286344]: 2025-12-15 10:05:22.852 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:22 localhost neutron_sriov_agent[260044]: 2025-12-15 10:05:22.879 2 INFO neutron.agent.securitygroups_rpc [None req-06ac1901-38b7-4e0c-91bb-48725281c230 d03c2d1ea31b421aabc2ccaccb47df84 08fb1034ee1a4def8bd8d9aecf9dfe86 - - default default] Security group member updated ['5de66dcc-b971-420f-89b1-9a69fc21c19e']#033[00m Dec 15 05:05:22 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Dec 15 05:05:22 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1388017251' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Dec 15 05:05:23 localhost neutron_sriov_agent[260044]: 2025-12-15 10:05:23.098 2 INFO neutron.agent.securitygroups_rpc [None req-9fe9ed8e-e15b-4a65-91fb-28b72139e823 a22542ef31414501844801d3b102584b 54df976b8a364f93b0b8b0128def8f10 - - default default] Security group member updated ['3b08e208-1f02-4d1e-84d0-d090034d8a7c']#033[00m Dec 15 05:05:23 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e148 do_prune osdmap full prune enabled Dec 15 05:05:23 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e149 e149: 6 total, 6 up, 6 in Dec 15 05:05:23 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e149: 6 total, 6 up, 6 in Dec 15 05:05:23 localhost nova_compute[286344]: 2025-12-15 10:05:23.926 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:05:24 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e149 do_prune osdmap full prune enabled Dec 15 05:05:24 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e150 e150: 6 total, 6 up, 6 in Dec 15 05:05:24 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e150: 6 total, 6 up, 6 in Dec 15 05:05:24 localhost nova_compute[286344]: 2025-12-15 10:05:24.271 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:05:24 localhost nova_compute[286344]: 2025-12-15 10:05:24.271 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m Dec 15 05:05:24 localhost nova_compute[286344]: 2025-12-15 10:05:24.288 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m Dec 15 05:05:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e. Dec 15 05:05:24 localhost podman[321507]: 2025-12-15 10:05:24.74136077 +0000 UTC m=+0.069880040 container health_status a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Dec 15 05:05:24 localhost podman[321507]: 2025-12-15 10:05:24.74937623 +0000 UTC m=+0.077895510 container exec_died a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Dec 15 05:05:24 localhost systemd[1]: a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e.service: Deactivated successfully. Dec 15 05:05:25 localhost neutron_sriov_agent[260044]: 2025-12-15 10:05:25.167 2 INFO neutron.agent.securitygroups_rpc [None req-43fd4716-f9b2-4f28-a748-95f2c94ebfdd b927ea383a0640b0839846e8a3038e87 2863215517a546fa9ec3fbc1298732b9 - - default default] Security group rule updated ['a92bfd93-273b-41a5-a030-d3fcc8fda37f']#033[00m Dec 15 05:05:25 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e150 do_prune osdmap full prune enabled Dec 15 05:05:25 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e151 e151: 6 total, 6 up, 6 in Dec 15 05:05:25 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e151: 6 total, 6 up, 6 in Dec 15 05:05:25 localhost nova_compute[286344]: 2025-12-15 10:05:25.216 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:25 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 05:05:25 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e151 do_prune osdmap full prune enabled Dec 15 05:05:25 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e152 e152: 6 total, 6 up, 6 in Dec 15 05:05:25 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e152: 6 total, 6 up, 6 in Dec 15 05:05:25 localhost neutron_sriov_agent[260044]: 2025-12-15 10:05:25.695 2 INFO neutron.agent.securitygroups_rpc [None req-238cbbcd-77ce-406c-b3c3-b8747cef9734 b927ea383a0640b0839846e8a3038e87 2863215517a546fa9ec3fbc1298732b9 - - default default] Security group rule updated ['a92bfd93-273b-41a5-a030-d3fcc8fda37f']#033[00m Dec 15 05:05:25 localhost neutron_sriov_agent[260044]: 2025-12-15 10:05:25.811 2 INFO neutron.agent.securitygroups_rpc [None req-9ed743c2-584e-4a66-97b0-d543bf4e183c d03c2d1ea31b421aabc2ccaccb47df84 08fb1034ee1a4def8bd8d9aecf9dfe86 - - default default] Security group member updated ['5de66dcc-b971-420f-89b1-9a69fc21c19e']#033[00m Dec 15 05:05:26 localhost neutron_sriov_agent[260044]: 2025-12-15 10:05:26.824 2 INFO neutron.agent.securitygroups_rpc [None req-d42f763f-93ec-44ad-ad94-052834f9fac3 a22542ef31414501844801d3b102584b 54df976b8a364f93b0b8b0128def8f10 - - default default] Security group member updated ['3b08e208-1f02-4d1e-84d0-d090034d8a7c']#033[00m Dec 15 05:05:27 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e152 do_prune osdmap full prune enabled Dec 15 05:05:27 localhost neutron_sriov_agent[260044]: 2025-12-15 10:05:27.204 2 INFO neutron.agent.securitygroups_rpc [None req-f7e295b5-66f4-494f-908b-051cab0d8e91 d03c2d1ea31b421aabc2ccaccb47df84 08fb1034ee1a4def8bd8d9aecf9dfe86 - - default default] Security group member updated ['5de66dcc-b971-420f-89b1-9a69fc21c19e']#033[00m Dec 15 05:05:27 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e153 e153: 6 total, 6 up, 6 in Dec 15 05:05:27 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e153: 6 total, 6 up, 6 in Dec 15 05:05:27 localhost nova_compute[286344]: 2025-12-15 10:05:27.891 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:28 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e153 do_prune osdmap full prune enabled Dec 15 05:05:28 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e154 e154: 6 total, 6 up, 6 in Dec 15 05:05:28 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e154: 6 total, 6 up, 6 in Dec 15 05:05:28 localhost neutron_sriov_agent[260044]: 2025-12-15 10:05:28.310 2 INFO neutron.agent.securitygroups_rpc [None req-5799802f-0d49-409e-8d91-bee13f94dce2 d03c2d1ea31b421aabc2ccaccb47df84 08fb1034ee1a4def8bd8d9aecf9dfe86 - - default default] Security group member updated ['5de66dcc-b971-420f-89b1-9a69fc21c19e']#033[00m Dec 15 05:05:29 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e154 do_prune osdmap full prune enabled Dec 15 05:05:29 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e155 e155: 6 total, 6 up, 6 in Dec 15 05:05:29 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e155: 6 total, 6 up, 6 in Dec 15 05:05:29 localhost neutron_sriov_agent[260044]: 2025-12-15 10:05:29.279 2 INFO neutron.agent.securitygroups_rpc [None req-18944d12-a91a-46e6-b708-7022514dca94 b927ea383a0640b0839846e8a3038e87 2863215517a546fa9ec3fbc1298732b9 - - default default] Security group rule updated ['e0141c34-f5e1-49b9-9090-d5ca6106e7cf']#033[00m Dec 15 05:05:29 localhost neutron_sriov_agent[260044]: 2025-12-15 10:05:29.736 2 INFO neutron.agent.securitygroups_rpc [None req-9a99cf32-04ed-41e6-ac40-a340ad53ca5e d03c2d1ea31b421aabc2ccaccb47df84 08fb1034ee1a4def8bd8d9aecf9dfe86 - - default default] Security group member updated ['5de66dcc-b971-420f-89b1-9a69fc21c19e']#033[00m Dec 15 05:05:30 localhost neutron_sriov_agent[260044]: 2025-12-15 10:05:30.111 2 INFO neutron.agent.securitygroups_rpc [None req-e8ee1339-91c3-48a4-89f7-96e87b53a4d3 d03c2d1ea31b421aabc2ccaccb47df84 08fb1034ee1a4def8bd8d9aecf9dfe86 - - default default] Security group member updated ['5de66dcc-b971-420f-89b1-9a69fc21c19e']#033[00m Dec 15 05:05:30 localhost nova_compute[286344]: 2025-12-15 10:05:30.263 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:30 localhost neutron_sriov_agent[260044]: 2025-12-15 10:05:30.348 2 INFO neutron.agent.securitygroups_rpc [None req-b5d5ccff-b210-46f8-aacc-4ce081aacb2b b927ea383a0640b0839846e8a3038e87 2863215517a546fa9ec3fbc1298732b9 - - default default] Security group rule updated ['e0141c34-f5e1-49b9-9090-d5ca6106e7cf']#033[00m Dec 15 05:05:30 localhost neutron_sriov_agent[260044]: 2025-12-15 10:05:30.491 2 INFO neutron.agent.securitygroups_rpc [None req-ce9a94ce-dc0e-49cf-9a33-28633bfd8bd5 d03c2d1ea31b421aabc2ccaccb47df84 08fb1034ee1a4def8bd8d9aecf9dfe86 - - default default] Security group member updated ['5de66dcc-b971-420f-89b1-9a69fc21c19e']#033[00m Dec 15 05:05:30 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e155 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 05:05:30 localhost ceph-mon[298913]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #49. Immutable memtables: 0. Dec 15 05:05:30 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:05:30.690530) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Dec 15 05:05:30 localhost ceph-mon[298913]: rocksdb: [db/flush_job.cc:856] [default] [JOB 27] Flushing memtable with next log file: 49 Dec 15 05:05:30 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765793130690704, "job": 27, "event": "flush_started", "num_memtables": 1, "num_entries": 2244, "num_deletes": 268, "total_data_size": 2308914, "memory_usage": 2350904, "flush_reason": "Manual Compaction"} Dec 15 05:05:30 localhost ceph-mon[298913]: rocksdb: [db/flush_job.cc:885] [default] [JOB 27] Level-0 flush table #50: started Dec 15 05:05:30 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765793130710475, "cf_name": "default", "job": 27, "event": "table_file_creation", "file_number": 50, "file_size": 2248732, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 27497, "largest_seqno": 29740, "table_properties": {"data_size": 2239302, "index_size": 5871, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2501, "raw_key_size": 20564, "raw_average_key_size": 21, "raw_value_size": 2220000, "raw_average_value_size": 2276, "num_data_blocks": 254, "num_entries": 975, "num_filter_entries": 975, "num_deletions": 268, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765792986, "oldest_key_time": 1765792986, "file_creation_time": 1765793130, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "603b24af-e2be-4214-bc56-9e652eb4af3d", "db_session_id": "0OJRM9SCUA16EXV0VQZ2", "orig_file_number": 50, "seqno_to_time_mapping": "N/A"}} Dec 15 05:05:30 localhost ceph-mon[298913]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 27] Flush lasted 19995 microseconds, and 7435 cpu microseconds. Dec 15 05:05:30 localhost ceph-mon[298913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Dec 15 05:05:30 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:05:30.710541) [db/flush_job.cc:967] [default] [JOB 27] Level-0 flush table #50: 2248732 bytes OK Dec 15 05:05:30 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:05:30.710573) [db/memtable_list.cc:519] [default] Level-0 commit table #50 started Dec 15 05:05:30 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:05:30.713608) [db/memtable_list.cc:722] [default] Level-0 commit table #50: memtable #1 done Dec 15 05:05:30 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:05:30.713630) EVENT_LOG_v1 {"time_micros": 1765793130713624, "job": 27, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Dec 15 05:05:30 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:05:30.713657) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Dec 15 05:05:30 localhost ceph-mon[298913]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 27] Try to delete WAL files size 2299298, prev total WAL file size 2299788, number of live WAL files 2. Dec 15 05:05:30 localhost ceph-mon[298913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005559462/store.db/000046.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Dec 15 05:05:30 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:05:30.714594) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0034303230' seq:72057594037927935, type:22 .. '6C6F676D0034323731' seq:0, type:0; will stop at (end) Dec 15 05:05:30 localhost ceph-mon[298913]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 28] Compacting 1@0 + 1@6 files to L6, score -1.00 Dec 15 05:05:30 localhost ceph-mon[298913]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 27 Base level 0, inputs: [50(2196KB)], [48(16MB)] Dec 15 05:05:30 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765793130714698, "job": 28, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [50], "files_L6": [48], "score": -1, "input_data_size": 19785979, "oldest_snapshot_seqno": -1} Dec 15 05:05:30 localhost neutron_sriov_agent[260044]: 2025-12-15 10:05:30.746 2 INFO neutron.agent.securitygroups_rpc [None req-2c7347b6-80c0-4d24-8f46-ba20103f56dc a22542ef31414501844801d3b102584b 54df976b8a364f93b0b8b0128def8f10 - - default default] Security group member updated ['3b08e208-1f02-4d1e-84d0-d090034d8a7c']#033[00m Dec 15 05:05:30 localhost ceph-mon[298913]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 28] Generated table #51: 13107 keys, 19332658 bytes, temperature: kUnknown Dec 15 05:05:30 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765793130873711, "cf_name": "default", "job": 28, "event": "table_file_creation", "file_number": 51, "file_size": 19332658, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 19254809, "index_size": 44020, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 32773, "raw_key_size": 350296, "raw_average_key_size": 26, "raw_value_size": 19028429, "raw_average_value_size": 1451, "num_data_blocks": 1684, "num_entries": 13107, "num_filter_entries": 13107, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765792320, "oldest_key_time": 0, "file_creation_time": 1765793130, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "603b24af-e2be-4214-bc56-9e652eb4af3d", "db_session_id": "0OJRM9SCUA16EXV0VQZ2", "orig_file_number": 51, "seqno_to_time_mapping": "N/A"}} Dec 15 05:05:30 localhost ceph-mon[298913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Dec 15 05:05:30 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:05:30.874128) [db/compaction/compaction_job.cc:1663] [default] [JOB 28] Compacted 1@0 + 1@6 files to L6 => 19332658 bytes Dec 15 05:05:30 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:05:30.875753) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 124.4 rd, 121.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.1, 16.7 +0.0 blob) out(18.4 +0.0 blob), read-write-amplify(17.4) write-amplify(8.6) OK, records in: 13653, records dropped: 546 output_compression: NoCompression Dec 15 05:05:30 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:05:30.875782) EVENT_LOG_v1 {"time_micros": 1765793130875768, "job": 28, "event": "compaction_finished", "compaction_time_micros": 159107, "compaction_time_cpu_micros": 54305, "output_level": 6, "num_output_files": 1, "total_output_size": 19332658, "num_input_records": 13653, "num_output_records": 13107, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Dec 15 05:05:30 localhost ceph-mon[298913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005559462/store.db/000050.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Dec 15 05:05:30 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765793130876287, "job": 28, "event": "table_file_deletion", "file_number": 50} Dec 15 05:05:30 localhost ceph-mon[298913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005559462/store.db/000048.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Dec 15 05:05:30 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765793130879084, "job": 28, "event": "table_file_deletion", "file_number": 48} Dec 15 05:05:30 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:05:30.714450) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 05:05:30 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:05:30.879206) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 05:05:30 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:05:30.879214) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 05:05:30 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:05:30.879217) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 05:05:30 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:05:30.879219) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 05:05:30 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:05:30.879221) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 05:05:31 localhost podman[243449]: time="2025-12-15T10:05:31Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 15 05:05:31 localhost podman[243449]: @ - - [15/Dec/2025:10:05:31 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 158464 "" "Go-http-client/1.1" Dec 15 05:05:31 localhost podman[243449]: @ - - [15/Dec/2025:10:05:31 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19720 "" "Go-http-client/1.1" Dec 15 05:05:32 localhost neutron_sriov_agent[260044]: 2025-12-15 10:05:32.103 2 INFO neutron.agent.securitygroups_rpc [None req-eceb20cf-aa33-471c-9304-8cc71fb5c22e b927ea383a0640b0839846e8a3038e87 2863215517a546fa9ec3fbc1298732b9 - - default default] Security group rule updated ['70c5b8cf-d5dd-479a-a035-dc263d0c5f51']#033[00m Dec 15 05:05:32 localhost neutron_sriov_agent[260044]: 2025-12-15 10:05:32.772 2 INFO neutron.agent.securitygroups_rpc [None req-5080008a-6ba2-47ab-aac1-ca444f91ec66 b927ea383a0640b0839846e8a3038e87 2863215517a546fa9ec3fbc1298732b9 - - default default] Security group rule updated ['70c5b8cf-d5dd-479a-a035-dc263d0c5f51']#033[00m Dec 15 05:05:32 localhost nova_compute[286344]: 2025-12-15 10:05:32.933 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:33 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Dec 15 05:05:33 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3862409060' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Dec 15 05:05:33 localhost neutron_sriov_agent[260044]: 2025-12-15 10:05:33.675 2 INFO neutron.agent.securitygroups_rpc [None req-3bb254d9-38ea-40b5-8c3d-b226ffd85099 b927ea383a0640b0839846e8a3038e87 2863215517a546fa9ec3fbc1298732b9 - - default default] Security group rule updated ['70c5b8cf-d5dd-479a-a035-dc263d0c5f51']#033[00m Dec 15 05:05:34 localhost nova_compute[286344]: 2025-12-15 10:05:34.271 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:05:34 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e155 do_prune osdmap full prune enabled Dec 15 05:05:34 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e156 e156: 6 total, 6 up, 6 in Dec 15 05:05:34 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e156: 6 total, 6 up, 6 in Dec 15 05:05:34 localhost openstack_network_exporter[246484]: ERROR 10:05:34 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 15 05:05:34 localhost openstack_network_exporter[246484]: ERROR 10:05:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 05:05:34 localhost openstack_network_exporter[246484]: ERROR 10:05:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 05:05:34 localhost openstack_network_exporter[246484]: ERROR 10:05:34 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 15 05:05:34 localhost openstack_network_exporter[246484]: Dec 15 05:05:34 localhost openstack_network_exporter[246484]: ERROR 10:05:34 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 15 05:05:34 localhost openstack_network_exporter[246484]: Dec 15 05:05:35 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:05:35.076 267546 INFO neutron.agent.linux.ip_lib [None req-adb3ac88-9598-49a1-9346-a0b4fbc08ef4 - - - - - -] Device tap159bb878-40 cannot be used as it has no MAC address#033[00m Dec 15 05:05:35 localhost nova_compute[286344]: 2025-12-15 10:05:35.098 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:35 localhost kernel: device tap159bb878-40 entered promiscuous mode Dec 15 05:05:35 localhost NetworkManager[5963]: [1765793135.1062] manager: (tap159bb878-40): new Generic device (/org/freedesktop/NetworkManager/Devices/41) Dec 15 05:05:35 localhost ovn_controller[154603]: 2025-12-15T10:05:35Z|00256|binding|INFO|Claiming lport 159bb878-40ba-4340-a0c2-64606be092f4 for this chassis. Dec 15 05:05:35 localhost nova_compute[286344]: 2025-12-15 10:05:35.107 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:35 localhost ovn_controller[154603]: 2025-12-15T10:05:35Z|00257|binding|INFO|159bb878-40ba-4340-a0c2-64606be092f4: Claiming unknown Dec 15 05:05:35 localhost systemd-udevd[321542]: Network interface NamePolicy= disabled on kernel command line. Dec 15 05:05:35 localhost ovn_metadata_agent[160585]: 2025-12-15 10:05:35.117 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::1/64', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-f92fa7f0-d2bf-47f3-bb61-da7318ce7879', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f92fa7f0-d2bf-47f3-bb61-da7318ce7879', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '58db13bf0d834ccda318448020067936', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=add79b81-b2fb-4bfb-9f25-d60503ab36ae, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=159bb878-40ba-4340-a0c2-64606be092f4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:05:35 localhost ovn_metadata_agent[160585]: 2025-12-15 10:05:35.120 160590 INFO neutron.agent.ovn.metadata.agent [-] Port 159bb878-40ba-4340-a0c2-64606be092f4 in datapath f92fa7f0-d2bf-47f3-bb61-da7318ce7879 bound to our chassis#033[00m Dec 15 05:05:35 localhost ovn_metadata_agent[160585]: 2025-12-15 10:05:35.121 160590 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network f92fa7f0-d2bf-47f3-bb61-da7318ce7879 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 15 05:05:35 localhost ovn_metadata_agent[160585]: 2025-12-15 10:05:35.123 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[3880bde4-f28b-4ccc-9c0b-94d7e9c30a77]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:05:35 localhost ovn_controller[154603]: 2025-12-15T10:05:35Z|00258|binding|INFO|Setting lport 159bb878-40ba-4340-a0c2-64606be092f4 ovn-installed in OVS Dec 15 05:05:35 localhost ovn_controller[154603]: 2025-12-15T10:05:35Z|00259|binding|INFO|Setting lport 159bb878-40ba-4340-a0c2-64606be092f4 up in Southbound Dec 15 05:05:35 localhost nova_compute[286344]: 2025-12-15 10:05:35.149 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:35 localhost neutron_sriov_agent[260044]: 2025-12-15 10:05:35.149 2 INFO neutron.agent.securitygroups_rpc [None req-f389b846-3e6f-4f86-9394-10b0b9a5046f b927ea383a0640b0839846e8a3038e87 2863215517a546fa9ec3fbc1298732b9 - - default default] Security group rule updated ['70c5b8cf-d5dd-479a-a035-dc263d0c5f51']#033[00m Dec 15 05:05:35 localhost nova_compute[286344]: 2025-12-15 10:05:35.150 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:35 localhost nova_compute[286344]: 2025-12-15 10:05:35.182 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:35 localhost nova_compute[286344]: 2025-12-15 10:05:35.211 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:35 localhost nova_compute[286344]: 2025-12-15 10:05:35.265 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:35 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e156 do_prune osdmap full prune enabled Dec 15 05:05:35 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e157 e157: 6 total, 6 up, 6 in Dec 15 05:05:35 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e157: 6 total, 6 up, 6 in Dec 15 05:05:35 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 05:05:35 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e157 do_prune osdmap full prune enabled Dec 15 05:05:35 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e158 e158: 6 total, 6 up, 6 in Dec 15 05:05:35 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e158: 6 total, 6 up, 6 in Dec 15 05:05:36 localhost podman[321598]: Dec 15 05:05:36 localhost podman[321598]: 2025-12-15 10:05:36.124459759 +0000 UTC m=+0.089263835 container create 3adc8eeda4c8329152ad78848902708dec1236fd2c4350e8e6383067837b1f3c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f92fa7f0-d2bf-47f3-bb61-da7318ce7879, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:05:36 localhost systemd[1]: Started libpod-conmon-3adc8eeda4c8329152ad78848902708dec1236fd2c4350e8e6383067837b1f3c.scope. Dec 15 05:05:36 localhost systemd[1]: Started libcrun container. Dec 15 05:05:36 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ad0505c1aa899b9e666a1145921c85111503f1ab236a8fc5775a8b0f9b589090/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 05:05:36 localhost podman[321598]: 2025-12-15 10:05:36.083494534 +0000 UTC m=+0.048298640 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 15 05:05:36 localhost podman[321598]: 2025-12-15 10:05:36.195555128 +0000 UTC m=+0.160359204 container init 3adc8eeda4c8329152ad78848902708dec1236fd2c4350e8e6383067837b1f3c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f92fa7f0-d2bf-47f3-bb61-da7318ce7879, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:05:36 localhost podman[321598]: 2025-12-15 10:05:36.212080451 +0000 UTC m=+0.176884537 container start 3adc8eeda4c8329152ad78848902708dec1236fd2c4350e8e6383067837b1f3c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f92fa7f0-d2bf-47f3-bb61-da7318ce7879, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Dec 15 05:05:36 localhost dnsmasq[321616]: started, version 2.85 cachesize 150 Dec 15 05:05:36 localhost dnsmasq[321616]: DNS service limited to local subnets Dec 15 05:05:36 localhost dnsmasq[321616]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 15 05:05:36 localhost dnsmasq[321616]: warning: no upstream servers configured Dec 15 05:05:36 localhost dnsmasq-dhcp[321616]: DHCPv6, static leases only on 2001:db8::, lease time 1d Dec 15 05:05:36 localhost dnsmasq[321616]: read /var/lib/neutron/dhcp/f92fa7f0-d2bf-47f3-bb61-da7318ce7879/addn_hosts - 0 addresses Dec 15 05:05:36 localhost dnsmasq-dhcp[321616]: read /var/lib/neutron/dhcp/f92fa7f0-d2bf-47f3-bb61-da7318ce7879/host Dec 15 05:05:36 localhost dnsmasq-dhcp[321616]: read /var/lib/neutron/dhcp/f92fa7f0-d2bf-47f3-bb61-da7318ce7879/opts Dec 15 05:05:36 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:05:36.514 267546 INFO neutron.agent.dhcp.agent [None req-17b09948-086b-4066-8c0d-c33f1f8678a9 - - - - - -] DHCP configuration for ports {'f9bd5d25-3665-494c-992d-392d88db5a17'} is completed#033[00m Dec 15 05:05:36 localhost neutron_sriov_agent[260044]: 2025-12-15 10:05:36.882 2 INFO neutron.agent.securitygroups_rpc [None req-d75b0f8a-81f2-4a74-ab29-0d346ae2a689 b927ea383a0640b0839846e8a3038e87 2863215517a546fa9ec3fbc1298732b9 - - default default] Security group rule updated ['70c5b8cf-d5dd-479a-a035-dc263d0c5f51']#033[00m Dec 15 05:05:37 localhost neutron_sriov_agent[260044]: 2025-12-15 10:05:37.324 2 INFO neutron.agent.securitygroups_rpc [None req-62247f9f-3856-4ccf-865e-3f9bc29c081c b927ea383a0640b0839846e8a3038e87 2863215517a546fa9ec3fbc1298732b9 - - default default] Security group rule updated ['70c5b8cf-d5dd-479a-a035-dc263d0c5f51']#033[00m Dec 15 05:05:37 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e158 do_prune osdmap full prune enabled Dec 15 05:05:37 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e159 e159: 6 total, 6 up, 6 in Dec 15 05:05:37 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e159: 6 total, 6 up, 6 in Dec 15 05:05:37 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Dec 15 05:05:37 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/438137847' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Dec 15 05:05:37 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Dec 15 05:05:37 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/438137847' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Dec 15 05:05:37 localhost nova_compute[286344]: 2025-12-15 10:05:37.961 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:38 localhost neutron_sriov_agent[260044]: 2025-12-15 10:05:38.596 2 INFO neutron.agent.securitygroups_rpc [None req-a38467fa-739e-4edb-a215-2279a10ec787 b927ea383a0640b0839846e8a3038e87 2863215517a546fa9ec3fbc1298732b9 - - default default] Security group rule updated ['bcd3af27-5007-4a87-be91-1e37e5913c42']#033[00m Dec 15 05:05:38 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Dec 15 05:05:38 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1625523691' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Dec 15 05:05:38 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Dec 15 05:05:38 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1625523691' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Dec 15 05:05:39 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e159 do_prune osdmap full prune enabled Dec 15 05:05:39 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e160 e160: 6 total, 6 up, 6 in Dec 15 05:05:39 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e160: 6 total, 6 up, 6 in Dec 15 05:05:39 localhost ceph-mon[298913]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #52. Immutable memtables: 0. Dec 15 05:05:39 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:05:39.796293) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Dec 15 05:05:39 localhost ceph-mon[298913]: rocksdb: [db/flush_job.cc:856] [default] [JOB 29] Flushing memtable with next log file: 52 Dec 15 05:05:39 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765793139796343, "job": 29, "event": "flush_started", "num_memtables": 1, "num_entries": 421, "num_deletes": 253, "total_data_size": 197341, "memory_usage": 205224, "flush_reason": "Manual Compaction"} Dec 15 05:05:39 localhost ceph-mon[298913]: rocksdb: [db/flush_job.cc:885] [default] [JOB 29] Level-0 flush table #53: started Dec 15 05:05:39 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765793139801068, "cf_name": "default", "job": 29, "event": "table_file_creation", "file_number": 53, "file_size": 193662, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 29741, "largest_seqno": 30161, "table_properties": {"data_size": 191260, "index_size": 513, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 837, "raw_key_size": 6799, "raw_average_key_size": 20, "raw_value_size": 186137, "raw_average_value_size": 560, "num_data_blocks": 22, "num_entries": 332, "num_filter_entries": 332, "num_deletions": 253, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765793130, "oldest_key_time": 1765793130, "file_creation_time": 1765793139, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "603b24af-e2be-4214-bc56-9e652eb4af3d", "db_session_id": "0OJRM9SCUA16EXV0VQZ2", "orig_file_number": 53, "seqno_to_time_mapping": "N/A"}} Dec 15 05:05:39 localhost ceph-mon[298913]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 29] Flush lasted 4813 microseconds, and 1345 cpu microseconds. Dec 15 05:05:39 localhost ceph-mon[298913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Dec 15 05:05:39 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:05:39.801109) [db/flush_job.cc:967] [default] [JOB 29] Level-0 flush table #53: 193662 bytes OK Dec 15 05:05:39 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:05:39.801128) [db/memtable_list.cc:519] [default] Level-0 commit table #53 started Dec 15 05:05:39 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:05:39.803900) [db/memtable_list.cc:722] [default] Level-0 commit table #53: memtable #1 done Dec 15 05:05:39 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:05:39.803936) EVENT_LOG_v1 {"time_micros": 1765793139803926, "job": 29, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Dec 15 05:05:39 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:05:39.803958) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Dec 15 05:05:39 localhost ceph-mon[298913]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 29] Try to delete WAL files size 194681, prev total WAL file size 194681, number of live WAL files 2. Dec 15 05:05:39 localhost ceph-mon[298913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005559462/store.db/000049.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Dec 15 05:05:39 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:05:39.804657) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003132303438' seq:72057594037927935, type:22 .. '7061786F73003132333030' seq:0, type:0; will stop at (end) Dec 15 05:05:39 localhost ceph-mon[298913]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 30] Compacting 1@0 + 1@6 files to L6, score -1.00 Dec 15 05:05:39 localhost ceph-mon[298913]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 29 Base level 0, inputs: [53(189KB)], [51(18MB)] Dec 15 05:05:39 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765793139804728, "job": 30, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [53], "files_L6": [51], "score": -1, "input_data_size": 19526320, "oldest_snapshot_seqno": -1} Dec 15 05:05:39 localhost ceph-mon[298913]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 30] Generated table #54: 12914 keys, 17671473 bytes, temperature: kUnknown Dec 15 05:05:39 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765793139916097, "cf_name": "default", "job": 30, "event": "table_file_creation", "file_number": 54, "file_size": 17671473, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17596176, "index_size": 41916, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 32325, "raw_key_size": 346837, "raw_average_key_size": 26, "raw_value_size": 17374452, "raw_average_value_size": 1345, "num_data_blocks": 1592, "num_entries": 12914, "num_filter_entries": 12914, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765792320, "oldest_key_time": 0, "file_creation_time": 1765793139, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "603b24af-e2be-4214-bc56-9e652eb4af3d", "db_session_id": "0OJRM9SCUA16EXV0VQZ2", "orig_file_number": 54, "seqno_to_time_mapping": "N/A"}} Dec 15 05:05:39 localhost ceph-mon[298913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Dec 15 05:05:39 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:05:39.916401) [db/compaction/compaction_job.cc:1663] [default] [JOB 30] Compacted 1@0 + 1@6 files to L6 => 17671473 bytes Dec 15 05:05:39 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:05:39.918766) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 175.2 rd, 158.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.2, 18.4 +0.0 blob) out(16.9 +0.0 blob), read-write-amplify(192.1) write-amplify(91.2) OK, records in: 13439, records dropped: 525 output_compression: NoCompression Dec 15 05:05:39 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:05:39.918805) EVENT_LOG_v1 {"time_micros": 1765793139918789, "job": 30, "event": "compaction_finished", "compaction_time_micros": 111459, "compaction_time_cpu_micros": 56156, "output_level": 6, "num_output_files": 1, "total_output_size": 17671473, "num_input_records": 13439, "num_output_records": 12914, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Dec 15 05:05:39 localhost ceph-mon[298913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005559462/store.db/000053.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Dec 15 05:05:39 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765793139919029, "job": 30, "event": "table_file_deletion", "file_number": 53} Dec 15 05:05:39 localhost ceph-mon[298913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005559462/store.db/000051.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Dec 15 05:05:39 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765793139922609, "job": 30, "event": "table_file_deletion", "file_number": 51} Dec 15 05:05:39 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:05:39.804561) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 05:05:39 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:05:39.922733) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 05:05:39 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:05:39.922748) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 05:05:39 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:05:39.922752) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 05:05:39 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:05:39.922755) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 05:05:39 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:05:39.922759) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 05:05:40 localhost nova_compute[286344]: 2025-12-15 10:05:40.308 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:40 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 05:05:40 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e160 do_prune osdmap full prune enabled Dec 15 05:05:40 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e161 e161: 6 total, 6 up, 6 in Dec 15 05:05:40 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e161: 6 total, 6 up, 6 in Dec 15 05:05:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0. Dec 15 05:05:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 05:05:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a. Dec 15 05:05:41 localhost podman[321618]: 2025-12-15 10:05:41.744823233 +0000 UTC m=+0.070326401 container health_status 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true) Dec 15 05:05:41 localhost podman[321618]: 2025-12-15 10:05:41.753819148 +0000 UTC m=+0.079322336 container exec_died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true) Dec 15 05:05:41 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.service: Deactivated successfully. Dec 15 05:05:41 localhost systemd[1]: tmp-crun.s92yRY.mount: Deactivated successfully. Dec 15 05:05:41 localhost podman[321617]: 2025-12-15 10:05:41.806564598 +0000 UTC m=+0.133945473 container health_status 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Dec 15 05:05:41 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e161 do_prune osdmap full prune enabled Dec 15 05:05:41 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e162 e162: 6 total, 6 up, 6 in Dec 15 05:05:41 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e162: 6 total, 6 up, 6 in Dec 15 05:05:41 localhost podman[321619]: 2025-12-15 10:05:41.870688242 +0000 UTC m=+0.191174724 container health_status b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:05:41 localhost podman[321619]: 2025-12-15 10:05:41.88778761 +0000 UTC m=+0.208274112 container exec_died b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202) Dec 15 05:05:41 localhost systemd[1]: b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a.service: Deactivated successfully. Dec 15 05:05:41 localhost podman[321617]: 2025-12-15 10:05:41.946304195 +0000 UTC m=+0.273685070 container exec_died 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Dec 15 05:05:41 localhost systemd[1]: 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0.service: Deactivated successfully. Dec 15 05:05:42 localhost neutron_sriov_agent[260044]: 2025-12-15 10:05:42.579 2 INFO neutron.agent.securitygroups_rpc [None req-3bae8990-da46-428c-a770-03d8405ec5e5 6b5da6f221214afe93e1fa66574f238b 89e710ef9f4f48d48a369002db572947 - - default default] Security group member updated ['a6c5f808-dddc-4f17-acbf-63b1b6e6f4d6']#033[00m Dec 15 05:05:42 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e162 do_prune osdmap full prune enabled Dec 15 05:05:42 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e163 e163: 6 total, 6 up, 6 in Dec 15 05:05:42 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e163: 6 total, 6 up, 6 in Dec 15 05:05:42 localhost nova_compute[286344]: 2025-12-15 10:05:42.967 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09. Dec 15 05:05:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 05:05:43 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:05:43.155 267546 INFO neutron.agent.linux.ip_lib [None req-3102b364-e6f3-4f04-b771-5de03f3712d7 - - - - - -] Device tap6547980b-48 cannot be used as it has no MAC address#033[00m Dec 15 05:05:43 localhost nova_compute[286344]: 2025-12-15 10:05:43.180 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:43 localhost podman[321680]: 2025-12-15 10:05:43.180182731 +0000 UTC m=+0.082642559 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.41.3) Dec 15 05:05:43 localhost kernel: device tap6547980b-48 entered promiscuous mode Dec 15 05:05:43 localhost systemd-udevd[321714]: Network interface NamePolicy= disabled on kernel command line. Dec 15 05:05:43 localhost NetworkManager[5963]: [1765793143.1918] manager: (tap6547980b-48): new Generic device (/org/freedesktop/NetworkManager/Devices/42) Dec 15 05:05:43 localhost ovn_controller[154603]: 2025-12-15T10:05:43Z|00260|binding|INFO|Claiming lport 6547980b-4840-441f-9103-a1bc4b8a8420 for this chassis. Dec 15 05:05:43 localhost ovn_controller[154603]: 2025-12-15T10:05:43Z|00261|binding|INFO|6547980b-4840-441f-9103-a1bc4b8a8420: Claiming unknown Dec 15 05:05:43 localhost nova_compute[286344]: 2025-12-15 10:05:43.195 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:43 localhost ovn_metadata_agent[160585]: 2025-12-15 10:05:43.215 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '89e710ef9f4f48d48a369002db572947', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e47821dc-5f5d-44dc-8a16-54817df4049d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=6547980b-4840-441f-9103-a1bc4b8a8420) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:05:43 localhost ovn_metadata_agent[160585]: 2025-12-15 10:05:43.217 160590 INFO neutron.agent.ovn.metadata.agent [-] Port 6547980b-4840-441f-9103-a1bc4b8a8420 in datapath c0669abd-aef1-4b0d-9f97-a6adeeac3211 bound to our chassis#033[00m Dec 15 05:05:43 localhost ovn_metadata_agent[160585]: 2025-12-15 10:05:43.219 160590 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network c0669abd-aef1-4b0d-9f97-a6adeeac3211 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 15 05:05:43 localhost ovn_metadata_agent[160585]: 2025-12-15 10:05:43.220 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[a35247ad-4459-4414-94c6-1936bedc4697]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:05:43 localhost nova_compute[286344]: 2025-12-15 10:05:43.236 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:43 localhost ovn_controller[154603]: 2025-12-15T10:05:43Z|00262|binding|INFO|Setting lport 6547980b-4840-441f-9103-a1bc4b8a8420 ovn-installed in OVS Dec 15 05:05:43 localhost ovn_controller[154603]: 2025-12-15T10:05:43Z|00263|binding|INFO|Setting lport 6547980b-4840-441f-9103-a1bc4b8a8420 up in Southbound Dec 15 05:05:43 localhost nova_compute[286344]: 2025-12-15 10:05:43.244 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:43 localhost podman[321679]: 2025-12-15 10:05:43.245026044 +0000 UTC m=+0.152264842 container health_status 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, vcs-type=git, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.openshift.expose-services=, config_id=openstack_network_exporter, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., architecture=x86_64, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9) Dec 15 05:05:43 localhost podman[321680]: 2025-12-15 10:05:43.260017149 +0000 UTC m=+0.162476967 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true) Dec 15 05:05:43 localhost nova_compute[286344]: 2025-12-15 10:05:43.272 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:43 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 05:05:43 localhost podman[321679]: 2025-12-15 10:05:43.28442348 +0000 UTC m=+0.191662308 container exec_died 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, distribution-scope=public, architecture=x86_64, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}) Dec 15 05:05:43 localhost systemd[1]: 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09.service: Deactivated successfully. Dec 15 05:05:43 localhost nova_compute[286344]: 2025-12-15 10:05:43.307 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:43 localhost neutron_sriov_agent[260044]: 2025-12-15 10:05:43.532 2 INFO neutron.agent.securitygroups_rpc [None req-3ac4e6c6-78fe-4875-8963-f44f718b42d5 6b5da6f221214afe93e1fa66574f238b 89e710ef9f4f48d48a369002db572947 - - default default] Security group member updated ['a6c5f808-dddc-4f17-acbf-63b1b6e6f4d6']#033[00m Dec 15 05:05:43 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e163 do_prune osdmap full prune enabled Dec 15 05:05:43 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e164 e164: 6 total, 6 up, 6 in Dec 15 05:05:43 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e164: 6 total, 6 up, 6 in Dec 15 05:05:44 localhost podman[321784]: Dec 15 05:05:44 localhost podman[321784]: 2025-12-15 10:05:44.092954792 +0000 UTC m=+0.088532226 container create 954b6de5ed4e5e1a969ec9a8a59336ca106798f2174b74b22099089db8f8318e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0) Dec 15 05:05:44 localhost systemd[1]: Started libpod-conmon-954b6de5ed4e5e1a969ec9a8a59336ca106798f2174b74b22099089db8f8318e.scope. Dec 15 05:05:44 localhost systemd[1]: Started libcrun container. Dec 15 05:05:44 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8394380d994f89ec2f1fded3ba5094cdf2b9c57487bf959febfdff7675ece089/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 05:05:44 localhost podman[321784]: 2025-12-15 10:05:44.052255004 +0000 UTC m=+0.047832498 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 15 05:05:44 localhost podman[321784]: 2025-12-15 10:05:44.157183979 +0000 UTC m=+0.152761423 container init 954b6de5ed4e5e1a969ec9a8a59336ca106798f2174b74b22099089db8f8318e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS) Dec 15 05:05:44 localhost podman[321784]: 2025-12-15 10:05:44.164685337 +0000 UTC m=+0.160262781 container start 954b6de5ed4e5e1a969ec9a8a59336ca106798f2174b74b22099089db8f8318e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:05:44 localhost dnsmasq[321802]: started, version 2.85 cachesize 150 Dec 15 05:05:44 localhost dnsmasq[321802]: DNS service limited to local subnets Dec 15 05:05:44 localhost dnsmasq[321802]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 15 05:05:44 localhost dnsmasq[321802]: warning: no upstream servers configured Dec 15 05:05:44 localhost dnsmasq-dhcp[321802]: DHCPv6, static leases only on 2001:db8::, lease time 1d Dec 15 05:05:44 localhost dnsmasq[321802]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/addn_hosts - 0 addresses Dec 15 05:05:44 localhost dnsmasq-dhcp[321802]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/host Dec 15 05:05:44 localhost dnsmasq-dhcp[321802]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/opts Dec 15 05:05:44 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:05:44.227 267546 INFO neutron.agent.dhcp.agent [None req-3102b364-e6f3-4f04-b771-5de03f3712d7 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-15T10:05:42Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=7ffbdf39-9f72-4919-a69d-31df566779e9, ip_allocation=immediate, mac_address=fa:16:3e:aa:3f:4e, name=tempest-NetworksTestDHCPv6-1099678426, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-15T10:05:39Z, description=, dns_domain=, id=c0669abd-aef1-4b0d-9f97-a6adeeac3211, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1861207414, port_security_enabled=True, project_id=89e710ef9f4f48d48a369002db572947, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=20673, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2164, status=ACTIVE, subnets=['947396ef-41ff-47dc-b258-3df58005f947'], tags=[], tenant_id=89e710ef9f4f48d48a369002db572947, updated_at=2025-12-15T10:05:41Z, vlan_transparent=None, network_id=c0669abd-aef1-4b0d-9f97-a6adeeac3211, port_security_enabled=True, project_id=89e710ef9f4f48d48a369002db572947, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['a6c5f808-dddc-4f17-acbf-63b1b6e6f4d6'], standard_attr_id=2179, status=DOWN, tags=[], tenant_id=89e710ef9f4f48d48a369002db572947, updated_at=2025-12-15T10:05:42Z on network c0669abd-aef1-4b0d-9f97-a6adeeac3211#033[00m Dec 15 05:05:44 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:05:44.310 267546 INFO neutron.agent.dhcp.agent [None req-4694a4fa-d72b-4c06-bd5d-4c6dd0916685 - - - - - -] DHCP configuration for ports {'79503367-f53f-4b35-8760-76fcaa4d8407'} is completed#033[00m Dec 15 05:05:44 localhost dnsmasq[321802]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/addn_hosts - 1 addresses Dec 15 05:05:44 localhost dnsmasq-dhcp[321802]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/host Dec 15 05:05:44 localhost podman[321821]: 2025-12-15 10:05:44.410917179 +0000 UTC m=+0.056175107 container kill 954b6de5ed4e5e1a969ec9a8a59336ca106798f2174b74b22099089db8f8318e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:05:44 localhost dnsmasq-dhcp[321802]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/opts Dec 15 05:05:44 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:05:44.626 267546 INFO neutron.agent.dhcp.agent [None req-19d0ef13-2428-4460-a41c-7e4032f97df7 - - - - - -] DHCP configuration for ports {'7ffbdf39-9f72-4919-a69d-31df566779e9'} is completed#033[00m Dec 15 05:05:44 localhost dnsmasq[321802]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/addn_hosts - 0 addresses Dec 15 05:05:44 localhost dnsmasq-dhcp[321802]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/host Dec 15 05:05:44 localhost podman[321859]: 2025-12-15 10:05:44.725165883 +0000 UTC m=+0.054443003 container kill 954b6de5ed4e5e1a969ec9a8a59336ca106798f2174b74b22099089db8f8318e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Dec 15 05:05:44 localhost dnsmasq-dhcp[321802]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/opts Dec 15 05:05:45 localhost dnsmasq[321802]: exiting on receipt of SIGTERM Dec 15 05:05:45 localhost podman[321896]: 2025-12-15 10:05:45.15502256 +0000 UTC m=+0.062394693 container kill 954b6de5ed4e5e1a969ec9a8a59336ca106798f2174b74b22099089db8f8318e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202) Dec 15 05:05:45 localhost systemd[1]: libpod-954b6de5ed4e5e1a969ec9a8a59336ca106798f2174b74b22099089db8f8318e.scope: Deactivated successfully. Dec 15 05:05:45 localhost podman[321909]: 2025-12-15 10:05:45.228498428 +0000 UTC m=+0.061318165 container died 954b6de5ed4e5e1a969ec9a8a59336ca106798f2174b74b22099089db8f8318e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Dec 15 05:05:45 localhost podman[321909]: 2025-12-15 10:05:45.261456563 +0000 UTC m=+0.094276270 container cleanup 954b6de5ed4e5e1a969ec9a8a59336ca106798f2174b74b22099089db8f8318e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:05:45 localhost systemd[1]: libpod-conmon-954b6de5ed4e5e1a969ec9a8a59336ca106798f2174b74b22099089db8f8318e.scope: Deactivated successfully. Dec 15 05:05:45 localhost podman[321912]: 2025-12-15 10:05:45.307023403 +0000 UTC m=+0.127795309 container remove 954b6de5ed4e5e1a969ec9a8a59336ca106798f2174b74b22099089db8f8318e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:05:45 localhost ovn_controller[154603]: 2025-12-15T10:05:45Z|00264|binding|INFO|Releasing lport 6547980b-4840-441f-9103-a1bc4b8a8420 from this chassis (sb_readonly=0) Dec 15 05:05:45 localhost ovn_controller[154603]: 2025-12-15T10:05:45Z|00265|binding|INFO|Setting lport 6547980b-4840-441f-9103-a1bc4b8a8420 down in Southbound Dec 15 05:05:45 localhost nova_compute[286344]: 2025-12-15 10:05:45.352 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:45 localhost kernel: device tap6547980b-48 left promiscuous mode Dec 15 05:05:45 localhost nova_compute[286344]: 2025-12-15 10:05:45.354 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m Dec 15 05:05:45 localhost ovn_metadata_agent[160585]: 2025-12-15 10:05:45.358 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '89e710ef9f4f48d48a369002db572947', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005559462.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e47821dc-5f5d-44dc-8a16-54817df4049d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=6547980b-4840-441f-9103-a1bc4b8a8420) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:05:45 localhost ovn_metadata_agent[160585]: 2025-12-15 10:05:45.360 160590 INFO neutron.agent.ovn.metadata.agent [-] Port 6547980b-4840-441f-9103-a1bc4b8a8420 in datapath c0669abd-aef1-4b0d-9f97-a6adeeac3211 unbound from our chassis#033[00m Dec 15 05:05:45 localhost ovn_metadata_agent[160585]: 2025-12-15 10:05:45.361 160590 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network c0669abd-aef1-4b0d-9f97-a6adeeac3211 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 15 05:05:45 localhost ovn_metadata_agent[160585]: 2025-12-15 10:05:45.362 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[afde818e-4160-4f38-98d0-56652d05f0b7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:05:45 localhost nova_compute[286344]: 2025-12-15 10:05:45.372 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:45 localhost nova_compute[286344]: 2025-12-15 10:05:45.373 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:45 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e164 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 05:05:45 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e164 do_prune osdmap full prune enabled Dec 15 05:05:45 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e165 e165: 6 total, 6 up, 6 in Dec 15 05:05:45 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e165: 6 total, 6 up, 6 in Dec 15 05:05:46 localhost systemd[1]: var-lib-containers-storage-overlay-8394380d994f89ec2f1fded3ba5094cdf2b9c57487bf959febfdff7675ece089-merged.mount: Deactivated successfully. Dec 15 05:05:46 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-954b6de5ed4e5e1a969ec9a8a59336ca106798f2174b74b22099089db8f8318e-userdata-shm.mount: Deactivated successfully. Dec 15 05:05:46 localhost neutron_sriov_agent[260044]: 2025-12-15 10:05:46.464 2 INFO neutron.agent.securitygroups_rpc [None req-f2b71bb6-f8c5-4790-bc61-5da005c4f841 6b5da6f221214afe93e1fa66574f238b 89e710ef9f4f48d48a369002db572947 - - default default] Security group member updated ['a6c5f808-dddc-4f17-acbf-63b1b6e6f4d6']#033[00m Dec 15 05:05:46 localhost systemd[1]: run-netns-qdhcp\x2dc0669abd\x2daef1\x2d4b0d\x2d9f97\x2da6adeeac3211.mount: Deactivated successfully. Dec 15 05:05:46 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Dec 15 05:05:46 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/610462930' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Dec 15 05:05:46 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Dec 15 05:05:46 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/610462930' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Dec 15 05:05:47 localhost nova_compute[286344]: 2025-12-15 10:05:47.333 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:05:47 localhost nova_compute[286344]: 2025-12-15 10:05:47.357 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Triggering sync for uuid 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m Dec 15 05:05:47 localhost nova_compute[286344]: 2025-12-15 10:05:47.357 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 05:05:47 localhost nova_compute[286344]: 2025-12-15 10:05:47.358 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 05:05:47 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:05:47.379 267546 INFO neutron.agent.linux.ip_lib [None req-508b206a-20e6-491e-8a61-aebbc2f9ee0a - - - - - -] Device tap1657c15f-3f cannot be used as it has no MAC address#033[00m Dec 15 05:05:47 localhost nova_compute[286344]: 2025-12-15 10:05:47.384 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.027s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 05:05:47 localhost nova_compute[286344]: 2025-12-15 10:05:47.400 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:47 localhost kernel: device tap1657c15f-3f entered promiscuous mode Dec 15 05:05:47 localhost nova_compute[286344]: 2025-12-15 10:05:47.407 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:47 localhost NetworkManager[5963]: [1765793147.4079] manager: (tap1657c15f-3f): new Generic device (/org/freedesktop/NetworkManager/Devices/43) Dec 15 05:05:47 localhost ovn_controller[154603]: 2025-12-15T10:05:47Z|00266|binding|INFO|Claiming lport 1657c15f-3ffc-4b16-9c97-99677e468aee for this chassis. Dec 15 05:05:47 localhost ovn_controller[154603]: 2025-12-15T10:05:47Z|00267|binding|INFO|1657c15f-3ffc-4b16-9c97-99677e468aee: Claiming unknown Dec 15 05:05:47 localhost systemd-udevd[321949]: Network interface NamePolicy= disabled on kernel command line. Dec 15 05:05:47 localhost ovn_controller[154603]: 2025-12-15T10:05:47Z|00268|binding|INFO|Setting lport 1657c15f-3ffc-4b16-9c97-99677e468aee ovn-installed in OVS Dec 15 05:05:47 localhost nova_compute[286344]: 2025-12-15 10:05:47.418 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:47 localhost ovn_controller[154603]: 2025-12-15T10:05:47Z|00269|binding|INFO|Setting lport 1657c15f-3ffc-4b16-9c97-99677e468aee up in Southbound Dec 15 05:05:47 localhost ovn_metadata_agent[160585]: 2025-12-15 10:05:47.422 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '89e710ef9f4f48d48a369002db572947', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e47821dc-5f5d-44dc-8a16-54817df4049d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=1657c15f-3ffc-4b16-9c97-99677e468aee) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:05:47 localhost ovn_metadata_agent[160585]: 2025-12-15 10:05:47.426 160590 INFO neutron.agent.ovn.metadata.agent [-] Port 1657c15f-3ffc-4b16-9c97-99677e468aee in datapath c0669abd-aef1-4b0d-9f97-a6adeeac3211 bound to our chassis#033[00m Dec 15 05:05:47 localhost nova_compute[286344]: 2025-12-15 10:05:47.427 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:47 localhost ovn_metadata_agent[160585]: 2025-12-15 10:05:47.429 160590 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network c0669abd-aef1-4b0d-9f97-a6adeeac3211 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 15 05:05:47 localhost ovn_metadata_agent[160585]: 2025-12-15 10:05:47.430 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[c86bb3b4-98b5-4c50-a870-7ab60b45a75c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:05:47 localhost nova_compute[286344]: 2025-12-15 10:05:47.448 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:47 localhost neutron_sriov_agent[260044]: 2025-12-15 10:05:47.480 2 INFO neutron.agent.securitygroups_rpc [None req-8a3cb91f-c2f1-4bcf-85e3-917ee0f7df56 6b5da6f221214afe93e1fa66574f238b 89e710ef9f4f48d48a369002db572947 - - default default] Security group member updated ['a6c5f808-dddc-4f17-acbf-63b1b6e6f4d6']#033[00m Dec 15 05:05:47 localhost nova_compute[286344]: 2025-12-15 10:05:47.524 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:47 localhost nova_compute[286344]: 2025-12-15 10:05:47.525 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:47 localhost neutron_sriov_agent[260044]: 2025-12-15 10:05:47.607 2 INFO neutron.agent.securitygroups_rpc [None req-42045b0b-5b6e-42ce-9b33-7566e35391a3 52899874f6ba4276802009964d723e15 dcbc6ff58fef4432a793909041cfb08b - - default default] Security group rule updated ['407f10ca-d755-4259-a56f-131d5cdc9e80']#033[00m Dec 15 05:05:47 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Dec 15 05:05:47 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1406175512' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Dec 15 05:05:47 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Dec 15 05:05:47 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1406175512' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Dec 15 05:05:47 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Dec 15 05:05:47 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3487660145' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Dec 15 05:05:47 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Dec 15 05:05:47 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3487660145' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Dec 15 05:05:47 localhost neutron_sriov_agent[260044]: 2025-12-15 10:05:47.944 2 INFO neutron.agent.securitygroups_rpc [None req-6fa1be55-10df-4528-b95e-d2aa7736afcb 52899874f6ba4276802009964d723e15 dcbc6ff58fef4432a793909041cfb08b - - default default] Security group rule updated ['407f10ca-d755-4259-a56f-131d5cdc9e80']#033[00m Dec 15 05:05:47 localhost nova_compute[286344]: 2025-12-15 10:05:47.969 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.122 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'name': 'test', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005559462.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'c785bf23f53946bc99867d8832a50266', 'user_id': '1ba5fce347b64bfebf995f187193f205', 'hostId': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.123 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.125 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.126 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'de129288-abdd-4829-903b-41973d1517e6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:05:48.123649', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': 'a0e9d644-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12214.316293577, 'message_signature': '6b71b8ffe63c9dd38e09ed7ce85ec5ecda0ba2b05d008a7e8a738e802913396a'}]}, 'timestamp': '2025-12-15 10:05:48.125945', '_unique_id': 'd0ebef97ac4f445baf7a29b754c09ed3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.126 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.126 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.126 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.126 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.126 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.126 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.126 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.126 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.126 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.126 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.126 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.126 12 ERROR oslo_messaging.notify.messaging Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.126 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.126 12 ERROR oslo_messaging.notify.messaging Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.126 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.126 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.126 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.126 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.126 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.126 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.126 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.126 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.126 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.126 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.126 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.126 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.126 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.126 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.126 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.126 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.126 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.126 12 ERROR oslo_messaging.notify.messaging Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.127 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.127 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.127 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.128 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eef80161-b33b-44ee-b556-3ae7163715ba', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:05:48.127948', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': 'a0ea32ba-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12214.316293577, 'message_signature': 'e4008ea0c4823b0828eb39c9ee1cef19fe6eb2f9bc51c7b691712ee0c0b8391d'}]}, 'timestamp': '2025-12-15 10:05:48.128269', '_unique_id': '1af08c4297534bd5ac3e66e61f47cd3f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.128 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.128 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.128 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.128 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.128 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.128 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.128 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.128 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.128 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.128 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.128 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.128 12 ERROR oslo_messaging.notify.messaging Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.128 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.128 12 ERROR oslo_messaging.notify.messaging Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.128 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.128 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.128 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.128 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.128 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.128 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.128 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.128 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.128 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.128 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.128 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.128 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.128 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.128 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.128 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.128 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.128 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.128 12 ERROR oslo_messaging.notify.messaging Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.129 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.154 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.latency volume: 1243487016 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.155 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.latency volume: 24779175 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.157 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '24006c10-b358-4c8b-98da-41a8f4f13159', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1243487016, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T10:05:48.129585', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a0ee52a0-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12214.322229526, 'message_signature': '1e844b4497bfd40731212b57cc84f8d25f38e954c0aaa6ff3671784803261f61'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24779175, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T10:05:48.129585', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a0ee65a6-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12214.322229526, 'message_signature': 'd1003de98978806a35425f57967c577725c2c92e0defb072361685e2a6b33f88'}]}, 'timestamp': '2025-12-15 10:05:48.155898', '_unique_id': 'd1a5812b409945829844a181696d690f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.157 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.157 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.157 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.157 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.157 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.157 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.157 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.157 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.157 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.157 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.157 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.157 12 ERROR oslo_messaging.notify.messaging Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.157 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.157 12 ERROR oslo_messaging.notify.messaging Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.157 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.157 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.157 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.157 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.157 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.157 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.157 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.157 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.157 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.157 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.157 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.157 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.157 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.157 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.157 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.157 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.157 12 ERROR oslo_messaging.notify.messaging Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.158 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.167 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.168 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.170 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5b924309-13b6-47e3-9efa-b3bd681bfefc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T10:05:48.158729', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a0f051fe-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12214.351409616, 'message_signature': 'c12a6d2f5e5e9219ff66c988cf054b32e070de370086b89e878400787f8b2de1'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T10:05:48.158729', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a0f06446-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12214.351409616, 'message_signature': '3806d181ce202e08b5e1df6d2bca74cf0ac105ecccf1a9f37c3cbcb486a54183'}]}, 'timestamp': '2025-12-15 10:05:48.168939', '_unique_id': '053086121f584509b112c165b6411ae7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.170 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.170 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.170 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.170 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.170 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.170 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.170 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.170 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.170 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.170 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.170 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.170 12 ERROR oslo_messaging.notify.messaging Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.170 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.170 12 ERROR oslo_messaging.notify.messaging Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.170 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.170 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.170 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.170 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.170 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.170 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.170 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.170 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.170 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.170 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.170 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.170 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.170 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.170 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.170 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.170 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.170 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.170 12 ERROR oslo_messaging.notify.messaging Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.171 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.172 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.172 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.174 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5858a7a6-4fff-48b8-8bf8-58306864775c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T10:05:48.172127', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a0f0f636-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12214.322229526, 'message_signature': '97b1b6c65693cea34a3bc3057363ffd70b171e38d2fea7090d5247ef1b275e2e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T10:05:48.172127', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a0f10ea0-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12214.322229526, 'message_signature': '45dab014f83988a8973dcb71ec2b69395ea9f0dc925c5dd0c802835ff52ebf87'}]}, 'timestamp': '2025-12-15 10:05:48.173320', '_unique_id': '8817cb44b73848f19657d7a859a34e61'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.174 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.174 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.174 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.174 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.174 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.174 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.174 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.174 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.174 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.174 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.174 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.174 12 ERROR oslo_messaging.notify.messaging Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.174 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.174 12 ERROR oslo_messaging.notify.messaging Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.174 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.174 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.174 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.174 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.174 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.174 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.174 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.174 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.174 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.174 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.174 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.174 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.174 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.174 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.174 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.174 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.174 12 ERROR oslo_messaging.notify.messaging Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.175 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.189 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/cpu volume: 15250000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.191 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ffc37582-7414-4b0e-8d63-1ad97abed2d5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 15250000000, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'timestamp': '2025-12-15T10:05:48.175723', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'a0f3a2fa-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12214.382252668, 'message_signature': 'be116caf0eab648149efa3117357c957818357f7c0333e6c69eacbd9e70027c8'}]}, 'timestamp': '2025-12-15 10:05:48.190268', '_unique_id': 'a9fb4e93b9f44b2d91d6cd8b2f6d3da0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.191 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.191 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.191 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.191 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.191 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.191 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.191 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.191 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.191 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.191 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.191 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.191 12 ERROR oslo_messaging.notify.messaging Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.191 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.191 12 ERROR oslo_messaging.notify.messaging Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.191 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.191 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.191 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.191 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.191 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.191 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.191 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.191 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.191 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.191 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.191 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.191 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.191 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.191 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.191 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.191 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.191 12 ERROR oslo_messaging.notify.messaging Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.192 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.193 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.195 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fdc70bfd-947a-4613-b8e3-fbbc4acf764a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:05:48.193066', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': 'a0f42a7c-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12214.316293577, 'message_signature': '0ead4324968e852bc3afb03c1fd0a2d63a732adda9c527eea20437adb71c5379'}]}, 'timestamp': '2025-12-15 10:05:48.193836', '_unique_id': 'a0b6e2cf812540238e818a5eb7422f03'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.195 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.195 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.195 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.195 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.195 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.195 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.195 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.195 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.195 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.195 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.195 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.195 12 ERROR oslo_messaging.notify.messaging Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.195 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.195 12 ERROR oslo_messaging.notify.messaging Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.195 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.195 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.195 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.195 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.195 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.195 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.195 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.195 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.195 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.195 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.195 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.195 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.195 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.195 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.195 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.195 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.195 12 ERROR oslo_messaging.notify.messaging Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.197 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.197 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.latency volume: 1342134926 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.198 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.latency volume: 123356132 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.200 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fcb4bd3e-c7fb-4f8c-945f-6407c9d852cd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1342134926, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T10:05:48.197549', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a0f4d7c4-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12214.322229526, 'message_signature': 'b132dc89c63a21c2155774670587c7a597e3245104e4b4229eb5fa7ab113e4f5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 123356132, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T10:05:48.197549', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a0f4f272-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12214.322229526, 'message_signature': '6b021f0d54c20b91b8ca49436dde1c5cf7eba303670ac026e8e0855ce54e768c'}]}, 'timestamp': '2025-12-15 10:05:48.198894', '_unique_id': 'eb6b8d2c1c1a462486af102f58dbb327'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.200 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.200 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.200 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.200 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.200 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.200 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.200 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.200 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.200 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.200 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.200 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.200 12 ERROR oslo_messaging.notify.messaging Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.200 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.200 12 ERROR oslo_messaging.notify.messaging Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.200 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.200 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.200 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.200 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.200 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.200 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.200 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.200 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.200 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.200 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.200 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.200 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.200 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.200 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.200 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.200 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.200 12 ERROR oslo_messaging.notify.messaging Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.202 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.202 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.203 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.205 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1dc4a4bf-80c7-4f80-9066-2ce54b26c5fd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T10:05:48.202462', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a0f597a4-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12214.351409616, 'message_signature': '80e285568f430505d1b5c2f03a2eb6548e2f88e1515e14d53da66a4bf1b25798'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T10:05:48.202462', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a0f5b20c-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12214.351409616, 'message_signature': '5e8f37e435d8123d5c6b1b8fffa582ee8c4cc28a3d6016090654f379bacda4fa'}]}, 'timestamp': '2025-12-15 10:05:48.203825', '_unique_id': 'f282420a4e1743fcbe96b39c9f68a287'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.205 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.205 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.205 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.205 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.205 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.205 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.205 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.205 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.205 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.205 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.205 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.205 12 ERROR oslo_messaging.notify.messaging Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.205 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.205 12 ERROR oslo_messaging.notify.messaging Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.205 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.205 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.205 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.205 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.205 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.205 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.205 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.205 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.205 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.205 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.205 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.205 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.205 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.205 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.205 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.205 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.205 12 ERROR oslo_messaging.notify.messaging Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.206 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.207 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.207 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.209 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '96f91727-1e24-4e4a-8970-30c0c397ae7a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T10:05:48.207177', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a0f6507c-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12214.322229526, 'message_signature': '33b7f0a7feffca0633247f489f171b0f919ead25718f78421ebc448d96cedd5d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T10:05:48.207177', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a0f66c4c-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12214.322229526, 'message_signature': '2b7fce3655e10106c9aae48f6d53eeab1ee479ee00483a94afdb82c442df3a8d'}]}, 'timestamp': '2025-12-15 10:05:48.208566', '_unique_id': 'ca67a650c5674c35a9ef904642daf1c3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.209 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.209 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.209 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.209 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.209 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.209 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.209 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.209 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.209 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.209 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.209 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.209 12 ERROR oslo_messaging.notify.messaging Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.209 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.209 12 ERROR oslo_messaging.notify.messaging Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.209 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.209 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.209 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.209 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.209 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.209 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.209 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.209 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.209 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.209 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.209 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.209 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.209 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.209 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.209 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.209 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.209 12 ERROR oslo_messaging.notify.messaging Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.211 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.212 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.214 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a66c1d80-4585-4385-be2f-6664000d9ee6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:05:48.212031', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': 'a0f70ea4-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12214.316293577, 'message_signature': '5f2ece0063ffb36d939702b838232a0732eb5c0761ca7b9723f32b9ce4b79967'}]}, 'timestamp': '2025-12-15 10:05:48.212776', '_unique_id': '69bbe98cfaf64a2497419a7b97ef8977'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.214 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.214 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.214 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.214 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.214 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.214 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.214 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.214 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.214 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.214 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.214 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.214 12 ERROR oslo_messaging.notify.messaging Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.214 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.214 12 ERROR oslo_messaging.notify.messaging Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.214 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.214 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.214 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.214 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.214 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.214 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.214 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.214 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.214 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.214 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.214 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.214 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.214 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.214 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.214 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.214 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.214 12 ERROR oslo_messaging.notify.messaging Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.215 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.216 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.217 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8cefd44e-8415-42d0-a16b-b82fe971eeb5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:05:48.215943', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': 'a0f7a88c-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12214.316293577, 'message_signature': '2b180be4e4bb454b2372ec452736a071938350d358c2f62e48ebbc8d4d5a9527'}]}, 'timestamp': '2025-12-15 10:05:48.216662', '_unique_id': '2a611a2d9e3b453ba60e69d29590f7c9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.217 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.217 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.217 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.217 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.217 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.217 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.217 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.217 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.217 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.217 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.217 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.217 12 ERROR oslo_messaging.notify.messaging Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.217 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.217 12 ERROR oslo_messaging.notify.messaging Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.217 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.217 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.217 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.217 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.217 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.217 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.217 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.217 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.217 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.217 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.217 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.217 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.217 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.217 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.217 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.217 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.217 12 ERROR oslo_messaging.notify.messaging Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.218 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.218 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.219 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fe06a428-e81d-4a73-a879-e912691b82cb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:05:48.218462', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': 'a0f8032c-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12214.316293577, 'message_signature': '328ed69e652cdf4f5cf3f55ec9447813b6d471d1f8c3977d6530870a23d824d5'}]}, 'timestamp': '2025-12-15 10:05:48.218858', '_unique_id': 'c70ceb22a96348ba956014d87ef028a4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.219 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.219 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.219 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.219 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.219 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.219 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.219 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.219 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.219 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.219 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.219 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.219 12 ERROR oslo_messaging.notify.messaging Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.219 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.219 12 ERROR oslo_messaging.notify.messaging Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.219 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.219 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.219 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.219 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.219 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.219 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.219 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.219 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.219 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.219 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.219 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.219 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.219 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.219 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.219 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.219 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.219 12 ERROR oslo_messaging.notify.messaging Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.220 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.220 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.220 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.223 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aead8fc4-2bec-4847-b1f6-df7b34d96358', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:05:48.220822', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': 'a0f893c8-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12214.316293577, 'message_signature': '87cec5098a9cf94f39132bcd6572bba34c758b5f5fca0a8d0a76c05a1578f3be'}]}, 'timestamp': '2025-12-15 10:05:48.222541', '_unique_id': 'a18b798e1a72487cb6610c380d5102c9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.223 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.223 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.223 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.223 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.223 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.223 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.223 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.223 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.223 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.223 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.223 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.223 12 ERROR oslo_messaging.notify.messaging Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.223 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.223 12 ERROR oslo_messaging.notify.messaging Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.223 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.223 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.223 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.223 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.223 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.223 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.223 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.223 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.223 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.223 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.223 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.223 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.223 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.223 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.223 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.223 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.223 12 ERROR oslo_messaging.notify.messaging Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.224 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.224 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.225 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a7481930-c71d-42e4-a69c-48d60d3187d7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:05:48.224362', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': 'a0f8e97c-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12214.316293577, 'message_signature': '6c4dad784dc54f8a7ac5a278230b5a512d7f2a66a58a0fe3952beee796dedcb1'}]}, 'timestamp': '2025-12-15 10:05:48.224753', '_unique_id': '350e9f4a99b341aab77be7590d791f69'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.225 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.225 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.225 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.225 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.225 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.225 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.225 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.225 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.225 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.225 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.225 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.225 12 ERROR oslo_messaging.notify.messaging Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.225 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.225 12 ERROR oslo_messaging.notify.messaging Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.225 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.225 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.225 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.225 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.225 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.225 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.225 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.225 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.225 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.225 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.225 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.225 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.225 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.225 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.225 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.225 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.225 12 ERROR oslo_messaging.notify.messaging Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.226 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.226 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.227 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cbc415e0-04e0-4fce-9a27-731dd6138de6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:05:48.226524', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': 'a0f93c42-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12214.316293577, 'message_signature': '0a704bdbf9440bac64c5408cf596c498bae8da39a8d51a877dcbc8a900b3b147'}]}, 'timestamp': '2025-12-15 10:05:48.226815', '_unique_id': '2c893e0d800d4f3d927158f8906fd8cf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.227 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.227 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.227 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.227 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.227 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.227 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.227 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.227 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.227 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.227 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.227 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.227 12 ERROR oslo_messaging.notify.messaging Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.227 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.227 12 ERROR oslo_messaging.notify.messaging Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.227 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.227 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.227 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.227 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.227 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.227 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.227 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.227 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.227 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.227 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.227 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.227 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.227 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.227 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.227 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.227 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.227 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.227 12 ERROR oslo_messaging.notify.messaging Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.228 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.228 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.228 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.228 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.229 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a3334edd-eecf-4180-b621-c9ec83a65d7b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T10:05:48.228300', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a0f9817a-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12214.322229526, 'message_signature': 'a4a382ecdc76c448b52dbed641c85f423fe072212d0d9c05c09ce2d6e4a5a91c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T10:05:48.228300', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a0f98bde-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12214.322229526, 'message_signature': 'b0e4eb5cd94bf27af0516189725ff04cc8678ff30293b3bb3c1e267b63b76b6f'}]}, 'timestamp': '2025-12-15 10:05:48.228835', '_unique_id': '9c3e9c6951ae4c79b6100bd0a0a8d309'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.229 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.229 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.229 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.229 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.229 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.229 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.229 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.229 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.229 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.229 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.229 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.229 12 ERROR oslo_messaging.notify.messaging Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.229 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.229 12 ERROR oslo_messaging.notify.messaging Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.229 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.229 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.229 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.229 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.229 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.229 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.229 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.229 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.229 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.229 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.229 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.229 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.229 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.229 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.229 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.229 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.229 12 ERROR oslo_messaging.notify.messaging Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.230 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.230 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.230 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.230 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.231 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0953adc5-767d-4f58-b490-46a2315c0bd9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T10:05:48.230325', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a0f9d076-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12214.322229526, 'message_signature': '4a12e7acd8bb64bf902df8232db2db60dfaabe8c5fb5c3990005fbc73e768307'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T10:05:48.230325', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a0f9dabc-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12214.322229526, 'message_signature': '03ec741b52aa07e388de8e393ce139e9f89f2469ccd8b8fe5075a4abe51158e1'}]}, 'timestamp': '2025-12-15 10:05:48.230854', '_unique_id': 'e2de51a54f6d40efb176d25136ce3c43'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.231 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.231 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.231 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.231 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.231 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.231 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.231 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.231 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.231 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.231 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.231 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.231 12 ERROR oslo_messaging.notify.messaging Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.231 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.231 12 ERROR oslo_messaging.notify.messaging Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.231 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.231 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.231 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.231 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.231 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.231 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.231 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.231 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.231 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.231 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.231 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.231 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.231 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.231 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.231 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.231 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.231 12 ERROR oslo_messaging.notify.messaging Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.232 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.232 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/memory.usage volume: 51.73828125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.233 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '387a9733-20da-47a3-9582-78bc3663cf2e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.73828125, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'timestamp': '2025-12-15T10:05:48.232277', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'a0fa1cfc-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12214.382252668, 'message_signature': '6808a1957f9c30c30e6a3fbb9ae082d430442a16e0f3371d63c7e0fb0ca0f68e'}]}, 'timestamp': '2025-12-15 10:05:48.232558', '_unique_id': '95afa6c86356439e9f4b71559ee4e850'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.233 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.233 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.233 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.233 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.233 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.233 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.233 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.233 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.233 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.233 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.233 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.233 12 ERROR oslo_messaging.notify.messaging Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.233 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.233 12 ERROR oslo_messaging.notify.messaging Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.233 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.233 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.233 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.233 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.233 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.233 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.233 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.233 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.233 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.233 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.233 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.233 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.233 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.233 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.233 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.233 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.233 12 ERROR oslo_messaging.notify.messaging Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.233 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.234 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.234 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.235 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '279ea87a-8897-42b2-b382-528dd6515d19', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T10:05:48.234065', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a0fa63e2-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12214.351409616, 'message_signature': 'ccf2425130c86b3729a6827f6b9ca235306bcb881d7ccbe560986c769f273cf4'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T10:05:48.234065', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a0fa6df6-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12214.351409616, 'message_signature': 'c4e4c5ad26f58b930310ae5f62414794e77123964a9f0d221ff39a3d0bc8ffda'}]}, 'timestamp': '2025-12-15 10:05:48.234620', '_unique_id': '721e2209e49a415681adcf42e6ecfab5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.235 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.235 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.235 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.235 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.235 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.235 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.235 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.235 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.235 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.235 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.235 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.235 12 ERROR oslo_messaging.notify.messaging Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.235 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.235 12 ERROR oslo_messaging.notify.messaging Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.235 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.235 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.235 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.235 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.235 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.235 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.235 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.235 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.235 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.235 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.235 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.235 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.235 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.235 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.235 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.235 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.235 12 ERROR oslo_messaging.notify.messaging Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.236 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.236 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.237 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '82a1a096-be9f-4861-baeb-9064336c40f5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:05:48.236244', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': 'a0fab838-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12214.316293577, 'message_signature': '62d376af6467482f8f5a04222010a0ecdc22220a234007afaffe7ea09c0ab836'}]}, 'timestamp': '2025-12-15 10:05:48.236544', '_unique_id': '1d0ee7ac2b41427bb23f8417e22e3707'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.237 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.237 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.237 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.237 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.237 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.237 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.237 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.237 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.237 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.237 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.237 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.237 12 ERROR oslo_messaging.notify.messaging Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.237 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.237 12 ERROR oslo_messaging.notify.messaging Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.237 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.237 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.237 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.237 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.237 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.237 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.237 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.237 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.237 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.237 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.237 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.237 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.237 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.237 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.237 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.237 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:05:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:05:48.237 12 ERROR oslo_messaging.notify.messaging Dec 15 05:05:48 localhost podman[322004]: Dec 15 05:05:48 localhost podman[322004]: 2025-12-15 10:05:48.330202524 +0000 UTC m=+0.079425058 container create 14e480052e521f3f47ca69618d29b84505fd6dc436df01e577739eeb12cf31e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 05:05:48 localhost systemd[1]: Started libpod-conmon-14e480052e521f3f47ca69618d29b84505fd6dc436df01e577739eeb12cf31e4.scope. Dec 15 05:05:48 localhost systemd[1]: tmp-crun.tg28mc.mount: Deactivated successfully. Dec 15 05:05:48 localhost systemd[1]: Started libcrun container. Dec 15 05:05:48 localhost podman[322004]: 2025-12-15 10:05:48.295106915 +0000 UTC m=+0.044329499 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 15 05:05:48 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ad74d0b87d707ced8b98f79274199da2c3857ac3fd686af734e9f40c7c339bda/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 05:05:48 localhost podman[322004]: 2025-12-15 10:05:48.405014026 +0000 UTC m=+0.154236560 container init 14e480052e521f3f47ca69618d29b84505fd6dc436df01e577739eeb12cf31e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3) Dec 15 05:05:48 localhost podman[322004]: 2025-12-15 10:05:48.410629947 +0000 UTC m=+0.159852481 container start 14e480052e521f3f47ca69618d29b84505fd6dc436df01e577739eeb12cf31e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0) Dec 15 05:05:48 localhost dnsmasq[322023]: started, version 2.85 cachesize 150 Dec 15 05:05:48 localhost dnsmasq[322023]: DNS service limited to local subnets Dec 15 05:05:48 localhost dnsmasq[322023]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 15 05:05:48 localhost dnsmasq[322023]: warning: no upstream servers configured Dec 15 05:05:48 localhost dnsmasq[322023]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/addn_hosts - 1 addresses Dec 15 05:05:48 localhost neutron_sriov_agent[260044]: 2025-12-15 10:05:48.616 2 INFO neutron.agent.securitygroups_rpc [None req-00a6acf7-87a4-47e2-afaa-10055df94143 a22542ef31414501844801d3b102584b 54df976b8a364f93b0b8b0128def8f10 - - default default] Security group member updated ['3b08e208-1f02-4d1e-84d0-d090034d8a7c']#033[00m Dec 15 05:05:48 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:05:48.634 267546 INFO neutron.agent.dhcp.agent [None req-038e5841-e4cb-43fc-92a8-8da9727c4612 - - - - - -] DHCP configuration for ports {'79503367-f53f-4b35-8760-76fcaa4d8407', '47e4e566-8be0-4eb1-a213-8eb668906d35'} is completed#033[00m Dec 15 05:05:48 localhost dnsmasq[322023]: exiting on receipt of SIGTERM Dec 15 05:05:48 localhost podman[322041]: 2025-12-15 10:05:48.71487164 +0000 UTC m=+0.052336120 container kill 14e480052e521f3f47ca69618d29b84505fd6dc436df01e577739eeb12cf31e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS) Dec 15 05:05:48 localhost systemd[1]: libpod-14e480052e521f3f47ca69618d29b84505fd6dc436df01e577739eeb12cf31e4.scope: Deactivated successfully. Dec 15 05:05:48 localhost podman[322057]: 2025-12-15 10:05:48.768178894 +0000 UTC m=+0.033462809 container died 14e480052e521f3f47ca69618d29b84505fd6dc436df01e577739eeb12cf31e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:05:48 localhost podman[322057]: 2025-12-15 10:05:48.80795906 +0000 UTC m=+0.073242975 container remove 14e480052e521f3f47ca69618d29b84505fd6dc436df01e577739eeb12cf31e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Dec 15 05:05:48 localhost ovn_controller[154603]: 2025-12-15T10:05:48Z|00270|binding|INFO|Releasing lport 1657c15f-3ffc-4b16-9c97-99677e468aee from this chassis (sb_readonly=0) Dec 15 05:05:48 localhost kernel: device tap1657c15f-3f left promiscuous mode Dec 15 05:05:48 localhost ovn_controller[154603]: 2025-12-15T10:05:48Z|00271|binding|INFO|Setting lport 1657c15f-3ffc-4b16-9c97-99677e468aee down in Southbound Dec 15 05:05:48 localhost nova_compute[286344]: 2025-12-15 10:05:48.819 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:48 localhost ovn_metadata_agent[160585]: 2025-12-15 10:05:48.829 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '89e710ef9f4f48d48a369002db572947', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005559462.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e47821dc-5f5d-44dc-8a16-54817df4049d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=1657c15f-3ffc-4b16-9c97-99677e468aee) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:05:48 localhost systemd[1]: libpod-conmon-14e480052e521f3f47ca69618d29b84505fd6dc436df01e577739eeb12cf31e4.scope: Deactivated successfully. Dec 15 05:05:48 localhost ovn_metadata_agent[160585]: 2025-12-15 10:05:48.831 160590 INFO neutron.agent.ovn.metadata.agent [-] Port 1657c15f-3ffc-4b16-9c97-99677e468aee in datapath c0669abd-aef1-4b0d-9f97-a6adeeac3211 unbound from our chassis#033[00m Dec 15 05:05:48 localhost nova_compute[286344]: 2025-12-15 10:05:48.833 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:48 localhost ovn_metadata_agent[160585]: 2025-12-15 10:05:48.834 160590 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network c0669abd-aef1-4b0d-9f97-a6adeeac3211 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 15 05:05:48 localhost ovn_metadata_agent[160585]: 2025-12-15 10:05:48.835 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[5064c5be-ef9e-42ac-879f-708c2a37d5fe]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:05:49 localhost neutron_sriov_agent[260044]: 2025-12-15 10:05:49.170 2 INFO neutron.agent.securitygroups_rpc [None req-7ae6b9fe-5a9a-4767-9908-2856749c5610 52899874f6ba4276802009964d723e15 dcbc6ff58fef4432a793909041cfb08b - - default default] Security group rule updated ['ecbd2763-fa0e-490b-8e4b-4d967816d8de']#033[00m Dec 15 05:05:49 localhost systemd[1]: var-lib-containers-storage-overlay-ad74d0b87d707ced8b98f79274199da2c3857ac3fd686af734e9f40c7c339bda-merged.mount: Deactivated successfully. Dec 15 05:05:49 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-14e480052e521f3f47ca69618d29b84505fd6dc436df01e577739eeb12cf31e4-userdata-shm.mount: Deactivated successfully. Dec 15 05:05:49 localhost systemd[1]: run-netns-qdhcp\x2dc0669abd\x2daef1\x2d4b0d\x2d9f97\x2da6adeeac3211.mount: Deactivated successfully. Dec 15 05:05:49 localhost neutron_sriov_agent[260044]: 2025-12-15 10:05:49.538 2 INFO neutron.agent.securitygroups_rpc [None req-4606af61-7ec9-4df8-9128-e0c2fde1ca16 52899874f6ba4276802009964d723e15 dcbc6ff58fef4432a793909041cfb08b - - default default] Security group rule updated ['ecbd2763-fa0e-490b-8e4b-4d967816d8de']#033[00m Dec 15 05:05:49 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e165 do_prune osdmap full prune enabled Dec 15 05:05:49 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e166 e166: 6 total, 6 up, 6 in Dec 15 05:05:49 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e166: 6 total, 6 up, 6 in Dec 15 05:05:49 localhost neutron_sriov_agent[260044]: 2025-12-15 10:05:49.843 2 INFO neutron.agent.securitygroups_rpc [None req-4a671417-fbf4-432a-8bd0-107957b0a2e1 6b5da6f221214afe93e1fa66574f238b 89e710ef9f4f48d48a369002db572947 - - default default] Security group member updated ['a6c5f808-dddc-4f17-acbf-63b1b6e6f4d6']#033[00m Dec 15 05:05:50 localhost neutron_sriov_agent[260044]: 2025-12-15 10:05:50.058 2 INFO neutron.agent.securitygroups_rpc [None req-ad58e8ed-a1d8-43b4-88e7-4124096e326b 52899874f6ba4276802009964d723e15 dcbc6ff58fef4432a793909041cfb08b - - default default] Security group rule updated ['ecbd2763-fa0e-490b-8e4b-4d967816d8de']#033[00m Dec 15 05:05:50 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:05:50.209 267546 INFO neutron.agent.linux.ip_lib [None req-0b4849aa-976f-4f63-ab54-2b8e43de9d09 - - - - - -] Device tapa5b67d2e-7f cannot be used as it has no MAC address#033[00m Dec 15 05:05:50 localhost nova_compute[286344]: 2025-12-15 10:05:50.226 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:50 localhost kernel: device tapa5b67d2e-7f entered promiscuous mode Dec 15 05:05:50 localhost systemd-udevd[321951]: Network interface NamePolicy= disabled on kernel command line. Dec 15 05:05:50 localhost NetworkManager[5963]: [1765793150.2309] manager: (tapa5b67d2e-7f): new Generic device (/org/freedesktop/NetworkManager/Devices/44) Dec 15 05:05:50 localhost ovn_controller[154603]: 2025-12-15T10:05:50Z|00272|binding|INFO|Claiming lport a5b67d2e-7f9e-460e-99d3-a5b85e403b89 for this chassis. Dec 15 05:05:50 localhost ovn_controller[154603]: 2025-12-15T10:05:50Z|00273|binding|INFO|a5b67d2e-7f9e-460e-99d3-a5b85e403b89: Claiming unknown Dec 15 05:05:50 localhost nova_compute[286344]: 2025-12-15 10:05:50.233 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:50 localhost ovn_controller[154603]: 2025-12-15T10:05:50Z|00274|binding|INFO|Setting lport a5b67d2e-7f9e-460e-99d3-a5b85e403b89 ovn-installed in OVS Dec 15 05:05:50 localhost nova_compute[286344]: 2025-12-15 10:05:50.240 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:50 localhost nova_compute[286344]: 2025-12-15 10:05:50.245 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:50 localhost nova_compute[286344]: 2025-12-15 10:05:50.262 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:50 localhost nova_compute[286344]: 2025-12-15 10:05:50.295 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:50 localhost nova_compute[286344]: 2025-12-15 10:05:50.319 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:50 localhost ovn_controller[154603]: 2025-12-15T10:05:50Z|00275|binding|INFO|Setting lport a5b67d2e-7f9e-460e-99d3-a5b85e403b89 up in Southbound Dec 15 05:05:50 localhost ovn_metadata_agent[160585]: 2025-12-15 10:05:50.338 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '89e710ef9f4f48d48a369002db572947', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e47821dc-5f5d-44dc-8a16-54817df4049d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=a5b67d2e-7f9e-460e-99d3-a5b85e403b89) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:05:50 localhost ovn_metadata_agent[160585]: 2025-12-15 10:05:50.340 160590 INFO neutron.agent.ovn.metadata.agent [-] Port a5b67d2e-7f9e-460e-99d3-a5b85e403b89 in datapath c0669abd-aef1-4b0d-9f97-a6adeeac3211 bound to our chassis#033[00m Dec 15 05:05:50 localhost ovn_metadata_agent[160585]: 2025-12-15 10:05:50.341 160590 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network c0669abd-aef1-4b0d-9f97-a6adeeac3211 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 15 05:05:50 localhost ovn_metadata_agent[160585]: 2025-12-15 10:05:50.342 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[1b9d28de-0c2a-4e8c-bd11-fe77dcb4902c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:05:50 localhost nova_compute[286344]: 2025-12-15 10:05:50.353 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:50 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e166 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 05:05:50 localhost neutron_sriov_agent[260044]: 2025-12-15 10:05:50.892 2 INFO neutron.agent.securitygroups_rpc [None req-5f04f265-5a28-443a-bc16-683caeb1c06b 52899874f6ba4276802009964d723e15 dcbc6ff58fef4432a793909041cfb08b - - default default] Security group rule updated ['ecbd2763-fa0e-490b-8e4b-4d967816d8de']#033[00m Dec 15 05:05:51 localhost podman[322146]: Dec 15 05:05:51 localhost podman[322146]: 2025-12-15 10:05:51.088425796 +0000 UTC m=+0.076239570 container create fd469b4f85312810bd0f8c1ba973052bdc3aea904282c0bdfc8dfa5295d8cbde (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:05:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 05:05:51 localhost systemd[1]: Started libpod-conmon-fd469b4f85312810bd0f8c1ba973052bdc3aea904282c0bdfc8dfa5295d8cbde.scope. Dec 15 05:05:51 localhost systemd[1]: Started libcrun container. Dec 15 05:05:51 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1e71919cdf55381b13e87bf7083d54136a4cda52cc144f1dc4319010643b0236/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 05:05:51 localhost podman[322146]: 2025-12-15 10:05:51.151235497 +0000 UTC m=+0.139049301 container init fd469b4f85312810bd0f8c1ba973052bdc3aea904282c0bdfc8dfa5295d8cbde (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202) Dec 15 05:05:51 localhost podman[322146]: 2025-12-15 10:05:51.055764868 +0000 UTC m=+0.043578702 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 15 05:05:51 localhost systemd[1]: tmp-crun.qfY89P.mount: Deactivated successfully. Dec 15 05:05:51 localhost podman[322146]: 2025-12-15 10:05:51.164275484 +0000 UTC m=+0.152089298 container start fd469b4f85312810bd0f8c1ba973052bdc3aea904282c0bdfc8dfa5295d8cbde (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Dec 15 05:05:51 localhost dnsmasq[322174]: started, version 2.85 cachesize 150 Dec 15 05:05:51 localhost dnsmasq[322174]: DNS service limited to local subnets Dec 15 05:05:51 localhost dnsmasq[322174]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 15 05:05:51 localhost dnsmasq[322174]: warning: no upstream servers configured Dec 15 05:05:51 localhost dnsmasq-dhcp[322174]: DHCPv6, static leases only on 2001:db8::, lease time 1d Dec 15 05:05:51 localhost dnsmasq[322174]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/addn_hosts - 0 addresses Dec 15 05:05:51 localhost dnsmasq-dhcp[322174]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/host Dec 15 05:05:51 localhost dnsmasq-dhcp[322174]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/opts Dec 15 05:05:51 localhost neutron_sriov_agent[260044]: 2025-12-15 10:05:51.193 2 INFO neutron.agent.securitygroups_rpc [None req-79a9f6b0-9fd6-4601-a26e-672e8c600275 52899874f6ba4276802009964d723e15 dcbc6ff58fef4432a793909041cfb08b - - default default] Security group rule updated ['ecbd2763-fa0e-490b-8e4b-4d967816d8de']#033[00m Dec 15 05:05:51 localhost podman[322160]: 2025-12-15 10:05:51.209309261 +0000 UTC m=+0.080175628 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202) Dec 15 05:05:51 localhost podman[322160]: 2025-12-15 10:05:51.217291301 +0000 UTC m=+0.088157708 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:05:51 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:05:51.224 267546 INFO neutron.agent.dhcp.agent [None req-0b4849aa-976f-4f63-ab54-2b8e43de9d09 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-15T10:05:46Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=47e4e566-8be0-4eb1-a213-8eb668906d35, ip_allocation=immediate, mac_address=fa:16:3e:8c:49:17, name=tempest-NetworksTestDHCPv6-1202193498, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-15T10:05:39Z, description=, dns_domain=, id=c0669abd-aef1-4b0d-9f97-a6adeeac3211, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1861207414, port_security_enabled=True, project_id=89e710ef9f4f48d48a369002db572947, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=20673, qos_policy_id=None, revision_number=4, router:external=False, shared=False, standard_attr_id=2164, status=ACTIVE, subnets=['b522fd59-3a0c-4ab7-80b4-b3f4867deba9'], tags=[], tenant_id=89e710ef9f4f48d48a369002db572947, updated_at=2025-12-15T10:05:45Z, vlan_transparent=None, network_id=c0669abd-aef1-4b0d-9f97-a6adeeac3211, port_security_enabled=True, project_id=89e710ef9f4f48d48a369002db572947, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['a6c5f808-dddc-4f17-acbf-63b1b6e6f4d6'], standard_attr_id=2214, status=DOWN, tags=[], tenant_id=89e710ef9f4f48d48a369002db572947, updated_at=2025-12-15T10:05:46Z on network c0669abd-aef1-4b0d-9f97-a6adeeac3211#033[00m Dec 15 05:05:51 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 05:05:51 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:05:51.292 267546 INFO neutron.agent.dhcp.agent [None req-15af173a-4166-4b27-8cb6-e1f1b54225cf - - - - - -] DHCP configuration for ports {'79503367-f53f-4b35-8760-76fcaa4d8407'} is completed#033[00m Dec 15 05:05:51 localhost dnsmasq[322174]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/addn_hosts - 1 addresses Dec 15 05:05:51 localhost podman[322198]: 2025-12-15 10:05:51.378624407 +0000 UTC m=+0.049653424 container kill fd469b4f85312810bd0f8c1ba973052bdc3aea904282c0bdfc8dfa5295d8cbde (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 05:05:51 localhost dnsmasq-dhcp[322174]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/host Dec 15 05:05:51 localhost dnsmasq-dhcp[322174]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/opts Dec 15 05:05:51 localhost ovn_metadata_agent[160585]: 2025-12-15 10:05:51.481 160590 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 05:05:51 localhost ovn_metadata_agent[160585]: 2025-12-15 10:05:51.482 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 05:05:51 localhost ovn_metadata_agent[160585]: 2025-12-15 10:05:51.483 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 05:05:51 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:05:51.526 267546 INFO neutron.agent.dhcp.agent [None req-0b4849aa-976f-4f63-ab54-2b8e43de9d09 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-15T10:05:49Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=48ff6594-e07b-4dbb-965d-733bcafe4623, ip_allocation=immediate, mac_address=fa:16:3e:2a:b7:c9, name=tempest-NetworksTestDHCPv6-657894070, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-15T10:05:39Z, description=, dns_domain=, id=c0669abd-aef1-4b0d-9f97-a6adeeac3211, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1861207414, port_security_enabled=True, project_id=89e710ef9f4f48d48a369002db572947, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=20673, qos_policy_id=None, revision_number=6, router:external=False, shared=False, standard_attr_id=2164, status=ACTIVE, subnets=['5c146605-4188-4173-a7e3-1d80fdca5d07'], tags=[], tenant_id=89e710ef9f4f48d48a369002db572947, updated_at=2025-12-15T10:05:48Z, vlan_transparent=None, network_id=c0669abd-aef1-4b0d-9f97-a6adeeac3211, port_security_enabled=True, project_id=89e710ef9f4f48d48a369002db572947, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['a6c5f808-dddc-4f17-acbf-63b1b6e6f4d6'], standard_attr_id=2231, status=DOWN, tags=[], tenant_id=89e710ef9f4f48d48a369002db572947, updated_at=2025-12-15T10:05:49Z on network c0669abd-aef1-4b0d-9f97-a6adeeac3211#033[00m Dec 15 05:05:51 localhost neutron_sriov_agent[260044]: 2025-12-15 10:05:51.558 2 INFO neutron.agent.securitygroups_rpc [None req-0acfdc7a-43d0-4f9f-972a-9bfa74b7ab73 6b5da6f221214afe93e1fa66574f238b 89e710ef9f4f48d48a369002db572947 - - default default] Security group member updated ['a6c5f808-dddc-4f17-acbf-63b1b6e6f4d6']#033[00m Dec 15 05:05:51 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:05:51.615 267546 INFO neutron.agent.dhcp.agent [None req-1d0070f3-bdb8-40ee-a959-b412663eafe2 - - - - - -] DHCP configuration for ports {'47e4e566-8be0-4eb1-a213-8eb668906d35'} is completed#033[00m Dec 15 05:05:51 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e166 do_prune osdmap full prune enabled Dec 15 05:05:51 localhost dnsmasq[322174]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/addn_hosts - 2 addresses Dec 15 05:05:51 localhost dnsmasq-dhcp[322174]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/host Dec 15 05:05:51 localhost podman[322236]: 2025-12-15 10:05:51.716424131 +0000 UTC m=+0.061985292 container kill fd469b4f85312810bd0f8c1ba973052bdc3aea904282c0bdfc8dfa5295d8cbde (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202) Dec 15 05:05:51 localhost dnsmasq-dhcp[322174]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/opts Dec 15 05:05:51 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e167 e167: 6 total, 6 up, 6 in Dec 15 05:05:51 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e167: 6 total, 6 up, 6 in Dec 15 05:05:51 localhost ovn_controller[154603]: 2025-12-15T10:05:51Z|00276|binding|INFO|Releasing lport a5b67d2e-7f9e-460e-99d3-a5b85e403b89 from this chassis (sb_readonly=0) Dec 15 05:05:51 localhost nova_compute[286344]: 2025-12-15 10:05:51.890 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:51 localhost kernel: device tapa5b67d2e-7f left promiscuous mode Dec 15 05:05:51 localhost ovn_controller[154603]: 2025-12-15T10:05:51Z|00277|binding|INFO|Setting lport a5b67d2e-7f9e-460e-99d3-a5b85e403b89 down in Southbound Dec 15 05:05:51 localhost ovn_metadata_agent[160585]: 2025-12-15 10:05:51.900 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '89e710ef9f4f48d48a369002db572947', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005559462.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e47821dc-5f5d-44dc-8a16-54817df4049d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=a5b67d2e-7f9e-460e-99d3-a5b85e403b89) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:05:51 localhost ovn_metadata_agent[160585]: 2025-12-15 10:05:51.902 160590 INFO neutron.agent.ovn.metadata.agent [-] Port a5b67d2e-7f9e-460e-99d3-a5b85e403b89 in datapath c0669abd-aef1-4b0d-9f97-a6adeeac3211 unbound from our chassis#033[00m Dec 15 05:05:51 localhost ovn_metadata_agent[160585]: 2025-12-15 10:05:51.904 160590 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network c0669abd-aef1-4b0d-9f97-a6adeeac3211 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 15 05:05:51 localhost ovn_metadata_agent[160585]: 2025-12-15 10:05:51.905 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[9a010f52-bec0-4e57-a631-b894434e76a7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:05:51 localhost nova_compute[286344]: 2025-12-15 10:05:51.911 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:51 localhost neutron_sriov_agent[260044]: 2025-12-15 10:05:51.946 2 INFO neutron.agent.securitygroups_rpc [None req-78605fd3-3e51-4e95-924a-4d77fe9c417f 52899874f6ba4276802009964d723e15 dcbc6ff58fef4432a793909041cfb08b - - default default] Security group rule updated ['ecbd2763-fa0e-490b-8e4b-4d967816d8de']#033[00m Dec 15 05:05:51 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:05:51.982 267546 INFO neutron.agent.dhcp.agent [None req-1193c1bc-4ba4-470c-9e8d-6948786d9a31 - - - - - -] DHCP configuration for ports {'48ff6594-e07b-4dbb-965d-733bcafe4623'} is completed#033[00m Dec 15 05:05:51 localhost neutron_sriov_agent[260044]: 2025-12-15 10:05:51.991 2 INFO neutron.agent.securitygroups_rpc [None req-c48f4d0e-0bae-4357-864c-65b491f9e7ce a22542ef31414501844801d3b102584b 54df976b8a364f93b0b8b0128def8f10 - - default default] Security group member updated ['3b08e208-1f02-4d1e-84d0-d090034d8a7c']#033[00m Dec 15 05:05:52 localhost dnsmasq[322174]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/addn_hosts - 1 addresses Dec 15 05:05:52 localhost podman[322278]: 2025-12-15 10:05:52.101129647 +0000 UTC m=+0.062162436 container kill fd469b4f85312810bd0f8c1ba973052bdc3aea904282c0bdfc8dfa5295d8cbde (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Dec 15 05:05:52 localhost dnsmasq-dhcp[322174]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/host Dec 15 05:05:52 localhost dnsmasq-dhcp[322174]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/opts Dec 15 05:05:52 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:05:52.128 267546 ERROR neutron.agent.dhcp.agent [None req-0b4849aa-976f-4f63-ab54-2b8e43de9d09 - - - - - -] Unable to reload_allocations dhcp for c0669abd-aef1-4b0d-9f97-a6adeeac3211.: neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tapa5b67d2e-7f not found in namespace qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211. Dec 15 05:05:52 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:05:52.128 267546 ERROR neutron.agent.dhcp.agent Traceback (most recent call last): Dec 15 05:05:52 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:05:52.128 267546 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/dhcp/agent.py", line 264, in _call_driver Dec 15 05:05:52 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:05:52.128 267546 ERROR neutron.agent.dhcp.agent rv = getattr(driver, action)(**action_kwargs) Dec 15 05:05:52 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:05:52.128 267546 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 673, in reload_allocations Dec 15 05:05:52 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:05:52.128 267546 ERROR neutron.agent.dhcp.agent self.device_manager.update(self.network, self.interface_name) Dec 15 05:05:52 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:05:52.128 267546 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1899, in update Dec 15 05:05:52 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:05:52.128 267546 ERROR neutron.agent.dhcp.agent self._set_default_route(network, device_name) Dec 15 05:05:52 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:05:52.128 267546 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1610, in _set_default_route Dec 15 05:05:52 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:05:52.128 267546 ERROR neutron.agent.dhcp.agent self._set_default_route_ip_version(network, device_name, Dec 15 05:05:52 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:05:52.128 267546 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1539, in _set_default_route_ip_version Dec 15 05:05:52 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:05:52.128 267546 ERROR neutron.agent.dhcp.agent gateway = device.route.get_gateway(ip_version=ip_version) Dec 15 05:05:52 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:05:52.128 267546 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 671, in get_gateway Dec 15 05:05:52 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:05:52.128 267546 ERROR neutron.agent.dhcp.agent routes = self.list_routes(ip_version, scope=scope, table=table) Dec 15 05:05:52 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:05:52.128 267546 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 656, in list_routes Dec 15 05:05:52 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:05:52.128 267546 ERROR neutron.agent.dhcp.agent return list_ip_routes(self._parent.namespace, ip_version, scope=scope, Dec 15 05:05:52 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:05:52.128 267546 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 1611, in list_ip_routes Dec 15 05:05:52 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:05:52.128 267546 ERROR neutron.agent.dhcp.agent routes = privileged.list_ip_routes(namespace, ip_version, device=device, Dec 15 05:05:52 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:05:52.128 267546 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 333, in wrapped_f Dec 15 05:05:52 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:05:52.128 267546 ERROR neutron.agent.dhcp.agent return self(f, *args, **kw) Dec 15 05:05:52 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:05:52.128 267546 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 423, in __call__ Dec 15 05:05:52 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:05:52.128 267546 ERROR neutron.agent.dhcp.agent do = self.iter(retry_state=retry_state) Dec 15 05:05:52 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:05:52.128 267546 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 360, in iter Dec 15 05:05:52 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:05:52.128 267546 ERROR neutron.agent.dhcp.agent return fut.result() Dec 15 05:05:52 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:05:52.128 267546 ERROR neutron.agent.dhcp.agent File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 439, in result Dec 15 05:05:52 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:05:52.128 267546 ERROR neutron.agent.dhcp.agent return self.__get_result() Dec 15 05:05:52 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:05:52.128 267546 ERROR neutron.agent.dhcp.agent File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 391, in __get_result Dec 15 05:05:52 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:05:52.128 267546 ERROR neutron.agent.dhcp.agent raise self._exception Dec 15 05:05:52 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:05:52.128 267546 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 426, in __call__ Dec 15 05:05:52 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:05:52.128 267546 ERROR neutron.agent.dhcp.agent result = fn(*args, **kwargs) Dec 15 05:05:52 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:05:52.128 267546 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_privsep/priv_context.py", line 271, in _wrap Dec 15 05:05:52 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:05:52.128 267546 ERROR neutron.agent.dhcp.agent return self.channel.remote_call(name, args, kwargs, Dec 15 05:05:52 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:05:52.128 267546 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_privsep/daemon.py", line 215, in remote_call Dec 15 05:05:52 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:05:52.128 267546 ERROR neutron.agent.dhcp.agent raise exc_type(*result[2]) Dec 15 05:05:52 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:05:52.128 267546 ERROR neutron.agent.dhcp.agent neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tapa5b67d2e-7f not found in namespace qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211. Dec 15 05:05:52 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:05:52.128 267546 ERROR neutron.agent.dhcp.agent #033[00m Dec 15 05:05:52 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:05:52.135 267546 INFO neutron.agent.dhcp.agent [-] Synchronizing state#033[00m Dec 15 05:05:52 localhost neutron_sriov_agent[260044]: 2025-12-15 10:05:52.542 2 INFO neutron.agent.securitygroups_rpc [None req-ea77140a-b14a-497c-ab1f-f0fa2e60c12f 52899874f6ba4276802009964d723e15 dcbc6ff58fef4432a793909041cfb08b - - default default] Security group rule updated ['ecbd2763-fa0e-490b-8e4b-4d967816d8de']#033[00m Dec 15 05:05:52 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:05:52.694 267546 INFO neutron.agent.dhcp.agent [None req-10f9e7a7-1f4e-41b3-bb30-b54303ec51bb - - - - - -] All active networks have been fetched through RPC.#033[00m Dec 15 05:05:52 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:05:52.696 267546 INFO neutron.agent.dhcp.agent [-] Starting network c0669abd-aef1-4b0d-9f97-a6adeeac3211 dhcp configuration#033[00m Dec 15 05:05:52 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:05:52.696 267546 INFO neutron.agent.dhcp.agent [-] Finished network c0669abd-aef1-4b0d-9f97-a6adeeac3211 dhcp configuration#033[00m Dec 15 05:05:52 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:05:52.697 267546 INFO neutron.agent.dhcp.agent [None req-10f9e7a7-1f4e-41b3-bb30-b54303ec51bb - - - - - -] Synchronizing state complete#033[00m Dec 15 05:05:52 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:05:52.875 267546 INFO neutron.agent.dhcp.agent [None req-850f5cec-715a-4b8c-abf5-2547656340d9 - - - - - -] DHCP configuration for ports {'79503367-f53f-4b35-8760-76fcaa4d8407', '48ff6594-e07b-4dbb-965d-733bcafe4623'} is completed#033[00m Dec 15 05:05:53 localhost nova_compute[286344]: 2025-12-15 10:05:53.016 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:53 localhost dnsmasq[322174]: exiting on receipt of SIGTERM Dec 15 05:05:53 localhost podman[322308]: 2025-12-15 10:05:53.024631507 +0000 UTC m=+0.109932491 container kill fd469b4f85312810bd0f8c1ba973052bdc3aea904282c0bdfc8dfa5295d8cbde (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:05:53 localhost systemd[1]: tmp-crun.mUg4L7.mount: Deactivated successfully. Dec 15 05:05:53 localhost systemd[1]: libpod-fd469b4f85312810bd0f8c1ba973052bdc3aea904282c0bdfc8dfa5295d8cbde.scope: Deactivated successfully. Dec 15 05:05:53 localhost podman[322321]: 2025-12-15 10:05:53.092454094 +0000 UTC m=+0.056224258 container died fd469b4f85312810bd0f8c1ba973052bdc3aea904282c0bdfc8dfa5295d8cbde (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, io.buildah.version=1.41.3) Dec 15 05:05:53 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-fd469b4f85312810bd0f8c1ba973052bdc3aea904282c0bdfc8dfa5295d8cbde-userdata-shm.mount: Deactivated successfully. Dec 15 05:05:53 localhost systemd[1]: var-lib-containers-storage-overlay-1e71919cdf55381b13e87bf7083d54136a4cda52cc144f1dc4319010643b0236-merged.mount: Deactivated successfully. Dec 15 05:05:53 localhost podman[322321]: 2025-12-15 10:05:53.126435144 +0000 UTC m=+0.090205218 container cleanup fd469b4f85312810bd0f8c1ba973052bdc3aea904282c0bdfc8dfa5295d8cbde (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0) Dec 15 05:05:53 localhost systemd[1]: libpod-conmon-fd469b4f85312810bd0f8c1ba973052bdc3aea904282c0bdfc8dfa5295d8cbde.scope: Deactivated successfully. Dec 15 05:05:53 localhost podman[322328]: 2025-12-15 10:05:53.165901422 +0000 UTC m=+0.115453000 container remove fd469b4f85312810bd0f8c1ba973052bdc3aea904282c0bdfc8dfa5295d8cbde (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS) Dec 15 05:05:53 localhost neutron_sriov_agent[260044]: 2025-12-15 10:05:53.371 2 INFO neutron.agent.securitygroups_rpc [None req-8d247871-8164-466c-be6d-9da4c8669a8a 52899874f6ba4276802009964d723e15 dcbc6ff58fef4432a793909041cfb08b - - default default] Security group rule updated ['ecbd2763-fa0e-490b-8e4b-4d967816d8de']#033[00m Dec 15 05:05:53 localhost systemd[1]: run-netns-qdhcp\x2dc0669abd\x2daef1\x2d4b0d\x2d9f97\x2da6adeeac3211.mount: Deactivated successfully. Dec 15 05:05:53 localhost neutron_sriov_agent[260044]: 2025-12-15 10:05:53.726 2 INFO neutron.agent.securitygroups_rpc [None req-2dcee6a5-9c1e-4393-a604-9b6cb00e1625 52899874f6ba4276802009964d723e15 dcbc6ff58fef4432a793909041cfb08b - - default default] Security group rule updated ['ecbd2763-fa0e-490b-8e4b-4d967816d8de']#033[00m Dec 15 05:05:53 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e167 do_prune osdmap full prune enabled Dec 15 05:05:53 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e168 e168: 6 total, 6 up, 6 in Dec 15 05:05:53 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e168: 6 total, 6 up, 6 in Dec 15 05:05:54 localhost neutron_sriov_agent[260044]: 2025-12-15 10:05:54.056 2 INFO neutron.agent.securitygroups_rpc [None req-a43d70f6-69d9-467f-8768-432fe6e67325 52899874f6ba4276802009964d723e15 dcbc6ff58fef4432a793909041cfb08b - - default default] Security group rule updated ['ecbd2763-fa0e-490b-8e4b-4d967816d8de']#033[00m Dec 15 05:05:54 localhost neutron_sriov_agent[260044]: 2025-12-15 10:05:54.677 2 INFO neutron.agent.securitygroups_rpc [None req-f0592682-9e08-43d4-b4c5-da0ddd568c3d 6b5da6f221214afe93e1fa66574f238b 89e710ef9f4f48d48a369002db572947 - - default default] Security group member updated ['a6c5f808-dddc-4f17-acbf-63b1b6e6f4d6']#033[00m Dec 15 05:05:54 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e168 do_prune osdmap full prune enabled Dec 15 05:05:54 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e169 e169: 6 total, 6 up, 6 in Dec 15 05:05:54 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e169: 6 total, 6 up, 6 in Dec 15 05:05:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e. Dec 15 05:05:55 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:05:55.036 267546 INFO neutron.agent.linux.ip_lib [None req-fb531df4-cf29-4a8f-88fe-22a15c0a17fd - - - - - -] Device tap0f45f416-52 cannot be used as it has no MAC address#033[00m Dec 15 05:05:55 localhost podman[322354]: 2025-12-15 10:05:55.045297412 +0000 UTC m=+0.086496006 container health_status a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Dec 15 05:05:55 localhost podman[322354]: 2025-12-15 10:05:55.054533712 +0000 UTC m=+0.095732286 container exec_died a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Dec 15 05:05:55 localhost nova_compute[286344]: 2025-12-15 10:05:55.058 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:55 localhost kernel: device tap0f45f416-52 entered promiscuous mode Dec 15 05:05:55 localhost NetworkManager[5963]: [1765793155.0638] manager: (tap0f45f416-52): new Generic device (/org/freedesktop/NetworkManager/Devices/45) Dec 15 05:05:55 localhost systemd-udevd[322385]: Network interface NamePolicy= disabled on kernel command line. Dec 15 05:05:55 localhost nova_compute[286344]: 2025-12-15 10:05:55.068 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:55 localhost ovn_controller[154603]: 2025-12-15T10:05:55Z|00278|binding|INFO|Claiming lport 0f45f416-52c0-4fd8-bccb-81241b5696b6 for this chassis. Dec 15 05:05:55 localhost ovn_controller[154603]: 2025-12-15T10:05:55Z|00279|binding|INFO|0f45f416-52c0-4fd8-bccb-81241b5696b6: Claiming unknown Dec 15 05:05:55 localhost systemd[1]: a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e.service: Deactivated successfully. Dec 15 05:05:55 localhost ovn_metadata_agent[160585]: 2025-12-15 10:05:55.077 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '89e710ef9f4f48d48a369002db572947', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e47821dc-5f5d-44dc-8a16-54817df4049d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=0f45f416-52c0-4fd8-bccb-81241b5696b6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:05:55 localhost ovn_controller[154603]: 2025-12-15T10:05:55Z|00280|binding|INFO|Setting lport 0f45f416-52c0-4fd8-bccb-81241b5696b6 ovn-installed in OVS Dec 15 05:05:55 localhost ovn_controller[154603]: 2025-12-15T10:05:55Z|00281|binding|INFO|Setting lport 0f45f416-52c0-4fd8-bccb-81241b5696b6 up in Southbound Dec 15 05:05:55 localhost nova_compute[286344]: 2025-12-15 10:05:55.079 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:55 localhost ovn_metadata_agent[160585]: 2025-12-15 10:05:55.080 160590 INFO neutron.agent.ovn.metadata.agent [-] Port 0f45f416-52c0-4fd8-bccb-81241b5696b6 in datapath c0669abd-aef1-4b0d-9f97-a6adeeac3211 bound to our chassis#033[00m Dec 15 05:05:55 localhost ovn_metadata_agent[160585]: 2025-12-15 10:05:55.082 160590 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network c0669abd-aef1-4b0d-9f97-a6adeeac3211 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 15 05:05:55 localhost ovn_metadata_agent[160585]: 2025-12-15 10:05:55.083 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[3e63b896-eb92-4984-a81d-0cadf290e168]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:05:55 localhost journal[231322]: ethtool ioctl error on tap0f45f416-52: No such device Dec 15 05:05:55 localhost journal[231322]: ethtool ioctl error on tap0f45f416-52: No such device Dec 15 05:05:55 localhost nova_compute[286344]: 2025-12-15 10:05:55.105 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:55 localhost journal[231322]: ethtool ioctl error on tap0f45f416-52: No such device Dec 15 05:05:55 localhost journal[231322]: ethtool ioctl error on tap0f45f416-52: No such device Dec 15 05:05:55 localhost journal[231322]: ethtool ioctl error on tap0f45f416-52: No such device Dec 15 05:05:55 localhost journal[231322]: ethtool ioctl error on tap0f45f416-52: No such device Dec 15 05:05:55 localhost journal[231322]: ethtool ioctl error on tap0f45f416-52: No such device Dec 15 05:05:55 localhost journal[231322]: ethtool ioctl error on tap0f45f416-52: No such device Dec 15 05:05:55 localhost nova_compute[286344]: 2025-12-15 10:05:55.147 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:55 localhost nova_compute[286344]: 2025-12-15 10:05:55.179 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:55 localhost nova_compute[286344]: 2025-12-15 10:05:55.392 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:55 localhost neutron_sriov_agent[260044]: 2025-12-15 10:05:55.530 2 INFO neutron.agent.securitygroups_rpc [None req-731aa473-c639-44c9-a380-1858037ad957 6b5da6f221214afe93e1fa66574f238b 89e710ef9f4f48d48a369002db572947 - - default default] Security group member updated ['a6c5f808-dddc-4f17-acbf-63b1b6e6f4d6']#033[00m Dec 15 05:05:55 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Dec 15 05:05:55 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.25625 172.18.0.34:0/382777224' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Dec 15 05:05:55 localhost neutron_sriov_agent[260044]: 2025-12-15 10:05:55.656 2 INFO neutron.agent.securitygroups_rpc [None req-02aa9f8f-25b5-4f55-bfe8-a7b040e52a3d 52899874f6ba4276802009964d723e15 dcbc6ff58fef4432a793909041cfb08b - - default default] Security group rule updated ['7e9480e1-cc7c-42fd-9d23-573e0c5c416b']#033[00m Dec 15 05:05:55 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e169 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 05:05:55 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e169 do_prune osdmap full prune enabled Dec 15 05:05:55 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e170 e170: 6 total, 6 up, 6 in Dec 15 05:05:55 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e170: 6 total, 6 up, 6 in Dec 15 05:05:55 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : mgrmap e48: np0005559464.aomnqe(active, since 8m), standbys: np0005559462.fudvyx, np0005559463.daptkf Dec 15 05:05:55 localhost podman[322457]: Dec 15 05:05:55 localhost podman[322457]: 2025-12-15 10:05:55.983712984 +0000 UTC m=+0.065316405 container create 9cf45cd141d1d7461c8f30a43a38da6a96161bd4993a40f38cbda77de2611cbb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0) Dec 15 05:05:56 localhost systemd[1]: Started libpod-conmon-9cf45cd141d1d7461c8f30a43a38da6a96161bd4993a40f38cbda77de2611cbb.scope. Dec 15 05:05:56 localhost podman[322457]: 2025-12-15 10:05:55.948484582 +0000 UTC m=+0.030087983 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 15 05:05:56 localhost systemd[1]: Started libcrun container. Dec 15 05:05:56 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/35f1a6296e617f55e968dc52f2d60e1216e22c684916f1b633e273ca5afcba97/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 05:05:56 localhost podman[322457]: 2025-12-15 10:05:56.070503736 +0000 UTC m=+0.152107167 container init 9cf45cd141d1d7461c8f30a43a38da6a96161bd4993a40f38cbda77de2611cbb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Dec 15 05:05:56 localhost sshd[322474]: main: sshd: ssh-rsa algorithm is disabled Dec 15 05:05:56 localhost podman[322457]: 2025-12-15 10:05:56.08182606 +0000 UTC m=+0.163429491 container start 9cf45cd141d1d7461c8f30a43a38da6a96161bd4993a40f38cbda77de2611cbb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3) Dec 15 05:05:56 localhost dnsmasq[322476]: started, version 2.85 cachesize 150 Dec 15 05:05:56 localhost dnsmasq[322476]: DNS service limited to local subnets Dec 15 05:05:56 localhost dnsmasq[322476]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 15 05:05:56 localhost dnsmasq[322476]: warning: no upstream servers configured Dec 15 05:05:56 localhost dnsmasq-dhcp[322476]: DHCPv6, static leases only on 2001:db8::, lease time 1d Dec 15 05:05:56 localhost dnsmasq[322476]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/addn_hosts - 0 addresses Dec 15 05:05:56 localhost dnsmasq-dhcp[322476]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/host Dec 15 05:05:56 localhost dnsmasq-dhcp[322476]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/opts Dec 15 05:05:56 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Dec 15 05:05:56 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2790153673' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Dec 15 05:05:56 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Dec 15 05:05:56 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2790153673' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Dec 15 05:05:56 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:05:56.145 267546 INFO neutron.agent.dhcp.agent [None req-fb531df4-cf29-4a8f-88fe-22a15c0a17fd - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-15T10:05:53Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=3cf7e3c8-e64a-4ee8-8464-04754ea37d89, ip_allocation=immediate, mac_address=fa:16:3e:28:17:e9, name=tempest-NetworksTestDHCPv6-1675245306, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-15T10:05:39Z, description=, dns_domain=, id=c0669abd-aef1-4b0d-9f97-a6adeeac3211, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1861207414, port_security_enabled=True, project_id=89e710ef9f4f48d48a369002db572947, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=20673, qos_policy_id=None, revision_number=8, router:external=False, shared=False, standard_attr_id=2164, status=ACTIVE, subnets=['e4759729-0e3e-4eaf-8737-a4967248f2dd'], tags=[], tenant_id=89e710ef9f4f48d48a369002db572947, updated_at=2025-12-15T10:05:53Z, vlan_transparent=None, network_id=c0669abd-aef1-4b0d-9f97-a6adeeac3211, port_security_enabled=True, project_id=89e710ef9f4f48d48a369002db572947, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['a6c5f808-dddc-4f17-acbf-63b1b6e6f4d6'], standard_attr_id=2242, status=DOWN, tags=[], tenant_id=89e710ef9f4f48d48a369002db572947, updated_at=2025-12-15T10:05:54Z on network c0669abd-aef1-4b0d-9f97-a6adeeac3211#033[00m Dec 15 05:05:56 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:05:56.239 267546 INFO neutron.agent.dhcp.agent [None req-c8808a22-3e39-482f-a8d7-be05f2fdcd78 - - - - - -] DHCP configuration for ports {'79503367-f53f-4b35-8760-76fcaa4d8407'} is completed#033[00m Dec 15 05:05:56 localhost dnsmasq[322476]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/addn_hosts - 1 addresses Dec 15 05:05:56 localhost dnsmasq-dhcp[322476]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/host Dec 15 05:05:56 localhost dnsmasq-dhcp[322476]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/opts Dec 15 05:05:56 localhost podman[322495]: 2025-12-15 10:05:56.334138953 +0000 UTC m=+0.055933270 container kill 9cf45cd141d1d7461c8f30a43a38da6a96161bd4993a40f38cbda77de2611cbb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:05:56 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:05:56.604 267546 INFO neutron.agent.dhcp.agent [None req-06782fc5-5048-4179-b83f-87dcddedb7be - - - - - -] DHCP configuration for ports {'3cf7e3c8-e64a-4ee8-8464-04754ea37d89'} is completed#033[00m Dec 15 05:05:56 localhost dnsmasq[322476]: exiting on receipt of SIGTERM Dec 15 05:05:56 localhost podman[322533]: 2025-12-15 10:05:56.833922591 +0000 UTC m=+0.064025794 container kill 9cf45cd141d1d7461c8f30a43a38da6a96161bd4993a40f38cbda77de2611cbb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Dec 15 05:05:56 localhost systemd[1]: libpod-9cf45cd141d1d7461c8f30a43a38da6a96161bd4993a40f38cbda77de2611cbb.scope: Deactivated successfully. Dec 15 05:05:56 localhost podman[322547]: 2025-12-15 10:05:56.903424639 +0000 UTC m=+0.054951026 container died 9cf45cd141d1d7461c8f30a43a38da6a96161bd4993a40f38cbda77de2611cbb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Dec 15 05:05:56 localhost podman[322547]: 2025-12-15 10:05:56.938284802 +0000 UTC m=+0.089811169 container cleanup 9cf45cd141d1d7461c8f30a43a38da6a96161bd4993a40f38cbda77de2611cbb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, io.buildah.version=1.41.3) Dec 15 05:05:56 localhost systemd[1]: libpod-conmon-9cf45cd141d1d7461c8f30a43a38da6a96161bd4993a40f38cbda77de2611cbb.scope: Deactivated successfully. Dec 15 05:05:56 localhost podman[322549]: 2025-12-15 10:05:56.982449847 +0000 UTC m=+0.127157383 container remove 9cf45cd141d1d7461c8f30a43a38da6a96161bd4993a40f38cbda77de2611cbb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 05:05:56 localhost systemd[1]: var-lib-containers-storage-overlay-35f1a6296e617f55e968dc52f2d60e1216e22c684916f1b633e273ca5afcba97-merged.mount: Deactivated successfully. Dec 15 05:05:56 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9cf45cd141d1d7461c8f30a43a38da6a96161bd4993a40f38cbda77de2611cbb-userdata-shm.mount: Deactivated successfully. Dec 15 05:05:57 localhost ovn_controller[154603]: 2025-12-15T10:05:57Z|00282|binding|INFO|Releasing lport 0f45f416-52c0-4fd8-bccb-81241b5696b6 from this chassis (sb_readonly=0) Dec 15 05:05:57 localhost kernel: device tap0f45f416-52 left promiscuous mode Dec 15 05:05:57 localhost nova_compute[286344]: 2025-12-15 10:05:57.001 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:57 localhost ovn_controller[154603]: 2025-12-15T10:05:57Z|00283|binding|INFO|Setting lport 0f45f416-52c0-4fd8-bccb-81241b5696b6 down in Southbound Dec 15 05:05:57 localhost ovn_metadata_agent[160585]: 2025-12-15 10:05:57.012 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '89e710ef9f4f48d48a369002db572947', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005559462.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e47821dc-5f5d-44dc-8a16-54817df4049d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=0f45f416-52c0-4fd8-bccb-81241b5696b6) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:05:57 localhost ovn_metadata_agent[160585]: 2025-12-15 10:05:57.014 160590 INFO neutron.agent.ovn.metadata.agent [-] Port 0f45f416-52c0-4fd8-bccb-81241b5696b6 in datapath c0669abd-aef1-4b0d-9f97-a6adeeac3211 unbound from our chassis#033[00m Dec 15 05:05:57 localhost ovn_metadata_agent[160585]: 2025-12-15 10:05:57.016 160590 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network c0669abd-aef1-4b0d-9f97-a6adeeac3211 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 15 05:05:57 localhost ovn_metadata_agent[160585]: 2025-12-15 10:05:57.017 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[53681c43-46b1-4db4-bf45-05de0b3db2e6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:05:57 localhost nova_compute[286344]: 2025-12-15 10:05:57.027 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:57 localhost systemd[1]: run-netns-qdhcp\x2dc0669abd\x2daef1\x2d4b0d\x2d9f97\x2da6adeeac3211.mount: Deactivated successfully. Dec 15 05:05:57 localhost neutron_sriov_agent[260044]: 2025-12-15 10:05:57.762 2 INFO neutron.agent.securitygroups_rpc [None req-a9c5ee54-dee3-4826-bad5-cc573f072d29 6b5da6f221214afe93e1fa66574f238b 89e710ef9f4f48d48a369002db572947 - - default default] Security group member updated ['a6c5f808-dddc-4f17-acbf-63b1b6e6f4d6']#033[00m Dec 15 05:05:57 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e170 do_prune osdmap full prune enabled Dec 15 05:05:57 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e171 e171: 6 total, 6 up, 6 in Dec 15 05:05:57 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e171: 6 total, 6 up, 6 in Dec 15 05:05:57 localhost neutron_sriov_agent[260044]: 2025-12-15 10:05:57.975 2 INFO neutron.agent.securitygroups_rpc [None req-48387118-2cf9-47fb-bc48-c27a472cf33e 52899874f6ba4276802009964d723e15 dcbc6ff58fef4432a793909041cfb08b - - default default] Security group rule updated ['e5d711a1-c86e-4980-a5f7-2827066ee781']#033[00m Dec 15 05:05:58 localhost nova_compute[286344]: 2025-12-15 10:05:58.019 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:58 localhost neutron_sriov_agent[260044]: 2025-12-15 10:05:58.344 2 INFO neutron.agent.securitygroups_rpc [None req-0597e323-71f4-4bb5-bf0a-905086a0b1ea 52899874f6ba4276802009964d723e15 dcbc6ff58fef4432a793909041cfb08b - - default default] Security group rule updated ['e5d711a1-c86e-4980-a5f7-2827066ee781']#033[00m Dec 15 05:05:58 localhost neutron_sriov_agent[260044]: 2025-12-15 10:05:58.372 2 INFO neutron.agent.securitygroups_rpc [None req-11dfe134-55ec-4066-b67d-17cca227715e 6b5da6f221214afe93e1fa66574f238b 89e710ef9f4f48d48a369002db572947 - - default default] Security group member updated ['a6c5f808-dddc-4f17-acbf-63b1b6e6f4d6']#033[00m Dec 15 05:05:58 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:05:58.562 267546 INFO neutron.agent.linux.ip_lib [None req-6ea588fc-8a96-4e17-88c9-6f286eef3a95 - - - - - -] Device tap06658722-30 cannot be used as it has no MAC address#033[00m Dec 15 05:05:58 localhost nova_compute[286344]: 2025-12-15 10:05:58.582 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:58 localhost kernel: device tap06658722-30 entered promiscuous mode Dec 15 05:05:58 localhost NetworkManager[5963]: [1765793158.5885] manager: (tap06658722-30): new Generic device (/org/freedesktop/NetworkManager/Devices/46) Dec 15 05:05:58 localhost ovn_controller[154603]: 2025-12-15T10:05:58Z|00284|binding|INFO|Claiming lport 06658722-3023-4c39-8175-8e1709db67ca for this chassis. Dec 15 05:05:58 localhost ovn_controller[154603]: 2025-12-15T10:05:58Z|00285|binding|INFO|06658722-3023-4c39-8175-8e1709db67ca: Claiming unknown Dec 15 05:05:58 localhost nova_compute[286344]: 2025-12-15 10:05:58.590 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:58 localhost systemd-udevd[322587]: Network interface NamePolicy= disabled on kernel command line. Dec 15 05:05:58 localhost ovn_metadata_agent[160585]: 2025-12-15 10:05:58.596 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '89e710ef9f4f48d48a369002db572947', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e47821dc-5f5d-44dc-8a16-54817df4049d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=06658722-3023-4c39-8175-8e1709db67ca) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:05:58 localhost ovn_controller[154603]: 2025-12-15T10:05:58Z|00286|binding|INFO|Setting lport 06658722-3023-4c39-8175-8e1709db67ca up in Southbound Dec 15 05:05:58 localhost ovn_controller[154603]: 2025-12-15T10:05:58Z|00287|binding|INFO|Setting lport 06658722-3023-4c39-8175-8e1709db67ca ovn-installed in OVS Dec 15 05:05:58 localhost nova_compute[286344]: 2025-12-15 10:05:58.599 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:58 localhost ovn_metadata_agent[160585]: 2025-12-15 10:05:58.599 160590 INFO neutron.agent.ovn.metadata.agent [-] Port 06658722-3023-4c39-8175-8e1709db67ca in datapath c0669abd-aef1-4b0d-9f97-a6adeeac3211 bound to our chassis#033[00m Dec 15 05:05:58 localhost ovn_metadata_agent[160585]: 2025-12-15 10:05:58.600 160590 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network c0669abd-aef1-4b0d-9f97-a6adeeac3211 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 15 05:05:58 localhost nova_compute[286344]: 2025-12-15 10:05:58.600 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:58 localhost ovn_metadata_agent[160585]: 2025-12-15 10:05:58.601 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[529aadcf-de4c-48d8-bef9-a196566b9336]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:05:58 localhost nova_compute[286344]: 2025-12-15 10:05:58.605 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:58 localhost journal[231322]: ethtool ioctl error on tap06658722-30: No such device Dec 15 05:05:58 localhost journal[231322]: ethtool ioctl error on tap06658722-30: No such device Dec 15 05:05:58 localhost nova_compute[286344]: 2025-12-15 10:05:58.632 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:58 localhost journal[231322]: ethtool ioctl error on tap06658722-30: No such device Dec 15 05:05:58 localhost journal[231322]: ethtool ioctl error on tap06658722-30: No such device Dec 15 05:05:58 localhost journal[231322]: ethtool ioctl error on tap06658722-30: No such device Dec 15 05:05:58 localhost journal[231322]: ethtool ioctl error on tap06658722-30: No such device Dec 15 05:05:58 localhost journal[231322]: ethtool ioctl error on tap06658722-30: No such device Dec 15 05:05:58 localhost journal[231322]: ethtool ioctl error on tap06658722-30: No such device Dec 15 05:05:58 localhost nova_compute[286344]: 2025-12-15 10:05:58.667 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:58 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Dec 15 05:05:58 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.25625 172.18.0.34:0/382777224' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Dec 15 05:05:58 localhost nova_compute[286344]: 2025-12-15 10:05:58.740 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:05:59 localhost podman[322695]: Dec 15 05:05:59 localhost podman[322695]: 2025-12-15 10:05:59.539085894 +0000 UTC m=+0.091490010 container create 32f40a01614559348db9f64730401e397a8cd6901aed793f78242fe34333040d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Dec 15 05:05:59 localhost systemd[1]: Started libpod-conmon-32f40a01614559348db9f64730401e397a8cd6901aed793f78242fe34333040d.scope. Dec 15 05:05:59 localhost podman[322695]: 2025-12-15 10:05:59.492275172 +0000 UTC m=+0.044679318 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 15 05:05:59 localhost systemd[1]: Started libcrun container. Dec 15 05:05:59 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4e47724196728a60a7beaa88d18258fff4d635000308187431c2e67fd4d1d3dd/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 05:05:59 localhost podman[322695]: 2025-12-15 10:05:59.610223475 +0000 UTC m=+0.162627581 container init 32f40a01614559348db9f64730401e397a8cd6901aed793f78242fe34333040d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:05:59 localhost podman[322695]: 2025-12-15 10:05:59.618963863 +0000 UTC m=+0.171367979 container start 32f40a01614559348db9f64730401e397a8cd6901aed793f78242fe34333040d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) Dec 15 05:05:59 localhost dnsmasq[322727]: started, version 2.85 cachesize 150 Dec 15 05:05:59 localhost dnsmasq[322727]: DNS service limited to local subnets Dec 15 05:05:59 localhost dnsmasq[322727]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 15 05:05:59 localhost dnsmasq[322727]: warning: no upstream servers configured Dec 15 05:05:59 localhost dnsmasq[322727]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/addn_hosts - 1 addresses Dec 15 05:05:59 localhost neutron_sriov_agent[260044]: 2025-12-15 10:05:59.740 2 INFO neutron.agent.securitygroups_rpc [None req-6178634c-0410-4a84-8415-fc64b8a02cd3 a22542ef31414501844801d3b102584b 54df976b8a364f93b0b8b0128def8f10 - - default default] Security group member updated ['3b08e208-1f02-4d1e-84d0-d090034d8a7c']#033[00m Dec 15 05:05:59 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:05:59.780 267546 INFO neutron.agent.dhcp.agent [None req-61fe1e4c-166b-452d-81eb-57029f6ab15b - - - - - -] DHCP configuration for ports {'79503367-f53f-4b35-8760-76fcaa4d8407', 'a7397fae-7351-41cc-9cd1-c5bdaeca1025'} is completed#033[00m Dec 15 05:05:59 localhost dnsmasq[322727]: exiting on receipt of SIGTERM Dec 15 05:05:59 localhost podman[322760]: 2025-12-15 10:05:59.954696795 +0000 UTC m=+0.060027924 container kill 32f40a01614559348db9f64730401e397a8cd6901aed793f78242fe34333040d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:05:59 localhost systemd[1]: libpod-32f40a01614559348db9f64730401e397a8cd6901aed793f78242fe34333040d.scope: Deactivated successfully. Dec 15 05:06:00 localhost podman[322774]: 2025-12-15 10:06:00.031781283 +0000 UTC m=+0.059805757 container died 32f40a01614559348db9f64730401e397a8cd6901aed793f78242fe34333040d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:06:00 localhost podman[322774]: 2025-12-15 10:06:00.055311132 +0000 UTC m=+0.083335606 container cleanup 32f40a01614559348db9f64730401e397a8cd6901aed793f78242fe34333040d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:06:00 localhost systemd[1]: libpod-conmon-32f40a01614559348db9f64730401e397a8cd6901aed793f78242fe34333040d.scope: Deactivated successfully. Dec 15 05:06:00 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Dec 15 05:06:00 localhost podman[322776]: 2025-12-15 10:06:00.113629731 +0000 UTC m=+0.130584518 container remove 32f40a01614559348db9f64730401e397a8cd6901aed793f78242fe34333040d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3) Dec 15 05:06:00 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:06:00 localhost nova_compute[286344]: 2025-12-15 10:06:00.127 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:00 localhost kernel: device tap06658722-30 left promiscuous mode Dec 15 05:06:00 localhost ovn_controller[154603]: 2025-12-15T10:06:00Z|00288|binding|INFO|Releasing lport 06658722-3023-4c39-8175-8e1709db67ca from this chassis (sb_readonly=0) Dec 15 05:06:00 localhost ovn_controller[154603]: 2025-12-15T10:06:00Z|00289|binding|INFO|Setting lport 06658722-3023-4c39-8175-8e1709db67ca down in Southbound Dec 15 05:06:00 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:00.135 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '89e710ef9f4f48d48a369002db572947', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005559462.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e47821dc-5f5d-44dc-8a16-54817df4049d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=06658722-3023-4c39-8175-8e1709db67ca) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:06:00 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:00.138 160590 INFO neutron.agent.ovn.metadata.agent [-] Port 06658722-3023-4c39-8175-8e1709db67ca in datapath c0669abd-aef1-4b0d-9f97-a6adeeac3211 unbound from our chassis#033[00m Dec 15 05:06:00 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:00.139 160590 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network c0669abd-aef1-4b0d-9f97-a6adeeac3211 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 15 05:06:00 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:00.140 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[1c93bded-d298-4b91-9ec8-d576305992f1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:06:00 localhost nova_compute[286344]: 2025-12-15 10:06:00.150 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:00 localhost nova_compute[286344]: 2025-12-15 10:06:00.395 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:00 localhost systemd[1]: var-lib-containers-storage-overlay-4e47724196728a60a7beaa88d18258fff4d635000308187431c2e67fd4d1d3dd-merged.mount: Deactivated successfully. Dec 15 05:06:00 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-32f40a01614559348db9f64730401e397a8cd6901aed793f78242fe34333040d-userdata-shm.mount: Deactivated successfully. Dec 15 05:06:00 localhost systemd[1]: run-netns-qdhcp\x2dc0669abd\x2daef1\x2d4b0d\x2d9f97\x2da6adeeac3211.mount: Deactivated successfully. Dec 15 05:06:00 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e171 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 05:06:00 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Dec 15 05:06:00 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:06:01 localhost neutron_sriov_agent[260044]: 2025-12-15 10:06:01.064 2 INFO neutron.agent.securitygroups_rpc [None req-ff7c28a8-7b3a-4da1-975d-72a77e1c4bc9 a22542ef31414501844801d3b102584b 54df976b8a364f93b0b8b0128def8f10 - - default default] Security group member updated ['3b08e208-1f02-4d1e-84d0-d090034d8a7c']#033[00m Dec 15 05:06:02 localhost neutron_sriov_agent[260044]: 2025-12-15 10:06:01.526 2 INFO neutron.agent.securitygroups_rpc [None req-5f01d720-7f32-4255-bc20-fa222a2fa4b3 6b5da6f221214afe93e1fa66574f238b 89e710ef9f4f48d48a369002db572947 - - default default] Security group member updated ['a6c5f808-dddc-4f17-acbf-63b1b6e6f4d6']#033[00m Dec 15 05:06:02 localhost neutron_sriov_agent[260044]: 2025-12-15 10:06:01.903 2 INFO neutron.agent.securitygroups_rpc [None req-fcc31301-1b8b-42ae-8d39-09f8caa9be03 52899874f6ba4276802009964d723e15 dcbc6ff58fef4432a793909041cfb08b - - default default] Security group rule updated ['1413df89-05bd-4882-808b-a65c71141b66']#033[00m Dec 15 05:06:02 localhost podman[243449]: time="2025-12-15T10:06:02Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 15 05:06:02 localhost podman[243449]: @ - - [15/Dec/2025:10:06:02 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 160281 "" "Go-http-client/1.1" Dec 15 05:06:02 localhost podman[243449]: @ - - [15/Dec/2025:10:06:02 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 20204 "" "Go-http-client/1.1" Dec 15 05:06:02 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:02.107 267546 INFO neutron.agent.linux.ip_lib [None req-6332ed50-373b-4506-9c88-c0dfcb82a5a8 - - - - - -] Device tap64b9206f-45 cannot be used as it has no MAC address#033[00m Dec 15 05:06:02 localhost nova_compute[286344]: 2025-12-15 10:06:02.129 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:02 localhost kernel: device tap64b9206f-45 entered promiscuous mode Dec 15 05:06:02 localhost nova_compute[286344]: 2025-12-15 10:06:02.137 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:02 localhost NetworkManager[5963]: [1765793162.1389] manager: (tap64b9206f-45): new Generic device (/org/freedesktop/NetworkManager/Devices/47) Dec 15 05:06:02 localhost ovn_controller[154603]: 2025-12-15T10:06:02Z|00290|binding|INFO|Claiming lport 64b9206f-4506-44ca-b7d8-6dc9733bf8e7 for this chassis. Dec 15 05:06:02 localhost ovn_controller[154603]: 2025-12-15T10:06:02Z|00291|binding|INFO|64b9206f-4506-44ca-b7d8-6dc9733bf8e7: Claiming unknown Dec 15 05:06:02 localhost systemd-udevd[322832]: Network interface NamePolicy= disabled on kernel command line. Dec 15 05:06:02 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:02.148 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '89e710ef9f4f48d48a369002db572947', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e47821dc-5f5d-44dc-8a16-54817df4049d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=64b9206f-4506-44ca-b7d8-6dc9733bf8e7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:06:02 localhost ovn_controller[154603]: 2025-12-15T10:06:02Z|00292|binding|INFO|Setting lport 64b9206f-4506-44ca-b7d8-6dc9733bf8e7 ovn-installed in OVS Dec 15 05:06:02 localhost ovn_controller[154603]: 2025-12-15T10:06:02Z|00293|binding|INFO|Setting lport 64b9206f-4506-44ca-b7d8-6dc9733bf8e7 up in Southbound Dec 15 05:06:02 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:02.149 160590 INFO neutron.agent.ovn.metadata.agent [-] Port 64b9206f-4506-44ca-b7d8-6dc9733bf8e7 in datapath c0669abd-aef1-4b0d-9f97-a6adeeac3211 bound to our chassis#033[00m Dec 15 05:06:02 localhost nova_compute[286344]: 2025-12-15 10:06:02.152 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:02 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:02.151 160590 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network c0669abd-aef1-4b0d-9f97-a6adeeac3211 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 15 05:06:02 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:02.153 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[01f97484-d9d9-4c02-b91c-52d922052f63]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:06:02 localhost journal[231322]: ethtool ioctl error on tap64b9206f-45: No such device Dec 15 05:06:02 localhost nova_compute[286344]: 2025-12-15 10:06:02.164 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:02 localhost journal[231322]: ethtool ioctl error on tap64b9206f-45: No such device Dec 15 05:06:02 localhost journal[231322]: ethtool ioctl error on tap64b9206f-45: No such device Dec 15 05:06:02 localhost journal[231322]: ethtool ioctl error on tap64b9206f-45: No such device Dec 15 05:06:02 localhost journal[231322]: ethtool ioctl error on tap64b9206f-45: No such device Dec 15 05:06:02 localhost journal[231322]: ethtool ioctl error on tap64b9206f-45: No such device Dec 15 05:06:02 localhost journal[231322]: ethtool ioctl error on tap64b9206f-45: No such device Dec 15 05:06:02 localhost journal[231322]: ethtool ioctl error on tap64b9206f-45: No such device Dec 15 05:06:02 localhost nova_compute[286344]: 2025-12-15 10:06:02.202 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:02 localhost nova_compute[286344]: 2025-12-15 10:06:02.232 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:02 localhost neutron_sriov_agent[260044]: 2025-12-15 10:06:02.416 2 INFO neutron.agent.securitygroups_rpc [None req-ff0d5e3a-8129-4358-8099-71e807aad3b3 52899874f6ba4276802009964d723e15 dcbc6ff58fef4432a793909041cfb08b - - default default] Security group rule updated ['1413df89-05bd-4882-808b-a65c71141b66']#033[00m Dec 15 05:06:03 localhost podman[322903]: Dec 15 05:06:03 localhost podman[322903]: 2025-12-15 10:06:03.009525318 +0000 UTC m=+0.085043869 container create 40296364aa698d012c1146811728523fe5bb115e9485dcfd05f6e9aeea04b82f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Dec 15 05:06:03 localhost nova_compute[286344]: 2025-12-15 10:06:03.065 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:03 localhost podman[322903]: 2025-12-15 10:06:02.973465686 +0000 UTC m=+0.048984307 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 15 05:06:03 localhost systemd[1]: Started libpod-conmon-40296364aa698d012c1146811728523fe5bb115e9485dcfd05f6e9aeea04b82f.scope. Dec 15 05:06:03 localhost systemd[1]: Started libcrun container. Dec 15 05:06:03 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9050452fa5d904ecb2cdef6847e706e0b9c87c15b3c8183dc4667542577be5e7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 05:06:03 localhost podman[322903]: 2025-12-15 10:06:03.137147182 +0000 UTC m=+0.212665763 container init 40296364aa698d012c1146811728523fe5bb115e9485dcfd05f6e9aeea04b82f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true) Dec 15 05:06:03 localhost podman[322903]: 2025-12-15 10:06:03.146549767 +0000 UTC m=+0.222068348 container start 40296364aa698d012c1146811728523fe5bb115e9485dcfd05f6e9aeea04b82f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:06:03 localhost dnsmasq[322921]: started, version 2.85 cachesize 150 Dec 15 05:06:03 localhost dnsmasq[322921]: DNS service limited to local subnets Dec 15 05:06:03 localhost dnsmasq[322921]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 15 05:06:03 localhost dnsmasq[322921]: warning: no upstream servers configured Dec 15 05:06:03 localhost dnsmasq-dhcp[322921]: DHCPv6, static leases only on 2001:db8::, lease time 1d Dec 15 05:06:03 localhost dnsmasq[322921]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/addn_hosts - 0 addresses Dec 15 05:06:03 localhost dnsmasq-dhcp[322921]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/host Dec 15 05:06:03 localhost dnsmasq-dhcp[322921]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/opts Dec 15 05:06:03 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:03.215 267546 INFO neutron.agent.dhcp.agent [None req-6332ed50-373b-4506-9c88-c0dfcb82a5a8 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-15T10:05:57Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=a7397fae-7351-41cc-9cd1-c5bdaeca1025, ip_allocation=immediate, mac_address=fa:16:3e:86:f5:da, name=tempest-NetworksTestDHCPv6-757344980, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-15T10:05:39Z, description=, dns_domain=, id=c0669abd-aef1-4b0d-9f97-a6adeeac3211, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1861207414, port_security_enabled=True, project_id=89e710ef9f4f48d48a369002db572947, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=20673, qos_policy_id=None, revision_number=10, router:external=False, shared=False, standard_attr_id=2164, status=ACTIVE, subnets=['76afe3c5-6770-47b2-ba48-721f6f8e3ff7'], tags=[], tenant_id=89e710ef9f4f48d48a369002db572947, updated_at=2025-12-15T10:05:56Z, vlan_transparent=None, network_id=c0669abd-aef1-4b0d-9f97-a6adeeac3211, port_security_enabled=True, project_id=89e710ef9f4f48d48a369002db572947, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['a6c5f808-dddc-4f17-acbf-63b1b6e6f4d6'], standard_attr_id=2263, status=DOWN, tags=[], tenant_id=89e710ef9f4f48d48a369002db572947, updated_at=2025-12-15T10:05:57Z on network c0669abd-aef1-4b0d-9f97-a6adeeac3211#033[00m Dec 15 05:06:03 localhost neutron_sriov_agent[260044]: 2025-12-15 10:06:03.264 2 INFO neutron.agent.securitygroups_rpc [None req-0a4a74ac-f806-4d19-9990-cf569060eafb 6b5da6f221214afe93e1fa66574f238b 89e710ef9f4f48d48a369002db572947 - - default default] Security group member updated ['a6c5f808-dddc-4f17-acbf-63b1b6e6f4d6']#033[00m Dec 15 05:06:03 localhost dnsmasq[322921]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/addn_hosts - 1 addresses Dec 15 05:06:03 localhost dnsmasq-dhcp[322921]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/host Dec 15 05:06:03 localhost dnsmasq-dhcp[322921]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/opts Dec 15 05:06:03 localhost podman[322939]: 2025-12-15 10:06:03.383060295 +0000 UTC m=+0.056252399 container kill 40296364aa698d012c1146811728523fe5bb115e9485dcfd05f6e9aeea04b82f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:06:03 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:03.440 267546 INFO neutron.agent.dhcp.agent [None req-477e2b66-92d1-4f24-b710-ec88263d1884 - - - - - -] DHCP configuration for ports {'79503367-f53f-4b35-8760-76fcaa4d8407'} is completed#033[00m Dec 15 05:06:03 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:03.530 267546 INFO neutron.agent.dhcp.agent [None req-6332ed50-373b-4506-9c88-c0dfcb82a5a8 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-15T10:06:00Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=1b5c35da-1732-42b8-bbf2-17ad31cbab26, ip_allocation=immediate, mac_address=fa:16:3e:fe:a7:1d, name=tempest-NetworksTestDHCPv6-1760377211, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-15T10:05:39Z, description=, dns_domain=, id=c0669abd-aef1-4b0d-9f97-a6adeeac3211, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1861207414, port_security_enabled=True, project_id=89e710ef9f4f48d48a369002db572947, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=20673, qos_policy_id=None, revision_number=12, router:external=False, shared=False, standard_attr_id=2164, status=ACTIVE, subnets=['c495bfdb-bcba-4806-8da0-5f4ab68ef242'], tags=[], tenant_id=89e710ef9f4f48d48a369002db572947, updated_at=2025-12-15T10:05:59Z, vlan_transparent=None, network_id=c0669abd-aef1-4b0d-9f97-a6adeeac3211, port_security_enabled=True, project_id=89e710ef9f4f48d48a369002db572947, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['a6c5f808-dddc-4f17-acbf-63b1b6e6f4d6'], standard_attr_id=2279, status=DOWN, tags=[], tenant_id=89e710ef9f4f48d48a369002db572947, updated_at=2025-12-15T10:06:01Z on network c0669abd-aef1-4b0d-9f97-a6adeeac3211#033[00m Dec 15 05:06:03 localhost dnsmasq[322921]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/addn_hosts - 2 addresses Dec 15 05:06:03 localhost dnsmasq-dhcp[322921]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/host Dec 15 05:06:03 localhost dnsmasq-dhcp[322921]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/opts Dec 15 05:06:03 localhost podman[322979]: 2025-12-15 10:06:03.72372374 +0000 UTC m=+0.057906500 container kill 40296364aa698d012c1146811728523fe5bb115e9485dcfd05f6e9aeea04b82f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 05:06:03 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:03.728 267546 INFO neutron.agent.dhcp.agent [None req-ae990eac-675e-4187-93cb-135308effc69 - - - - - -] DHCP configuration for ports {'a7397fae-7351-41cc-9cd1-c5bdaeca1025'} is completed#033[00m Dec 15 05:06:03 localhost ovn_controller[154603]: 2025-12-15T10:06:03Z|00294|binding|INFO|Releasing lport 64b9206f-4506-44ca-b7d8-6dc9733bf8e7 from this chassis (sb_readonly=0) Dec 15 05:06:03 localhost ovn_controller[154603]: 2025-12-15T10:06:03Z|00295|binding|INFO|Setting lport 64b9206f-4506-44ca-b7d8-6dc9733bf8e7 down in Southbound Dec 15 05:06:03 localhost nova_compute[286344]: 2025-12-15 10:06:03.897 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:03 localhost kernel: device tap64b9206f-45 left promiscuous mode Dec 15 05:06:03 localhost nova_compute[286344]: 2025-12-15 10:06:03.922 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:03 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:03.942 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '89e710ef9f4f48d48a369002db572947', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005559462.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e47821dc-5f5d-44dc-8a16-54817df4049d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=64b9206f-4506-44ca-b7d8-6dc9733bf8e7) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:06:03 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:03.944 160590 INFO neutron.agent.ovn.metadata.agent [-] Port 64b9206f-4506-44ca-b7d8-6dc9733bf8e7 in datapath c0669abd-aef1-4b0d-9f97-a6adeeac3211 unbound from our chassis#033[00m Dec 15 05:06:03 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:03.945 160590 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network c0669abd-aef1-4b0d-9f97-a6adeeac3211 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 15 05:06:03 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:03.946 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[8976094f-6108-4b5b-8703-e2a9f9ad6055]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:06:03 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:03.979 267546 INFO neutron.agent.dhcp.agent [None req-eef1a52c-0a21-4433-b37d-506cf5e8f934 - - - - - -] DHCP configuration for ports {'1b5c35da-1732-42b8-bbf2-17ad31cbab26'} is completed#033[00m Dec 15 05:06:04 localhost dnsmasq[322921]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/addn_hosts - 1 addresses Dec 15 05:06:04 localhost dnsmasq-dhcp[322921]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/host Dec 15 05:06:04 localhost dnsmasq-dhcp[322921]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/opts Dec 15 05:06:04 localhost podman[323023]: 2025-12-15 10:06:04.175736211 +0000 UTC m=+0.062042993 container kill 40296364aa698d012c1146811728523fe5bb115e9485dcfd05f6e9aeea04b82f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true) Dec 15 05:06:04 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:04.201 267546 ERROR neutron.agent.dhcp.agent [None req-6332ed50-373b-4506-9c88-c0dfcb82a5a8 - - - - - -] Unable to reload_allocations dhcp for c0669abd-aef1-4b0d-9f97-a6adeeac3211.: neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap64b9206f-45 not found in namespace qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211. Dec 15 05:06:04 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:04.201 267546 ERROR neutron.agent.dhcp.agent Traceback (most recent call last): Dec 15 05:06:04 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:04.201 267546 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/dhcp/agent.py", line 264, in _call_driver Dec 15 05:06:04 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:04.201 267546 ERROR neutron.agent.dhcp.agent rv = getattr(driver, action)(**action_kwargs) Dec 15 05:06:04 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:04.201 267546 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 673, in reload_allocations Dec 15 05:06:04 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:04.201 267546 ERROR neutron.agent.dhcp.agent self.device_manager.update(self.network, self.interface_name) Dec 15 05:06:04 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:04.201 267546 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1899, in update Dec 15 05:06:04 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:04.201 267546 ERROR neutron.agent.dhcp.agent self._set_default_route(network, device_name) Dec 15 05:06:04 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:04.201 267546 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1610, in _set_default_route Dec 15 05:06:04 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:04.201 267546 ERROR neutron.agent.dhcp.agent self._set_default_route_ip_version(network, device_name, Dec 15 05:06:04 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:04.201 267546 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1539, in _set_default_route_ip_version Dec 15 05:06:04 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:04.201 267546 ERROR neutron.agent.dhcp.agent gateway = device.route.get_gateway(ip_version=ip_version) Dec 15 05:06:04 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:04.201 267546 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 671, in get_gateway Dec 15 05:06:04 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:04.201 267546 ERROR neutron.agent.dhcp.agent routes = self.list_routes(ip_version, scope=scope, table=table) Dec 15 05:06:04 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:04.201 267546 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 656, in list_routes Dec 15 05:06:04 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:04.201 267546 ERROR neutron.agent.dhcp.agent return list_ip_routes(self._parent.namespace, ip_version, scope=scope, Dec 15 05:06:04 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:04.201 267546 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 1611, in list_ip_routes Dec 15 05:06:04 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:04.201 267546 ERROR neutron.agent.dhcp.agent routes = privileged.list_ip_routes(namespace, ip_version, device=device, Dec 15 05:06:04 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:04.201 267546 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 333, in wrapped_f Dec 15 05:06:04 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:04.201 267546 ERROR neutron.agent.dhcp.agent return self(f, *args, **kw) Dec 15 05:06:04 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:04.201 267546 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 423, in __call__ Dec 15 05:06:04 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:04.201 267546 ERROR neutron.agent.dhcp.agent do = self.iter(retry_state=retry_state) Dec 15 05:06:04 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:04.201 267546 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 360, in iter Dec 15 05:06:04 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:04.201 267546 ERROR neutron.agent.dhcp.agent return fut.result() Dec 15 05:06:04 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:04.201 267546 ERROR neutron.agent.dhcp.agent File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 439, in result Dec 15 05:06:04 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:04.201 267546 ERROR neutron.agent.dhcp.agent return self.__get_result() Dec 15 05:06:04 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:04.201 267546 ERROR neutron.agent.dhcp.agent File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 391, in __get_result Dec 15 05:06:04 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:04.201 267546 ERROR neutron.agent.dhcp.agent raise self._exception Dec 15 05:06:04 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:04.201 267546 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 426, in __call__ Dec 15 05:06:04 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:04.201 267546 ERROR neutron.agent.dhcp.agent result = fn(*args, **kwargs) Dec 15 05:06:04 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:04.201 267546 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_privsep/priv_context.py", line 271, in _wrap Dec 15 05:06:04 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:04.201 267546 ERROR neutron.agent.dhcp.agent return self.channel.remote_call(name, args, kwargs, Dec 15 05:06:04 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:04.201 267546 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_privsep/daemon.py", line 215, in remote_call Dec 15 05:06:04 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:04.201 267546 ERROR neutron.agent.dhcp.agent raise exc_type(*result[2]) Dec 15 05:06:04 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:04.201 267546 ERROR neutron.agent.dhcp.agent neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap64b9206f-45 not found in namespace qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211. Dec 15 05:06:04 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:04.201 267546 ERROR neutron.agent.dhcp.agent #033[00m Dec 15 05:06:04 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:04.205 267546 INFO neutron.agent.dhcp.agent [None req-10f9e7a7-1f4e-41b3-bb30-b54303ec51bb - - - - - -] Synchronizing state#033[00m Dec 15 05:06:04 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : mgrmap e49: np0005559464.aomnqe(active, since 8m), standbys: np0005559462.fudvyx, np0005559463.daptkf Dec 15 05:06:04 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Dec 15 05:06:04 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:06:04 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:04.677 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'fe:17:e3', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fe:55:2b:86:15:b5'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:06:04 localhost nova_compute[286344]: 2025-12-15 10:06:04.677 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:04 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:04.678 160590 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Dec 15 05:06:04 localhost neutron_sriov_agent[260044]: 2025-12-15 10:06:04.838 2 INFO neutron.agent.securitygroups_rpc [None req-64fa8a05-cec1-45e8-95d9-04cdd0f1f1da 52899874f6ba4276802009964d723e15 dcbc6ff58fef4432a793909041cfb08b - - default default] Security group rule updated ['0b8be23d-9729-4205-97f5-2b655855dee5']#033[00m Dec 15 05:06:04 localhost openstack_network_exporter[246484]: ERROR 10:06:04 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 15 05:06:04 localhost openstack_network_exporter[246484]: ERROR 10:06:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 05:06:04 localhost openstack_network_exporter[246484]: ERROR 10:06:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 05:06:04 localhost openstack_network_exporter[246484]: ERROR 10:06:04 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 15 05:06:04 localhost openstack_network_exporter[246484]: Dec 15 05:06:04 localhost openstack_network_exporter[246484]: ERROR 10:06:04 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 15 05:06:04 localhost openstack_network_exporter[246484]: Dec 15 05:06:04 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:04.977 267546 INFO neutron.agent.dhcp.agent [None req-fd06c9ab-9e0a-43d3-bb87-35d334482a83 - - - - - -] All active networks have been fetched through RPC.#033[00m Dec 15 05:06:04 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:04.979 267546 INFO neutron.agent.dhcp.agent [-] Starting network c0669abd-aef1-4b0d-9f97-a6adeeac3211 dhcp configuration#033[00m Dec 15 05:06:04 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:04.980 267546 INFO neutron.agent.dhcp.agent [-] Finished network c0669abd-aef1-4b0d-9f97-a6adeeac3211 dhcp configuration#033[00m Dec 15 05:06:04 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:04.980 267546 INFO neutron.agent.dhcp.agent [None req-fd06c9ab-9e0a-43d3-bb87-35d334482a83 - - - - - -] Synchronizing state complete#033[00m Dec 15 05:06:05 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:05.084 267546 INFO neutron.agent.dhcp.agent [None req-afe37dc2-a767-4428-a8ac-350446bac239 - - - - - -] DHCP configuration for ports {'79503367-f53f-4b35-8760-76fcaa4d8407', '1b5c35da-1732-42b8-bbf2-17ad31cbab26'} is completed#033[00m Dec 15 05:06:05 localhost dnsmasq[322921]: exiting on receipt of SIGTERM Dec 15 05:06:05 localhost podman[323055]: 2025-12-15 10:06:05.261887121 +0000 UTC m=+0.056007612 container kill 40296364aa698d012c1146811728523fe5bb115e9485dcfd05f6e9aeea04b82f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202) Dec 15 05:06:05 localhost systemd[1]: libpod-40296364aa698d012c1146811728523fe5bb115e9485dcfd05f6e9aeea04b82f.scope: Deactivated successfully. Dec 15 05:06:05 localhost podman[323069]: 2025-12-15 10:06:05.330294442 +0000 UTC m=+0.051893429 container died 40296364aa698d012c1146811728523fe5bb115e9485dcfd05f6e9aeea04b82f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Dec 15 05:06:05 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-40296364aa698d012c1146811728523fe5bb115e9485dcfd05f6e9aeea04b82f-userdata-shm.mount: Deactivated successfully. Dec 15 05:06:05 localhost podman[323069]: 2025-12-15 10:06:05.368503089 +0000 UTC m=+0.090102036 container cleanup 40296364aa698d012c1146811728523fe5bb115e9485dcfd05f6e9aeea04b82f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:06:05 localhost systemd[1]: libpod-conmon-40296364aa698d012c1146811728523fe5bb115e9485dcfd05f6e9aeea04b82f.scope: Deactivated successfully. Dec 15 05:06:05 localhost podman[323070]: 2025-12-15 10:06:05.40689074 +0000 UTC m=+0.123372678 container remove 40296364aa698d012c1146811728523fe5bb115e9485dcfd05f6e9aeea04b82f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true) Dec 15 05:06:05 localhost nova_compute[286344]: 2025-12-15 10:06:05.439 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:05 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:06:05 localhost neutron_sriov_agent[260044]: 2025-12-15 10:06:05.691 2 INFO neutron.agent.securitygroups_rpc [None req-404fe7fc-dabe-431b-8e9b-d67a6e771f3a 52899874f6ba4276802009964d723e15 dcbc6ff58fef4432a793909041cfb08b - - default default] Security group rule updated ['0b8be23d-9729-4205-97f5-2b655855dee5']#033[00m Dec 15 05:06:05 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e171 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 05:06:05 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e171 do_prune osdmap full prune enabled Dec 15 05:06:05 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:05.715 267546 INFO neutron.agent.dhcp.agent [None req-a86e3a1c-429d-4ad4-af70-4e663826c68f - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:06:05 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:05.717 267546 INFO neutron.agent.dhcp.agent [None req-a86e3a1c-429d-4ad4-af70-4e663826c68f - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:06:05 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e172 e172: 6 total, 6 up, 6 in Dec 15 05:06:05 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e172: 6 total, 6 up, 6 in Dec 15 05:06:06 localhost neutron_sriov_agent[260044]: 2025-12-15 10:06:06.211 2 INFO neutron.agent.securitygroups_rpc [None req-0018b661-146a-4897-a122-a5c9d1a05a72 52899874f6ba4276802009964d723e15 dcbc6ff58fef4432a793909041cfb08b - - default default] Security group rule updated ['0b8be23d-9729-4205-97f5-2b655855dee5']#033[00m Dec 15 05:06:06 localhost systemd[1]: var-lib-containers-storage-overlay-9050452fa5d904ecb2cdef6847e706e0b9c87c15b3c8183dc4667542577be5e7-merged.mount: Deactivated successfully. Dec 15 05:06:06 localhost systemd[1]: run-netns-qdhcp\x2dc0669abd\x2daef1\x2d4b0d\x2d9f97\x2da6adeeac3211.mount: Deactivated successfully. Dec 15 05:06:06 localhost neutron_sriov_agent[260044]: 2025-12-15 10:06:06.582 2 INFO neutron.agent.securitygroups_rpc [None req-f5a66aa3-580c-4579-b246-9b30d2c85a31 6b5da6f221214afe93e1fa66574f238b 89e710ef9f4f48d48a369002db572947 - - default default] Security group member updated ['a6c5f808-dddc-4f17-acbf-63b1b6e6f4d6']#033[00m Dec 15 05:06:06 localhost neutron_sriov_agent[260044]: 2025-12-15 10:06:06.612 2 INFO neutron.agent.securitygroups_rpc [None req-d3c12c68-c9d8-47cb-a0c4-f792cd4d9eb5 52899874f6ba4276802009964d723e15 dcbc6ff58fef4432a793909041cfb08b - - default default] Security group rule updated ['0b8be23d-9729-4205-97f5-2b655855dee5']#033[00m Dec 15 05:06:06 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:06.731 267546 INFO neutron.agent.linux.ip_lib [None req-8e484ed1-a320-4b18-ae5e-a825ba80e49e - - - - - -] Device tap5eab3344-75 cannot be used as it has no MAC address#033[00m Dec 15 05:06:06 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e172 do_prune osdmap full prune enabled Dec 15 05:06:06 localhost nova_compute[286344]: 2025-12-15 10:06:06.754 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:06 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e173 e173: 6 total, 6 up, 6 in Dec 15 05:06:06 localhost kernel: device tap5eab3344-75 entered promiscuous mode Dec 15 05:06:06 localhost nova_compute[286344]: 2025-12-15 10:06:06.763 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:06 localhost ovn_controller[154603]: 2025-12-15T10:06:06Z|00296|binding|INFO|Claiming lport 5eab3344-7516-49c6-b83e-7add74748a28 for this chassis. Dec 15 05:06:06 localhost ovn_controller[154603]: 2025-12-15T10:06:06Z|00297|binding|INFO|5eab3344-7516-49c6-b83e-7add74748a28: Claiming unknown Dec 15 05:06:06 localhost systemd-udevd[323106]: Network interface NamePolicy= disabled on kernel command line. Dec 15 05:06:06 localhost NetworkManager[5963]: [1765793166.7666] manager: (tap5eab3344-75): new Generic device (/org/freedesktop/NetworkManager/Devices/48) Dec 15 05:06:06 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e173: 6 total, 6 up, 6 in Dec 15 05:06:06 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:06.774 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '89e710ef9f4f48d48a369002db572947', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e47821dc-5f5d-44dc-8a16-54817df4049d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=5eab3344-7516-49c6-b83e-7add74748a28) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:06:06 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:06.775 160590 INFO neutron.agent.ovn.metadata.agent [-] Port 5eab3344-7516-49c6-b83e-7add74748a28 in datapath c0669abd-aef1-4b0d-9f97-a6adeeac3211 bound to our chassis#033[00m Dec 15 05:06:06 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:06.777 160590 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network c0669abd-aef1-4b0d-9f97-a6adeeac3211 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 15 05:06:06 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:06.778 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[3554dd99-e13b-4550-bfdf-e0f53497f802]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:06:06 localhost ovn_controller[154603]: 2025-12-15T10:06:06Z|00298|binding|INFO|Setting lport 5eab3344-7516-49c6-b83e-7add74748a28 ovn-installed in OVS Dec 15 05:06:06 localhost ovn_controller[154603]: 2025-12-15T10:06:06Z|00299|binding|INFO|Setting lport 5eab3344-7516-49c6-b83e-7add74748a28 up in Southbound Dec 15 05:06:06 localhost nova_compute[286344]: 2025-12-15 10:06:06.783 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:06 localhost journal[231322]: ethtool ioctl error on tap5eab3344-75: No such device Dec 15 05:06:06 localhost journal[231322]: ethtool ioctl error on tap5eab3344-75: No such device Dec 15 05:06:06 localhost nova_compute[286344]: 2025-12-15 10:06:06.802 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:06 localhost journal[231322]: ethtool ioctl error on tap5eab3344-75: No such device Dec 15 05:06:06 localhost journal[231322]: ethtool ioctl error on tap5eab3344-75: No such device Dec 15 05:06:06 localhost journal[231322]: ethtool ioctl error on tap5eab3344-75: No such device Dec 15 05:06:06 localhost journal[231322]: ethtool ioctl error on tap5eab3344-75: No such device Dec 15 05:06:06 localhost journal[231322]: ethtool ioctl error on tap5eab3344-75: No such device Dec 15 05:06:06 localhost journal[231322]: ethtool ioctl error on tap5eab3344-75: No such device Dec 15 05:06:06 localhost nova_compute[286344]: 2025-12-15 10:06:06.844 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:06 localhost nova_compute[286344]: 2025-12-15 10:06:06.874 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:06 localhost neutron_sriov_agent[260044]: 2025-12-15 10:06:06.944 2 INFO neutron.agent.securitygroups_rpc [None req-010a434c-dce9-4f48-8181-2e8f6ec62914 52899874f6ba4276802009964d723e15 dcbc6ff58fef4432a793909041cfb08b - - default default] Security group rule updated ['0b8be23d-9729-4205-97f5-2b655855dee5']#033[00m Dec 15 05:06:07 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:07.679 160590 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=12d96d64-e862-4f68-81e5-8d9ec5d3a5e2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 15 05:06:07 localhost podman[323177]: Dec 15 05:06:07 localhost podman[323177]: 2025-12-15 10:06:07.74785977 +0000 UTC m=+0.094646389 container create dafe96ccae697401d708c2394beec82d1d8c99a7dd67eace79f6a56c7999eea1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2) Dec 15 05:06:07 localhost systemd[1]: Started libpod-conmon-dafe96ccae697401d708c2394beec82d1d8c99a7dd67eace79f6a56c7999eea1.scope. Dec 15 05:06:07 localhost podman[323177]: 2025-12-15 10:06:07.700781142 +0000 UTC m=+0.047567771 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 15 05:06:07 localhost systemd[1]: Started libcrun container. Dec 15 05:06:07 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/78cd3493f29c2e4428a663e942fd51a6e5df7e4952f7ba51c9786381369ae569/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 05:06:07 localhost podman[323177]: 2025-12-15 10:06:07.828666152 +0000 UTC m=+0.175452771 container init dafe96ccae697401d708c2394beec82d1d8c99a7dd67eace79f6a56c7999eea1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true) Dec 15 05:06:07 localhost podman[323177]: 2025-12-15 10:06:07.844058278 +0000 UTC m=+0.190844897 container start dafe96ccae697401d708c2394beec82d1d8c99a7dd67eace79f6a56c7999eea1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 05:06:07 localhost dnsmasq[323195]: started, version 2.85 cachesize 150 Dec 15 05:06:07 localhost dnsmasq[323195]: DNS service limited to local subnets Dec 15 05:06:07 localhost dnsmasq[323195]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 15 05:06:07 localhost dnsmasq[323195]: warning: no upstream servers configured Dec 15 05:06:07 localhost dnsmasq-dhcp[323195]: DHCPv6, static leases only on 2001:db8::, lease time 1d Dec 15 05:06:07 localhost dnsmasq[323195]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/addn_hosts - 0 addresses Dec 15 05:06:07 localhost dnsmasq-dhcp[323195]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/host Dec 15 05:06:07 localhost dnsmasq-dhcp[323195]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/opts Dec 15 05:06:07 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:07.908 267546 INFO neutron.agent.dhcp.agent [None req-8e484ed1-a320-4b18-ae5e-a825ba80e49e - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-15T10:06:06Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=c74980b4-c8a7-4a3e-b2fc-b5f2356112d8, ip_allocation=immediate, mac_address=fa:16:3e:94:e0:59, name=tempest-NetworksTestDHCPv6-825164852, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-15T10:05:39Z, description=, dns_domain=, id=c0669abd-aef1-4b0d-9f97-a6adeeac3211, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1861207414, port_security_enabled=True, project_id=89e710ef9f4f48d48a369002db572947, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=20673, qos_policy_id=None, revision_number=14, router:external=False, shared=False, standard_attr_id=2164, status=ACTIVE, subnets=['b6bbff17-d9d9-402c-861c-489221e1401d'], tags=[], tenant_id=89e710ef9f4f48d48a369002db572947, updated_at=2025-12-15T10:06:05Z, vlan_transparent=None, network_id=c0669abd-aef1-4b0d-9f97-a6adeeac3211, port_security_enabled=True, project_id=89e710ef9f4f48d48a369002db572947, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['a6c5f808-dddc-4f17-acbf-63b1b6e6f4d6'], standard_attr_id=2295, status=DOWN, tags=[], tenant_id=89e710ef9f4f48d48a369002db572947, updated_at=2025-12-15T10:06:06Z on network c0669abd-aef1-4b0d-9f97-a6adeeac3211#033[00m Dec 15 05:06:08 localhost nova_compute[286344]: 2025-12-15 10:06:08.104 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:08 localhost dnsmasq[323195]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/addn_hosts - 1 addresses Dec 15 05:06:08 localhost dnsmasq-dhcp[323195]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/host Dec 15 05:06:08 localhost dnsmasq-dhcp[323195]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/opts Dec 15 05:06:08 localhost podman[323214]: 2025-12-15 10:06:08.139632464 +0000 UTC m=+0.091972113 container kill dafe96ccae697401d708c2394beec82d1d8c99a7dd67eace79f6a56c7999eea1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true) Dec 15 05:06:08 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:08.395 267546 INFO neutron.agent.dhcp.agent [None req-709be871-bf28-4f0b-a3f9-d99ef8c344c8 - - - - - -] DHCP configuration for ports {'79503367-f53f-4b35-8760-76fcaa4d8407'} is completed#033[00m Dec 15 05:06:08 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:08.540 267546 INFO neutron.agent.dhcp.agent [None req-1c0eec1a-b4d3-42af-9c82-f92877706716 - - - - - -] DHCP configuration for ports {'c74980b4-c8a7-4a3e-b2fc-b5f2356112d8'} is completed#033[00m Dec 15 05:06:08 localhost systemd[1]: tmp-crun.OUgoMC.mount: Deactivated successfully. Dec 15 05:06:08 localhost neutron_sriov_agent[260044]: 2025-12-15 10:06:08.977 2 INFO neutron.agent.securitygroups_rpc [None req-690077e9-4852-4d80-a3f1-5f457e6c757f 6b5da6f221214afe93e1fa66574f238b 89e710ef9f4f48d48a369002db572947 - - default default] Security group member updated ['a6c5f808-dddc-4f17-acbf-63b1b6e6f4d6']#033[00m Dec 15 05:06:09 localhost dnsmasq[323195]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/addn_hosts - 0 addresses Dec 15 05:06:09 localhost podman[323251]: 2025-12-15 10:06:09.178136781 +0000 UTC m=+0.065032748 container kill dafe96ccae697401d708c2394beec82d1d8c99a7dd67eace79f6a56c7999eea1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 05:06:09 localhost dnsmasq-dhcp[323195]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/host Dec 15 05:06:09 localhost dnsmasq-dhcp[323195]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/opts Dec 15 05:06:09 localhost neutron_sriov_agent[260044]: 2025-12-15 10:06:09.281 2 INFO neutron.agent.securitygroups_rpc [None req-83c7c815-ca7e-4d29-87f9-0bbd6f3ff4a4 52899874f6ba4276802009964d723e15 dcbc6ff58fef4432a793909041cfb08b - - default default] Security group rule updated ['0b8be23d-9729-4205-97f5-2b655855dee5']#033[00m Dec 15 05:06:10 localhost dnsmasq[323195]: exiting on receipt of SIGTERM Dec 15 05:06:10 localhost podman[323287]: 2025-12-15 10:06:10.007049224 +0000 UTC m=+0.062887455 container kill dafe96ccae697401d708c2394beec82d1d8c99a7dd67eace79f6a56c7999eea1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202) Dec 15 05:06:10 localhost systemd[1]: libpod-dafe96ccae697401d708c2394beec82d1d8c99a7dd67eace79f6a56c7999eea1.scope: Deactivated successfully. Dec 15 05:06:10 localhost podman[323301]: 2025-12-15 10:06:10.082594125 +0000 UTC m=+0.058286270 container died dafe96ccae697401d708c2394beec82d1d8c99a7dd67eace79f6a56c7999eea1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:06:10 localhost podman[323301]: 2025-12-15 10:06:10.128629516 +0000 UTC m=+0.104321601 container cleanup dafe96ccae697401d708c2394beec82d1d8c99a7dd67eace79f6a56c7999eea1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:06:10 localhost systemd[1]: libpod-conmon-dafe96ccae697401d708c2394beec82d1d8c99a7dd67eace79f6a56c7999eea1.scope: Deactivated successfully. Dec 15 05:06:10 localhost podman[323303]: 2025-12-15 10:06:10.168024622 +0000 UTC m=+0.137106392 container remove dafe96ccae697401d708c2394beec82d1d8c99a7dd67eace79f6a56c7999eea1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0) Dec 15 05:06:10 localhost nova_compute[286344]: 2025-12-15 10:06:10.186 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:10 localhost ovn_controller[154603]: 2025-12-15T10:06:10Z|00300|binding|INFO|Releasing lport 5eab3344-7516-49c6-b83e-7add74748a28 from this chassis (sb_readonly=0) Dec 15 05:06:10 localhost ovn_controller[154603]: 2025-12-15T10:06:10Z|00301|binding|INFO|Setting lport 5eab3344-7516-49c6-b83e-7add74748a28 down in Southbound Dec 15 05:06:10 localhost kernel: device tap5eab3344-75 left promiscuous mode Dec 15 05:06:10 localhost nova_compute[286344]: 2025-12-15 10:06:10.215 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:10 localhost nova_compute[286344]: 2025-12-15 10:06:10.216 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:10 localhost nova_compute[286344]: 2025-12-15 10:06:10.440 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:10 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:10.624 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '89e710ef9f4f48d48a369002db572947', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005559462.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e47821dc-5f5d-44dc-8a16-54817df4049d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=5eab3344-7516-49c6-b83e-7add74748a28) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:06:10 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:10.626 160590 INFO neutron.agent.ovn.metadata.agent [-] Port 5eab3344-7516-49c6-b83e-7add74748a28 in datapath c0669abd-aef1-4b0d-9f97-a6adeeac3211 unbound from our chassis#033[00m Dec 15 05:06:10 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:10.627 160590 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network c0669abd-aef1-4b0d-9f97-a6adeeac3211 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 15 05:06:10 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:10.628 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[d6f6a4fb-1bc4-4c73-b75e-60aa195ca671]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:06:10 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e173 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 05:06:11 localhost systemd[1]: tmp-crun.7ING6r.mount: Deactivated successfully. Dec 15 05:06:11 localhost systemd[1]: var-lib-containers-storage-overlay-78cd3493f29c2e4428a663e942fd51a6e5df7e4952f7ba51c9786381369ae569-merged.mount: Deactivated successfully. Dec 15 05:06:11 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-dafe96ccae697401d708c2394beec82d1d8c99a7dd67eace79f6a56c7999eea1-userdata-shm.mount: Deactivated successfully. Dec 15 05:06:11 localhost systemd[1]: run-netns-qdhcp\x2dc0669abd\x2daef1\x2d4b0d\x2d9f97\x2da6adeeac3211.mount: Deactivated successfully. Dec 15 05:06:11 localhost nova_compute[286344]: 2025-12-15 10:06:11.295 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:06:11 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Dec 15 05:06:11 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.25625 172.18.0.34:0/382777224' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Dec 15 05:06:11 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:11.806 267546 INFO neutron.agent.dhcp.agent [None req-da63b96e-df8b-47a8-82c6-33ba4fc84d5d - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:06:12 localhost neutron_sriov_agent[260044]: 2025-12-15 10:06:12.108 2 INFO neutron.agent.securitygroups_rpc [None req-18130326-71d6-41b7-bd49-d51ab9309bf7 52899874f6ba4276802009964d723e15 dcbc6ff58fef4432a793909041cfb08b - - default default] Security group rule updated ['ca0a36ac-5c24-462d-9e9b-07f79b1c910a']#033[00m Dec 15 05:06:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0. Dec 15 05:06:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 05:06:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a. Dec 15 05:06:12 localhost podman[323332]: 2025-12-15 10:06:12.769963843 +0000 UTC m=+0.097023698 container health_status 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Dec 15 05:06:12 localhost podman[323332]: 2025-12-15 10:06:12.779007909 +0000 UTC m=+0.106067754 container exec_died 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 15 05:06:12 localhost systemd[1]: 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0.service: Deactivated successfully. Dec 15 05:06:12 localhost podman[323334]: 2025-12-15 10:06:12.877544116 +0000 UTC m=+0.197937055 container health_status b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Dec 15 05:06:12 localhost podman[323334]: 2025-12-15 10:06:12.888630683 +0000 UTC m=+0.209023602 container exec_died b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251202) Dec 15 05:06:12 localhost systemd[1]: b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a.service: Deactivated successfully. Dec 15 05:06:12 localhost podman[323333]: 2025-12-15 10:06:12.980968333 +0000 UTC m=+0.304476280 container health_status 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251202, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:06:12 localhost podman[323333]: 2025-12-15 10:06:12.993372724 +0000 UTC m=+0.316880681 container exec_died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Dec 15 05:06:13 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.service: Deactivated successfully. Dec 15 05:06:13 localhost nova_compute[286344]: 2025-12-15 10:06:13.139 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:13 localhost nova_compute[286344]: 2025-12-15 10:06:13.268 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:06:13 localhost nova_compute[286344]: 2025-12-15 10:06:13.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:06:13 localhost nova_compute[286344]: 2025-12-15 10:06:13.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:06:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09. Dec 15 05:06:13 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 05:06:13 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:13.625 267546 INFO neutron.agent.linux.ip_lib [None req-357e2c08-fee3-4f85-a8f5-22286acb5bf4 - - - - - -] Device tap152baff9-97 cannot be used as it has no MAC address#033[00m Dec 15 05:06:13 localhost podman[323395]: 2025-12-15 10:06:13.642514438 +0000 UTC m=+0.107311727 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202) Dec 15 05:06:13 localhost nova_compute[286344]: 2025-12-15 10:06:13.648 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:13 localhost kernel: device tap152baff9-97 entered promiscuous mode Dec 15 05:06:13 localhost NetworkManager[5963]: [1765793173.6602] manager: (tap152baff9-97): new Generic device (/org/freedesktop/NetworkManager/Devices/49) Dec 15 05:06:13 localhost nova_compute[286344]: 2025-12-15 10:06:13.663 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:13 localhost ovn_controller[154603]: 2025-12-15T10:06:13Z|00302|binding|INFO|Claiming lport 152baff9-972b-403a-9d91-62370ca50cad for this chassis. Dec 15 05:06:13 localhost ovn_controller[154603]: 2025-12-15T10:06:13Z|00303|binding|INFO|152baff9-972b-403a-9d91-62370ca50cad: Claiming unknown Dec 15 05:06:13 localhost systemd-udevd[323436]: Network interface NamePolicy= disabled on kernel command line. Dec 15 05:06:13 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:13.673 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '89e710ef9f4f48d48a369002db572947', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e47821dc-5f5d-44dc-8a16-54817df4049d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=152baff9-972b-403a-9d91-62370ca50cad) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:06:13 localhost ovn_controller[154603]: 2025-12-15T10:06:13Z|00304|binding|INFO|Setting lport 152baff9-972b-403a-9d91-62370ca50cad ovn-installed in OVS Dec 15 05:06:13 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:13.675 160590 INFO neutron.agent.ovn.metadata.agent [-] Port 152baff9-972b-403a-9d91-62370ca50cad in datapath c0669abd-aef1-4b0d-9f97-a6adeeac3211 bound to our chassis#033[00m Dec 15 05:06:13 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:13.676 160590 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network c0669abd-aef1-4b0d-9f97-a6adeeac3211 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 15 05:06:13 localhost ovn_controller[154603]: 2025-12-15T10:06:13Z|00305|binding|INFO|Setting lport 152baff9-972b-403a-9d91-62370ca50cad up in Southbound Dec 15 05:06:13 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:13.677 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[3496f8eb-1c6d-498c-a353-5098666804a0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:06:13 localhost nova_compute[286344]: 2025-12-15 10:06:13.681 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:13 localhost nova_compute[286344]: 2025-12-15 10:06:13.685 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:13 localhost podman[323394]: 2025-12-15 10:06:13.692723094 +0000 UTC m=+0.162577559 container health_status 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, name=ubi9-minimal, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, release=1755695350, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}) Dec 15 05:06:13 localhost nova_compute[286344]: 2025-12-15 10:06:13.714 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:13 localhost podman[323395]: 2025-12-15 10:06:13.726697074 +0000 UTC m=+0.191494333 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Dec 15 05:06:13 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 05:06:13 localhost podman[323394]: 2025-12-15 10:06:13.744322875 +0000 UTC m=+0.214177340 container exec_died 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, container_name=openstack_network_exporter, io.buildah.version=1.33.7, architecture=x86_64, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b) Dec 15 05:06:13 localhost systemd[1]: 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09.service: Deactivated successfully. Dec 15 05:06:13 localhost nova_compute[286344]: 2025-12-15 10:06:13.767 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:13 localhost nova_compute[286344]: 2025-12-15 10:06:13.799 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:14 localhost nova_compute[286344]: 2025-12-15 10:06:14.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:06:14 localhost nova_compute[286344]: 2025-12-15 10:06:14.272 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 15 05:06:14 localhost nova_compute[286344]: 2025-12-15 10:06:14.273 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 15 05:06:14 localhost podman[323502]: Dec 15 05:06:14 localhost podman[323502]: 2025-12-15 10:06:14.644935122 +0000 UTC m=+0.091375187 container create 6dd79eb6e79b49448fcbdfe350444f62a9db0d01eb86a2eb29995989354b3109 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:06:14 localhost systemd[1]: Started libpod-conmon-6dd79eb6e79b49448fcbdfe350444f62a9db0d01eb86a2eb29995989354b3109.scope. Dec 15 05:06:14 localhost systemd[1]: tmp-crun.YSenlQ.mount: Deactivated successfully. Dec 15 05:06:14 localhost podman[323502]: 2025-12-15 10:06:14.602050239 +0000 UTC m=+0.048490354 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 15 05:06:14 localhost systemd[1]: Started libcrun container. Dec 15 05:06:14 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ed31fe2d083b0c6ba7c433586dc37875d9428d2d15c0f414d71d4eef4e129c28/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 05:06:14 localhost podman[323502]: 2025-12-15 10:06:14.733329554 +0000 UTC m=+0.179769619 container init 6dd79eb6e79b49448fcbdfe350444f62a9db0d01eb86a2eb29995989354b3109 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 05:06:14 localhost podman[323502]: 2025-12-15 10:06:14.742146295 +0000 UTC m=+0.188586360 container start 6dd79eb6e79b49448fcbdfe350444f62a9db0d01eb86a2eb29995989354b3109 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2) Dec 15 05:06:14 localhost dnsmasq[323521]: started, version 2.85 cachesize 150 Dec 15 05:06:14 localhost dnsmasq[323521]: DNS service limited to local subnets Dec 15 05:06:14 localhost dnsmasq[323521]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 15 05:06:14 localhost dnsmasq[323521]: warning: no upstream servers configured Dec 15 05:06:14 localhost dnsmasq-dhcp[323521]: DHCPv6, static leases only on 2001:db8::, lease time 1d Dec 15 05:06:14 localhost dnsmasq[323521]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/addn_hosts - 0 addresses Dec 15 05:06:14 localhost dnsmasq-dhcp[323521]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/host Dec 15 05:06:14 localhost dnsmasq-dhcp[323521]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/opts Dec 15 05:06:14 localhost nova_compute[286344]: 2025-12-15 10:06:14.803 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 15 05:06:14 localhost nova_compute[286344]: 2025-12-15 10:06:14.803 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquired lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 15 05:06:14 localhost nova_compute[286344]: 2025-12-15 10:06:14.804 286348 DEBUG nova.network.neutron [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 15 05:06:14 localhost nova_compute[286344]: 2025-12-15 10:06:14.804 286348 DEBUG nova.objects.instance [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 15 05:06:14 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:14.805 267546 INFO neutron.agent.dhcp.agent [None req-357e2c08-fee3-4f85-a8f5-22286acb5bf4 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-15T10:06:13Z, description=, device_id=e5b34fba-4329-4364-b85d-5ffd591a97df, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=03841f32-5c25-497e-a484-097e2d0cdb22, ip_allocation=immediate, mac_address=fa:16:3e:48:e8:91, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-15T10:05:39Z, description=, dns_domain=, id=c0669abd-aef1-4b0d-9f97-a6adeeac3211, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1861207414, port_security_enabled=True, project_id=89e710ef9f4f48d48a369002db572947, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=20673, qos_policy_id=None, revision_number=18, router:external=False, shared=False, standard_attr_id=2164, status=ACTIVE, subnets=['65fb6967-f602-498a-bf72-b12237d2983f'], tags=[], tenant_id=89e710ef9f4f48d48a369002db572947, updated_at=2025-12-15T10:06:12Z, vlan_transparent=None, network_id=c0669abd-aef1-4b0d-9f97-a6adeeac3211, port_security_enabled=False, project_id=89e710ef9f4f48d48a369002db572947, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2327, status=DOWN, tags=[], tenant_id=89e710ef9f4f48d48a369002db572947, updated_at=2025-12-15T10:06:13Z on network c0669abd-aef1-4b0d-9f97-a6adeeac3211#033[00m Dec 15 05:06:14 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:14.994 267546 INFO neutron.agent.dhcp.agent [None req-ee06dadc-f324-4df2-ae23-6e2365a76b1f - - - - - -] DHCP configuration for ports {'79503367-f53f-4b35-8760-76fcaa4d8407'} is completed#033[00m Dec 15 05:06:15 localhost dnsmasq[323521]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/addn_hosts - 1 addresses Dec 15 05:06:15 localhost dnsmasq-dhcp[323521]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/host Dec 15 05:06:15 localhost dnsmasq-dhcp[323521]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/opts Dec 15 05:06:15 localhost podman[323540]: 2025-12-15 10:06:15.009061424 +0000 UTC m=+0.063629283 container kill 6dd79eb6e79b49448fcbdfe350444f62a9db0d01eb86a2eb29995989354b3109 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2) Dec 15 05:06:15 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:15.341 267546 INFO neutron.agent.dhcp.agent [None req-e1e9e251-c58e-495c-a9cb-602488045644 - - - - - -] DHCP configuration for ports {'03841f32-5c25-497e-a484-097e2d0cdb22'} is completed#033[00m Dec 15 05:06:15 localhost nova_compute[286344]: 2025-12-15 10:06:15.478 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:15 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e173 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 05:06:15 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e173 do_prune osdmap full prune enabled Dec 15 05:06:15 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e174 e174: 6 total, 6 up, 6 in Dec 15 05:06:15 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e174: 6 total, 6 up, 6 in Dec 15 05:06:15 localhost nova_compute[286344]: 2025-12-15 10:06:15.950 286348 DEBUG nova.network.neutron [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Updating instance_info_cache with network_info: [{"id": "03ef8889-3216-43fb-8a52-4be17a956ce1", "address": "fa:16:3e:74:df:7c", "network": {"id": "befb7a72-17a9-4bcb-b561-84b8f626685a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.201", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "c785bf23f53946bc99867d8832a50266", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03ef8889-32", "ovs_interfaceid": "03ef8889-3216-43fb-8a52-4be17a956ce1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 15 05:06:15 localhost nova_compute[286344]: 2025-12-15 10:06:15.986 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Releasing lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 15 05:06:15 localhost nova_compute[286344]: 2025-12-15 10:06:15.986 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 15 05:06:15 localhost nova_compute[286344]: 2025-12-15 10:06:15.987 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:06:16 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:16.152 267546 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-15T10:06:13Z, description=, device_id=e5b34fba-4329-4364-b85d-5ffd591a97df, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=03841f32-5c25-497e-a484-097e2d0cdb22, ip_allocation=immediate, mac_address=fa:16:3e:48:e8:91, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-15T10:05:39Z, description=, dns_domain=, id=c0669abd-aef1-4b0d-9f97-a6adeeac3211, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1861207414, port_security_enabled=True, project_id=89e710ef9f4f48d48a369002db572947, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=20673, qos_policy_id=None, revision_number=18, router:external=False, shared=False, standard_attr_id=2164, status=ACTIVE, subnets=['65fb6967-f602-498a-bf72-b12237d2983f'], tags=[], tenant_id=89e710ef9f4f48d48a369002db572947, updated_at=2025-12-15T10:06:12Z, vlan_transparent=None, network_id=c0669abd-aef1-4b0d-9f97-a6adeeac3211, port_security_enabled=False, project_id=89e710ef9f4f48d48a369002db572947, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2327, status=DOWN, tags=[], tenant_id=89e710ef9f4f48d48a369002db572947, updated_at=2025-12-15T10:06:13Z on network c0669abd-aef1-4b0d-9f97-a6adeeac3211#033[00m Dec 15 05:06:16 localhost dnsmasq[323521]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/addn_hosts - 1 addresses Dec 15 05:06:16 localhost dnsmasq-dhcp[323521]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/host Dec 15 05:06:16 localhost podman[323578]: 2025-12-15 10:06:16.342510152 +0000 UTC m=+0.059046489 container kill 6dd79eb6e79b49448fcbdfe350444f62a9db0d01eb86a2eb29995989354b3109 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:06:16 localhost dnsmasq-dhcp[323521]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/opts Dec 15 05:06:16 localhost nova_compute[286344]: 2025-12-15 10:06:16.547 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:16 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:16.619 267546 INFO neutron.agent.dhcp.agent [None req-2bc3561e-2014-4137-8400-b4b8b4f34a62 - - - - - -] DHCP configuration for ports {'03841f32-5c25-497e-a484-097e2d0cdb22'} is completed#033[00m Dec 15 05:06:18 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:18.116 267546 INFO neutron.agent.linux.ip_lib [None req-45f6a720-886e-4823-a159-043e1e1f6efc - - - - - -] Device tapa75f0a50-91 cannot be used as it has no MAC address#033[00m Dec 15 05:06:18 localhost nova_compute[286344]: 2025-12-15 10:06:18.174 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:18 localhost kernel: device tapa75f0a50-91 entered promiscuous mode Dec 15 05:06:18 localhost NetworkManager[5963]: [1765793178.1834] manager: (tapa75f0a50-91): new Generic device (/org/freedesktop/NetworkManager/Devices/50) Dec 15 05:06:18 localhost systemd-udevd[323608]: Network interface NamePolicy= disabled on kernel command line. Dec 15 05:06:18 localhost ovn_controller[154603]: 2025-12-15T10:06:18Z|00306|binding|INFO|Claiming lport a75f0a50-91fb-45a9-989c-e3dbcd7547b2 for this chassis. Dec 15 05:06:18 localhost ovn_controller[154603]: 2025-12-15T10:06:18Z|00307|binding|INFO|a75f0a50-91fb-45a9-989c-e3dbcd7547b2: Claiming unknown Dec 15 05:06:18 localhost nova_compute[286344]: 2025-12-15 10:06:18.190 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:18 localhost journal[231322]: ethtool ioctl error on tapa75f0a50-91: No such device Dec 15 05:06:18 localhost journal[231322]: ethtool ioctl error on tapa75f0a50-91: No such device Dec 15 05:06:18 localhost ovn_controller[154603]: 2025-12-15T10:06:18Z|00308|binding|INFO|Setting lport a75f0a50-91fb-45a9-989c-e3dbcd7547b2 ovn-installed in OVS Dec 15 05:06:18 localhost journal[231322]: ethtool ioctl error on tapa75f0a50-91: No such device Dec 15 05:06:18 localhost nova_compute[286344]: 2025-12-15 10:06:18.228 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:18 localhost journal[231322]: ethtool ioctl error on tapa75f0a50-91: No such device Dec 15 05:06:18 localhost journal[231322]: ethtool ioctl error on tapa75f0a50-91: No such device Dec 15 05:06:18 localhost journal[231322]: ethtool ioctl error on tapa75f0a50-91: No such device Dec 15 05:06:18 localhost journal[231322]: ethtool ioctl error on tapa75f0a50-91: No such device Dec 15 05:06:18 localhost journal[231322]: ethtool ioctl error on tapa75f0a50-91: No such device Dec 15 05:06:18 localhost ovn_controller[154603]: 2025-12-15T10:06:18Z|00309|binding|INFO|Setting lport a75f0a50-91fb-45a9-989c-e3dbcd7547b2 up in Southbound Dec 15 05:06:18 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:18.262 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::3/64', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-0cde33d5-98fb-45b7-b0ca-efc60772f6b8', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0cde33d5-98fb-45b7-b0ca-efc60772f6b8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '58db13bf0d834ccda318448020067936', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d36e857a-32c4-489d-af59-957ec099a1b0, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=a75f0a50-91fb-45a9-989c-e3dbcd7547b2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:06:18 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:18.264 160590 INFO neutron.agent.ovn.metadata.agent [-] Port a75f0a50-91fb-45a9-989c-e3dbcd7547b2 in datapath 0cde33d5-98fb-45b7-b0ca-efc60772f6b8 bound to our chassis#033[00m Dec 15 05:06:18 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:18.267 160590 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 0cde33d5-98fb-45b7-b0ca-efc60772f6b8 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 15 05:06:18 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:18.268 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[8e13ee92-ab76-477d-941b-df404b029683]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:06:18 localhost nova_compute[286344]: 2025-12-15 10:06:18.270 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:18 localhost nova_compute[286344]: 2025-12-15 10:06:18.302 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:18 localhost systemd[1]: tmp-crun.o0wleN.mount: Deactivated successfully. Dec 15 05:06:18 localhost dnsmasq[323521]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/addn_hosts - 0 addresses Dec 15 05:06:18 localhost dnsmasq-dhcp[323521]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/host Dec 15 05:06:18 localhost podman[323671]: 2025-12-15 10:06:18.767847904 +0000 UTC m=+0.070619458 container kill 6dd79eb6e79b49448fcbdfe350444f62a9db0d01eb86a2eb29995989354b3109 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3) Dec 15 05:06:18 localhost dnsmasq-dhcp[323521]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/opts Dec 15 05:06:19 localhost podman[323719]: Dec 15 05:06:19 localhost podman[323719]: 2025-12-15 10:06:19.091432041 +0000 UTC m=+0.088545967 container create ddd13815141668579541faacb26dfb57d4f0cb41c2119911b83a65fc6653afe0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0cde33d5-98fb-45b7-b0ca-efc60772f6b8, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true) Dec 15 05:06:19 localhost systemd[1]: Started libpod-conmon-ddd13815141668579541faacb26dfb57d4f0cb41c2119911b83a65fc6653afe0.scope. Dec 15 05:06:19 localhost podman[323719]: 2025-12-15 10:06:19.049271546 +0000 UTC m=+0.046385512 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 15 05:06:19 localhost systemd[1]: tmp-crun.fcKwgA.mount: Deactivated successfully. Dec 15 05:06:19 localhost systemd[1]: Started libcrun container. Dec 15 05:06:19 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/72662ed49334642861ef2e6a93d61bd834d1ca7d053b9c7c062903ce6f2fc17f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 05:06:19 localhost podman[323719]: 2025-12-15 10:06:19.189223059 +0000 UTC m=+0.186336995 container init ddd13815141668579541faacb26dfb57d4f0cb41c2119911b83a65fc6653afe0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0cde33d5-98fb-45b7-b0ca-efc60772f6b8, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Dec 15 05:06:19 localhost podman[323719]: 2025-12-15 10:06:19.197726141 +0000 UTC m=+0.194840067 container start ddd13815141668579541faacb26dfb57d4f0cb41c2119911b83a65fc6653afe0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0cde33d5-98fb-45b7-b0ca-efc60772f6b8, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 05:06:19 localhost dnsmasq[323738]: started, version 2.85 cachesize 150 Dec 15 05:06:19 localhost dnsmasq[323738]: DNS service limited to local subnets Dec 15 05:06:19 localhost dnsmasq[323738]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 15 05:06:19 localhost dnsmasq[323738]: warning: no upstream servers configured Dec 15 05:06:19 localhost dnsmasq-dhcp[323738]: DHCPv6, static leases only on 2001:db8::, lease time 1d Dec 15 05:06:19 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Dec 15 05:06:19 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.25625 172.18.0.34:0/382777224' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Dec 15 05:06:19 localhost dnsmasq[323738]: read /var/lib/neutron/dhcp/0cde33d5-98fb-45b7-b0ca-efc60772f6b8/addn_hosts - 0 addresses Dec 15 05:06:19 localhost dnsmasq-dhcp[323738]: read /var/lib/neutron/dhcp/0cde33d5-98fb-45b7-b0ca-efc60772f6b8/host Dec 15 05:06:19 localhost dnsmasq-dhcp[323738]: read /var/lib/neutron/dhcp/0cde33d5-98fb-45b7-b0ca-efc60772f6b8/opts Dec 15 05:06:19 localhost nova_compute[286344]: 2025-12-15 10:06:19.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:06:19 localhost nova_compute[286344]: 2025-12-15 10:06:19.270 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 15 05:06:19 localhost nova_compute[286344]: 2025-12-15 10:06:19.271 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:06:19 localhost nova_compute[286344]: 2025-12-15 10:06:19.305 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 05:06:19 localhost nova_compute[286344]: 2025-12-15 10:06:19.305 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 05:06:19 localhost nova_compute[286344]: 2025-12-15 10:06:19.305 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 05:06:19 localhost nova_compute[286344]: 2025-12-15 10:06:19.306 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Auditing locally available compute resources for np0005559462.localdomain (node: np0005559462.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 15 05:06:19 localhost nova_compute[286344]: 2025-12-15 10:06:19.306 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 05:06:19 localhost ovn_controller[154603]: 2025-12-15T10:06:19Z|00310|binding|INFO|Releasing lport 152baff9-972b-403a-9d91-62370ca50cad from this chassis (sb_readonly=0) Dec 15 05:06:19 localhost kernel: device tap152baff9-97 left promiscuous mode Dec 15 05:06:19 localhost ovn_controller[154603]: 2025-12-15T10:06:19Z|00311|binding|INFO|Setting lport 152baff9-972b-403a-9d91-62370ca50cad down in Southbound Dec 15 05:06:19 localhost nova_compute[286344]: 2025-12-15 10:06:19.323 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:19 localhost nova_compute[286344]: 2025-12-15 10:06:19.341 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:19 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:19.446 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '89e710ef9f4f48d48a369002db572947', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005559462.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e47821dc-5f5d-44dc-8a16-54817df4049d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=152baff9-972b-403a-9d91-62370ca50cad) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:06:19 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:19.448 160590 INFO neutron.agent.ovn.metadata.agent [-] Port 152baff9-972b-403a-9d91-62370ca50cad in datapath c0669abd-aef1-4b0d-9f97-a6adeeac3211 unbound from our chassis#033[00m Dec 15 05:06:19 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:19.450 160590 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network c0669abd-aef1-4b0d-9f97-a6adeeac3211 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 15 05:06:19 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:19.451 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[f816351d-ef56-46d9-9466-bdc61efd2fe9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:06:19 localhost dnsmasq[323738]: read /var/lib/neutron/dhcp/0cde33d5-98fb-45b7-b0ca-efc60772f6b8/addn_hosts - 0 addresses Dec 15 05:06:19 localhost dnsmasq-dhcp[323738]: read /var/lib/neutron/dhcp/0cde33d5-98fb-45b7-b0ca-efc60772f6b8/host Dec 15 05:06:19 localhost podman[323778]: 2025-12-15 10:06:19.65164731 +0000 UTC m=+0.062695090 container kill ddd13815141668579541faacb26dfb57d4f0cb41c2119911b83a65fc6653afe0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0cde33d5-98fb-45b7-b0ca-efc60772f6b8, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true) Dec 15 05:06:19 localhost dnsmasq-dhcp[323738]: read /var/lib/neutron/dhcp/0cde33d5-98fb-45b7-b0ca-efc60772f6b8/opts Dec 15 05:06:19 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:19.720 267546 INFO neutron.agent.dhcp.agent [None req-dde1379a-5d07-4c29-ac0d-917d6359c959 - - - - - -] DHCP configuration for ports {'5ff0e563-a801-4b03-a2ec-8d7a3fcdf08b'} is completed#033[00m Dec 15 05:06:19 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Dec 15 05:06:19 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1091456141' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Dec 15 05:06:19 localhost nova_compute[286344]: 2025-12-15 10:06:19.757 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 05:06:19 localhost nova_compute[286344]: 2025-12-15 10:06:19.978 286348 DEBUG nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 05:06:19 localhost nova_compute[286344]: 2025-12-15 10:06:19.979 286348 DEBUG nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 05:06:20 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:20.179 267546 INFO neutron.agent.dhcp.agent [None req-06ac4267-49db-48e8-be71-1c4a10290e1f - - - - - -] DHCP configuration for ports {'5ff0e563-a801-4b03-a2ec-8d7a3fcdf08b', 'a75f0a50-91fb-45a9-989c-e3dbcd7547b2'} is completed#033[00m Dec 15 05:06:20 localhost nova_compute[286344]: 2025-12-15 10:06:20.196 286348 WARNING nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 15 05:06:20 localhost nova_compute[286344]: 2025-12-15 10:06:20.198 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Hypervisor/Node resource view: name=np0005559462.localdomain free_ram=11240MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 15 05:06:20 localhost nova_compute[286344]: 2025-12-15 10:06:20.198 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 05:06:20 localhost nova_compute[286344]: 2025-12-15 10:06:20.199 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 05:06:20 localhost nova_compute[286344]: 2025-12-15 10:06:20.320 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Instance 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 15 05:06:20 localhost nova_compute[286344]: 2025-12-15 10:06:20.321 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 15 05:06:20 localhost nova_compute[286344]: 2025-12-15 10:06:20.322 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Final resource view: name=np0005559462.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 15 05:06:20 localhost nova_compute[286344]: 2025-12-15 10:06:20.362 286348 DEBUG nova.scheduler.client.report [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Refreshing inventories for resource provider 26c8956b-6742-4951-b566-971b9bbe323b _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Dec 15 05:06:20 localhost nova_compute[286344]: 2025-12-15 10:06:20.506 286348 DEBUG nova.scheduler.client.report [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Updating ProviderTree inventory for provider 26c8956b-6742-4951-b566-971b9bbe323b from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Dec 15 05:06:20 localhost nova_compute[286344]: 2025-12-15 10:06:20.507 286348 DEBUG nova.compute.provider_tree [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Updating inventory in ProviderTree for provider 26c8956b-6742-4951-b566-971b9bbe323b with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Dec 15 05:06:20 localhost nova_compute[286344]: 2025-12-15 10:06:20.513 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:20 localhost nova_compute[286344]: 2025-12-15 10:06:20.541 286348 DEBUG nova.scheduler.client.report [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Refreshing aggregate associations for resource provider 26c8956b-6742-4951-b566-971b9bbe323b, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Dec 15 05:06:20 localhost dnsmasq[323521]: exiting on receipt of SIGTERM Dec 15 05:06:20 localhost podman[323816]: 2025-12-15 10:06:20.547064997 +0000 UTC m=+0.059834478 container kill 6dd79eb6e79b49448fcbdfe350444f62a9db0d01eb86a2eb29995989354b3109 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2) Dec 15 05:06:20 localhost systemd[1]: libpod-6dd79eb6e79b49448fcbdfe350444f62a9db0d01eb86a2eb29995989354b3109.scope: Deactivated successfully. Dec 15 05:06:20 localhost podman[323828]: 2025-12-15 10:06:20.612148736 +0000 UTC m=+0.054454464 container died 6dd79eb6e79b49448fcbdfe350444f62a9db0d01eb86a2eb29995989354b3109 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:06:20 localhost nova_compute[286344]: 2025-12-15 10:06:20.636 286348 DEBUG nova.scheduler.client.report [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Refreshing trait associations for resource provider 26c8956b-6742-4951-b566-971b9bbe323b, traits: COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_IDE,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NODE,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_CLMUL,HW_CPU_X86_ABM,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_BMI2,HW_CPU_X86_SSE42,HW_CPU_X86_FMA3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_BMI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE,HW_CPU_X86_AVX,HW_CPU_X86_SVM,COMPUTE_SECURITY_TPM_2_0,COMPUTE_TRUSTED_CERTS,COMPUTE_RESCUE_BFV,HW_CPU_X86_AVX2,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSSE3,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_AESNI,HW_CPU_X86_SSE4A,HW_CPU_X86_MMX,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_F16C,HW_CPU_X86_SHA,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_SATA _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Dec 15 05:06:20 localhost systemd[1]: tmp-crun.IuJNz1.mount: Deactivated successfully. Dec 15 05:06:20 localhost podman[323828]: 2025-12-15 10:06:20.665344337 +0000 UTC m=+0.107650025 container cleanup 6dd79eb6e79b49448fcbdfe350444f62a9db0d01eb86a2eb29995989354b3109 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:06:20 localhost systemd[1]: libpod-conmon-6dd79eb6e79b49448fcbdfe350444f62a9db0d01eb86a2eb29995989354b3109.scope: Deactivated successfully. Dec 15 05:06:20 localhost nova_compute[286344]: 2025-12-15 10:06:20.683 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 05:06:20 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e174 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 05:06:20 localhost podman[323835]: 2025-12-15 10:06:20.744516888 +0000 UTC m=+0.175153304 container remove 6dd79eb6e79b49448fcbdfe350444f62a9db0d01eb86a2eb29995989354b3109 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:06:20 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:20.815 160590 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port df3aa4c3-35a5-4ad9-97de-3d4b0cfb8864 with type ""#033[00m Dec 15 05:06:20 localhost ovn_controller[154603]: 2025-12-15T10:06:20Z|00312|binding|INFO|Removing iface tapa75f0a50-91 ovn-installed in OVS Dec 15 05:06:20 localhost ovn_controller[154603]: 2025-12-15T10:06:20Z|00313|binding|INFO|Removing lport a75f0a50-91fb-45a9-989c-e3dbcd7547b2 ovn-installed in OVS Dec 15 05:06:20 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:20.818 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::3/64', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-0cde33d5-98fb-45b7-b0ca-efc60772f6b8', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0cde33d5-98fb-45b7-b0ca-efc60772f6b8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '58db13bf0d834ccda318448020067936', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005559462.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d36e857a-32c4-489d-af59-957ec099a1b0, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=a75f0a50-91fb-45a9-989c-e3dbcd7547b2) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:06:20 localhost nova_compute[286344]: 2025-12-15 10:06:20.819 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:20 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:20.822 160590 INFO neutron.agent.ovn.metadata.agent [-] Port a75f0a50-91fb-45a9-989c-e3dbcd7547b2 in datapath 0cde33d5-98fb-45b7-b0ca-efc60772f6b8 unbound from our chassis#033[00m Dec 15 05:06:20 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:20.827 160590 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0cde33d5-98fb-45b7-b0ca-efc60772f6b8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 15 05:06:20 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:20.828 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[175fde0c-7884-4836-b774-a0649cf06954]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:06:20 localhost nova_compute[286344]: 2025-12-15 10:06:20.830 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:20 localhost dnsmasq[323738]: exiting on receipt of SIGTERM Dec 15 05:06:20 localhost podman[323875]: 2025-12-15 10:06:20.859012053 +0000 UTC m=+0.066745421 container kill ddd13815141668579541faacb26dfb57d4f0cb41c2119911b83a65fc6653afe0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0cde33d5-98fb-45b7-b0ca-efc60772f6b8, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS) Dec 15 05:06:20 localhost systemd[1]: libpod-ddd13815141668579541faacb26dfb57d4f0cb41c2119911b83a65fc6653afe0.scope: Deactivated successfully. Dec 15 05:06:20 localhost podman[323906]: 2025-12-15 10:06:20.919921797 +0000 UTC m=+0.050520595 container died ddd13815141668579541faacb26dfb57d4f0cb41c2119911b83a65fc6653afe0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0cde33d5-98fb-45b7-b0ca-efc60772f6b8, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:06:20 localhost podman[323906]: 2025-12-15 10:06:20.944569034 +0000 UTC m=+0.075167802 container cleanup ddd13815141668579541faacb26dfb57d4f0cb41c2119911b83a65fc6653afe0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0cde33d5-98fb-45b7-b0ca-efc60772f6b8, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3) Dec 15 05:06:20 localhost systemd[1]: libpod-conmon-ddd13815141668579541faacb26dfb57d4f0cb41c2119911b83a65fc6653afe0.scope: Deactivated successfully. Dec 15 05:06:20 localhost podman[323913]: 2025-12-15 10:06:20.994233687 +0000 UTC m=+0.110365653 container remove ddd13815141668579541faacb26dfb57d4f0cb41c2119911b83a65fc6653afe0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0cde33d5-98fb-45b7-b0ca-efc60772f6b8, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2) Dec 15 05:06:21 localhost nova_compute[286344]: 2025-12-15 10:06:21.004 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:21 localhost kernel: device tapa75f0a50-91 left promiscuous mode Dec 15 05:06:21 localhost nova_compute[286344]: 2025-12-15 10:06:21.022 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:21 localhost systemd[1]: var-lib-containers-storage-overlay-72662ed49334642861ef2e6a93d61bd834d1ca7d053b9c7c062903ce6f2fc17f-merged.mount: Deactivated successfully. Dec 15 05:06:21 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ddd13815141668579541faacb26dfb57d4f0cb41c2119911b83a65fc6653afe0-userdata-shm.mount: Deactivated successfully. Dec 15 05:06:21 localhost systemd[1]: run-netns-qdhcp\x2d0cde33d5\x2d98fb\x2d45b7\x2db0ca\x2defc60772f6b8.mount: Deactivated successfully. Dec 15 05:06:21 localhost systemd[1]: var-lib-containers-storage-overlay-ed31fe2d083b0c6ba7c433586dc37875d9428d2d15c0f414d71d4eef4e129c28-merged.mount: Deactivated successfully. Dec 15 05:06:21 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6dd79eb6e79b49448fcbdfe350444f62a9db0d01eb86a2eb29995989354b3109-userdata-shm.mount: Deactivated successfully. Dec 15 05:06:21 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Dec 15 05:06:21 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3083283982' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Dec 15 05:06:21 localhost nova_compute[286344]: 2025-12-15 10:06:21.132 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.450s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 05:06:21 localhost nova_compute[286344]: 2025-12-15 10:06:21.138 286348 DEBUG nova.compute.provider_tree [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Inventory has not changed in ProviderTree for provider: 26c8956b-6742-4951-b566-971b9bbe323b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 15 05:06:21 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:21.364 267546 INFO neutron.agent.dhcp.agent [None req-fd06c9ab-9e0a-43d3-bb87-35d334482a83 - - - - - -] Synchronizing state#033[00m Dec 15 05:06:21 localhost systemd[1]: run-netns-qdhcp\x2dc0669abd\x2daef1\x2d4b0d\x2d9f97\x2da6adeeac3211.mount: Deactivated successfully. Dec 15 05:06:21 localhost nova_compute[286344]: 2025-12-15 10:06:21.368 286348 DEBUG nova.scheduler.client.report [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Inventory has not changed for provider 26c8956b-6742-4951-b566-971b9bbe323b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 15 05:06:21 localhost nova_compute[286344]: 2025-12-15 10:06:21.370 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Compute_service record updated for np0005559462.localdomain:np0005559462.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 15 05:06:21 localhost nova_compute[286344]: 2025-12-15 10:06:21.371 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.172s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 05:06:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 05:06:21 localhost ovn_controller[154603]: 2025-12-15T10:06:21Z|00314|binding|INFO|Releasing lport b35254ad-12eb-47bb-92be-44fefe0694f0 from this chassis (sb_readonly=0) Dec 15 05:06:21 localhost podman[323936]: 2025-12-15 10:06:21.485153151 +0000 UTC m=+0.084341621 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true) Dec 15 05:06:21 localhost podman[323936]: 2025-12-15 10:06:21.495368747 +0000 UTC m=+0.094557227 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2) Dec 15 05:06:21 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 05:06:21 localhost nova_compute[286344]: 2025-12-15 10:06:21.524 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:21 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:21.785 267546 INFO neutron.agent.dhcp.agent [None req-4aa54636-cc33-4055-9493-70e26ed1186d - - - - - -] All active networks have been fetched through RPC.#033[00m Dec 15 05:06:21 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:21.786 267546 INFO neutron.agent.dhcp.agent [-] Starting network 0cde33d5-98fb-45b7-b0ca-efc60772f6b8 dhcp configuration#033[00m Dec 15 05:06:21 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:21.789 267546 INFO neutron.agent.dhcp.agent [-] Starting network c0669abd-aef1-4b0d-9f97-a6adeeac3211 dhcp configuration#033[00m Dec 15 05:06:21 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:21.790 267546 INFO neutron.agent.dhcp.agent [-] Finished network c0669abd-aef1-4b0d-9f97-a6adeeac3211 dhcp configuration#033[00m Dec 15 05:06:23 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:23.110 267546 INFO neutron.agent.linux.ip_lib [None req-412493f8-97b5-4819-a851-f9e0e3a1ae3a - - - - - -] Device tapb5ab771c-fa cannot be used as it has no MAC address#033[00m Dec 15 05:06:23 localhost nova_compute[286344]: 2025-12-15 10:06:23.184 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:23 localhost kernel: device tapb5ab771c-fa entered promiscuous mode Dec 15 05:06:23 localhost NetworkManager[5963]: [1765793183.1931] manager: (tapb5ab771c-fa): new Generic device (/org/freedesktop/NetworkManager/Devices/51) Dec 15 05:06:23 localhost ovn_controller[154603]: 2025-12-15T10:06:23Z|00315|binding|INFO|Claiming lport b5ab771c-faf2-4f8e-b5a8-5eb7d4781cc6 for this chassis. Dec 15 05:06:23 localhost ovn_controller[154603]: 2025-12-15T10:06:23Z|00316|binding|INFO|b5ab771c-faf2-4f8e-b5a8-5eb7d4781cc6: Claiming unknown Dec 15 05:06:23 localhost nova_compute[286344]: 2025-12-15 10:06:23.192 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:23 localhost systemd-udevd[323964]: Network interface NamePolicy= disabled on kernel command line. Dec 15 05:06:23 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:23.204 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::3/64', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-0cde33d5-98fb-45b7-b0ca-efc60772f6b8', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0cde33d5-98fb-45b7-b0ca-efc60772f6b8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '58db13bf0d834ccda318448020067936', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d36e857a-32c4-489d-af59-957ec099a1b0, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=b5ab771c-faf2-4f8e-b5a8-5eb7d4781cc6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:06:23 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:23.206 160590 INFO neutron.agent.ovn.metadata.agent [-] Port b5ab771c-faf2-4f8e-b5a8-5eb7d4781cc6 in datapath 0cde33d5-98fb-45b7-b0ca-efc60772f6b8 bound to our chassis#033[00m Dec 15 05:06:23 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:23.207 160590 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 0cde33d5-98fb-45b7-b0ca-efc60772f6b8 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 15 05:06:23 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:23.207 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[5aa6cf40-f36e-43c2-b2f4-75c1a417ce56]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:06:23 localhost journal[231322]: ethtool ioctl error on tapb5ab771c-fa: No such device Dec 15 05:06:23 localhost journal[231322]: ethtool ioctl error on tapb5ab771c-fa: No such device Dec 15 05:06:23 localhost ovn_controller[154603]: 2025-12-15T10:06:23Z|00317|binding|INFO|Setting lport b5ab771c-faf2-4f8e-b5a8-5eb7d4781cc6 ovn-installed in OVS Dec 15 05:06:23 localhost nova_compute[286344]: 2025-12-15 10:06:23.226 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:23 localhost ovn_controller[154603]: 2025-12-15T10:06:23Z|00318|binding|INFO|Setting lport b5ab771c-faf2-4f8e-b5a8-5eb7d4781cc6 up in Southbound Dec 15 05:06:23 localhost journal[231322]: ethtool ioctl error on tapb5ab771c-fa: No such device Dec 15 05:06:23 localhost journal[231322]: ethtool ioctl error on tapb5ab771c-fa: No such device Dec 15 05:06:23 localhost journal[231322]: ethtool ioctl error on tapb5ab771c-fa: No such device Dec 15 05:06:23 localhost journal[231322]: ethtool ioctl error on tapb5ab771c-fa: No such device Dec 15 05:06:23 localhost journal[231322]: ethtool ioctl error on tapb5ab771c-fa: No such device Dec 15 05:06:23 localhost journal[231322]: ethtool ioctl error on tapb5ab771c-fa: No such device Dec 15 05:06:23 localhost nova_compute[286344]: 2025-12-15 10:06:23.270 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:23 localhost nova_compute[286344]: 2025-12-15 10:06:23.300 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:23 localhost nova_compute[286344]: 2025-12-15 10:06:23.371 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:06:24 localhost podman[324036]: Dec 15 05:06:24 localhost podman[324036]: 2025-12-15 10:06:24.104303103 +0000 UTC m=+0.087914101 container create 1cb94e13831e06d71960c721bf24871677d3b0d396d5a57cff4f93c724aaaa3e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0cde33d5-98fb-45b7-b0ca-efc60772f6b8, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3) Dec 15 05:06:24 localhost systemd[1]: Started libpod-conmon-1cb94e13831e06d71960c721bf24871677d3b0d396d5a57cff4f93c724aaaa3e.scope. Dec 15 05:06:24 localhost podman[324036]: 2025-12-15 10:06:24.059497372 +0000 UTC m=+0.043108380 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 15 05:06:24 localhost systemd[1]: Started libcrun container. Dec 15 05:06:24 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6d9ca7dbc9c54ef0acd81baa7793ce4d5ab1d2abd1004bc994dd4967ed227a17/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 05:06:24 localhost podman[324036]: 2025-12-15 10:06:24.187792803 +0000 UTC m=+0.171403791 container init 1cb94e13831e06d71960c721bf24871677d3b0d396d5a57cff4f93c724aaaa3e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0cde33d5-98fb-45b7-b0ca-efc60772f6b8, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:06:24 localhost podman[324036]: 2025-12-15 10:06:24.199584657 +0000 UTC m=+0.183195645 container start 1cb94e13831e06d71960c721bf24871677d3b0d396d5a57cff4f93c724aaaa3e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0cde33d5-98fb-45b7-b0ca-efc60772f6b8, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Dec 15 05:06:24 localhost dnsmasq[324054]: started, version 2.85 cachesize 150 Dec 15 05:06:24 localhost dnsmasq[324054]: DNS service limited to local subnets Dec 15 05:06:24 localhost dnsmasq[324054]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 15 05:06:24 localhost dnsmasq[324054]: warning: no upstream servers configured Dec 15 05:06:24 localhost dnsmasq-dhcp[324054]: DHCPv6, static leases only on 2001:db8::, lease time 1d Dec 15 05:06:24 localhost dnsmasq[324054]: read /var/lib/neutron/dhcp/0cde33d5-98fb-45b7-b0ca-efc60772f6b8/addn_hosts - 0 addresses Dec 15 05:06:24 localhost dnsmasq-dhcp[324054]: read /var/lib/neutron/dhcp/0cde33d5-98fb-45b7-b0ca-efc60772f6b8/host Dec 15 05:06:24 localhost dnsmasq-dhcp[324054]: read /var/lib/neutron/dhcp/0cde33d5-98fb-45b7-b0ca-efc60772f6b8/opts Dec 15 05:06:24 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:24.271 267546 INFO neutron.agent.dhcp.agent [None req-412493f8-97b5-4819-a851-f9e0e3a1ae3a - - - - - -] Finished network 0cde33d5-98fb-45b7-b0ca-efc60772f6b8 dhcp configuration#033[00m Dec 15 05:06:24 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:24.272 267546 INFO neutron.agent.dhcp.agent [None req-4aa54636-cc33-4055-9493-70e26ed1186d - - - - - -] Synchronizing state complete#033[00m Dec 15 05:06:24 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:24.274 267546 INFO neutron.agent.dhcp.agent [None req-5265beb6-2fc8-4686-8b81-065143b62b41 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:06:24 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:24.343 267546 INFO neutron.agent.dhcp.agent [None req-336f5185-ac61-461b-8544-12e71cc0fb1e - - - - - -] DHCP configuration for ports {'5ff0e563-a801-4b03-a2ec-8d7a3fcdf08b'} is completed#033[00m Dec 15 05:06:24 localhost dnsmasq[324054]: read /var/lib/neutron/dhcp/0cde33d5-98fb-45b7-b0ca-efc60772f6b8/addn_hosts - 0 addresses Dec 15 05:06:24 localhost dnsmasq-dhcp[324054]: read /var/lib/neutron/dhcp/0cde33d5-98fb-45b7-b0ca-efc60772f6b8/host Dec 15 05:06:24 localhost dnsmasq-dhcp[324054]: read /var/lib/neutron/dhcp/0cde33d5-98fb-45b7-b0ca-efc60772f6b8/opts Dec 15 05:06:24 localhost podman[324072]: 2025-12-15 10:06:24.472409655 +0000 UTC m=+0.059371177 container kill 1cb94e13831e06d71960c721bf24871677d3b0d396d5a57cff4f93c724aaaa3e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0cde33d5-98fb-45b7-b0ca-efc60772f6b8, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true) Dec 15 05:06:24 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Dec 15 05:06:24 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3688665608' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Dec 15 05:06:24 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Dec 15 05:06:24 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3688665608' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Dec 15 05:06:25 localhost systemd[1]: Started /usr/bin/podman healthcheck run a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e. Dec 15 05:06:25 localhost dnsmasq[324054]: exiting on receipt of SIGTERM Dec 15 05:06:25 localhost systemd[1]: libpod-1cb94e13831e06d71960c721bf24871677d3b0d396d5a57cff4f93c724aaaa3e.scope: Deactivated successfully. Dec 15 05:06:25 localhost podman[324109]: 2025-12-15 10:06:25.146706428 +0000 UTC m=+0.063228723 container kill 1cb94e13831e06d71960c721bf24871677d3b0d396d5a57cff4f93c724aaaa3e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0cde33d5-98fb-45b7-b0ca-efc60772f6b8, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202) Dec 15 05:06:25 localhost podman[324127]: 2025-12-15 10:06:25.222152926 +0000 UTC m=+0.064936476 container died 1cb94e13831e06d71960c721bf24871677d3b0d396d5a57cff4f93c724aaaa3e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0cde33d5-98fb-45b7-b0ca-efc60772f6b8, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Dec 15 05:06:25 localhost systemd[1]: tmp-crun.Z4XTPf.mount: Deactivated successfully. Dec 15 05:06:25 localhost podman[324127]: 2025-12-15 10:06:25.26267635 +0000 UTC m=+0.105459860 container cleanup 1cb94e13831e06d71960c721bf24871677d3b0d396d5a57cff4f93c724aaaa3e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0cde33d5-98fb-45b7-b0ca-efc60772f6b8, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true) Dec 15 05:06:25 localhost systemd[1]: libpod-conmon-1cb94e13831e06d71960c721bf24871677d3b0d396d5a57cff4f93c724aaaa3e.scope: Deactivated successfully. Dec 15 05:06:25 localhost podman[324136]: 2025-12-15 10:06:25.301126042 +0000 UTC m=+0.128387843 container remove 1cb94e13831e06d71960c721bf24871677d3b0d396d5a57cff4f93c724aaaa3e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0cde33d5-98fb-45b7-b0ca-efc60772f6b8, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202) Dec 15 05:06:25 localhost ovn_controller[154603]: 2025-12-15T10:06:25Z|00319|binding|INFO|Releasing lport b5ab771c-faf2-4f8e-b5a8-5eb7d4781cc6 from this chassis (sb_readonly=0) Dec 15 05:06:25 localhost kernel: device tapb5ab771c-fa left promiscuous mode Dec 15 05:06:25 localhost ovn_controller[154603]: 2025-12-15T10:06:25Z|00320|binding|INFO|Setting lport b5ab771c-faf2-4f8e-b5a8-5eb7d4781cc6 down in Southbound Dec 15 05:06:25 localhost nova_compute[286344]: 2025-12-15 10:06:25.314 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:25 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:25.320 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-0cde33d5-98fb-45b7-b0ca-efc60772f6b8', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0cde33d5-98fb-45b7-b0ca-efc60772f6b8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '58db13bf0d834ccda318448020067936', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d36e857a-32c4-489d-af59-957ec099a1b0, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=b5ab771c-faf2-4f8e-b5a8-5eb7d4781cc6) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:06:25 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:25.321 160590 INFO neutron.agent.ovn.metadata.agent [-] Port b5ab771c-faf2-4f8e-b5a8-5eb7d4781cc6 in datapath 0cde33d5-98fb-45b7-b0ca-efc60772f6b8 unbound from our chassis#033[00m Dec 15 05:06:25 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:25.322 160590 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 0cde33d5-98fb-45b7-b0ca-efc60772f6b8 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 15 05:06:25 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:25.322 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[be05b19a-315b-4285-992b-f1759ac8fa0e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:06:25 localhost podman[324120]: 2025-12-15 10:06:25.226480685 +0000 UTC m=+0.090344673 container health_status a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Dec 15 05:06:25 localhost nova_compute[286344]: 2025-12-15 10:06:25.334 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:25 localhost podman[324120]: 2025-12-15 10:06:25.359892302 +0000 UTC m=+0.223756240 container exec_died a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 15 05:06:25 localhost systemd[1]: a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e.service: Deactivated successfully. Dec 15 05:06:25 localhost nova_compute[286344]: 2025-12-15 10:06:25.515 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:25 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e174 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 05:06:26 localhost systemd[1]: tmp-crun.xQoZ4O.mount: Deactivated successfully. Dec 15 05:06:26 localhost systemd[1]: var-lib-containers-storage-overlay-6d9ca7dbc9c54ef0acd81baa7793ce4d5ab1d2abd1004bc994dd4967ed227a17-merged.mount: Deactivated successfully. Dec 15 05:06:26 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1cb94e13831e06d71960c721bf24871677d3b0d396d5a57cff4f93c724aaaa3e-userdata-shm.mount: Deactivated successfully. Dec 15 05:06:26 localhost ovn_controller[154603]: 2025-12-15T10:06:26Z|00321|binding|INFO|Releasing lport b35254ad-12eb-47bb-92be-44fefe0694f0 from this chassis (sb_readonly=0) Dec 15 05:06:26 localhost nova_compute[286344]: 2025-12-15 10:06:26.328 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:26 localhost systemd[1]: run-netns-qdhcp\x2d0cde33d5\x2d98fb\x2d45b7\x2db0ca\x2defc60772f6b8.mount: Deactivated successfully. Dec 15 05:06:26 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:26.388 267546 INFO neutron.agent.dhcp.agent [None req-acb313cd-3e97-4a0d-b969-a032d0540a87 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:06:26 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:26.389 267546 INFO neutron.agent.dhcp.agent [None req-acb313cd-3e97-4a0d-b969-a032d0540a87 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:06:26 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:26.473 267546 INFO neutron.agent.linux.ip_lib [None req-c61b8fc2-ba73-4713-841c-3d6e1549e92c - - - - - -] Device tap74d7ed3e-15 cannot be used as it has no MAC address#033[00m Dec 15 05:06:26 localhost nova_compute[286344]: 2025-12-15 10:06:26.493 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:26 localhost kernel: device tap74d7ed3e-15 entered promiscuous mode Dec 15 05:06:26 localhost ovn_controller[154603]: 2025-12-15T10:06:26Z|00322|binding|INFO|Claiming lport 74d7ed3e-1592-4aef-bbc4-9182d34ab7fc for this chassis. Dec 15 05:06:26 localhost ovn_controller[154603]: 2025-12-15T10:06:26Z|00323|binding|INFO|74d7ed3e-1592-4aef-bbc4-9182d34ab7fc: Claiming unknown Dec 15 05:06:26 localhost NetworkManager[5963]: [1765793186.5001] manager: (tap74d7ed3e-15): new Generic device (/org/freedesktop/NetworkManager/Devices/52) Dec 15 05:06:26 localhost nova_compute[286344]: 2025-12-15 10:06:26.502 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:26 localhost systemd-udevd[324182]: Network interface NamePolicy= disabled on kernel command line. Dec 15 05:06:26 localhost journal[231322]: ethtool ioctl error on tap74d7ed3e-15: No such device Dec 15 05:06:26 localhost ovn_controller[154603]: 2025-12-15T10:06:26Z|00324|binding|INFO|Setting lport 74d7ed3e-1592-4aef-bbc4-9182d34ab7fc ovn-installed in OVS Dec 15 05:06:26 localhost nova_compute[286344]: 2025-12-15 10:06:26.529 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:26 localhost journal[231322]: ethtool ioctl error on tap74d7ed3e-15: No such device Dec 15 05:06:26 localhost ovn_controller[154603]: 2025-12-15T10:06:26Z|00325|binding|INFO|Setting lport 74d7ed3e-1592-4aef-bbc4-9182d34ab7fc up in Southbound Dec 15 05:06:26 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:26.535 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '89e710ef9f4f48d48a369002db572947', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e47821dc-5f5d-44dc-8a16-54817df4049d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=74d7ed3e-1592-4aef-bbc4-9182d34ab7fc) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:06:26 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:26.536 160590 INFO neutron.agent.ovn.metadata.agent [-] Port 74d7ed3e-1592-4aef-bbc4-9182d34ab7fc in datapath c0669abd-aef1-4b0d-9f97-a6adeeac3211 bound to our chassis#033[00m Dec 15 05:06:26 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:26.537 160590 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network c0669abd-aef1-4b0d-9f97-a6adeeac3211 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 15 05:06:26 localhost journal[231322]: ethtool ioctl error on tap74d7ed3e-15: No such device Dec 15 05:06:26 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:26.538 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[51083bbd-7c22-4c18-8115-7b5f11f044d1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:06:26 localhost journal[231322]: ethtool ioctl error on tap74d7ed3e-15: No such device Dec 15 05:06:26 localhost journal[231322]: ethtool ioctl error on tap74d7ed3e-15: No such device Dec 15 05:06:26 localhost journal[231322]: ethtool ioctl error on tap74d7ed3e-15: No such device Dec 15 05:06:26 localhost journal[231322]: ethtool ioctl error on tap74d7ed3e-15: No such device Dec 15 05:06:26 localhost journal[231322]: ethtool ioctl error on tap74d7ed3e-15: No such device Dec 15 05:06:26 localhost nova_compute[286344]: 2025-12-15 10:06:26.570 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:26 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Dec 15 05:06:26 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.25625 172.18.0.34:0/382777224' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Dec 15 05:06:26 localhost nova_compute[286344]: 2025-12-15 10:06:26.610 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:27 localhost podman[324253]: Dec 15 05:06:27 localhost podman[324253]: 2025-12-15 10:06:27.485781531 +0000 UTC m=+0.090007563 container create 957a0014c4d388494413c001beb7a9e69d3c93f4e9eb490381c2024864b71603 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:06:27 localhost systemd[1]: Started libpod-conmon-957a0014c4d388494413c001beb7a9e69d3c93f4e9eb490381c2024864b71603.scope. Dec 15 05:06:27 localhost podman[324253]: 2025-12-15 10:06:27.444603421 +0000 UTC m=+0.048829473 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 15 05:06:27 localhost systemd[1]: Started libcrun container. Dec 15 05:06:27 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1d36e8f4244237199baed09623472b04aa341c90fa15e9ad9279a37dfd34c8f5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 05:06:27 localhost podman[324253]: 2025-12-15 10:06:27.559629779 +0000 UTC m=+0.163855821 container init 957a0014c4d388494413c001beb7a9e69d3c93f4e9eb490381c2024864b71603 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 05:06:27 localhost podman[324253]: 2025-12-15 10:06:27.569451235 +0000 UTC m=+0.173677267 container start 957a0014c4d388494413c001beb7a9e69d3c93f4e9eb490381c2024864b71603 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true) Dec 15 05:06:27 localhost dnsmasq[324271]: started, version 2.85 cachesize 150 Dec 15 05:06:27 localhost dnsmasq[324271]: DNS service limited to local subnets Dec 15 05:06:27 localhost dnsmasq[324271]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 15 05:06:27 localhost dnsmasq[324271]: warning: no upstream servers configured Dec 15 05:06:27 localhost dnsmasq[324271]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/addn_hosts - 0 addresses Dec 15 05:06:27 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:27.627 267546 INFO neutron.agent.dhcp.agent [None req-c61b8fc2-ba73-4713-841c-3d6e1549e92c - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-15T10:06:24Z, description=, device_id=bba41ba2-4e47-485f-85db-683f504c6c57, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=9abefd75-7498-4555-a71a-4f912eff3a71, ip_allocation=immediate, mac_address=fa:16:3e:f7:25:d3, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-15T10:05:39Z, description=, dns_domain=, id=c0669abd-aef1-4b0d-9f97-a6adeeac3211, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1861207414, port_security_enabled=True, project_id=89e710ef9f4f48d48a369002db572947, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=20673, qos_policy_id=None, revision_number=20, router:external=False, shared=False, standard_attr_id=2164, status=ACTIVE, subnets=['ffd5430b-b09a-4ff2-ab59-84b5e6d765fd'], tags=[], tenant_id=89e710ef9f4f48d48a369002db572947, updated_at=2025-12-15T10:06:22Z, vlan_transparent=None, network_id=c0669abd-aef1-4b0d-9f97-a6adeeac3211, port_security_enabled=False, project_id=89e710ef9f4f48d48a369002db572947, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2388, status=DOWN, tags=[], tenant_id=89e710ef9f4f48d48a369002db572947, updated_at=2025-12-15T10:06:25Z on network c0669abd-aef1-4b0d-9f97-a6adeeac3211#033[00m Dec 15 05:06:27 localhost podman[324291]: 2025-12-15 10:06:27.781079961 +0000 UTC m=+0.037670924 container kill 957a0014c4d388494413c001beb7a9e69d3c93f4e9eb490381c2024864b71603 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true) Dec 15 05:06:27 localhost dnsmasq[324271]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/addn_hosts - 1 addresses Dec 15 05:06:27 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:27.881 267546 INFO neutron.agent.dhcp.agent [None req-4a65c5a2-d027-4bde-aca1-a709edc514d0 - - - - - -] DHCP configuration for ports {'79503367-f53f-4b35-8760-76fcaa4d8407'} is completed#033[00m Dec 15 05:06:27 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:27.934 267546 INFO neutron.agent.dhcp.agent [None req-c61b8fc2-ba73-4713-841c-3d6e1549e92c - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-15T10:06:24Z, description=, device_id=bba41ba2-4e47-485f-85db-683f504c6c57, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=9abefd75-7498-4555-a71a-4f912eff3a71, ip_allocation=immediate, mac_address=fa:16:3e:f7:25:d3, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-15T10:05:39Z, description=, dns_domain=, id=c0669abd-aef1-4b0d-9f97-a6adeeac3211, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1861207414, port_security_enabled=True, project_id=89e710ef9f4f48d48a369002db572947, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=20673, qos_policy_id=None, revision_number=20, router:external=False, shared=False, standard_attr_id=2164, status=ACTIVE, subnets=['ffd5430b-b09a-4ff2-ab59-84b5e6d765fd'], tags=[], tenant_id=89e710ef9f4f48d48a369002db572947, updated_at=2025-12-15T10:06:22Z, vlan_transparent=None, network_id=c0669abd-aef1-4b0d-9f97-a6adeeac3211, port_security_enabled=False, project_id=89e710ef9f4f48d48a369002db572947, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2388, status=DOWN, tags=[], tenant_id=89e710ef9f4f48d48a369002db572947, updated_at=2025-12-15T10:06:25Z on network c0669abd-aef1-4b0d-9f97-a6adeeac3211#033[00m Dec 15 05:06:28 localhost dnsmasq[324271]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/addn_hosts - 1 addresses Dec 15 05:06:28 localhost podman[324331]: 2025-12-15 10:06:28.117723195 +0000 UTC m=+0.059011537 container kill 957a0014c4d388494413c001beb7a9e69d3c93f4e9eb490381c2024864b71603 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:06:28 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:28.163 267546 INFO neutron.agent.dhcp.agent [None req-6568ed6d-1ce5-4612-8576-69f8e3b40db5 - - - - - -] DHCP configuration for ports {'9abefd75-7498-4555-a71a-4f912eff3a71'} is completed#033[00m Dec 15 05:06:28 localhost nova_compute[286344]: 2025-12-15 10:06:28.223 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:28 localhost nova_compute[286344]: 2025-12-15 10:06:28.272 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:28 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:28.461 267546 INFO neutron.agent.dhcp.agent [None req-b6036113-5d58-4543-af9c-95176a71f7a4 - - - - - -] DHCP configuration for ports {'9abefd75-7498-4555-a71a-4f912eff3a71'} is completed#033[00m Dec 15 05:06:30 localhost nova_compute[286344]: 2025-12-15 10:06:30.518 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:30 localhost dnsmasq[324271]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/addn_hosts - 0 addresses Dec 15 05:06:30 localhost podman[324367]: 2025-12-15 10:06:30.527941577 +0000 UTC m=+0.064841523 container kill 957a0014c4d388494413c001beb7a9e69d3c93f4e9eb490381c2024864b71603 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202) Dec 15 05:06:30 localhost systemd[1]: tmp-crun.EHvz8V.mount: Deactivated successfully. Dec 15 05:06:30 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e174 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 05:06:31 localhost ovn_controller[154603]: 2025-12-15T10:06:31Z|00326|binding|INFO|Releasing lport 74d7ed3e-1592-4aef-bbc4-9182d34ab7fc from this chassis (sb_readonly=0) Dec 15 05:06:31 localhost kernel: device tap74d7ed3e-15 left promiscuous mode Dec 15 05:06:31 localhost ovn_controller[154603]: 2025-12-15T10:06:31Z|00327|binding|INFO|Setting lport 74d7ed3e-1592-4aef-bbc4-9182d34ab7fc down in Southbound Dec 15 05:06:31 localhost nova_compute[286344]: 2025-12-15 10:06:31.482 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:31 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:31.494 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '89e710ef9f4f48d48a369002db572947', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005559462.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e47821dc-5f5d-44dc-8a16-54817df4049d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=74d7ed3e-1592-4aef-bbc4-9182d34ab7fc) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:06:31 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:31.496 160590 INFO neutron.agent.ovn.metadata.agent [-] Port 74d7ed3e-1592-4aef-bbc4-9182d34ab7fc in datapath c0669abd-aef1-4b0d-9f97-a6adeeac3211 unbound from our chassis#033[00m Dec 15 05:06:31 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:31.497 160590 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network c0669abd-aef1-4b0d-9f97-a6adeeac3211 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 15 05:06:31 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:31.498 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[91a65880-f2a8-492e-8cf4-d4471e62add4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:06:31 localhost nova_compute[286344]: 2025-12-15 10:06:31.502 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:31 localhost podman[243449]: time="2025-12-15T10:06:31Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 15 05:06:31 localhost podman[243449]: @ - - [15/Dec/2025:10:06:31 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 162000 "" "Go-http-client/1.1" Dec 15 05:06:31 localhost podman[243449]: @ - - [15/Dec/2025:10:06:31 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 20667 "" "Go-http-client/1.1" Dec 15 05:06:32 localhost dnsmasq[324271]: exiting on receipt of SIGTERM Dec 15 05:06:32 localhost podman[324406]: 2025-12-15 10:06:32.602084241 +0000 UTC m=+0.060472904 container kill 957a0014c4d388494413c001beb7a9e69d3c93f4e9eb490381c2024864b71603 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2) Dec 15 05:06:32 localhost systemd[1]: tmp-crun.xVBo0u.mount: Deactivated successfully. Dec 15 05:06:32 localhost systemd[1]: libpod-957a0014c4d388494413c001beb7a9e69d3c93f4e9eb490381c2024864b71603.scope: Deactivated successfully. Dec 15 05:06:32 localhost podman[324420]: 2025-12-15 10:06:32.673685882 +0000 UTC m=+0.052779152 container died 957a0014c4d388494413c001beb7a9e69d3c93f4e9eb490381c2024864b71603 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:06:32 localhost systemd[1]: tmp-crun.1TVBKk.mount: Deactivated successfully. Dec 15 05:06:32 localhost podman[324420]: 2025-12-15 10:06:32.711162631 +0000 UTC m=+0.090255861 container cleanup 957a0014c4d388494413c001beb7a9e69d3c93f4e9eb490381c2024864b71603 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:06:32 localhost systemd[1]: libpod-conmon-957a0014c4d388494413c001beb7a9e69d3c93f4e9eb490381c2024864b71603.scope: Deactivated successfully. Dec 15 05:06:32 localhost podman[324421]: 2025-12-15 10:06:32.750805063 +0000 UTC m=+0.126537538 container remove 957a0014c4d388494413c001beb7a9e69d3c93f4e9eb490381c2024864b71603 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:06:32 localhost ovn_controller[154603]: 2025-12-15T10:06:32Z|00328|binding|INFO|Releasing lport b35254ad-12eb-47bb-92be-44fefe0694f0 from this chassis (sb_readonly=0) Dec 15 05:06:32 localhost nova_compute[286344]: 2025-12-15 10:06:32.880 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:33 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:33.103 267546 INFO neutron.agent.dhcp.agent [None req-fdca4ffe-4f31-4937-bf27-5d8dd739fd7b - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:06:33 localhost nova_compute[286344]: 2025-12-15 10:06:33.225 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:33 localhost dnsmasq[321616]: exiting on receipt of SIGTERM Dec 15 05:06:33 localhost podman[324463]: 2025-12-15 10:06:33.264552098 +0000 UTC m=+0.059421837 container kill 3adc8eeda4c8329152ad78848902708dec1236fd2c4350e8e6383067837b1f3c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f92fa7f0-d2bf-47f3-bb61-da7318ce7879, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2) Dec 15 05:06:33 localhost systemd[1]: libpod-3adc8eeda4c8329152ad78848902708dec1236fd2c4350e8e6383067837b1f3c.scope: Deactivated successfully. Dec 15 05:06:33 localhost podman[324475]: 2025-12-15 10:06:33.334753135 +0000 UTC m=+0.058036164 container died 3adc8eeda4c8329152ad78848902708dec1236fd2c4350e8e6383067837b1f3c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f92fa7f0-d2bf-47f3-bb61-da7318ce7879, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true) Dec 15 05:06:33 localhost podman[324475]: 2025-12-15 10:06:33.364788807 +0000 UTC m=+0.088071806 container cleanup 3adc8eeda4c8329152ad78848902708dec1236fd2c4350e8e6383067837b1f3c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f92fa7f0-d2bf-47f3-bb61-da7318ce7879, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202) Dec 15 05:06:33 localhost systemd[1]: libpod-conmon-3adc8eeda4c8329152ad78848902708dec1236fd2c4350e8e6383067837b1f3c.scope: Deactivated successfully. Dec 15 05:06:33 localhost podman[324477]: 2025-12-15 10:06:33.415043665 +0000 UTC m=+0.128604810 container remove 3adc8eeda4c8329152ad78848902708dec1236fd2c4350e8e6383067837b1f3c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f92fa7f0-d2bf-47f3-bb61-da7318ce7879, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:06:33 localhost ovn_controller[154603]: 2025-12-15T10:06:33Z|00329|binding|INFO|Releasing lport 159bb878-40ba-4340-a0c2-64606be092f4 from this chassis (sb_readonly=0) Dec 15 05:06:33 localhost ovn_controller[154603]: 2025-12-15T10:06:33Z|00330|binding|INFO|Setting lport 159bb878-40ba-4340-a0c2-64606be092f4 down in Southbound Dec 15 05:06:33 localhost kernel: device tap159bb878-40 left promiscuous mode Dec 15 05:06:33 localhost nova_compute[286344]: 2025-12-15 10:06:33.427 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:33 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:33.440 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::1/64', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-f92fa7f0-d2bf-47f3-bb61-da7318ce7879', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f92fa7f0-d2bf-47f3-bb61-da7318ce7879', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '58db13bf0d834ccda318448020067936', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005559462.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=add79b81-b2fb-4bfb-9f25-d60503ab36ae, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=159bb878-40ba-4340-a0c2-64606be092f4) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:06:33 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:33.442 160590 INFO neutron.agent.ovn.metadata.agent [-] Port 159bb878-40ba-4340-a0c2-64606be092f4 in datapath f92fa7f0-d2bf-47f3-bb61-da7318ce7879 unbound from our chassis#033[00m Dec 15 05:06:33 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:33.444 160590 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network f92fa7f0-d2bf-47f3-bb61-da7318ce7879 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 15 05:06:33 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:33.445 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[876f60dd-86b0-4d0a-b5eb-dda10ffb2696]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:06:33 localhost nova_compute[286344]: 2025-12-15 10:06:33.449 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:33 localhost systemd[1]: var-lib-containers-storage-overlay-1d36e8f4244237199baed09623472b04aa341c90fa15e9ad9279a37dfd34c8f5-merged.mount: Deactivated successfully. Dec 15 05:06:33 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-957a0014c4d388494413c001beb7a9e69d3c93f4e9eb490381c2024864b71603-userdata-shm.mount: Deactivated successfully. Dec 15 05:06:33 localhost systemd[1]: run-netns-qdhcp\x2dc0669abd\x2daef1\x2d4b0d\x2d9f97\x2da6adeeac3211.mount: Deactivated successfully. Dec 15 05:06:33 localhost systemd[1]: var-lib-containers-storage-overlay-ad0505c1aa899b9e666a1145921c85111503f1ab236a8fc5775a8b0f9b589090-merged.mount: Deactivated successfully. Dec 15 05:06:33 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3adc8eeda4c8329152ad78848902708dec1236fd2c4350e8e6383067837b1f3c-userdata-shm.mount: Deactivated successfully. Dec 15 05:06:33 localhost systemd[1]: run-netns-qdhcp\x2df92fa7f0\x2dd2bf\x2d47f3\x2dbb61\x2dda7318ce7879.mount: Deactivated successfully. Dec 15 05:06:33 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:33.665 267546 INFO neutron.agent.dhcp.agent [None req-ab420496-a2cc-45ab-8506-e129e9d28b17 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:06:33 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:33.666 267546 INFO neutron.agent.dhcp.agent [None req-ab420496-a2cc-45ab-8506-e129e9d28b17 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:06:34 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:34.025 267546 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:06:34 localhost ovn_controller[154603]: 2025-12-15T10:06:34Z|00331|binding|INFO|Releasing lport b35254ad-12eb-47bb-92be-44fefe0694f0 from this chassis (sb_readonly=0) Dec 15 05:06:34 localhost nova_compute[286344]: 2025-12-15 10:06:34.321 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:34 localhost openstack_network_exporter[246484]: ERROR 10:06:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 05:06:34 localhost openstack_network_exporter[246484]: ERROR 10:06:34 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 15 05:06:34 localhost openstack_network_exporter[246484]: ERROR 10:06:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 05:06:34 localhost openstack_network_exporter[246484]: ERROR 10:06:34 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 15 05:06:34 localhost openstack_network_exporter[246484]: Dec 15 05:06:34 localhost openstack_network_exporter[246484]: ERROR 10:06:34 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 15 05:06:34 localhost openstack_network_exporter[246484]: Dec 15 05:06:35 localhost nova_compute[286344]: 2025-12-15 10:06:35.553 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:35 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:35.653 267546 INFO neutron.agent.linux.ip_lib [None req-e51afd07-b3d5-4e67-8908-953ab4a29ad1 - - - - - -] Device tap5a4cf37c-37 cannot be used as it has no MAC address#033[00m Dec 15 05:06:35 localhost nova_compute[286344]: 2025-12-15 10:06:35.676 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:35 localhost kernel: device tap5a4cf37c-37 entered promiscuous mode Dec 15 05:06:35 localhost NetworkManager[5963]: [1765793195.6841] manager: (tap5a4cf37c-37): new Generic device (/org/freedesktop/NetworkManager/Devices/53) Dec 15 05:06:35 localhost nova_compute[286344]: 2025-12-15 10:06:35.684 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:35 localhost ovn_controller[154603]: 2025-12-15T10:06:35Z|00332|binding|INFO|Claiming lport 5a4cf37c-37ea-429b-9caa-6e3e638f1ab2 for this chassis. Dec 15 05:06:35 localhost ovn_controller[154603]: 2025-12-15T10:06:35Z|00333|binding|INFO|5a4cf37c-37ea-429b-9caa-6e3e638f1ab2: Claiming unknown Dec 15 05:06:35 localhost systemd-udevd[324518]: Network interface NamePolicy= disabled on kernel command line. Dec 15 05:06:35 localhost journal[231322]: ethtool ioctl error on tap5a4cf37c-37: No such device Dec 15 05:06:35 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:35.723 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe43:4a2/64', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '89e710ef9f4f48d48a369002db572947', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e47821dc-5f5d-44dc-8a16-54817df4049d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=5a4cf37c-37ea-429b-9caa-6e3e638f1ab2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:06:35 localhost ovn_controller[154603]: 2025-12-15T10:06:35Z|00334|binding|INFO|Setting lport 5a4cf37c-37ea-429b-9caa-6e3e638f1ab2 ovn-installed in OVS Dec 15 05:06:35 localhost ovn_controller[154603]: 2025-12-15T10:06:35Z|00335|binding|INFO|Setting lport 5a4cf37c-37ea-429b-9caa-6e3e638f1ab2 up in Southbound Dec 15 05:06:35 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:35.727 160590 INFO neutron.agent.ovn.metadata.agent [-] Port 5a4cf37c-37ea-429b-9caa-6e3e638f1ab2 in datapath c0669abd-aef1-4b0d-9f97-a6adeeac3211 bound to our chassis#033[00m Dec 15 05:06:35 localhost journal[231322]: ethtool ioctl error on tap5a4cf37c-37: No such device Dec 15 05:06:35 localhost nova_compute[286344]: 2025-12-15 10:06:35.728 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:35 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:35.730 160590 DEBUG neutron.agent.ovn.metadata.agent [-] Port c78598bf-e416-478a-bccf-cf1e43fa87b5 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Dec 15 05:06:35 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:35.731 160590 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c0669abd-aef1-4b0d-9f97-a6adeeac3211, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 15 05:06:35 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:35.732 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[82000e73-c877-48c2-b976-34c354ec5d17]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:06:35 localhost journal[231322]: ethtool ioctl error on tap5a4cf37c-37: No such device Dec 15 05:06:35 localhost journal[231322]: ethtool ioctl error on tap5a4cf37c-37: No such device Dec 15 05:06:35 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e174 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 05:06:35 localhost journal[231322]: ethtool ioctl error on tap5a4cf37c-37: No such device Dec 15 05:06:35 localhost journal[231322]: ethtool ioctl error on tap5a4cf37c-37: No such device Dec 15 05:06:35 localhost journal[231322]: ethtool ioctl error on tap5a4cf37c-37: No such device Dec 15 05:06:35 localhost journal[231322]: ethtool ioctl error on tap5a4cf37c-37: No such device Dec 15 05:06:35 localhost nova_compute[286344]: 2025-12-15 10:06:35.768 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:35 localhost nova_compute[286344]: 2025-12-15 10:06:35.797 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:36 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Dec 15 05:06:36 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.25625 172.18.0.34:0/382777224' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Dec 15 05:06:36 localhost podman[324589]: Dec 15 05:06:36 localhost podman[324589]: 2025-12-15 10:06:36.580768713 +0000 UTC m=+0.086714871 container create 3785b198c1a52f311840af02cf8ddac39a4e13f54a43e25fb0d4081e675018f1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS) Dec 15 05:06:36 localhost systemd[1]: Started libpod-conmon-3785b198c1a52f311840af02cf8ddac39a4e13f54a43e25fb0d4081e675018f1.scope. Dec 15 05:06:36 localhost podman[324589]: 2025-12-15 10:06:36.539406158 +0000 UTC m=+0.045352386 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 15 05:06:36 localhost systemd[1]: tmp-crun.3tEh6E.mount: Deactivated successfully. Dec 15 05:06:36 localhost systemd[1]: Started libcrun container. Dec 15 05:06:36 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c22eca8db6d8fa77bbe31b396c3e1171624cfbe972adff78ed92989400a40468/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 05:06:36 localhost podman[324589]: 2025-12-15 10:06:36.669913054 +0000 UTC m=+0.175859212 container init 3785b198c1a52f311840af02cf8ddac39a4e13f54a43e25fb0d4081e675018f1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Dec 15 05:06:36 localhost podman[324589]: 2025-12-15 10:06:36.681459113 +0000 UTC m=+0.187405281 container start 3785b198c1a52f311840af02cf8ddac39a4e13f54a43e25fb0d4081e675018f1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 05:06:36 localhost dnsmasq[324607]: started, version 2.85 cachesize 150 Dec 15 05:06:36 localhost dnsmasq[324607]: DNS service limited to local subnets Dec 15 05:06:36 localhost dnsmasq[324607]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 15 05:06:36 localhost dnsmasq[324607]: warning: no upstream servers configured Dec 15 05:06:36 localhost dnsmasq[324607]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/addn_hosts - 0 addresses Dec 15 05:06:36 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:36.889 267546 INFO neutron.agent.dhcp.agent [None req-a729be93-6a8c-4284-9d33-e68a0bd3b92c - - - - - -] DHCP configuration for ports {'79503367-f53f-4b35-8760-76fcaa4d8407'} is completed#033[00m Dec 15 05:06:37 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:37.065 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:04:2e:5c 10.100.0.2 2001:db8::f816:3eff:fe04:2e5c'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe04:2e5c/64', 'neutron:device_id': 'ovnmeta-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '89e710ef9f4f48d48a369002db572947', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e47821dc-5f5d-44dc-8a16-54817df4049d, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=79503367-f53f-4b35-8760-76fcaa4d8407) old=Port_Binding(mac=['fa:16:3e:04:2e:5c 2001:db8::f816:3eff:fe04:2e5c'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe04:2e5c/64', 'neutron:device_id': 'ovnmeta-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '89e710ef9f4f48d48a369002db572947', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:06:37 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:37.067 160590 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 79503367-f53f-4b35-8760-76fcaa4d8407 in datapath c0669abd-aef1-4b0d-9f97-a6adeeac3211 updated#033[00m Dec 15 05:06:37 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:37.070 160590 DEBUG neutron.agent.ovn.metadata.agent [-] Port c78598bf-e416-478a-bccf-cf1e43fa87b5 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Dec 15 05:06:37 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:37.070 160590 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c0669abd-aef1-4b0d-9f97-a6adeeac3211, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 15 05:06:37 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:37.071 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[34927ef2-e4b7-4ecf-9187-b753abcadaa7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:06:37 localhost dnsmasq[324607]: exiting on receipt of SIGTERM Dec 15 05:06:37 localhost podman[324623]: 2025-12-15 10:06:37.073669338 +0000 UTC m=+0.062566638 container kill 3785b198c1a52f311840af02cf8ddac39a4e13f54a43e25fb0d4081e675018f1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2) Dec 15 05:06:37 localhost systemd[1]: libpod-3785b198c1a52f311840af02cf8ddac39a4e13f54a43e25fb0d4081e675018f1.scope: Deactivated successfully. Dec 15 05:06:37 localhost podman[324637]: 2025-12-15 10:06:37.140190432 +0000 UTC m=+0.052550746 container died 3785b198c1a52f311840af02cf8ddac39a4e13f54a43e25fb0d4081e675018f1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Dec 15 05:06:37 localhost podman[324637]: 2025-12-15 10:06:37.170719796 +0000 UTC m=+0.083080120 container cleanup 3785b198c1a52f311840af02cf8ddac39a4e13f54a43e25fb0d4081e675018f1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Dec 15 05:06:37 localhost systemd[1]: libpod-conmon-3785b198c1a52f311840af02cf8ddac39a4e13f54a43e25fb0d4081e675018f1.scope: Deactivated successfully. Dec 15 05:06:37 localhost podman[324639]: 2025-12-15 10:06:37.223500696 +0000 UTC m=+0.128393134 container remove 3785b198c1a52f311840af02cf8ddac39a4e13f54a43e25fb0d4081e675018f1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Dec 15 05:06:37 localhost systemd[1]: var-lib-containers-storage-overlay-c22eca8db6d8fa77bbe31b396c3e1171624cfbe972adff78ed92989400a40468-merged.mount: Deactivated successfully. Dec 15 05:06:37 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3785b198c1a52f311840af02cf8ddac39a4e13f54a43e25fb0d4081e675018f1-userdata-shm.mount: Deactivated successfully. Dec 15 05:06:38 localhost nova_compute[286344]: 2025-12-15 10:06:38.264 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:39 localhost podman[324683]: 2025-12-15 10:06:39.219176617 +0000 UTC m=+0.069231364 container kill de37692395c3281252a36a476ac3795cfecdc2602a63c414d8accd8a1744ef0a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-307e1014-96f2-485d-9b39-dd1a53bbce39, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202) Dec 15 05:06:39 localhost dnsmasq[321289]: exiting on receipt of SIGTERM Dec 15 05:06:39 localhost systemd[1]: libpod-de37692395c3281252a36a476ac3795cfecdc2602a63c414d8accd8a1744ef0a.scope: Deactivated successfully. Dec 15 05:06:39 localhost podman[324702]: 2025-12-15 10:06:39.287906766 +0000 UTC m=+0.051444378 container died de37692395c3281252a36a476ac3795cfecdc2602a63c414d8accd8a1744ef0a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-307e1014-96f2-485d-9b39-dd1a53bbce39, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true) Dec 15 05:06:39 localhost neutron_sriov_agent[260044]: 2025-12-15 10:06:39.288 2 INFO neutron.agent.securitygroups_rpc [None req-8e9020c1-b60c-486d-8d6f-f9661d9e8db0 6b5da6f221214afe93e1fa66574f238b 89e710ef9f4f48d48a369002db572947 - - default default] Security group member updated ['a6c5f808-dddc-4f17-acbf-63b1b6e6f4d6']#033[00m Dec 15 05:06:39 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-de37692395c3281252a36a476ac3795cfecdc2602a63c414d8accd8a1744ef0a-userdata-shm.mount: Deactivated successfully. Dec 15 05:06:39 localhost podman[324702]: 2025-12-15 10:06:39.319793795 +0000 UTC m=+0.083331377 container cleanup de37692395c3281252a36a476ac3795cfecdc2602a63c414d8accd8a1744ef0a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-307e1014-96f2-485d-9b39-dd1a53bbce39, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3) Dec 15 05:06:39 localhost systemd[1]: libpod-conmon-de37692395c3281252a36a476ac3795cfecdc2602a63c414d8accd8a1744ef0a.scope: Deactivated successfully. Dec 15 05:06:39 localhost podman[324703]: 2025-12-15 10:06:39.374313618 +0000 UTC m=+0.132972678 container remove de37692395c3281252a36a476ac3795cfecdc2602a63c414d8accd8a1744ef0a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-307e1014-96f2-485d-9b39-dd1a53bbce39, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:06:39 localhost ovn_controller[154603]: 2025-12-15T10:06:39Z|00336|binding|INFO|Releasing lport 8e9b47ba-93aa-4955-91d2-958702cde86f from this chassis (sb_readonly=0) Dec 15 05:06:39 localhost nova_compute[286344]: 2025-12-15 10:06:39.423 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:39 localhost kernel: device tap8e9b47ba-93 left promiscuous mode Dec 15 05:06:39 localhost ovn_controller[154603]: 2025-12-15T10:06:39Z|00337|binding|INFO|Setting lport 8e9b47ba-93aa-4955-91d2-958702cde86f down in Southbound Dec 15 05:06:39 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:39.437 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:ffff::2/64', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-307e1014-96f2-485d-9b39-dd1a53bbce39', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-307e1014-96f2-485d-9b39-dd1a53bbce39', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '58db13bf0d834ccda318448020067936', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005559462.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bb700d79-8a1a-4b17-ad19-514767bed4fc, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=8e9b47ba-93aa-4955-91d2-958702cde86f) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:06:39 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:39.439 160590 INFO neutron.agent.ovn.metadata.agent [-] Port 8e9b47ba-93aa-4955-91d2-958702cde86f in datapath 307e1014-96f2-485d-9b39-dd1a53bbce39 unbound from our chassis#033[00m Dec 15 05:06:39 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:39.440 160590 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 307e1014-96f2-485d-9b39-dd1a53bbce39 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 15 05:06:39 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:39.441 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[ff79fd11-5a57-4a6d-8ad4-bef2a6fa0301]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:06:39 localhost nova_compute[286344]: 2025-12-15 10:06:39.452 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:39 localhost nova_compute[286344]: 2025-12-15 10:06:39.455 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:39 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:39.468 267546 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:06:40 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:40.020 267546 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:06:40 localhost podman[324775]: Dec 15 05:06:40 localhost podman[324775]: 2025-12-15 10:06:40.127754922 +0000 UTC m=+0.088475485 container create fc5e9bce44df8105b02182ef00d00351c80ec051cabb4ea787d16be8a408d5ac (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:06:40 localhost systemd[1]: Started libpod-conmon-fc5e9bce44df8105b02182ef00d00351c80ec051cabb4ea787d16be8a408d5ac.scope. Dec 15 05:06:40 localhost systemd[1]: Started libcrun container. Dec 15 05:06:40 localhost podman[324775]: 2025-12-15 10:06:40.084945101 +0000 UTC m=+0.045665704 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 15 05:06:40 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2c5baab4b5e4c8ba2c8bea3080e88e9a77ac3f36ae0aedecf530caf25c11cfb5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 05:06:40 localhost podman[324775]: 2025-12-15 10:06:40.199277053 +0000 UTC m=+0.159997616 container init fc5e9bce44df8105b02182ef00d00351c80ec051cabb4ea787d16be8a408d5ac (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true) Dec 15 05:06:40 localhost podman[324775]: 2025-12-15 10:06:40.208585925 +0000 UTC m=+0.169306478 container start fc5e9bce44df8105b02182ef00d00351c80ec051cabb4ea787d16be8a408d5ac (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true) Dec 15 05:06:40 localhost dnsmasq[324794]: started, version 2.85 cachesize 150 Dec 15 05:06:40 localhost dnsmasq[324794]: DNS service limited to local subnets Dec 15 05:06:40 localhost dnsmasq[324794]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 15 05:06:40 localhost dnsmasq[324794]: warning: no upstream servers configured Dec 15 05:06:40 localhost dnsmasq-dhcp[324794]: DHCP, static leases only on 10.100.0.0, lease time 1d Dec 15 05:06:40 localhost dnsmasq[324794]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/addn_hosts - 0 addresses Dec 15 05:06:40 localhost dnsmasq-dhcp[324794]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/host Dec 15 05:06:40 localhost dnsmasq-dhcp[324794]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/opts Dec 15 05:06:40 localhost systemd[1]: var-lib-containers-storage-overlay-291069eecaeaf2840287e2aff8611ace2f4cc02008b5e608012fb764bff07026-merged.mount: Deactivated successfully. Dec 15 05:06:40 localhost systemd[1]: run-netns-qdhcp\x2d307e1014\x2d96f2\x2d485d\x2d9b39\x2ddd1a53bbce39.mount: Deactivated successfully. Dec 15 05:06:40 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:40.273 267546 INFO neutron.agent.dhcp.agent [None req-d427305b-b20c-4029-8566-a002e2ebaa00 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-15T10:06:38Z, description=, device_id=, device_owner=, dns_assignment=[, ], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[, ], id=2297803a-67ce-44d6-90d4-c5e719aa8e1c, ip_allocation=immediate, mac_address=fa:16:3e:93:94:fb, name=tempest-NetworksTestDHCPv6-1701368857, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-15T10:05:39Z, description=, dns_domain=, id=c0669abd-aef1-4b0d-9f97-a6adeeac3211, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1861207414, port_security_enabled=True, project_id=89e710ef9f4f48d48a369002db572947, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=20673, qos_policy_id=None, revision_number=23, router:external=False, shared=False, standard_attr_id=2164, status=ACTIVE, subnets=['0808c6aa-5e90-4f90-a242-fcadae4063ac', '100102c9-1d1d-4a61-9b76-061beb82eb87'], tags=[], tenant_id=89e710ef9f4f48d48a369002db572947, updated_at=2025-12-15T10:06:35Z, vlan_transparent=None, network_id=c0669abd-aef1-4b0d-9f97-a6adeeac3211, port_security_enabled=True, project_id=89e710ef9f4f48d48a369002db572947, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['a6c5f808-dddc-4f17-acbf-63b1b6e6f4d6'], standard_attr_id=2456, status=DOWN, tags=[], tenant_id=89e710ef9f4f48d48a369002db572947, updated_at=2025-12-15T10:06:39Z on network c0669abd-aef1-4b0d-9f97-a6adeeac3211#033[00m Dec 15 05:06:40 localhost ovn_controller[154603]: 2025-12-15T10:06:40Z|00338|binding|INFO|Releasing lport b35254ad-12eb-47bb-92be-44fefe0694f0 from this chassis (sb_readonly=0) Dec 15 05:06:40 localhost nova_compute[286344]: 2025-12-15 10:06:40.314 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:40 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:40.442 267546 INFO neutron.agent.dhcp.agent [None req-bdf74775-0696-42fe-965c-a1df6b48eb73 - - - - - -] DHCP configuration for ports {'79503367-f53f-4b35-8760-76fcaa4d8407', '5a4cf37c-37ea-429b-9caa-6e3e638f1ab2'} is completed#033[00m Dec 15 05:06:40 localhost dnsmasq[324794]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/addn_hosts - 2 addresses Dec 15 05:06:40 localhost podman[324811]: 2025-12-15 10:06:40.524809939 +0000 UTC m=+0.060423714 container kill fc5e9bce44df8105b02182ef00d00351c80ec051cabb4ea787d16be8a408d5ac (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:06:40 localhost dnsmasq-dhcp[324794]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/host Dec 15 05:06:40 localhost dnsmasq-dhcp[324794]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/opts Dec 15 05:06:40 localhost nova_compute[286344]: 2025-12-15 10:06:40.585 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:40 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e174 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 05:06:40 localhost ceph-mon[298913]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #55. Immutable memtables: 0. Dec 15 05:06:40 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:06:40.764145) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Dec 15 05:06:40 localhost ceph-mon[298913]: rocksdb: [db/flush_job.cc:856] [default] [JOB 31] Flushing memtable with next log file: 55 Dec 15 05:06:40 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765793200764248, "job": 31, "event": "flush_started", "num_memtables": 1, "num_entries": 1166, "num_deletes": 260, "total_data_size": 1579510, "memory_usage": 1720704, "flush_reason": "Manual Compaction"} Dec 15 05:06:40 localhost ceph-mon[298913]: rocksdb: [db/flush_job.cc:885] [default] [JOB 31] Level-0 flush table #56: started Dec 15 05:06:40 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765793200775796, "cf_name": "default", "job": 31, "event": "table_file_creation", "file_number": 56, "file_size": 1347262, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 30162, "largest_seqno": 31327, "table_properties": {"data_size": 1342493, "index_size": 2176, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1541, "raw_key_size": 13211, "raw_average_key_size": 22, "raw_value_size": 1332002, "raw_average_value_size": 2227, "num_data_blocks": 95, "num_entries": 598, "num_filter_entries": 598, "num_deletions": 260, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765793140, "oldest_key_time": 1765793140, "file_creation_time": 1765793200, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "603b24af-e2be-4214-bc56-9e652eb4af3d", "db_session_id": "0OJRM9SCUA16EXV0VQZ2", "orig_file_number": 56, "seqno_to_time_mapping": "N/A"}} Dec 15 05:06:40 localhost ceph-mon[298913]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 31] Flush lasted 11696 microseconds, and 4387 cpu microseconds. Dec 15 05:06:40 localhost ceph-mon[298913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Dec 15 05:06:40 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:06:40.775842) [db/flush_job.cc:967] [default] [JOB 31] Level-0 flush table #56: 1347262 bytes OK Dec 15 05:06:40 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:06:40.775865) [db/memtable_list.cc:519] [default] Level-0 commit table #56 started Dec 15 05:06:40 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:06:40.777809) [db/memtable_list.cc:722] [default] Level-0 commit table #56: memtable #1 done Dec 15 05:06:40 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:06:40.777831) EVENT_LOG_v1 {"time_micros": 1765793200777824, "job": 31, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Dec 15 05:06:40 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:06:40.777852) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Dec 15 05:06:40 localhost ceph-mon[298913]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 31] Try to delete WAL files size 1573896, prev total WAL file size 1574220, number of live WAL files 2. Dec 15 05:06:40 localhost ceph-mon[298913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005559462/store.db/000052.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Dec 15 05:06:40 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:06:40.778555) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740034303037' seq:72057594037927935, type:22 .. '6D6772737461740034323630' seq:0, type:0; will stop at (end) Dec 15 05:06:40 localhost ceph-mon[298913]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 32] Compacting 1@0 + 1@6 files to L6, score -1.00 Dec 15 05:06:40 localhost ceph-mon[298913]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 31 Base level 0, inputs: [56(1315KB)], [54(16MB)] Dec 15 05:06:40 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765793200778653, "job": 32, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [56], "files_L6": [54], "score": -1, "input_data_size": 19018735, "oldest_snapshot_seqno": -1} Dec 15 05:06:40 localhost ceph-mon[298913]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 32] Generated table #57: 12997 keys, 17094976 bytes, temperature: kUnknown Dec 15 05:06:40 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765793200905495, "cf_name": "default", "job": 32, "event": "table_file_creation", "file_number": 57, "file_size": 17094976, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17022711, "index_size": 38692, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 32517, "raw_key_size": 349244, "raw_average_key_size": 26, "raw_value_size": 16803027, "raw_average_value_size": 1292, "num_data_blocks": 1456, "num_entries": 12997, "num_filter_entries": 12997, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765792320, "oldest_key_time": 0, "file_creation_time": 1765793200, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "603b24af-e2be-4214-bc56-9e652eb4af3d", "db_session_id": "0OJRM9SCUA16EXV0VQZ2", "orig_file_number": 57, "seqno_to_time_mapping": "N/A"}} Dec 15 05:06:40 localhost ceph-mon[298913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Dec 15 05:06:40 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:06:40.905762) [db/compaction/compaction_job.cc:1663] [default] [JOB 32] Compacted 1@0 + 1@6 files to L6 => 17094976 bytes Dec 15 05:06:40 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:06:40.907401) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 149.8 rd, 134.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.3, 16.9 +0.0 blob) out(16.3 +0.0 blob), read-write-amplify(26.8) write-amplify(12.7) OK, records in: 13512, records dropped: 515 output_compression: NoCompression Dec 15 05:06:40 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:06:40.907428) EVENT_LOG_v1 {"time_micros": 1765793200907416, "job": 32, "event": "compaction_finished", "compaction_time_micros": 126923, "compaction_time_cpu_micros": 48912, "output_level": 6, "num_output_files": 1, "total_output_size": 17094976, "num_input_records": 13512, "num_output_records": 12997, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Dec 15 05:06:40 localhost ceph-mon[298913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005559462/store.db/000056.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Dec 15 05:06:40 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765793200907751, "job": 32, "event": "table_file_deletion", "file_number": 56} Dec 15 05:06:40 localhost ceph-mon[298913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005559462/store.db/000054.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Dec 15 05:06:40 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765793200910293, "job": 32, "event": "table_file_deletion", "file_number": 54} Dec 15 05:06:40 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:06:40.778482) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 05:06:40 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:06:40.910397) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 05:06:40 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:06:40.910403) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 05:06:40 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:06:40.910406) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 05:06:40 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:06:40.910410) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 05:06:40 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:06:40.910413) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 05:06:40 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:40.994 267546 INFO neutron.agent.dhcp.agent [None req-93b883b7-daff-4fd2-9339-869bfae73543 - - - - - -] DHCP configuration for ports {'2297803a-67ce-44d6-90d4-c5e719aa8e1c'} is completed#033[00m Dec 15 05:06:41 localhost neutron_sriov_agent[260044]: 2025-12-15 10:06:41.456 2 INFO neutron.agent.securitygroups_rpc [None req-813e1a4f-dcc5-4af4-918a-a93008022a16 6b5da6f221214afe93e1fa66574f238b 89e710ef9f4f48d48a369002db572947 - - default default] Security group member updated ['a6c5f808-dddc-4f17-acbf-63b1b6e6f4d6']#033[00m Dec 15 05:06:41 localhost dnsmasq[324794]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/addn_hosts - 0 addresses Dec 15 05:06:41 localhost dnsmasq-dhcp[324794]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/host Dec 15 05:06:41 localhost dnsmasq-dhcp[324794]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/opts Dec 15 05:06:41 localhost podman[324851]: 2025-12-15 10:06:41.785177378 +0000 UTC m=+0.059264475 container kill fc5e9bce44df8105b02182ef00d00351c80ec051cabb4ea787d16be8a408d5ac (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:06:43 localhost nova_compute[286344]: 2025-12-15 10:06:43.268 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:43 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Dec 15 05:06:43 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.25625 172.18.0.34:0/382777224' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Dec 15 05:06:43 localhost dnsmasq[324794]: exiting on receipt of SIGTERM Dec 15 05:06:43 localhost podman[324890]: 2025-12-15 10:06:43.447307601 +0000 UTC m=+0.062869704 container kill fc5e9bce44df8105b02182ef00d00351c80ec051cabb4ea787d16be8a408d5ac (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true) Dec 15 05:06:43 localhost systemd[1]: libpod-fc5e9bce44df8105b02182ef00d00351c80ec051cabb4ea787d16be8a408d5ac.scope: Deactivated successfully. Dec 15 05:06:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0. Dec 15 05:06:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 05:06:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a. Dec 15 05:06:43 localhost podman[324902]: 2025-12-15 10:06:43.532074252 +0000 UTC m=+0.070734801 container died fc5e9bce44df8105b02182ef00d00351c80ec051cabb4ea787d16be8a408d5ac (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:06:43 localhost podman[324902]: 2025-12-15 10:06:43.668318622 +0000 UTC m=+0.206979151 container cleanup fc5e9bce44df8105b02182ef00d00351c80ec051cabb4ea787d16be8a408d5ac (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:06:43 localhost podman[324911]: 2025-12-15 10:06:43.618238868 +0000 UTC m=+0.137973193 container health_status 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:06:43 localhost systemd[1]: libpod-conmon-fc5e9bce44df8105b02182ef00d00351c80ec051cabb4ea787d16be8a408d5ac.scope: Deactivated successfully. Dec 15 05:06:43 localhost podman[324910]: 2025-12-15 10:06:43.583099499 +0000 UTC m=+0.099720996 container health_status 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Dec 15 05:06:43 localhost podman[324913]: 2025-12-15 10:06:43.638107655 +0000 UTC m=+0.154711562 container health_status b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible) Dec 15 05:06:43 localhost podman[324910]: 2025-12-15 10:06:43.717494622 +0000 UTC m=+0.234116139 container exec_died 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Dec 15 05:06:43 localhost systemd[1]: 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0.service: Deactivated successfully. Dec 15 05:06:43 localhost podman[324904]: 2025-12-15 10:06:43.73980634 +0000 UTC m=+0.265628647 container remove fc5e9bce44df8105b02182ef00d00351c80ec051cabb4ea787d16be8a408d5ac (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0) Dec 15 05:06:43 localhost podman[324913]: 2025-12-15 10:06:43.768847587 +0000 UTC m=+0.285451474 container exec_died b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:06:43 localhost systemd[1]: b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a.service: Deactivated successfully. Dec 15 05:06:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09. Dec 15 05:06:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 05:06:43 localhost podman[324911]: 2025-12-15 10:06:43.853712701 +0000 UTC m=+0.373447026 container exec_died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=multipathd) Dec 15 05:06:43 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.service: Deactivated successfully. Dec 15 05:06:43 localhost podman[324994]: 2025-12-15 10:06:43.925082477 +0000 UTC m=+0.106397863 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, managed_by=edpm_ansible) Dec 15 05:06:43 localhost podman[324993]: 2025-12-15 10:06:43.939713453 +0000 UTC m=+0.126248440 container health_status 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=openstack_network_exporter, container_name=openstack_network_exporter, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Dec 15 05:06:43 localhost podman[324993]: 2025-12-15 10:06:43.958800201 +0000 UTC m=+0.145335218 container exec_died 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, container_name=openstack_network_exporter, io.openshift.expose-services=, release=1755695350, io.buildah.version=1.33.7, distribution-scope=public, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, name=ubi9-minimal, io.openshift.tags=minimal rhel9, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., architecture=x86_64, com.redhat.component=ubi9-minimal-container) Dec 15 05:06:43 localhost systemd[1]: 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09.service: Deactivated successfully. Dec 15 05:06:43 localhost podman[324994]: 2025-12-15 10:06:43.995462588 +0000 UTC m=+0.176777964 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, org.label-schema.vendor=CentOS) Dec 15 05:06:44 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 05:06:44 localhost systemd[1]: tmp-crun.YsYOSB.mount: Deactivated successfully. Dec 15 05:06:44 localhost systemd[1]: var-lib-containers-storage-overlay-2c5baab4b5e4c8ba2c8bea3080e88e9a77ac3f36ae0aedecf530caf25c11cfb5-merged.mount: Deactivated successfully. Dec 15 05:06:44 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-fc5e9bce44df8105b02182ef00d00351c80ec051cabb4ea787d16be8a408d5ac-userdata-shm.mount: Deactivated successfully. Dec 15 05:06:44 localhost podman[325082]: Dec 15 05:06:44 localhost podman[325082]: 2025-12-15 10:06:44.643022153 +0000 UTC m=+0.094347723 container create 558baec87ea075947cc1a0f1b234ce69fb881efc0737f5eb89f209a10eeb80f5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) Dec 15 05:06:44 localhost systemd[1]: Started libpod-conmon-558baec87ea075947cc1a0f1b234ce69fb881efc0737f5eb89f209a10eeb80f5.scope. Dec 15 05:06:44 localhost systemd[1]: tmp-crun.rGTm04.mount: Deactivated successfully. Dec 15 05:06:44 localhost systemd[1]: Started libcrun container. Dec 15 05:06:44 localhost podman[325082]: 2025-12-15 10:06:44.599088023 +0000 UTC m=+0.050413623 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 15 05:06:44 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f9b970c1f41873132a107a7c314af2449ba30abeb30aac5c9970db7b972ee7d9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 05:06:44 localhost podman[325082]: 2025-12-15 10:06:44.708685855 +0000 UTC m=+0.160011415 container init 558baec87ea075947cc1a0f1b234ce69fb881efc0737f5eb89f209a10eeb80f5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Dec 15 05:06:44 localhost podman[325082]: 2025-12-15 10:06:44.718164933 +0000 UTC m=+0.169490493 container start 558baec87ea075947cc1a0f1b234ce69fb881efc0737f5eb89f209a10eeb80f5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Dec 15 05:06:44 localhost dnsmasq[325100]: started, version 2.85 cachesize 150 Dec 15 05:06:44 localhost dnsmasq[325100]: DNS service limited to local subnets Dec 15 05:06:44 localhost dnsmasq[325100]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 15 05:06:44 localhost dnsmasq[325100]: warning: no upstream servers configured Dec 15 05:06:44 localhost dnsmasq[325100]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/addn_hosts - 0 addresses Dec 15 05:06:45 localhost dnsmasq[325100]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/addn_hosts - 0 addresses Dec 15 05:06:45 localhost podman[325118]: 2025-12-15 10:06:45.128498751 +0000 UTC m=+0.066100845 container kill 558baec87ea075947cc1a0f1b234ce69fb881efc0737f5eb89f209a10eeb80f5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Dec 15 05:06:45 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:45.147 267546 INFO neutron.agent.dhcp.agent [None req-2da9c781-d63f-4966-a06a-6ded5a9ae16e - - - - - -] DHCP configuration for ports {'79503367-f53f-4b35-8760-76fcaa4d8407', '5a4cf37c-37ea-429b-9caa-6e3e638f1ab2'} is completed#033[00m Dec 15 05:06:45 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:45.514 267546 INFO neutron.agent.dhcp.agent [None req-bf0b8e78-a0d6-4a7e-a9c5-e0a9e3e47aee - - - - - -] DHCP configuration for ports {'79503367-f53f-4b35-8760-76fcaa4d8407', '5a4cf37c-37ea-429b-9caa-6e3e638f1ab2'} is completed#033[00m Dec 15 05:06:45 localhost nova_compute[286344]: 2025-12-15 10:06:45.635 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:45 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e174 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 05:06:46 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:46.901 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:04:2e:5c 2001:db8::f816:3eff:fe04:2e5c'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe04:2e5c/64', 'neutron:device_id': 'ovnmeta-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '89e710ef9f4f48d48a369002db572947', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e47821dc-5f5d-44dc-8a16-54817df4049d, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=79503367-f53f-4b35-8760-76fcaa4d8407) old=Port_Binding(mac=['fa:16:3e:04:2e:5c 10.100.0.2 2001:db8::f816:3eff:fe04:2e5c'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe04:2e5c/64', 'neutron:device_id': 'ovnmeta-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '89e710ef9f4f48d48a369002db572947', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:06:46 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:46.904 160590 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 79503367-f53f-4b35-8760-76fcaa4d8407 in datapath c0669abd-aef1-4b0d-9f97-a6adeeac3211 updated#033[00m Dec 15 05:06:46 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:46.906 160590 DEBUG neutron.agent.ovn.metadata.agent [-] Port c78598bf-e416-478a-bccf-cf1e43fa87b5 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Dec 15 05:06:46 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:46.907 160590 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c0669abd-aef1-4b0d-9f97-a6adeeac3211, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 15 05:06:46 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:46.908 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[a80cb1ce-9ed5-4552-a6e1-a3c21a961eb0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:06:48 localhost dnsmasq[325100]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/addn_hosts - 0 addresses Dec 15 05:06:48 localhost systemd[1]: tmp-crun.LggYmK.mount: Deactivated successfully. Dec 15 05:06:48 localhost podman[325156]: 2025-12-15 10:06:48.236949502 +0000 UTC m=+0.065571188 container kill 558baec87ea075947cc1a0f1b234ce69fb881efc0737f5eb89f209a10eeb80f5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true) Dec 15 05:06:48 localhost nova_compute[286344]: 2025-12-15 10:06:48.301 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:49 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:49.120 267546 INFO neutron.agent.dhcp.agent [None req-93b2c469-76bf-4595-aec7-c4e19d8627b4 - - - - - -] DHCP configuration for ports {'79503367-f53f-4b35-8760-76fcaa4d8407', '5a4cf37c-37ea-429b-9caa-6e3e638f1ab2'} is completed#033[00m Dec 15 05:06:49 localhost dnsmasq[325100]: exiting on receipt of SIGTERM Dec 15 05:06:49 localhost podman[325192]: 2025-12-15 10:06:49.872732327 +0000 UTC m=+0.059932635 container kill 558baec87ea075947cc1a0f1b234ce69fb881efc0737f5eb89f209a10eeb80f5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) Dec 15 05:06:49 localhost systemd[1]: libpod-558baec87ea075947cc1a0f1b234ce69fb881efc0737f5eb89f209a10eeb80f5.scope: Deactivated successfully. Dec 15 05:06:49 localhost podman[325206]: 2025-12-15 10:06:49.940307718 +0000 UTC m=+0.056863222 container died 558baec87ea075947cc1a0f1b234ce69fb881efc0737f5eb89f209a10eeb80f5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Dec 15 05:06:49 localhost systemd[1]: tmp-crun.fYl0mN.mount: Deactivated successfully. Dec 15 05:06:49 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-558baec87ea075947cc1a0f1b234ce69fb881efc0737f5eb89f209a10eeb80f5-userdata-shm.mount: Deactivated successfully. Dec 15 05:06:49 localhost podman[325206]: 2025-12-15 10:06:49.975421915 +0000 UTC m=+0.091977378 container cleanup 558baec87ea075947cc1a0f1b234ce69fb881efc0737f5eb89f209a10eeb80f5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2) Dec 15 05:06:49 localhost systemd[1]: libpod-conmon-558baec87ea075947cc1a0f1b234ce69fb881efc0737f5eb89f209a10eeb80f5.scope: Deactivated successfully. Dec 15 05:06:50 localhost podman[325213]: 2025-12-15 10:06:50.010412279 +0000 UTC m=+0.113706920 container remove 558baec87ea075947cc1a0f1b234ce69fb881efc0737f5eb89f209a10eeb80f5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:06:50 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:50.635 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:04:2e:5c 10.100.0.2 2001:db8::f816:3eff:fe04:2e5c'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe04:2e5c/64', 'neutron:device_id': 'ovnmeta-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '89e710ef9f4f48d48a369002db572947', 'neutron:revision_number': '7', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e47821dc-5f5d-44dc-8a16-54817df4049d, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=79503367-f53f-4b35-8760-76fcaa4d8407) old=Port_Binding(mac=['fa:16:3e:04:2e:5c 2001:db8::f816:3eff:fe04:2e5c'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe04:2e5c/64', 'neutron:device_id': 'ovnmeta-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '89e710ef9f4f48d48a369002db572947', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:06:50 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:50.638 160590 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 79503367-f53f-4b35-8760-76fcaa4d8407 in datapath c0669abd-aef1-4b0d-9f97-a6adeeac3211 updated#033[00m Dec 15 05:06:50 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:50.640 160590 DEBUG neutron.agent.ovn.metadata.agent [-] Port c78598bf-e416-478a-bccf-cf1e43fa87b5 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Dec 15 05:06:50 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:50.640 160590 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c0669abd-aef1-4b0d-9f97-a6adeeac3211, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 15 05:06:50 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:50.641 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[b6e98741-10b5-439d-9f1e-acce975e5bf8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:06:50 localhost nova_compute[286344]: 2025-12-15 10:06:50.665 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:50 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e174 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408 Dec 15 05:06:50 localhost systemd[1]: var-lib-containers-storage-overlay-f9b970c1f41873132a107a7c314af2449ba30abeb30aac5c9970db7b972ee7d9-merged.mount: Deactivated successfully. Dec 15 05:06:51 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:51.482 160590 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 05:06:51 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:51.483 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 05:06:51 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:51.483 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 05:06:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 05:06:51 localhost podman[325247]: 2025-12-15 10:06:51.764020534 +0000 UTC m=+0.091968447 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202) Dec 15 05:06:51 localhost podman[325247]: 2025-12-15 10:06:51.79763389 +0000 UTC m=+0.125581753 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0) Dec 15 05:06:51 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 05:06:52 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e174 do_prune osdmap full prune enabled Dec 15 05:06:52 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e175 e175: 6 total, 6 up, 6 in Dec 15 05:06:52 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:52.362 267546 INFO neutron.agent.linux.ip_lib [None req-8705ef72-86bc-4e4c-b778-80e6944feac1 - - - - - -] Device tapfad1135b-b7 cannot be used as it has no MAC address#033[00m Dec 15 05:06:52 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e175: 6 total, 6 up, 6 in Dec 15 05:06:52 localhost nova_compute[286344]: 2025-12-15 10:06:52.393 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:52 localhost kernel: device tapfad1135b-b7 entered promiscuous mode Dec 15 05:06:52 localhost NetworkManager[5963]: [1765793212.4046] manager: (tapfad1135b-b7): new Generic device (/org/freedesktop/NetworkManager/Devices/54) Dec 15 05:06:52 localhost ovn_controller[154603]: 2025-12-15T10:06:52Z|00339|binding|INFO|Claiming lport fad1135b-b7b1-48ac-8865-1d59a512e8ca for this chassis. Dec 15 05:06:52 localhost ovn_controller[154603]: 2025-12-15T10:06:52Z|00340|binding|INFO|fad1135b-b7b1-48ac-8865-1d59a512e8ca: Claiming unknown Dec 15 05:06:52 localhost nova_compute[286344]: 2025-12-15 10:06:52.406 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:52 localhost systemd-udevd[325315]: Network interface NamePolicy= disabled on kernel command line. Dec 15 05:06:52 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:52.417 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.255.242/28', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-d4d65c77-4d50-4ac8-bdf3-cfe58cbd19a4', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d4d65c77-4d50-4ac8-bdf3-cfe58cbd19a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5a302f70917d47c392ee5c9b50e38a7e', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=856a2b9e-2f34-49e1-a56c-c5f5d8ff3083, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=fad1135b-b7b1-48ac-8865-1d59a512e8ca) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:06:52 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:52.419 160590 INFO neutron.agent.ovn.metadata.agent [-] Port fad1135b-b7b1-48ac-8865-1d59a512e8ca in datapath d4d65c77-4d50-4ac8-bdf3-cfe58cbd19a4 bound to our chassis#033[00m Dec 15 05:06:52 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:52.420 160590 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network d4d65c77-4d50-4ac8-bdf3-cfe58cbd19a4 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 15 05:06:52 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:52.421 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[69c746d8-f4fd-4cfb-94a5-9fbd162aa236]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:06:52 localhost journal[231322]: ethtool ioctl error on tapfad1135b-b7: No such device Dec 15 05:06:52 localhost ovn_controller[154603]: 2025-12-15T10:06:52Z|00341|binding|INFO|Setting lport fad1135b-b7b1-48ac-8865-1d59a512e8ca ovn-installed in OVS Dec 15 05:06:52 localhost ovn_controller[154603]: 2025-12-15T10:06:52Z|00342|binding|INFO|Setting lport fad1135b-b7b1-48ac-8865-1d59a512e8ca up in Southbound Dec 15 05:06:52 localhost journal[231322]: ethtool ioctl error on tapfad1135b-b7: No such device Dec 15 05:06:52 localhost nova_compute[286344]: 2025-12-15 10:06:52.446 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:52 localhost nova_compute[286344]: 2025-12-15 10:06:52.448 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:52 localhost journal[231322]: ethtool ioctl error on tapfad1135b-b7: No such device Dec 15 05:06:52 localhost journal[231322]: ethtool ioctl error on tapfad1135b-b7: No such device Dec 15 05:06:52 localhost journal[231322]: ethtool ioctl error on tapfad1135b-b7: No such device Dec 15 05:06:52 localhost journal[231322]: ethtool ioctl error on tapfad1135b-b7: No such device Dec 15 05:06:52 localhost journal[231322]: ethtool ioctl error on tapfad1135b-b7: No such device Dec 15 05:06:52 localhost journal[231322]: ethtool ioctl error on tapfad1135b-b7: No such device Dec 15 05:06:52 localhost nova_compute[286344]: 2025-12-15 10:06:52.490 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:52 localhost nova_compute[286344]: 2025-12-15 10:06:52.520 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:52 localhost podman[325313]: Dec 15 05:06:52 localhost podman[325313]: 2025-12-15 10:06:52.544679446 +0000 UTC m=+0.129885990 container create 1766c3f2c0572371e5e6912e882322615dc66e840787d1507ecc9b141d8b4bdc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:06:52 localhost podman[325313]: 2025-12-15 10:06:52.483578062 +0000 UTC m=+0.068784666 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 15 05:06:52 localhost systemd[1]: Started libpod-conmon-1766c3f2c0572371e5e6912e882322615dc66e840787d1507ecc9b141d8b4bdc.scope. Dec 15 05:06:52 localhost systemd[1]: Started libcrun container. Dec 15 05:06:52 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/430edaeaa2fd4ca6404096d30c43d2ad9f7edc42f1d0fd48d5175f7c754be6bf/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 05:06:52 localhost podman[325313]: 2025-12-15 10:06:52.617998245 +0000 UTC m=+0.203204769 container init 1766c3f2c0572371e5e6912e882322615dc66e840787d1507ecc9b141d8b4bdc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:06:52 localhost podman[325313]: 2025-12-15 10:06:52.626754363 +0000 UTC m=+0.211960887 container start 1766c3f2c0572371e5e6912e882322615dc66e840787d1507ecc9b141d8b4bdc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true) Dec 15 05:06:52 localhost dnsmasq[325361]: started, version 2.85 cachesize 150 Dec 15 05:06:52 localhost dnsmasq[325361]: DNS service limited to local subnets Dec 15 05:06:52 localhost dnsmasq[325361]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 15 05:06:52 localhost dnsmasq[325361]: warning: no upstream servers configured Dec 15 05:06:52 localhost dnsmasq-dhcp[325361]: DHCP, static leases only on 10.100.0.0, lease time 1d Dec 15 05:06:52 localhost dnsmasq-dhcp[325361]: DHCPv6, static leases only on 2001:db8::, lease time 1d Dec 15 05:06:52 localhost dnsmasq[325361]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/addn_hosts - 0 addresses Dec 15 05:06:52 localhost dnsmasq-dhcp[325361]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/host Dec 15 05:06:52 localhost dnsmasq-dhcp[325361]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/opts Dec 15 05:06:52 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:52.878 267546 INFO neutron.agent.dhcp.agent [None req-c633af13-cd81-4221-8445-86220d50ece8 - - - - - -] DHCP configuration for ports {'79503367-f53f-4b35-8760-76fcaa4d8407', '5a4cf37c-37ea-429b-9caa-6e3e638f1ab2'} is completed#033[00m Dec 15 05:06:53 localhost neutron_sriov_agent[260044]: 2025-12-15 10:06:53.061 2 INFO neutron.agent.securitygroups_rpc [None req-77515c9a-f5d2-4a01-b9c4-951520042827 6b5da6f221214afe93e1fa66574f238b 89e710ef9f4f48d48a369002db572947 - - default default] Security group member updated ['a6c5f808-dddc-4f17-acbf-63b1b6e6f4d6']#033[00m Dec 15 05:06:53 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:53.205 267546 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-15T10:06:52Z, description=, device_id=, device_owner=, dns_assignment=[, ], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[, ], id=58df24eb-539e-462a-8609-deb0790aed3d, ip_allocation=immediate, mac_address=fa:16:3e:92:f5:50, name=tempest-NetworksTestDHCPv6-936805709, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-15T10:05:39Z, description=, dns_domain=, id=c0669abd-aef1-4b0d-9f97-a6adeeac3211, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1861207414, port_security_enabled=True, project_id=89e710ef9f4f48d48a369002db572947, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=20673, qos_policy_id=None, revision_number=27, router:external=False, shared=False, standard_attr_id=2164, status=ACTIVE, subnets=['419ede2b-0185-45c9-8e44-93848f76a31f', 'dbae195c-1fbc-4426-ad6e-714894c13409'], tags=[], tenant_id=89e710ef9f4f48d48a369002db572947, updated_at=2025-12-15T10:06:49Z, vlan_transparent=None, network_id=c0669abd-aef1-4b0d-9f97-a6adeeac3211, port_security_enabled=True, project_id=89e710ef9f4f48d48a369002db572947, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['a6c5f808-dddc-4f17-acbf-63b1b6e6f4d6'], standard_attr_id=2514, status=DOWN, tags=[], tenant_id=89e710ef9f4f48d48a369002db572947, updated_at=2025-12-15T10:06:52Z on network c0669abd-aef1-4b0d-9f97-a6adeeac3211#033[00m Dec 15 05:06:53 localhost nova_compute[286344]: 2025-12-15 10:06:53.305 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:53 localhost podman[325405]: Dec 15 05:06:53 localhost podman[325405]: 2025-12-15 10:06:53.362311907 +0000 UTC m=+0.094379452 container create 5a6468fef7e6af9cf1304e74e1115619a3eb0865b18496c87c34924ae65820eb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d4d65c77-4d50-4ac8-bdf3-cfe58cbd19a4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0) Dec 15 05:06:53 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e175 do_prune osdmap full prune enabled Dec 15 05:06:53 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e176 e176: 6 total, 6 up, 6 in Dec 15 05:06:53 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e176: 6 total, 6 up, 6 in Dec 15 05:06:53 localhost systemd[1]: Started libpod-conmon-5a6468fef7e6af9cf1304e74e1115619a3eb0865b18496c87c34924ae65820eb.scope. Dec 15 05:06:53 localhost systemd[1]: Started libcrun container. Dec 15 05:06:53 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c3f62309f462240b01951714780fa5be0069023265097e303c1b65def8ed0543/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 05:06:53 localhost podman[325405]: 2025-12-15 10:06:53.321631509 +0000 UTC m=+0.053699124 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 15 05:06:53 localhost podman[325405]: 2025-12-15 10:06:53.427558165 +0000 UTC m=+0.159625750 container init 5a6468fef7e6af9cf1304e74e1115619a3eb0865b18496c87c34924ae65820eb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d4d65c77-4d50-4ac8-bdf3-cfe58cbd19a4, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:06:53 localhost podman[325405]: 2025-12-15 10:06:53.440312652 +0000 UTC m=+0.172380227 container start 5a6468fef7e6af9cf1304e74e1115619a3eb0865b18496c87c34924ae65820eb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d4d65c77-4d50-4ac8-bdf3-cfe58cbd19a4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202) Dec 15 05:06:53 localhost dnsmasq[325450]: started, version 2.85 cachesize 150 Dec 15 05:06:53 localhost dnsmasq[325450]: DNS service limited to local subnets Dec 15 05:06:53 localhost dnsmasq[325450]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 15 05:06:53 localhost dnsmasq[325450]: warning: no upstream servers configured Dec 15 05:06:53 localhost dnsmasq-dhcp[325450]: DHCP, static leases only on 10.100.255.240, lease time 1d Dec 15 05:06:53 localhost dnsmasq[325450]: read /var/lib/neutron/dhcp/d4d65c77-4d50-4ac8-bdf3-cfe58cbd19a4/addn_hosts - 0 addresses Dec 15 05:06:53 localhost dnsmasq-dhcp[325450]: read /var/lib/neutron/dhcp/d4d65c77-4d50-4ac8-bdf3-cfe58cbd19a4/host Dec 15 05:06:53 localhost dnsmasq-dhcp[325450]: read /var/lib/neutron/dhcp/d4d65c77-4d50-4ac8-bdf3-cfe58cbd19a4/opts Dec 15 05:06:53 localhost dnsmasq[325361]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/addn_hosts - 2 addresses Dec 15 05:06:53 localhost dnsmasq-dhcp[325361]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/host Dec 15 05:06:53 localhost dnsmasq-dhcp[325361]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/opts Dec 15 05:06:53 localhost podman[325435]: 2025-12-15 10:06:53.468968853 +0000 UTC m=+0.076149486 container kill 1766c3f2c0572371e5e6912e882322615dc66e840787d1507ecc9b141d8b4bdc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Dec 15 05:06:53 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:53.975 267546 INFO neutron.agent.dhcp.agent [None req-ec14a0be-7acf-4635-9ff9-84add8965cb4 - - - - - -] DHCP configuration for ports {'809c2b21-605d-4b68-96b5-465ad48d6211'} is completed#033[00m Dec 15 05:06:54 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:54.122 267546 INFO neutron.agent.dhcp.agent [None req-0c3b6c92-7990-49eb-9b25-474267be10bd - - - - - -] DHCP configuration for ports {'58df24eb-539e-462a-8609-deb0790aed3d'} is completed#033[00m Dec 15 05:06:55 localhost neutron_sriov_agent[260044]: 2025-12-15 10:06:55.003 2 INFO neutron.agent.securitygroups_rpc [None req-6127c903-3523-4b92-aaae-da9ee5e41d85 6b5da6f221214afe93e1fa66574f238b 89e710ef9f4f48d48a369002db572947 - - default default] Security group member updated ['a6c5f808-dddc-4f17-acbf-63b1b6e6f4d6']#033[00m Dec 15 05:06:55 localhost dnsmasq[325361]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/addn_hosts - 0 addresses Dec 15 05:06:55 localhost dnsmasq-dhcp[325361]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/host Dec 15 05:06:55 localhost dnsmasq-dhcp[325361]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/opts Dec 15 05:06:55 localhost podman[325477]: 2025-12-15 10:06:55.240514918 +0000 UTC m=+0.061252140 container kill 1766c3f2c0572371e5e6912e882322615dc66e840787d1507ecc9b141d8b4bdc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:06:55 localhost nova_compute[286344]: 2025-12-15 10:06:55.361 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e. Dec 15 05:06:55 localhost nova_compute[286344]: 2025-12-15 10:06:55.668 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:55 localhost podman[325501]: 2025-12-15 10:06:55.745267152 +0000 UTC m=+0.068736654 container health_status a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 15 05:06:55 localhost podman[325501]: 2025-12-15 10:06:55.756359845 +0000 UTC m=+0.079829387 container exec_died a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Dec 15 05:06:55 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e176 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:06:55 localhost systemd[1]: a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e.service: Deactivated successfully. Dec 15 05:06:56 localhost dnsmasq[325361]: exiting on receipt of SIGTERM Dec 15 05:06:56 localhost podman[325538]: 2025-12-15 10:06:56.084939678 +0000 UTC m=+0.058029162 container kill 1766c3f2c0572371e5e6912e882322615dc66e840787d1507ecc9b141d8b4bdc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true) Dec 15 05:06:56 localhost systemd[1]: libpod-1766c3f2c0572371e5e6912e882322615dc66e840787d1507ecc9b141d8b4bdc.scope: Deactivated successfully. Dec 15 05:06:56 localhost podman[325553]: 2025-12-15 10:06:56.164131506 +0000 UTC m=+0.058082143 container died 1766c3f2c0572371e5e6912e882322615dc66e840787d1507ecc9b141d8b4bdc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Dec 15 05:06:56 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1766c3f2c0572371e5e6912e882322615dc66e840787d1507ecc9b141d8b4bdc-userdata-shm.mount: Deactivated successfully. Dec 15 05:06:56 localhost podman[325553]: 2025-12-15 10:06:56.262095825 +0000 UTC m=+0.156046422 container remove 1766c3f2c0572371e5e6912e882322615dc66e840787d1507ecc9b141d8b4bdc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true) Dec 15 05:06:56 localhost systemd[1]: libpod-conmon-1766c3f2c0572371e5e6912e882322615dc66e840787d1507ecc9b141d8b4bdc.scope: Deactivated successfully. Dec 15 05:06:56 localhost systemd[1]: var-lib-containers-storage-overlay-430edaeaa2fd4ca6404096d30c43d2ad9f7edc42f1d0fd48d5175f7c754be6bf-merged.mount: Deactivated successfully. Dec 15 05:06:57 localhost podman[325629]: Dec 15 05:06:57 localhost podman[325629]: 2025-12-15 10:06:57.119455929 +0000 UTC m=+0.085131711 container create 940bd7aa79d3e137c10930a19ecc1ed6127167c4efdb2aba24286dc3c1f13f6f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:06:57 localhost systemd[1]: Started libpod-conmon-940bd7aa79d3e137c10930a19ecc1ed6127167c4efdb2aba24286dc3c1f13f6f.scope. Dec 15 05:06:57 localhost systemd[1]: tmp-crun.q0g1Qs.mount: Deactivated successfully. Dec 15 05:06:57 localhost podman[325629]: 2025-12-15 10:06:57.0791441 +0000 UTC m=+0.044819892 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 15 05:06:57 localhost systemd[1]: Started libcrun container. Dec 15 05:06:57 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6d493d4211b05fb23aa42b95ba89e2628e295e06eb863fb2b8f3164638845e49/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 05:06:57 localhost podman[325629]: 2025-12-15 10:06:57.203035266 +0000 UTC m=+0.168711038 container init 940bd7aa79d3e137c10930a19ecc1ed6127167c4efdb2aba24286dc3c1f13f6f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 05:06:57 localhost podman[325629]: 2025-12-15 10:06:57.212095214 +0000 UTC m=+0.177770986 container start 940bd7aa79d3e137c10930a19ecc1ed6127167c4efdb2aba24286dc3c1f13f6f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:06:57 localhost dnsmasq[325647]: started, version 2.85 cachesize 150 Dec 15 05:06:57 localhost dnsmasq[325647]: DNS service limited to local subnets Dec 15 05:06:57 localhost dnsmasq[325647]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 15 05:06:57 localhost dnsmasq[325647]: warning: no upstream servers configured Dec 15 05:06:57 localhost dnsmasq-dhcp[325647]: DHCPv6, static leases only on 2001:db8::, lease time 1d Dec 15 05:06:57 localhost dnsmasq[325647]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/addn_hosts - 0 addresses Dec 15 05:06:57 localhost dnsmasq-dhcp[325647]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/host Dec 15 05:06:57 localhost dnsmasq-dhcp[325647]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/opts Dec 15 05:06:57 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:57.516 267546 INFO neutron.agent.dhcp.agent [None req-6e877438-6a6a-4027-9e42-a5a6629b0f0e - - - - - -] DHCP configuration for ports {'79503367-f53f-4b35-8760-76fcaa4d8407', '5a4cf37c-37ea-429b-9caa-6e3e638f1ab2'} is completed#033[00m Dec 15 05:06:57 localhost dnsmasq[325647]: exiting on receipt of SIGTERM Dec 15 05:06:57 localhost podman[325666]: 2025-12-15 10:06:57.6365697 +0000 UTC m=+0.060104779 container kill 940bd7aa79d3e137c10930a19ecc1ed6127167c4efdb2aba24286dc3c1f13f6f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true) Dec 15 05:06:57 localhost systemd[1]: libpod-940bd7aa79d3e137c10930a19ecc1ed6127167c4efdb2aba24286dc3c1f13f6f.scope: Deactivated successfully. Dec 15 05:06:57 localhost podman[325681]: 2025-12-15 10:06:57.717517046 +0000 UTC m=+0.063764559 container died 940bd7aa79d3e137c10930a19ecc1ed6127167c4efdb2aba24286dc3c1f13f6f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:06:57 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-940bd7aa79d3e137c10930a19ecc1ed6127167c4efdb2aba24286dc3c1f13f6f-userdata-shm.mount: Deactivated successfully. Dec 15 05:06:57 localhost systemd[1]: var-lib-containers-storage-overlay-6d493d4211b05fb23aa42b95ba89e2628e295e06eb863fb2b8f3164638845e49-merged.mount: Deactivated successfully. Dec 15 05:06:57 localhost podman[325681]: 2025-12-15 10:06:57.751100641 +0000 UTC m=+0.097348104 container cleanup 940bd7aa79d3e137c10930a19ecc1ed6127167c4efdb2aba24286dc3c1f13f6f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Dec 15 05:06:57 localhost systemd[1]: libpod-conmon-940bd7aa79d3e137c10930a19ecc1ed6127167c4efdb2aba24286dc3c1f13f6f.scope: Deactivated successfully. Dec 15 05:06:57 localhost podman[325682]: 2025-12-15 10:06:57.789715633 +0000 UTC m=+0.131033022 container remove 940bd7aa79d3e137c10930a19ecc1ed6127167c4efdb2aba24286dc3c1f13f6f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true) Dec 15 05:06:58 localhost nova_compute[286344]: 2025-12-15 10:06:58.309 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:58 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:58.424 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:04:2e:5c 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '89e710ef9f4f48d48a369002db572947', 'neutron:revision_number': '10', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e47821dc-5f5d-44dc-8a16-54817df4049d, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=79503367-f53f-4b35-8760-76fcaa4d8407) old=Port_Binding(mac=['fa:16:3e:04:2e:5c 10.100.0.2 2001:db8::f816:3eff:fe04:2e5c'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe04:2e5c/64', 'neutron:device_id': 'ovnmeta-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '89e710ef9f4f48d48a369002db572947', 'neutron:revision_number': '7', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:06:58 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:58.426 160590 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 79503367-f53f-4b35-8760-76fcaa4d8407 in datapath c0669abd-aef1-4b0d-9f97-a6adeeac3211 updated#033[00m Dec 15 05:06:58 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:58.429 160590 DEBUG neutron.agent.ovn.metadata.agent [-] Port c78598bf-e416-478a-bccf-cf1e43fa87b5 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Dec 15 05:06:58 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:58.429 160590 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c0669abd-aef1-4b0d-9f97-a6adeeac3211, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 15 05:06:58 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:58.430 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[94503dc8-4409-4d69-906f-ed730a53206a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:06:59 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:59.481 267546 INFO neutron.agent.linux.ip_lib [None req-08c6ffb4-8b01-4b46-9672-cc7462b563a6 - - - - - -] Device tap9ed4e45d-85 cannot be used as it has no MAC address#033[00m Dec 15 05:06:59 localhost nova_compute[286344]: 2025-12-15 10:06:59.506 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:59 localhost kernel: device tap9ed4e45d-85 entered promiscuous mode Dec 15 05:06:59 localhost NetworkManager[5963]: [1765793219.5135] manager: (tap9ed4e45d-85): new Generic device (/org/freedesktop/NetworkManager/Devices/55) Dec 15 05:06:59 localhost ovn_controller[154603]: 2025-12-15T10:06:59Z|00343|binding|INFO|Claiming lport 9ed4e45d-85f0-450a-a36f-0d214fbddc6c for this chassis. Dec 15 05:06:59 localhost ovn_controller[154603]: 2025-12-15T10:06:59Z|00344|binding|INFO|9ed4e45d-85f0-450a-a36f-0d214fbddc6c: Claiming unknown Dec 15 05:06:59 localhost nova_compute[286344]: 2025-12-15 10:06:59.515 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:59 localhost systemd-udevd[325781]: Network interface NamePolicy= disabled on kernel command line. Dec 15 05:06:59 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:59.528 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-6211c52b-4c8a-4698-aaba-53022274894d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6211c52b-4c8a-4698-aaba-53022274894d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5ccee293c21a4d25b8692241c1f8fb63', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=19cc8789-4d1f-411f-8f3c-326f524c8ece, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=9ed4e45d-85f0-450a-a36f-0d214fbddc6c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:06:59 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:59.529 160590 INFO neutron.agent.ovn.metadata.agent [-] Port 9ed4e45d-85f0-450a-a36f-0d214fbddc6c in datapath 6211c52b-4c8a-4698-aaba-53022274894d bound to our chassis#033[00m Dec 15 05:06:59 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:59.531 160590 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 6211c52b-4c8a-4698-aaba-53022274894d or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 15 05:06:59 localhost ovn_metadata_agent[160585]: 2025-12-15 10:06:59.532 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[9a766748-8a02-40cd-b92a-4296d79a3f94]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:06:59 localhost podman[325764]: Dec 15 05:06:59 localhost podman[325764]: 2025-12-15 10:06:59.551837441 +0000 UTC m=+0.092557663 container create a05c9bda888177f0343ff9390b358df9ec9e3db09344875e84a540fb63a83c1e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Dec 15 05:06:59 localhost ovn_controller[154603]: 2025-12-15T10:06:59Z|00345|binding|INFO|Setting lport 9ed4e45d-85f0-450a-a36f-0d214fbddc6c ovn-installed in OVS Dec 15 05:06:59 localhost ovn_controller[154603]: 2025-12-15T10:06:59Z|00346|binding|INFO|Setting lport 9ed4e45d-85f0-450a-a36f-0d214fbddc6c up in Southbound Dec 15 05:06:59 localhost nova_compute[286344]: 2025-12-15 10:06:59.573 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:59 localhost systemd[1]: Started libpod-conmon-a05c9bda888177f0343ff9390b358df9ec9e3db09344875e84a540fb63a83c1e.scope. Dec 15 05:06:59 localhost podman[325764]: 2025-12-15 10:06:59.506571937 +0000 UTC m=+0.047292219 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 15 05:06:59 localhost nova_compute[286344]: 2025-12-15 10:06:59.614 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:59 localhost systemd[1]: Started libcrun container. Dec 15 05:06:59 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/69d36dd0eea241fcedfd64e792543f446a3bd867043baa0850fbb3e4c6395c29/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 05:06:59 localhost nova_compute[286344]: 2025-12-15 10:06:59.641 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:06:59 localhost podman[325764]: 2025-12-15 10:06:59.648201416 +0000 UTC m=+0.188921638 container init a05c9bda888177f0343ff9390b358df9ec9e3db09344875e84a540fb63a83c1e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:06:59 localhost podman[325764]: 2025-12-15 10:06:59.65784638 +0000 UTC m=+0.198566592 container start a05c9bda888177f0343ff9390b358df9ec9e3db09344875e84a540fb63a83c1e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3) Dec 15 05:06:59 localhost dnsmasq[325797]: started, version 2.85 cachesize 150 Dec 15 05:06:59 localhost dnsmasq[325797]: DNS service limited to local subnets Dec 15 05:06:59 localhost dnsmasq[325797]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 15 05:06:59 localhost dnsmasq[325797]: warning: no upstream servers configured Dec 15 05:06:59 localhost dnsmasq-dhcp[325797]: DHCP, static leases only on 10.100.0.0, lease time 1d Dec 15 05:06:59 localhost dnsmasq[325797]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/addn_hosts - 0 addresses Dec 15 05:06:59 localhost dnsmasq-dhcp[325797]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/host Dec 15 05:06:59 localhost dnsmasq-dhcp[325797]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/opts Dec 15 05:06:59 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:06:59.838 267546 INFO neutron.agent.dhcp.agent [None req-69b25ee2-dade-4003-9467-592192c99a1f - - - - - -] DHCP configuration for ports {'79503367-f53f-4b35-8760-76fcaa4d8407', '5a4cf37c-37ea-429b-9caa-6e3e638f1ab2'} is completed#033[00m Dec 15 05:07:00 localhost systemd[1]: tmp-crun.IlwOLA.mount: Deactivated successfully. Dec 15 05:07:00 localhost podman[325843]: Dec 15 05:07:00 localhost podman[325843]: 2025-12-15 10:07:00.465152569 +0000 UTC m=+0.081344648 container create 7b4d183efe37b51e41316f0b47169ea601879afda0d8986486eddea57e3b7d7c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6211c52b-4c8a-4698-aaba-53022274894d, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:07:00 localhost systemd[1]: Started libpod-conmon-7b4d183efe37b51e41316f0b47169ea601879afda0d8986486eddea57e3b7d7c.scope. Dec 15 05:07:00 localhost systemd[1]: Started libcrun container. Dec 15 05:07:00 localhost podman[325843]: 2025-12-15 10:07:00.429757664 +0000 UTC m=+0.045949723 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 15 05:07:00 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d2775776f0e273fb3d05b272d95f1ffd06426af493f94ead53af40fff91d2ecd/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 05:07:00 localhost podman[325843]: 2025-12-15 10:07:00.542272629 +0000 UTC m=+0.158464698 container init 7b4d183efe37b51e41316f0b47169ea601879afda0d8986486eddea57e3b7d7c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6211c52b-4c8a-4698-aaba-53022274894d, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:07:00 localhost podman[325843]: 2025-12-15 10:07:00.55182036 +0000 UTC m=+0.168012419 container start 7b4d183efe37b51e41316f0b47169ea601879afda0d8986486eddea57e3b7d7c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6211c52b-4c8a-4698-aaba-53022274894d, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Dec 15 05:07:00 localhost dnsmasq[325890]: started, version 2.85 cachesize 150 Dec 15 05:07:00 localhost dnsmasq[325890]: DNS service limited to local subnets Dec 15 05:07:00 localhost dnsmasq[325890]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 15 05:07:00 localhost dnsmasq[325890]: warning: no upstream servers configured Dec 15 05:07:00 localhost dnsmasq-dhcp[325890]: DHCP, static leases only on 10.100.0.0, lease time 1d Dec 15 05:07:00 localhost dnsmasq[325890]: read /var/lib/neutron/dhcp/6211c52b-4c8a-4698-aaba-53022274894d/addn_hosts - 0 addresses Dec 15 05:07:00 localhost dnsmasq-dhcp[325890]: read /var/lib/neutron/dhcp/6211c52b-4c8a-4698-aaba-53022274894d/host Dec 15 05:07:00 localhost dnsmasq-dhcp[325890]: read /var/lib/neutron/dhcp/6211c52b-4c8a-4698-aaba-53022274894d/opts Dec 15 05:07:00 localhost nova_compute[286344]: 2025-12-15 10:07:00.639 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:00 localhost nova_compute[286344]: 2025-12-15 10:07:00.670 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:00 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e176 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:07:00 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e176 do_prune osdmap full prune enabled Dec 15 05:07:00 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:00.803 267546 INFO neutron.agent.dhcp.agent [None req-e8ce4db5-d185-417d-82b7-941adfa57383 - - - - - -] DHCP configuration for ports {'07a159b2-e3ae-41f1-89e9-c9feb42e38bc'} is completed#033[00m Dec 15 05:07:01 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e177 e177: 6 total, 6 up, 6 in Dec 15 05:07:01 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e177: 6 total, 6 up, 6 in Dec 15 05:07:01 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Dec 15 05:07:01 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:07:01 localhost podman[243449]: time="2025-12-15T10:07:01Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 15 05:07:01 localhost podman[243449]: @ - - [15/Dec/2025:10:07:01 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 162116 "" "Go-http-client/1.1" Dec 15 05:07:01 localhost podman[243449]: @ - - [15/Dec/2025:10:07:01 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 20668 "" "Go-http-client/1.1" Dec 15 05:07:02 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Dec 15 05:07:02 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:07:02 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:02.518 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:04:2e:5c 10.100.0.2 2001:db8::f816:3eff:fe04:2e5c'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe04:2e5c/64', 'neutron:device_id': 'ovnmeta-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '89e710ef9f4f48d48a369002db572947', 'neutron:revision_number': '11', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e47821dc-5f5d-44dc-8a16-54817df4049d, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=79503367-f53f-4b35-8760-76fcaa4d8407) old=Port_Binding(mac=['fa:16:3e:04:2e:5c 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '89e710ef9f4f48d48a369002db572947', 'neutron:revision_number': '10', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:07:02 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:02.520 160590 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 79503367-f53f-4b35-8760-76fcaa4d8407 in datapath c0669abd-aef1-4b0d-9f97-a6adeeac3211 updated#033[00m Dec 15 05:07:02 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:02.522 160590 DEBUG neutron.agent.ovn.metadata.agent [-] Port c78598bf-e416-478a-bccf-cf1e43fa87b5 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Dec 15 05:07:02 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:02.523 160590 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c0669abd-aef1-4b0d-9f97-a6adeeac3211, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 15 05:07:02 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:02.523 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[49fbfb5c-ce17-4096-97f1-6f6182f919a6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:07:03 localhost podman[325964]: 2025-12-15 10:07:03.021095788 +0000 UTC m=+0.061577349 container kill a05c9bda888177f0343ff9390b358df9ec9e3db09344875e84a540fb63a83c1e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Dec 15 05:07:03 localhost dnsmasq[325797]: exiting on receipt of SIGTERM Dec 15 05:07:03 localhost systemd[1]: libpod-a05c9bda888177f0343ff9390b358df9ec9e3db09344875e84a540fb63a83c1e.scope: Deactivated successfully. Dec 15 05:07:03 localhost podman[325979]: 2025-12-15 10:07:03.090841739 +0000 UTC m=+0.052425310 container died a05c9bda888177f0343ff9390b358df9ec9e3db09344875e84a540fb63a83c1e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 05:07:03 localhost podman[325979]: 2025-12-15 10:07:03.135158967 +0000 UTC m=+0.096742488 container cleanup a05c9bda888177f0343ff9390b358df9ec9e3db09344875e84a540fb63a83c1e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Dec 15 05:07:03 localhost systemd[1]: libpod-conmon-a05c9bda888177f0343ff9390b358df9ec9e3db09344875e84a540fb63a83c1e.scope: Deactivated successfully. Dec 15 05:07:03 localhost podman[325980]: 2025-12-15 10:07:03.173038858 +0000 UTC m=+0.130555438 container remove a05c9bda888177f0343ff9390b358df9ec9e3db09344875e84a540fb63a83c1e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:07:03 localhost nova_compute[286344]: 2025-12-15 10:07:03.313 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:03 localhost neutron_sriov_agent[260044]: 2025-12-15 10:07:03.853 2 INFO neutron.agent.securitygroups_rpc [None req-0f32bd5c-60e4-48e1-9b51-87a3617deb30 6b5da6f221214afe93e1fa66574f238b 89e710ef9f4f48d48a369002db572947 - - default default] Security group member updated ['a6c5f808-dddc-4f17-acbf-63b1b6e6f4d6']#033[00m Dec 15 05:07:04 localhost systemd[1]: var-lib-containers-storage-overlay-69d36dd0eea241fcedfd64e792543f446a3bd867043baa0850fbb3e4c6395c29-merged.mount: Deactivated successfully. Dec 15 05:07:04 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a05c9bda888177f0343ff9390b358df9ec9e3db09344875e84a540fb63a83c1e-userdata-shm.mount: Deactivated successfully. Dec 15 05:07:04 localhost podman[326060]: Dec 15 05:07:04 localhost podman[326060]: 2025-12-15 10:07:04.038629136 +0000 UTC m=+0.093710535 container create 47c9545b814d0e13a7278d08b35f44a8a679c4a9b29593501eec1eac29a0b222 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true) Dec 15 05:07:04 localhost systemd[1]: Started libpod-conmon-47c9545b814d0e13a7278d08b35f44a8a679c4a9b29593501eec1eac29a0b222.scope. Dec 15 05:07:04 localhost systemd[1]: tmp-crun.i8QI91.mount: Deactivated successfully. Dec 15 05:07:04 localhost podman[326060]: 2025-12-15 10:07:03.992673354 +0000 UTC m=+0.047754773 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 15 05:07:04 localhost systemd[1]: Started libcrun container. Dec 15 05:07:04 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c3e50734e468984bc5e1cee399cd3615b59dc782fc17e71a3d8df29c2d53b4fb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 05:07:04 localhost podman[326060]: 2025-12-15 10:07:04.126162062 +0000 UTC m=+0.181243461 container init 47c9545b814d0e13a7278d08b35f44a8a679c4a9b29593501eec1eac29a0b222 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:07:04 localhost podman[326060]: 2025-12-15 10:07:04.136182964 +0000 UTC m=+0.191264363 container start 47c9545b814d0e13a7278d08b35f44a8a679c4a9b29593501eec1eac29a0b222 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:07:04 localhost dnsmasq[326078]: started, version 2.85 cachesize 150 Dec 15 05:07:04 localhost dnsmasq[326078]: DNS service limited to local subnets Dec 15 05:07:04 localhost dnsmasq[326078]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 15 05:07:04 localhost dnsmasq[326078]: warning: no upstream servers configured Dec 15 05:07:04 localhost dnsmasq-dhcp[326078]: DHCP, static leases only on 10.100.0.0, lease time 1d Dec 15 05:07:04 localhost dnsmasq[326078]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/addn_hosts - 0 addresses Dec 15 05:07:04 localhost dnsmasq-dhcp[326078]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/host Dec 15 05:07:04 localhost dnsmasq-dhcp[326078]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/opts Dec 15 05:07:04 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:04.201 267546 INFO neutron.agent.dhcp.agent [None req-a1980a10-f2f4-4ac0-88cc-bf2f7349ff22 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-15T10:07:03Z, description=, device_id=, device_owner=, dns_assignment=[, ], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[, ], id=a524e10a-08ca-44f6-a8bd-0045c8ba4114, ip_allocation=immediate, mac_address=fa:16:3e:61:5c:ad, name=tempest-NetworksTestDHCPv6-1716667796, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-15T10:05:39Z, description=, dns_domain=, id=c0669abd-aef1-4b0d-9f97-a6adeeac3211, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1861207414, port_security_enabled=True, project_id=89e710ef9f4f48d48a369002db572947, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=20673, qos_policy_id=None, revision_number=31, router:external=False, shared=False, standard_attr_id=2164, status=ACTIVE, subnets=['027833e8-d67d-428e-a6b5-ec4551c434b5', 'bee495dc-dab8-418d-ba8b-a53a1ebb8975'], tags=[], tenant_id=89e710ef9f4f48d48a369002db572947, updated_at=2025-12-15T10:06:59Z, vlan_transparent=None, network_id=c0669abd-aef1-4b0d-9f97-a6adeeac3211, port_security_enabled=True, project_id=89e710ef9f4f48d48a369002db572947, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['a6c5f808-dddc-4f17-acbf-63b1b6e6f4d6'], standard_attr_id=2602, status=DOWN, tags=[], tenant_id=89e710ef9f4f48d48a369002db572947, updated_at=2025-12-15T10:07:03Z on network c0669abd-aef1-4b0d-9f97-a6adeeac3211#033[00m Dec 15 05:07:04 localhost podman[326097]: 2025-12-15 10:07:04.475388087 +0000 UTC m=+0.058741981 container kill 47c9545b814d0e13a7278d08b35f44a8a679c4a9b29593501eec1eac29a0b222 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2) Dec 15 05:07:04 localhost dnsmasq[326078]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/addn_hosts - 2 addresses Dec 15 05:07:04 localhost dnsmasq-dhcp[326078]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/host Dec 15 05:07:04 localhost dnsmasq-dhcp[326078]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/opts Dec 15 05:07:04 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:04.478 267546 INFO neutron.agent.dhcp.agent [None req-81889308-960b-4203-ad01-70e5d0b0387a - - - - - -] DHCP configuration for ports {'79503367-f53f-4b35-8760-76fcaa4d8407', '5a4cf37c-37ea-429b-9caa-6e3e638f1ab2'} is completed#033[00m Dec 15 05:07:04 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Dec 15 05:07:04 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:07:04 localhost neutron_sriov_agent[260044]: 2025-12-15 10:07:04.830 2 INFO neutron.agent.securitygroups_rpc [None req-18efe7cb-803c-4f30-9bff-0e6cc2fcd641 6b5da6f221214afe93e1fa66574f238b 89e710ef9f4f48d48a369002db572947 - - default default] Security group member updated ['a6c5f808-dddc-4f17-acbf-63b1b6e6f4d6']#033[00m Dec 15 05:07:04 localhost openstack_network_exporter[246484]: ERROR 10:07:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 05:07:04 localhost openstack_network_exporter[246484]: ERROR 10:07:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 05:07:04 localhost openstack_network_exporter[246484]: ERROR 10:07:04 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 15 05:07:04 localhost openstack_network_exporter[246484]: ERROR 10:07:04 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 15 05:07:04 localhost openstack_network_exporter[246484]: Dec 15 05:07:04 localhost openstack_network_exporter[246484]: ERROR 10:07:04 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 15 05:07:04 localhost openstack_network_exporter[246484]: Dec 15 05:07:04 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:04.862 267546 INFO neutron.agent.dhcp.agent [None req-3578f8ad-e3cc-457d-a3f8-1f7412523a47 - - - - - -] DHCP configuration for ports {'a524e10a-08ca-44f6-a8bd-0045c8ba4114'} is completed#033[00m Dec 15 05:07:04 localhost nova_compute[286344]: 2025-12-15 10:07:04.918 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:05 localhost dnsmasq[326078]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/addn_hosts - 0 addresses Dec 15 05:07:05 localhost dnsmasq-dhcp[326078]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/host Dec 15 05:07:05 localhost podman[326138]: 2025-12-15 10:07:05.199761957 +0000 UTC m=+0.057298643 container kill 47c9545b814d0e13a7278d08b35f44a8a679c4a9b29593501eec1eac29a0b222 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Dec 15 05:07:05 localhost dnsmasq-dhcp[326078]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/opts Dec 15 05:07:05 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:07:05 localhost nova_compute[286344]: 2025-12-15 10:07:05.320 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:05 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:05.320 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'fe:17:e3', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fe:55:2b:86:15:b5'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:07:05 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:05.322 160590 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Dec 15 05:07:05 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:05.323 160590 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=12d96d64-e862-4f68-81e5-8d9ec5d3a5e2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 15 05:07:05 localhost nova_compute[286344]: 2025-12-15 10:07:05.716 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:05 localhost dnsmasq[326078]: exiting on receipt of SIGTERM Dec 15 05:07:05 localhost podman[326177]: 2025-12-15 10:07:05.751771819 +0000 UTC m=+0.109897706 container kill 47c9545b814d0e13a7278d08b35f44a8a679c4a9b29593501eec1eac29a0b222 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0) Dec 15 05:07:05 localhost systemd[1]: libpod-47c9545b814d0e13a7278d08b35f44a8a679c4a9b29593501eec1eac29a0b222.scope: Deactivated successfully. Dec 15 05:07:05 localhost podman[326190]: 2025-12-15 10:07:05.825562909 +0000 UTC m=+0.057920808 container died 47c9545b814d0e13a7278d08b35f44a8a679c4a9b29593501eec1eac29a0b222 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Dec 15 05:07:05 localhost podman[326190]: 2025-12-15 10:07:05.859520325 +0000 UTC m=+0.091878154 container cleanup 47c9545b814d0e13a7278d08b35f44a8a679c4a9b29593501eec1eac29a0b222 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202) Dec 15 05:07:05 localhost systemd[1]: libpod-conmon-47c9545b814d0e13a7278d08b35f44a8a679c4a9b29593501eec1eac29a0b222.scope: Deactivated successfully. Dec 15 05:07:05 localhost podman[326192]: 2025-12-15 10:07:05.903141104 +0000 UTC m=+0.128430991 container remove 47c9545b814d0e13a7278d08b35f44a8a679c4a9b29593501eec1eac29a0b222 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Dec 15 05:07:06 localhost systemd[1]: tmp-crun.Ox559e.mount: Deactivated successfully. Dec 15 05:07:06 localhost systemd[1]: var-lib-containers-storage-overlay-c3e50734e468984bc5e1cee399cd3615b59dc782fc17e71a3d8df29c2d53b4fb-merged.mount: Deactivated successfully. Dec 15 05:07:06 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-47c9545b814d0e13a7278d08b35f44a8a679c4a9b29593501eec1eac29a0b222-userdata-shm.mount: Deactivated successfully. Dec 15 05:07:06 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:06.137 267546 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-15T10:07:05Z, description=, device_id=1d0b57fc-e595-48e1-939f-22f3cbec445f, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=1e1329c4-d16f-43c8-b0a0-d6bdb82efcc0, ip_allocation=immediate, mac_address=fa:16:3e:51:d9:b8, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-15T10:06:56Z, description=, dns_domain=, id=6211c52b-4c8a-4698-aaba-53022274894d, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-VolumesActionsTest-1821391794-network, port_security_enabled=True, project_id=5ccee293c21a4d25b8692241c1f8fb63, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=22789, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2540, status=ACTIVE, subnets=['177683d4-ddda-40ec-a329-7b0c7357dbde'], tags=[], tenant_id=5ccee293c21a4d25b8692241c1f8fb63, updated_at=2025-12-15T10:06:58Z, vlan_transparent=None, network_id=6211c52b-4c8a-4698-aaba-53022274894d, port_security_enabled=False, project_id=5ccee293c21a4d25b8692241c1f8fb63, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2619, status=DOWN, tags=[], tenant_id=5ccee293c21a4d25b8692241c1f8fb63, updated_at=2025-12-15T10:07:05Z on network 6211c52b-4c8a-4698-aaba-53022274894d#033[00m Dec 15 05:07:06 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e177 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:07:06 localhost dnsmasq[325890]: read /var/lib/neutron/dhcp/6211c52b-4c8a-4698-aaba-53022274894d/addn_hosts - 1 addresses Dec 15 05:07:06 localhost dnsmasq-dhcp[325890]: read /var/lib/neutron/dhcp/6211c52b-4c8a-4698-aaba-53022274894d/host Dec 15 05:07:06 localhost dnsmasq-dhcp[325890]: read /var/lib/neutron/dhcp/6211c52b-4c8a-4698-aaba-53022274894d/opts Dec 15 05:07:06 localhost podman[326257]: 2025-12-15 10:07:06.388182021 +0000 UTC m=+0.064920870 container kill 7b4d183efe37b51e41316f0b47169ea601879afda0d8986486eddea57e3b7d7c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6211c52b-4c8a-4698-aaba-53022274894d, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202) Dec 15 05:07:06 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:06.649 267546 INFO neutron.agent.dhcp.agent [None req-8edf48a2-a9b2-4419-b617-5c8b2fdb2e5d - - - - - -] DHCP configuration for ports {'1e1329c4-d16f-43c8-b0a0-d6bdb82efcc0'} is completed#033[00m Dec 15 05:07:06 localhost podman[326305]: Dec 15 05:07:06 localhost podman[326305]: 2025-12-15 10:07:06.887230529 +0000 UTC m=+0.089310785 container create 18306c9426ac3357b91dac457b841a8bb062c1c76b51fad1288275f2aded097e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:07:06 localhost systemd[1]: Started libpod-conmon-18306c9426ac3357b91dac457b841a8bb062c1c76b51fad1288275f2aded097e.scope. Dec 15 05:07:06 localhost podman[326305]: 2025-12-15 10:07:06.844519225 +0000 UTC m=+0.046599511 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 15 05:07:06 localhost systemd[1]: Started libcrun container. Dec 15 05:07:06 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6cb5cfeaea14ad169b200764504d10df43d93fbb61d97644fb5565d45be406f5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 05:07:06 localhost podman[326305]: 2025-12-15 10:07:06.960056724 +0000 UTC m=+0.162136990 container init 18306c9426ac3357b91dac457b841a8bb062c1c76b51fad1288275f2aded097e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:07:06 localhost podman[326305]: 2025-12-15 10:07:06.970144689 +0000 UTC m=+0.172224945 container start 18306c9426ac3357b91dac457b841a8bb062c1c76b51fad1288275f2aded097e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Dec 15 05:07:06 localhost dnsmasq[326324]: started, version 2.85 cachesize 150 Dec 15 05:07:06 localhost dnsmasq[326324]: DNS service limited to local subnets Dec 15 05:07:06 localhost dnsmasq[326324]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 15 05:07:06 localhost dnsmasq[326324]: warning: no upstream servers configured Dec 15 05:07:06 localhost dnsmasq-dhcp[326324]: DHCP, static leases only on 10.100.0.0, lease time 1d Dec 15 05:07:06 localhost dnsmasq[326324]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/addn_hosts - 0 addresses Dec 15 05:07:06 localhost dnsmasq-dhcp[326324]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/host Dec 15 05:07:06 localhost dnsmasq-dhcp[326324]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/opts Dec 15 05:07:07 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:07.221 267546 INFO neutron.agent.dhcp.agent [None req-bd51cfef-2a52-414e-927f-5199298e5963 - - - - - -] DHCP configuration for ports {'79503367-f53f-4b35-8760-76fcaa4d8407', '5a4cf37c-37ea-429b-9caa-6e3e638f1ab2'} is completed#033[00m Dec 15 05:07:07 localhost nova_compute[286344]: 2025-12-15 10:07:07.392 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:07 localhost podman[326341]: 2025-12-15 10:07:07.404393642 +0000 UTC m=+0.061767425 container kill 18306c9426ac3357b91dac457b841a8bb062c1c76b51fad1288275f2aded097e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 05:07:07 localhost dnsmasq[326324]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/addn_hosts - 0 addresses Dec 15 05:07:07 localhost systemd[1]: tmp-crun.OCpsS0.mount: Deactivated successfully. Dec 15 05:07:07 localhost dnsmasq-dhcp[326324]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/host Dec 15 05:07:07 localhost dnsmasq-dhcp[326324]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/opts Dec 15 05:07:07 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:07.566 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:04:2e:5c 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '89e710ef9f4f48d48a369002db572947', 'neutron:revision_number': '14', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e47821dc-5f5d-44dc-8a16-54817df4049d, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=79503367-f53f-4b35-8760-76fcaa4d8407) old=Port_Binding(mac=['fa:16:3e:04:2e:5c 10.100.0.2 2001:db8::f816:3eff:fe04:2e5c'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe04:2e5c/64', 'neutron:device_id': 'ovnmeta-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '89e710ef9f4f48d48a369002db572947', 'neutron:revision_number': '11', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:07:07 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:07.568 160590 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 79503367-f53f-4b35-8760-76fcaa4d8407 in datapath c0669abd-aef1-4b0d-9f97-a6adeeac3211 updated#033[00m Dec 15 05:07:07 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:07.571 160590 DEBUG neutron.agent.ovn.metadata.agent [-] Port c78598bf-e416-478a-bccf-cf1e43fa87b5 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Dec 15 05:07:07 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:07.572 160590 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c0669abd-aef1-4b0d-9f97-a6adeeac3211, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 15 05:07:07 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:07.573 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[51687d4a-29c1-43c3-9745-9a4d33dd88e5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:07:07 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:07.765 267546 INFO neutron.agent.dhcp.agent [None req-4c05049e-d67f-48ae-b20b-95b986d870ae - - - - - -] DHCP configuration for ports {'79503367-f53f-4b35-8760-76fcaa4d8407', '5a4cf37c-37ea-429b-9caa-6e3e638f1ab2'} is completed#033[00m Dec 15 05:07:08 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:08.109 267546 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-15T10:07:05Z, description=, device_id=1d0b57fc-e595-48e1-939f-22f3cbec445f, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=1e1329c4-d16f-43c8-b0a0-d6bdb82efcc0, ip_allocation=immediate, mac_address=fa:16:3e:51:d9:b8, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-15T10:06:56Z, description=, dns_domain=, id=6211c52b-4c8a-4698-aaba-53022274894d, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-VolumesActionsTest-1821391794-network, port_security_enabled=True, project_id=5ccee293c21a4d25b8692241c1f8fb63, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=22789, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2540, status=ACTIVE, subnets=['177683d4-ddda-40ec-a329-7b0c7357dbde'], tags=[], tenant_id=5ccee293c21a4d25b8692241c1f8fb63, updated_at=2025-12-15T10:06:58Z, vlan_transparent=None, network_id=6211c52b-4c8a-4698-aaba-53022274894d, port_security_enabled=False, project_id=5ccee293c21a4d25b8692241c1f8fb63, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2619, status=DOWN, tags=[], tenant_id=5ccee293c21a4d25b8692241c1f8fb63, updated_at=2025-12-15T10:07:05Z on network 6211c52b-4c8a-4698-aaba-53022274894d#033[00m Dec 15 05:07:08 localhost nova_compute[286344]: 2025-12-15 10:07:08.316 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:08 localhost dnsmasq[325890]: read /var/lib/neutron/dhcp/6211c52b-4c8a-4698-aaba-53022274894d/addn_hosts - 1 addresses Dec 15 05:07:08 localhost dnsmasq-dhcp[325890]: read /var/lib/neutron/dhcp/6211c52b-4c8a-4698-aaba-53022274894d/host Dec 15 05:07:08 localhost podman[326377]: 2025-12-15 10:07:08.356346633 +0000 UTC m=+0.074760329 container kill 7b4d183efe37b51e41316f0b47169ea601879afda0d8986486eddea57e3b7d7c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6211c52b-4c8a-4698-aaba-53022274894d, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Dec 15 05:07:08 localhost dnsmasq-dhcp[325890]: read /var/lib/neutron/dhcp/6211c52b-4c8a-4698-aaba-53022274894d/opts Dec 15 05:07:08 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:08.652 267546 INFO neutron.agent.dhcp.agent [None req-0d9c40a9-1be9-4ece-9c4a-c4b4e54cd46a - - - - - -] DHCP configuration for ports {'1e1329c4-d16f-43c8-b0a0-d6bdb82efcc0'} is completed#033[00m Dec 15 05:07:10 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:10.421 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:04:2e:5c 10.100.0.2 2001:db8::f816:3eff:fe04:2e5c'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe04:2e5c/64', 'neutron:device_id': 'ovnmeta-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '89e710ef9f4f48d48a369002db572947', 'neutron:revision_number': '15', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e47821dc-5f5d-44dc-8a16-54817df4049d, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=79503367-f53f-4b35-8760-76fcaa4d8407) old=Port_Binding(mac=['fa:16:3e:04:2e:5c 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '89e710ef9f4f48d48a369002db572947', 'neutron:revision_number': '14', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:07:10 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:10.423 160590 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 79503367-f53f-4b35-8760-76fcaa4d8407 in datapath c0669abd-aef1-4b0d-9f97-a6adeeac3211 updated#033[00m Dec 15 05:07:10 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:10.426 160590 DEBUG neutron.agent.ovn.metadata.agent [-] Port c78598bf-e416-478a-bccf-cf1e43fa87b5 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Dec 15 05:07:10 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:10.427 160590 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c0669abd-aef1-4b0d-9f97-a6adeeac3211, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 15 05:07:10 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:10.428 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[732d9e0d-103e-45d3-86fd-400a8d3ac14c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:07:10 localhost nova_compute[286344]: 2025-12-15 10:07:10.718 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:10 localhost dnsmasq[326324]: exiting on receipt of SIGTERM Dec 15 05:07:10 localhost podman[326414]: 2025-12-15 10:07:10.828375024 +0000 UTC m=+0.061712863 container kill 18306c9426ac3357b91dac457b841a8bb062c1c76b51fad1288275f2aded097e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2) Dec 15 05:07:10 localhost systemd[1]: libpod-18306c9426ac3357b91dac457b841a8bb062c1c76b51fad1288275f2aded097e.scope: Deactivated successfully. Dec 15 05:07:10 localhost podman[326428]: 2025-12-15 10:07:10.898180506 +0000 UTC m=+0.057637602 container died 18306c9426ac3357b91dac457b841a8bb062c1c76b51fad1288275f2aded097e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:07:10 localhost podman[326428]: 2025-12-15 10:07:10.933288934 +0000 UTC m=+0.092745960 container cleanup 18306c9426ac3357b91dac457b841a8bb062c1c76b51fad1288275f2aded097e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:07:10 localhost systemd[1]: libpod-conmon-18306c9426ac3357b91dac457b841a8bb062c1c76b51fad1288275f2aded097e.scope: Deactivated successfully. Dec 15 05:07:10 localhost neutron_sriov_agent[260044]: 2025-12-15 10:07:10.951 2 INFO neutron.agent.securitygroups_rpc [None req-78553433-0e3f-4898-b165-6f1948c260dc 6b5da6f221214afe93e1fa66574f238b 89e710ef9f4f48d48a369002db572947 - - default default] Security group member updated ['a6c5f808-dddc-4f17-acbf-63b1b6e6f4d6']#033[00m Dec 15 05:07:10 localhost podman[326430]: 2025-12-15 10:07:10.986911195 +0000 UTC m=+0.135186946 container remove 18306c9426ac3357b91dac457b841a8bb062c1c76b51fad1288275f2aded097e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2) Dec 15 05:07:11 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e177 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:07:11 localhost systemd[1]: var-lib-containers-storage-overlay-6cb5cfeaea14ad169b200764504d10df43d93fbb61d97644fb5565d45be406f5-merged.mount: Deactivated successfully. Dec 15 05:07:11 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-18306c9426ac3357b91dac457b841a8bb062c1c76b51fad1288275f2aded097e-userdata-shm.mount: Deactivated successfully. Dec 15 05:07:12 localhost nova_compute[286344]: 2025-12-15 10:07:12.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:07:12 localhost podman[326507]: Dec 15 05:07:12 localhost podman[326507]: 2025-12-15 10:07:12.419263516 +0000 UTC m=+0.084454262 container create 8099584c15df0f6bc14074b008e9818a516c0760a5084fa015616f773ef470e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:07:12 localhost systemd[1]: Started libpod-conmon-8099584c15df0f6bc14074b008e9818a516c0760a5084fa015616f773ef470e6.scope. Dec 15 05:07:12 localhost systemd[1]: tmp-crun.b24KPA.mount: Deactivated successfully. Dec 15 05:07:12 localhost systemd[1]: Started libcrun container. Dec 15 05:07:12 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0b0864144d551fe4f734fdbe8b8402a642084d386e35dcd5933ad56f4dbf0bf1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 05:07:12 localhost podman[326507]: 2025-12-15 10:07:12.386691168 +0000 UTC m=+0.051881964 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 15 05:07:12 localhost podman[326507]: 2025-12-15 10:07:12.487965219 +0000 UTC m=+0.153155995 container init 8099584c15df0f6bc14074b008e9818a516c0760a5084fa015616f773ef470e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 05:07:12 localhost podman[326507]: 2025-12-15 10:07:12.498585407 +0000 UTC m=+0.163776173 container start 8099584c15df0f6bc14074b008e9818a516c0760a5084fa015616f773ef470e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true) Dec 15 05:07:12 localhost dnsmasq[326526]: started, version 2.85 cachesize 150 Dec 15 05:07:12 localhost dnsmasq[326526]: DNS service limited to local subnets Dec 15 05:07:12 localhost dnsmasq[326526]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 15 05:07:12 localhost dnsmasq[326526]: warning: no upstream servers configured Dec 15 05:07:12 localhost dnsmasq-dhcp[326526]: DHCP, static leases only on 10.100.0.0, lease time 1d Dec 15 05:07:12 localhost dnsmasq-dhcp[326526]: DHCPv6, static leases only on 2001:db8::, lease time 1d Dec 15 05:07:12 localhost dnsmasq[326526]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/addn_hosts - 0 addresses Dec 15 05:07:12 localhost dnsmasq-dhcp[326526]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/host Dec 15 05:07:12 localhost dnsmasq-dhcp[326526]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/opts Dec 15 05:07:12 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:12.563 267546 INFO neutron.agent.dhcp.agent [None req-d13e1e15-8350-4b4e-8ab8-c78fd5e8bdd3 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-15T10:07:10Z, description=, device_id=, device_owner=, dns_assignment=[, ], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[, ], id=a89a489f-7d2b-40bb-b296-b69ac95a1cf1, ip_allocation=immediate, mac_address=fa:16:3e:39:84:80, name=tempest-NetworksTestDHCPv6-1846309243, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-15T10:05:39Z, description=, dns_domain=, id=c0669abd-aef1-4b0d-9f97-a6adeeac3211, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1861207414, port_security_enabled=True, project_id=89e710ef9f4f48d48a369002db572947, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=20673, qos_policy_id=None, revision_number=35, router:external=False, shared=False, standard_attr_id=2164, status=ACTIVE, subnets=['23767fe4-48fe-4aac-8b4b-c1fb4d5d1473', 'd8176823-4e8f-42b5-b266-0400abf626e9'], tags=[], tenant_id=89e710ef9f4f48d48a369002db572947, updated_at=2025-12-15T10:07:08Z, vlan_transparent=None, network_id=c0669abd-aef1-4b0d-9f97-a6adeeac3211, port_security_enabled=True, project_id=89e710ef9f4f48d48a369002db572947, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['a6c5f808-dddc-4f17-acbf-63b1b6e6f4d6'], standard_attr_id=2641, status=DOWN, tags=[], tenant_id=89e710ef9f4f48d48a369002db572947, updated_at=2025-12-15T10:07:10Z on network c0669abd-aef1-4b0d-9f97-a6adeeac3211#033[00m Dec 15 05:07:12 localhost neutron_sriov_agent[260044]: 2025-12-15 10:07:12.644 2 INFO neutron.agent.securitygroups_rpc [None req-6ab90569-33d8-4620-ad2c-93be2bff55d3 6b5da6f221214afe93e1fa66574f238b 89e710ef9f4f48d48a369002db572947 - - default default] Security group member updated ['a6c5f808-dddc-4f17-acbf-63b1b6e6f4d6']#033[00m Dec 15 05:07:12 localhost dnsmasq[326526]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/addn_hosts - 2 addresses Dec 15 05:07:12 localhost dnsmasq-dhcp[326526]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/host Dec 15 05:07:12 localhost dnsmasq-dhcp[326526]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/opts Dec 15 05:07:12 localhost podman[326544]: 2025-12-15 10:07:12.797604146 +0000 UTC m=+0.045344137 container kill 8099584c15df0f6bc14074b008e9818a516c0760a5084fa015616f773ef470e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0) Dec 15 05:07:12 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:12.800 267546 INFO neutron.agent.dhcp.agent [None req-ba5beb9c-cd34-4eb7-be60-b2170893ee40 - - - - - -] DHCP configuration for ports {'79503367-f53f-4b35-8760-76fcaa4d8407', '5a4cf37c-37ea-429b-9caa-6e3e638f1ab2'} is completed#033[00m Dec 15 05:07:13 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:13.047 267546 INFO neutron.agent.dhcp.agent [None req-fd54ab00-844d-4b09-ad1d-d1fc6df1caed - - - - - -] DHCP configuration for ports {'a89a489f-7d2b-40bb-b296-b69ac95a1cf1'} is completed#033[00m Dec 15 05:07:13 localhost dnsmasq[326526]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/addn_hosts - 0 addresses Dec 15 05:07:13 localhost dnsmasq-dhcp[326526]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/host Dec 15 05:07:13 localhost dnsmasq-dhcp[326526]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/opts Dec 15 05:07:13 localhost podman[326581]: 2025-12-15 10:07:13.23006843 +0000 UTC m=+0.060130409 container kill 8099584c15df0f6bc14074b008e9818a516c0760a5084fa015616f773ef470e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Dec 15 05:07:13 localhost nova_compute[286344]: 2025-12-15 10:07:13.319 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0. Dec 15 05:07:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09. Dec 15 05:07:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 05:07:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a. Dec 15 05:07:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 05:07:14 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:14.105 267546 INFO neutron.agent.linux.ip_lib [None req-93c34c52-ee43-4e5c-afd6-b693eb6d0ec2 - - - - - -] Device tap34c2fd24-fb cannot be used as it has no MAC address#033[00m Dec 15 05:07:14 localhost podman[326607]: 2025-12-15 10:07:14.133487629 +0000 UTC m=+0.103228445 container health_status 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, version=9.6, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git) Dec 15 05:07:14 localhost nova_compute[286344]: 2025-12-15 10:07:14.180 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:14 localhost kernel: device tap34c2fd24-fb entered promiscuous mode Dec 15 05:07:14 localhost nova_compute[286344]: 2025-12-15 10:07:14.196 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:14 localhost ovn_controller[154603]: 2025-12-15T10:07:14Z|00347|binding|INFO|Claiming lport 34c2fd24-fb92-4d4c-b9cd-c2d7d32770ff for this chassis. Dec 15 05:07:14 localhost ovn_controller[154603]: 2025-12-15T10:07:14Z|00348|binding|INFO|34c2fd24-fb92-4d4c-b9cd-c2d7d32770ff: Claiming unknown Dec 15 05:07:14 localhost NetworkManager[5963]: [1765793234.2035] manager: (tap34c2fd24-fb): new Generic device (/org/freedesktop/NetworkManager/Devices/56) Dec 15 05:07:14 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:14.206 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.1/28', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-89ceb4c2-ef6d-4611-b0ca-45fd8611536f', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-89ceb4c2-ef6d-4611-b0ca-45fd8611536f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5a302f70917d47c392ee5c9b50e38a7e', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bf0debb4-a014-4d60-a959-df5e2f98d74c, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=34c2fd24-fb92-4d4c-b9cd-c2d7d32770ff) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:07:14 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:14.208 160590 INFO neutron.agent.ovn.metadata.agent [-] Port 34c2fd24-fb92-4d4c-b9cd-c2d7d32770ff in datapath 89ceb4c2-ef6d-4611-b0ca-45fd8611536f bound to our chassis#033[00m Dec 15 05:07:14 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:14.209 160590 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 89ceb4c2-ef6d-4611-b0ca-45fd8611536f or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 15 05:07:14 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:14.210 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[9cdeb834-1fe4-476f-8f9c-c7ce96089d89]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:07:14 localhost nova_compute[286344]: 2025-12-15 10:07:14.228 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:14 localhost ovn_controller[154603]: 2025-12-15T10:07:14Z|00349|binding|INFO|Setting lport 34c2fd24-fb92-4d4c-b9cd-c2d7d32770ff ovn-installed in OVS Dec 15 05:07:14 localhost ovn_controller[154603]: 2025-12-15T10:07:14Z|00350|binding|INFO|Setting lport 34c2fd24-fb92-4d4c-b9cd-c2d7d32770ff up in Southbound Dec 15 05:07:14 localhost nova_compute[286344]: 2025-12-15 10:07:14.231 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:14 localhost podman[326615]: 2025-12-15 10:07:14.192741163 +0000 UTC m=+0.152250549 container health_status b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Dec 15 05:07:14 localhost nova_compute[286344]: 2025-12-15 10:07:14.269 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:07:14 localhost nova_compute[286344]: 2025-12-15 10:07:14.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:07:14 localhost podman[326622]: 2025-12-15 10:07:14.223596744 +0000 UTC m=+0.180575551 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:07:14 localhost nova_compute[286344]: 2025-12-15 10:07:14.280 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:14 localhost podman[326607]: 2025-12-15 10:07:14.28327387 +0000 UTC m=+0.253014716 container exec_died 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vendor=Red Hat, Inc., config_id=openstack_network_exporter, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41) Dec 15 05:07:14 localhost systemd[1]: 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09.service: Deactivated successfully. Dec 15 05:07:14 localhost podman[326609]: 2025-12-15 10:07:14.316092344 +0000 UTC m=+0.279603590 container health_status 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, container_name=multipathd, org.label-schema.vendor=CentOS) Dec 15 05:07:14 localhost nova_compute[286344]: 2025-12-15 10:07:14.324 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:14 localhost podman[326609]: 2025-12-15 10:07:14.335438162 +0000 UTC m=+0.298949458 container exec_died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251202, tcib_managed=true, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Dec 15 05:07:14 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.service: Deactivated successfully. Dec 15 05:07:14 localhost podman[326622]: 2025-12-15 10:07:14.356688381 +0000 UTC m=+0.313667168 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_controller) Dec 15 05:07:14 localhost podman[326606]: 2025-12-15 10:07:14.266191864 +0000 UTC m=+0.235485558 container health_status 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Dec 15 05:07:14 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 05:07:14 localhost podman[326615]: 2025-12-15 10:07:14.389817073 +0000 UTC m=+0.349326419 container exec_died b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:07:14 localhost podman[326606]: 2025-12-15 10:07:14.402755416 +0000 UTC m=+0.372049070 container exec_died 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Dec 15 05:07:14 localhost systemd[1]: b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a.service: Deactivated successfully. Dec 15 05:07:14 localhost systemd[1]: 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0.service: Deactivated successfully. Dec 15 05:07:14 localhost nova_compute[286344]: 2025-12-15 10:07:14.479 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:14 localhost dnsmasq[326526]: exiting on receipt of SIGTERM Dec 15 05:07:14 localhost podman[326763]: 2025-12-15 10:07:14.844390611 +0000 UTC m=+0.070095061 container kill 8099584c15df0f6bc14074b008e9818a516c0760a5084fa015616f773ef470e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 05:07:14 localhost systemd[1]: libpod-8099584c15df0f6bc14074b008e9818a516c0760a5084fa015616f773ef470e6.scope: Deactivated successfully. Dec 15 05:07:14 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Dec 15 05:07:14 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/382325651' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Dec 15 05:07:14 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Dec 15 05:07:14 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/382325651' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Dec 15 05:07:14 localhost podman[326780]: 2025-12-15 10:07:14.917301998 +0000 UTC m=+0.060525621 container died 8099584c15df0f6bc14074b008e9818a516c0760a5084fa015616f773ef470e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Dec 15 05:07:14 localhost podman[326780]: 2025-12-15 10:07:14.951946991 +0000 UTC m=+0.095170584 container cleanup 8099584c15df0f6bc14074b008e9818a516c0760a5084fa015616f773ef470e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true) Dec 15 05:07:14 localhost systemd[1]: libpod-conmon-8099584c15df0f6bc14074b008e9818a516c0760a5084fa015616f773ef470e6.scope: Deactivated successfully. Dec 15 05:07:14 localhost systemd[1]: var-lib-containers-storage-overlay-0b0864144d551fe4f734fdbe8b8402a642084d386e35dcd5933ad56f4dbf0bf1-merged.mount: Deactivated successfully. Dec 15 05:07:14 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8099584c15df0f6bc14074b008e9818a516c0760a5084fa015616f773ef470e6-userdata-shm.mount: Deactivated successfully. Dec 15 05:07:14 localhost podman[326782]: 2025-12-15 10:07:14.995945671 +0000 UTC m=+0.130771415 container remove 8099584c15df0f6bc14074b008e9818a516c0760a5084fa015616f773ef470e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS) Dec 15 05:07:15 localhost podman[326842]: Dec 15 05:07:15 localhost podman[326842]: 2025-12-15 10:07:15.264974952 +0000 UTC m=+0.097578571 container create 49422fefdef265b829bed4dee2315786e43e4f42468c6656fad3e1d7b33bdc79 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-89ceb4c2-ef6d-4611-b0ca-45fd8611536f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Dec 15 05:07:15 localhost nova_compute[286344]: 2025-12-15 10:07:15.266 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:07:15 localhost nova_compute[286344]: 2025-12-15 10:07:15.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:07:15 localhost nova_compute[286344]: 2025-12-15 10:07:15.271 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 15 05:07:15 localhost nova_compute[286344]: 2025-12-15 10:07:15.271 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 15 05:07:15 localhost systemd[1]: Started libpod-conmon-49422fefdef265b829bed4dee2315786e43e4f42468c6656fad3e1d7b33bdc79.scope. Dec 15 05:07:15 localhost systemd[1]: tmp-crun.gw6zWT.mount: Deactivated successfully. Dec 15 05:07:15 localhost systemd[1]: Started libcrun container. Dec 15 05:07:15 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c8dd47ed73c6994023fe69830b445629caff824e12e109beef57adad9e6c8660/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 05:07:15 localhost podman[326842]: 2025-12-15 10:07:15.222751291 +0000 UTC m=+0.055354990 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 15 05:07:15 localhost podman[326842]: 2025-12-15 10:07:15.329040387 +0000 UTC m=+0.161644006 container init 49422fefdef265b829bed4dee2315786e43e4f42468c6656fad3e1d7b33bdc79 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-89ceb4c2-ef6d-4611-b0ca-45fd8611536f, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Dec 15 05:07:15 localhost podman[326842]: 2025-12-15 10:07:15.339160913 +0000 UTC m=+0.171764542 container start 49422fefdef265b829bed4dee2315786e43e4f42468c6656fad3e1d7b33bdc79 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-89ceb4c2-ef6d-4611-b0ca-45fd8611536f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:07:15 localhost dnsmasq[326869]: started, version 2.85 cachesize 150 Dec 15 05:07:15 localhost dnsmasq[326869]: DNS service limited to local subnets Dec 15 05:07:15 localhost dnsmasq[326869]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 15 05:07:15 localhost dnsmasq[326869]: warning: no upstream servers configured Dec 15 05:07:15 localhost dnsmasq-dhcp[326869]: DHCP, static leases only on 10.100.0.0, lease time 1d Dec 15 05:07:15 localhost dnsmasq[326869]: read /var/lib/neutron/dhcp/89ceb4c2-ef6d-4611-b0ca-45fd8611536f/addn_hosts - 0 addresses Dec 15 05:07:15 localhost dnsmasq-dhcp[326869]: read /var/lib/neutron/dhcp/89ceb4c2-ef6d-4611-b0ca-45fd8611536f/host Dec 15 05:07:15 localhost dnsmasq-dhcp[326869]: read /var/lib/neutron/dhcp/89ceb4c2-ef6d-4611-b0ca-45fd8611536f/opts Dec 15 05:07:15 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:15.530 267546 INFO neutron.agent.dhcp.agent [None req-42ac2b0b-0e53-4071-a0ee-7f2233cf9ad9 - - - - - -] DHCP configuration for ports {'93fe5946-3d57-4af8-af14-5c7446d6f78b'} is completed#033[00m Dec 15 05:07:15 localhost nova_compute[286344]: 2025-12-15 10:07:15.721 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:15 localhost nova_compute[286344]: 2025-12-15 10:07:15.756 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 15 05:07:15 localhost nova_compute[286344]: 2025-12-15 10:07:15.756 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquired lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 15 05:07:15 localhost nova_compute[286344]: 2025-12-15 10:07:15.757 286348 DEBUG nova.network.neutron [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 15 05:07:15 localhost nova_compute[286344]: 2025-12-15 10:07:15.757 286348 DEBUG nova.objects.instance [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 15 05:07:15 localhost ovn_controller[154603]: 2025-12-15T10:07:15Z|00351|binding|INFO|Releasing lport 34c2fd24-fb92-4d4c-b9cd-c2d7d32770ff from this chassis (sb_readonly=0) Dec 15 05:07:15 localhost ovn_controller[154603]: 2025-12-15T10:07:15Z|00352|binding|INFO|Setting lport 34c2fd24-fb92-4d4c-b9cd-c2d7d32770ff down in Southbound Dec 15 05:07:15 localhost nova_compute[286344]: 2025-12-15 10:07:15.935 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:15 localhost kernel: device tap34c2fd24-fb left promiscuous mode Dec 15 05:07:15 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:15.942 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.1/28', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-89ceb4c2-ef6d-4611-b0ca-45fd8611536f', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-89ceb4c2-ef6d-4611-b0ca-45fd8611536f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5a302f70917d47c392ee5c9b50e38a7e', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bf0debb4-a014-4d60-a959-df5e2f98d74c, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=34c2fd24-fb92-4d4c-b9cd-c2d7d32770ff) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:07:15 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:15.944 160590 INFO neutron.agent.ovn.metadata.agent [-] Port 34c2fd24-fb92-4d4c-b9cd-c2d7d32770ff in datapath 89ceb4c2-ef6d-4611-b0ca-45fd8611536f unbound from our chassis#033[00m Dec 15 05:07:15 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:15.948 160590 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 89ceb4c2-ef6d-4611-b0ca-45fd8611536f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 15 05:07:15 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:15.949 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[47b2afbb-04f2-41dc-abdb-30e5a6120556]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:07:15 localhost nova_compute[286344]: 2025-12-15 10:07:15.960 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:16 localhost podman[326900]: Dec 15 05:07:16 localhost podman[326900]: 2025-12-15 10:07:16.029139935 +0000 UTC m=+0.106911105 container create 91891ac80aeb33d0680de6a31f435600bad33403ad42f1e66a51de8b929b5091 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:07:16 localhost systemd[1]: Started libpod-conmon-91891ac80aeb33d0680de6a31f435600bad33403ad42f1e66a51de8b929b5091.scope. Dec 15 05:07:16 localhost podman[326900]: 2025-12-15 10:07:15.980595602 +0000 UTC m=+0.058366812 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 15 05:07:16 localhost systemd[1]: tmp-crun.ZNAm6q.mount: Deactivated successfully. Dec 15 05:07:16 localhost systemd[1]: Started libcrun container. Dec 15 05:07:16 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5a5746c3dfce3eae41116dc0cd76b95e3f017ab939047a9c3652414a1b5c7d55/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 05:07:16 localhost podman[326900]: 2025-12-15 10:07:16.126379684 +0000 UTC m=+0.204150854 container init 91891ac80aeb33d0680de6a31f435600bad33403ad42f1e66a51de8b929b5091 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Dec 15 05:07:16 localhost podman[326900]: 2025-12-15 10:07:16.135128842 +0000 UTC m=+0.212900012 container start 91891ac80aeb33d0680de6a31f435600bad33403ad42f1e66a51de8b929b5091 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:07:16 localhost dnsmasq[326920]: started, version 2.85 cachesize 150 Dec 15 05:07:16 localhost dnsmasq[326920]: DNS service limited to local subnets Dec 15 05:07:16 localhost dnsmasq[326920]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 15 05:07:16 localhost dnsmasq[326920]: warning: no upstream servers configured Dec 15 05:07:16 localhost dnsmasq-dhcp[326920]: DHCP, static leases only on 10.100.0.0, lease time 1d Dec 15 05:07:16 localhost dnsmasq[326920]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/addn_hosts - 0 addresses Dec 15 05:07:16 localhost dnsmasq-dhcp[326920]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/host Dec 15 05:07:16 localhost dnsmasq-dhcp[326920]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/opts Dec 15 05:07:16 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e177 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:07:16 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:16.354 267546 INFO neutron.agent.dhcp.agent [None req-fa1a29f4-a9fb-4738-b1ea-7b1706df32a6 - - - - - -] DHCP configuration for ports {'79503367-f53f-4b35-8760-76fcaa4d8407', '5a4cf37c-37ea-429b-9caa-6e3e638f1ab2'} is completed#033[00m Dec 15 05:07:16 localhost dnsmasq[326920]: exiting on receipt of SIGTERM Dec 15 05:07:16 localhost podman[326939]: 2025-12-15 10:07:16.468494747 +0000 UTC m=+0.052620625 container kill 91891ac80aeb33d0680de6a31f435600bad33403ad42f1e66a51de8b929b5091 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202) Dec 15 05:07:16 localhost systemd[1]: libpod-91891ac80aeb33d0680de6a31f435600bad33403ad42f1e66a51de8b929b5091.scope: Deactivated successfully. Dec 15 05:07:16 localhost podman[326952]: 2025-12-15 10:07:16.535579205 +0000 UTC m=+0.056058519 container died 91891ac80aeb33d0680de6a31f435600bad33403ad42f1e66a51de8b929b5091 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:07:16 localhost podman[326952]: 2025-12-15 10:07:16.568563034 +0000 UTC m=+0.089042318 container cleanup 91891ac80aeb33d0680de6a31f435600bad33403ad42f1e66a51de8b929b5091 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Dec 15 05:07:16 localhost systemd[1]: libpod-conmon-91891ac80aeb33d0680de6a31f435600bad33403ad42f1e66a51de8b929b5091.scope: Deactivated successfully. Dec 15 05:07:16 localhost podman[326954]: 2025-12-15 10:07:16.614777904 +0000 UTC m=+0.126432227 container remove 91891ac80aeb33d0680de6a31f435600bad33403ad42f1e66a51de8b929b5091 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:07:16 localhost ovn_controller[154603]: 2025-12-15T10:07:16Z|00353|binding|INFO|Releasing lport 5a4cf37c-37ea-429b-9caa-6e3e638f1ab2 from this chassis (sb_readonly=0) Dec 15 05:07:16 localhost ovn_controller[154603]: 2025-12-15T10:07:16Z|00354|binding|INFO|Setting lport 5a4cf37c-37ea-429b-9caa-6e3e638f1ab2 down in Southbound Dec 15 05:07:16 localhost nova_compute[286344]: 2025-12-15 10:07:16.627 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:16 localhost kernel: device tap5a4cf37c-37 left promiscuous mode Dec 15 05:07:16 localhost nova_compute[286344]: 2025-12-15 10:07:16.647 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:16 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:16.670 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28 2001:db8::f816:3eff:fe43:4a2/64', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '89e710ef9f4f48d48a369002db572947', 'neutron:revision_number': '16', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005559462.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e47821dc-5f5d-44dc-8a16-54817df4049d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=5a4cf37c-37ea-429b-9caa-6e3e638f1ab2) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:07:16 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:16.672 160590 INFO neutron.agent.ovn.metadata.agent [-] Port 5a4cf37c-37ea-429b-9caa-6e3e638f1ab2 in datapath c0669abd-aef1-4b0d-9f97-a6adeeac3211 unbound from our chassis#033[00m Dec 15 05:07:16 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:16.676 160590 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c0669abd-aef1-4b0d-9f97-a6adeeac3211, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 15 05:07:16 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:16.677 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[2a90ba88-0f41-421b-8a01-8baef0d67a72]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:07:16 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:16.919 267546 INFO neutron.agent.dhcp.agent [None req-217e38da-d4f2-43ae-a2c6-200b502eb821 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:07:16 localhost dnsmasq[326869]: read /var/lib/neutron/dhcp/89ceb4c2-ef6d-4611-b0ca-45fd8611536f/addn_hosts - 0 addresses Dec 15 05:07:16 localhost dnsmasq-dhcp[326869]: read /var/lib/neutron/dhcp/89ceb4c2-ef6d-4611-b0ca-45fd8611536f/host Dec 15 05:07:16 localhost dnsmasq-dhcp[326869]: read /var/lib/neutron/dhcp/89ceb4c2-ef6d-4611-b0ca-45fd8611536f/opts Dec 15 05:07:16 localhost podman[326996]: 2025-12-15 10:07:16.967777393 +0000 UTC m=+0.069211148 container kill 49422fefdef265b829bed4dee2315786e43e4f42468c6656fad3e1d7b33bdc79 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-89ceb4c2-ef6d-4611-b0ca-45fd8611536f, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 05:07:16 localhost systemd[1]: var-lib-containers-storage-overlay-5a5746c3dfce3eae41116dc0cd76b95e3f017ab939047a9c3652414a1b5c7d55-merged.mount: Deactivated successfully. Dec 15 05:07:16 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-91891ac80aeb33d0680de6a31f435600bad33403ad42f1e66a51de8b929b5091-userdata-shm.mount: Deactivated successfully. Dec 15 05:07:16 localhost systemd[1]: run-netns-qdhcp\x2dc0669abd\x2daef1\x2d4b0d\x2d9f97\x2da6adeeac3211.mount: Deactivated successfully. Dec 15 05:07:16 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:16.998 267546 ERROR neutron.agent.dhcp.agent [None req-2a995ef5-897e-4f45-8ca3-246075af8034 - - - - - -] Unable to reload_allocations dhcp for 89ceb4c2-ef6d-4611-b0ca-45fd8611536f.: neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap34c2fd24-fb not found in namespace qdhcp-89ceb4c2-ef6d-4611-b0ca-45fd8611536f. Dec 15 05:07:17 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:16.998 267546 ERROR neutron.agent.dhcp.agent Traceback (most recent call last): Dec 15 05:07:17 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:16.998 267546 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/dhcp/agent.py", line 264, in _call_driver Dec 15 05:07:17 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:16.998 267546 ERROR neutron.agent.dhcp.agent rv = getattr(driver, action)(**action_kwargs) Dec 15 05:07:17 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:16.998 267546 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 673, in reload_allocations Dec 15 05:07:17 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:16.998 267546 ERROR neutron.agent.dhcp.agent self.device_manager.update(self.network, self.interface_name) Dec 15 05:07:17 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:16.998 267546 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1899, in update Dec 15 05:07:17 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:16.998 267546 ERROR neutron.agent.dhcp.agent self._set_default_route(network, device_name) Dec 15 05:07:17 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:16.998 267546 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1610, in _set_default_route Dec 15 05:07:17 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:16.998 267546 ERROR neutron.agent.dhcp.agent self._set_default_route_ip_version(network, device_name, Dec 15 05:07:17 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:16.998 267546 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1539, in _set_default_route_ip_version Dec 15 05:07:17 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:16.998 267546 ERROR neutron.agent.dhcp.agent gateway = device.route.get_gateway(ip_version=ip_version) Dec 15 05:07:17 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:16.998 267546 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 671, in get_gateway Dec 15 05:07:17 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:16.998 267546 ERROR neutron.agent.dhcp.agent routes = self.list_routes(ip_version, scope=scope, table=table) Dec 15 05:07:17 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:16.998 267546 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 656, in list_routes Dec 15 05:07:17 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:16.998 267546 ERROR neutron.agent.dhcp.agent return list_ip_routes(self._parent.namespace, ip_version, scope=scope, Dec 15 05:07:17 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:16.998 267546 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 1611, in list_ip_routes Dec 15 05:07:17 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:16.998 267546 ERROR neutron.agent.dhcp.agent routes = privileged.list_ip_routes(namespace, ip_version, device=device, Dec 15 05:07:17 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:16.998 267546 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 333, in wrapped_f Dec 15 05:07:17 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:16.998 267546 ERROR neutron.agent.dhcp.agent return self(f, *args, **kw) Dec 15 05:07:17 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:16.998 267546 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 423, in __call__ Dec 15 05:07:17 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:16.998 267546 ERROR neutron.agent.dhcp.agent do = self.iter(retry_state=retry_state) Dec 15 05:07:17 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:16.998 267546 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 360, in iter Dec 15 05:07:17 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:16.998 267546 ERROR neutron.agent.dhcp.agent return fut.result() Dec 15 05:07:17 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:16.998 267546 ERROR neutron.agent.dhcp.agent File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 439, in result Dec 15 05:07:17 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:16.998 267546 ERROR neutron.agent.dhcp.agent return self.__get_result() Dec 15 05:07:17 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:16.998 267546 ERROR neutron.agent.dhcp.agent File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 391, in __get_result Dec 15 05:07:17 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:16.998 267546 ERROR neutron.agent.dhcp.agent raise self._exception Dec 15 05:07:17 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:16.998 267546 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 426, in __call__ Dec 15 05:07:17 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:16.998 267546 ERROR neutron.agent.dhcp.agent result = fn(*args, **kwargs) Dec 15 05:07:17 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:16.998 267546 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_privsep/priv_context.py", line 271, in _wrap Dec 15 05:07:17 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:16.998 267546 ERROR neutron.agent.dhcp.agent return self.channel.remote_call(name, args, kwargs, Dec 15 05:07:17 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:16.998 267546 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_privsep/daemon.py", line 215, in remote_call Dec 15 05:07:17 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:16.998 267546 ERROR neutron.agent.dhcp.agent raise exc_type(*result[2]) Dec 15 05:07:17 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:16.998 267546 ERROR neutron.agent.dhcp.agent neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap34c2fd24-fb not found in namespace qdhcp-89ceb4c2-ef6d-4611-b0ca-45fd8611536f. Dec 15 05:07:17 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:16.998 267546 ERROR neutron.agent.dhcp.agent #033[00m Dec 15 05:07:17 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:17.001 267546 INFO neutron.agent.dhcp.agent [None req-4aa54636-cc33-4055-9493-70e26ed1186d - - - - - -] Synchronizing state#033[00m Dec 15 05:07:17 localhost nova_compute[286344]: 2025-12-15 10:07:17.036 286348 DEBUG nova.network.neutron [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Updating instance_info_cache with network_info: [{"id": "03ef8889-3216-43fb-8a52-4be17a956ce1", "address": "fa:16:3e:74:df:7c", "network": {"id": "befb7a72-17a9-4bcb-b561-84b8f626685a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.201", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "c785bf23f53946bc99867d8832a50266", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03ef8889-32", "ovs_interfaceid": "03ef8889-3216-43fb-8a52-4be17a956ce1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 15 05:07:17 localhost nova_compute[286344]: 2025-12-15 10:07:17.052 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Releasing lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 15 05:07:17 localhost nova_compute[286344]: 2025-12-15 10:07:17.052 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 15 05:07:17 localhost nova_compute[286344]: 2025-12-15 10:07:17.053 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:07:17 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:17.213 267546 INFO neutron.agent.dhcp.agent [None req-30075567-0dd8-41c3-98b9-712c932ad94e - - - - - -] All active networks have been fetched through RPC.#033[00m Dec 15 05:07:17 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:17.214 267546 INFO neutron.agent.dhcp.agent [-] Starting network 89ceb4c2-ef6d-4611-b0ca-45fd8611536f dhcp configuration#033[00m Dec 15 05:07:17 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:17.214 267546 INFO neutron.agent.dhcp.agent [-] Finished network 89ceb4c2-ef6d-4611-b0ca-45fd8611536f dhcp configuration#033[00m Dec 15 05:07:17 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:17.215 267546 INFO neutron.agent.dhcp.agent [-] Starting network c0669abd-aef1-4b0d-9f97-a6adeeac3211 dhcp configuration#033[00m Dec 15 05:07:17 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:17.215 267546 INFO neutron.agent.dhcp.agent [-] Finished network c0669abd-aef1-4b0d-9f97-a6adeeac3211 dhcp configuration#033[00m Dec 15 05:07:17 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:17.216 267546 INFO neutron.agent.dhcp.agent [None req-30075567-0dd8-41c3-98b9-712c932ad94e - - - - - -] Synchronizing state complete#033[00m Dec 15 05:07:17 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e177 do_prune osdmap full prune enabled Dec 15 05:07:17 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e178 e178: 6 total, 6 up, 6 in Dec 15 05:07:17 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e178: 6 total, 6 up, 6 in Dec 15 05:07:17 localhost podman[327026]: 2025-12-15 10:07:17.660029536 +0000 UTC m=+0.065854796 container kill 49422fefdef265b829bed4dee2315786e43e4f42468c6656fad3e1d7b33bdc79 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-89ceb4c2-ef6d-4611-b0ca-45fd8611536f, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:07:17 localhost dnsmasq[326869]: exiting on receipt of SIGTERM Dec 15 05:07:17 localhost systemd[1]: libpod-49422fefdef265b829bed4dee2315786e43e4f42468c6656fad3e1d7b33bdc79.scope: Deactivated successfully. Dec 15 05:07:17 localhost podman[327040]: 2025-12-15 10:07:17.737551948 +0000 UTC m=+0.061191947 container died 49422fefdef265b829bed4dee2315786e43e4f42468c6656fad3e1d7b33bdc79 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-89ceb4c2-ef6d-4611-b0ca-45fd8611536f, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3) Dec 15 05:07:17 localhost podman[327040]: 2025-12-15 10:07:17.770194668 +0000 UTC m=+0.093834627 container cleanup 49422fefdef265b829bed4dee2315786e43e4f42468c6656fad3e1d7b33bdc79 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-89ceb4c2-ef6d-4611-b0ca-45fd8611536f, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true) Dec 15 05:07:17 localhost ovn_controller[154603]: 2025-12-15T10:07:17Z|00355|binding|INFO|Releasing lport b35254ad-12eb-47bb-92be-44fefe0694f0 from this chassis (sb_readonly=0) Dec 15 05:07:17 localhost systemd[1]: libpod-conmon-49422fefdef265b829bed4dee2315786e43e4f42468c6656fad3e1d7b33bdc79.scope: Deactivated successfully. Dec 15 05:07:17 localhost nova_compute[286344]: 2025-12-15 10:07:17.822 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:17 localhost podman[327042]: 2025-12-15 10:07:17.824843538 +0000 UTC m=+0.141723724 container remove 49422fefdef265b829bed4dee2315786e43e4f42468c6656fad3e1d7b33bdc79 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-89ceb4c2-ef6d-4611-b0ca-45fd8611536f, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:07:17 localhost systemd[1]: var-lib-containers-storage-overlay-c8dd47ed73c6994023fe69830b445629caff824e12e109beef57adad9e6c8660-merged.mount: Deactivated successfully. Dec 15 05:07:17 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-49422fefdef265b829bed4dee2315786e43e4f42468c6656fad3e1d7b33bdc79-userdata-shm.mount: Deactivated successfully. Dec 15 05:07:17 localhost systemd[1]: run-netns-qdhcp\x2d89ceb4c2\x2def6d\x2d4611\x2db0ca\x2d45fd8611536f.mount: Deactivated successfully. Dec 15 05:07:18 localhost nova_compute[286344]: 2025-12-15 10:07:18.048 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:07:18 localhost nova_compute[286344]: 2025-12-15 10:07:18.322 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:18 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e178 do_prune osdmap full prune enabled Dec 15 05:07:18 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e179 e179: 6 total, 6 up, 6 in Dec 15 05:07:18 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e179: 6 total, 6 up, 6 in Dec 15 05:07:18 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:18.885 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:04:2e:5c 2001:db8::f816:3eff:fe04:2e5c'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe04:2e5c/64', 'neutron:device_id': 'ovnmeta-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '89e710ef9f4f48d48a369002db572947', 'neutron:revision_number': '18', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e47821dc-5f5d-44dc-8a16-54817df4049d, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=79503367-f53f-4b35-8760-76fcaa4d8407) old=Port_Binding(mac=['fa:16:3e:04:2e:5c 10.100.0.2 2001:db8::f816:3eff:fe04:2e5c'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe04:2e5c/64', 'neutron:device_id': 'ovnmeta-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '89e710ef9f4f48d48a369002db572947', 'neutron:revision_number': '15', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:07:18 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:18.887 160590 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 79503367-f53f-4b35-8760-76fcaa4d8407 in datapath c0669abd-aef1-4b0d-9f97-a6adeeac3211 updated#033[00m Dec 15 05:07:18 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:18.890 160590 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c0669abd-aef1-4b0d-9f97-a6adeeac3211, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 15 05:07:18 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:18.891 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[629bb311-a2f0-48c7-ac00-92724fcca66a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:07:19 localhost nova_compute[286344]: 2025-12-15 10:07:19.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:07:19 localhost nova_compute[286344]: 2025-12-15 10:07:19.270 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 15 05:07:19 localhost nova_compute[286344]: 2025-12-15 10:07:19.746 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:20 localhost neutron_sriov_agent[260044]: 2025-12-15 10:07:20.211 2 INFO neutron.agent.securitygroups_rpc [None req-411b4924-5993-489c-a5b8-4e867c1749db 6b5da6f221214afe93e1fa66574f238b 89e710ef9f4f48d48a369002db572947 - - default default] Security group member updated ['a6c5f808-dddc-4f17-acbf-63b1b6e6f4d6']#033[00m Dec 15 05:07:20 localhost nova_compute[286344]: 2025-12-15 10:07:20.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:07:20 localhost nova_compute[286344]: 2025-12-15 10:07:20.291 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 05:07:20 localhost nova_compute[286344]: 2025-12-15 10:07:20.291 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 05:07:20 localhost nova_compute[286344]: 2025-12-15 10:07:20.292 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 05:07:20 localhost nova_compute[286344]: 2025-12-15 10:07:20.292 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Auditing locally available compute resources for np0005559462.localdomain (node: np0005559462.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 15 05:07:20 localhost nova_compute[286344]: 2025-12-15 10:07:20.292 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 05:07:20 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e179 do_prune osdmap full prune enabled Dec 15 05:07:20 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e180 e180: 6 total, 6 up, 6 in Dec 15 05:07:20 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e180: 6 total, 6 up, 6 in Dec 15 05:07:20 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:20.599 267546 INFO neutron.agent.linux.ip_lib [None req-c0d13db7-da5f-4abf-9dfa-827600c51a36 - - - - - -] Device tapcf0610ac-db cannot be used as it has no MAC address#033[00m Dec 15 05:07:20 localhost nova_compute[286344]: 2025-12-15 10:07:20.630 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:20 localhost kernel: device tapcf0610ac-db entered promiscuous mode Dec 15 05:07:20 localhost ovn_controller[154603]: 2025-12-15T10:07:20Z|00356|binding|INFO|Claiming lport cf0610ac-dbe5-4403-b0be-c514631a654a for this chassis. Dec 15 05:07:20 localhost ovn_controller[154603]: 2025-12-15T10:07:20Z|00357|binding|INFO|cf0610ac-dbe5-4403-b0be-c514631a654a: Claiming unknown Dec 15 05:07:20 localhost NetworkManager[5963]: [1765793240.6373] manager: (tapcf0610ac-db): new Generic device (/org/freedesktop/NetworkManager/Devices/57) Dec 15 05:07:20 localhost systemd-udevd[327099]: Network interface NamePolicy= disabled on kernel command line. Dec 15 05:07:20 localhost nova_compute[286344]: 2025-12-15 10:07:20.639 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:20 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:20.651 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe64:a40e/64', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '89e710ef9f4f48d48a369002db572947', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e47821dc-5f5d-44dc-8a16-54817df4049d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=cf0610ac-dbe5-4403-b0be-c514631a654a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:07:20 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:20.653 160590 INFO neutron.agent.ovn.metadata.agent [-] Port cf0610ac-dbe5-4403-b0be-c514631a654a in datapath c0669abd-aef1-4b0d-9f97-a6adeeac3211 bound to our chassis#033[00m Dec 15 05:07:20 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:20.658 160590 DEBUG neutron.agent.ovn.metadata.agent [-] Port c5768ef4-d76a-4308-8c2a-4bfdb87a4398 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Dec 15 05:07:20 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:20.658 160590 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c0669abd-aef1-4b0d-9f97-a6adeeac3211, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 15 05:07:20 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:20.659 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[663d2190-847c-4191-9760-6b9c9f4da3e3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:07:20 localhost nova_compute[286344]: 2025-12-15 10:07:20.679 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:20 localhost ovn_controller[154603]: 2025-12-15T10:07:20Z|00358|binding|INFO|Setting lport cf0610ac-dbe5-4403-b0be-c514631a654a ovn-installed in OVS Dec 15 05:07:20 localhost ovn_controller[154603]: 2025-12-15T10:07:20Z|00359|binding|INFO|Setting lport cf0610ac-dbe5-4403-b0be-c514631a654a up in Southbound Dec 15 05:07:20 localhost nova_compute[286344]: 2025-12-15 10:07:20.691 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:20 localhost nova_compute[286344]: 2025-12-15 10:07:20.722 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:20 localhost nova_compute[286344]: 2025-12-15 10:07:20.728 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:20 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Dec 15 05:07:20 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2826833802' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Dec 15 05:07:20 localhost nova_compute[286344]: 2025-12-15 10:07:20.753 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:20 localhost nova_compute[286344]: 2025-12-15 10:07:20.764 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.472s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 05:07:20 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:20.780 267546 INFO neutron.agent.linux.ip_lib [None req-514c1da5-b711-4677-a97a-46d9cbe8da58 - - - - - -] Device tap50e91b62-3c cannot be used as it has no MAC address#033[00m Dec 15 05:07:20 localhost nova_compute[286344]: 2025-12-15 10:07:20.832 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:20 localhost kernel: device tap50e91b62-3c entered promiscuous mode Dec 15 05:07:20 localhost NetworkManager[5963]: [1765793240.8395] manager: (tap50e91b62-3c): new Generic device (/org/freedesktop/NetworkManager/Devices/58) Dec 15 05:07:20 localhost ovn_controller[154603]: 2025-12-15T10:07:20Z|00360|binding|INFO|Claiming lport 50e91b62-3c39-4cdf-af88-b63051c0d313 for this chassis. Dec 15 05:07:20 localhost ovn_controller[154603]: 2025-12-15T10:07:20Z|00361|binding|INFO|50e91b62-3c39-4cdf-af88-b63051c0d313: Claiming unknown Dec 15 05:07:20 localhost nova_compute[286344]: 2025-12-15 10:07:20.841 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:20 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:20.872 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-2297edbd-f908-4486-afde-70d6e7369121', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2297edbd-f908-4486-afde-70d6e7369121', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5a302f70917d47c392ee5c9b50e38a7e', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4d07c10c-00eb-46c3-8c5b-11a0cb0572db, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=50e91b62-3c39-4cdf-af88-b63051c0d313) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:07:20 localhost ovn_controller[154603]: 2025-12-15T10:07:20Z|00362|binding|INFO|Setting lport 50e91b62-3c39-4cdf-af88-b63051c0d313 ovn-installed in OVS Dec 15 05:07:20 localhost ovn_controller[154603]: 2025-12-15T10:07:20Z|00363|binding|INFO|Setting lport 50e91b62-3c39-4cdf-af88-b63051c0d313 up in Southbound Dec 15 05:07:20 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:20.875 160590 INFO neutron.agent.ovn.metadata.agent [-] Port 50e91b62-3c39-4cdf-af88-b63051c0d313 in datapath 2297edbd-f908-4486-afde-70d6e7369121 bound to our chassis#033[00m Dec 15 05:07:20 localhost nova_compute[286344]: 2025-12-15 10:07:20.875 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:20 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:20.877 160590 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 2297edbd-f908-4486-afde-70d6e7369121 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 15 05:07:20 localhost nova_compute[286344]: 2025-12-15 10:07:20.876 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:20 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:20.878 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[dde85ffc-b32e-4362-b221-ea1f522d6c5b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:07:20 localhost nova_compute[286344]: 2025-12-15 10:07:20.906 286348 DEBUG nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 05:07:20 localhost nova_compute[286344]: 2025-12-15 10:07:20.906 286348 DEBUG nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 05:07:20 localhost nova_compute[286344]: 2025-12-15 10:07:20.936 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:20 localhost nova_compute[286344]: 2025-12-15 10:07:20.972 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:21 localhost nova_compute[286344]: 2025-12-15 10:07:21.130 286348 WARNING nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 15 05:07:21 localhost nova_compute[286344]: 2025-12-15 10:07:21.134 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Hypervisor/Node resource view: name=np0005559462.localdomain free_ram=11222MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 15 05:07:21 localhost nova_compute[286344]: 2025-12-15 10:07:21.134 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 05:07:21 localhost nova_compute[286344]: 2025-12-15 10:07:21.135 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 05:07:21 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e180 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:07:21 localhost nova_compute[286344]: 2025-12-15 10:07:21.235 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Instance 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 15 05:07:21 localhost nova_compute[286344]: 2025-12-15 10:07:21.236 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 15 05:07:21 localhost nova_compute[286344]: 2025-12-15 10:07:21.236 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Final resource view: name=np0005559462.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 15 05:07:21 localhost neutron_sriov_agent[260044]: 2025-12-15 10:07:21.292 2 INFO neutron.agent.securitygroups_rpc [None req-9c241515-ff94-47b7-8d5a-116430e945ae 6b5da6f221214afe93e1fa66574f238b 89e710ef9f4f48d48a369002db572947 - - default default] Security group member updated ['a6c5f808-dddc-4f17-acbf-63b1b6e6f4d6']#033[00m Dec 15 05:07:21 localhost nova_compute[286344]: 2025-12-15 10:07:21.292 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 05:07:21 localhost sshd[327207]: main: sshd: ssh-rsa algorithm is disabled Dec 15 05:07:21 localhost podman[327218]: Dec 15 05:07:21 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Dec 15 05:07:21 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1569328174' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Dec 15 05:07:21 localhost podman[327218]: 2025-12-15 10:07:21.746387179 +0000 UTC m=+0.095094642 container create aeb8b5508497e46e02b4c127f72b147151b11df4b66d1f0d1d08453bf5c8d281 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Dec 15 05:07:21 localhost nova_compute[286344]: 2025-12-15 10:07:21.776 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.484s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 05:07:21 localhost nova_compute[286344]: 2025-12-15 10:07:21.787 286348 DEBUG nova.compute.provider_tree [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Inventory has not changed in ProviderTree for provider: 26c8956b-6742-4951-b566-971b9bbe323b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 15 05:07:21 localhost systemd[1]: Started libpod-conmon-aeb8b5508497e46e02b4c127f72b147151b11df4b66d1f0d1d08453bf5c8d281.scope. Dec 15 05:07:21 localhost podman[327218]: 2025-12-15 10:07:21.706749949 +0000 UTC m=+0.055457412 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 15 05:07:21 localhost systemd[1]: Started libcrun container. Dec 15 05:07:21 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 05:07:21 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/669577eec4ebe88da1c42c5edfe23a6f872e87fd0b715a721a953586f871ec61/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 05:07:21 localhost podman[327218]: 2025-12-15 10:07:21.834228443 +0000 UTC m=+0.182935896 container init aeb8b5508497e46e02b4c127f72b147151b11df4b66d1f0d1d08453bf5c8d281 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3) Dec 15 05:07:21 localhost nova_compute[286344]: 2025-12-15 10:07:21.842 286348 DEBUG nova.scheduler.client.report [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Inventory has not changed for provider 26c8956b-6742-4951-b566-971b9bbe323b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 15 05:07:21 localhost nova_compute[286344]: 2025-12-15 10:07:21.845 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Compute_service record updated for np0005559462.localdomain:np0005559462.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 15 05:07:21 localhost nova_compute[286344]: 2025-12-15 10:07:21.845 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.710s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 05:07:21 localhost podman[327218]: 2025-12-15 10:07:21.8462324 +0000 UTC m=+0.194939883 container start aeb8b5508497e46e02b4c127f72b147151b11df4b66d1f0d1d08453bf5c8d281 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202) Dec 15 05:07:21 localhost dnsmasq[327258]: started, version 2.85 cachesize 150 Dec 15 05:07:21 localhost dnsmasq[327258]: DNS service limited to local subnets Dec 15 05:07:21 localhost dnsmasq[327258]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 15 05:07:21 localhost dnsmasq[327258]: warning: no upstream servers configured Dec 15 05:07:21 localhost dnsmasq[327258]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/addn_hosts - 0 addresses Dec 15 05:07:21 localhost podman[327240]: 2025-12-15 10:07:21.909559796 +0000 UTC m=+0.080261518 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, managed_by=edpm_ansible) Dec 15 05:07:21 localhost podman[327240]: 2025-12-15 10:07:21.993468503 +0000 UTC m=+0.164170235 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Dec 15 05:07:22 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:22.005 267546 INFO neutron.agent.dhcp.agent [None req-9d084096-b610-4e90-ad00-ef229ce5e82d - - - - - -] DHCP configuration for ports {'79503367-f53f-4b35-8760-76fcaa4d8407'} is completed#033[00m Dec 15 05:07:22 localhost podman[327277]: Dec 15 05:07:22 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 05:07:22 localhost podman[327277]: 2025-12-15 10:07:22.020739325 +0000 UTC m=+0.080625677 container create 9f582b4998af378a7182e23186c368e1d2c51df6bff8de0df16ed8a494757aab (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2297edbd-f908-4486-afde-70d6e7369121, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3) Dec 15 05:07:22 localhost systemd[1]: Started libpod-conmon-9f582b4998af378a7182e23186c368e1d2c51df6bff8de0df16ed8a494757aab.scope. Dec 15 05:07:22 localhost systemd[1]: Started libcrun container. Dec 15 05:07:22 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/593486039bd527a71c88ee288aaf006bf83d7d4a41f4fae83efeecbe775b502b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 05:07:22 localhost podman[327277]: 2025-12-15 10:07:22.06603107 +0000 UTC m=+0.125917422 container init 9f582b4998af378a7182e23186c368e1d2c51df6bff8de0df16ed8a494757aab (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2297edbd-f908-4486-afde-70d6e7369121, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:07:22 localhost podman[327277]: 2025-12-15 10:07:22.073000179 +0000 UTC m=+0.132886541 container start 9f582b4998af378a7182e23186c368e1d2c51df6bff8de0df16ed8a494757aab (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2297edbd-f908-4486-afde-70d6e7369121, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 05:07:22 localhost podman[327277]: 2025-12-15 10:07:21.975689837 +0000 UTC m=+0.035576279 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 15 05:07:22 localhost dnsmasq[327309]: started, version 2.85 cachesize 150 Dec 15 05:07:22 localhost dnsmasq[327309]: DNS service limited to local subnets Dec 15 05:07:22 localhost dnsmasq[327309]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 15 05:07:22 localhost dnsmasq[327309]: warning: no upstream servers configured Dec 15 05:07:22 localhost dnsmasq-dhcp[327309]: DHCP, static leases only on 10.100.0.0, lease time 1d Dec 15 05:07:22 localhost dnsmasq[327309]: read /var/lib/neutron/dhcp/2297edbd-f908-4486-afde-70d6e7369121/addn_hosts - 0 addresses Dec 15 05:07:22 localhost dnsmasq-dhcp[327309]: read /var/lib/neutron/dhcp/2297edbd-f908-4486-afde-70d6e7369121/host Dec 15 05:07:22 localhost dnsmasq-dhcp[327309]: read /var/lib/neutron/dhcp/2297edbd-f908-4486-afde-70d6e7369121/opts Dec 15 05:07:22 localhost dnsmasq[327258]: exiting on receipt of SIGTERM Dec 15 05:07:22 localhost podman[327313]: 2025-12-15 10:07:22.202863268 +0000 UTC m=+0.060928251 container kill aeb8b5508497e46e02b4c127f72b147151b11df4b66d1f0d1d08453bf5c8d281 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true) Dec 15 05:07:22 localhost systemd[1]: libpod-aeb8b5508497e46e02b4c127f72b147151b11df4b66d1f0d1d08453bf5c8d281.scope: Deactivated successfully. Dec 15 05:07:22 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e180 do_prune osdmap full prune enabled Dec 15 05:07:22 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e181 e181: 6 total, 6 up, 6 in Dec 15 05:07:22 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e181: 6 total, 6 up, 6 in Dec 15 05:07:22 localhost podman[327327]: 2025-12-15 10:07:22.284805872 +0000 UTC m=+0.063314497 container died aeb8b5508497e46e02b4c127f72b147151b11df4b66d1f0d1d08453bf5c8d281 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202) Dec 15 05:07:22 localhost podman[327327]: 2025-12-15 10:07:22.32035848 +0000 UTC m=+0.098867055 container cleanup aeb8b5508497e46e02b4c127f72b147151b11df4b66d1f0d1d08453bf5c8d281 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Dec 15 05:07:22 localhost systemd[1]: libpod-conmon-aeb8b5508497e46e02b4c127f72b147151b11df4b66d1f0d1d08453bf5c8d281.scope: Deactivated successfully. Dec 15 05:07:22 localhost podman[327328]: 2025-12-15 10:07:22.356374831 +0000 UTC m=+0.129077438 container remove aeb8b5508497e46e02b4c127f72b147151b11df4b66d1f0d1d08453bf5c8d281 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 05:07:22 localhost ovn_controller[154603]: 2025-12-15T10:07:22Z|00364|binding|INFO|Releasing lport cf0610ac-dbe5-4403-b0be-c514631a654a from this chassis (sb_readonly=0) Dec 15 05:07:22 localhost ovn_controller[154603]: 2025-12-15T10:07:22Z|00365|binding|INFO|Setting lport cf0610ac-dbe5-4403-b0be-c514631a654a down in Southbound Dec 15 05:07:22 localhost kernel: device tapcf0610ac-db left promiscuous mode Dec 15 05:07:22 localhost nova_compute[286344]: 2025-12-15 10:07:22.370 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:22 localhost nova_compute[286344]: 2025-12-15 10:07:22.395 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:22 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:22.452 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe64:a40e/64', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '89e710ef9f4f48d48a369002db572947', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005559462.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e47821dc-5f5d-44dc-8a16-54817df4049d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=cf0610ac-dbe5-4403-b0be-c514631a654a) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:07:22 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:22.454 160590 INFO neutron.agent.ovn.metadata.agent [-] Port cf0610ac-dbe5-4403-b0be-c514631a654a in datapath c0669abd-aef1-4b0d-9f97-a6adeeac3211 unbound from our chassis#033[00m Dec 15 05:07:22 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:22.457 160590 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c0669abd-aef1-4b0d-9f97-a6adeeac3211, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 15 05:07:22 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:22.458 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[653116f8-3466-43c9-8332-960a3b2dd591]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:07:22 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:22.476 267546 INFO neutron.agent.dhcp.agent [None req-81ae3cc0-69f5-4c4b-a823-fd047b37a9fb - - - - - -] DHCP configuration for ports {'16d4d21c-3a73-46e4-801f-6e63cba6ae1f'} is completed#033[00m Dec 15 05:07:22 localhost ovn_controller[154603]: 2025-12-15T10:07:22Z|00366|binding|INFO|Releasing lport 50e91b62-3c39-4cdf-af88-b63051c0d313 from this chassis (sb_readonly=0) Dec 15 05:07:22 localhost ovn_controller[154603]: 2025-12-15T10:07:22Z|00367|binding|INFO|Setting lport 50e91b62-3c39-4cdf-af88-b63051c0d313 down in Southbound Dec 15 05:07:22 localhost nova_compute[286344]: 2025-12-15 10:07:22.531 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:22 localhost kernel: device tap50e91b62-3c left promiscuous mode Dec 15 05:07:22 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:22.540 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-2297edbd-f908-4486-afde-70d6e7369121', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2297edbd-f908-4486-afde-70d6e7369121', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5a302f70917d47c392ee5c9b50e38a7e', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4d07c10c-00eb-46c3-8c5b-11a0cb0572db, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=50e91b62-3c39-4cdf-af88-b63051c0d313) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:07:22 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:22.544 160590 INFO neutron.agent.ovn.metadata.agent [-] Port 50e91b62-3c39-4cdf-af88-b63051c0d313 in datapath 2297edbd-f908-4486-afde-70d6e7369121 unbound from our chassis#033[00m Dec 15 05:07:22 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:22.548 160590 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2297edbd-f908-4486-afde-70d6e7369121, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 15 05:07:22 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:22.549 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[15d718a0-99a2-45f6-a8f7-0ef7c9b98d34]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:07:22 localhost nova_compute[286344]: 2025-12-15 10:07:22.552 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:22 localhost systemd[1]: var-lib-containers-storage-overlay-669577eec4ebe88da1c42c5edfe23a6f872e87fd0b715a721a953586f871ec61-merged.mount: Deactivated successfully. Dec 15 05:07:22 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-aeb8b5508497e46e02b4c127f72b147151b11df4b66d1f0d1d08453bf5c8d281-userdata-shm.mount: Deactivated successfully. Dec 15 05:07:22 localhost systemd[1]: run-netns-qdhcp\x2dc0669abd\x2daef1\x2d4b0d\x2d9f97\x2da6adeeac3211.mount: Deactivated successfully. Dec 15 05:07:22 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:22.996 267546 INFO neutron.agent.dhcp.agent [None req-00fef111-14d6-4fa8-a5de-cde76da0db7d - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:07:22 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:22.997 267546 INFO neutron.agent.dhcp.agent [None req-00fef111-14d6-4fa8-a5de-cde76da0db7d - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:07:23 localhost nova_compute[286344]: 2025-12-15 10:07:23.326 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:23 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e181 do_prune osdmap full prune enabled Dec 15 05:07:23 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e182 e182: 6 total, 6 up, 6 in Dec 15 05:07:23 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e182: 6 total, 6 up, 6 in Dec 15 05:07:23 localhost dnsmasq[327309]: read /var/lib/neutron/dhcp/2297edbd-f908-4486-afde-70d6e7369121/addn_hosts - 0 addresses Dec 15 05:07:23 localhost dnsmasq-dhcp[327309]: read /var/lib/neutron/dhcp/2297edbd-f908-4486-afde-70d6e7369121/host Dec 15 05:07:23 localhost dnsmasq-dhcp[327309]: read /var/lib/neutron/dhcp/2297edbd-f908-4486-afde-70d6e7369121/opts Dec 15 05:07:23 localhost podman[327374]: 2025-12-15 10:07:23.677909673 +0000 UTC m=+0.068164968 container kill 9f582b4998af378a7182e23186c368e1d2c51df6bff8de0df16ed8a494757aab (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2297edbd-f908-4486-afde-70d6e7369121, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3) Dec 15 05:07:23 localhost systemd[1]: tmp-crun.a2bfLX.mount: Deactivated successfully. Dec 15 05:07:23 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:23.711 267546 ERROR neutron.agent.dhcp.agent [None req-b4ead53a-c062-4951-bdbf-412379cb069d - - - - - -] Unable to reload_allocations dhcp for 2297edbd-f908-4486-afde-70d6e7369121.: neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap50e91b62-3c not found in namespace qdhcp-2297edbd-f908-4486-afde-70d6e7369121. Dec 15 05:07:23 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:23.711 267546 ERROR neutron.agent.dhcp.agent Traceback (most recent call last): Dec 15 05:07:23 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:23.711 267546 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/dhcp/agent.py", line 264, in _call_driver Dec 15 05:07:23 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:23.711 267546 ERROR neutron.agent.dhcp.agent rv = getattr(driver, action)(**action_kwargs) Dec 15 05:07:23 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:23.711 267546 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 673, in reload_allocations Dec 15 05:07:23 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:23.711 267546 ERROR neutron.agent.dhcp.agent self.device_manager.update(self.network, self.interface_name) Dec 15 05:07:23 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:23.711 267546 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1899, in update Dec 15 05:07:23 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:23.711 267546 ERROR neutron.agent.dhcp.agent self._set_default_route(network, device_name) Dec 15 05:07:23 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:23.711 267546 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1610, in _set_default_route Dec 15 05:07:23 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:23.711 267546 ERROR neutron.agent.dhcp.agent self._set_default_route_ip_version(network, device_name, Dec 15 05:07:23 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:23.711 267546 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1539, in _set_default_route_ip_version Dec 15 05:07:23 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:23.711 267546 ERROR neutron.agent.dhcp.agent gateway = device.route.get_gateway(ip_version=ip_version) Dec 15 05:07:23 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:23.711 267546 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 671, in get_gateway Dec 15 05:07:23 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:23.711 267546 ERROR neutron.agent.dhcp.agent routes = self.list_routes(ip_version, scope=scope, table=table) Dec 15 05:07:23 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:23.711 267546 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 656, in list_routes Dec 15 05:07:23 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:23.711 267546 ERROR neutron.agent.dhcp.agent return list_ip_routes(self._parent.namespace, ip_version, scope=scope, Dec 15 05:07:23 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:23.711 267546 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 1611, in list_ip_routes Dec 15 05:07:23 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:23.711 267546 ERROR neutron.agent.dhcp.agent routes = privileged.list_ip_routes(namespace, ip_version, device=device, Dec 15 05:07:23 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:23.711 267546 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 333, in wrapped_f Dec 15 05:07:23 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:23.711 267546 ERROR neutron.agent.dhcp.agent return self(f, *args, **kw) Dec 15 05:07:23 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:23.711 267546 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 423, in __call__ Dec 15 05:07:23 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:23.711 267546 ERROR neutron.agent.dhcp.agent do = self.iter(retry_state=retry_state) Dec 15 05:07:23 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:23.711 267546 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 360, in iter Dec 15 05:07:23 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:23.711 267546 ERROR neutron.agent.dhcp.agent return fut.result() Dec 15 05:07:23 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:23.711 267546 ERROR neutron.agent.dhcp.agent File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 439, in result Dec 15 05:07:23 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:23.711 267546 ERROR neutron.agent.dhcp.agent return self.__get_result() Dec 15 05:07:23 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:23.711 267546 ERROR neutron.agent.dhcp.agent File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 391, in __get_result Dec 15 05:07:23 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:23.711 267546 ERROR neutron.agent.dhcp.agent raise self._exception Dec 15 05:07:23 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:23.711 267546 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 426, in __call__ Dec 15 05:07:23 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:23.711 267546 ERROR neutron.agent.dhcp.agent result = fn(*args, **kwargs) Dec 15 05:07:23 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:23.711 267546 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_privsep/priv_context.py", line 271, in _wrap Dec 15 05:07:23 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:23.711 267546 ERROR neutron.agent.dhcp.agent return self.channel.remote_call(name, args, kwargs, Dec 15 05:07:23 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:23.711 267546 ERROR neutron.agent.dhcp.agent File "/usr/lib/python3.9/site-packages/oslo_privsep/daemon.py", line 215, in remote_call Dec 15 05:07:23 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:23.711 267546 ERROR neutron.agent.dhcp.agent raise exc_type(*result[2]) Dec 15 05:07:23 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:23.711 267546 ERROR neutron.agent.dhcp.agent neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap50e91b62-3c not found in namespace qdhcp-2297edbd-f908-4486-afde-70d6e7369121. Dec 15 05:07:23 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:23.711 267546 ERROR neutron.agent.dhcp.agent #033[00m Dec 15 05:07:23 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:23.714 267546 INFO neutron.agent.dhcp.agent [None req-30075567-0dd8-41c3-98b9-712c932ad94e - - - - - -] Synchronizing state#033[00m Dec 15 05:07:23 localhost nova_compute[286344]: 2025-12-15 10:07:23.847 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:07:24 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:24.168 267546 INFO neutron.agent.dhcp.agent [None req-c70badbc-b682-4004-aeac-5836b78e058c - - - - - -] All active networks have been fetched through RPC.#033[00m Dec 15 05:07:24 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:24.169 267546 INFO neutron.agent.dhcp.agent [-] Starting network 2297edbd-f908-4486-afde-70d6e7369121 dhcp configuration#033[00m Dec 15 05:07:24 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:24.173 267546 INFO neutron.agent.dhcp.agent [-] Starting network c0669abd-aef1-4b0d-9f97-a6adeeac3211 dhcp configuration#033[00m Dec 15 05:07:24 localhost dnsmasq[327309]: exiting on receipt of SIGTERM Dec 15 05:07:24 localhost podman[327405]: 2025-12-15 10:07:24.360058041 +0000 UTC m=+0.065892966 container kill 9f582b4998af378a7182e23186c368e1d2c51df6bff8de0df16ed8a494757aab (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2297edbd-f908-4486-afde-70d6e7369121, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 05:07:24 localhost systemd[1]: libpod-9f582b4998af378a7182e23186c368e1d2c51df6bff8de0df16ed8a494757aab.scope: Deactivated successfully. Dec 15 05:07:24 localhost podman[327418]: 2025-12-15 10:07:24.435397924 +0000 UTC m=+0.061214449 container died 9f582b4998af378a7182e23186c368e1d2c51df6bff8de0df16ed8a494757aab (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2297edbd-f908-4486-afde-70d6e7369121, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:07:24 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9f582b4998af378a7182e23186c368e1d2c51df6bff8de0df16ed8a494757aab-userdata-shm.mount: Deactivated successfully. Dec 15 05:07:24 localhost podman[327418]: 2025-12-15 10:07:24.4734059 +0000 UTC m=+0.099222375 container cleanup 9f582b4998af378a7182e23186c368e1d2c51df6bff8de0df16ed8a494757aab (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2297edbd-f908-4486-afde-70d6e7369121, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Dec 15 05:07:24 localhost systemd[1]: libpod-conmon-9f582b4998af378a7182e23186c368e1d2c51df6bff8de0df16ed8a494757aab.scope: Deactivated successfully. Dec 15 05:07:24 localhost podman[327420]: 2025-12-15 10:07:24.511771976 +0000 UTC m=+0.127311331 container remove 9f582b4998af378a7182e23186c368e1d2c51df6bff8de0df16ed8a494757aab (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2297edbd-f908-4486-afde-70d6e7369121, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:07:24 localhost ovn_controller[154603]: 2025-12-15T10:07:24Z|00368|binding|INFO|Releasing lport b35254ad-12eb-47bb-92be-44fefe0694f0 from this chassis (sb_readonly=0) Dec 15 05:07:24 localhost nova_compute[286344]: 2025-12-15 10:07:24.833 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:24 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:24.857 267546 INFO neutron.agent.dhcp.agent [None req-98845297-a720-4327-9363-911a0c08b562 - - - - - -] Finished network 2297edbd-f908-4486-afde-70d6e7369121 dhcp configuration#033[00m Dec 15 05:07:25 localhost nova_compute[286344]: 2025-12-15 10:07:25.309 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:25 localhost systemd[1]: var-lib-containers-storage-overlay-593486039bd527a71c88ee288aaf006bf83d7d4a41f4fae83efeecbe775b502b-merged.mount: Deactivated successfully. Dec 15 05:07:25 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:25.409 267546 INFO neutron.agent.linux.ip_lib [None req-87859ec8-e234-4630-b45a-ee830977619a - - - - - -] Device tapbca2222a-94 cannot be used as it has no MAC address#033[00m Dec 15 05:07:25 localhost nova_compute[286344]: 2025-12-15 10:07:25.434 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:25 localhost kernel: device tapbca2222a-94 entered promiscuous mode Dec 15 05:07:25 localhost NetworkManager[5963]: [1765793245.4405] manager: (tapbca2222a-94): new Generic device (/org/freedesktop/NetworkManager/Devices/59) Dec 15 05:07:25 localhost ovn_controller[154603]: 2025-12-15T10:07:25Z|00369|binding|INFO|Claiming lport bca2222a-9426-4c05-bde2-e2763807a6cd for this chassis. Dec 15 05:07:25 localhost ovn_controller[154603]: 2025-12-15T10:07:25Z|00370|binding|INFO|bca2222a-9426-4c05-bde2-e2763807a6cd: Claiming unknown Dec 15 05:07:25 localhost nova_compute[286344]: 2025-12-15 10:07:25.445 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:25 localhost systemd-udevd[327457]: Network interface NamePolicy= disabled on kernel command line. Dec 15 05:07:25 localhost journal[231322]: ethtool ioctl error on tapbca2222a-94: No such device Dec 15 05:07:25 localhost journal[231322]: ethtool ioctl error on tapbca2222a-94: No such device Dec 15 05:07:25 localhost journal[231322]: ethtool ioctl error on tapbca2222a-94: No such device Dec 15 05:07:25 localhost journal[231322]: ethtool ioctl error on tapbca2222a-94: No such device Dec 15 05:07:25 localhost ovn_controller[154603]: 2025-12-15T10:07:25Z|00371|binding|INFO|Setting lport bca2222a-9426-4c05-bde2-e2763807a6cd ovn-installed in OVS Dec 15 05:07:25 localhost nova_compute[286344]: 2025-12-15 10:07:25.477 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:25 localhost journal[231322]: ethtool ioctl error on tapbca2222a-94: No such device Dec 15 05:07:25 localhost journal[231322]: ethtool ioctl error on tapbca2222a-94: No such device Dec 15 05:07:25 localhost journal[231322]: ethtool ioctl error on tapbca2222a-94: No such device Dec 15 05:07:25 localhost ovn_controller[154603]: 2025-12-15T10:07:25Z|00372|binding|INFO|Setting lport bca2222a-9426-4c05-bde2-e2763807a6cd up in Southbound Dec 15 05:07:25 localhost journal[231322]: ethtool ioctl error on tapbca2222a-94: No such device Dec 15 05:07:25 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:25.487 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe77:45f3/64', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '89e710ef9f4f48d48a369002db572947', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e47821dc-5f5d-44dc-8a16-54817df4049d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=bca2222a-9426-4c05-bde2-e2763807a6cd) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:07:25 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:25.489 160590 INFO neutron.agent.ovn.metadata.agent [-] Port bca2222a-9426-4c05-bde2-e2763807a6cd in datapath c0669abd-aef1-4b0d-9f97-a6adeeac3211 bound to our chassis#033[00m Dec 15 05:07:25 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:25.491 160590 DEBUG neutron.agent.ovn.metadata.agent [-] Port fb4bbdbd-ced6-4a59-b63a-25b3f45dbb62 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Dec 15 05:07:25 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:25.491 160590 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c0669abd-aef1-4b0d-9f97-a6adeeac3211, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 15 05:07:25 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:25.492 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[df556f0b-dfc9-4cdd-8d72-a110a1f9c794]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:07:25 localhost nova_compute[286344]: 2025-12-15 10:07:25.522 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:25 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e182 do_prune osdmap full prune enabled Dec 15 05:07:25 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e183 e183: 6 total, 6 up, 6 in Dec 15 05:07:25 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e183: 6 total, 6 up, 6 in Dec 15 05:07:25 localhost nova_compute[286344]: 2025-12-15 10:07:25.557 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:25 localhost nova_compute[286344]: 2025-12-15 10:07:25.724 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:25 localhost neutron_sriov_agent[260044]: 2025-12-15 10:07:25.996 2 INFO neutron.agent.securitygroups_rpc [None req-8e318e6c-e7de-49e1-9d1e-da37538632e2 6b5da6f221214afe93e1fa66574f238b 89e710ef9f4f48d48a369002db572947 - - default default] Security group member updated ['a6c5f808-dddc-4f17-acbf-63b1b6e6f4d6']#033[00m Dec 15 05:07:26 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e183 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:07:26 localhost podman[327529]: Dec 15 05:07:26 localhost podman[327529]: 2025-12-15 10:07:26.394505096 +0000 UTC m=+0.092259440 container create a196dfbb9f14f33110a56c86fd2b9a51df944690350d27eaae4460e5ca5303cf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:07:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e. Dec 15 05:07:26 localhost systemd[1]: Started libpod-conmon-a196dfbb9f14f33110a56c86fd2b9a51df944690350d27eaae4460e5ca5303cf.scope. Dec 15 05:07:26 localhost systemd[1]: Started libcrun container. Dec 15 05:07:26 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/107016a6374646ca7b7b7372fc462b048508d1e9cad1593c2ae89010ba636709/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 05:07:26 localhost podman[327529]: 2025-12-15 10:07:26.351627059 +0000 UTC m=+0.049381423 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 15 05:07:26 localhost podman[327529]: 2025-12-15 10:07:26.460780857 +0000 UTC m=+0.158535211 container init a196dfbb9f14f33110a56c86fd2b9a51df944690350d27eaae4460e5ca5303cf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0) Dec 15 05:07:26 localhost podman[327529]: 2025-12-15 10:07:26.478268203 +0000 UTC m=+0.176022547 container start a196dfbb9f14f33110a56c86fd2b9a51df944690350d27eaae4460e5ca5303cf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Dec 15 05:07:26 localhost dnsmasq[327559]: started, version 2.85 cachesize 150 Dec 15 05:07:26 localhost dnsmasq[327559]: DNS service limited to local subnets Dec 15 05:07:26 localhost dnsmasq[327559]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 15 05:07:26 localhost dnsmasq[327559]: warning: no upstream servers configured Dec 15 05:07:26 localhost dnsmasq-dhcp[327559]: DHCPv6, static leases only on 2001:db8::, lease time 1d Dec 15 05:07:26 localhost dnsmasq[327559]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/addn_hosts - 0 addresses Dec 15 05:07:26 localhost dnsmasq-dhcp[327559]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/host Dec 15 05:07:26 localhost dnsmasq-dhcp[327559]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/opts Dec 15 05:07:26 localhost podman[327543]: 2025-12-15 10:07:26.524965312 +0000 UTC m=+0.085688990 container health_status a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Dec 15 05:07:26 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:26.536 267546 INFO neutron.agent.dhcp.agent [None req-87859ec8-e234-4630-b45a-ee830977619a - - - - - -] Finished network c0669abd-aef1-4b0d-9f97-a6adeeac3211 dhcp configuration#033[00m Dec 15 05:07:26 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:26.537 267546 INFO neutron.agent.dhcp.agent [None req-c70badbc-b682-4004-aeac-5836b78e058c - - - - - -] Synchronizing state complete#033[00m Dec 15 05:07:26 localhost podman[327543]: 2025-12-15 10:07:26.562582966 +0000 UTC m=+0.123306654 container exec_died a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 15 05:07:26 localhost systemd[1]: a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e.service: Deactivated successfully. Dec 15 05:07:26 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:26.835 267546 INFO neutron.agent.dhcp.agent [None req-c053e5d8-250f-4abe-a04b-93467c6030f6 - - - - - -] DHCP configuration for ports {'79503367-f53f-4b35-8760-76fcaa4d8407'} is completed#033[00m Dec 15 05:07:26 localhost dnsmasq[327559]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/addn_hosts - 1 addresses Dec 15 05:07:26 localhost dnsmasq-dhcp[327559]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/host Dec 15 05:07:26 localhost dnsmasq-dhcp[327559]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/opts Dec 15 05:07:26 localhost podman[327588]: 2025-12-15 10:07:26.966504228 +0000 UTC m=+0.057475764 container kill a196dfbb9f14f33110a56c86fd2b9a51df944690350d27eaae4460e5ca5303cf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0) Dec 15 05:07:27 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:27.107 267546 INFO neutron.agent.dhcp.agent [None req-b88da428-f6eb-479b-88cd-3d991d3333cb - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-15T10:07:25Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=52607aa1-5ce3-4dec-b01a-afe70835007d, ip_allocation=immediate, mac_address=fa:16:3e:1d:ec:66, name=tempest-NetworksTestDHCPv6-1265757386, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-15T10:05:39Z, description=, dns_domain=, id=c0669abd-aef1-4b0d-9f97-a6adeeac3211, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1861207414, port_security_enabled=True, project_id=89e710ef9f4f48d48a369002db572947, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=20673, qos_policy_id=None, revision_number=40, router:external=False, shared=False, standard_attr_id=2164, status=ACTIVE, subnets=['a5f49b23-efb6-4c4b-b4ca-faa19243314c'], tags=[], tenant_id=89e710ef9f4f48d48a369002db572947, updated_at=2025-12-15T10:07:22Z, vlan_transparent=None, network_id=c0669abd-aef1-4b0d-9f97-a6adeeac3211, port_security_enabled=True, project_id=89e710ef9f4f48d48a369002db572947, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['a6c5f808-dddc-4f17-acbf-63b1b6e6f4d6'], standard_attr_id=2730, status=DOWN, tags=[], tenant_id=89e710ef9f4f48d48a369002db572947, updated_at=2025-12-15T10:07:25Z on network c0669abd-aef1-4b0d-9f97-a6adeeac3211#033[00m Dec 15 05:07:27 localhost neutron_sriov_agent[260044]: 2025-12-15 10:07:27.180 2 INFO neutron.agent.securitygroups_rpc [None req-6db41e03-07be-46b5-b736-e5c6615f8268 6b5da6f221214afe93e1fa66574f238b 89e710ef9f4f48d48a369002db572947 - - default default] Security group member updated ['a6c5f808-dddc-4f17-acbf-63b1b6e6f4d6']#033[00m Dec 15 05:07:27 localhost dnsmasq[327559]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/addn_hosts - 1 addresses Dec 15 05:07:27 localhost dnsmasq-dhcp[327559]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/host Dec 15 05:07:27 localhost dnsmasq-dhcp[327559]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/opts Dec 15 05:07:27 localhost podman[327625]: 2025-12-15 10:07:27.287606059 +0000 UTC m=+0.058789600 container kill a196dfbb9f14f33110a56c86fd2b9a51df944690350d27eaae4460e5ca5303cf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202) Dec 15 05:07:27 localhost systemd[1]: tmp-crun.SEtAFs.mount: Deactivated successfully. Dec 15 05:07:27 localhost systemd[1]: run-netns-qdhcp\x2d2297edbd\x2df908\x2d4486\x2dafde\x2d70d6e7369121.mount: Deactivated successfully. Dec 15 05:07:27 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:27.448 267546 INFO neutron.agent.dhcp.agent [None req-35b76dbb-6989-463d-ad9e-50f8ba403029 - - - - - -] DHCP configuration for ports {'bca2222a-9426-4c05-bde2-e2763807a6cd', '79503367-f53f-4b35-8760-76fcaa4d8407', '52607aa1-5ce3-4dec-b01a-afe70835007d'} is completed#033[00m Dec 15 05:07:27 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e183 do_prune osdmap full prune enabled Dec 15 05:07:27 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e184 e184: 6 total, 6 up, 6 in Dec 15 05:07:27 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e184: 6 total, 6 up, 6 in Dec 15 05:07:27 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:27.584 267546 INFO neutron.agent.dhcp.agent [None req-950044ef-0938-497b-82e8-4c028ef5eca7 - - - - - -] DHCP configuration for ports {'52607aa1-5ce3-4dec-b01a-afe70835007d'} is completed#033[00m Dec 15 05:07:27 localhost dnsmasq[327559]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/addn_hosts - 0 addresses Dec 15 05:07:27 localhost podman[327663]: 2025-12-15 10:07:27.599656164 +0000 UTC m=+0.059401666 container kill a196dfbb9f14f33110a56c86fd2b9a51df944690350d27eaae4460e5ca5303cf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:07:27 localhost dnsmasq-dhcp[327559]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/host Dec 15 05:07:27 localhost dnsmasq-dhcp[327559]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/opts Dec 15 05:07:28 localhost dnsmasq[327559]: exiting on receipt of SIGTERM Dec 15 05:07:28 localhost podman[327700]: 2025-12-15 10:07:28.2758532 +0000 UTC m=+0.063372554 container kill a196dfbb9f14f33110a56c86fd2b9a51df944690350d27eaae4460e5ca5303cf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true) Dec 15 05:07:28 localhost systemd[1]: libpod-a196dfbb9f14f33110a56c86fd2b9a51df944690350d27eaae4460e5ca5303cf.scope: Deactivated successfully. Dec 15 05:07:28 localhost nova_compute[286344]: 2025-12-15 10:07:28.345 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:28 localhost podman[327714]: 2025-12-15 10:07:28.347122408 +0000 UTC m=+0.056446705 container died a196dfbb9f14f33110a56c86fd2b9a51df944690350d27eaae4460e5ca5303cf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:07:28 localhost podman[327714]: 2025-12-15 10:07:28.376025444 +0000 UTC m=+0.085349691 container cleanup a196dfbb9f14f33110a56c86fd2b9a51df944690350d27eaae4460e5ca5303cf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Dec 15 05:07:28 localhost systemd[1]: libpod-conmon-a196dfbb9f14f33110a56c86fd2b9a51df944690350d27eaae4460e5ca5303cf.scope: Deactivated successfully. Dec 15 05:07:28 localhost systemd[1]: var-lib-containers-storage-overlay-107016a6374646ca7b7b7372fc462b048508d1e9cad1593c2ae89010ba636709-merged.mount: Deactivated successfully. Dec 15 05:07:28 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a196dfbb9f14f33110a56c86fd2b9a51df944690350d27eaae4460e5ca5303cf-userdata-shm.mount: Deactivated successfully. Dec 15 05:07:28 localhost podman[327716]: 2025-12-15 10:07:28.445931445 +0000 UTC m=+0.147852812 container remove a196dfbb9f14f33110a56c86fd2b9a51df944690350d27eaae4460e5ca5303cf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Dec 15 05:07:28 localhost ovn_controller[154603]: 2025-12-15T10:07:28Z|00373|binding|INFO|Releasing lport bca2222a-9426-4c05-bde2-e2763807a6cd from this chassis (sb_readonly=0) Dec 15 05:07:28 localhost kernel: device tapbca2222a-94 left promiscuous mode Dec 15 05:07:28 localhost ovn_controller[154603]: 2025-12-15T10:07:28Z|00374|binding|INFO|Setting lport bca2222a-9426-4c05-bde2-e2763807a6cd down in Southbound Dec 15 05:07:28 localhost nova_compute[286344]: 2025-12-15 10:07:28.458 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:28 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:28.470 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe77:45f3/64', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '89e710ef9f4f48d48a369002db572947', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005559462.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e47821dc-5f5d-44dc-8a16-54817df4049d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=bca2222a-9426-4c05-bde2-e2763807a6cd) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:07:28 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:28.472 160590 INFO neutron.agent.ovn.metadata.agent [-] Port bca2222a-9426-4c05-bde2-e2763807a6cd in datapath c0669abd-aef1-4b0d-9f97-a6adeeac3211 unbound from our chassis#033[00m Dec 15 05:07:28 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:28.474 160590 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c0669abd-aef1-4b0d-9f97-a6adeeac3211, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 15 05:07:28 localhost nova_compute[286344]: 2025-12-15 10:07:28.478 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:28 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:28.478 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[93f12262-1a5f-450f-923f-de58105bce69]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:07:28 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:28.682 267546 INFO neutron.agent.dhcp.agent [None req-bdd28d72-6e59-43a5-8d3a-eb1332704372 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:07:28 localhost systemd[1]: run-netns-qdhcp\x2dc0669abd\x2daef1\x2d4b0d\x2d9f97\x2da6adeeac3211.mount: Deactivated successfully. Dec 15 05:07:29 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e184 do_prune osdmap full prune enabled Dec 15 05:07:29 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e185 e185: 6 total, 6 up, 6 in Dec 15 05:07:29 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e185: 6 total, 6 up, 6 in Dec 15 05:07:30 localhost nova_compute[286344]: 2025-12-15 10:07:30.760 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:31 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e185 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:07:31 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e185 do_prune osdmap full prune enabled Dec 15 05:07:31 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e186 e186: 6 total, 6 up, 6 in Dec 15 05:07:31 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e186: 6 total, 6 up, 6 in Dec 15 05:07:31 localhost podman[243449]: time="2025-12-15T10:07:31Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 15 05:07:31 localhost podman[243449]: @ - - [15/Dec/2025:10:07:31 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 160292 "" "Go-http-client/1.1" Dec 15 05:07:31 localhost podman[243449]: @ - - [15/Dec/2025:10:07:31 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 20207 "" "Go-http-client/1.1" Dec 15 05:07:32 localhost neutron_sriov_agent[260044]: 2025-12-15 10:07:32.031 2 INFO neutron.agent.securitygroups_rpc [None req-e4582458-8f73-42f4-b22c-258e89833a6d 6b5da6f221214afe93e1fa66574f238b 89e710ef9f4f48d48a369002db572947 - - default default] Security group member updated ['a6c5f808-dddc-4f17-acbf-63b1b6e6f4d6']#033[00m Dec 15 05:07:32 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:32.201 267546 INFO neutron.agent.linux.ip_lib [None req-57e4ceab-9f99-4ee0-87e6-670c6bd7327f - - - - - -] Device tap933e6e08-11 cannot be used as it has no MAC address#033[00m Dec 15 05:07:32 localhost nova_compute[286344]: 2025-12-15 10:07:32.225 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:32 localhost kernel: device tap933e6e08-11 entered promiscuous mode Dec 15 05:07:32 localhost NetworkManager[5963]: [1765793252.2336] manager: (tap933e6e08-11): new Generic device (/org/freedesktop/NetworkManager/Devices/60) Dec 15 05:07:32 localhost ovn_controller[154603]: 2025-12-15T10:07:32Z|00375|binding|INFO|Claiming lport 933e6e08-113b-4e2c-84bf-f77172150bd4 for this chassis. Dec 15 05:07:32 localhost ovn_controller[154603]: 2025-12-15T10:07:32Z|00376|binding|INFO|933e6e08-113b-4e2c-84bf-f77172150bd4: Claiming unknown Dec 15 05:07:32 localhost nova_compute[286344]: 2025-12-15 10:07:32.237 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:32 localhost systemd-udevd[327755]: Network interface NamePolicy= disabled on kernel command line. Dec 15 05:07:32 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e186 do_prune osdmap full prune enabled Dec 15 05:07:32 localhost ovn_controller[154603]: 2025-12-15T10:07:32Z|00377|binding|INFO|Setting lport 933e6e08-113b-4e2c-84bf-f77172150bd4 ovn-installed in OVS Dec 15 05:07:32 localhost ovn_controller[154603]: 2025-12-15T10:07:32Z|00378|binding|INFO|Setting lport 933e6e08-113b-4e2c-84bf-f77172150bd4 up in Southbound Dec 15 05:07:32 localhost nova_compute[286344]: 2025-12-15 10:07:32.248 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:32 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:32.248 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe09:8d0/64', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '89e710ef9f4f48d48a369002db572947', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e47821dc-5f5d-44dc-8a16-54817df4049d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=933e6e08-113b-4e2c-84bf-f77172150bd4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:07:32 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:32.250 160590 INFO neutron.agent.ovn.metadata.agent [-] Port 933e6e08-113b-4e2c-84bf-f77172150bd4 in datapath c0669abd-aef1-4b0d-9f97-a6adeeac3211 bound to our chassis#033[00m Dec 15 05:07:32 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:32.253 160590 DEBUG neutron.agent.ovn.metadata.agent [-] Port 662c3e92-5ead-414b-8aaf-0a576e26e6fe IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Dec 15 05:07:32 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:32.253 160590 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c0669abd-aef1-4b0d-9f97-a6adeeac3211, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 15 05:07:32 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:32.255 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[c2b3fcc5-35d5-4708-a6fc-4eee936ed1ca]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:07:32 localhost journal[231322]: ethtool ioctl error on tap933e6e08-11: No such device Dec 15 05:07:32 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e187 e187: 6 total, 6 up, 6 in Dec 15 05:07:32 localhost journal[231322]: ethtool ioctl error on tap933e6e08-11: No such device Dec 15 05:07:32 localhost nova_compute[286344]: 2025-12-15 10:07:32.266 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:32 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e187: 6 total, 6 up, 6 in Dec 15 05:07:32 localhost journal[231322]: ethtool ioctl error on tap933e6e08-11: No such device Dec 15 05:07:32 localhost journal[231322]: ethtool ioctl error on tap933e6e08-11: No such device Dec 15 05:07:32 localhost journal[231322]: ethtool ioctl error on tap933e6e08-11: No such device Dec 15 05:07:32 localhost journal[231322]: ethtool ioctl error on tap933e6e08-11: No such device Dec 15 05:07:32 localhost journal[231322]: ethtool ioctl error on tap933e6e08-11: No such device Dec 15 05:07:32 localhost journal[231322]: ethtool ioctl error on tap933e6e08-11: No such device Dec 15 05:07:32 localhost nova_compute[286344]: 2025-12-15 10:07:32.301 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:32 localhost nova_compute[286344]: 2025-12-15 10:07:32.329 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:33 localhost neutron_sriov_agent[260044]: 2025-12-15 10:07:33.095 2 INFO neutron.agent.securitygroups_rpc [None req-6fee1a50-5a3a-4617-9d32-b5d83a961d23 6b5da6f221214afe93e1fa66574f238b 89e710ef9f4f48d48a369002db572947 - - default default] Security group member updated ['a6c5f808-dddc-4f17-acbf-63b1b6e6f4d6']#033[00m Dec 15 05:07:33 localhost podman[327826]: Dec 15 05:07:33 localhost podman[327826]: 2025-12-15 10:07:33.154754781 +0000 UTC m=+0.073372016 container create f9b93180c1713c68e3de397597644dca496c84652efe3a260cec2fdfd5441141 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:07:33 localhost systemd[1]: Started libpod-conmon-f9b93180c1713c68e3de397597644dca496c84652efe3a260cec2fdfd5441141.scope. Dec 15 05:07:33 localhost systemd[1]: Started libcrun container. Dec 15 05:07:33 localhost podman[327826]: 2025-12-15 10:07:33.110129557 +0000 UTC m=+0.028746832 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 15 05:07:33 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/31973a60198e9a07d953ab1ea3dab5f0ba072611fe831cb0516a9cf5bd9ca872/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 05:07:33 localhost podman[327826]: 2025-12-15 10:07:33.222627596 +0000 UTC m=+0.141244831 container init f9b93180c1713c68e3de397597644dca496c84652efe3a260cec2fdfd5441141 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2) Dec 15 05:07:33 localhost podman[327826]: 2025-12-15 10:07:33.232837174 +0000 UTC m=+0.151454419 container start f9b93180c1713c68e3de397597644dca496c84652efe3a260cec2fdfd5441141 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202) Dec 15 05:07:33 localhost dnsmasq[327845]: started, version 2.85 cachesize 150 Dec 15 05:07:33 localhost dnsmasq[327845]: DNS service limited to local subnets Dec 15 05:07:33 localhost dnsmasq[327845]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 15 05:07:33 localhost dnsmasq[327845]: warning: no upstream servers configured Dec 15 05:07:33 localhost dnsmasq[327845]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/addn_hosts - 0 addresses Dec 15 05:07:33 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:33.288 267546 INFO neutron.agent.dhcp.agent [None req-57e4ceab-9f99-4ee0-87e6-670c6bd7327f - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-15T10:07:31Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=2bbe1233-9bf5-448e-9651-50a0e9603a8a, ip_allocation=immediate, mac_address=fa:16:3e:b8:45:ac, name=tempest-NetworksTestDHCPv6-154457985, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-15T10:05:39Z, description=, dns_domain=, id=c0669abd-aef1-4b0d-9f97-a6adeeac3211, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1861207414, port_security_enabled=True, project_id=89e710ef9f4f48d48a369002db572947, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=20673, qos_policy_id=None, revision_number=42, router:external=False, shared=False, standard_attr_id=2164, status=ACTIVE, subnets=['ea3727a1-9bb6-46f4-bf18-007d48a4bbad'], tags=[], tenant_id=89e710ef9f4f48d48a369002db572947, updated_at=2025-12-15T10:07:28Z, vlan_transparent=None, network_id=c0669abd-aef1-4b0d-9f97-a6adeeac3211, port_security_enabled=True, project_id=89e710ef9f4f48d48a369002db572947, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['a6c5f808-dddc-4f17-acbf-63b1b6e6f4d6'], standard_attr_id=2751, status=DOWN, tags=[], tenant_id=89e710ef9f4f48d48a369002db572947, updated_at=2025-12-15T10:07:31Z on network c0669abd-aef1-4b0d-9f97-a6adeeac3211#033[00m Dec 15 05:07:33 localhost nova_compute[286344]: 2025-12-15 10:07:33.347 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:33 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:33.361 267546 INFO neutron.agent.dhcp.agent [None req-8066383b-393c-4891-9ad8-7239f5b8aeff - - - - - -] DHCP configuration for ports {'79503367-f53f-4b35-8760-76fcaa4d8407'} is completed#033[00m Dec 15 05:07:33 localhost dnsmasq[327845]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/addn_hosts - 1 addresses Dec 15 05:07:33 localhost podman[327864]: 2025-12-15 10:07:33.468356578 +0000 UTC m=+0.056139198 container kill f9b93180c1713c68e3de397597644dca496c84652efe3a260cec2fdfd5441141 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:07:33 localhost ovn_controller[154603]: 2025-12-15T10:07:33Z|00379|binding|INFO|Releasing lport b35254ad-12eb-47bb-92be-44fefe0694f0 from this chassis (sb_readonly=0) Dec 15 05:07:33 localhost nova_compute[286344]: 2025-12-15 10:07:33.599 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:33 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:33.681 267546 INFO neutron.agent.dhcp.agent [None req-55a4cbd0-9e28-4da6-8d3d-c74269879fa1 - - - - - -] DHCP configuration for ports {'2bbe1233-9bf5-448e-9651-50a0e9603a8a'} is completed#033[00m Dec 15 05:07:33 localhost dnsmasq[327845]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/addn_hosts - 0 addresses Dec 15 05:07:33 localhost podman[327903]: 2025-12-15 10:07:33.795256976 +0000 UTC m=+0.059957101 container kill f9b93180c1713c68e3de397597644dca496c84652efe3a260cec2fdfd5441141 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2) Dec 15 05:07:34 localhost dnsmasq[327845]: exiting on receipt of SIGTERM Dec 15 05:07:34 localhost podman[327943]: 2025-12-15 10:07:34.276913162 +0000 UTC m=+0.067489836 container kill f9b93180c1713c68e3de397597644dca496c84652efe3a260cec2fdfd5441141 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_managed=true) Dec 15 05:07:34 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e187 do_prune osdmap full prune enabled Dec 15 05:07:34 localhost systemd[1]: libpod-f9b93180c1713c68e3de397597644dca496c84652efe3a260cec2fdfd5441141.scope: Deactivated successfully. Dec 15 05:07:34 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e188 e188: 6 total, 6 up, 6 in Dec 15 05:07:34 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e188: 6 total, 6 up, 6 in Dec 15 05:07:34 localhost podman[327956]: 2025-12-15 10:07:34.341318164 +0000 UTC m=+0.049467026 container died f9b93180c1713c68e3de397597644dca496c84652efe3a260cec2fdfd5441141 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true) Dec 15 05:07:34 localhost systemd[1]: tmp-crun.rwHXT9.mount: Deactivated successfully. Dec 15 05:07:34 localhost podman[327956]: 2025-12-15 10:07:34.383774529 +0000 UTC m=+0.091923331 container cleanup f9b93180c1713c68e3de397597644dca496c84652efe3a260cec2fdfd5441141 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:07:34 localhost systemd[1]: libpod-conmon-f9b93180c1713c68e3de397597644dca496c84652efe3a260cec2fdfd5441141.scope: Deactivated successfully. Dec 15 05:07:34 localhost podman[327957]: 2025-12-15 10:07:34.406676821 +0000 UTC m=+0.106830915 container remove f9b93180c1713c68e3de397597644dca496c84652efe3a260cec2fdfd5441141 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true) Dec 15 05:07:34 localhost kernel: device tap933e6e08-11 left promiscuous mode Dec 15 05:07:34 localhost ovn_controller[154603]: 2025-12-15T10:07:34Z|00380|binding|INFO|Releasing lport 933e6e08-113b-4e2c-84bf-f77172150bd4 from this chassis (sb_readonly=0) Dec 15 05:07:34 localhost ovn_controller[154603]: 2025-12-15T10:07:34Z|00381|binding|INFO|Setting lport 933e6e08-113b-4e2c-84bf-f77172150bd4 down in Southbound Dec 15 05:07:34 localhost nova_compute[286344]: 2025-12-15 10:07:34.467 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:34 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:34.475 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe09:8d0/64', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '89e710ef9f4f48d48a369002db572947', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005559462.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e47821dc-5f5d-44dc-8a16-54817df4049d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=933e6e08-113b-4e2c-84bf-f77172150bd4) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:07:34 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:34.476 160590 INFO neutron.agent.ovn.metadata.agent [-] Port 933e6e08-113b-4e2c-84bf-f77172150bd4 in datapath c0669abd-aef1-4b0d-9f97-a6adeeac3211 unbound from our chassis#033[00m Dec 15 05:07:34 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:34.477 160590 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c0669abd-aef1-4b0d-9f97-a6adeeac3211, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 15 05:07:34 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:34.478 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[d90ac4e5-61f2-444b-998d-b3b84478d1e2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:07:34 localhost nova_compute[286344]: 2025-12-15 10:07:34.489 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:34 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Dec 15 05:07:34 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.25625 172.18.0.34:0/382777224' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Dec 15 05:07:34 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:34.769 267546 INFO neutron.agent.dhcp.agent [None req-2e676b3d-af17-4590-8f7a-684b55b3a538 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:07:34 localhost openstack_network_exporter[246484]: ERROR 10:07:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 05:07:34 localhost openstack_network_exporter[246484]: ERROR 10:07:34 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 15 05:07:34 localhost openstack_network_exporter[246484]: ERROR 10:07:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 05:07:34 localhost openstack_network_exporter[246484]: ERROR 10:07:34 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 15 05:07:34 localhost openstack_network_exporter[246484]: Dec 15 05:07:34 localhost openstack_network_exporter[246484]: ERROR 10:07:34 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 15 05:07:34 localhost openstack_network_exporter[246484]: Dec 15 05:07:35 localhost systemd[1]: var-lib-containers-storage-overlay-31973a60198e9a07d953ab1ea3dab5f0ba072611fe831cb0516a9cf5bd9ca872-merged.mount: Deactivated successfully. Dec 15 05:07:35 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f9b93180c1713c68e3de397597644dca496c84652efe3a260cec2fdfd5441141-userdata-shm.mount: Deactivated successfully. Dec 15 05:07:35 localhost systemd[1]: run-netns-qdhcp\x2dc0669abd\x2daef1\x2d4b0d\x2d9f97\x2da6adeeac3211.mount: Deactivated successfully. Dec 15 05:07:35 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Dec 15 05:07:35 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3176214807' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Dec 15 05:07:35 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Dec 15 05:07:35 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3176214807' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Dec 15 05:07:35 localhost nova_compute[286344]: 2025-12-15 10:07:35.762 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:36 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e188 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:07:36 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:36.233 267546 INFO neutron.agent.linux.ip_lib [None req-0bf57e73-aaf0-4a23-b9a8-a4b5b02b53d4 - - - - - -] Device tapa2fb0857-25 cannot be used as it has no MAC address#033[00m Dec 15 05:07:36 localhost nova_compute[286344]: 2025-12-15 10:07:36.256 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:36 localhost kernel: device tapa2fb0857-25 entered promiscuous mode Dec 15 05:07:36 localhost NetworkManager[5963]: [1765793256.2635] manager: (tapa2fb0857-25): new Generic device (/org/freedesktop/NetworkManager/Devices/61) Dec 15 05:07:36 localhost nova_compute[286344]: 2025-12-15 10:07:36.265 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:36 localhost ovn_controller[154603]: 2025-12-15T10:07:36Z|00382|binding|INFO|Claiming lport a2fb0857-2598-4fe2-9f7a-0be5adf79f04 for this chassis. Dec 15 05:07:36 localhost ovn_controller[154603]: 2025-12-15T10:07:36Z|00383|binding|INFO|a2fb0857-2598-4fe2-9f7a-0be5adf79f04: Claiming unknown Dec 15 05:07:36 localhost systemd-udevd[327996]: Network interface NamePolicy= disabled on kernel command line. Dec 15 05:07:36 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:36.277 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe9a:3f4a/64', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '89e710ef9f4f48d48a369002db572947', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e47821dc-5f5d-44dc-8a16-54817df4049d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=a2fb0857-2598-4fe2-9f7a-0be5adf79f04) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:07:36 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:36.279 160590 INFO neutron.agent.ovn.metadata.agent [-] Port a2fb0857-2598-4fe2-9f7a-0be5adf79f04 in datapath c0669abd-aef1-4b0d-9f97-a6adeeac3211 bound to our chassis#033[00m Dec 15 05:07:36 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:36.282 160590 DEBUG neutron.agent.ovn.metadata.agent [-] Port c263072e-5fd3-4668-b7e8-d65ea50d28dc IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Dec 15 05:07:36 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:36.283 160590 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c0669abd-aef1-4b0d-9f97-a6adeeac3211, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 15 05:07:36 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:36.284 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[af245d39-8944-4c7a-b5dd-52ad2c03672f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:07:36 localhost nova_compute[286344]: 2025-12-15 10:07:36.286 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:36 localhost ovn_controller[154603]: 2025-12-15T10:07:36Z|00384|binding|INFO|Setting lport a2fb0857-2598-4fe2-9f7a-0be5adf79f04 ovn-installed in OVS Dec 15 05:07:36 localhost ovn_controller[154603]: 2025-12-15T10:07:36Z|00385|binding|INFO|Setting lport a2fb0857-2598-4fe2-9f7a-0be5adf79f04 up in Southbound Dec 15 05:07:36 localhost nova_compute[286344]: 2025-12-15 10:07:36.288 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:36 localhost nova_compute[286344]: 2025-12-15 10:07:36.305 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:36 localhost nova_compute[286344]: 2025-12-15 10:07:36.312 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:36 localhost nova_compute[286344]: 2025-12-15 10:07:36.354 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:36 localhost nova_compute[286344]: 2025-12-15 10:07:36.390 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:36 localhost neutron_sriov_agent[260044]: 2025-12-15 10:07:36.825 2 INFO neutron.agent.securitygroups_rpc [None req-c66e5f11-81d3-4787-a261-34ad36a2bb71 6b5da6f221214afe93e1fa66574f238b 89e710ef9f4f48d48a369002db572947 - - default default] Security group member updated ['a6c5f808-dddc-4f17-acbf-63b1b6e6f4d6']#033[00m Dec 15 05:07:37 localhost podman[328051]: Dec 15 05:07:37 localhost podman[328051]: 2025-12-15 10:07:37.300037204 +0000 UTC m=+0.091357486 container create 81703f10f72a1f3e9fd6eabdfdbb9f6088f22400b162646d4c8888949334d570 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Dec 15 05:07:37 localhost systemd[1]: Started libpod-conmon-81703f10f72a1f3e9fd6eabdfdbb9f6088f22400b162646d4c8888949334d570.scope. Dec 15 05:07:37 localhost podman[328051]: 2025-12-15 10:07:37.256620253 +0000 UTC m=+0.047940555 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 15 05:07:37 localhost systemd[1]: tmp-crun.qWVLEP.mount: Deactivated successfully. Dec 15 05:07:37 localhost systemd[1]: Started libcrun container. Dec 15 05:07:37 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e34110eed3198c597215aa418d1498221edd214641dba7a55811595c99cb5cea/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 05:07:37 localhost podman[328051]: 2025-12-15 10:07:37.381779696 +0000 UTC m=+0.173099968 container init 81703f10f72a1f3e9fd6eabdfdbb9f6088f22400b162646d4c8888949334d570 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0) Dec 15 05:07:37 localhost podman[328051]: 2025-12-15 10:07:37.39186737 +0000 UTC m=+0.183187642 container start 81703f10f72a1f3e9fd6eabdfdbb9f6088f22400b162646d4c8888949334d570 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:07:37 localhost dnsmasq[328070]: started, version 2.85 cachesize 150 Dec 15 05:07:37 localhost dnsmasq[328070]: DNS service limited to local subnets Dec 15 05:07:37 localhost dnsmasq[328070]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 15 05:07:37 localhost dnsmasq[328070]: warning: no upstream servers configured Dec 15 05:07:37 localhost dnsmasq-dhcp[328070]: DHCPv6, static leases only on 2001:db8::, lease time 1d Dec 15 05:07:37 localhost dnsmasq[328070]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/addn_hosts - 0 addresses Dec 15 05:07:37 localhost dnsmasq-dhcp[328070]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/host Dec 15 05:07:37 localhost dnsmasq-dhcp[328070]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/opts Dec 15 05:07:37 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:37.453 267546 INFO neutron.agent.dhcp.agent [None req-0bf57e73-aaf0-4a23-b9a8-a4b5b02b53d4 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-15T10:07:35Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=a197e222-fe68-4cb4-bd5e-3feac6f7a1f4, ip_allocation=immediate, mac_address=fa:16:3e:6e:10:3f, name=tempest-NetworksTestDHCPv6-536107815, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-15T10:05:39Z, description=, dns_domain=, id=c0669abd-aef1-4b0d-9f97-a6adeeac3211, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1861207414, port_security_enabled=True, project_id=89e710ef9f4f48d48a369002db572947, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=20673, qos_policy_id=None, revision_number=44, router:external=False, shared=False, standard_attr_id=2164, status=ACTIVE, subnets=['f39e6418-8e9f-4f2c-b6e5-b1ad0a730933'], tags=[], tenant_id=89e710ef9f4f48d48a369002db572947, updated_at=2025-12-15T10:07:34Z, vlan_transparent=None, network_id=c0669abd-aef1-4b0d-9f97-a6adeeac3211, port_security_enabled=True, project_id=89e710ef9f4f48d48a369002db572947, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['a6c5f808-dddc-4f17-acbf-63b1b6e6f4d6'], standard_attr_id=2774, status=DOWN, tags=[], tenant_id=89e710ef9f4f48d48a369002db572947, updated_at=2025-12-15T10:07:36Z on network c0669abd-aef1-4b0d-9f97-a6adeeac3211#033[00m Dec 15 05:07:37 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:37.554 267546 INFO neutron.agent.dhcp.agent [None req-c1184c28-fba8-48b2-bf50-e1defa13542a - - - - - -] DHCP configuration for ports {'79503367-f53f-4b35-8760-76fcaa4d8407'} is completed#033[00m Dec 15 05:07:37 localhost dnsmasq[328070]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/addn_hosts - 1 addresses Dec 15 05:07:37 localhost dnsmasq-dhcp[328070]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/host Dec 15 05:07:37 localhost dnsmasq-dhcp[328070]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/opts Dec 15 05:07:37 localhost podman[328090]: 2025-12-15 10:07:37.645872507 +0000 UTC m=+0.060270459 container kill 81703f10f72a1f3e9fd6eabdfdbb9f6088f22400b162646d4c8888949334d570 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:07:37 localhost neutron_sriov_agent[260044]: 2025-12-15 10:07:37.844 2 INFO neutron.agent.securitygroups_rpc [None req-2192368b-90ec-4aaf-9083-0fccf1383351 6b5da6f221214afe93e1fa66574f238b 89e710ef9f4f48d48a369002db572947 - - default default] Security group member updated ['a6c5f808-dddc-4f17-acbf-63b1b6e6f4d6']#033[00m Dec 15 05:07:37 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:37.873 267546 INFO neutron.agent.dhcp.agent [None req-c232c2a4-efb6-4d7e-93f0-68562d17f902 - - - - - -] DHCP configuration for ports {'a197e222-fe68-4cb4-bd5e-3feac6f7a1f4'} is completed#033[00m Dec 15 05:07:38 localhost dnsmasq[328070]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/addn_hosts - 0 addresses Dec 15 05:07:38 localhost dnsmasq-dhcp[328070]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/host Dec 15 05:07:38 localhost dnsmasq-dhcp[328070]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/opts Dec 15 05:07:38 localhost podman[328128]: 2025-12-15 10:07:38.095139992 +0000 UTC m=+0.062626773 container kill 81703f10f72a1f3e9fd6eabdfdbb9f6088f22400b162646d4c8888949334d570 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Dec 15 05:07:38 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1695627887", "caps": ["mds", "allow rw path=/volumes/_nogroup/9c0624f9-769c-4121-86d5-16884e5a85ac/74d28453-dfb5-4d70-8223-e8abd5716c4b", "osd", "allow rw pool=manila_data namespace=fsvolumens_9c0624f9-769c-4121-86d5-16884e5a85ac", "mon", "allow r"], "format": "json"} v 0) Dec 15 05:07:38 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1695627887", "caps": ["mds", "allow rw path=/volumes/_nogroup/9c0624f9-769c-4121-86d5-16884e5a85ac/74d28453-dfb5-4d70-8223-e8abd5716c4b", "osd", "allow rw pool=manila_data namespace=fsvolumens_9c0624f9-769c-4121-86d5-16884e5a85ac", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:07:38 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1695627887", "caps": ["mds", "allow rw path=/volumes/_nogroup/9c0624f9-769c-4121-86d5-16884e5a85ac/74d28453-dfb5-4d70-8223-e8abd5716c4b", "osd", "allow rw pool=manila_data namespace=fsvolumens_9c0624f9-769c-4121-86d5-16884e5a85ac", "mon", "allow r"], "format": "json"}]': finished Dec 15 05:07:38 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-1695627887", "format": "json"} : dispatch Dec 15 05:07:38 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1695627887", "caps": ["mds", "allow rw path=/volumes/_nogroup/9c0624f9-769c-4121-86d5-16884e5a85ac/74d28453-dfb5-4d70-8223-e8abd5716c4b", "osd", "allow rw pool=manila_data namespace=fsvolumens_9c0624f9-769c-4121-86d5-16884e5a85ac", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:07:38 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1695627887", "caps": ["mds", "allow rw path=/volumes/_nogroup/9c0624f9-769c-4121-86d5-16884e5a85ac/74d28453-dfb5-4d70-8223-e8abd5716c4b", "osd", "allow rw pool=manila_data namespace=fsvolumens_9c0624f9-769c-4121-86d5-16884e5a85ac", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:07:38 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1695627887", "caps": ["mds", "allow rw path=/volumes/_nogroup/9c0624f9-769c-4121-86d5-16884e5a85ac/74d28453-dfb5-4d70-8223-e8abd5716c4b", "osd", "allow rw pool=manila_data namespace=fsvolumens_9c0624f9-769c-4121-86d5-16884e5a85ac", "mon", "allow r"], "format": "json"}]': finished Dec 15 05:07:38 localhost nova_compute[286344]: 2025-12-15 10:07:38.351 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:38 localhost dnsmasq[328070]: exiting on receipt of SIGTERM Dec 15 05:07:38 localhost podman[328167]: 2025-12-15 10:07:38.767166825 +0000 UTC m=+0.055701295 container kill 81703f10f72a1f3e9fd6eabdfdbb9f6088f22400b162646d4c8888949334d570 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3) Dec 15 05:07:38 localhost systemd[1]: libpod-81703f10f72a1f3e9fd6eabdfdbb9f6088f22400b162646d4c8888949334d570.scope: Deactivated successfully. Dec 15 05:07:38 localhost podman[328182]: 2025-12-15 10:07:38.84087781 +0000 UTC m=+0.056674583 container died 81703f10f72a1f3e9fd6eabdfdbb9f6088f22400b162646d4c8888949334d570 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Dec 15 05:07:38 localhost podman[328182]: 2025-12-15 10:07:38.879198821 +0000 UTC m=+0.094995554 container cleanup 81703f10f72a1f3e9fd6eabdfdbb9f6088f22400b162646d4c8888949334d570 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:07:38 localhost systemd[1]: libpod-conmon-81703f10f72a1f3e9fd6eabdfdbb9f6088f22400b162646d4c8888949334d570.scope: Deactivated successfully. Dec 15 05:07:38 localhost ovn_controller[154603]: 2025-12-15T10:07:38Z|00386|binding|INFO|Releasing lport b35254ad-12eb-47bb-92be-44fefe0694f0 from this chassis (sb_readonly=0) Dec 15 05:07:38 localhost podman[328183]: 2025-12-15 10:07:38.923234409 +0000 UTC m=+0.131630200 container remove 81703f10f72a1f3e9fd6eabdfdbb9f6088f22400b162646d4c8888949334d570 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Dec 15 05:07:38 localhost kernel: device tapa2fb0857-25 left promiscuous mode Dec 15 05:07:38 localhost nova_compute[286344]: 2025-12-15 10:07:38.936 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:38 localhost ovn_controller[154603]: 2025-12-15T10:07:38Z|00387|binding|INFO|Releasing lport a2fb0857-2598-4fe2-9f7a-0be5adf79f04 from this chassis (sb_readonly=0) Dec 15 05:07:38 localhost ovn_controller[154603]: 2025-12-15T10:07:38Z|00388|binding|INFO|Setting lport a2fb0857-2598-4fe2-9f7a-0be5adf79f04 down in Southbound Dec 15 05:07:38 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:38.948 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe9a:3f4a/64', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '89e710ef9f4f48d48a369002db572947', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005559462.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e47821dc-5f5d-44dc-8a16-54817df4049d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=a2fb0857-2598-4fe2-9f7a-0be5adf79f04) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:07:38 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:38.950 160590 INFO neutron.agent.ovn.metadata.agent [-] Port a2fb0857-2598-4fe2-9f7a-0be5adf79f04 in datapath c0669abd-aef1-4b0d-9f97-a6adeeac3211 unbound from our chassis#033[00m Dec 15 05:07:38 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:38.953 160590 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c0669abd-aef1-4b0d-9f97-a6adeeac3211, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 15 05:07:38 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:38.955 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[412cafad-a7a0-4a00-a49f-af6e82eb647d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:07:38 localhost nova_compute[286344]: 2025-12-15 10:07:38.974 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:38 localhost nova_compute[286344]: 2025-12-15 10:07:38.978 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:38 localhost nova_compute[286344]: 2025-12-15 10:07:38.984 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:39 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth rm", "entity": "client.tempest-cephx-id-1695627887"} v 0) Dec 15 05:07:39 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-1695627887"} : dispatch Dec 15 05:07:39 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-1695627887"}]': finished Dec 15 05:07:39 localhost systemd[1]: var-lib-containers-storage-overlay-e34110eed3198c597215aa418d1498221edd214641dba7a55811595c99cb5cea-merged.mount: Deactivated successfully. Dec 15 05:07:39 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-81703f10f72a1f3e9fd6eabdfdbb9f6088f22400b162646d4c8888949334d570-userdata-shm.mount: Deactivated successfully. Dec 15 05:07:39 localhost systemd[1]: run-netns-qdhcp\x2dc0669abd\x2daef1\x2d4b0d\x2d9f97\x2da6adeeac3211.mount: Deactivated successfully. Dec 15 05:07:39 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e188 do_prune osdmap full prune enabled Dec 15 05:07:39 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-1695627887", "format": "json"} : dispatch Dec 15 05:07:39 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-1695627887"} : dispatch Dec 15 05:07:39 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-1695627887"} : dispatch Dec 15 05:07:39 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-1695627887"}]': finished Dec 15 05:07:39 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e189 e189: 6 total, 6 up, 6 in Dec 15 05:07:39 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e189: 6 total, 6 up, 6 in Dec 15 05:07:39 localhost neutron_sriov_agent[260044]: 2025-12-15 10:07:39.688 2 INFO neutron.agent.securitygroups_rpc [None req-8a334cc9-2d07-4711-aa34-29086b4f2379 6b5da6f221214afe93e1fa66574f238b 89e710ef9f4f48d48a369002db572947 - - default default] Security group member updated ['a6c5f808-dddc-4f17-acbf-63b1b6e6f4d6']#033[00m Dec 15 05:07:40 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Dec 15 05:07:40 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/709212148' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Dec 15 05:07:40 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Dec 15 05:07:40 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/709212148' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Dec 15 05:07:40 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:40.173 267546 INFO neutron.agent.linux.ip_lib [None req-6d4e1b1f-cef3-48a5-ad87-b9c7499b10c6 - - - - - -] Device tapcd7958e6-24 cannot be used as it has no MAC address#033[00m Dec 15 05:07:40 localhost nova_compute[286344]: 2025-12-15 10:07:40.227 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:40 localhost kernel: device tapcd7958e6-24 entered promiscuous mode Dec 15 05:07:40 localhost nova_compute[286344]: 2025-12-15 10:07:40.234 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:40 localhost NetworkManager[5963]: [1765793260.2350] manager: (tapcd7958e6-24): new Generic device (/org/freedesktop/NetworkManager/Devices/62) Dec 15 05:07:40 localhost ovn_controller[154603]: 2025-12-15T10:07:40Z|00389|binding|INFO|Claiming lport cd7958e6-2496-4ef1-9da2-8e3a6a9b2c7a for this chassis. Dec 15 05:07:40 localhost ovn_controller[154603]: 2025-12-15T10:07:40Z|00390|binding|INFO|cd7958e6-2496-4ef1-9da2-8e3a6a9b2c7a: Claiming unknown Dec 15 05:07:40 localhost systemd-udevd[328221]: Network interface NamePolicy= disabled on kernel command line. Dec 15 05:07:40 localhost ovn_controller[154603]: 2025-12-15T10:07:40Z|00391|binding|INFO|Setting lport cd7958e6-2496-4ef1-9da2-8e3a6a9b2c7a ovn-installed in OVS Dec 15 05:07:40 localhost ovn_controller[154603]: 2025-12-15T10:07:40Z|00392|binding|INFO|Setting lport cd7958e6-2496-4ef1-9da2-8e3a6a9b2c7a up in Southbound Dec 15 05:07:40 localhost nova_compute[286344]: 2025-12-15 10:07:40.246 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:40 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:40.246 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '89e710ef9f4f48d48a369002db572947', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e47821dc-5f5d-44dc-8a16-54817df4049d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=cd7958e6-2496-4ef1-9da2-8e3a6a9b2c7a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:07:40 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:40.249 160590 INFO neutron.agent.ovn.metadata.agent [-] Port cd7958e6-2496-4ef1-9da2-8e3a6a9b2c7a in datapath c0669abd-aef1-4b0d-9f97-a6adeeac3211 bound to our chassis#033[00m Dec 15 05:07:40 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:40.252 160590 DEBUG neutron.agent.ovn.metadata.agent [-] Port ba867ea8-98dd-461d-b89c-b05b0058edfb IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Dec 15 05:07:40 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:40.252 160590 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c0669abd-aef1-4b0d-9f97-a6adeeac3211, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 15 05:07:40 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:40.253 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[887d0359-02c4-4bdc-9b2e-3bda2d6ded1a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:07:40 localhost nova_compute[286344]: 2025-12-15 10:07:40.264 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:40 localhost nova_compute[286344]: 2025-12-15 10:07:40.311 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:40 localhost nova_compute[286344]: 2025-12-15 10:07:40.348 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:40 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e189 do_prune osdmap full prune enabled Dec 15 05:07:40 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e190 e190: 6 total, 6 up, 6 in Dec 15 05:07:40 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e190: 6 total, 6 up, 6 in Dec 15 05:07:40 localhost neutron_sriov_agent[260044]: 2025-12-15 10:07:40.465 2 INFO neutron.agent.securitygroups_rpc [None req-bffcd14c-07db-4b82-a59d-c52b38bc765b 6b5da6f221214afe93e1fa66574f238b 89e710ef9f4f48d48a369002db572947 - - default default] Security group member updated ['a6c5f808-dddc-4f17-acbf-63b1b6e6f4d6']#033[00m Dec 15 05:07:40 localhost nova_compute[286344]: 2025-12-15 10:07:40.763 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:41 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:41.156 267546 INFO neutron.agent.linux.ip_lib [None req-92d509c5-4f7c-4480-9d8f-e5a2c31a4b8d - - - - - -] Device tap66b56f34-78 cannot be used as it has no MAC address#033[00m Dec 15 05:07:41 localhost nova_compute[286344]: 2025-12-15 10:07:41.185 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:41 localhost kernel: device tap66b56f34-78 entered promiscuous mode Dec 15 05:07:41 localhost NetworkManager[5963]: [1765793261.1907] manager: (tap66b56f34-78): new Generic device (/org/freedesktop/NetworkManager/Devices/63) Dec 15 05:07:41 localhost nova_compute[286344]: 2025-12-15 10:07:41.190 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:41 localhost ovn_controller[154603]: 2025-12-15T10:07:41Z|00393|binding|INFO|Claiming lport 66b56f34-783c-4d71-80f4-e3fa75cf8cca for this chassis. Dec 15 05:07:41 localhost ovn_controller[154603]: 2025-12-15T10:07:41Z|00394|binding|INFO|66b56f34-783c-4d71-80f4-e3fa75cf8cca: Claiming unknown Dec 15 05:07:41 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:41.201 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-42183817-84a7-4017-8689-6b11f9300b4a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-42183817-84a7-4017-8689-6b11f9300b4a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5a302f70917d47c392ee5c9b50e38a7e', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=244e37fd-d2a5-4937-9c41-9dcd4626ee07, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=66b56f34-783c-4d71-80f4-e3fa75cf8cca) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:07:41 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:41.205 160590 INFO neutron.agent.ovn.metadata.agent [-] Port 66b56f34-783c-4d71-80f4-e3fa75cf8cca in datapath 42183817-84a7-4017-8689-6b11f9300b4a bound to our chassis#033[00m Dec 15 05:07:41 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:41.208 160590 DEBUG neutron.agent.ovn.metadata.agent [-] Port 609e337b-6f3b-4a43-ad5e-009d1ce2a3b6 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Dec 15 05:07:41 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:41.208 160590 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 42183817-84a7-4017-8689-6b11f9300b4a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 15 05:07:41 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:41.210 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[80b2addb-4524-4438-b2aa-991cdb98152f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:07:41 localhost podman[328278]: Dec 15 05:07:41 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e190 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:07:41 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e190 do_prune osdmap full prune enabled Dec 15 05:07:41 localhost ovn_controller[154603]: 2025-12-15T10:07:41Z|00395|binding|INFO|Setting lport 66b56f34-783c-4d71-80f4-e3fa75cf8cca ovn-installed in OVS Dec 15 05:07:41 localhost ovn_controller[154603]: 2025-12-15T10:07:41Z|00396|binding|INFO|Setting lport 66b56f34-783c-4d71-80f4-e3fa75cf8cca up in Southbound Dec 15 05:07:41 localhost podman[328278]: 2025-12-15 10:07:41.235782629 +0000 UTC m=+0.104336859 container create 704c5d9953f13a5989fa946c09f7c43d8842a92413d03d9c09cd281bf3dd747e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202) Dec 15 05:07:41 localhost nova_compute[286344]: 2025-12-15 10:07:41.234 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:41 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e191 e191: 6 total, 6 up, 6 in Dec 15 05:07:41 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e191: 6 total, 6 up, 6 in Dec 15 05:07:41 localhost podman[328278]: 2025-12-15 10:07:41.17806984 +0000 UTC m=+0.046624110 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 15 05:07:41 localhost systemd[1]: Started libpod-conmon-704c5d9953f13a5989fa946c09f7c43d8842a92413d03d9c09cd281bf3dd747e.scope. Dec 15 05:07:41 localhost nova_compute[286344]: 2025-12-15 10:07:41.288 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:41 localhost systemd[1]: Started libcrun container. Dec 15 05:07:41 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0ec729e7395f61dbdedc4ce6299e1d2485c77a524cc1c05f5a4bf84d1f8ca2af/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 05:07:41 localhost podman[328278]: 2025-12-15 10:07:41.31744954 +0000 UTC m=+0.186003790 container init 704c5d9953f13a5989fa946c09f7c43d8842a92413d03d9c09cd281bf3dd747e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true) Dec 15 05:07:41 localhost nova_compute[286344]: 2025-12-15 10:07:41.328 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:41 localhost podman[328278]: 2025-12-15 10:07:41.329883927 +0000 UTC m=+0.198438167 container start 704c5d9953f13a5989fa946c09f7c43d8842a92413d03d9c09cd281bf3dd747e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0) Dec 15 05:07:41 localhost dnsmasq[328309]: started, version 2.85 cachesize 150 Dec 15 05:07:41 localhost dnsmasq[328309]: DNS service limited to local subnets Dec 15 05:07:41 localhost dnsmasq[328309]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 15 05:07:41 localhost dnsmasq[328309]: warning: no upstream servers configured Dec 15 05:07:41 localhost dnsmasq-dhcp[328309]: DHCPv6, static leases only on 2001:db8::, lease time 1d Dec 15 05:07:41 localhost dnsmasq[328309]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/addn_hosts - 0 addresses Dec 15 05:07:41 localhost dnsmasq-dhcp[328309]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/host Dec 15 05:07:41 localhost dnsmasq-dhcp[328309]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/opts Dec 15 05:07:41 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:41.384 267546 INFO neutron.agent.dhcp.agent [None req-6d4e1b1f-cef3-48a5-ad87-b9c7499b10c6 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-15T10:07:39Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=348bd76c-3873-48d6-8300-6dc564b86a57, ip_allocation=immediate, mac_address=fa:16:3e:70:6a:6d, name=tempest-NetworksTestDHCPv6-1901095525, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-15T10:05:39Z, description=, dns_domain=, id=c0669abd-aef1-4b0d-9f97-a6adeeac3211, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1861207414, port_security_enabled=True, project_id=89e710ef9f4f48d48a369002db572947, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=20673, qos_policy_id=None, revision_number=46, router:external=False, shared=False, standard_attr_id=2164, status=ACTIVE, subnets=['c58cdc36-644e-49eb-bfdd-8d5633b14a3c'], tags=[], tenant_id=89e710ef9f4f48d48a369002db572947, updated_at=2025-12-15T10:07:38Z, vlan_transparent=None, network_id=c0669abd-aef1-4b0d-9f97-a6adeeac3211, port_security_enabled=True, project_id=89e710ef9f4f48d48a369002db572947, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['a6c5f808-dddc-4f17-acbf-63b1b6e6f4d6'], standard_attr_id=2790, status=DOWN, tags=[], tenant_id=89e710ef9f4f48d48a369002db572947, updated_at=2025-12-15T10:07:39Z on network c0669abd-aef1-4b0d-9f97-a6adeeac3211#033[00m Dec 15 05:07:41 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:41.441 267546 INFO neutron.agent.dhcp.agent [None req-eb907ac9-1eba-46c5-a95e-62c099844cbc - - - - - -] DHCP configuration for ports {'79503367-f53f-4b35-8760-76fcaa4d8407'} is completed#033[00m Dec 15 05:07:41 localhost dnsmasq[328309]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/addn_hosts - 1 addresses Dec 15 05:07:41 localhost dnsmasq-dhcp[328309]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/host Dec 15 05:07:41 localhost podman[328332]: 2025-12-15 10:07:41.600896866 +0000 UTC m=+0.062468749 container kill 704c5d9953f13a5989fa946c09f7c43d8842a92413d03d9c09cd281bf3dd747e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:07:41 localhost dnsmasq-dhcp[328309]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/opts Dec 15 05:07:41 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:41.914 267546 INFO neutron.agent.dhcp.agent [None req-abe3fd2d-acfb-4f42-9aa4-0d56c5040a69 - - - - - -] DHCP configuration for ports {'348bd76c-3873-48d6-8300-6dc564b86a57'} is completed#033[00m Dec 15 05:07:42 localhost dnsmasq[328309]: exiting on receipt of SIGTERM Dec 15 05:07:42 localhost podman[328389]: 2025-12-15 10:07:42.193667844 +0000 UTC m=+0.120083686 container kill 704c5d9953f13a5989fa946c09f7c43d8842a92413d03d9c09cd281bf3dd747e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:07:42 localhost systemd[1]: libpod-704c5d9953f13a5989fa946c09f7c43d8842a92413d03d9c09cd281bf3dd747e.scope: Deactivated successfully. Dec 15 05:07:42 localhost dnsmasq[325890]: read /var/lib/neutron/dhcp/6211c52b-4c8a-4698-aaba-53022274894d/addn_hosts - 0 addresses Dec 15 05:07:42 localhost dnsmasq-dhcp[325890]: read /var/lib/neutron/dhcp/6211c52b-4c8a-4698-aaba-53022274894d/host Dec 15 05:07:42 localhost dnsmasq-dhcp[325890]: read /var/lib/neutron/dhcp/6211c52b-4c8a-4698-aaba-53022274894d/opts Dec 15 05:07:42 localhost podman[328418]: 2025-12-15 10:07:42.218303154 +0000 UTC m=+0.064542106 container kill 7b4d183efe37b51e41316f0b47169ea601879afda0d8986486eddea57e3b7d7c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6211c52b-4c8a-4698-aaba-53022274894d, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Dec 15 05:07:42 localhost podman[328447]: 2025-12-15 10:07:42.27515644 +0000 UTC m=+0.055022587 container died 704c5d9953f13a5989fa946c09f7c43d8842a92413d03d9c09cd281bf3dd747e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2) Dec 15 05:07:42 localhost podman[328447]: 2025-12-15 10:07:42.317222814 +0000 UTC m=+0.097088941 container remove 704c5d9953f13a5989fa946c09f7c43d8842a92413d03d9c09cd281bf3dd747e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:07:42 localhost systemd[1]: libpod-conmon-704c5d9953f13a5989fa946c09f7c43d8842a92413d03d9c09cd281bf3dd747e.scope: Deactivated successfully. Dec 15 05:07:42 localhost podman[328474]: Dec 15 05:07:42 localhost podman[328474]: 2025-12-15 10:07:42.373920776 +0000 UTC m=+0.095832107 container create 734cacfd8eee1f89a540c0e1e23b93ca09747f8f6282f1adb6e778f0ed04a4a9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42183817-84a7-4017-8689-6b11f9300b4a, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 05:07:42 localhost ovn_controller[154603]: 2025-12-15T10:07:42Z|00397|binding|INFO|Releasing lport cd7958e6-2496-4ef1-9da2-8e3a6a9b2c7a from this chassis (sb_readonly=0) Dec 15 05:07:42 localhost ovn_controller[154603]: 2025-12-15T10:07:42Z|00398|binding|INFO|Setting lport cd7958e6-2496-4ef1-9da2-8e3a6a9b2c7a down in Southbound Dec 15 05:07:42 localhost nova_compute[286344]: 2025-12-15 10:07:42.384 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:42 localhost kernel: device tapcd7958e6-24 left promiscuous mode Dec 15 05:07:42 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:42.391 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '89e710ef9f4f48d48a369002db572947', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005559462.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e47821dc-5f5d-44dc-8a16-54817df4049d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=cd7958e6-2496-4ef1-9da2-8e3a6a9b2c7a) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:07:42 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:42.392 160590 INFO neutron.agent.ovn.metadata.agent [-] Port cd7958e6-2496-4ef1-9da2-8e3a6a9b2c7a in datapath c0669abd-aef1-4b0d-9f97-a6adeeac3211 unbound from our chassis#033[00m Dec 15 05:07:42 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:42.394 160590 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c0669abd-aef1-4b0d-9f97-a6adeeac3211, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 15 05:07:42 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:42.395 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[979e86db-ea13-4218-ab8b-4395fb95a05f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:07:42 localhost nova_compute[286344]: 2025-12-15 10:07:42.408 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:42 localhost systemd[1]: Started libpod-conmon-734cacfd8eee1f89a540c0e1e23b93ca09747f8f6282f1adb6e778f0ed04a4a9.scope. Dec 15 05:07:42 localhost systemd[1]: Started libcrun container. Dec 15 05:07:42 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e191 do_prune osdmap full prune enabled Dec 15 05:07:42 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/24069bfeeaca67e7042cb156882b19b67548d6e0749d71a3590f98c2b4d556a7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 05:07:42 localhost podman[328474]: 2025-12-15 10:07:42.332230412 +0000 UTC m=+0.054141793 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 15 05:07:42 localhost podman[328474]: 2025-12-15 10:07:42.43882973 +0000 UTC m=+0.160741101 container init 734cacfd8eee1f89a540c0e1e23b93ca09747f8f6282f1adb6e778f0ed04a4a9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42183817-84a7-4017-8689-6b11f9300b4a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Dec 15 05:07:42 localhost podman[328474]: 2025-12-15 10:07:42.449410878 +0000 UTC m=+0.171322259 container start 734cacfd8eee1f89a540c0e1e23b93ca09747f8f6282f1adb6e778f0ed04a4a9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42183817-84a7-4017-8689-6b11f9300b4a, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true) Dec 15 05:07:42 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e192 e192: 6 total, 6 up, 6 in Dec 15 05:07:42 localhost dnsmasq[328502]: started, version 2.85 cachesize 150 Dec 15 05:07:42 localhost dnsmasq[328502]: DNS service limited to local subnets Dec 15 05:07:42 localhost dnsmasq[328502]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 15 05:07:42 localhost dnsmasq[328502]: warning: no upstream servers configured Dec 15 05:07:42 localhost dnsmasq-dhcp[328502]: DHCP, static leases only on 10.100.0.0, lease time 1d Dec 15 05:07:42 localhost dnsmasq[328502]: read /var/lib/neutron/dhcp/42183817-84a7-4017-8689-6b11f9300b4a/addn_hosts - 0 addresses Dec 15 05:07:42 localhost dnsmasq-dhcp[328502]: read /var/lib/neutron/dhcp/42183817-84a7-4017-8689-6b11f9300b4a/host Dec 15 05:07:42 localhost dnsmasq-dhcp[328502]: read /var/lib/neutron/dhcp/42183817-84a7-4017-8689-6b11f9300b4a/opts Dec 15 05:07:42 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e192: 6 total, 6 up, 6 in Dec 15 05:07:42 localhost ovn_controller[154603]: 2025-12-15T10:07:42Z|00399|binding|INFO|Releasing lport 9ed4e45d-85f0-450a-a36f-0d214fbddc6c from this chassis (sb_readonly=0) Dec 15 05:07:42 localhost ovn_controller[154603]: 2025-12-15T10:07:42Z|00400|binding|INFO|Setting lport 9ed4e45d-85f0-450a-a36f-0d214fbddc6c down in Southbound Dec 15 05:07:42 localhost kernel: device tap9ed4e45d-85 left promiscuous mode Dec 15 05:07:42 localhost nova_compute[286344]: 2025-12-15 10:07:42.530 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:42 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:42.537 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-6211c52b-4c8a-4698-aaba-53022274894d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6211c52b-4c8a-4698-aaba-53022274894d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5ccee293c21a4d25b8692241c1f8fb63', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005559462.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=19cc8789-4d1f-411f-8f3c-326f524c8ece, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=9ed4e45d-85f0-450a-a36f-0d214fbddc6c) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:07:42 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:42.539 160590 INFO neutron.agent.ovn.metadata.agent [-] Port 9ed4e45d-85f0-450a-a36f-0d214fbddc6c in datapath 6211c52b-4c8a-4698-aaba-53022274894d unbound from our chassis#033[00m Dec 15 05:07:42 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:42.542 160590 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6211c52b-4c8a-4698-aaba-53022274894d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 15 05:07:42 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:42.543 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[3b9b796e-bca6-4118-b292-5e2ca3dc139b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:07:42 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:42.549 267546 INFO neutron.agent.dhcp.agent [None req-842ae29f-52a6-413d-973b-2c51de09261c - - - - - -] DHCP configuration for ports {'cf7c0a45-e71c-49c5-b07c-2d5756194686'} is completed#033[00m Dec 15 05:07:42 localhost nova_compute[286344]: 2025-12-15 10:07:42.556 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:42 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Dec 15 05:07:42 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.25625 172.18.0.34:0/382777224' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Dec 15 05:07:42 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:42.687 267546 INFO neutron.agent.dhcp.agent [None req-3b7f39af-822d-479d-8fe4-afd36ea7eead - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:07:42 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:42.687 267546 INFO neutron.agent.dhcp.agent [None req-3b7f39af-822d-479d-8fe4-afd36ea7eead - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:07:42 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:42.688 267546 INFO neutron.agent.dhcp.agent [None req-3b7f39af-822d-479d-8fe4-afd36ea7eead - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:07:42 localhost dnsmasq[328502]: exiting on receipt of SIGTERM Dec 15 05:07:42 localhost podman[328521]: 2025-12-15 10:07:42.820085607 +0000 UTC m=+0.070399485 container kill 734cacfd8eee1f89a540c0e1e23b93ca09747f8f6282f1adb6e778f0ed04a4a9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42183817-84a7-4017-8689-6b11f9300b4a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true) Dec 15 05:07:42 localhost systemd[1]: libpod-734cacfd8eee1f89a540c0e1e23b93ca09747f8f6282f1adb6e778f0ed04a4a9.scope: Deactivated successfully. Dec 15 05:07:42 localhost podman[328533]: 2025-12-15 10:07:42.900057082 +0000 UTC m=+0.063254932 container died 734cacfd8eee1f89a540c0e1e23b93ca09747f8f6282f1adb6e778f0ed04a4a9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42183817-84a7-4017-8689-6b11f9300b4a, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0) Dec 15 05:07:42 localhost podman[328533]: 2025-12-15 10:07:42.942484685 +0000 UTC m=+0.105682505 container cleanup 734cacfd8eee1f89a540c0e1e23b93ca09747f8f6282f1adb6e778f0ed04a4a9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42183817-84a7-4017-8689-6b11f9300b4a, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Dec 15 05:07:42 localhost systemd[1]: libpod-conmon-734cacfd8eee1f89a540c0e1e23b93ca09747f8f6282f1adb6e778f0ed04a4a9.scope: Deactivated successfully. Dec 15 05:07:42 localhost podman[328535]: 2025-12-15 10:07:42.981901747 +0000 UTC m=+0.137659044 container remove 734cacfd8eee1f89a540c0e1e23b93ca09747f8f6282f1adb6e778f0ed04a4a9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42183817-84a7-4017-8689-6b11f9300b4a, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Dec 15 05:07:42 localhost ovn_controller[154603]: 2025-12-15T10:07:42Z|00401|binding|INFO|Releasing lport 66b56f34-783c-4d71-80f4-e3fa75cf8cca from this chassis (sb_readonly=0) Dec 15 05:07:42 localhost ovn_controller[154603]: 2025-12-15T10:07:42Z|00402|binding|INFO|Setting lport 66b56f34-783c-4d71-80f4-e3fa75cf8cca down in Southbound Dec 15 05:07:42 localhost kernel: device tap66b56f34-78 left promiscuous mode Dec 15 05:07:42 localhost nova_compute[286344]: 2025-12-15 10:07:42.995 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:43 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:43.002 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-42183817-84a7-4017-8689-6b11f9300b4a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-42183817-84a7-4017-8689-6b11f9300b4a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5a302f70917d47c392ee5c9b50e38a7e', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005559462.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=244e37fd-d2a5-4937-9c41-9dcd4626ee07, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=66b56f34-783c-4d71-80f4-e3fa75cf8cca) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:07:43 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:43.004 160590 INFO neutron.agent.ovn.metadata.agent [-] Port 66b56f34-783c-4d71-80f4-e3fa75cf8cca in datapath 42183817-84a7-4017-8689-6b11f9300b4a unbound from our chassis#033[00m Dec 15 05:07:43 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:43.005 160590 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 42183817-84a7-4017-8689-6b11f9300b4a or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 15 05:07:43 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:43.006 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[570cd596-f753-4cfa-9487-03d4a1d6eeb3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:07:43 localhost nova_compute[286344]: 2025-12-15 10:07:43.016 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:43 localhost systemd[1]: var-lib-containers-storage-overlay-0ec729e7395f61dbdedc4ce6299e1d2485c77a524cc1c05f5a4bf84d1f8ca2af-merged.mount: Deactivated successfully. Dec 15 05:07:43 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-704c5d9953f13a5989fa946c09f7c43d8842a92413d03d9c09cd281bf3dd747e-userdata-shm.mount: Deactivated successfully. Dec 15 05:07:43 localhost systemd[1]: run-netns-qdhcp\x2dc0669abd\x2daef1\x2d4b0d\x2d9f97\x2da6adeeac3211.mount: Deactivated successfully. Dec 15 05:07:43 localhost systemd[1]: run-netns-qdhcp\x2d42183817\x2d84a7\x2d4017\x2d8689\x2d6b11f9300b4a.mount: Deactivated successfully. Dec 15 05:07:43 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:43.205 267546 INFO neutron.agent.dhcp.agent [None req-22325795-1295-42d0-90d9-4cdbfd316690 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:07:43 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:43.206 267546 INFO neutron.agent.dhcp.agent [None req-22325795-1295-42d0-90d9-4cdbfd316690 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:07:43 localhost nova_compute[286344]: 2025-12-15 10:07:43.354 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:43 localhost ovn_controller[154603]: 2025-12-15T10:07:43Z|00403|binding|INFO|Releasing lport b35254ad-12eb-47bb-92be-44fefe0694f0 from this chassis (sb_readonly=0) Dec 15 05:07:43 localhost nova_compute[286344]: 2025-12-15 10:07:43.791 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:44 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e192 do_prune osdmap full prune enabled Dec 15 05:07:44 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e193 e193: 6 total, 6 up, 6 in Dec 15 05:07:44 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e193: 6 total, 6 up, 6 in Dec 15 05:07:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0. Dec 15 05:07:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09. Dec 15 05:07:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 05:07:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 05:07:44 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a. Dec 15 05:07:44 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:44.672 267546 INFO neutron.agent.linux.ip_lib [None req-cc2a6848-5958-4601-b901-38ab7cd243cf - - - - - -] Device tapc645485a-08 cannot be used as it has no MAC address#033[00m Dec 15 05:07:44 localhost podman[328578]: 2025-12-15 10:07:44.691836282 +0000 UTC m=+0.081873257 container health_status b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2) Dec 15 05:07:44 localhost podman[328578]: 2025-12-15 10:07:44.731680645 +0000 UTC m=+0.121717630 container exec_died b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:07:44 localhost kernel: device tapc645485a-08 entered promiscuous mode Dec 15 05:07:44 localhost nova_compute[286344]: 2025-12-15 10:07:44.733 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:44 localhost NetworkManager[5963]: [1765793264.7393] manager: (tapc645485a-08): new Generic device (/org/freedesktop/NetworkManager/Devices/64) Dec 15 05:07:44 localhost systemd-udevd[328631]: Network interface NamePolicy= disabled on kernel command line. Dec 15 05:07:44 localhost nova_compute[286344]: 2025-12-15 10:07:44.742 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:44 localhost ovn_controller[154603]: 2025-12-15T10:07:44Z|00404|binding|INFO|Claiming lport c645485a-086d-48ea-a19e-1922e8d2dbd9 for this chassis. Dec 15 05:07:44 localhost ovn_controller[154603]: 2025-12-15T10:07:44Z|00405|binding|INFO|c645485a-086d-48ea-a19e-1922e8d2dbd9: Claiming unknown Dec 15 05:07:44 localhost systemd[1]: b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a.service: Deactivated successfully. Dec 15 05:07:44 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:44.752 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe31:8e8e/64', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '89e710ef9f4f48d48a369002db572947', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e47821dc-5f5d-44dc-8a16-54817df4049d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=c645485a-086d-48ea-a19e-1922e8d2dbd9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:07:44 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:44.753 160590 INFO neutron.agent.ovn.metadata.agent [-] Port c645485a-086d-48ea-a19e-1922e8d2dbd9 in datapath c0669abd-aef1-4b0d-9f97-a6adeeac3211 bound to our chassis#033[00m Dec 15 05:07:44 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:44.755 160590 DEBUG neutron.agent.ovn.metadata.agent [-] Port ad00b7a9-0b2b-4651-a26a-374b2bd1f287 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Dec 15 05:07:44 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:44.755 160590 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c0669abd-aef1-4b0d-9f97-a6adeeac3211, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 15 05:07:44 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:44.755 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[14c04d35-4256-49f3-83a8-ff19b580df53]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:07:44 localhost podman[328563]: 2025-12-15 10:07:44.758756621 +0000 UTC m=+0.165449909 container health_status 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Dec 15 05:07:44 localhost journal[231322]: ethtool ioctl error on tapc645485a-08: No such device Dec 15 05:07:44 localhost nova_compute[286344]: 2025-12-15 10:07:44.768 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:44 localhost journal[231322]: ethtool ioctl error on tapc645485a-08: No such device Dec 15 05:07:44 localhost ovn_controller[154603]: 2025-12-15T10:07:44Z|00406|binding|INFO|Setting lport c645485a-086d-48ea-a19e-1922e8d2dbd9 ovn-installed in OVS Dec 15 05:07:44 localhost ovn_controller[154603]: 2025-12-15T10:07:44Z|00407|binding|INFO|Setting lport c645485a-086d-48ea-a19e-1922e8d2dbd9 up in Southbound Dec 15 05:07:44 localhost nova_compute[286344]: 2025-12-15 10:07:44.770 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:44 localhost nova_compute[286344]: 2025-12-15 10:07:44.772 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:44 localhost journal[231322]: ethtool ioctl error on tapc645485a-08: No such device Dec 15 05:07:44 localhost journal[231322]: ethtool ioctl error on tapc645485a-08: No such device Dec 15 05:07:44 localhost journal[231322]: ethtool ioctl error on tapc645485a-08: No such device Dec 15 05:07:44 localhost journal[231322]: ethtool ioctl error on tapc645485a-08: No such device Dec 15 05:07:44 localhost journal[231322]: ethtool ioctl error on tapc645485a-08: No such device Dec 15 05:07:44 localhost journal[231322]: ethtool ioctl error on tapc645485a-08: No such device Dec 15 05:07:44 localhost podman[328563]: 2025-12-15 10:07:44.796419986 +0000 UTC m=+0.203113334 container exec_died 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 15 05:07:44 localhost nova_compute[286344]: 2025-12-15 10:07:44.803 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:44 localhost systemd[1]: tmp-crun.5DRQIN.mount: Deactivated successfully. Dec 15 05:07:44 localhost systemd[1]: 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0.service: Deactivated successfully. Dec 15 05:07:44 localhost podman[328567]: 2025-12-15 10:07:44.817569521 +0000 UTC m=+0.210669830 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller) Dec 15 05:07:44 localhost nova_compute[286344]: 2025-12-15 10:07:44.846 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:44 localhost podman[328567]: 2025-12-15 10:07:44.855233785 +0000 UTC m=+0.248334124 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Dec 15 05:07:44 localhost podman[328566]: 2025-12-15 10:07:44.870040218 +0000 UTC m=+0.268588825 container health_status 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=multipathd) Dec 15 05:07:44 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 05:07:44 localhost podman[328564]: 2025-12-15 10:07:44.927381136 +0000 UTC m=+0.329606492 container health_status 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, version=9.6, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, container_name=openstack_network_exporter, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}) Dec 15 05:07:44 localhost podman[328564]: 2025-12-15 10:07:44.940485043 +0000 UTC m=+0.342710369 container exec_died 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, name=ubi9-minimal, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, release=1755695350, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=) Dec 15 05:07:44 localhost podman[328566]: 2025-12-15 10:07:44.952016806 +0000 UTC m=+0.350565493 container exec_died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=multipathd, org.label-schema.build-date=20251202, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:07:44 localhost systemd[1]: 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09.service: Deactivated successfully. Dec 15 05:07:44 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.service: Deactivated successfully. Dec 15 05:07:45 localhost podman[328747]: Dec 15 05:07:45 localhost podman[328747]: 2025-12-15 10:07:45.651768664 +0000 UTC m=+0.081526008 container create 36277dfb1ff105856487ca0d1f39b07afdc61f9ea47071204a368fff1f8cdc4f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 05:07:45 localhost systemd[1]: Started libpod-conmon-36277dfb1ff105856487ca0d1f39b07afdc61f9ea47071204a368fff1f8cdc4f.scope. Dec 15 05:07:45 localhost podman[328747]: 2025-12-15 10:07:45.605852394 +0000 UTC m=+0.035609738 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 15 05:07:45 localhost systemd[1]: Started libcrun container. Dec 15 05:07:45 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f7da3aae8d93248ad9ccb4a28cff87c3367a996301c7b29896e7d2c568edbe34/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 05:07:45 localhost podman[328747]: 2025-12-15 10:07:45.719101714 +0000 UTC m=+0.148859068 container init 36277dfb1ff105856487ca0d1f39b07afdc61f9ea47071204a368fff1f8cdc4f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true) Dec 15 05:07:45 localhost podman[328747]: 2025-12-15 10:07:45.72852954 +0000 UTC m=+0.158286884 container start 36277dfb1ff105856487ca0d1f39b07afdc61f9ea47071204a368fff1f8cdc4f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:07:45 localhost dnsmasq[328765]: started, version 2.85 cachesize 150 Dec 15 05:07:45 localhost dnsmasq[328765]: DNS service limited to local subnets Dec 15 05:07:45 localhost dnsmasq[328765]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 15 05:07:45 localhost dnsmasq[328765]: warning: no upstream servers configured Dec 15 05:07:45 localhost dnsmasq[328765]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/addn_hosts - 0 addresses Dec 15 05:07:45 localhost nova_compute[286344]: 2025-12-15 10:07:45.816 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:45 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:45.957 267546 INFO neutron.agent.dhcp.agent [None req-8f51b10c-c5f8-4173-afc1-3fd0e5844d8d - - - - - -] DHCP configuration for ports {'79503367-f53f-4b35-8760-76fcaa4d8407'} is completed#033[00m Dec 15 05:07:46 localhost dnsmasq[328765]: exiting on receipt of SIGTERM Dec 15 05:07:46 localhost podman[328782]: 2025-12-15 10:07:46.108938874 +0000 UTC m=+0.062396417 container kill 36277dfb1ff105856487ca0d1f39b07afdc61f9ea47071204a368fff1f8cdc4f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:07:46 localhost systemd[1]: libpod-36277dfb1ff105856487ca0d1f39b07afdc61f9ea47071204a368fff1f8cdc4f.scope: Deactivated successfully. Dec 15 05:07:46 localhost podman[328795]: 2025-12-15 10:07:46.180756347 +0000 UTC m=+0.059029946 container died 36277dfb1ff105856487ca0d1f39b07afdc61f9ea47071204a368fff1f8cdc4f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 05:07:46 localhost ovn_controller[154603]: 2025-12-15T10:07:46Z|00408|binding|INFO|Releasing lport b35254ad-12eb-47bb-92be-44fefe0694f0 from this chassis (sb_readonly=0) Dec 15 05:07:46 localhost nova_compute[286344]: 2025-12-15 10:07:46.217 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:46 localhost podman[328795]: 2025-12-15 10:07:46.220178069 +0000 UTC m=+0.098451628 container cleanup 36277dfb1ff105856487ca0d1f39b07afdc61f9ea47071204a368fff1f8cdc4f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Dec 15 05:07:46 localhost systemd[1]: libpod-conmon-36277dfb1ff105856487ca0d1f39b07afdc61f9ea47071204a368fff1f8cdc4f.scope: Deactivated successfully. Dec 15 05:07:46 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e193 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:07:46 localhost podman[328797]: 2025-12-15 10:07:46.262899261 +0000 UTC m=+0.133078219 container remove 36277dfb1ff105856487ca0d1f39b07afdc61f9ea47071204a368fff1f8cdc4f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Dec 15 05:07:46 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e193 do_prune osdmap full prune enabled Dec 15 05:07:46 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e194 e194: 6 total, 6 up, 6 in Dec 15 05:07:46 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e194: 6 total, 6 up, 6 in Dec 15 05:07:46 localhost neutron_sriov_agent[260044]: 2025-12-15 10:07:46.547 2 INFO neutron.agent.securitygroups_rpc [None req-0df40bf9-cf29-458c-9404-719b72f3c4de 6b5da6f221214afe93e1fa66574f238b 89e710ef9f4f48d48a369002db572947 - - default default] Security group member updated ['a6c5f808-dddc-4f17-acbf-63b1b6e6f4d6']#033[00m Dec 15 05:07:46 localhost systemd[1]: var-lib-containers-storage-overlay-f7da3aae8d93248ad9ccb4a28cff87c3367a996301c7b29896e7d2c568edbe34-merged.mount: Deactivated successfully. Dec 15 05:07:46 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-36277dfb1ff105856487ca0d1f39b07afdc61f9ea47071204a368fff1f8cdc4f-userdata-shm.mount: Deactivated successfully. Dec 15 05:07:47 localhost neutron_sriov_agent[260044]: 2025-12-15 10:07:47.160 2 INFO neutron.agent.securitygroups_rpc [None req-e11de003-9ace-4793-9035-608803851e57 6b5da6f221214afe93e1fa66574f238b 89e710ef9f4f48d48a369002db572947 - - default default] Security group member updated ['a6c5f808-dddc-4f17-acbf-63b1b6e6f4d6']#033[00m Dec 15 05:07:47 localhost dnsmasq[325890]: exiting on receipt of SIGTERM Dec 15 05:07:47 localhost podman[328857]: 2025-12-15 10:07:47.176340218 +0000 UTC m=+0.063858218 container kill 7b4d183efe37b51e41316f0b47169ea601879afda0d8986486eddea57e3b7d7c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6211c52b-4c8a-4698-aaba-53022274894d, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Dec 15 05:07:47 localhost systemd[1]: libpod-7b4d183efe37b51e41316f0b47169ea601879afda0d8986486eddea57e3b7d7c.scope: Deactivated successfully. Dec 15 05:07:47 localhost podman[328875]: 2025-12-15 10:07:47.26397303 +0000 UTC m=+0.062399807 container died 7b4d183efe37b51e41316f0b47169ea601879afda0d8986486eddea57e3b7d7c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6211c52b-4c8a-4698-aaba-53022274894d, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS) Dec 15 05:07:47 localhost systemd[1]: tmp-crun.bDfIIN.mount: Deactivated successfully. Dec 15 05:07:47 localhost podman[328875]: 2025-12-15 10:07:47.3172987 +0000 UTC m=+0.115725447 container remove 7b4d183efe37b51e41316f0b47169ea601879afda0d8986486eddea57e3b7d7c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6211c52b-4c8a-4698-aaba-53022274894d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Dec 15 05:07:47 localhost systemd[1]: libpod-conmon-7b4d183efe37b51e41316f0b47169ea601879afda0d8986486eddea57e3b7d7c.scope: Deactivated successfully. Dec 15 05:07:47 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:47.365 267546 INFO neutron.agent.dhcp.agent [None req-f15025bd-403f-47bd-92a8-b2ba0e48949a - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:07:47 localhost systemd[1]: var-lib-containers-storage-overlay-d2775776f0e273fb3d05b272d95f1ffd06426af493f94ead53af40fff91d2ecd-merged.mount: Deactivated successfully. Dec 15 05:07:47 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7b4d183efe37b51e41316f0b47169ea601879afda0d8986486eddea57e3b7d7c-userdata-shm.mount: Deactivated successfully. Dec 15 05:07:47 localhost systemd[1]: run-netns-qdhcp\x2d6211c52b\x2d4c8a\x2d4698\x2daaba\x2d53022274894d.mount: Deactivated successfully. Dec 15 05:07:47 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:47.686 267546 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:07:47 localhost podman[328929]: Dec 15 05:07:47 localhost podman[328929]: 2025-12-15 10:07:47.825600352 +0000 UTC m=+0.087218783 container create 062a4e9ed068663fd31358325b21585fa0c4282ee890718359dfadba7774126d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Dec 15 05:07:47 localhost systemd[1]: Started libpod-conmon-062a4e9ed068663fd31358325b21585fa0c4282ee890718359dfadba7774126d.scope. Dec 15 05:07:47 localhost systemd[1]: Started libcrun container. Dec 15 05:07:47 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/559de983d183607019ef7582e8a5bf3ce9281b22abc68315c8bb21074740eed4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 05:07:47 localhost podman[328929]: 2025-12-15 10:07:47.879203459 +0000 UTC m=+0.140821900 container init 062a4e9ed068663fd31358325b21585fa0c4282ee890718359dfadba7774126d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Dec 15 05:07:47 localhost podman[328929]: 2025-12-15 10:07:47.786138469 +0000 UTC m=+0.047756920 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 15 05:07:47 localhost podman[328929]: 2025-12-15 10:07:47.888301846 +0000 UTC m=+0.149920277 container start 062a4e9ed068663fd31358325b21585fa0c4282ee890718359dfadba7774126d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202) Dec 15 05:07:47 localhost dnsmasq[328948]: started, version 2.85 cachesize 150 Dec 15 05:07:47 localhost dnsmasq[328948]: DNS service limited to local subnets Dec 15 05:07:47 localhost dnsmasq[328948]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 15 05:07:47 localhost dnsmasq[328948]: warning: no upstream servers configured Dec 15 05:07:47 localhost dnsmasq-dhcp[328948]: DHCPv6, static leases only on 2001:db8:0:1::, lease time 1d Dec 15 05:07:47 localhost dnsmasq[328948]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/addn_hosts - 0 addresses Dec 15 05:07:47 localhost dnsmasq-dhcp[328948]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/host Dec 15 05:07:47 localhost dnsmasq-dhcp[328948]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/opts Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.123 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'name': 'test', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005559462.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'c785bf23f53946bc99867d8832a50266', 'user_id': '1ba5fce347b64bfebf995f187193f205', 'hostId': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.123 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.149 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.149 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.151 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a3d665e4-4440-4eb3-ab57-4598685d93d1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T10:07:48.123909', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e8740e94-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12334.316543824, 'message_signature': '5926d4e17542ae9f55fd5c3dda884808403c2e698682f389283add4560c2a2d6'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T10:07:48.123909', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e8741862-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12334.316543824, 'message_signature': '3a57ed2739c835e2fe466791b18398121415dc5b5b1ee42fbaf2e66511792a29'}]}, 'timestamp': '2025-12-15 10:07:48.150184', '_unique_id': '85a07eda06d34861bd16d87fdc462574'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.151 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.151 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.151 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.151 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.151 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.151 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.151 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.151 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.151 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.151 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.151 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.151 12 ERROR oslo_messaging.notify.messaging Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.151 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.151 12 ERROR oslo_messaging.notify.messaging Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.151 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.151 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.151 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.151 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.151 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.151 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.151 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.151 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.151 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.151 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.151 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.151 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.151 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.151 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.151 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.151 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.151 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.151 12 ERROR oslo_messaging.notify.messaging Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.151 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.156 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.157 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6c012856-e2ea-4198-a16e-355b83e89564', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:07:48.151685', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': 'e87527fc-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12334.34432073, 'message_signature': 'c3385f260fb6f66aac07be1d4849aaa461903431d5e43f0a2ef508ffff7df4dc'}]}, 'timestamp': '2025-12-15 10:07:48.157154', '_unique_id': '8cbfeacf27174fcc9c5d196f3370114c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.157 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.157 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.157 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.157 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.157 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.157 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.157 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.157 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.157 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.157 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.157 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.157 12 ERROR oslo_messaging.notify.messaging Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.157 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.157 12 ERROR oslo_messaging.notify.messaging Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.157 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.157 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.157 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.157 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.157 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.157 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.157 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.157 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.157 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.157 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.157 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.157 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.157 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.157 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.157 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.157 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.157 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.157 12 ERROR oslo_messaging.notify.messaging Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.158 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.158 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.158 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd40461ab-2a8f-4459-8844-9f8599b5115e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:07:48.158150', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': 'e87557ea-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12334.34432073, 'message_signature': 'b1750c5cca1a470ed49d7e9b4771dbc14ee0b8b32ceb451f4294a0f3d6a88614'}]}, 'timestamp': '2025-12-15 10:07:48.158365', '_unique_id': 'd3a81b8fb736448da953797e895600e8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.158 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.158 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.158 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.158 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.158 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.158 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.158 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.158 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.158 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.158 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.158 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.158 12 ERROR oslo_messaging.notify.messaging Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.158 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.158 12 ERROR oslo_messaging.notify.messaging Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.158 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.158 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.158 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.158 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.158 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.158 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.158 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.158 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.158 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.158 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.158 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.158 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.158 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.158 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.158 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.158 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.158 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.158 12 ERROR oslo_messaging.notify.messaging Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.159 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.159 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.159 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.160 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'efda6094-adb2-4ccb-8b34-a55f29413ee6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T10:07:48.159457', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e8758ac6-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12334.316543824, 'message_signature': '24f94e5e8d340061bf15b658fd59c72d9ae3557060852379c82631876501a09a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T10:07:48.159457', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e87591f6-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12334.316543824, 'message_signature': 'e9638e720a616040a28f900daff36c6ace5d19528f6077f1b2e311766d0ae4ea'}]}, 'timestamp': '2025-12-15 10:07:48.159836', '_unique_id': '960b912a498d43519291fc990f27fda6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.160 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.160 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.160 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.160 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.160 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.160 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.160 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.160 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.160 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.160 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.160 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.160 12 ERROR oslo_messaging.notify.messaging Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.160 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.160 12 ERROR oslo_messaging.notify.messaging Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.160 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.160 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.160 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.160 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.160 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.160 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.160 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.160 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.160 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.160 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.160 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.160 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.160 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.160 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.160 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.160 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.160 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.160 12 ERROR oslo_messaging.notify.messaging Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.160 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.160 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.161 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd97def88-7190-4297-b7a3-034b2fdb286f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:07:48.160804', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': 'e875bf6e-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12334.34432073, 'message_signature': 'bbe4fe3b9708428139e209fc71425a1e00a78f7556d5bf2137d8e29ea2c959e8'}]}, 'timestamp': '2025-12-15 10:07:48.161047', '_unique_id': 'e4b542bd4dce400c826e5867ff1ae843'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.161 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.161 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.161 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.161 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.161 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.161 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.161 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.161 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.161 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.161 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.161 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.161 12 ERROR oslo_messaging.notify.messaging Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.161 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.161 12 ERROR oslo_messaging.notify.messaging Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.161 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.161 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.161 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.161 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.161 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.161 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.161 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.161 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.161 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.161 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.161 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.161 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.161 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.161 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.161 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.161 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.161 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.161 12 ERROR oslo_messaging.notify.messaging Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.161 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.162 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.162 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.162 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bc189271-60d8-4b6e-96ab-cd310c842088', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:07:48.162067', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': 'e875f0ba-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12334.34432073, 'message_signature': '675daafe962ccd5776cfa67de7f4d4cd5f1e26719d5fd89627ea4d6277009df9'}]}, 'timestamp': '2025-12-15 10:07:48.162276', '_unique_id': '2c05834b92a6431eb29562f7c8c70a97'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.162 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.162 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.162 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.162 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.162 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.162 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.162 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.162 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.162 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.162 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.162 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.162 12 ERROR oslo_messaging.notify.messaging Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.162 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.162 12 ERROR oslo_messaging.notify.messaging Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.162 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.162 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.162 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.162 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.162 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.162 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.162 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.162 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.162 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.162 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.162 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.162 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.162 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.162 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.162 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.162 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.162 12 ERROR oslo_messaging.notify.messaging Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.163 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.171 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.171 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.172 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3b029d66-493a-45b3-83a5-a32d54b07bab', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T10:07:48.163222', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e8775874-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12334.355854043, 'message_signature': 'ba1fa9d1b75745a8599a1c7551223e04459d722bc40ec869328953c8b00ba2b2'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T10:07:48.163222', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e8775ffe-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12334.355854043, 'message_signature': '821e22149dac2c58d071ef4a10d5afa872b78329e631982a3dc30a5da6a4482e'}]}, 'timestamp': '2025-12-15 10:07:48.171663', '_unique_id': '7a435243150345fa950278fc308fcd7f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.172 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.172 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.172 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.172 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.172 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.172 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.172 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.172 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.172 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.172 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.172 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.172 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.172 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.172 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.172 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.172 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.172 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.172 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.172 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.172 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.172 12 ERROR oslo_messaging.notify.messaging Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.172 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.172 12 ERROR oslo_messaging.notify.messaging Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.172 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.172 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.172 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.172 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.172 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.172 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.172 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.172 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.172 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.172 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.172 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.172 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.172 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.172 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.172 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.172 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.172 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.172 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.172 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.172 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.172 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.172 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.172 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.172 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.172 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.172 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.172 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.172 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.172 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.172 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.172 12 ERROR oslo_messaging.notify.messaging Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.172 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.172 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.172 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.173 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a5b4f0c1-a5eb-4103-85d8-32c424c8a2e7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T10:07:48.172667', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e8778ed4-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12334.316543824, 'message_signature': '0c9ef9a9130b2e1750ddf09550a187f6e11eec60ab750321b98324ce7d52b839'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T10:07:48.172667', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e87795f0-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12334.316543824, 'message_signature': 'f916756ae58d82ce641e4e6560699963a7d97ab7a288a76713d3b180c77da2b4'}]}, 'timestamp': '2025-12-15 10:07:48.173075', '_unique_id': '26e0179c69124fae858019bac98fe80b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.173 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.173 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.173 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.173 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.173 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.173 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.173 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.173 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.173 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.173 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.173 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.173 12 ERROR oslo_messaging.notify.messaging Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.173 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.173 12 ERROR oslo_messaging.notify.messaging Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.173 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.173 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.173 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.173 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.173 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.173 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.173 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.173 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.173 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.173 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.173 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.173 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.173 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.173 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.173 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.173 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.173 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.173 12 ERROR oslo_messaging.notify.messaging Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.173 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.174 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.174 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '449fcaf2-5327-4d41-ab0e-4aa68df982f2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:07:48.174036', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': 'e877c4d0-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12334.34432073, 'message_signature': '630e144c72940bb8f92ca6104c75354288cbe1b114a711a84d7a941991fbfdbb'}]}, 'timestamp': '2025-12-15 10:07:48.174256', '_unique_id': '1f5a46a4fa3943858e37b854c3981a15'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.174 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.174 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.174 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.174 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.174 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.174 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.174 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.174 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.174 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.174 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.174 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.174 12 ERROR oslo_messaging.notify.messaging Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.174 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.174 12 ERROR oslo_messaging.notify.messaging Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.174 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.174 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.174 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.174 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.174 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.174 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.174 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.174 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.174 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.174 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.174 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.174 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.174 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.174 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.174 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.174 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.174 12 ERROR oslo_messaging.notify.messaging Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.175 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.175 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.175 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.175 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2da1ed6d-47fc-4d05-a3fa-92ed0446b7ce', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T10:07:48.175199', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e877f19e-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12334.355854043, 'message_signature': '00538c9243dc517c9b92be7ae1b3efb2a32572b82c0accc685f6d6932aa4fd6d'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T10:07:48.175199', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e877f89c-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12334.355854043, 'message_signature': '006d245d5b55c1c2d256fdd3501129c57234f10230b37d767baf9ec3d53691de'}]}, 'timestamp': '2025-12-15 10:07:48.175568', '_unique_id': 'e01710fb7a1543ea8562c72112c623b1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.175 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.175 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.175 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.175 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.175 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.175 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.175 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.175 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.175 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.175 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.175 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.175 12 ERROR oslo_messaging.notify.messaging Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.175 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.175 12 ERROR oslo_messaging.notify.messaging Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.175 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.175 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.175 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.175 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.175 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.175 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.175 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.175 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.175 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.175 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.175 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.175 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.175 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.175 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.175 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.175 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.175 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.175 12 ERROR oslo_messaging.notify.messaging Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.176 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Dec 15 05:07:48 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:48.185 267546 INFO neutron.agent.dhcp.agent [None req-93c5b90a-cbcd-4528-b780-38b9a91e1b5b - - - - - -] DHCP configuration for ports {'79503367-f53f-4b35-8760-76fcaa4d8407', 'c645485a-086d-48ea-a19e-1922e8d2dbd9'} is completed#033[00m Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.191 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/cpu volume: 15860000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.193 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '29483d19-7a18-4f44-9009-1e42595d51c0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 15860000000, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'timestamp': '2025-12-15T10:07:48.176664', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'e87a7edc-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12334.384308888, 'message_signature': 'ad927d9f314f5af25806ac97103759b20ee63e5daca60f246268cc5c0db2ed26'}]}, 'timestamp': '2025-12-15 10:07:48.192246', '_unique_id': 'ca554387e17643ccb483130046b65cde'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.193 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.193 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.193 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.193 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.193 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.193 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.193 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.193 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.193 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.193 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.193 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.193 12 ERROR oslo_messaging.notify.messaging Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.193 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.193 12 ERROR oslo_messaging.notify.messaging Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.193 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.193 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.193 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.193 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.193 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.193 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.193 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.193 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.193 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.193 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.193 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.193 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.193 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.193 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.193 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.193 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.193 12 ERROR oslo_messaging.notify.messaging Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.194 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.194 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/memory.usage volume: 51.73828125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.195 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '56e60229-81de-4e23-a0a8-b28fd3f7fef7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.73828125, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'timestamp': '2025-12-15T10:07:48.194406', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'e87ae9d0-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12334.384308888, 'message_signature': 'a853899cddafc6bc3a525d3fcef3b4ec531e27529d4cc3f44b7a506ac468583b'}]}, 'timestamp': '2025-12-15 10:07:48.195008', '_unique_id': '1b1b8a75c6d24f058871ca81109c83c1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.195 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.195 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.195 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.195 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.195 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.195 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.195 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.195 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.195 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.195 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.195 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.195 12 ERROR oslo_messaging.notify.messaging Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.195 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.195 12 ERROR oslo_messaging.notify.messaging Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.195 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.195 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.195 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.195 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.195 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.195 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.195 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.195 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.195 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.195 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.195 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.195 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.195 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.195 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.195 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.195 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.195 12 ERROR oslo_messaging.notify.messaging Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.197 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.197 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.latency volume: 1243487016 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.197 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.latency volume: 24779175 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.198 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '44eb1b2e-7782-4c65-a563-86be50b8885e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1243487016, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T10:07:48.197173', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e87b50c8-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12334.316543824, 'message_signature': 'fab5fd612cea625697d8c08a45872b5ea4399dbdfc8287f8f2ae81e92c84236e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24779175, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T10:07:48.197173', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e87b60f4-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12334.316543824, 'message_signature': 'da2f0645944090f3fd622250be79ff0daea84f7974c338c1861112f4bfa04c20'}]}, 'timestamp': '2025-12-15 10:07:48.198044', '_unique_id': '3268bf0351224afd932bb827919293ef'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.198 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.198 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.198 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.198 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.198 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.198 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.198 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.198 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.198 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.198 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.198 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.198 12 ERROR oslo_messaging.notify.messaging Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.198 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.198 12 ERROR oslo_messaging.notify.messaging Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.198 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.198 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.198 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.198 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.198 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.198 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.198 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.198 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.198 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.198 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.198 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.198 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.198 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.198 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.198 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.198 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.198 12 ERROR oslo_messaging.notify.messaging Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.200 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.200 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.201 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '47d6b273-37eb-4557-8402-fc1ef699ea19', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:07:48.200196', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': 'e87bc738-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12334.34432073, 'message_signature': '30aa3be4ec9255b4be7913123d03fadeb9dcafc1887ea953296012378f1789c8'}]}, 'timestamp': '2025-12-15 10:07:48.200661', '_unique_id': 'c420197850f6445e9278bf02ac20c1e6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.201 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.201 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.201 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.201 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.201 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.201 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.201 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.201 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.201 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.201 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.201 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.201 12 ERROR oslo_messaging.notify.messaging Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.201 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.201 12 ERROR oslo_messaging.notify.messaging Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.201 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.201 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.201 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.201 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.201 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.201 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.201 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.201 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.201 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.201 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.201 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.201 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.201 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.201 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.201 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.201 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.201 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.201 12 ERROR oslo_messaging.notify.messaging Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.202 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.202 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.latency volume: 1342134926 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.202 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.latency volume: 123356132 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.203 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '29ad27fa-04f5-491c-9775-5c3988c8b766', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1342134926, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T10:07:48.202406', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e87c19cc-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12334.316543824, 'message_signature': 'ed45702ee0c2a25791fcc3208198d1717f268f1048d50bb41e75a561b56d32e4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 123356132, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T10:07:48.202406', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e87c237c-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12334.316543824, 'message_signature': '875c9a26f2ee5663a5600bc431cac443a432ce5e7441986d1a26c11eb558740e'}]}, 'timestamp': '2025-12-15 10:07:48.202916', '_unique_id': 'e4d39b9a2566497aa5ffeacfef026bfe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.203 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.203 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.203 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.203 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.203 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.203 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.203 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.203 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.203 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.203 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.203 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.203 12 ERROR oslo_messaging.notify.messaging Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.203 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.203 12 ERROR oslo_messaging.notify.messaging Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.203 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.203 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.203 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.203 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.203 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.203 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.203 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.203 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.203 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.203 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.203 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.203 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.203 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.203 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.203 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.203 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.203 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.203 12 ERROR oslo_messaging.notify.messaging Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.204 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.204 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.204 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.204 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.205 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8d22378d-234d-4fa0-b27d-7781717672f6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T10:07:48.204352', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e87c65bc-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12334.316543824, 'message_signature': '5bfa4ee8f6083715449939f2ab2a9600e2f7a1e6578717c48a0dc215eac96a05'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T10:07:48.204352', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e87c6fc6-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12334.316543824, 'message_signature': 'd4063da0a17e9ddba460096bd8d5431746f85e5ab7caba895d713d0b24cf2b34'}]}, 'timestamp': '2025-12-15 10:07:48.204869', '_unique_id': 'cb497a2b478741568c17be0f5d285369'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.205 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.205 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.205 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.205 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.205 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.205 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.205 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.205 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.205 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.205 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.205 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.205 12 ERROR oslo_messaging.notify.messaging Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.205 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.205 12 ERROR oslo_messaging.notify.messaging Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.205 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.205 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.205 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.205 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.205 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.205 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.205 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.205 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.205 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.205 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.205 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.205 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.205 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.205 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.205 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.205 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.205 12 ERROR oslo_messaging.notify.messaging Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.206 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.206 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.207 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '207c3b76-8bbf-4702-bffe-cfff368e7d91', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:07:48.206338', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': 'e87cb3a0-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12334.34432073, 'message_signature': 'd2e2ecbfb1701cdc2b7327067f481c35b3e67c20d09ea5b51c9c82648c44d8b0'}]}, 'timestamp': '2025-12-15 10:07:48.206627', '_unique_id': 'a5e5ad65e4264d54a61c8eb623713f9b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.207 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.207 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.207 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.207 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.207 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.207 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.207 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.207 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.207 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.207 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.207 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.207 12 ERROR oslo_messaging.notify.messaging Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.207 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.207 12 ERROR oslo_messaging.notify.messaging Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.207 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.207 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.207 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.207 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.207 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.207 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.207 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.207 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.207 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.207 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.207 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.207 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.207 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.207 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.207 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.207 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.207 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.207 12 ERROR oslo_messaging.notify.messaging Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.207 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.207 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.208 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.208 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.208 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3da00f4a-f7aa-4c10-b0a8-03a9e2fe553a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:07:48.208117', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': 'e87cf8f6-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12334.34432073, 'message_signature': 'b83403ca0b1856f4c75ef020afd03077ffc22109f951e471e00d605bf32c2221'}]}, 'timestamp': '2025-12-15 10:07:48.208402', '_unique_id': 'c387d940c4f14d1898b7bbadf6ca9414'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.208 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.208 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.208 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.208 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.208 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.208 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.208 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.208 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.208 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.208 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.208 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.208 12 ERROR oslo_messaging.notify.messaging Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.208 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.208 12 ERROR oslo_messaging.notify.messaging Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.208 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.208 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.208 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.208 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.208 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.208 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.208 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.208 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.208 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.208 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.208 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.208 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.208 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.208 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.208 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.208 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.208 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.208 12 ERROR oslo_messaging.notify.messaging Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.209 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.209 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.210 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7393a09b-a2d0-4c0f-ad5c-f9ed4f427cc1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:07:48.209715', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': 'e87d3780-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12334.34432073, 'message_signature': 'd0a2600c01fe74e5ddca966ab4a213c012ceefacc645b1e4a59d04e9c63c1c86'}]}, 'timestamp': '2025-12-15 10:07:48.210026', '_unique_id': '9730a0808241499daeb892c0e9eade11'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.210 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.210 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.210 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.210 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.210 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.210 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.210 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.210 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.210 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.210 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.210 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.210 12 ERROR oslo_messaging.notify.messaging Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.210 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.210 12 ERROR oslo_messaging.notify.messaging Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.210 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.210 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.210 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.210 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.210 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.210 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.210 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.210 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.210 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.210 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.210 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.210 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.210 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.210 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.210 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.210 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.210 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.210 12 ERROR oslo_messaging.notify.messaging Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.211 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.211 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.211 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.212 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b6ca2f4d-358b-4fb6-9de3-6ccbb19ec5c6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T10:07:48.211378', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e87d788a-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12334.355854043, 'message_signature': '9d45552ea4d02e31a0bf4aedd21235d5be7748d6f816d7f688b22c8bfe56a4fa'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T10:07:48.211378', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e87d82a8-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12334.355854043, 'message_signature': 'f6e006ce8850c497f81cae084bc2487f4bd76ce7c926bf8cc3dc7b36b00936a2'}]}, 'timestamp': '2025-12-15 10:07:48.211906', '_unique_id': '6fab88960e0546348ef33ce8438c8b8f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.212 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.212 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.212 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.212 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.212 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.212 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.212 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.212 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.212 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.212 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.212 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.212 12 ERROR oslo_messaging.notify.messaging Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.212 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.212 12 ERROR oslo_messaging.notify.messaging Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.212 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.212 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.212 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.212 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.212 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.212 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.212 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.212 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.212 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.212 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.212 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.212 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.212 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.212 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.212 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.212 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.212 12 ERROR oslo_messaging.notify.messaging Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.213 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.213 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.214 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd5b4144e-1572-4599-a926-fabf66feb8c7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:07:48.213270', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': 'e87dc268-d99d-11f0-817e-fa163ebaca0f', 'monotonic_time': 12334.34432073, 'message_signature': '44f3b44cb459a800c767925772456e48df7f79332f0f6d5384daf1905e19f0d0'}]}, 'timestamp': '2025-12-15 10:07:48.213558', '_unique_id': 'b1c21337874d4119b691d981bb8a5c57'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.214 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.214 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.214 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.214 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.214 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.214 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.214 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.214 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.214 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.214 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.214 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.214 12 ERROR oslo_messaging.notify.messaging Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.214 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.214 12 ERROR oslo_messaging.notify.messaging Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.214 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.214 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.214 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.214 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.214 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.214 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.214 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.214 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.214 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.214 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.214 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.214 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.214 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.214 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.214 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.214 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.214 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:07:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:07:48.214 12 ERROR oslo_messaging.notify.messaging Dec 15 05:07:48 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e194 do_prune osdmap full prune enabled Dec 15 05:07:48 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e195 e195: 6 total, 6 up, 6 in Dec 15 05:07:48 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e195: 6 total, 6 up, 6 in Dec 15 05:07:48 localhost dnsmasq[328948]: exiting on receipt of SIGTERM Dec 15 05:07:48 localhost podman[328964]: 2025-12-15 10:07:48.290611315 +0000 UTC m=+0.059532939 container kill 062a4e9ed068663fd31358325b21585fa0c4282ee890718359dfadba7774126d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Dec 15 05:07:48 localhost systemd[1]: libpod-062a4e9ed068663fd31358325b21585fa0c4282ee890718359dfadba7774126d.scope: Deactivated successfully. Dec 15 05:07:48 localhost podman[328978]: 2025-12-15 10:07:48.363882038 +0000 UTC m=+0.061586806 container died 062a4e9ed068663fd31358325b21585fa0c4282ee890718359dfadba7774126d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:07:48 localhost nova_compute[286344]: 2025-12-15 10:07:48.387 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:48 localhost podman[328978]: 2025-12-15 10:07:48.418877273 +0000 UTC m=+0.116582011 container cleanup 062a4e9ed068663fd31358325b21585fa0c4282ee890718359dfadba7774126d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Dec 15 05:07:48 localhost systemd[1]: libpod-conmon-062a4e9ed068663fd31358325b21585fa0c4282ee890718359dfadba7774126d.scope: Deactivated successfully. Dec 15 05:07:48 localhost podman[328985]: 2025-12-15 10:07:48.494775777 +0000 UTC m=+0.172328517 container remove 062a4e9ed068663fd31358325b21585fa0c4282ee890718359dfadba7774126d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2) Dec 15 05:07:48 localhost systemd[1]: var-lib-containers-storage-overlay-559de983d183607019ef7582e8a5bf3ce9281b22abc68315c8bb21074740eed4-merged.mount: Deactivated successfully. Dec 15 05:07:48 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-062a4e9ed068663fd31358325b21585fa0c4282ee890718359dfadba7774126d-userdata-shm.mount: Deactivated successfully. Dec 15 05:07:49 localhost podman[329056]: Dec 15 05:07:49 localhost podman[329056]: 2025-12-15 10:07:49.473061777 +0000 UTC m=+0.085745303 container create 22a54391df476f3575cf093f948ee174ef592dfc56b0d052315e4b0c762fd783 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2) Dec 15 05:07:49 localhost systemd[1]: Started libpod-conmon-22a54391df476f3575cf093f948ee174ef592dfc56b0d052315e4b0c762fd783.scope. Dec 15 05:07:49 localhost systemd[1]: tmp-crun.HjP7aB.mount: Deactivated successfully. Dec 15 05:07:49 localhost podman[329056]: 2025-12-15 10:07:49.426356257 +0000 UTC m=+0.039039843 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 15 05:07:49 localhost systemd[1]: Started libcrun container. Dec 15 05:07:49 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8747dec28a6fb70e32b1792e013ad2ab795352689e6e125952bcd4fa3d23a46c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 05:07:49 localhost podman[329056]: 2025-12-15 10:07:49.552363133 +0000 UTC m=+0.165046679 container init 22a54391df476f3575cf093f948ee174ef592dfc56b0d052315e4b0c762fd783 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, io.buildah.version=1.41.3) Dec 15 05:07:49 localhost podman[329056]: 2025-12-15 10:07:49.562507579 +0000 UTC m=+0.175191125 container start 22a54391df476f3575cf093f948ee174ef592dfc56b0d052315e4b0c762fd783 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:07:49 localhost dnsmasq[329074]: started, version 2.85 cachesize 150 Dec 15 05:07:49 localhost dnsmasq[329074]: DNS service limited to local subnets Dec 15 05:07:49 localhost dnsmasq[329074]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 15 05:07:49 localhost dnsmasq[329074]: warning: no upstream servers configured Dec 15 05:07:49 localhost dnsmasq-dhcp[329074]: DHCPv6, static leases only on 2001:db8:0:1::, lease time 1d Dec 15 05:07:49 localhost dnsmasq[329074]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/addn_hosts - 0 addresses Dec 15 05:07:49 localhost dnsmasq-dhcp[329074]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/host Dec 15 05:07:49 localhost dnsmasq-dhcp[329074]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/opts Dec 15 05:07:49 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:49.863 267546 INFO neutron.agent.dhcp.agent [None req-a252f359-b427-43ca-9350-77199433a265 - - - - - -] DHCP configuration for ports {'79503367-f53f-4b35-8760-76fcaa4d8407', 'c645485a-086d-48ea-a19e-1922e8d2dbd9'} is completed#033[00m Dec 15 05:07:49 localhost dnsmasq[329074]: exiting on receipt of SIGTERM Dec 15 05:07:49 localhost podman[329090]: 2025-12-15 10:07:49.961105127 +0000 UTC m=+0.059049977 container kill 22a54391df476f3575cf093f948ee174ef592dfc56b0d052315e4b0c762fd783 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:07:49 localhost systemd[1]: libpod-22a54391df476f3575cf093f948ee174ef592dfc56b0d052315e4b0c762fd783.scope: Deactivated successfully. Dec 15 05:07:50 localhost podman[329103]: 2025-12-15 10:07:50.032413556 +0000 UTC m=+0.058778979 container died 22a54391df476f3575cf093f948ee174ef592dfc56b0d052315e4b0c762fd783 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Dec 15 05:07:50 localhost systemd[1]: tmp-crun.jTQyrX.mount: Deactivated successfully. Dec 15 05:07:50 localhost podman[329103]: 2025-12-15 10:07:50.077598035 +0000 UTC m=+0.103963418 container cleanup 22a54391df476f3575cf093f948ee174ef592dfc56b0d052315e4b0c762fd783 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true) Dec 15 05:07:50 localhost systemd[1]: libpod-conmon-22a54391df476f3575cf093f948ee174ef592dfc56b0d052315e4b0c762fd783.scope: Deactivated successfully. Dec 15 05:07:50 localhost podman[329111]: 2025-12-15 10:07:50.161457365 +0000 UTC m=+0.174358323 container remove 22a54391df476f3575cf093f948ee174ef592dfc56b0d052315e4b0c762fd783 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 05:07:50 localhost systemd[1]: var-lib-containers-storage-overlay-8747dec28a6fb70e32b1792e013ad2ab795352689e6e125952bcd4fa3d23a46c-merged.mount: Deactivated successfully. Dec 15 05:07:50 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-22a54391df476f3575cf093f948ee174ef592dfc56b0d052315e4b0c762fd783-userdata-shm.mount: Deactivated successfully. Dec 15 05:07:50 localhost nova_compute[286344]: 2025-12-15 10:07:50.859 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:51 localhost podman[329187]: Dec 15 05:07:51 localhost podman[329187]: 2025-12-15 10:07:51.056137482 +0000 UTC m=+0.080635684 container create 9dd7c9866033c86a82e003c1d890f248f09395258683f7a527a2ea3672598b8a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_managed=true) Dec 15 05:07:51 localhost systemd[1]: Started libpod-conmon-9dd7c9866033c86a82e003c1d890f248f09395258683f7a527a2ea3672598b8a.scope. Dec 15 05:07:51 localhost systemd[1]: tmp-crun.bsvXgs.mount: Deactivated successfully. Dec 15 05:07:51 localhost podman[329187]: 2025-12-15 10:07:51.020330088 +0000 UTC m=+0.044828330 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 15 05:07:51 localhost systemd[1]: Started libcrun container. Dec 15 05:07:51 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/af10404c8082b047a4ece97be9ebad242e59329e0bf5eb507fdcdc69946c2e62/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 05:07:51 localhost podman[329187]: 2025-12-15 10:07:51.139758775 +0000 UTC m=+0.164256977 container init 9dd7c9866033c86a82e003c1d890f248f09395258683f7a527a2ea3672598b8a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Dec 15 05:07:51 localhost podman[329187]: 2025-12-15 10:07:51.148535614 +0000 UTC m=+0.173033806 container start 9dd7c9866033c86a82e003c1d890f248f09395258683f7a527a2ea3672598b8a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3) Dec 15 05:07:51 localhost dnsmasq[329206]: started, version 2.85 cachesize 150 Dec 15 05:07:51 localhost dnsmasq[329206]: DNS service limited to local subnets Dec 15 05:07:51 localhost dnsmasq[329206]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 15 05:07:51 localhost dnsmasq[329206]: warning: no upstream servers configured Dec 15 05:07:51 localhost dnsmasq-dhcp[329206]: DHCPv6, static leases only on 2001:db8::, lease time 1d Dec 15 05:07:51 localhost dnsmasq[329206]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/addn_hosts - 0 addresses Dec 15 05:07:51 localhost dnsmasq-dhcp[329206]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/host Dec 15 05:07:51 localhost dnsmasq-dhcp[329206]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/opts Dec 15 05:07:51 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e195 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:07:51 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e195 do_prune osdmap full prune enabled Dec 15 05:07:51 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e196 e196: 6 total, 6 up, 6 in Dec 15 05:07:51 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e196: 6 total, 6 up, 6 in Dec 15 05:07:51 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:51.483 160590 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 05:07:51 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:51.484 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 05:07:51 localhost ovn_metadata_agent[160585]: 2025-12-15 10:07:51.484 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 05:07:52 localhost dnsmasq[329206]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/addn_hosts - 0 addresses Dec 15 05:07:52 localhost dnsmasq-dhcp[329206]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/host Dec 15 05:07:52 localhost dnsmasq-dhcp[329206]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/opts Dec 15 05:07:52 localhost podman[329224]: 2025-12-15 10:07:52.056660636 +0000 UTC m=+0.059823317 container kill 9dd7c9866033c86a82e003c1d890f248f09395258683f7a527a2ea3672598b8a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:07:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 05:07:52 localhost podman[329238]: 2025-12-15 10:07:52.168534288 +0000 UTC m=+0.083946383 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202) Dec 15 05:07:52 localhost podman[329238]: 2025-12-15 10:07:52.179393214 +0000 UTC m=+0.094805309 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Dec 15 05:07:52 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 05:07:53 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:53.164 267546 INFO neutron.agent.dhcp.agent [None req-68eaed9d-5769-44b5-b1b8-5b5733d018d3 - - - - - -] DHCP configuration for ports {'79503367-f53f-4b35-8760-76fcaa4d8407', 'c645485a-086d-48ea-a19e-1922e8d2dbd9'} is completed#033[00m Dec 15 05:07:53 localhost dnsmasq[329206]: exiting on receipt of SIGTERM Dec 15 05:07:53 localhost podman[329280]: 2025-12-15 10:07:53.182079768 +0000 UTC m=+0.062807659 container kill 9dd7c9866033c86a82e003c1d890f248f09395258683f7a527a2ea3672598b8a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true) Dec 15 05:07:53 localhost systemd[1]: libpod-9dd7c9866033c86a82e003c1d890f248f09395258683f7a527a2ea3672598b8a.scope: Deactivated successfully. Dec 15 05:07:53 localhost podman[329293]: 2025-12-15 10:07:53.248470603 +0000 UTC m=+0.055401457 container died 9dd7c9866033c86a82e003c1d890f248f09395258683f7a527a2ea3672598b8a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:07:53 localhost systemd[1]: tmp-crun.7I7uVX.mount: Deactivated successfully. Dec 15 05:07:53 localhost podman[329293]: 2025-12-15 10:07:53.338241484 +0000 UTC m=+0.145172298 container cleanup 9dd7c9866033c86a82e003c1d890f248f09395258683f7a527a2ea3672598b8a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2) Dec 15 05:07:53 localhost systemd[1]: libpod-conmon-9dd7c9866033c86a82e003c1d890f248f09395258683f7a527a2ea3672598b8a.scope: Deactivated successfully. Dec 15 05:07:53 localhost podman[329295]: 2025-12-15 10:07:53.362491373 +0000 UTC m=+0.158731267 container remove 9dd7c9866033c86a82e003c1d890f248f09395258683f7a527a2ea3672598b8a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:07:53 localhost nova_compute[286344]: 2025-12-15 10:07:53.391 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:53 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:53.675 267546 INFO neutron.agent.dhcp.agent [None req-bfc27b21-56ef-4a27-9a78-cfb7a6018b02 - - - - - -] DHCP configuration for ports {'79503367-f53f-4b35-8760-76fcaa4d8407', 'c645485a-086d-48ea-a19e-1922e8d2dbd9'} is completed#033[00m Dec 15 05:07:53 localhost neutron_sriov_agent[260044]: 2025-12-15 10:07:53.997 2 INFO neutron.agent.securitygroups_rpc [None req-6e1915da-973e-4475-9a95-26fdf57603bc 6b5da6f221214afe93e1fa66574f238b 89e710ef9f4f48d48a369002db572947 - - default default] Security group member updated ['a6c5f808-dddc-4f17-acbf-63b1b6e6f4d6']#033[00m Dec 15 05:07:54 localhost systemd[1]: var-lib-containers-storage-overlay-af10404c8082b047a4ece97be9ebad242e59329e0bf5eb507fdcdc69946c2e62-merged.mount: Deactivated successfully. Dec 15 05:07:54 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9dd7c9866033c86a82e003c1d890f248f09395258683f7a527a2ea3672598b8a-userdata-shm.mount: Deactivated successfully. Dec 15 05:07:54 localhost ovn_controller[154603]: 2025-12-15T10:07:54Z|00409|binding|INFO|Releasing lport b35254ad-12eb-47bb-92be-44fefe0694f0 from this chassis (sb_readonly=0) Dec 15 05:07:54 localhost nova_compute[286344]: 2025-12-15 10:07:54.713 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:55 localhost neutron_sriov_agent[260044]: 2025-12-15 10:07:55.166 2 INFO neutron.agent.securitygroups_rpc [None req-51b6fad8-ab09-45a6-8494-55b3ab36f9d4 6b5da6f221214afe93e1fa66574f238b 89e710ef9f4f48d48a369002db572947 - - default default] Security group member updated ['a6c5f808-dddc-4f17-acbf-63b1b6e6f4d6']#033[00m Dec 15 05:07:55 localhost podman[329373]: Dec 15 05:07:55 localhost podman[329373]: 2025-12-15 10:07:55.320791849 +0000 UTC m=+0.089345410 container create 2faedaeec19a149046e7243bbd727fb227482d627ff75c71d7f1350c184f53e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2) Dec 15 05:07:55 localhost podman[329373]: 2025-12-15 10:07:55.277059411 +0000 UTC m=+0.045613022 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 15 05:07:55 localhost systemd[1]: Started libpod-conmon-2faedaeec19a149046e7243bbd727fb227482d627ff75c71d7f1350c184f53e5.scope. Dec 15 05:07:55 localhost systemd[1]: Started libcrun container. Dec 15 05:07:55 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/991d0175b795e5b544b14e19080682f2362d065cb09718dd91cdd74b675751d6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 05:07:55 localhost podman[329373]: 2025-12-15 10:07:55.409197894 +0000 UTC m=+0.177751465 container init 2faedaeec19a149046e7243bbd727fb227482d627ff75c71d7f1350c184f53e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202) Dec 15 05:07:55 localhost podman[329373]: 2025-12-15 10:07:55.419141243 +0000 UTC m=+0.187694804 container start 2faedaeec19a149046e7243bbd727fb227482d627ff75c71d7f1350c184f53e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:07:55 localhost dnsmasq[329392]: started, version 2.85 cachesize 150 Dec 15 05:07:55 localhost dnsmasq[329392]: DNS service limited to local subnets Dec 15 05:07:55 localhost dnsmasq[329392]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 15 05:07:55 localhost dnsmasq[329392]: warning: no upstream servers configured Dec 15 05:07:55 localhost dnsmasq-dhcp[329392]: DHCPv6, static leases only on 2001:db8::, lease time 1d Dec 15 05:07:55 localhost dnsmasq-dhcp[329392]: DHCPv6, static leases only on 2001:db8:0:1::, lease time 1d Dec 15 05:07:55 localhost dnsmasq[329392]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/addn_hosts - 0 addresses Dec 15 05:07:55 localhost dnsmasq-dhcp[329392]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/host Dec 15 05:07:55 localhost dnsmasq-dhcp[329392]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/opts Dec 15 05:07:55 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:55.484 267546 INFO neutron.agent.dhcp.agent [None req-92b72e73-1364-4aa4-badd-bf7da49d3f37 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-15T10:07:45Z, description=, device_id=, device_owner=, dns_assignment=[, ], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[, ], id=fbe41b43-2c1d-44c0-89ff-7c1bc53fe35d, ip_allocation=immediate, mac_address=fa:16:3e:de:a1:e1, name=tempest-NetworksTestDHCPv6-1492849438, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-15T10:05:39Z, description=, dns_domain=, id=c0669abd-aef1-4b0d-9f97-a6adeeac3211, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1861207414, port_security_enabled=True, project_id=89e710ef9f4f48d48a369002db572947, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=20673, qos_policy_id=None, revision_number=49, router:external=False, shared=False, standard_attr_id=2164, status=ACTIVE, subnets=['2672789a-2085-4cc2-8531-d7b3612d4e5b', '3a1ec57d-84ad-4063-ab8a-3bcdfd15932c'], tags=[], tenant_id=89e710ef9f4f48d48a369002db572947, updated_at=2025-12-15T10:07:45Z, vlan_transparent=None, network_id=c0669abd-aef1-4b0d-9f97-a6adeeac3211, port_security_enabled=True, project_id=89e710ef9f4f48d48a369002db572947, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['a6c5f808-dddc-4f17-acbf-63b1b6e6f4d6'], standard_attr_id=2815, status=DOWN, tags=[], tenant_id=89e710ef9f4f48d48a369002db572947, updated_at=2025-12-15T10:07:46Z on network c0669abd-aef1-4b0d-9f97-a6adeeac3211#033[00m Dec 15 05:07:55 localhost dnsmasq[329392]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/addn_hosts - 2 addresses Dec 15 05:07:55 localhost dnsmasq-dhcp[329392]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/host Dec 15 05:07:55 localhost dnsmasq-dhcp[329392]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/opts Dec 15 05:07:55 localhost podman[329409]: 2025-12-15 10:07:55.655703336 +0000 UTC m=+0.054804811 container kill 2faedaeec19a149046e7243bbd727fb227482d627ff75c71d7f1350c184f53e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:07:55 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:55.662 267546 INFO neutron.agent.dhcp.agent [None req-ffb52a40-639f-463e-a7d7-3b27a4a45b75 - - - - - -] DHCP configuration for ports {'79503367-f53f-4b35-8760-76fcaa4d8407', 'c645485a-086d-48ea-a19e-1922e8d2dbd9'} is completed#033[00m Dec 15 05:07:55 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:55.815 267546 INFO neutron.agent.dhcp.agent [None req-92b72e73-1364-4aa4-badd-bf7da49d3f37 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-15T10:07:53Z, description=, device_id=, device_owner=, dns_assignment=[, ], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[, ], id=4d5d3c0e-5d3b-45c5-8b70-f2621ce5b7fc, ip_allocation=immediate, mac_address=fa:16:3e:d4:73:c9, name=tempest-NetworksTestDHCPv6-664859919, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-15T10:05:39Z, description=, dns_domain=, id=c0669abd-aef1-4b0d-9f97-a6adeeac3211, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1861207414, port_security_enabled=True, project_id=89e710ef9f4f48d48a369002db572947, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=20673, qos_policy_id=None, revision_number=53, router:external=False, shared=False, standard_attr_id=2164, status=ACTIVE, subnets=['188ac5e8-b169-4284-b987-19d152c943ad', '4fd4113e-5947-4070-baae-fdc8b31c9be6'], tags=[], tenant_id=89e710ef9f4f48d48a369002db572947, updated_at=2025-12-15T10:07:51Z, vlan_transparent=None, network_id=c0669abd-aef1-4b0d-9f97-a6adeeac3211, port_security_enabled=True, project_id=89e710ef9f4f48d48a369002db572947, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['a6c5f808-dddc-4f17-acbf-63b1b6e6f4d6'], standard_attr_id=2839, status=DOWN, tags=[], tenant_id=89e710ef9f4f48d48a369002db572947, updated_at=2025-12-15T10:07:53Z on network c0669abd-aef1-4b0d-9f97-a6adeeac3211#033[00m Dec 15 05:07:55 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:55.891 267546 INFO neutron.agent.dhcp.agent [None req-b388a527-503c-4cd0-a353-d6f01f2e5bf5 - - - - - -] DHCP configuration for ports {'fbe41b43-2c1d-44c0-89ff-7c1bc53fe35d'} is completed#033[00m Dec 15 05:07:55 localhost nova_compute[286344]: 2025-12-15 10:07:55.898 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:56 localhost dnsmasq[329392]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/addn_hosts - 4 addresses Dec 15 05:07:56 localhost dnsmasq-dhcp[329392]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/host Dec 15 05:07:56 localhost podman[329449]: 2025-12-15 10:07:56.004214252 +0000 UTC m=+0.058802750 container kill 2faedaeec19a149046e7243bbd727fb227482d627ff75c71d7f1350c184f53e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 05:07:56 localhost dnsmasq-dhcp[329392]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/opts Dec 15 05:07:56 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e196 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:07:56 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e196 do_prune osdmap full prune enabled Dec 15 05:07:56 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e197 e197: 6 total, 6 up, 6 in Dec 15 05:07:56 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e197: 6 total, 6 up, 6 in Dec 15 05:07:56 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:56.261 267546 INFO neutron.agent.dhcp.agent [None req-cc7ce6a4-3d38-4cb9-af33-84f16ce18721 - - - - - -] DHCP configuration for ports {'4d5d3c0e-5d3b-45c5-8b70-f2621ce5b7fc'} is completed#033[00m Dec 15 05:07:56 localhost dnsmasq[329392]: exiting on receipt of SIGTERM Dec 15 05:07:56 localhost podman[329488]: 2025-12-15 10:07:56.529127146 +0000 UTC m=+0.058562764 container kill 2faedaeec19a149046e7243bbd727fb227482d627ff75c71d7f1350c184f53e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2) Dec 15 05:07:56 localhost systemd[1]: libpod-2faedaeec19a149046e7243bbd727fb227482d627ff75c71d7f1350c184f53e5.scope: Deactivated successfully. Dec 15 05:07:56 localhost podman[329500]: 2025-12-15 10:07:56.596077746 +0000 UTC m=+0.056363014 container died 2faedaeec19a149046e7243bbd727fb227482d627ff75c71d7f1350c184f53e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:07:56 localhost systemd[1]: tmp-crun.4Iq4Q9.mount: Deactivated successfully. Dec 15 05:07:56 localhost systemd[1]: Started /usr/bin/podman healthcheck run a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e. Dec 15 05:07:56 localhost podman[329500]: 2025-12-15 10:07:56.646119176 +0000 UTC m=+0.106404414 container cleanup 2faedaeec19a149046e7243bbd727fb227482d627ff75c71d7f1350c184f53e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0) Dec 15 05:07:56 localhost systemd[1]: libpod-conmon-2faedaeec19a149046e7243bbd727fb227482d627ff75c71d7f1350c184f53e5.scope: Deactivated successfully. Dec 15 05:07:56 localhost podman[329507]: 2025-12-15 10:07:56.669581514 +0000 UTC m=+0.113666732 container remove 2faedaeec19a149046e7243bbd727fb227482d627ff75c71d7f1350c184f53e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202) Dec 15 05:07:56 localhost podman[329528]: 2025-12-15 10:07:56.730958213 +0000 UTC m=+0.087118379 container health_status a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Dec 15 05:07:56 localhost podman[329528]: 2025-12-15 10:07:56.763635521 +0000 UTC m=+0.119795717 container exec_died a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Dec 15 05:07:56 localhost systemd[1]: a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e.service: Deactivated successfully. Dec 15 05:07:57 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e197 do_prune osdmap full prune enabled Dec 15 05:07:57 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e198 e198: 6 total, 6 up, 6 in Dec 15 05:07:57 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e198: 6 total, 6 up, 6 in Dec 15 05:07:57 localhost systemd[1]: var-lib-containers-storage-overlay-991d0175b795e5b544b14e19080682f2362d065cb09718dd91cdd74b675751d6-merged.mount: Deactivated successfully. Dec 15 05:07:57 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2faedaeec19a149046e7243bbd727fb227482d627ff75c71d7f1350c184f53e5-userdata-shm.mount: Deactivated successfully. Dec 15 05:07:57 localhost podman[329600]: Dec 15 05:07:57 localhost podman[329600]: 2025-12-15 10:07:57.546237201 +0000 UTC m=+0.098554280 container create 4b74b89b0a3f5a17e7d28c4a90393b1cd5b924778e77ea72e403e1f0dd8d4690 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0) Dec 15 05:07:57 localhost systemd[1]: Started libpod-conmon-4b74b89b0a3f5a17e7d28c4a90393b1cd5b924778e77ea72e403e1f0dd8d4690.scope. Dec 15 05:07:57 localhost systemd[1]: tmp-crun.7qYo3G.mount: Deactivated successfully. Dec 15 05:07:57 localhost podman[329600]: 2025-12-15 10:07:57.498948365 +0000 UTC m=+0.051265494 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 15 05:07:57 localhost systemd[1]: Started libcrun container. Dec 15 05:07:57 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/50918d884cadfae09bbf2d4a96970c654920cba50c8559ae0972684b91551e92/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 05:07:57 localhost podman[329600]: 2025-12-15 10:07:57.628787655 +0000 UTC m=+0.181104745 container init 4b74b89b0a3f5a17e7d28c4a90393b1cd5b924778e77ea72e403e1f0dd8d4690 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) Dec 15 05:07:57 localhost podman[329600]: 2025-12-15 10:07:57.63810811 +0000 UTC m=+0.190425199 container start 4b74b89b0a3f5a17e7d28c4a90393b1cd5b924778e77ea72e403e1f0dd8d4690 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 05:07:57 localhost dnsmasq[329619]: started, version 2.85 cachesize 150 Dec 15 05:07:57 localhost dnsmasq[329619]: DNS service limited to local subnets Dec 15 05:07:57 localhost dnsmasq[329619]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 15 05:07:57 localhost dnsmasq[329619]: warning: no upstream servers configured Dec 15 05:07:57 localhost dnsmasq-dhcp[329619]: DHCPv6, static leases only on 2001:db8::, lease time 1d Dec 15 05:07:57 localhost dnsmasq[329619]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/addn_hosts - 0 addresses Dec 15 05:07:57 localhost dnsmasq-dhcp[329619]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/host Dec 15 05:07:57 localhost dnsmasq-dhcp[329619]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/opts Dec 15 05:07:58 localhost dnsmasq[329619]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/addn_hosts - 0 addresses Dec 15 05:07:58 localhost dnsmasq-dhcp[329619]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/host Dec 15 05:07:58 localhost dnsmasq-dhcp[329619]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/opts Dec 15 05:07:58 localhost podman[329637]: 2025-12-15 10:07:58.341247508 +0000 UTC m=+0.059234592 container kill 4b74b89b0a3f5a17e7d28c4a90393b1cd5b924778e77ea72e403e1f0dd8d4690 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3) Dec 15 05:07:58 localhost systemd[1]: tmp-crun.n3ckbk.mount: Deactivated successfully. Dec 15 05:07:58 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:58.365 267546 INFO neutron.agent.dhcp.agent [None req-7e1514da-c2db-4f2f-a1e8-512a99912246 - - - - - -] DHCP configuration for ports {'79503367-f53f-4b35-8760-76fcaa4d8407', 'c645485a-086d-48ea-a19e-1922e8d2dbd9'} is completed#033[00m Dec 15 05:07:58 localhost nova_compute[286344]: 2025-12-15 10:07:58.435 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:07:59 localhost dnsmasq[329619]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/addn_hosts - 0 addresses Dec 15 05:07:59 localhost dnsmasq-dhcp[329619]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/host Dec 15 05:07:59 localhost dnsmasq-dhcp[329619]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/opts Dec 15 05:07:59 localhost podman[329675]: 2025-12-15 10:07:59.095522258 +0000 UTC m=+0.059390836 container kill 4b74b89b0a3f5a17e7d28c4a90393b1cd5b924778e77ea72e403e1f0dd8d4690 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:07:59 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:07:59.111 267546 INFO neutron.agent.dhcp.agent [None req-d2de7ff8-7bf7-47bb-8080-b6f707d95d2c - - - - - -] DHCP configuration for ports {'79503367-f53f-4b35-8760-76fcaa4d8407', 'c645485a-086d-48ea-a19e-1922e8d2dbd9'} is completed#033[00m Dec 15 05:08:00 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:08:00.083 267546 INFO neutron.agent.dhcp.agent [None req-b9cbecaa-c7ed-489c-b4de-c676c424d2ab - - - - - -] DHCP configuration for ports {'79503367-f53f-4b35-8760-76fcaa4d8407', 'c645485a-086d-48ea-a19e-1922e8d2dbd9'} is completed#033[00m Dec 15 05:08:00 localhost nova_compute[286344]: 2025-12-15 10:08:00.947 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:08:01 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e198 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:08:01 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Dec 15 05:08:01 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.25625 172.18.0.34:0/382777224' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Dec 15 05:08:01 localhost podman[243449]: time="2025-12-15T10:08:01Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 15 05:08:01 localhost podman[243449]: @ - - [15/Dec/2025:10:08:01 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 160284 "" "Go-http-client/1.1" Dec 15 05:08:01 localhost podman[243449]: @ - - [15/Dec/2025:10:08:01 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 20197 "" "Go-http-client/1.1" Dec 15 05:08:02 localhost podman[329804]: 2025-12-15 10:08:02.824933762 +0000 UTC m=+0.106430315 container exec 8dcda56b365b42dc8758aab77a9ec80db304780e449052738f7e4e648ae1ecaf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-crash-np0005559462, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-11-26T19:44:28Z, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, io.openshift.expose-services=, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, io.buildah.version=1.41.4, vendor=Red Hat, Inc., GIT_CLEAN=True, vcs-type=git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1763362218, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, distribution-scope=public, architecture=x86_64, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=) Dec 15 05:08:02 localhost podman[329804]: 2025-12-15 10:08:02.97122946 +0000 UTC m=+0.252725983 container exec_died 8dcda56b365b42dc8758aab77a9ec80db304780e449052738f7e4e648ae1ecaf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-crash-np0005559462, GIT_BRANCH=main, vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, name=rhceph, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, maintainer=Guillaume Abrioux , ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, vcs-type=git, CEPH_POINT_RELEASE=, RELEASE=main, io.openshift.expose-services=, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1763362218, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7) Dec 15 05:08:03 localhost ovn_metadata_agent[160585]: 2025-12-15 10:08:03.269 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:04:2e:5c 2001:db8:0:1:f816:3eff:fe04:2e5c'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe04:2e5c/64', 'neutron:device_id': 'ovnmeta-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '89e710ef9f4f48d48a369002db572947', 'neutron:revision_number': '30', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e47821dc-5f5d-44dc-8a16-54817df4049d, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=79503367-f53f-4b35-8760-76fcaa4d8407) old=Port_Binding(mac=['fa:16:3e:04:2e:5c 2001:db8::f816:3eff:fe04:2e5c'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe04:2e5c/64', 'neutron:device_id': 'ovnmeta-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '89e710ef9f4f48d48a369002db572947', 'neutron:revision_number': '28', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:08:03 localhost ovn_metadata_agent[160585]: 2025-12-15 10:08:03.273 160590 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 79503367-f53f-4b35-8760-76fcaa4d8407 in datapath c0669abd-aef1-4b0d-9f97-a6adeeac3211 updated#033[00m Dec 15 05:08:03 localhost ovn_metadata_agent[160585]: 2025-12-15 10:08:03.276 160590 DEBUG neutron.agent.ovn.metadata.agent [-] Port ad00b7a9-0b2b-4651-a26a-374b2bd1f287 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Dec 15 05:08:03 localhost ovn_metadata_agent[160585]: 2025-12-15 10:08:03.276 160590 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c0669abd-aef1-4b0d-9f97-a6adeeac3211, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 15 05:08:03 localhost ovn_metadata_agent[160585]: 2025-12-15 10:08:03.278 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[2d4aaf20-6e1b-4f3a-8e2d-c1aa2dbeb1ec]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:08:03 localhost nova_compute[286344]: 2025-12-15 10:08:03.439 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:08:03 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559463.localdomain.devices.0}] v 0) Dec 15 05:08:03 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:08:03 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559463.localdomain}] v 0) Dec 15 05:08:03 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559462.localdomain.devices.0}] v 0) Dec 15 05:08:03 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:08:03 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:08:03 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559462.localdomain}] v 0) Dec 15 05:08:03 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:08:03 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559464.localdomain.devices.0}] v 0) Dec 15 05:08:03 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:08:03 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559464.localdomain}] v 0) Dec 15 05:08:03 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:08:03 localhost ceph-mon[298913]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #58. Immutable memtables: 0. Dec 15 05:08:03 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:08:03.802814) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Dec 15 05:08:03 localhost ceph-mon[298913]: rocksdb: [db/flush_job.cc:856] [default] [JOB 33] Flushing memtable with next log file: 58 Dec 15 05:08:03 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765793283803089, "job": 33, "event": "flush_started", "num_memtables": 1, "num_entries": 1542, "num_deletes": 260, "total_data_size": 1529557, "memory_usage": 1565648, "flush_reason": "Manual Compaction"} Dec 15 05:08:03 localhost ceph-mon[298913]: rocksdb: [db/flush_job.cc:885] [default] [JOB 33] Level-0 flush table #59: started Dec 15 05:08:03 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765793283817350, "cf_name": "default", "job": 33, "event": "table_file_creation", "file_number": 59, "file_size": 1498946, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 31328, "largest_seqno": 32869, "table_properties": {"data_size": 1491895, "index_size": 4072, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1925, "raw_key_size": 16493, "raw_average_key_size": 21, "raw_value_size": 1477266, "raw_average_value_size": 1964, "num_data_blocks": 171, "num_entries": 752, "num_filter_entries": 752, "num_deletions": 260, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765793200, "oldest_key_time": 1765793200, "file_creation_time": 1765793283, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "603b24af-e2be-4214-bc56-9e652eb4af3d", "db_session_id": "0OJRM9SCUA16EXV0VQZ2", "orig_file_number": 59, "seqno_to_time_mapping": "N/A"}} Dec 15 05:08:03 localhost ceph-mon[298913]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 33] Flush lasted 14536 microseconds, and 5169 cpu microseconds. Dec 15 05:08:03 localhost ceph-mon[298913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Dec 15 05:08:03 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:08:03.817405) [db/flush_job.cc:967] [default] [JOB 33] Level-0 flush table #59: 1498946 bytes OK Dec 15 05:08:03 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:08:03.817430) [db/memtable_list.cc:519] [default] Level-0 commit table #59 started Dec 15 05:08:03 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:08:03.819573) [db/memtable_list.cc:722] [default] Level-0 commit table #59: memtable #1 done Dec 15 05:08:03 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:08:03.819593) EVENT_LOG_v1 {"time_micros": 1765793283819587, "job": 33, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Dec 15 05:08:03 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:08:03.819614) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Dec 15 05:08:03 localhost ceph-mon[298913]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 33] Try to delete WAL files size 1522523, prev total WAL file size 1522523, number of live WAL files 2. Dec 15 05:08:03 localhost ceph-mon[298913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005559462/store.db/000055.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Dec 15 05:08:03 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:08:03.820207) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003132323939' seq:72057594037927935, type:22 .. '7061786F73003132353531' seq:0, type:0; will stop at (end) Dec 15 05:08:03 localhost ceph-mon[298913]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 34] Compacting 1@0 + 1@6 files to L6, score -1.00 Dec 15 05:08:03 localhost ceph-mon[298913]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 33 Base level 0, inputs: [59(1463KB)], [57(16MB)] Dec 15 05:08:03 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765793283820262, "job": 34, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [59], "files_L6": [57], "score": -1, "input_data_size": 18593922, "oldest_snapshot_seqno": -1} Dec 15 05:08:03 localhost ceph-mon[298913]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 34] Generated table #60: 13206 keys, 17409562 bytes, temperature: kUnknown Dec 15 05:08:03 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765793283929162, "cf_name": "default", "job": 34, "event": "table_file_creation", "file_number": 60, "file_size": 17409562, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17335279, "index_size": 40198, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 33029, "raw_key_size": 354612, "raw_average_key_size": 26, "raw_value_size": 17111235, "raw_average_value_size": 1295, "num_data_blocks": 1512, "num_entries": 13206, "num_filter_entries": 13206, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765792320, "oldest_key_time": 0, "file_creation_time": 1765793283, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "603b24af-e2be-4214-bc56-9e652eb4af3d", "db_session_id": "0OJRM9SCUA16EXV0VQZ2", "orig_file_number": 60, "seqno_to_time_mapping": "N/A"}} Dec 15 05:08:03 localhost ceph-mon[298913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Dec 15 05:08:03 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:08:03.929476) [db/compaction/compaction_job.cc:1663] [default] [JOB 34] Compacted 1@0 + 1@6 files to L6 => 17409562 bytes Dec 15 05:08:03 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:08:03.955898) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 170.6 rd, 159.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.4, 16.3 +0.0 blob) out(16.6 +0.0 blob), read-write-amplify(24.0) write-amplify(11.6) OK, records in: 13749, records dropped: 543 output_compression: NoCompression Dec 15 05:08:03 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:08:03.955925) EVENT_LOG_v1 {"time_micros": 1765793283955913, "job": 34, "event": "compaction_finished", "compaction_time_micros": 109005, "compaction_time_cpu_micros": 51233, "output_level": 6, "num_output_files": 1, "total_output_size": 17409562, "num_input_records": 13749, "num_output_records": 13206, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Dec 15 05:08:03 localhost ceph-mon[298913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005559462/store.db/000059.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Dec 15 05:08:03 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765793283956255, "job": 34, "event": "table_file_deletion", "file_number": 59} Dec 15 05:08:03 localhost ceph-mon[298913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005559462/store.db/000057.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Dec 15 05:08:03 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765793283958575, "job": 34, "event": "table_file_deletion", "file_number": 57} Dec 15 05:08:03 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:08:03.820144) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 05:08:03 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:08:03.958673) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 05:08:03 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:08:03.958681) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 05:08:03 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:08:03.958684) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 05:08:03 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:08:03.958687) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 05:08:03 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:08:03.958690) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 05:08:04 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:08:04 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:08:04 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:08:04 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:08:04 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:08:04 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:08:04 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} v 0) Dec 15 05:08:04 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Dec 15 05:08:04 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} v 0) Dec 15 05:08:04 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Dec 15 05:08:04 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} v 0) Dec 15 05:08:04 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Dec 15 05:08:04 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} v 0) Dec 15 05:08:04 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Dec 15 05:08:04 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} v 0) Dec 15 05:08:04 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Dec 15 05:08:04 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} v 0) Dec 15 05:08:04 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Dec 15 05:08:04 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0) Dec 15 05:08:04 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0) Dec 15 05:08:04 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0) Dec 15 05:08:04 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Dec 15 05:08:04 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:08:04 localhost openstack_network_exporter[246484]: ERROR 10:08:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 05:08:04 localhost openstack_network_exporter[246484]: ERROR 10:08:04 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 15 05:08:04 localhost openstack_network_exporter[246484]: ERROR 10:08:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 05:08:04 localhost openstack_network_exporter[246484]: ERROR 10:08:04 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 15 05:08:04 localhost openstack_network_exporter[246484]: Dec 15 05:08:04 localhost openstack_network_exporter[246484]: ERROR 10:08:04 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 15 05:08:04 localhost openstack_network_exporter[246484]: Dec 15 05:08:05 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Dec 15 05:08:05 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Dec 15 05:08:05 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Dec 15 05:08:05 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Dec 15 05:08:05 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Dec 15 05:08:05 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Dec 15 05:08:05 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Dec 15 05:08:05 localhost ceph-mon[298913]: Adjusting osd_memory_target on np0005559463.localdomain to 836.6M Dec 15 05:08:05 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Dec 15 05:08:05 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Dec 15 05:08:05 localhost ceph-mon[298913]: Adjusting osd_memory_target on np0005559462.localdomain to 836.6M Dec 15 05:08:05 localhost ceph-mon[298913]: Adjusting osd_memory_target on np0005559464.localdomain to 836.6M Dec 15 05:08:05 localhost ceph-mon[298913]: Unable to set osd_memory_target on np0005559463.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Dec 15 05:08:05 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Dec 15 05:08:05 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Dec 15 05:08:05 localhost ceph-mon[298913]: Unable to set osd_memory_target on np0005559462.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096 Dec 15 05:08:05 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Dec 15 05:08:05 localhost ceph-mon[298913]: Unable to set osd_memory_target on np0005559464.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Dec 15 05:08:05 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Dec 15 05:08:05 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:08:05 localhost nova_compute[286344]: 2025-12-15 10:08:05.975 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:08:06 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e198 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:08:06 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e198 do_prune osdmap full prune enabled Dec 15 05:08:06 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e199 e199: 6 total, 6 up, 6 in Dec 15 05:08:06 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e199: 6 total, 6 up, 6 in Dec 15 05:08:06 localhost nova_compute[286344]: 2025-12-15 10:08:06.929 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:08:06 localhost ovn_metadata_agent[160585]: 2025-12-15 10:08:06.929 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'fe:17:e3', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fe:55:2b:86:15:b5'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:08:06 localhost ovn_metadata_agent[160585]: 2025-12-15 10:08:06.930 160590 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Dec 15 05:08:08 localhost nova_compute[286344]: 2025-12-15 10:08:08.484 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:08:08 localhost podman[330027]: 2025-12-15 10:08:08.777514266 +0000 UTC m=+0.064701350 container kill 4b74b89b0a3f5a17e7d28c4a90393b1cd5b924778e77ea72e403e1f0dd8d4690 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Dec 15 05:08:08 localhost dnsmasq[329619]: exiting on receipt of SIGTERM Dec 15 05:08:08 localhost systemd[1]: libpod-4b74b89b0a3f5a17e7d28c4a90393b1cd5b924778e77ea72e403e1f0dd8d4690.scope: Deactivated successfully. Dec 15 05:08:08 localhost podman[330040]: 2025-12-15 10:08:08.845457813 +0000 UTC m=+0.052564900 container died 4b74b89b0a3f5a17e7d28c4a90393b1cd5b924778e77ea72e403e1f0dd8d4690 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3) Dec 15 05:08:08 localhost podman[330040]: 2025-12-15 10:08:08.891867746 +0000 UTC m=+0.098974793 container cleanup 4b74b89b0a3f5a17e7d28c4a90393b1cd5b924778e77ea72e403e1f0dd8d4690 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 05:08:08 localhost systemd[1]: libpod-conmon-4b74b89b0a3f5a17e7d28c4a90393b1cd5b924778e77ea72e403e1f0dd8d4690.scope: Deactivated successfully. Dec 15 05:08:08 localhost podman[330042]: 2025-12-15 10:08:08.920775101 +0000 UTC m=+0.116387885 container remove 4b74b89b0a3f5a17e7d28c4a90393b1cd5b924778e77ea72e403e1f0dd8d4690 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Dec 15 05:08:09 localhost neutron_sriov_agent[260044]: 2025-12-15 10:08:09.581 2 INFO neutron.agent.securitygroups_rpc [None req-7ab995ad-2660-4600-af8b-1ca4ed2689cb 6b5da6f221214afe93e1fa66574f238b 89e710ef9f4f48d48a369002db572947 - - default default] Security group member updated ['a6c5f808-dddc-4f17-acbf-63b1b6e6f4d6']#033[00m Dec 15 05:08:09 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Dec 15 05:08:09 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:08:09 localhost systemd[1]: tmp-crun.6bcWOq.mount: Deactivated successfully. Dec 15 05:08:09 localhost systemd[1]: var-lib-containers-storage-overlay-50918d884cadfae09bbf2d4a96970c654920cba50c8559ae0972684b91551e92-merged.mount: Deactivated successfully. Dec 15 05:08:09 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4b74b89b0a3f5a17e7d28c4a90393b1cd5b924778e77ea72e403e1f0dd8d4690-userdata-shm.mount: Deactivated successfully. Dec 15 05:08:09 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Dec 15 05:08:09 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.25625 172.18.0.34:0/382777224' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Dec 15 05:08:10 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:08:11 localhost nova_compute[286344]: 2025-12-15 10:08:11.028 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:08:11 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e199 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:08:11 localhost neutron_sriov_agent[260044]: 2025-12-15 10:08:11.307 2 INFO neutron.agent.securitygroups_rpc [None req-c4dc5dd8-fd27-412f-a1fd-ea5834057ffb 6b5da6f221214afe93e1fa66574f238b 89e710ef9f4f48d48a369002db572947 - - default default] Security group member updated ['a6c5f808-dddc-4f17-acbf-63b1b6e6f4d6']#033[00m Dec 15 05:08:11 localhost podman[330125]: Dec 15 05:08:11 localhost podman[330125]: 2025-12-15 10:08:11.706676832 +0000 UTC m=+0.091589011 container create 2d752ffaf27bf107210fa83d016fa3920b600049e1c6150381650109687cc419 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:08:11 localhost systemd[1]: Started libpod-conmon-2d752ffaf27bf107210fa83d016fa3920b600049e1c6150381650109687cc419.scope. Dec 15 05:08:11 localhost systemd[1]: tmp-crun.02SKjt.mount: Deactivated successfully. Dec 15 05:08:11 localhost podman[330125]: 2025-12-15 10:08:11.662241434 +0000 UTC m=+0.047153633 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 15 05:08:11 localhost systemd[1]: Started libcrun container. Dec 15 05:08:11 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1718cb4a742b7cdd7a6257c178830df9895eef30571e925d77517bf77e3887ea/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 05:08:11 localhost podman[330125]: 2025-12-15 10:08:11.787224742 +0000 UTC m=+0.172136931 container init 2d752ffaf27bf107210fa83d016fa3920b600049e1c6150381650109687cc419 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:08:11 localhost podman[330125]: 2025-12-15 10:08:11.795787464 +0000 UTC m=+0.180699643 container start 2d752ffaf27bf107210fa83d016fa3920b600049e1c6150381650109687cc419 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3) Dec 15 05:08:11 localhost dnsmasq[330143]: started, version 2.85 cachesize 150 Dec 15 05:08:11 localhost dnsmasq[330143]: DNS service limited to local subnets Dec 15 05:08:11 localhost dnsmasq[330143]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 15 05:08:11 localhost dnsmasq[330143]: warning: no upstream servers configured Dec 15 05:08:11 localhost dnsmasq-dhcp[330143]: DHCPv6, static leases only on 2001:db8::, lease time 1d Dec 15 05:08:11 localhost dnsmasq[330143]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/addn_hosts - 0 addresses Dec 15 05:08:11 localhost dnsmasq-dhcp[330143]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/host Dec 15 05:08:11 localhost dnsmasq-dhcp[330143]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/opts Dec 15 05:08:11 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:08:11.862 267546 INFO neutron.agent.dhcp.agent [None req-f7d0f932-667a-4da0-93c8-94a794d7c036 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-15T10:08:09Z, description=, device_id=, device_owner=, dns_assignment=[, ], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[, ], id=065fb945-f19c-43f2-a1f8-498bd36233b0, ip_allocation=immediate, mac_address=fa:16:3e:73:3f:f3, name=tempest-NetworksTestDHCPv6-1667359563, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-15T10:05:39Z, description=, dns_domain=, id=c0669abd-aef1-4b0d-9f97-a6adeeac3211, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1861207414, port_security_enabled=True, project_id=89e710ef9f4f48d48a369002db572947, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=20673, qos_policy_id=None, revision_number=57, router:external=False, shared=False, standard_attr_id=2164, status=ACTIVE, subnets=['9d24b8df-61a1-43f3-a249-cd5a59bed150', 'e0add063-050c-4bdf-b44c-55e8cfc89399'], tags=[], tenant_id=89e710ef9f4f48d48a369002db572947, updated_at=2025-12-15T10:07:59Z, vlan_transparent=None, network_id=c0669abd-aef1-4b0d-9f97-a6adeeac3211, port_security_enabled=True, project_id=89e710ef9f4f48d48a369002db572947, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['a6c5f808-dddc-4f17-acbf-63b1b6e6f4d6'], standard_attr_id=2872, status=DOWN, tags=[], tenant_id=89e710ef9f4f48d48a369002db572947, updated_at=2025-12-15T10:08:09Z on network c0669abd-aef1-4b0d-9f97-a6adeeac3211#033[00m Dec 15 05:08:12 localhost dnsmasq[330143]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/addn_hosts - 2 addresses Dec 15 05:08:12 localhost podman[330160]: 2025-12-15 10:08:12.038836563 +0000 UTC m=+0.049050924 container kill 2d752ffaf27bf107210fa83d016fa3920b600049e1c6150381650109687cc419 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Dec 15 05:08:12 localhost dnsmasq-dhcp[330143]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/host Dec 15 05:08:12 localhost dnsmasq-dhcp[330143]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/opts Dec 15 05:08:12 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:08:12.376 267546 INFO neutron.agent.dhcp.agent [None req-e21900c5-453d-48b1-9f56-b000e445af52 - - - - - -] DHCP configuration for ports {'79503367-f53f-4b35-8760-76fcaa4d8407', 'c645485a-086d-48ea-a19e-1922e8d2dbd9'} is completed#033[00m Dec 15 05:08:12 localhost dnsmasq[330143]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/addn_hosts - 0 addresses Dec 15 05:08:12 localhost dnsmasq-dhcp[330143]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/host Dec 15 05:08:12 localhost dnsmasq-dhcp[330143]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/opts Dec 15 05:08:12 localhost podman[330198]: 2025-12-15 10:08:12.388007058 +0000 UTC m=+0.063736294 container kill 2d752ffaf27bf107210fa83d016fa3920b600049e1c6150381650109687cc419 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Dec 15 05:08:12 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:08:12.605 267546 INFO neutron.agent.dhcp.agent [None req-7e5445cc-8c87-43e1-b5f7-b892640649dd - - - - - -] DHCP configuration for ports {'065fb945-f19c-43f2-a1f8-498bd36233b0'} is completed#033[00m Dec 15 05:08:13 localhost dnsmasq[330143]: exiting on receipt of SIGTERM Dec 15 05:08:13 localhost podman[330237]: 2025-12-15 10:08:13.039927355 +0000 UTC m=+0.058180554 container kill 2d752ffaf27bf107210fa83d016fa3920b600049e1c6150381650109687cc419 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Dec 15 05:08:13 localhost systemd[1]: libpod-2d752ffaf27bf107210fa83d016fa3920b600049e1c6150381650109687cc419.scope: Deactivated successfully. Dec 15 05:08:13 localhost podman[330251]: 2025-12-15 10:08:13.119110537 +0000 UTC m=+0.063694053 container died 2d752ffaf27bf107210fa83d016fa3920b600049e1c6150381650109687cc419 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 05:08:13 localhost systemd[1]: tmp-crun.DZ4xw1.mount: Deactivated successfully. Dec 15 05:08:13 localhost podman[330251]: 2025-12-15 10:08:13.158163769 +0000 UTC m=+0.102747235 container cleanup 2d752ffaf27bf107210fa83d016fa3920b600049e1c6150381650109687cc419 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Dec 15 05:08:13 localhost systemd[1]: libpod-conmon-2d752ffaf27bf107210fa83d016fa3920b600049e1c6150381650109687cc419.scope: Deactivated successfully. Dec 15 05:08:13 localhost podman[330252]: 2025-12-15 10:08:13.193779057 +0000 UTC m=+0.133990813 container remove 2d752ffaf27bf107210fa83d016fa3920b600049e1c6150381650109687cc419 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true) Dec 15 05:08:13 localhost nova_compute[286344]: 2025-12-15 10:08:13.528 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:08:13 localhost systemd[1]: var-lib-containers-storage-overlay-1718cb4a742b7cdd7a6257c178830df9895eef30571e925d77517bf77e3887ea-merged.mount: Deactivated successfully. Dec 15 05:08:13 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2d752ffaf27bf107210fa83d016fa3920b600049e1c6150381650109687cc419-userdata-shm.mount: Deactivated successfully. Dec 15 05:08:13 localhost ovn_metadata_agent[160585]: 2025-12-15 10:08:13.932 160590 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=12d96d64-e862-4f68-81e5-8d9ec5d3a5e2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 15 05:08:14 localhost podman[330329]: Dec 15 05:08:14 localhost podman[330329]: 2025-12-15 10:08:14.144756965 +0000 UTC m=+0.105608253 container create eb0edf72e8cb50165ee77d52f0a438feba72b0cf071ad96eaf92f97a7cc82e90 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS) Dec 15 05:08:14 localhost systemd[1]: Started libpod-conmon-eb0edf72e8cb50165ee77d52f0a438feba72b0cf071ad96eaf92f97a7cc82e90.scope. Dec 15 05:08:14 localhost podman[330329]: 2025-12-15 10:08:14.083293224 +0000 UTC m=+0.044144542 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 15 05:08:14 localhost systemd[1]: tmp-crun.4s7Ftn.mount: Deactivated successfully. Dec 15 05:08:14 localhost systemd[1]: Started libcrun container. Dec 15 05:08:14 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8bc61f5fb67bd8a7bb91dcb067157be68a9c5f6d1fd540f1ac7b1cca57a9bb37/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 05:08:14 localhost podman[330329]: 2025-12-15 10:08:14.216191547 +0000 UTC m=+0.177042835 container init eb0edf72e8cb50165ee77d52f0a438feba72b0cf071ad96eaf92f97a7cc82e90 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2) Dec 15 05:08:14 localhost podman[330329]: 2025-12-15 10:08:14.224728069 +0000 UTC m=+0.185579367 container start eb0edf72e8cb50165ee77d52f0a438feba72b0cf071ad96eaf92f97a7cc82e90 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Dec 15 05:08:14 localhost dnsmasq[330348]: started, version 2.85 cachesize 150 Dec 15 05:08:14 localhost dnsmasq[330348]: DNS service limited to local subnets Dec 15 05:08:14 localhost dnsmasq[330348]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 15 05:08:14 localhost dnsmasq[330348]: warning: no upstream servers configured Dec 15 05:08:14 localhost dnsmasq[330348]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/addn_hosts - 0 addresses Dec 15 05:08:14 localhost nova_compute[286344]: 2025-12-15 10:08:14.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:08:14 localhost nova_compute[286344]: 2025-12-15 10:08:14.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:08:14 localhost dnsmasq[330348]: exiting on receipt of SIGTERM Dec 15 05:08:14 localhost podman[330367]: 2025-12-15 10:08:14.946643499 +0000 UTC m=+0.095096947 container kill eb0edf72e8cb50165ee77d52f0a438feba72b0cf071ad96eaf92f97a7cc82e90 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2) Dec 15 05:08:14 localhost systemd[1]: libpod-eb0edf72e8cb50165ee77d52f0a438feba72b0cf071ad96eaf92f97a7cc82e90.scope: Deactivated successfully. Dec 15 05:08:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0. Dec 15 05:08:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 05:08:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a. Dec 15 05:08:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09. Dec 15 05:08:15 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 05:08:15 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:08:15.075 267546 INFO neutron.agent.dhcp.agent [None req-c2159282-71bf-44fb-9499-1b46ba34e250 - - - - - -] DHCP configuration for ports {'79503367-f53f-4b35-8760-76fcaa4d8407', 'c645485a-086d-48ea-a19e-1922e8d2dbd9'} is completed#033[00m Dec 15 05:08:15 localhost podman[330397]: 2025-12-15 10:08:15.103093813 +0000 UTC m=+0.126670755 container health_status 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, maintainer=Red Hat, Inc., distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, config_id=openstack_network_exporter, container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7) Dec 15 05:08:15 localhost podman[330397]: 2025-12-15 10:08:15.116433936 +0000 UTC m=+0.140010868 container exec_died 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, version=9.6, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, io.openshift.expose-services=, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., release=1755695350, distribution-scope=public, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container) Dec 15 05:08:15 localhost systemd[1]: 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09.service: Deactivated successfully. Dec 15 05:08:15 localhost podman[330391]: 2025-12-15 10:08:15.16107865 +0000 UTC m=+0.190407838 container health_status b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Dec 15 05:08:15 localhost podman[330381]: 2025-12-15 10:08:15.174887015 +0000 UTC m=+0.217153356 container died eb0edf72e8cb50165ee77d52f0a438feba72b0cf071ad96eaf92f97a7cc82e90 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202) Dec 15 05:08:15 localhost podman[330381]: 2025-12-15 10:08:15.200108051 +0000 UTC m=+0.242374392 container cleanup eb0edf72e8cb50165ee77d52f0a438feba72b0cf071ad96eaf92f97a7cc82e90 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Dec 15 05:08:15 localhost systemd[1]: libpod-conmon-eb0edf72e8cb50165ee77d52f0a438feba72b0cf071ad96eaf92f97a7cc82e90.scope: Deactivated successfully. Dec 15 05:08:15 localhost podman[330390]: 2025-12-15 10:08:15.073845678 +0000 UTC m=+0.101189053 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller) Dec 15 05:08:15 localhost podman[330391]: 2025-12-15 10:08:15.253332478 +0000 UTC m=+0.282661866 container exec_died b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, managed_by=edpm_ansible) Dec 15 05:08:15 localhost systemd[1]: b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a.service: Deactivated successfully. Dec 15 05:08:15 localhost nova_compute[286344]: 2025-12-15 10:08:15.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:08:15 localhost podman[330383]: 2025-12-15 10:08:15.275530402 +0000 UTC m=+0.307175894 container remove eb0edf72e8cb50165ee77d52f0a438feba72b0cf071ad96eaf92f97a7cc82e90 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2) Dec 15 05:08:15 localhost ovn_controller[154603]: 2025-12-15T10:08:15Z|00410|binding|INFO|Releasing lport c645485a-086d-48ea-a19e-1922e8d2dbd9 from this chassis (sb_readonly=0) Dec 15 05:08:15 localhost kernel: device tapc645485a-08 left promiscuous mode Dec 15 05:08:15 localhost ovn_controller[154603]: 2025-12-15T10:08:15Z|00411|binding|INFO|Setting lport c645485a-086d-48ea-a19e-1922e8d2dbd9 down in Southbound Dec 15 05:08:15 localhost nova_compute[286344]: 2025-12-15 10:08:15.289 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:08:15 localhost podman[330389]: 2025-12-15 10:08:15.257336687 +0000 UTC m=+0.283239462 container health_status 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Dec 15 05:08:15 localhost nova_compute[286344]: 2025-12-15 10:08:15.313 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:08:15 localhost ovn_metadata_agent[160585]: 2025-12-15 10:08:15.321 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe31:8e8e/64 2001:db8::2/64', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '89e710ef9f4f48d48a369002db572947', 'neutron:revision_number': '12', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005559462.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e47821dc-5f5d-44dc-8a16-54817df4049d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=c645485a-086d-48ea-a19e-1922e8d2dbd9) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:08:15 localhost podman[330454]: 2025-12-15 10:08:15.323403594 +0000 UTC m=+0.240933622 container health_status 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true) Dec 15 05:08:15 localhost ovn_metadata_agent[160585]: 2025-12-15 10:08:15.324 160590 INFO neutron.agent.ovn.metadata.agent [-] Port c645485a-086d-48ea-a19e-1922e8d2dbd9 in datapath c0669abd-aef1-4b0d-9f97-a6adeeac3211 unbound from our chassis#033[00m Dec 15 05:08:15 localhost ovn_metadata_agent[160585]: 2025-12-15 10:08:15.326 160590 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c0669abd-aef1-4b0d-9f97-a6adeeac3211, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 15 05:08:15 localhost ovn_metadata_agent[160585]: 2025-12-15 10:08:15.327 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[3acd226b-3486-41fc-a2be-2135dec34b11]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:08:15 localhost podman[330389]: 2025-12-15 10:08:15.344895707 +0000 UTC m=+0.370798552 container exec_died 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Dec 15 05:08:15 localhost systemd[1]: 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0.service: Deactivated successfully. Dec 15 05:08:15 localhost podman[330390]: 2025-12-15 10:08:15.359647989 +0000 UTC m=+0.386991354 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, config_id=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS) Dec 15 05:08:15 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 05:08:15 localhost podman[330454]: 2025-12-15 10:08:15.412386362 +0000 UTC m=+0.329916340 container exec_died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202) Dec 15 05:08:15 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.service: Deactivated successfully. Dec 15 05:08:15 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Dec 15 05:08:15 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2180970392' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Dec 15 05:08:15 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Dec 15 05:08:15 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2180970392' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Dec 15 05:08:15 localhost systemd[1]: var-lib-containers-storage-overlay-8bc61f5fb67bd8a7bb91dcb067157be68a9c5f6d1fd540f1ac7b1cca57a9bb37-merged.mount: Deactivated successfully. Dec 15 05:08:15 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-eb0edf72e8cb50165ee77d52f0a438feba72b0cf071ad96eaf92f97a7cc82e90-userdata-shm.mount: Deactivated successfully. Dec 15 05:08:15 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Dec 15 05:08:15 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.25625 172.18.0.34:0/382777224' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Dec 15 05:08:16 localhost nova_compute[286344]: 2025-12-15 10:08:16.064 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:08:16 localhost systemd[1]: run-netns-qdhcp\x2dc0669abd\x2daef1\x2d4b0d\x2d9f97\x2da6adeeac3211.mount: Deactivated successfully. Dec 15 05:08:16 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e199 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:08:16 localhost nova_compute[286344]: 2025-12-15 10:08:16.266 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:08:16 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Dec 15 05:08:16 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2910075711' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Dec 15 05:08:16 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Dec 15 05:08:16 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2910075711' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Dec 15 05:08:17 localhost nova_compute[286344]: 2025-12-15 10:08:17.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:08:17 localhost nova_compute[286344]: 2025-12-15 10:08:17.271 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 15 05:08:17 localhost nova_compute[286344]: 2025-12-15 10:08:17.271 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 15 05:08:18 localhost nova_compute[286344]: 2025-12-15 10:08:18.573 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:08:18 localhost nova_compute[286344]: 2025-12-15 10:08:18.771 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 15 05:08:18 localhost nova_compute[286344]: 2025-12-15 10:08:18.772 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquired lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 15 05:08:18 localhost nova_compute[286344]: 2025-12-15 10:08:18.772 286348 DEBUG nova.network.neutron [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 15 05:08:18 localhost nova_compute[286344]: 2025-12-15 10:08:18.772 286348 DEBUG nova.objects.instance [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 15 05:08:19 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-537145655", "caps": ["mds", "allow rw path=/volumes/_nogroup/6a6f47e1-f860-424b-a5a3-e3f1e5d0f96f/0f173a26-3b98-47e3-b9c9-a22fafea1cc2", "osd", "allow rw pool=manila_data namespace=fsvolumens_6a6f47e1-f860-424b-a5a3-e3f1e5d0f96f", "mon", "allow r"], "format": "json"} v 0) Dec 15 05:08:19 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-537145655", "caps": ["mds", "allow rw path=/volumes/_nogroup/6a6f47e1-f860-424b-a5a3-e3f1e5d0f96f/0f173a26-3b98-47e3-b9c9-a22fafea1cc2", "osd", "allow rw pool=manila_data namespace=fsvolumens_6a6f47e1-f860-424b-a5a3-e3f1e5d0f96f", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:08:19 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-537145655", "caps": ["mds", "allow rw path=/volumes/_nogroup/6a6f47e1-f860-424b-a5a3-e3f1e5d0f96f/0f173a26-3b98-47e3-b9c9-a22fafea1cc2", "osd", "allow rw pool=manila_data namespace=fsvolumens_6a6f47e1-f860-424b-a5a3-e3f1e5d0f96f", "mon", "allow r"], "format": "json"}]': finished Dec 15 05:08:20 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-537145655", "format": "json"} : dispatch Dec 15 05:08:20 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-537145655", "caps": ["mds", "allow rw path=/volumes/_nogroup/6a6f47e1-f860-424b-a5a3-e3f1e5d0f96f/0f173a26-3b98-47e3-b9c9-a22fafea1cc2", "osd", "allow rw pool=manila_data namespace=fsvolumens_6a6f47e1-f860-424b-a5a3-e3f1e5d0f96f", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:08:20 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-537145655", "caps": ["mds", "allow rw path=/volumes/_nogroup/6a6f47e1-f860-424b-a5a3-e3f1e5d0f96f/0f173a26-3b98-47e3-b9c9-a22fafea1cc2", "osd", "allow rw pool=manila_data namespace=fsvolumens_6a6f47e1-f860-424b-a5a3-e3f1e5d0f96f", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:08:20 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-537145655", "caps": ["mds", "allow rw path=/volumes/_nogroup/6a6f47e1-f860-424b-a5a3-e3f1e5d0f96f/0f173a26-3b98-47e3-b9c9-a22fafea1cc2", "osd", "allow rw pool=manila_data namespace=fsvolumens_6a6f47e1-f860-424b-a5a3-e3f1e5d0f96f", "mon", "allow r"], "format": "json"}]': finished Dec 15 05:08:20 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:08:20.431 267546 INFO neutron.agent.linux.ip_lib [None req-4d00132b-63fe-46c6-8f4c-d7a77fdde7a8 - - - - - -] Device tap015037e0-a9 cannot be used as it has no MAC address#033[00m Dec 15 05:08:20 localhost nova_compute[286344]: 2025-12-15 10:08:20.459 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:08:20 localhost kernel: device tap015037e0-a9 entered promiscuous mode Dec 15 05:08:20 localhost ovn_controller[154603]: 2025-12-15T10:08:20Z|00412|binding|INFO|Claiming lport 015037e0-a999-4d04-a64f-708652eacc63 for this chassis. Dec 15 05:08:20 localhost ovn_controller[154603]: 2025-12-15T10:08:20Z|00413|binding|INFO|015037e0-a999-4d04-a64f-708652eacc63: Claiming unknown Dec 15 05:08:20 localhost NetworkManager[5963]: [1765793300.4657] manager: (tap015037e0-a9): new Generic device (/org/freedesktop/NetworkManager/Devices/65) Dec 15 05:08:20 localhost systemd-udevd[330528]: Network interface NamePolicy= disabled on kernel command line. Dec 15 05:08:20 localhost ovn_controller[154603]: 2025-12-15T10:08:20Z|00414|binding|INFO|Setting lport 015037e0-a999-4d04-a64f-708652eacc63 ovn-installed in OVS Dec 15 05:08:20 localhost nova_compute[286344]: 2025-12-15 10:08:20.475 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:08:20 localhost journal[231322]: ethtool ioctl error on tap015037e0-a9: No such device Dec 15 05:08:20 localhost nova_compute[286344]: 2025-12-15 10:08:20.495 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:08:20 localhost journal[231322]: ethtool ioctl error on tap015037e0-a9: No such device Dec 15 05:08:20 localhost journal[231322]: ethtool ioctl error on tap015037e0-a9: No such device Dec 15 05:08:20 localhost journal[231322]: ethtool ioctl error on tap015037e0-a9: No such device Dec 15 05:08:20 localhost journal[231322]: ethtool ioctl error on tap015037e0-a9: No such device Dec 15 05:08:20 localhost journal[231322]: ethtool ioctl error on tap015037e0-a9: No such device Dec 15 05:08:20 localhost journal[231322]: ethtool ioctl error on tap015037e0-a9: No such device Dec 15 05:08:20 localhost journal[231322]: ethtool ioctl error on tap015037e0-a9: No such device Dec 15 05:08:20 localhost ovn_controller[154603]: 2025-12-15T10:08:20Z|00415|binding|INFO|Setting lport 015037e0-a999-4d04-a64f-708652eacc63 up in Southbound Dec 15 05:08:20 localhost ovn_metadata_agent[160585]: 2025-12-15 10:08:20.521 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '89e710ef9f4f48d48a369002db572947', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e47821dc-5f5d-44dc-8a16-54817df4049d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=015037e0-a999-4d04-a64f-708652eacc63) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:08:20 localhost ovn_metadata_agent[160585]: 2025-12-15 10:08:20.523 160590 INFO neutron.agent.ovn.metadata.agent [-] Port 015037e0-a999-4d04-a64f-708652eacc63 in datapath c0669abd-aef1-4b0d-9f97-a6adeeac3211 bound to our chassis#033[00m Dec 15 05:08:20 localhost ovn_metadata_agent[160585]: 2025-12-15 10:08:20.525 160590 DEBUG neutron.agent.ovn.metadata.agent [-] Port 9bf6cc89-06d5-4ff1-899a-8505b3cbf38c IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Dec 15 05:08:20 localhost ovn_metadata_agent[160585]: 2025-12-15 10:08:20.526 160590 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c0669abd-aef1-4b0d-9f97-a6adeeac3211, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 15 05:08:20 localhost ovn_metadata_agent[160585]: 2025-12-15 10:08:20.527 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[cb792977-c77f-41a8-87e9-2dccb15d7554]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:08:20 localhost nova_compute[286344]: 2025-12-15 10:08:20.534 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:08:20 localhost nova_compute[286344]: 2025-12-15 10:08:20.562 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:08:20 localhost nova_compute[286344]: 2025-12-15 10:08:20.883 286348 DEBUG nova.network.neutron [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Updating instance_info_cache with network_info: [{"id": "03ef8889-3216-43fb-8a52-4be17a956ce1", "address": "fa:16:3e:74:df:7c", "network": {"id": "befb7a72-17a9-4bcb-b561-84b8f626685a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.201", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "c785bf23f53946bc99867d8832a50266", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03ef8889-32", "ovs_interfaceid": "03ef8889-3216-43fb-8a52-4be17a956ce1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 15 05:08:21 localhost nova_compute[286344]: 2025-12-15 10:08:21.066 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:08:21 localhost nova_compute[286344]: 2025-12-15 10:08:21.110 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Releasing lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 15 05:08:21 localhost nova_compute[286344]: 2025-12-15 10:08:21.111 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 15 05:08:21 localhost nova_compute[286344]: 2025-12-15 10:08:21.112 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:08:21 localhost nova_compute[286344]: 2025-12-15 10:08:21.112 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:08:21 localhost nova_compute[286344]: 2025-12-15 10:08:21.113 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 15 05:08:21 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e199 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:08:21 localhost podman[330600]: Dec 15 05:08:21 localhost podman[330600]: 2025-12-15 10:08:21.399962599 +0000 UTC m=+0.087018417 container create 21423a3d2b9d6253a3b1a2f6f2aea3f3f1f6a228cbcbeb771dffc6dabb27cf45 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true) Dec 15 05:08:21 localhost systemd[1]: Started libpod-conmon-21423a3d2b9d6253a3b1a2f6f2aea3f3f1f6a228cbcbeb771dffc6dabb27cf45.scope. Dec 15 05:08:21 localhost systemd[1]: Started libcrun container. Dec 15 05:08:21 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6081e5659779b4b74e6692f6a4dde27934e3e4ee362c6561f6c9b299ddcaeb67/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 05:08:21 localhost podman[330600]: 2025-12-15 10:08:21.357235808 +0000 UTC m=+0.044291636 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 15 05:08:21 localhost podman[330600]: 2025-12-15 10:08:21.465423569 +0000 UTC m=+0.152479387 container init 21423a3d2b9d6253a3b1a2f6f2aea3f3f1f6a228cbcbeb771dffc6dabb27cf45 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3) Dec 15 05:08:21 localhost podman[330600]: 2025-12-15 10:08:21.477819356 +0000 UTC m=+0.164875184 container start 21423a3d2b9d6253a3b1a2f6f2aea3f3f1f6a228cbcbeb771dffc6dabb27cf45 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true) Dec 15 05:08:21 localhost dnsmasq[330619]: started, version 2.85 cachesize 150 Dec 15 05:08:21 localhost dnsmasq[330619]: DNS service limited to local subnets Dec 15 05:08:21 localhost dnsmasq[330619]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 15 05:08:21 localhost dnsmasq[330619]: warning: no upstream servers configured Dec 15 05:08:21 localhost dnsmasq-dhcp[330619]: DHCPv6, static leases only on 2001:db8::, lease time 1d Dec 15 05:08:21 localhost dnsmasq[330619]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/addn_hosts - 0 addresses Dec 15 05:08:21 localhost dnsmasq-dhcp[330619]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/host Dec 15 05:08:21 localhost dnsmasq-dhcp[330619]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/opts Dec 15 05:08:21 localhost ovn_controller[154603]: 2025-12-15T10:08:21Z|00416|binding|INFO|Releasing lport 015037e0-a999-4d04-a64f-708652eacc63 from this chassis (sb_readonly=0) Dec 15 05:08:21 localhost kernel: device tap015037e0-a9 left promiscuous mode Dec 15 05:08:21 localhost ovn_controller[154603]: 2025-12-15T10:08:21Z|00417|binding|INFO|Setting lport 015037e0-a999-4d04-a64f-708652eacc63 down in Southbound Dec 15 05:08:21 localhost nova_compute[286344]: 2025-12-15 10:08:21.591 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:08:21 localhost nova_compute[286344]: 2025-12-15 10:08:21.610 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:08:21 localhost ovn_metadata_agent[160585]: 2025-12-15 10:08:21.681 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:feea:4f86/64 2001:db8::2/64', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '89e710ef9f4f48d48a369002db572947', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e47821dc-5f5d-44dc-8a16-54817df4049d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=015037e0-a999-4d04-a64f-708652eacc63) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:08:21 localhost ovn_metadata_agent[160585]: 2025-12-15 10:08:21.683 160590 INFO neutron.agent.ovn.metadata.agent [-] Port 015037e0-a999-4d04-a64f-708652eacc63 in datapath c0669abd-aef1-4b0d-9f97-a6adeeac3211 unbound from our chassis#033[00m Dec 15 05:08:21 localhost ovn_metadata_agent[160585]: 2025-12-15 10:08:21.685 160590 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c0669abd-aef1-4b0d-9f97-a6adeeac3211, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 15 05:08:21 localhost ovn_metadata_agent[160585]: 2025-12-15 10:08:21.685 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[927228f6-187b-45cc-b816-b0dfe8cfe7d8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:08:21 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:08:21.715 267546 INFO neutron.agent.dhcp.agent [None req-cecea230-9842-468f-b44a-b91b5e48061e - - - - - -] DHCP configuration for ports {'79503367-f53f-4b35-8760-76fcaa4d8407'} is completed#033[00m Dec 15 05:08:22 localhost nova_compute[286344]: 2025-12-15 10:08:22.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:08:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 05:08:22 localhost systemd[1]: tmp-crun.iUOmYt.mount: Deactivated successfully. Dec 15 05:08:22 localhost podman[330622]: 2025-12-15 10:08:22.759931498 +0000 UTC m=+0.087862440 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Dec 15 05:08:22 localhost podman[330622]: 2025-12-15 10:08:22.78942277 +0000 UTC m=+0.117353712 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, config_id=ovn_metadata_agent) Dec 15 05:08:22 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 05:08:22 localhost nova_compute[286344]: 2025-12-15 10:08:22.940 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 05:08:22 localhost nova_compute[286344]: 2025-12-15 10:08:22.940 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 05:08:22 localhost nova_compute[286344]: 2025-12-15 10:08:22.940 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 05:08:22 localhost nova_compute[286344]: 2025-12-15 10:08:22.941 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Auditing locally available compute resources for np0005559462.localdomain (node: np0005559462.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 15 05:08:22 localhost nova_compute[286344]: 2025-12-15 10:08:22.941 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 05:08:23 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Dec 15 05:08:23 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.25625 172.18.0.34:0/382777224' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Dec 15 05:08:23 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Dec 15 05:08:23 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1160033306' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Dec 15 05:08:23 localhost nova_compute[286344]: 2025-12-15 10:08:23.401 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 05:08:23 localhost nova_compute[286344]: 2025-12-15 10:08:23.576 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:08:23 localhost nova_compute[286344]: 2025-12-15 10:08:23.635 286348 DEBUG nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 05:08:23 localhost nova_compute[286344]: 2025-12-15 10:08:23.636 286348 DEBUG nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 05:08:23 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth rm", "entity": "client.tempest-cephx-id-537145655"} v 0) Dec 15 05:08:23 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-537145655"} : dispatch Dec 15 05:08:23 localhost nova_compute[286344]: 2025-12-15 10:08:23.884 286348 WARNING nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 15 05:08:23 localhost nova_compute[286344]: 2025-12-15 10:08:23.886 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Hypervisor/Node resource view: name=np0005559462.localdomain free_ram=11230MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 15 05:08:23 localhost nova_compute[286344]: 2025-12-15 10:08:23.886 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 05:08:23 localhost nova_compute[286344]: 2025-12-15 10:08:23.887 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 05:08:23 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-537145655"}]': finished Dec 15 05:08:24 localhost systemd[1]: tmp-crun.CaV1ju.mount: Deactivated successfully. Dec 15 05:08:24 localhost dnsmasq[330619]: exiting on receipt of SIGTERM Dec 15 05:08:24 localhost podman[330679]: 2025-12-15 10:08:24.136049265 +0000 UTC m=+0.062287484 container kill 21423a3d2b9d6253a3b1a2f6f2aea3f3f1f6a228cbcbeb771dffc6dabb27cf45 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 05:08:24 localhost systemd[1]: libpod-21423a3d2b9d6253a3b1a2f6f2aea3f3f1f6a228cbcbeb771dffc6dabb27cf45.scope: Deactivated successfully. Dec 15 05:08:24 localhost podman[330692]: 2025-12-15 10:08:24.205956696 +0000 UTC m=+0.056492967 container died 21423a3d2b9d6253a3b1a2f6f2aea3f3f1f6a228cbcbeb771dffc6dabb27cf45 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:08:24 localhost podman[330692]: 2025-12-15 10:08:24.237639688 +0000 UTC m=+0.088175919 container cleanup 21423a3d2b9d6253a3b1a2f6f2aea3f3f1f6a228cbcbeb771dffc6dabb27cf45 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:08:24 localhost systemd[1]: libpod-conmon-21423a3d2b9d6253a3b1a2f6f2aea3f3f1f6a228cbcbeb771dffc6dabb27cf45.scope: Deactivated successfully. Dec 15 05:08:24 localhost podman[330694]: 2025-12-15 10:08:24.281587243 +0000 UTC m=+0.126329736 container remove 21423a3d2b9d6253a3b1a2f6f2aea3f3f1f6a228cbcbeb771dffc6dabb27cf45 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Dec 15 05:08:24 localhost nova_compute[286344]: 2025-12-15 10:08:24.281 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Instance 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 15 05:08:24 localhost nova_compute[286344]: 2025-12-15 10:08:24.282 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 15 05:08:24 localhost nova_compute[286344]: 2025-12-15 10:08:24.283 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Final resource view: name=np0005559462.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 15 05:08:24 localhost nova_compute[286344]: 2025-12-15 10:08:24.329 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 05:08:24 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:08:24.333 267546 INFO neutron.agent.linux.ip_lib [None req-9a659aba-9f3f-4819-8681-dff4a3067700 - - - - - -] Device tap015037e0-a9 cannot be used as it has no MAC address#033[00m Dec 15 05:08:24 localhost nova_compute[286344]: 2025-12-15 10:08:24.409 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:08:24 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-537145655", "format": "json"} : dispatch Dec 15 05:08:24 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-537145655"} : dispatch Dec 15 05:08:24 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-537145655"} : dispatch Dec 15 05:08:24 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-537145655"}]': finished Dec 15 05:08:24 localhost kernel: device tap015037e0-a9 entered promiscuous mode Dec 15 05:08:24 localhost NetworkManager[5963]: [1765793304.4178] manager: (tap015037e0-a9): new Generic device (/org/freedesktop/NetworkManager/Devices/66) Dec 15 05:08:24 localhost nova_compute[286344]: 2025-12-15 10:08:24.419 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:08:24 localhost ovn_controller[154603]: 2025-12-15T10:08:24Z|00418|binding|INFO|Claiming lport 015037e0-a999-4d04-a64f-708652eacc63 for this chassis. Dec 15 05:08:24 localhost ovn_controller[154603]: 2025-12-15T10:08:24Z|00419|binding|INFO|015037e0-a999-4d04-a64f-708652eacc63: Claiming unknown Dec 15 05:08:24 localhost systemd-udevd[330728]: Network interface NamePolicy= disabled on kernel command line. Dec 15 05:08:24 localhost ovn_controller[154603]: 2025-12-15T10:08:24Z|00420|binding|INFO|Setting lport 015037e0-a999-4d04-a64f-708652eacc63 ovn-installed in OVS Dec 15 05:08:24 localhost nova_compute[286344]: 2025-12-15 10:08:24.434 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:08:24 localhost ovn_controller[154603]: 2025-12-15T10:08:24Z|00421|binding|INFO|Setting lport 015037e0-a999-4d04-a64f-708652eacc63 up in Southbound Dec 15 05:08:24 localhost ovn_metadata_agent[160585]: 2025-12-15 10:08:24.438 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:feea:4f86/64 2001:db8::2/64', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '89e710ef9f4f48d48a369002db572947', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e47821dc-5f5d-44dc-8a16-54817df4049d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=015037e0-a999-4d04-a64f-708652eacc63) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:08:24 localhost ovn_metadata_agent[160585]: 2025-12-15 10:08:24.440 160590 INFO neutron.agent.ovn.metadata.agent [-] Port 015037e0-a999-4d04-a64f-708652eacc63 in datapath c0669abd-aef1-4b0d-9f97-a6adeeac3211 bound to our chassis#033[00m Dec 15 05:08:24 localhost ovn_metadata_agent[160585]: 2025-12-15 10:08:24.443 160590 DEBUG neutron.agent.ovn.metadata.agent [-] Port 9bf6cc89-06d5-4ff1-899a-8505b3cbf38c IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Dec 15 05:08:24 localhost ovn_metadata_agent[160585]: 2025-12-15 10:08:24.443 160590 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c0669abd-aef1-4b0d-9f97-a6adeeac3211, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 15 05:08:24 localhost ovn_metadata_agent[160585]: 2025-12-15 10:08:24.444 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[0ab5c2f2-79b6-43d2-a3c0-bff18dc307f7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:08:24 localhost journal[231322]: ethtool ioctl error on tap015037e0-a9: No such device Dec 15 05:08:24 localhost journal[231322]: ethtool ioctl error on tap015037e0-a9: No such device Dec 15 05:08:24 localhost journal[231322]: ethtool ioctl error on tap015037e0-a9: No such device Dec 15 05:08:24 localhost nova_compute[286344]: 2025-12-15 10:08:24.458 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:08:24 localhost journal[231322]: ethtool ioctl error on tap015037e0-a9: No such device Dec 15 05:08:24 localhost journal[231322]: ethtool ioctl error on tap015037e0-a9: No such device Dec 15 05:08:24 localhost journal[231322]: ethtool ioctl error on tap015037e0-a9: No such device Dec 15 05:08:24 localhost journal[231322]: ethtool ioctl error on tap015037e0-a9: No such device Dec 15 05:08:24 localhost journal[231322]: ethtool ioctl error on tap015037e0-a9: No such device Dec 15 05:08:24 localhost nova_compute[286344]: 2025-12-15 10:08:24.498 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:08:24 localhost nova_compute[286344]: 2025-12-15 10:08:24.526 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:08:24 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Dec 15 05:08:24 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1362160112' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Dec 15 05:08:24 localhost nova_compute[286344]: 2025-12-15 10:08:24.777 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 05:08:24 localhost nova_compute[286344]: 2025-12-15 10:08:24.785 286348 DEBUG nova.compute.provider_tree [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Inventory has not changed in ProviderTree for provider: 26c8956b-6742-4951-b566-971b9bbe323b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 15 05:08:25 localhost systemd[1]: var-lib-containers-storage-overlay-6081e5659779b4b74e6692f6a4dde27934e3e4ee362c6561f6c9b299ddcaeb67-merged.mount: Deactivated successfully. Dec 15 05:08:25 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-21423a3d2b9d6253a3b1a2f6f2aea3f3f1f6a228cbcbeb771dffc6dabb27cf45-userdata-shm.mount: Deactivated successfully. Dec 15 05:08:25 localhost podman[330818]: Dec 15 05:08:25 localhost podman[330818]: 2025-12-15 10:08:25.295018429 +0000 UTC m=+0.087331566 container create 7c490fec55d76843c6a4b322f520facd5f2038faa9a5099320268b4b130808cf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:08:25 localhost systemd[1]: Started libpod-conmon-7c490fec55d76843c6a4b322f520facd5f2038faa9a5099320268b4b130808cf.scope. Dec 15 05:08:25 localhost podman[330818]: 2025-12-15 10:08:25.254110037 +0000 UTC m=+0.046423194 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 15 05:08:25 localhost systemd[1]: Started libcrun container. Dec 15 05:08:25 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/46470493796bed8cc7ca0ef47a1ecf5767e73a6f4543ae34d31809b92f6ad524/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 05:08:25 localhost podman[330818]: 2025-12-15 10:08:25.374903641 +0000 UTC m=+0.167216748 container init 7c490fec55d76843c6a4b322f520facd5f2038faa9a5099320268b4b130808cf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2) Dec 15 05:08:25 localhost podman[330818]: 2025-12-15 10:08:25.384200054 +0000 UTC m=+0.176513171 container start 7c490fec55d76843c6a4b322f520facd5f2038faa9a5099320268b4b130808cf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3) Dec 15 05:08:25 localhost dnsmasq[330836]: started, version 2.85 cachesize 150 Dec 15 05:08:25 localhost dnsmasq[330836]: DNS service limited to local subnets Dec 15 05:08:25 localhost dnsmasq[330836]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 15 05:08:25 localhost dnsmasq[330836]: warning: no upstream servers configured Dec 15 05:08:25 localhost dnsmasq-dhcp[330836]: DHCPv6, static leases only on 2001:db8:0:1::, lease time 1d Dec 15 05:08:25 localhost dnsmasq-dhcp[330836]: DHCPv6, static leases only on 2001:db8::, lease time 1d Dec 15 05:08:25 localhost dnsmasq[330836]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/addn_hosts - 0 addresses Dec 15 05:08:25 localhost dnsmasq-dhcp[330836]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/host Dec 15 05:08:25 localhost dnsmasq-dhcp[330836]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/opts Dec 15 05:08:25 localhost nova_compute[286344]: 2025-12-15 10:08:25.514 286348 DEBUG nova.scheduler.client.report [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Inventory has not changed for provider 26c8956b-6742-4951-b566-971b9bbe323b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 15 05:08:25 localhost nova_compute[286344]: 2025-12-15 10:08:25.516 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Compute_service record updated for np0005559462.localdomain:np0005559462.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 15 05:08:25 localhost nova_compute[286344]: 2025-12-15 10:08:25.516 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.629s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 05:08:25 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:08:25.608 267546 INFO neutron.agent.dhcp.agent [None req-afa3af13-6193-4eb2-95cf-0d0d1409b93f - - - - - -] DHCP configuration for ports {'79503367-f53f-4b35-8760-76fcaa4d8407', '015037e0-a999-4d04-a64f-708652eacc63'} is completed#033[00m Dec 15 05:08:25 localhost neutron_sriov_agent[260044]: 2025-12-15 10:08:25.883 2 INFO neutron.agent.securitygroups_rpc [None req-9b5159b6-f281-4ff8-8185-b605e431f172 6b5da6f221214afe93e1fa66574f238b 89e710ef9f4f48d48a369002db572947 - - default default] Security group member updated ['a6c5f808-dddc-4f17-acbf-63b1b6e6f4d6']#033[00m Dec 15 05:08:26 localhost nova_compute[286344]: 2025-12-15 10:08:26.068 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:08:26 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:08:26.109 267546 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-15T10:08:25Z, description=, device_id=, device_owner=, dns_assignment=[, ], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[, ], id=52dc5570-e53a-4683-9494-bcf8f1436cd6, ip_allocation=immediate, mac_address=fa:16:3e:8d:3f:d8, name=tempest-NetworksTestDHCPv6-555825491, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-15T10:05:39Z, description=, dns_domain=, id=c0669abd-aef1-4b0d-9f97-a6adeeac3211, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-NetworksTestDHCPv6-test-network-1861207414, port_security_enabled=True, project_id=89e710ef9f4f48d48a369002db572947, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=20673, qos_policy_id=None, revision_number=61, router:external=False, shared=False, standard_attr_id=2164, status=ACTIVE, subnets=['645df342-a91e-4676-9acc-612194d89847', 'ae0e001b-7506-4150-a85c-bc57cd0ad5cb'], tags=[], tenant_id=89e710ef9f4f48d48a369002db572947, updated_at=2025-12-15T10:08:17Z, vlan_transparent=None, network_id=c0669abd-aef1-4b0d-9f97-a6adeeac3211, port_security_enabled=True, project_id=89e710ef9f4f48d48a369002db572947, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['a6c5f808-dddc-4f17-acbf-63b1b6e6f4d6'], standard_attr_id=2901, status=DOWN, tags=[], tenant_id=89e710ef9f4f48d48a369002db572947, updated_at=2025-12-15T10:08:25Z on network c0669abd-aef1-4b0d-9f97-a6adeeac3211#033[00m Dec 15 05:08:26 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e199 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:08:26 localhost dnsmasq[330836]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/addn_hosts - 2 addresses Dec 15 05:08:26 localhost dnsmasq-dhcp[330836]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/host Dec 15 05:08:26 localhost dnsmasq-dhcp[330836]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/opts Dec 15 05:08:26 localhost podman[330855]: 2025-12-15 10:08:26.30275053 +0000 UTC m=+0.061536494 container kill 7c490fec55d76843c6a4b322f520facd5f2038faa9a5099320268b4b130808cf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2) Dec 15 05:08:26 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Dec 15 05:08:26 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1148133529' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Dec 15 05:08:26 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Dec 15 05:08:26 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1148133529' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Dec 15 05:08:27 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:08:27.432 267546 INFO neutron.agent.dhcp.agent [None req-3a25cbdf-ec1e-46d2-a4aa-5d453870fa67 - - - - - -] DHCP configuration for ports {'52dc5570-e53a-4683-9494-bcf8f1436cd6'} is completed#033[00m Dec 15 05:08:27 localhost nova_compute[286344]: 2025-12-15 10:08:27.516 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:08:27 localhost systemd[1]: Started /usr/bin/podman healthcheck run a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e. Dec 15 05:08:27 localhost podman[330875]: 2025-12-15 10:08:27.75654515 +0000 UTC m=+0.085144877 container health_status a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Dec 15 05:08:27 localhost podman[330875]: 2025-12-15 10:08:27.789772363 +0000 UTC m=+0.118372080 container exec_died a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 15 05:08:27 localhost systemd[1]: a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e.service: Deactivated successfully. Dec 15 05:08:28 localhost nova_compute[286344]: 2025-12-15 10:08:28.607 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:08:29 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Dec 15 05:08:29 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.25625 172.18.0.34:0/382777224' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Dec 15 05:08:29 localhost neutron_sriov_agent[260044]: 2025-12-15 10:08:29.425 2 INFO neutron.agent.securitygroups_rpc [None req-c6fb2d8b-7567-4c44-890c-46a882ecc94e 6b5da6f221214afe93e1fa66574f238b 89e710ef9f4f48d48a369002db572947 - - default default] Security group member updated ['a6c5f808-dddc-4f17-acbf-63b1b6e6f4d6']#033[00m Dec 15 05:08:30 localhost dnsmasq[330836]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/addn_hosts - 0 addresses Dec 15 05:08:30 localhost dnsmasq-dhcp[330836]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/host Dec 15 05:08:30 localhost dnsmasq-dhcp[330836]: read /var/lib/neutron/dhcp/c0669abd-aef1-4b0d-9f97-a6adeeac3211/opts Dec 15 05:08:30 localhost podman[330915]: 2025-12-15 10:08:30.223180239 +0000 UTC m=+0.068642467 container kill 7c490fec55d76843c6a4b322f520facd5f2038faa9a5099320268b4b130808cf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0) Dec 15 05:08:30 localhost systemd[1]: tmp-crun.cst8Up.mount: Deactivated successfully. Dec 15 05:08:31 localhost nova_compute[286344]: 2025-12-15 10:08:31.071 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:08:31 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e199 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:08:31 localhost podman[243449]: time="2025-12-15T10:08:31Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 15 05:08:31 localhost podman[243449]: @ - - [15/Dec/2025:10:08:31 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 160380 "" "Go-http-client/1.1" Dec 15 05:08:31 localhost podman[243449]: @ - - [15/Dec/2025:10:08:31 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 20202 "" "Go-http-client/1.1" Dec 15 05:08:32 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-537145655", "caps": ["mds", "allow rw path=/volumes/_nogroup/568da928-180d-446f-b45b-9f50182ea67e/79fc8097-e1ed-40a4-9149-7f3067893ab1", "osd", "allow rw pool=manila_data namespace=fsvolumens_568da928-180d-446f-b45b-9f50182ea67e", "mon", "allow r"], "format": "json"} v 0) Dec 15 05:08:32 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-537145655", "caps": ["mds", "allow rw path=/volumes/_nogroup/568da928-180d-446f-b45b-9f50182ea67e/79fc8097-e1ed-40a4-9149-7f3067893ab1", "osd", "allow rw pool=manila_data namespace=fsvolumens_568da928-180d-446f-b45b-9f50182ea67e", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:08:32 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-537145655", "caps": ["mds", "allow rw path=/volumes/_nogroup/568da928-180d-446f-b45b-9f50182ea67e/79fc8097-e1ed-40a4-9149-7f3067893ab1", "osd", "allow rw pool=manila_data namespace=fsvolumens_568da928-180d-446f-b45b-9f50182ea67e", "mon", "allow r"], "format": "json"}]': finished Dec 15 05:08:33 localhost dnsmasq[330836]: exiting on receipt of SIGTERM Dec 15 05:08:33 localhost podman[330956]: 2025-12-15 10:08:33.295920079 +0000 UTC m=+0.058722328 container kill 7c490fec55d76843c6a4b322f520facd5f2038faa9a5099320268b4b130808cf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true) Dec 15 05:08:33 localhost systemd[1]: libpod-7c490fec55d76843c6a4b322f520facd5f2038faa9a5099320268b4b130808cf.scope: Deactivated successfully. Dec 15 05:08:33 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-537145655", "format": "json"} : dispatch Dec 15 05:08:33 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-537145655", "caps": ["mds", "allow rw path=/volumes/_nogroup/568da928-180d-446f-b45b-9f50182ea67e/79fc8097-e1ed-40a4-9149-7f3067893ab1", "osd", "allow rw pool=manila_data namespace=fsvolumens_568da928-180d-446f-b45b-9f50182ea67e", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:08:33 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-537145655", "caps": ["mds", "allow rw path=/volumes/_nogroup/568da928-180d-446f-b45b-9f50182ea67e/79fc8097-e1ed-40a4-9149-7f3067893ab1", "osd", "allow rw pool=manila_data namespace=fsvolumens_568da928-180d-446f-b45b-9f50182ea67e", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:08:33 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-537145655", "caps": ["mds", "allow rw path=/volumes/_nogroup/568da928-180d-446f-b45b-9f50182ea67e/79fc8097-e1ed-40a4-9149-7f3067893ab1", "osd", "allow rw pool=manila_data namespace=fsvolumens_568da928-180d-446f-b45b-9f50182ea67e", "mon", "allow r"], "format": "json"}]': finished Dec 15 05:08:33 localhost podman[330970]: 2025-12-15 10:08:33.367718722 +0000 UTC m=+0.059075048 container died 7c490fec55d76843c6a4b322f520facd5f2038faa9a5099320268b4b130808cf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Dec 15 05:08:33 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7c490fec55d76843c6a4b322f520facd5f2038faa9a5099320268b4b130808cf-userdata-shm.mount: Deactivated successfully. Dec 15 05:08:33 localhost podman[330970]: 2025-12-15 10:08:33.404167973 +0000 UTC m=+0.095524279 container cleanup 7c490fec55d76843c6a4b322f520facd5f2038faa9a5099320268b4b130808cf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202) Dec 15 05:08:33 localhost systemd[1]: libpod-conmon-7c490fec55d76843c6a4b322f520facd5f2038faa9a5099320268b4b130808cf.scope: Deactivated successfully. Dec 15 05:08:33 localhost podman[330977]: 2025-12-15 10:08:33.456608189 +0000 UTC m=+0.136537504 container remove 7c490fec55d76843c6a4b322f520facd5f2038faa9a5099320268b4b130808cf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0669abd-aef1-4b0d-9f97-a6adeeac3211, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0) Dec 15 05:08:33 localhost nova_compute[286344]: 2025-12-15 10:08:33.611 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:08:34 localhost ovn_controller[154603]: 2025-12-15T10:08:34Z|00422|binding|INFO|Releasing lport 015037e0-a999-4d04-a64f-708652eacc63 from this chassis (sb_readonly=0) Dec 15 05:08:34 localhost nova_compute[286344]: 2025-12-15 10:08:34.051 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:08:34 localhost kernel: device tap015037e0-a9 left promiscuous mode Dec 15 05:08:34 localhost ovn_controller[154603]: 2025-12-15T10:08:34Z|00423|binding|INFO|Setting lport 015037e0-a999-4d04-a64f-708652eacc63 down in Southbound Dec 15 05:08:34 localhost nova_compute[286344]: 2025-12-15 10:08:34.073 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:08:34 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:08:34.125 267546 INFO neutron.agent.dhcp.agent [None req-9f286121-401f-4e07-8da1-b37a5a0deae0 - - - - - -] DHCP configuration for ports {'79503367-f53f-4b35-8760-76fcaa4d8407', '015037e0-a999-4d04-a64f-708652eacc63'} is completed#033[00m Dec 15 05:08:34 localhost ovn_metadata_agent[160585]: 2025-12-15 10:08:34.224 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:feea:4f86/64 2001:db8::2/64', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c0669abd-aef1-4b0d-9f97-a6adeeac3211', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '89e710ef9f4f48d48a369002db572947', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005559462.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e47821dc-5f5d-44dc-8a16-54817df4049d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=015037e0-a999-4d04-a64f-708652eacc63) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:08:34 localhost ovn_metadata_agent[160585]: 2025-12-15 10:08:34.226 160590 INFO neutron.agent.ovn.metadata.agent [-] Port 015037e0-a999-4d04-a64f-708652eacc63 in datapath c0669abd-aef1-4b0d-9f97-a6adeeac3211 unbound from our chassis#033[00m Dec 15 05:08:34 localhost ovn_metadata_agent[160585]: 2025-12-15 10:08:34.228 160590 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c0669abd-aef1-4b0d-9f97-a6adeeac3211, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 15 05:08:34 localhost ovn_metadata_agent[160585]: 2025-12-15 10:08:34.229 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[bf9aa21a-ea65-41da-869a-8eb95efc417f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:08:34 localhost systemd[1]: var-lib-containers-storage-overlay-46470493796bed8cc7ca0ef47a1ecf5767e73a6f4543ae34d31809b92f6ad524-merged.mount: Deactivated successfully. Dec 15 05:08:34 localhost systemd[1]: run-netns-qdhcp\x2dc0669abd\x2daef1\x2d4b0d\x2d9f97\x2da6adeeac3211.mount: Deactivated successfully. Dec 15 05:08:34 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:08:34.798 267546 INFO neutron.agent.dhcp.agent [None req-b2979ec8-91ae-46ae-a47e-377bb7057d11 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:08:34 localhost openstack_network_exporter[246484]: ERROR 10:08:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 05:08:34 localhost openstack_network_exporter[246484]: ERROR 10:08:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 05:08:34 localhost openstack_network_exporter[246484]: ERROR 10:08:34 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 15 05:08:34 localhost openstack_network_exporter[246484]: ERROR 10:08:34 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 15 05:08:34 localhost openstack_network_exporter[246484]: Dec 15 05:08:34 localhost openstack_network_exporter[246484]: ERROR 10:08:34 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 15 05:08:34 localhost openstack_network_exporter[246484]: Dec 15 05:08:36 localhost nova_compute[286344]: 2025-12-15 10:08:36.073 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:08:36 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e199 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:08:36 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth rm", "entity": "client.tempest-cephx-id-537145655"} v 0) Dec 15 05:08:36 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-537145655"} : dispatch Dec 15 05:08:36 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-537145655"}]': finished Dec 15 05:08:37 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-537145655", "format": "json"} : dispatch Dec 15 05:08:37 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-537145655"} : dispatch Dec 15 05:08:37 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-537145655"} : dispatch Dec 15 05:08:37 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-537145655"}]': finished Dec 15 05:08:38 localhost podman[331017]: 2025-12-15 10:08:38.07328709 +0000 UTC m=+0.060896247 container kill 5a6468fef7e6af9cf1304e74e1115619a3eb0865b18496c87c34924ae65820eb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d4d65c77-4d50-4ac8-bdf3-cfe58cbd19a4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 05:08:38 localhost dnsmasq[325450]: exiting on receipt of SIGTERM Dec 15 05:08:38 localhost systemd[1]: tmp-crun.I4ETdd.mount: Deactivated successfully. Dec 15 05:08:38 localhost systemd[1]: libpod-5a6468fef7e6af9cf1304e74e1115619a3eb0865b18496c87c34924ae65820eb.scope: Deactivated successfully. Dec 15 05:08:38 localhost podman[331030]: 2025-12-15 10:08:38.144421304 +0000 UTC m=+0.054561126 container died 5a6468fef7e6af9cf1304e74e1115619a3eb0865b18496c87c34924ae65820eb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d4d65c77-4d50-4ac8-bdf3-cfe58cbd19a4, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3) Dec 15 05:08:38 localhost podman[331030]: 2025-12-15 10:08:38.179109606 +0000 UTC m=+0.089249408 container cleanup 5a6468fef7e6af9cf1304e74e1115619a3eb0865b18496c87c34924ae65820eb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d4d65c77-4d50-4ac8-bdf3-cfe58cbd19a4, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2) Dec 15 05:08:38 localhost systemd[1]: libpod-conmon-5a6468fef7e6af9cf1304e74e1115619a3eb0865b18496c87c34924ae65820eb.scope: Deactivated successfully. Dec 15 05:08:38 localhost podman[331031]: 2025-12-15 10:08:38.220898393 +0000 UTC m=+0.127639291 container remove 5a6468fef7e6af9cf1304e74e1115619a3eb0865b18496c87c34924ae65820eb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d4d65c77-4d50-4ac8-bdf3-cfe58cbd19a4, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Dec 15 05:08:38 localhost ovn_controller[154603]: 2025-12-15T10:08:38Z|00424|binding|INFO|Removing iface tapfad1135b-b7 ovn-installed in OVS Dec 15 05:08:38 localhost ovn_metadata_agent[160585]: 2025-12-15 10:08:38.221 160590 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 0759561a-9c32-4277-8ea3-1357c5758408 with type ""#033[00m Dec 15 05:08:38 localhost ovn_controller[154603]: 2025-12-15T10:08:38Z|00425|binding|INFO|Removing lport fad1135b-b7b1-48ac-8865-1d59a512e8ca ovn-installed in OVS Dec 15 05:08:38 localhost ovn_metadata_agent[160585]: 2025-12-15 10:08:38.223 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.255.242/28', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-d4d65c77-4d50-4ac8-bdf3-cfe58cbd19a4', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d4d65c77-4d50-4ac8-bdf3-cfe58cbd19a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5a302f70917d47c392ee5c9b50e38a7e', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005559462.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=856a2b9e-2f34-49e1-a56c-c5f5d8ff3083, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=fad1135b-b7b1-48ac-8865-1d59a512e8ca) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:08:38 localhost ovn_metadata_agent[160585]: 2025-12-15 10:08:38.225 160590 INFO neutron.agent.ovn.metadata.agent [-] Port fad1135b-b7b1-48ac-8865-1d59a512e8ca in datapath d4d65c77-4d50-4ac8-bdf3-cfe58cbd19a4 unbound from our chassis#033[00m Dec 15 05:08:38 localhost nova_compute[286344]: 2025-12-15 10:08:38.225 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:08:38 localhost ovn_metadata_agent[160585]: 2025-12-15 10:08:38.228 160590 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d4d65c77-4d50-4ac8-bdf3-cfe58cbd19a4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 15 05:08:38 localhost ovn_metadata_agent[160585]: 2025-12-15 10:08:38.229 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[be174aee-7eb8-48c5-ab43-56b7f7b90da9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:08:38 localhost nova_compute[286344]: 2025-12-15 10:08:38.231 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:08:38 localhost nova_compute[286344]: 2025-12-15 10:08:38.240 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:08:38 localhost kernel: device tapfad1135b-b7 left promiscuous mode Dec 15 05:08:38 localhost nova_compute[286344]: 2025-12-15 10:08:38.254 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:08:38 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:08:38.305 267546 INFO neutron.agent.dhcp.agent [None req-74d2eb32-e0e2-4a63-8395-1da4d258c344 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:08:38 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:08:38.548 267546 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:08:38 localhost nova_compute[286344]: 2025-12-15 10:08:38.634 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:08:38 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:08:38.715 267546 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:08:39 localhost ovn_controller[154603]: 2025-12-15T10:08:39Z|00426|binding|INFO|Releasing lport b35254ad-12eb-47bb-92be-44fefe0694f0 from this chassis (sb_readonly=0) Dec 15 05:08:39 localhost nova_compute[286344]: 2025-12-15 10:08:39.042 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:08:39 localhost systemd[1]: var-lib-containers-storage-overlay-c3f62309f462240b01951714780fa5be0069023265097e303c1b65def8ed0543-merged.mount: Deactivated successfully. Dec 15 05:08:39 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5a6468fef7e6af9cf1304e74e1115619a3eb0865b18496c87c34924ae65820eb-userdata-shm.mount: Deactivated successfully. Dec 15 05:08:39 localhost systemd[1]: run-netns-qdhcp\x2dd4d65c77\x2d4d50\x2d4ac8\x2dbdf3\x2dcfe58cbd19a4.mount: Deactivated successfully. Dec 15 05:08:39 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Dec 15 05:08:39 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.25625 172.18.0.34:0/382777224' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Dec 15 05:08:39 localhost nova_compute[286344]: 2025-12-15 10:08:39.789 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:08:41 localhost nova_compute[286344]: 2025-12-15 10:08:41.111 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:08:41 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e199 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:08:42 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e199 do_prune osdmap full prune enabled Dec 15 05:08:42 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e200 e200: 6 total, 6 up, 6 in Dec 15 05:08:42 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e200: 6 total, 6 up, 6 in Dec 15 05:08:42 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-537145655", "caps": ["mds", "allow rw path=/volumes/_nogroup/4a1828c0-b006-4b50-a1c3-15b3370c3c1b/0bf39908-6427-495f-9aea-131f2bee7631", "osd", "allow rw pool=manila_data namespace=fsvolumens_4a1828c0-b006-4b50-a1c3-15b3370c3c1b", "mon", "allow r"], "format": "json"} v 0) Dec 15 05:08:42 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-537145655", "caps": ["mds", "allow rw path=/volumes/_nogroup/4a1828c0-b006-4b50-a1c3-15b3370c3c1b/0bf39908-6427-495f-9aea-131f2bee7631", "osd", "allow rw pool=manila_data namespace=fsvolumens_4a1828c0-b006-4b50-a1c3-15b3370c3c1b", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:08:42 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-537145655", "caps": ["mds", "allow rw path=/volumes/_nogroup/4a1828c0-b006-4b50-a1c3-15b3370c3c1b/0bf39908-6427-495f-9aea-131f2bee7631", "osd", "allow rw pool=manila_data namespace=fsvolumens_4a1828c0-b006-4b50-a1c3-15b3370c3c1b", "mon", "allow r"], "format": "json"}]': finished Dec 15 05:08:43 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-537145655", "format": "json"} : dispatch Dec 15 05:08:43 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-537145655", "caps": ["mds", "allow rw path=/volumes/_nogroup/4a1828c0-b006-4b50-a1c3-15b3370c3c1b/0bf39908-6427-495f-9aea-131f2bee7631", "osd", "allow rw pool=manila_data namespace=fsvolumens_4a1828c0-b006-4b50-a1c3-15b3370c3c1b", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:08:43 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-537145655", "caps": ["mds", "allow rw path=/volumes/_nogroup/4a1828c0-b006-4b50-a1c3-15b3370c3c1b/0bf39908-6427-495f-9aea-131f2bee7631", "osd", "allow rw pool=manila_data namespace=fsvolumens_4a1828c0-b006-4b50-a1c3-15b3370c3c1b", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:08:43 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-537145655", "caps": ["mds", "allow rw path=/volumes/_nogroup/4a1828c0-b006-4b50-a1c3-15b3370c3c1b/0bf39908-6427-495f-9aea-131f2bee7631", "osd", "allow rw pool=manila_data namespace=fsvolumens_4a1828c0-b006-4b50-a1c3-15b3370c3c1b", "mon", "allow r"], "format": "json"}]': finished Dec 15 05:08:43 localhost nova_compute[286344]: 2025-12-15 10:08:43.685 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:08:44 localhost nova_compute[286344]: 2025-12-15 10:08:44.433 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:08:45 localhost neutron_sriov_agent[260044]: 2025-12-15 10:08:45.182 2 INFO neutron.agent.securitygroups_rpc [None req-053eb48c-d851-4c95-9939-1d1b5bba9a91 af329bd85e524c1ea41f4255aa5eb9d8 1d15e551a54e4d0ba499cdbee6530731 - - default default] Security group member updated ['5c37dc84-f80b-4e12-8d9d-49a8b22167f0']#033[00m Dec 15 05:08:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0. Dec 15 05:08:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09. Dec 15 05:08:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 05:08:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 05:08:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a. Dec 15 05:08:45 localhost podman[331061]: 2025-12-15 10:08:45.778722205 +0000 UTC m=+0.100785391 container health_status 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 15 05:08:45 localhost podman[331061]: 2025-12-15 10:08:45.79030348 +0000 UTC m=+0.112366666 container exec_died 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Dec 15 05:08:45 localhost systemd[1]: 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0.service: Deactivated successfully. Dec 15 05:08:45 localhost systemd[1]: tmp-crun.ZqtMG8.mount: Deactivated successfully. Dec 15 05:08:45 localhost podman[331075]: 2025-12-15 10:08:45.852641696 +0000 UTC m=+0.160141056 container health_status b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2) Dec 15 05:08:45 localhost podman[331063]: 2025-12-15 10:08:45.893672471 +0000 UTC m=+0.209117827 container health_status 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Dec 15 05:08:45 localhost podman[331062]: 2025-12-15 10:08:45.822724682 +0000 UTC m=+0.143103122 container health_status 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, name=ubi9-minimal, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, io.buildah.version=1.33.7, distribution-scope=public, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, release=1755695350, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Dec 15 05:08:45 localhost podman[331063]: 2025-12-15 10:08:45.930936184 +0000 UTC m=+0.246381540 container exec_died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3) Dec 15 05:08:45 localhost podman[331068]: 2025-12-15 10:08:45.946332933 +0000 UTC m=+0.255977211 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_id=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:08:45 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.service: Deactivated successfully. Dec 15 05:08:45 localhost podman[331062]: 2025-12-15 10:08:45.958470573 +0000 UTC m=+0.278849063 container exec_died 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, release=1755695350, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, version=9.6, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.33.7) Dec 15 05:08:45 localhost systemd[1]: 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09.service: Deactivated successfully. Dec 15 05:08:45 localhost podman[331068]: 2025-12-15 10:08:45.995401358 +0000 UTC m=+0.305045626 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller) Dec 15 05:08:46 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth rm", "entity": "client.tempest-cephx-id-537145655"} v 0) Dec 15 05:08:46 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-537145655"} : dispatch Dec 15 05:08:46 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 05:08:46 localhost podman[331075]: 2025-12-15 10:08:46.016890761 +0000 UTC m=+0.324390171 container exec_died b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Dec 15 05:08:46 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-537145655"}]': finished Dec 15 05:08:46 localhost systemd[1]: b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a.service: Deactivated successfully. Dec 15 05:08:46 localhost nova_compute[286344]: 2025-12-15 10:08:46.112 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:08:46 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e200 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:08:46 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-537145655", "format": "json"} : dispatch Dec 15 05:08:46 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-537145655"} : dispatch Dec 15 05:08:46 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-537145655"} : dispatch Dec 15 05:08:46 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-537145655"}]': finished Dec 15 05:08:46 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:08:46.814 267546 INFO neutron.agent.linux.ip_lib [None req-ad872bef-6dcf-4009-ad14-84b052a65dad - - - - - -] Device tapbae8f778-7c cannot be used as it has no MAC address#033[00m Dec 15 05:08:46 localhost nova_compute[286344]: 2025-12-15 10:08:46.839 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:08:46 localhost kernel: device tapbae8f778-7c entered promiscuous mode Dec 15 05:08:46 localhost NetworkManager[5963]: [1765793326.8464] manager: (tapbae8f778-7c): new Generic device (/org/freedesktop/NetworkManager/Devices/67) Dec 15 05:08:46 localhost ovn_controller[154603]: 2025-12-15T10:08:46Z|00427|binding|INFO|Claiming lport bae8f778-7c77-4071-9f9a-c4137b3157cd for this chassis. Dec 15 05:08:46 localhost ovn_controller[154603]: 2025-12-15T10:08:46Z|00428|binding|INFO|bae8f778-7c77-4071-9f9a-c4137b3157cd: Claiming unknown Dec 15 05:08:46 localhost nova_compute[286344]: 2025-12-15 10:08:46.847 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:08:46 localhost systemd-udevd[331174]: Network interface NamePolicy= disabled on kernel command line. Dec 15 05:08:46 localhost ovn_metadata_agent[160585]: 2025-12-15 10:08:46.857 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-a41a152b-f4be-46d2-978a-2511131ed3ae', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a41a152b-f4be-46d2-978a-2511131ed3ae', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe9f9243b8b846fea6a4b2d81c50a247', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9e1a5105-1c60-4aa8-a286-144f714eda94, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=bae8f778-7c77-4071-9f9a-c4137b3157cd) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:08:46 localhost ovn_metadata_agent[160585]: 2025-12-15 10:08:46.860 160590 INFO neutron.agent.ovn.metadata.agent [-] Port bae8f778-7c77-4071-9f9a-c4137b3157cd in datapath a41a152b-f4be-46d2-978a-2511131ed3ae bound to our chassis#033[00m Dec 15 05:08:46 localhost ovn_metadata_agent[160585]: 2025-12-15 10:08:46.863 160590 DEBUG neutron.agent.ovn.metadata.agent [-] Port f348c013-3832-4d80-888d-89fbd46825c3 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Dec 15 05:08:46 localhost ovn_metadata_agent[160585]: 2025-12-15 10:08:46.863 160590 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a41a152b-f4be-46d2-978a-2511131ed3ae, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 15 05:08:46 localhost ovn_metadata_agent[160585]: 2025-12-15 10:08:46.867 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[42e6fd17-5cba-4543-9cf6-ccc338d82018]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:08:46 localhost journal[231322]: ethtool ioctl error on tapbae8f778-7c: No such device Dec 15 05:08:46 localhost journal[231322]: ethtool ioctl error on tapbae8f778-7c: No such device Dec 15 05:08:46 localhost ovn_controller[154603]: 2025-12-15T10:08:46Z|00429|binding|INFO|Setting lport bae8f778-7c77-4071-9f9a-c4137b3157cd ovn-installed in OVS Dec 15 05:08:46 localhost ovn_controller[154603]: 2025-12-15T10:08:46Z|00430|binding|INFO|Setting lport bae8f778-7c77-4071-9f9a-c4137b3157cd up in Southbound Dec 15 05:08:46 localhost nova_compute[286344]: 2025-12-15 10:08:46.888 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:08:46 localhost journal[231322]: ethtool ioctl error on tapbae8f778-7c: No such device Dec 15 05:08:46 localhost journal[231322]: ethtool ioctl error on tapbae8f778-7c: No such device Dec 15 05:08:46 localhost journal[231322]: ethtool ioctl error on tapbae8f778-7c: No such device Dec 15 05:08:46 localhost journal[231322]: ethtool ioctl error on tapbae8f778-7c: No such device Dec 15 05:08:46 localhost journal[231322]: ethtool ioctl error on tapbae8f778-7c: No such device Dec 15 05:08:46 localhost journal[231322]: ethtool ioctl error on tapbae8f778-7c: No such device Dec 15 05:08:46 localhost nova_compute[286344]: 2025-12-15 10:08:46.923 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:08:46 localhost nova_compute[286344]: 2025-12-15 10:08:46.954 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:08:47 localhost neutron_sriov_agent[260044]: 2025-12-15 10:08:47.087 2 INFO neutron.agent.securitygroups_rpc [None req-216d72ae-1e26-4b80-9d02-e23a9ff33b7b 2a4c858f01e7402bb63c15d45b9aef6a fe9f9243b8b846fea6a4b2d81c50a247 - - default default] Security group member updated ['35926619-5c17-4d8f-973a-f16e1b692d86']#033[00m Dec 15 05:08:47 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e200 do_prune osdmap full prune enabled Dec 15 05:08:47 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e201 e201: 6 total, 6 up, 6 in Dec 15 05:08:47 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e201: 6 total, 6 up, 6 in Dec 15 05:08:47 localhost neutron_sriov_agent[260044]: 2025-12-15 10:08:47.610 2 INFO neutron.agent.securitygroups_rpc [None req-bd5d554a-bdfd-4f5d-b793-91c884c65ee0 af329bd85e524c1ea41f4255aa5eb9d8 1d15e551a54e4d0ba499cdbee6530731 - - default default] Security group member updated ['5c37dc84-f80b-4e12-8d9d-49a8b22167f0']#033[00m Dec 15 05:08:47 localhost podman[331245]: Dec 15 05:08:47 localhost podman[331245]: 2025-12-15 10:08:47.844225467 +0000 UTC m=+0.093433141 container create e4a3d9d8b47e1c6a046ece3d2f29b3b6d779d04bb53ebc58d57146e500e77115 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a41a152b-f4be-46d2-978a-2511131ed3ae, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Dec 15 05:08:47 localhost neutron_sriov_agent[260044]: 2025-12-15 10:08:47.868 2 INFO neutron.agent.securitygroups_rpc [None req-bd5d554a-bdfd-4f5d-b793-91c884c65ee0 af329bd85e524c1ea41f4255aa5eb9d8 1d15e551a54e4d0ba499cdbee6530731 - - default default] Security group member updated ['5c37dc84-f80b-4e12-8d9d-49a8b22167f0']#033[00m Dec 15 05:08:47 localhost systemd[1]: Started libpod-conmon-e4a3d9d8b47e1c6a046ece3d2f29b3b6d779d04bb53ebc58d57146e500e77115.scope. Dec 15 05:08:47 localhost podman[331245]: 2025-12-15 10:08:47.800951161 +0000 UTC m=+0.050158845 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 15 05:08:47 localhost systemd[1]: tmp-crun.0wd5gB.mount: Deactivated successfully. Dec 15 05:08:47 localhost systemd[1]: Started libcrun container. Dec 15 05:08:47 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5fdf99fb567f8a2309159ee28f527c989ccc1da8637db79cb532747f60904a66/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 05:08:47 localhost podman[331245]: 2025-12-15 10:08:47.939672413 +0000 UTC m=+0.188880097 container init e4a3d9d8b47e1c6a046ece3d2f29b3b6d779d04bb53ebc58d57146e500e77115 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a41a152b-f4be-46d2-978a-2511131ed3ae, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true) Dec 15 05:08:47 localhost podman[331245]: 2025-12-15 10:08:47.949022217 +0000 UTC m=+0.198229901 container start e4a3d9d8b47e1c6a046ece3d2f29b3b6d779d04bb53ebc58d57146e500e77115 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a41a152b-f4be-46d2-978a-2511131ed3ae, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Dec 15 05:08:47 localhost dnsmasq[331263]: started, version 2.85 cachesize 150 Dec 15 05:08:47 localhost dnsmasq[331263]: DNS service limited to local subnets Dec 15 05:08:47 localhost dnsmasq[331263]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 15 05:08:47 localhost dnsmasq[331263]: warning: no upstream servers configured Dec 15 05:08:47 localhost dnsmasq-dhcp[331263]: DHCP, static leases only on 10.100.0.0, lease time 1d Dec 15 05:08:47 localhost dnsmasq[331263]: read /var/lib/neutron/dhcp/a41a152b-f4be-46d2-978a-2511131ed3ae/addn_hosts - 0 addresses Dec 15 05:08:47 localhost dnsmasq-dhcp[331263]: read /var/lib/neutron/dhcp/a41a152b-f4be-46d2-978a-2511131ed3ae/host Dec 15 05:08:47 localhost dnsmasq-dhcp[331263]: read /var/lib/neutron/dhcp/a41a152b-f4be-46d2-978a-2511131ed3ae/opts Dec 15 05:08:48 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:08:48.010 267546 INFO neutron.agent.dhcp.agent [None req-3e00125c-90a7-4efb-a5cc-523cdc541ef6 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-15T10:08:46Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=84333ca0-de3e-48d1-9fd2-de9237ba8d6b, ip_allocation=immediate, mac_address=fa:16:3e:19:52:82, name=tempest-ExtraDHCPOptionsTestJSON-421717953, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-15T10:08:43Z, description=, dns_domain=, id=a41a152b-f4be-46d2-978a-2511131ed3ae, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ExtraDHCPOptionsTestJSON-test-network-358644074, port_security_enabled=True, project_id=fe9f9243b8b846fea6a4b2d81c50a247, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=21889, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2969, status=ACTIVE, subnets=['5b380339-20e2-4516-889c-ae47d7998470'], tags=[], tenant_id=fe9f9243b8b846fea6a4b2d81c50a247, updated_at=2025-12-15T10:08:45Z, vlan_transparent=None, network_id=a41a152b-f4be-46d2-978a-2511131ed3ae, port_security_enabled=True, project_id=fe9f9243b8b846fea6a4b2d81c50a247, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['35926619-5c17-4d8f-973a-f16e1b692d86'], standard_attr_id=2984, status=DOWN, tags=[], tenant_id=fe9f9243b8b846fea6a4b2d81c50a247, updated_at=2025-12-15T10:08:46Z on network a41a152b-f4be-46d2-978a-2511131ed3ae#033[00m Dec 15 05:08:48 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:08:48.137 267546 INFO neutron.agent.dhcp.agent [None req-cd5151fc-ef4b-4a2d-b86d-877c35d31ca9 - - - - - -] DHCP configuration for ports {'03fcbc0c-dc3a-409d-9e57-07933d043d55'} is completed#033[00m Dec 15 05:08:48 localhost dnsmasq[331263]: read /var/lib/neutron/dhcp/a41a152b-f4be-46d2-978a-2511131ed3ae/addn_hosts - 1 addresses Dec 15 05:08:48 localhost podman[331281]: 2025-12-15 10:08:48.229935645 +0000 UTC m=+0.060247158 container kill e4a3d9d8b47e1c6a046ece3d2f29b3b6d779d04bb53ebc58d57146e500e77115 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a41a152b-f4be-46d2-978a-2511131ed3ae, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:08:48 localhost dnsmasq-dhcp[331263]: read /var/lib/neutron/dhcp/a41a152b-f4be-46d2-978a-2511131ed3ae/host Dec 15 05:08:48 localhost dnsmasq-dhcp[331263]: read /var/lib/neutron/dhcp/a41a152b-f4be-46d2-978a-2511131ed3ae/opts Dec 15 05:08:48 localhost neutron_sriov_agent[260044]: 2025-12-15 10:08:48.300 2 INFO neutron.agent.securitygroups_rpc [None req-f2bd50e6-aacf-4ebf-b89b-4c7ac15b8281 af329bd85e524c1ea41f4255aa5eb9d8 1d15e551a54e4d0ba499cdbee6530731 - - default default] Security group member updated ['5c37dc84-f80b-4e12-8d9d-49a8b22167f0']#033[00m Dec 15 05:08:48 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:08:48.470 267546 INFO neutron.agent.dhcp.agent [None req-9b79197e-1bb4-4c8b-ac51-0b3fc105405c - - - - - -] DHCP configuration for ports {'84333ca0-de3e-48d1-9fd2-de9237ba8d6b'} is completed#033[00m Dec 15 05:08:48 localhost neutron_sriov_agent[260044]: 2025-12-15 10:08:48.496 2 INFO neutron.agent.securitygroups_rpc [None req-2fa267c2-790c-4323-a16c-0260bc5f1da5 2a4c858f01e7402bb63c15d45b9aef6a fe9f9243b8b846fea6a4b2d81c50a247 - - default default] Security group member updated ['35926619-5c17-4d8f-973a-f16e1b692d86']#033[00m Dec 15 05:08:48 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:08:48.540 267546 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-15T10:08:47Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[, , ], fixed_ips=[], id=dc76546b-97de-4511-a416-b44991e4a487, ip_allocation=immediate, mac_address=fa:16:3e:e9:54:26, name=tempest-ExtraDHCPOptionsTestJSON-1343883061, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-15T10:08:43Z, description=, dns_domain=, id=a41a152b-f4be-46d2-978a-2511131ed3ae, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ExtraDHCPOptionsTestJSON-test-network-358644074, port_security_enabled=True, project_id=fe9f9243b8b846fea6a4b2d81c50a247, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=21889, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2969, status=ACTIVE, subnets=['5b380339-20e2-4516-889c-ae47d7998470'], tags=[], tenant_id=fe9f9243b8b846fea6a4b2d81c50a247, updated_at=2025-12-15T10:08:45Z, vlan_transparent=None, network_id=a41a152b-f4be-46d2-978a-2511131ed3ae, port_security_enabled=True, project_id=fe9f9243b8b846fea6a4b2d81c50a247, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['35926619-5c17-4d8f-973a-f16e1b692d86'], standard_attr_id=2994, status=DOWN, tags=[], tenant_id=fe9f9243b8b846fea6a4b2d81c50a247, updated_at=2025-12-15T10:08:47Z on network a41a152b-f4be-46d2-978a-2511131ed3ae#033[00m Dec 15 05:08:48 localhost neutron_sriov_agent[260044]: 2025-12-15 10:08:48.646 2 INFO neutron.agent.securitygroups_rpc [None req-98872a2f-1698-4e60-b1ba-74729586e7ba af329bd85e524c1ea41f4255aa5eb9d8 1d15e551a54e4d0ba499cdbee6530731 - - default default] Security group member updated ['5c37dc84-f80b-4e12-8d9d-49a8b22167f0']#033[00m Dec 15 05:08:48 localhost nova_compute[286344]: 2025-12-15 10:08:48.732 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:08:48 localhost dnsmasq[331263]: read /var/lib/neutron/dhcp/a41a152b-f4be-46d2-978a-2511131ed3ae/addn_hosts - 2 addresses Dec 15 05:08:48 localhost dnsmasq-dhcp[331263]: read /var/lib/neutron/dhcp/a41a152b-f4be-46d2-978a-2511131ed3ae/host Dec 15 05:08:48 localhost podman[331319]: 2025-12-15 10:08:48.756153663 +0000 UTC m=+0.062568931 container kill e4a3d9d8b47e1c6a046ece3d2f29b3b6d779d04bb53ebc58d57146e500e77115 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a41a152b-f4be-46d2-978a-2511131ed3ae, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Dec 15 05:08:48 localhost dnsmasq-dhcp[331263]: read /var/lib/neutron/dhcp/a41a152b-f4be-46d2-978a-2511131ed3ae/opts Dec 15 05:08:49 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:08:49.013 267546 INFO neutron.agent.dhcp.agent [None req-c119e077-9b8a-4885-b605-a812351e726d - - - - - -] DHCP configuration for ports {'dc76546b-97de-4511-a416-b44991e4a487'} is completed#033[00m Dec 15 05:08:49 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-537145655", "caps": ["mds", "allow rw path=/volumes/_nogroup/d45816ae-1301-449f-a02a-41f15745dceb/4f63897f-7a83-4942-b630-cfc3d31a8b78", "osd", "allow rw pool=manila_data namespace=fsvolumens_d45816ae-1301-449f-a02a-41f15745dceb", "mon", "allow r"], "format": "json"} v 0) Dec 15 05:08:49 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-537145655", "caps": ["mds", "allow rw path=/volumes/_nogroup/d45816ae-1301-449f-a02a-41f15745dceb/4f63897f-7a83-4942-b630-cfc3d31a8b78", "osd", "allow rw pool=manila_data namespace=fsvolumens_d45816ae-1301-449f-a02a-41f15745dceb", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:08:49 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-537145655", "caps": ["mds", "allow rw path=/volumes/_nogroup/d45816ae-1301-449f-a02a-41f15745dceb/4f63897f-7a83-4942-b630-cfc3d31a8b78", "osd", "allow rw pool=manila_data namespace=fsvolumens_d45816ae-1301-449f-a02a-41f15745dceb", "mon", "allow r"], "format": "json"}]': finished Dec 15 05:08:49 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-537145655", "format": "json"} : dispatch Dec 15 05:08:49 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-537145655", "caps": ["mds", "allow rw path=/volumes/_nogroup/d45816ae-1301-449f-a02a-41f15745dceb/4f63897f-7a83-4942-b630-cfc3d31a8b78", "osd", "allow rw pool=manila_data namespace=fsvolumens_d45816ae-1301-449f-a02a-41f15745dceb", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:08:49 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-537145655", "caps": ["mds", "allow rw path=/volumes/_nogroup/d45816ae-1301-449f-a02a-41f15745dceb/4f63897f-7a83-4942-b630-cfc3d31a8b78", "osd", "allow rw pool=manila_data namespace=fsvolumens_d45816ae-1301-449f-a02a-41f15745dceb", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:08:49 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-537145655", "caps": ["mds", "allow rw path=/volumes/_nogroup/d45816ae-1301-449f-a02a-41f15745dceb/4f63897f-7a83-4942-b630-cfc3d31a8b78", "osd", "allow rw pool=manila_data namespace=fsvolumens_d45816ae-1301-449f-a02a-41f15745dceb", "mon", "allow r"], "format": "json"}]': finished Dec 15 05:08:49 localhost neutron_sriov_agent[260044]: 2025-12-15 10:08:49.629 2 INFO neutron.agent.securitygroups_rpc [None req-7d6ad835-d8ee-4fca-9ef9-cd6338ffa426 2a4c858f01e7402bb63c15d45b9aef6a fe9f9243b8b846fea6a4b2d81c50a247 - - default default] Security group member updated ['35926619-5c17-4d8f-973a-f16e1b692d86']#033[00m Dec 15 05:08:49 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Dec 15 05:08:49 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/551637216' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Dec 15 05:08:49 localhost nova_compute[286344]: 2025-12-15 10:08:49.679 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:08:49 localhost dnsmasq[331263]: read /var/lib/neutron/dhcp/a41a152b-f4be-46d2-978a-2511131ed3ae/addn_hosts - 1 addresses Dec 15 05:08:49 localhost podman[331356]: 2025-12-15 10:08:49.934365781 +0000 UTC m=+0.087453890 container kill e4a3d9d8b47e1c6a046ece3d2f29b3b6d779d04bb53ebc58d57146e500e77115 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a41a152b-f4be-46d2-978a-2511131ed3ae, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:08:49 localhost dnsmasq-dhcp[331263]: read /var/lib/neutron/dhcp/a41a152b-f4be-46d2-978a-2511131ed3ae/host Dec 15 05:08:49 localhost dnsmasq-dhcp[331263]: read /var/lib/neutron/dhcp/a41a152b-f4be-46d2-978a-2511131ed3ae/opts Dec 15 05:08:50 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:08:50.181 267546 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-15T10:08:46Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[, , ], fixed_ips=[], id=84333ca0-de3e-48d1-9fd2-de9237ba8d6b, ip_allocation=immediate, mac_address=fa:16:3e:19:52:82, name=tempest-new-port-name-1912071320, network_id=a41a152b-f4be-46d2-978a-2511131ed3ae, port_security_enabled=True, project_id=fe9f9243b8b846fea6a4b2d81c50a247, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['35926619-5c17-4d8f-973a-f16e1b692d86'], standard_attr_id=2984, status=DOWN, tags=[], tenant_id=fe9f9243b8b846fea6a4b2d81c50a247, updated_at=2025-12-15T10:08:50Z on network a41a152b-f4be-46d2-978a-2511131ed3ae#033[00m Dec 15 05:08:50 localhost dnsmasq[331263]: read /var/lib/neutron/dhcp/a41a152b-f4be-46d2-978a-2511131ed3ae/addn_hosts - 1 addresses Dec 15 05:08:50 localhost dnsmasq-dhcp[331263]: read /var/lib/neutron/dhcp/a41a152b-f4be-46d2-978a-2511131ed3ae/host Dec 15 05:08:50 localhost dnsmasq-dhcp[331263]: read /var/lib/neutron/dhcp/a41a152b-f4be-46d2-978a-2511131ed3ae/opts Dec 15 05:08:50 localhost podman[331395]: 2025-12-15 10:08:50.405129471 +0000 UTC m=+0.059983223 container kill e4a3d9d8b47e1c6a046ece3d2f29b3b6d779d04bb53ebc58d57146e500e77115 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a41a152b-f4be-46d2-978a-2511131ed3ae, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Dec 15 05:08:50 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:08:50.662 267546 INFO neutron.agent.dhcp.agent [None req-f11c32b4-9a6b-415f-803e-5eaf383faaa1 - - - - - -] DHCP configuration for ports {'84333ca0-de3e-48d1-9fd2-de9237ba8d6b'} is completed#033[00m Dec 15 05:08:51 localhost nova_compute[286344]: 2025-12-15 10:08:51.113 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:08:51 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e201 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:08:51 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e201 do_prune osdmap full prune enabled Dec 15 05:08:51 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e202 e202: 6 total, 6 up, 6 in Dec 15 05:08:51 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e202: 6 total, 6 up, 6 in Dec 15 05:08:51 localhost ovn_metadata_agent[160585]: 2025-12-15 10:08:51.484 160590 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 05:08:51 localhost ovn_metadata_agent[160585]: 2025-12-15 10:08:51.484 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 05:08:51 localhost ovn_metadata_agent[160585]: 2025-12-15 10:08:51.485 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 05:08:51 localhost neutron_sriov_agent[260044]: 2025-12-15 10:08:51.546 2 INFO neutron.agent.securitygroups_rpc [None req-84849762-0354-49f2-98cc-352c0ecfc104 2a4c858f01e7402bb63c15d45b9aef6a fe9f9243b8b846fea6a4b2d81c50a247 - - default default] Security group member updated ['35926619-5c17-4d8f-973a-f16e1b692d86']#033[00m Dec 15 05:08:51 localhost dnsmasq[331263]: read /var/lib/neutron/dhcp/a41a152b-f4be-46d2-978a-2511131ed3ae/addn_hosts - 0 addresses Dec 15 05:08:51 localhost dnsmasq-dhcp[331263]: read /var/lib/neutron/dhcp/a41a152b-f4be-46d2-978a-2511131ed3ae/host Dec 15 05:08:51 localhost dnsmasq-dhcp[331263]: read /var/lib/neutron/dhcp/a41a152b-f4be-46d2-978a-2511131ed3ae/opts Dec 15 05:08:51 localhost podman[331433]: 2025-12-15 10:08:51.925740597 +0000 UTC m=+0.118311568 container kill e4a3d9d8b47e1c6a046ece3d2f29b3b6d779d04bb53ebc58d57146e500e77115 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a41a152b-f4be-46d2-978a-2511131ed3ae, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 05:08:52 localhost dnsmasq[331263]: exiting on receipt of SIGTERM Dec 15 05:08:52 localhost podman[331472]: 2025-12-15 10:08:52.522865504 +0000 UTC m=+0.062797809 container kill e4a3d9d8b47e1c6a046ece3d2f29b3b6d779d04bb53ebc58d57146e500e77115 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a41a152b-f4be-46d2-978a-2511131ed3ae, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2) Dec 15 05:08:52 localhost systemd[1]: libpod-e4a3d9d8b47e1c6a046ece3d2f29b3b6d779d04bb53ebc58d57146e500e77115.scope: Deactivated successfully. Dec 15 05:08:52 localhost ovn_controller[154603]: 2025-12-15T10:08:52Z|00431|binding|INFO|Removing iface tapbae8f778-7c ovn-installed in OVS Dec 15 05:08:52 localhost ovn_controller[154603]: 2025-12-15T10:08:52Z|00432|binding|INFO|Removing lport bae8f778-7c77-4071-9f9a-c4137b3157cd ovn-installed in OVS Dec 15 05:08:52 localhost ovn_metadata_agent[160585]: 2025-12-15 10:08:52.545 160590 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port f348c013-3832-4d80-888d-89fbd46825c3 with type ""#033[00m Dec 15 05:08:52 localhost nova_compute[286344]: 2025-12-15 10:08:52.547 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:08:52 localhost ovn_metadata_agent[160585]: 2025-12-15 10:08:52.546 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-a41a152b-f4be-46d2-978a-2511131ed3ae', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a41a152b-f4be-46d2-978a-2511131ed3ae', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe9f9243b8b846fea6a4b2d81c50a247', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005559462.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9e1a5105-1c60-4aa8-a286-144f714eda94, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=bae8f778-7c77-4071-9f9a-c4137b3157cd) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:08:52 localhost ovn_metadata_agent[160585]: 2025-12-15 10:08:52.551 160590 INFO neutron.agent.ovn.metadata.agent [-] Port bae8f778-7c77-4071-9f9a-c4137b3157cd in datapath a41a152b-f4be-46d2-978a-2511131ed3ae unbound from our chassis#033[00m Dec 15 05:08:52 localhost nova_compute[286344]: 2025-12-15 10:08:52.554 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:08:52 localhost ovn_metadata_agent[160585]: 2025-12-15 10:08:52.556 160590 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a41a152b-f4be-46d2-978a-2511131ed3ae, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 15 05:08:52 localhost ovn_metadata_agent[160585]: 2025-12-15 10:08:52.557 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[820b1628-11a9-4735-aa17-4114635d70dd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:08:52 localhost podman[331487]: 2025-12-15 10:08:52.613079727 +0000 UTC m=+0.071247839 container died e4a3d9d8b47e1c6a046ece3d2f29b3b6d779d04bb53ebc58d57146e500e77115 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a41a152b-f4be-46d2-978a-2511131ed3ae, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202) Dec 15 05:08:52 localhost podman[331487]: 2025-12-15 10:08:52.648667384 +0000 UTC m=+0.106835436 container cleanup e4a3d9d8b47e1c6a046ece3d2f29b3b6d779d04bb53ebc58d57146e500e77115 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a41a152b-f4be-46d2-978a-2511131ed3ae, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true) Dec 15 05:08:52 localhost systemd[1]: libpod-conmon-e4a3d9d8b47e1c6a046ece3d2f29b3b6d779d04bb53ebc58d57146e500e77115.scope: Deactivated successfully. Dec 15 05:08:52 localhost podman[331488]: 2025-12-15 10:08:52.686299937 +0000 UTC m=+0.138618770 container remove e4a3d9d8b47e1c6a046ece3d2f29b3b6d779d04bb53ebc58d57146e500e77115 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a41a152b-f4be-46d2-978a-2511131ed3ae, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3) Dec 15 05:08:52 localhost nova_compute[286344]: 2025-12-15 10:08:52.699 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:08:52 localhost kernel: device tapbae8f778-7c left promiscuous mode Dec 15 05:08:52 localhost nova_compute[286344]: 2025-12-15 10:08:52.710 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:08:52 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:08:52.730 267546 INFO neutron.agent.dhcp.agent [None req-21863670-6ce3-42e4-b616-bf88e20d26e5 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:08:52 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:08:52.817 267546 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:08:52 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth rm", "entity": "client.tempest-cephx-id-537145655"} v 0) Dec 15 05:08:52 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-537145655"} : dispatch Dec 15 05:08:52 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 05:08:52 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-537145655"}]': finished Dec 15 05:08:52 localhost systemd[1]: var-lib-containers-storage-overlay-5fdf99fb567f8a2309159ee28f527c989ccc1da8637db79cb532747f60904a66-merged.mount: Deactivated successfully. Dec 15 05:08:52 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e4a3d9d8b47e1c6a046ece3d2f29b3b6d779d04bb53ebc58d57146e500e77115-userdata-shm.mount: Deactivated successfully. Dec 15 05:08:52 localhost systemd[1]: run-netns-qdhcp\x2da41a152b\x2df4be\x2d46d2\x2d978a\x2d2511131ed3ae.mount: Deactivated successfully. Dec 15 05:08:53 localhost podman[331517]: 2025-12-15 10:08:53.006324739 +0000 UTC m=+0.082190366 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2) Dec 15 05:08:53 localhost ovn_controller[154603]: 2025-12-15T10:08:53Z|00433|binding|INFO|Releasing lport b35254ad-12eb-47bb-92be-44fefe0694f0 from this chassis (sb_readonly=0) Dec 15 05:08:53 localhost podman[331517]: 2025-12-15 10:08:53.040471587 +0000 UTC m=+0.116337184 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, tcib_managed=true) Dec 15 05:08:53 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 05:08:53 localhost nova_compute[286344]: 2025-12-15 10:08:53.098 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:08:53 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:08:53.202 267546 INFO neutron.agent.linux.ip_lib [None req-ba89fcc1-4db8-4a75-b3f5-9c5c87a9c6fe - - - - - -] Device tap5db8e4c9-03 cannot be used as it has no MAC address#033[00m Dec 15 05:08:53 localhost nova_compute[286344]: 2025-12-15 10:08:53.228 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:08:53 localhost kernel: device tap5db8e4c9-03 entered promiscuous mode Dec 15 05:08:53 localhost nova_compute[286344]: 2025-12-15 10:08:53.235 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:08:53 localhost ovn_controller[154603]: 2025-12-15T10:08:53Z|00434|binding|INFO|Claiming lport 5db8e4c9-03a7-4e72-a832-ae57ae17e27e for this chassis. Dec 15 05:08:53 localhost ovn_controller[154603]: 2025-12-15T10:08:53Z|00435|binding|INFO|5db8e4c9-03a7-4e72-a832-ae57ae17e27e: Claiming unknown Dec 15 05:08:53 localhost NetworkManager[5963]: [1765793333.2378] manager: (tap5db8e4c9-03): new Generic device (/org/freedesktop/NetworkManager/Devices/68) Dec 15 05:08:53 localhost systemd-udevd[331543]: Network interface NamePolicy= disabled on kernel command line. Dec 15 05:08:53 localhost ovn_metadata_agent[160585]: 2025-12-15 10:08:53.258 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-70fa6579-9f68-46bc-93c5-9467cf1f3dce', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-70fa6579-9f68-46bc-93c5-9467cf1f3dce', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1d15e551a54e4d0ba499cdbee6530731', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6af7c49d-2dd8-448f-a638-35318f812f6d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=5db8e4c9-03a7-4e72-a832-ae57ae17e27e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:08:53 localhost ovn_metadata_agent[160585]: 2025-12-15 10:08:53.260 160590 INFO neutron.agent.ovn.metadata.agent [-] Port 5db8e4c9-03a7-4e72-a832-ae57ae17e27e in datapath 70fa6579-9f68-46bc-93c5-9467cf1f3dce bound to our chassis#033[00m Dec 15 05:08:53 localhost ovn_metadata_agent[160585]: 2025-12-15 10:08:53.261 160590 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 70fa6579-9f68-46bc-93c5-9467cf1f3dce or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 15 05:08:53 localhost ovn_metadata_agent[160585]: 2025-12-15 10:08:53.262 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[443223d2-c6b2-43be-8d2c-603386ce6f32]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:08:53 localhost journal[231322]: ethtool ioctl error on tap5db8e4c9-03: No such device Dec 15 05:08:53 localhost nova_compute[286344]: 2025-12-15 10:08:53.272 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:08:53 localhost ovn_controller[154603]: 2025-12-15T10:08:53Z|00436|binding|INFO|Setting lport 5db8e4c9-03a7-4e72-a832-ae57ae17e27e ovn-installed in OVS Dec 15 05:08:53 localhost ovn_controller[154603]: 2025-12-15T10:08:53Z|00437|binding|INFO|Setting lport 5db8e4c9-03a7-4e72-a832-ae57ae17e27e up in Southbound Dec 15 05:08:53 localhost journal[231322]: ethtool ioctl error on tap5db8e4c9-03: No such device Dec 15 05:08:53 localhost nova_compute[286344]: 2025-12-15 10:08:53.277 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:08:53 localhost journal[231322]: ethtool ioctl error on tap5db8e4c9-03: No such device Dec 15 05:08:53 localhost journal[231322]: ethtool ioctl error on tap5db8e4c9-03: No such device Dec 15 05:08:53 localhost journal[231322]: ethtool ioctl error on tap5db8e4c9-03: No such device Dec 15 05:08:53 localhost journal[231322]: ethtool ioctl error on tap5db8e4c9-03: No such device Dec 15 05:08:53 localhost journal[231322]: ethtool ioctl error on tap5db8e4c9-03: No such device Dec 15 05:08:53 localhost journal[231322]: ethtool ioctl error on tap5db8e4c9-03: No such device Dec 15 05:08:53 localhost nova_compute[286344]: 2025-12-15 10:08:53.322 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:08:53 localhost nova_compute[286344]: 2025-12-15 10:08:53.352 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:08:53 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-537145655", "format": "json"} : dispatch Dec 15 05:08:53 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-537145655"} : dispatch Dec 15 05:08:53 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-537145655"} : dispatch Dec 15 05:08:53 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-537145655"}]': finished Dec 15 05:08:53 localhost nova_compute[286344]: 2025-12-15 10:08:53.733 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:08:54 localhost neutron_sriov_agent[260044]: 2025-12-15 10:08:54.118 2 INFO neutron.agent.securitygroups_rpc [None req-fc95b02e-3f3a-44e4-9d8b-0c5818084aa4 af329bd85e524c1ea41f4255aa5eb9d8 1d15e551a54e4d0ba499cdbee6530731 - - default default] Security group member updated ['5c37dc84-f80b-4e12-8d9d-49a8b22167f0']#033[00m Dec 15 05:08:54 localhost podman[331614]: Dec 15 05:08:54 localhost podman[331614]: 2025-12-15 10:08:54.318787596 +0000 UTC m=+0.108972394 container create 37d3f4237373b471dbd44dc197f2c89b3e3c0a54b31d8d5dbf3202e19b62df8e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-70fa6579-9f68-46bc-93c5-9467cf1f3dce, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3) Dec 15 05:08:54 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Dec 15 05:08:54 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.25625 172.18.0.34:0/382777224' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Dec 15 05:08:54 localhost podman[331614]: 2025-12-15 10:08:54.261269141 +0000 UTC m=+0.051453999 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 15 05:08:54 localhost systemd[1]: Started libpod-conmon-37d3f4237373b471dbd44dc197f2c89b3e3c0a54b31d8d5dbf3202e19b62df8e.scope. Dec 15 05:08:54 localhost systemd[1]: Started libcrun container. Dec 15 05:08:54 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5eaa9c390b51dfdc12db9e6a24f11074a6445ebed0e601c261b01fc78b33fce9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 05:08:54 localhost podman[331614]: 2025-12-15 10:08:54.394915416 +0000 UTC m=+0.185100174 container init 37d3f4237373b471dbd44dc197f2c89b3e3c0a54b31d8d5dbf3202e19b62df8e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-70fa6579-9f68-46bc-93c5-9467cf1f3dce, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3) Dec 15 05:08:54 localhost systemd[1]: tmp-crun.X1SXUc.mount: Deactivated successfully. Dec 15 05:08:54 localhost podman[331614]: 2025-12-15 10:08:54.409208094 +0000 UTC m=+0.199392862 container start 37d3f4237373b471dbd44dc197f2c89b3e3c0a54b31d8d5dbf3202e19b62df8e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-70fa6579-9f68-46bc-93c5-9467cf1f3dce, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Dec 15 05:08:54 localhost dnsmasq[331631]: started, version 2.85 cachesize 150 Dec 15 05:08:54 localhost dnsmasq[331631]: DNS service limited to local subnets Dec 15 05:08:54 localhost dnsmasq[331631]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 15 05:08:54 localhost dnsmasq[331631]: warning: no upstream servers configured Dec 15 05:08:54 localhost dnsmasq-dhcp[331631]: DHCPv6, static leases only on 2001:db8::, lease time 1d Dec 15 05:08:54 localhost dnsmasq[331631]: read /var/lib/neutron/dhcp/70fa6579-9f68-46bc-93c5-9467cf1f3dce/addn_hosts - 0 addresses Dec 15 05:08:54 localhost dnsmasq-dhcp[331631]: read /var/lib/neutron/dhcp/70fa6579-9f68-46bc-93c5-9467cf1f3dce/host Dec 15 05:08:54 localhost dnsmasq-dhcp[331631]: read /var/lib/neutron/dhcp/70fa6579-9f68-46bc-93c5-9467cf1f3dce/opts Dec 15 05:08:54 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:08:54.465 267546 INFO neutron.agent.dhcp.agent [None req-ba89fcc1-4db8-4a75-b3f5-9c5c87a9c6fe - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-15T10:08:52Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=b544488a-9c22-413e-bd43-831657f60961, ip_allocation=immediate, mac_address=fa:16:3e:3d:59:81, name=tempest-PortsIpV6TestJSON-888039072, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-15T10:08:49Z, description=, dns_domain=, id=70fa6579-9f68-46bc-93c5-9467cf1f3dce, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsIpV6TestJSON-1215393156, port_security_enabled=True, project_id=1d15e551a54e4d0ba499cdbee6530731, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=37684, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2995, status=ACTIVE, subnets=['d6e40ba7-113d-4621-9986-0b3b57cf87b2'], tags=[], tenant_id=1d15e551a54e4d0ba499cdbee6530731, updated_at=2025-12-15T10:08:52Z, vlan_transparent=None, network_id=70fa6579-9f68-46bc-93c5-9467cf1f3dce, port_security_enabled=True, project_id=1d15e551a54e4d0ba499cdbee6530731, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['5c37dc84-f80b-4e12-8d9d-49a8b22167f0'], standard_attr_id=3001, status=DOWN, tags=[], tenant_id=1d15e551a54e4d0ba499cdbee6530731, updated_at=2025-12-15T10:08:53Z on network 70fa6579-9f68-46bc-93c5-9467cf1f3dce#033[00m Dec 15 05:08:54 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:08:54.572 267546 INFO neutron.agent.dhcp.agent [None req-cc39dfa2-9ba0-4805-a6a4-531e7483969c - - - - - -] DHCP configuration for ports {'c13de052-b465-4bf6-809e-c87d3db32b95'} is completed#033[00m Dec 15 05:08:54 localhost dnsmasq[331631]: read /var/lib/neutron/dhcp/70fa6579-9f68-46bc-93c5-9467cf1f3dce/addn_hosts - 1 addresses Dec 15 05:08:54 localhost dnsmasq-dhcp[331631]: read /var/lib/neutron/dhcp/70fa6579-9f68-46bc-93c5-9467cf1f3dce/host Dec 15 05:08:54 localhost dnsmasq-dhcp[331631]: read /var/lib/neutron/dhcp/70fa6579-9f68-46bc-93c5-9467cf1f3dce/opts Dec 15 05:08:54 localhost podman[331650]: 2025-12-15 10:08:54.65436221 +0000 UTC m=+0.069520011 container kill 37d3f4237373b471dbd44dc197f2c89b3e3c0a54b31d8d5dbf3202e19b62df8e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-70fa6579-9f68-46bc-93c5-9467cf1f3dce, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 05:08:54 localhost neutron_sriov_agent[260044]: 2025-12-15 10:08:54.893 2 INFO neutron.agent.securitygroups_rpc [None req-45628020-f5ab-45a7-9ce2-b46f6aab88b1 af329bd85e524c1ea41f4255aa5eb9d8 1d15e551a54e4d0ba499cdbee6530731 - - default default] Security group member updated ['5c37dc84-f80b-4e12-8d9d-49a8b22167f0']#033[00m Dec 15 05:08:54 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:08:54.948 267546 INFO neutron.agent.dhcp.agent [None req-0d8ebd4f-2363-47df-b7a1-7a95f2c797bc - - - - - -] DHCP configuration for ports {'b544488a-9c22-413e-bd43-831657f60961'} is completed#033[00m Dec 15 05:08:55 localhost podman[331688]: 2025-12-15 10:08:55.149479542 +0000 UTC m=+0.063740674 container kill 37d3f4237373b471dbd44dc197f2c89b3e3c0a54b31d8d5dbf3202e19b62df8e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-70fa6579-9f68-46bc-93c5-9467cf1f3dce, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Dec 15 05:08:55 localhost dnsmasq[331631]: read /var/lib/neutron/dhcp/70fa6579-9f68-46bc-93c5-9467cf1f3dce/addn_hosts - 0 addresses Dec 15 05:08:55 localhost dnsmasq-dhcp[331631]: read /var/lib/neutron/dhcp/70fa6579-9f68-46bc-93c5-9467cf1f3dce/host Dec 15 05:08:55 localhost dnsmasq-dhcp[331631]: read /var/lib/neutron/dhcp/70fa6579-9f68-46bc-93c5-9467cf1f3dce/opts Dec 15 05:08:55 localhost ceph-osd[31375]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #44. Immutable memtables: 1. Dec 15 05:08:55 localhost dnsmasq[331631]: exiting on receipt of SIGTERM Dec 15 05:08:55 localhost podman[331725]: 2025-12-15 10:08:55.950838462 +0000 UTC m=+0.075778402 container kill 37d3f4237373b471dbd44dc197f2c89b3e3c0a54b31d8d5dbf3202e19b62df8e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-70fa6579-9f68-46bc-93c5-9467cf1f3dce, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:08:55 localhost systemd[1]: libpod-37d3f4237373b471dbd44dc197f2c89b3e3c0a54b31d8d5dbf3202e19b62df8e.scope: Deactivated successfully. Dec 15 05:08:56 localhost podman[331738]: 2025-12-15 10:08:56.0283585 +0000 UTC m=+0.057819294 container died 37d3f4237373b471dbd44dc197f2c89b3e3c0a54b31d8d5dbf3202e19b62df8e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-70fa6579-9f68-46bc-93c5-9467cf1f3dce, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202) Dec 15 05:08:56 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-537145655", "caps": ["mds", "allow rw path=/volumes/_nogroup/d45816ae-1301-449f-a02a-41f15745dceb/4f63897f-7a83-4942-b630-cfc3d31a8b78", "osd", "allow rw pool=manila_data namespace=fsvolumens_d45816ae-1301-449f-a02a-41f15745dceb", "mon", "allow r"], "format": "json"} v 0) Dec 15 05:08:56 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-537145655", "caps": ["mds", "allow rw path=/volumes/_nogroup/d45816ae-1301-449f-a02a-41f15745dceb/4f63897f-7a83-4942-b630-cfc3d31a8b78", "osd", "allow rw pool=manila_data namespace=fsvolumens_d45816ae-1301-449f-a02a-41f15745dceb", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:08:56 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-537145655", "caps": ["mds", "allow rw path=/volumes/_nogroup/d45816ae-1301-449f-a02a-41f15745dceb/4f63897f-7a83-4942-b630-cfc3d31a8b78", "osd", "allow rw pool=manila_data namespace=fsvolumens_d45816ae-1301-449f-a02a-41f15745dceb", "mon", "allow r"], "format": "json"}]': finished Dec 15 05:08:56 localhost ovn_controller[154603]: 2025-12-15T10:08:56Z|00438|binding|INFO|Removing iface tap5db8e4c9-03 ovn-installed in OVS Dec 15 05:08:56 localhost ovn_controller[154603]: 2025-12-15T10:08:56Z|00439|binding|INFO|Removing lport 5db8e4c9-03a7-4e72-a832-ae57ae17e27e ovn-installed in OVS Dec 15 05:08:56 localhost ovn_metadata_agent[160585]: 2025-12-15 10:08:56.094 160590 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port bfc3f2ce-b8d7-4b30-b355-624a39c112c1 with type ""#033[00m Dec 15 05:08:56 localhost ovn_metadata_agent[160585]: 2025-12-15 10:08:56.095 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-70fa6579-9f68-46bc-93c5-9467cf1f3dce', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-70fa6579-9f68-46bc-93c5-9467cf1f3dce', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1d15e551a54e4d0ba499cdbee6530731', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005559462.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6af7c49d-2dd8-448f-a638-35318f812f6d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=5db8e4c9-03a7-4e72-a832-ae57ae17e27e) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:08:56 localhost ovn_metadata_agent[160585]: 2025-12-15 10:08:56.096 160590 INFO neutron.agent.ovn.metadata.agent [-] Port 5db8e4c9-03a7-4e72-a832-ae57ae17e27e in datapath 70fa6579-9f68-46bc-93c5-9467cf1f3dce unbound from our chassis#033[00m Dec 15 05:08:56 localhost ovn_metadata_agent[160585]: 2025-12-15 10:08:56.098 160590 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 70fa6579-9f68-46bc-93c5-9467cf1f3dce or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 15 05:08:56 localhost nova_compute[286344]: 2025-12-15 10:08:56.099 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:08:56 localhost ovn_metadata_agent[160585]: 2025-12-15 10:08:56.100 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[0243a0dd-21f7-4269-80ab-f1abbc4c311f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:08:56 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-37d3f4237373b471dbd44dc197f2c89b3e3c0a54b31d8d5dbf3202e19b62df8e-userdata-shm.mount: Deactivated successfully. Dec 15 05:08:56 localhost podman[331738]: 2025-12-15 10:08:56.110310118 +0000 UTC m=+0.139770922 container cleanup 37d3f4237373b471dbd44dc197f2c89b3e3c0a54b31d8d5dbf3202e19b62df8e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-70fa6579-9f68-46bc-93c5-9467cf1f3dce, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:08:56 localhost systemd[1]: libpod-conmon-37d3f4237373b471dbd44dc197f2c89b3e3c0a54b31d8d5dbf3202e19b62df8e.scope: Deactivated successfully. Dec 15 05:08:56 localhost nova_compute[286344]: 2025-12-15 10:08:56.115 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:08:56 localhost podman[331740]: 2025-12-15 10:08:56.129029647 +0000 UTC m=+0.155337445 container remove 37d3f4237373b471dbd44dc197f2c89b3e3c0a54b31d8d5dbf3202e19b62df8e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-70fa6579-9f68-46bc-93c5-9467cf1f3dce, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 05:08:56 localhost nova_compute[286344]: 2025-12-15 10:08:56.139 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:08:56 localhost kernel: device tap5db8e4c9-03 left promiscuous mode Dec 15 05:08:56 localhost nova_compute[286344]: 2025-12-15 10:08:56.154 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:08:56 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:08:56.254 267546 INFO neutron.agent.dhcp.agent [None req-0f4b6168-e851-4ae4-aa88-691713450ca6 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:08:56 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e202 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:08:56 localhost systemd[1]: var-lib-containers-storage-overlay-5eaa9c390b51dfdc12db9e6a24f11074a6445ebed0e601c261b01fc78b33fce9-merged.mount: Deactivated successfully. Dec 15 05:08:56 localhost systemd[1]: run-netns-qdhcp\x2d70fa6579\x2d9f68\x2d46bc\x2d93c5\x2d9467cf1f3dce.mount: Deactivated successfully. Dec 15 05:08:56 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:08:56.452 267546 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:08:56 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-537145655", "format": "json"} : dispatch Dec 15 05:08:56 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-537145655", "caps": ["mds", "allow rw path=/volumes/_nogroup/d45816ae-1301-449f-a02a-41f15745dceb/4f63897f-7a83-4942-b630-cfc3d31a8b78", "osd", "allow rw pool=manila_data namespace=fsvolumens_d45816ae-1301-449f-a02a-41f15745dceb", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:08:56 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-537145655", "caps": ["mds", "allow rw path=/volumes/_nogroup/d45816ae-1301-449f-a02a-41f15745dceb/4f63897f-7a83-4942-b630-cfc3d31a8b78", "osd", "allow rw pool=manila_data namespace=fsvolumens_d45816ae-1301-449f-a02a-41f15745dceb", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:08:56 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-537145655", "caps": ["mds", "allow rw path=/volumes/_nogroup/d45816ae-1301-449f-a02a-41f15745dceb/4f63897f-7a83-4942-b630-cfc3d31a8b78", "osd", "allow rw pool=manila_data namespace=fsvolumens_d45816ae-1301-449f-a02a-41f15745dceb", "mon", "allow r"], "format": "json"}]': finished Dec 15 05:08:56 localhost nova_compute[286344]: 2025-12-15 10:08:56.490 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:08:56 localhost ovn_controller[154603]: 2025-12-15T10:08:56Z|00440|binding|INFO|Releasing lport b35254ad-12eb-47bb-92be-44fefe0694f0 from this chassis (sb_readonly=0) Dec 15 05:08:56 localhost nova_compute[286344]: 2025-12-15 10:08:56.685 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:08:57 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e202 do_prune osdmap full prune enabled Dec 15 05:08:57 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e203 e203: 6 total, 6 up, 6 in Dec 15 05:08:57 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e203: 6 total, 6 up, 6 in Dec 15 05:08:57 localhost ceph-osd[32311]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #44. Immutable memtables: 1. Dec 15 05:08:57 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.eve49", "caps": ["mds", "allow rw path=/volumes/_nogroup/71f8a216-0ace-48cb-9799-bcf4f361aa30/cce16599-7ae9-4668-a71c-5b857423d725", "osd", "allow rw pool=manila_data namespace=fsvolumens_71f8a216-0ace-48cb-9799-bcf4f361aa30", "mon", "allow r"], "format": "json"} v 0) Dec 15 05:08:57 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.eve49", "caps": ["mds", "allow rw path=/volumes/_nogroup/71f8a216-0ace-48cb-9799-bcf4f361aa30/cce16599-7ae9-4668-a71c-5b857423d725", "osd", "allow rw pool=manila_data namespace=fsvolumens_71f8a216-0ace-48cb-9799-bcf4f361aa30", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:08:57 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth get-or-create", "entity": "client.eve49", "caps": ["mds", "allow rw path=/volumes/_nogroup/71f8a216-0ace-48cb-9799-bcf4f361aa30/cce16599-7ae9-4668-a71c-5b857423d725", "osd", "allow rw pool=manila_data namespace=fsvolumens_71f8a216-0ace-48cb-9799-bcf4f361aa30", "mon", "allow r"], "format": "json"}]': finished Dec 15 05:08:58 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.eve49", "format": "json"} : dispatch Dec 15 05:08:58 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.eve49", "caps": ["mds", "allow rw path=/volumes/_nogroup/71f8a216-0ace-48cb-9799-bcf4f361aa30/cce16599-7ae9-4668-a71c-5b857423d725", "osd", "allow rw pool=manila_data namespace=fsvolumens_71f8a216-0ace-48cb-9799-bcf4f361aa30", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:08:58 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.eve49", "caps": ["mds", "allow rw path=/volumes/_nogroup/71f8a216-0ace-48cb-9799-bcf4f361aa30/cce16599-7ae9-4668-a71c-5b857423d725", "osd", "allow rw pool=manila_data namespace=fsvolumens_71f8a216-0ace-48cb-9799-bcf4f361aa30", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:08:58 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth get-or-create", "entity": "client.eve49", "caps": ["mds", "allow rw path=/volumes/_nogroup/71f8a216-0ace-48cb-9799-bcf4f361aa30/cce16599-7ae9-4668-a71c-5b857423d725", "osd", "allow rw pool=manila_data namespace=fsvolumens_71f8a216-0ace-48cb-9799-bcf4f361aa30", "mon", "allow r"], "format": "json"}]': finished Dec 15 05:08:58 localhost systemd[1]: Started /usr/bin/podman healthcheck run a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e. Dec 15 05:08:58 localhost nova_compute[286344]: 2025-12-15 10:08:58.737 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:08:58 localhost systemd[1]: tmp-crun.s1UPvw.mount: Deactivated successfully. Dec 15 05:08:58 localhost podman[331769]: 2025-12-15 10:08:58.756732166 +0000 UTC m=+0.091686224 container health_status a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Dec 15 05:08:58 localhost podman[331769]: 2025-12-15 10:08:58.768574058 +0000 UTC m=+0.103528116 container exec_died a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 15 05:08:58 localhost systemd[1]: a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e.service: Deactivated successfully. Dec 15 05:08:59 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth rm", "entity": "client.tempest-cephx-id-537145655"} v 0) Dec 15 05:08:59 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-537145655"} : dispatch Dec 15 05:08:59 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-537145655"}]': finished Dec 15 05:09:00 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-537145655", "format": "json"} : dispatch Dec 15 05:09:00 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-537145655"} : dispatch Dec 15 05:09:00 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-537145655"} : dispatch Dec 15 05:09:00 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-537145655"}]': finished Dec 15 05:09:00 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.eve48", "caps": ["mds", "allow rw path=/volumes/_nogroup/71f8a216-0ace-48cb-9799-bcf4f361aa30/cce16599-7ae9-4668-a71c-5b857423d725", "osd", "allow rw pool=manila_data namespace=fsvolumens_71f8a216-0ace-48cb-9799-bcf4f361aa30", "mon", "allow r"], "format": "json"} v 0) Dec 15 05:09:00 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.eve48", "caps": ["mds", "allow rw path=/volumes/_nogroup/71f8a216-0ace-48cb-9799-bcf4f361aa30/cce16599-7ae9-4668-a71c-5b857423d725", "osd", "allow rw pool=manila_data namespace=fsvolumens_71f8a216-0ace-48cb-9799-bcf4f361aa30", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:09:00 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth get-or-create", "entity": "client.eve48", "caps": ["mds", "allow rw path=/volumes/_nogroup/71f8a216-0ace-48cb-9799-bcf4f361aa30/cce16599-7ae9-4668-a71c-5b857423d725", "osd", "allow rw pool=manila_data namespace=fsvolumens_71f8a216-0ace-48cb-9799-bcf4f361aa30", "mon", "allow r"], "format": "json"}]': finished Dec 15 05:09:01 localhost neutron_sriov_agent[260044]: 2025-12-15 10:09:01.109 2 INFO neutron.agent.securitygroups_rpc [None req-bac5a227-3298-47cc-a85f-58789375ed07 af329bd85e524c1ea41f4255aa5eb9d8 1d15e551a54e4d0ba499cdbee6530731 - - default default] Security group member updated ['5c37dc84-f80b-4e12-8d9d-49a8b22167f0']#033[00m Dec 15 05:09:01 localhost nova_compute[286344]: 2025-12-15 10:09:01.118 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:09:01 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e203 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:09:01 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e203 do_prune osdmap full prune enabled Dec 15 05:09:01 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e204 e204: 6 total, 6 up, 6 in Dec 15 05:09:01 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e204: 6 total, 6 up, 6 in Dec 15 05:09:01 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.eve48", "format": "json"} : dispatch Dec 15 05:09:01 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.eve48", "caps": ["mds", "allow rw path=/volumes/_nogroup/71f8a216-0ace-48cb-9799-bcf4f361aa30/cce16599-7ae9-4668-a71c-5b857423d725", "osd", "allow rw pool=manila_data namespace=fsvolumens_71f8a216-0ace-48cb-9799-bcf4f361aa30", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:09:01 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.eve48", "caps": ["mds", "allow rw path=/volumes/_nogroup/71f8a216-0ace-48cb-9799-bcf4f361aa30/cce16599-7ae9-4668-a71c-5b857423d725", "osd", "allow rw pool=manila_data namespace=fsvolumens_71f8a216-0ace-48cb-9799-bcf4f361aa30", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:09:01 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth get-or-create", "entity": "client.eve48", "caps": ["mds", "allow rw path=/volumes/_nogroup/71f8a216-0ace-48cb-9799-bcf4f361aa30/cce16599-7ae9-4668-a71c-5b857423d725", "osd", "allow rw pool=manila_data namespace=fsvolumens_71f8a216-0ace-48cb-9799-bcf4f361aa30", "mon", "allow r"], "format": "json"}]': finished Dec 15 05:09:01 localhost neutron_sriov_agent[260044]: 2025-12-15 10:09:01.611 2 INFO neutron.agent.securitygroups_rpc [None req-ffbebe9a-91d7-48b1-aa43-1aa3d76d9752 af329bd85e524c1ea41f4255aa5eb9d8 1d15e551a54e4d0ba499cdbee6530731 - - default default] Security group member updated ['5c37dc84-f80b-4e12-8d9d-49a8b22167f0']#033[00m Dec 15 05:09:01 localhost podman[243449]: time="2025-12-15T10:09:01Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 15 05:09:01 localhost podman[243449]: @ - - [15/Dec/2025:10:09:01 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156640 "" "Go-http-client/1.1" Dec 15 05:09:01 localhost podman[243449]: @ - - [15/Dec/2025:10:09:01 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19260 "" "Go-http-client/1.1" Dec 15 05:09:02 localhost neutron_sriov_agent[260044]: 2025-12-15 10:09:02.076 2 INFO neutron.agent.securitygroups_rpc [None req-007e29be-de0a-41e0-9ebb-d3e9536e3b6d af329bd85e524c1ea41f4255aa5eb9d8 1d15e551a54e4d0ba499cdbee6530731 - - default default] Security group member updated ['5c37dc84-f80b-4e12-8d9d-49a8b22167f0']#033[00m Dec 15 05:09:02 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e204 do_prune osdmap full prune enabled Dec 15 05:09:02 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e205 e205: 6 total, 6 up, 6 in Dec 15 05:09:02 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e205: 6 total, 6 up, 6 in Dec 15 05:09:02 localhost neutron_sriov_agent[260044]: 2025-12-15 10:09:02.915 2 INFO neutron.agent.securitygroups_rpc [None req-a7c32cbb-e335-4189-bd83-4753463953f0 af329bd85e524c1ea41f4255aa5eb9d8 1d15e551a54e4d0ba499cdbee6530731 - - default default] Security group member updated ['5c37dc84-f80b-4e12-8d9d-49a8b22167f0']#033[00m Dec 15 05:09:03 localhost neutron_sriov_agent[260044]: 2025-12-15 10:09:03.346 2 INFO neutron.agent.securitygroups_rpc [None req-7ae8d04f-bc10-4bde-8db0-da5668a27a53 af329bd85e524c1ea41f4255aa5eb9d8 1d15e551a54e4d0ba499cdbee6530731 - - default default] Security group member updated ['5c37dc84-f80b-4e12-8d9d-49a8b22167f0']#033[00m Dec 15 05:09:03 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:09:03.392 267546 INFO neutron.agent.linux.ip_lib [None req-8ad1ac27-a6f5-4288-827c-3141551f4a85 - - - - - -] Device tapb2445dd8-46 cannot be used as it has no MAC address#033[00m Dec 15 05:09:03 localhost nova_compute[286344]: 2025-12-15 10:09:03.419 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:09:03 localhost kernel: device tapb2445dd8-46 entered promiscuous mode Dec 15 05:09:03 localhost NetworkManager[5963]: [1765793343.4280] manager: (tapb2445dd8-46): new Generic device (/org/freedesktop/NetworkManager/Devices/69) Dec 15 05:09:03 localhost nova_compute[286344]: 2025-12-15 10:09:03.430 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:09:03 localhost systemd-udevd[331802]: Network interface NamePolicy= disabled on kernel command line. Dec 15 05:09:03 localhost ovn_controller[154603]: 2025-12-15T10:09:03Z|00441|binding|INFO|Claiming lport b2445dd8-4619-4744-b6e8-830d7be7c5d8 for this chassis. Dec 15 05:09:03 localhost ovn_controller[154603]: 2025-12-15T10:09:03Z|00442|binding|INFO|b2445dd8-4619-4744-b6e8-830d7be7c5d8: Claiming unknown Dec 15 05:09:03 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-537145655", "caps": ["mds", "allow rw path=/volumes/_nogroup/d45816ae-1301-449f-a02a-41f15745dceb/4f63897f-7a83-4942-b630-cfc3d31a8b78", "osd", "allow rw pool=manila_data namespace=fsvolumens_d45816ae-1301-449f-a02a-41f15745dceb", "mon", "allow r"], "format": "json"} v 0) Dec 15 05:09:03 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-537145655", "caps": ["mds", "allow rw path=/volumes/_nogroup/d45816ae-1301-449f-a02a-41f15745dceb/4f63897f-7a83-4942-b630-cfc3d31a8b78", "osd", "allow rw pool=manila_data namespace=fsvolumens_d45816ae-1301-449f-a02a-41f15745dceb", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:09:03 localhost ovn_metadata_agent[160585]: 2025-12-15 10:09:03.450 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.101.0.2/28', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-c64cba8d-8f89-44f9-b11a-021414a39002', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c64cba8d-8f89-44f9-b11a-021414a39002', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '37134118fff54995bf9c18df154a3cf8', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b2b36aa5-7f9d-47b4-997a-af0aa4a54015, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=b2445dd8-4619-4744-b6e8-830d7be7c5d8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:09:03 localhost ovn_metadata_agent[160585]: 2025-12-15 10:09:03.456 160590 INFO neutron.agent.ovn.metadata.agent [-] Port b2445dd8-4619-4744-b6e8-830d7be7c5d8 in datapath c64cba8d-8f89-44f9-b11a-021414a39002 bound to our chassis#033[00m Dec 15 05:09:03 localhost ovn_metadata_agent[160585]: 2025-12-15 10:09:03.458 160590 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network c64cba8d-8f89-44f9-b11a-021414a39002 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 15 05:09:03 localhost ovn_metadata_agent[160585]: 2025-12-15 10:09:03.459 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[79295568-e473-40e8-ab15-33406ede3f8e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:09:03 localhost journal[231322]: ethtool ioctl error on tapb2445dd8-46: No such device Dec 15 05:09:03 localhost ovn_controller[154603]: 2025-12-15T10:09:03Z|00443|binding|INFO|Setting lport b2445dd8-4619-4744-b6e8-830d7be7c5d8 ovn-installed in OVS Dec 15 05:09:03 localhost ovn_controller[154603]: 2025-12-15T10:09:03Z|00444|binding|INFO|Setting lport b2445dd8-4619-4744-b6e8-830d7be7c5d8 up in Southbound Dec 15 05:09:03 localhost journal[231322]: ethtool ioctl error on tapb2445dd8-46: No such device Dec 15 05:09:03 localhost nova_compute[286344]: 2025-12-15 10:09:03.469 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:09:03 localhost journal[231322]: ethtool ioctl error on tapb2445dd8-46: No such device Dec 15 05:09:03 localhost journal[231322]: ethtool ioctl error on tapb2445dd8-46: No such device Dec 15 05:09:03 localhost journal[231322]: ethtool ioctl error on tapb2445dd8-46: No such device Dec 15 05:09:03 localhost journal[231322]: ethtool ioctl error on tapb2445dd8-46: No such device Dec 15 05:09:03 localhost journal[231322]: ethtool ioctl error on tapb2445dd8-46: No such device Dec 15 05:09:03 localhost journal[231322]: ethtool ioctl error on tapb2445dd8-46: No such device Dec 15 05:09:03 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-537145655", "format": "json"} : dispatch Dec 15 05:09:03 localhost nova_compute[286344]: 2025-12-15 10:09:03.515 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:09:03 localhost nova_compute[286344]: 2025-12-15 10:09:03.545 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:09:03 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-537145655", "caps": ["mds", "allow rw path=/volumes/_nogroup/d45816ae-1301-449f-a02a-41f15745dceb/4f63897f-7a83-4942-b630-cfc3d31a8b78", "osd", "allow rw pool=manila_data namespace=fsvolumens_d45816ae-1301-449f-a02a-41f15745dceb", "mon", "allow r"], "format": "json"}]': finished Dec 15 05:09:03 localhost nova_compute[286344]: 2025-12-15 10:09:03.741 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:09:04 localhost podman[331873]: Dec 15 05:09:04 localhost podman[331873]: 2025-12-15 10:09:04.46760839 +0000 UTC m=+0.098547290 container create 695a5ad1bff6bf1e4833eecda29638a20465df0962e9963b9dc6e3b4baacd00c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c64cba8d-8f89-44f9-b11a-021414a39002, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 05:09:04 localhost podman[331873]: 2025-12-15 10:09:04.403391645 +0000 UTC m=+0.034330575 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 15 05:09:04 localhost systemd[1]: Started libpod-conmon-695a5ad1bff6bf1e4833eecda29638a20465df0962e9963b9dc6e3b4baacd00c.scope. Dec 15 05:09:04 localhost systemd[1]: tmp-crun.wxjzyU.mount: Deactivated successfully. Dec 15 05:09:04 localhost systemd[1]: Started libcrun container. Dec 15 05:09:04 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1594dbbd3d870bf76756642c3a8b9c2843e1fc740dcc6181100456977df633d0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 05:09:04 localhost podman[331873]: 2025-12-15 10:09:04.552018516 +0000 UTC m=+0.182957436 container init 695a5ad1bff6bf1e4833eecda29638a20465df0962e9963b9dc6e3b4baacd00c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c64cba8d-8f89-44f9-b11a-021414a39002, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:09:04 localhost podman[331873]: 2025-12-15 10:09:04.562059279 +0000 UTC m=+0.192998229 container start 695a5ad1bff6bf1e4833eecda29638a20465df0962e9963b9dc6e3b4baacd00c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c64cba8d-8f89-44f9-b11a-021414a39002, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Dec 15 05:09:04 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-537145655", "caps": ["mds", "allow rw path=/volumes/_nogroup/d45816ae-1301-449f-a02a-41f15745dceb/4f63897f-7a83-4942-b630-cfc3d31a8b78", "osd", "allow rw pool=manila_data namespace=fsvolumens_d45816ae-1301-449f-a02a-41f15745dceb", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:09:04 localhost dnsmasq[331892]: started, version 2.85 cachesize 150 Dec 15 05:09:04 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-537145655", "caps": ["mds", "allow rw path=/volumes/_nogroup/d45816ae-1301-449f-a02a-41f15745dceb/4f63897f-7a83-4942-b630-cfc3d31a8b78", "osd", "allow rw pool=manila_data namespace=fsvolumens_d45816ae-1301-449f-a02a-41f15745dceb", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:09:04 localhost dnsmasq[331892]: DNS service limited to local subnets Dec 15 05:09:04 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-537145655", "caps": ["mds", "allow rw path=/volumes/_nogroup/d45816ae-1301-449f-a02a-41f15745dceb/4f63897f-7a83-4942-b630-cfc3d31a8b78", "osd", "allow rw pool=manila_data namespace=fsvolumens_d45816ae-1301-449f-a02a-41f15745dceb", "mon", "allow r"], "format": "json"}]': finished Dec 15 05:09:04 localhost dnsmasq[331892]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 15 05:09:04 localhost dnsmasq[331892]: warning: no upstream servers configured Dec 15 05:09:04 localhost dnsmasq-dhcp[331892]: DHCP, static leases only on 10.101.0.0, lease time 1d Dec 15 05:09:04 localhost dnsmasq[331892]: read /var/lib/neutron/dhcp/c64cba8d-8f89-44f9-b11a-021414a39002/addn_hosts - 0 addresses Dec 15 05:09:04 localhost dnsmasq-dhcp[331892]: read /var/lib/neutron/dhcp/c64cba8d-8f89-44f9-b11a-021414a39002/host Dec 15 05:09:04 localhost dnsmasq-dhcp[331892]: read /var/lib/neutron/dhcp/c64cba8d-8f89-44f9-b11a-021414a39002/opts Dec 15 05:09:04 localhost neutron_sriov_agent[260044]: 2025-12-15 10:09:04.683 2 INFO neutron.agent.securitygroups_rpc [None req-1b4b4866-746c-48d5-9ceb-53b4fabb2913 af329bd85e524c1ea41f4255aa5eb9d8 1d15e551a54e4d0ba499cdbee6530731 - - default default] Security group member updated ['5c37dc84-f80b-4e12-8d9d-49a8b22167f0']#033[00m Dec 15 05:09:04 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:09:04.767 267546 INFO neutron.agent.dhcp.agent [None req-9c148be0-bb30-48d9-8c5c-4dd7b464b965 - - - - - -] DHCP configuration for ports {'38719f85-0333-4dbb-9ded-91696d357d67'} is completed#033[00m Dec 15 05:09:04 localhost openstack_network_exporter[246484]: ERROR 10:09:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 05:09:04 localhost openstack_network_exporter[246484]: ERROR 10:09:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 05:09:04 localhost openstack_network_exporter[246484]: ERROR 10:09:04 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 15 05:09:04 localhost openstack_network_exporter[246484]: ERROR 10:09:04 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 15 05:09:04 localhost openstack_network_exporter[246484]: Dec 15 05:09:04 localhost openstack_network_exporter[246484]: ERROR 10:09:04 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 15 05:09:04 localhost openstack_network_exporter[246484]: Dec 15 05:09:04 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth rm", "entity": "client.eve48"} v 0) Dec 15 05:09:04 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.eve48"} : dispatch Dec 15 05:09:04 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth rm", "entity": "client.eve48"}]': finished Dec 15 05:09:05 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.eve48", "format": "json"} : dispatch Dec 15 05:09:05 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.eve48"} : dispatch Dec 15 05:09:05 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.eve48"} : dispatch Dec 15 05:09:05 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth rm", "entity": "client.eve48"}]': finished Dec 15 05:09:06 localhost nova_compute[286344]: 2025-12-15 10:09:06.123 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:09:06 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Dec 15 05:09:06 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:09:06 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e205 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:09:06 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth rm", "entity": "client.tempest-cephx-id-537145655"} v 0) Dec 15 05:09:06 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-537145655"} : dispatch Dec 15 05:09:06 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-537145655"}]': finished Dec 15 05:09:06 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Dec 15 05:09:06 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:09:06 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-537145655", "format": "json"} : dispatch Dec 15 05:09:06 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-537145655"} : dispatch Dec 15 05:09:06 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-537145655"} : dispatch Dec 15 05:09:06 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-537145655"}]': finished Dec 15 05:09:07 localhost ovn_metadata_agent[160585]: 2025-12-15 10:09:07.619 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=17, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'fe:17:e3', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fe:55:2b:86:15:b5'}, ipsec=False) old=SB_Global(nb_cfg=16) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:09:07 localhost ovn_metadata_agent[160585]: 2025-12-15 10:09:07.620 160590 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Dec 15 05:09:07 localhost nova_compute[286344]: 2025-12-15 10:09:07.621 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:09:07 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.eve47", "caps": ["mds", "allow rw path=/volumes/_nogroup/71f8a216-0ace-48cb-9799-bcf4f361aa30/cce16599-7ae9-4668-a71c-5b857423d725", "osd", "allow rw pool=manila_data namespace=fsvolumens_71f8a216-0ace-48cb-9799-bcf4f361aa30", "mon", "allow r"], "format": "json"} v 0) Dec 15 05:09:07 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.eve47", "caps": ["mds", "allow rw path=/volumes/_nogroup/71f8a216-0ace-48cb-9799-bcf4f361aa30/cce16599-7ae9-4668-a71c-5b857423d725", "osd", "allow rw pool=manila_data namespace=fsvolumens_71f8a216-0ace-48cb-9799-bcf4f361aa30", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:09:07 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth get-or-create", "entity": "client.eve47", "caps": ["mds", "allow rw path=/volumes/_nogroup/71f8a216-0ace-48cb-9799-bcf4f361aa30/cce16599-7ae9-4668-a71c-5b857423d725", "osd", "allow rw pool=manila_data namespace=fsvolumens_71f8a216-0ace-48cb-9799-bcf4f361aa30", "mon", "allow r"], "format": "json"}]': finished Dec 15 05:09:07 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.eve47", "format": "json"} : dispatch Dec 15 05:09:07 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.eve47", "caps": ["mds", "allow rw path=/volumes/_nogroup/71f8a216-0ace-48cb-9799-bcf4f361aa30/cce16599-7ae9-4668-a71c-5b857423d725", "osd", "allow rw pool=manila_data namespace=fsvolumens_71f8a216-0ace-48cb-9799-bcf4f361aa30", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:09:07 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.eve47", "caps": ["mds", "allow rw path=/volumes/_nogroup/71f8a216-0ace-48cb-9799-bcf4f361aa30/cce16599-7ae9-4668-a71c-5b857423d725", "osd", "allow rw pool=manila_data namespace=fsvolumens_71f8a216-0ace-48cb-9799-bcf4f361aa30", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:09:07 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth get-or-create", "entity": "client.eve47", "caps": ["mds", "allow rw path=/volumes/_nogroup/71f8a216-0ace-48cb-9799-bcf4f361aa30/cce16599-7ae9-4668-a71c-5b857423d725", "osd", "allow rw pool=manila_data namespace=fsvolumens_71f8a216-0ace-48cb-9799-bcf4f361aa30", "mon", "allow r"], "format": "json"}]': finished Dec 15 05:09:08 localhost neutron_sriov_agent[260044]: 2025-12-15 10:09:08.529 2 INFO neutron.agent.securitygroups_rpc [None req-f169930f-648e-47be-91aa-472534fce2a9 af329bd85e524c1ea41f4255aa5eb9d8 1d15e551a54e4d0ba499cdbee6530731 - - default default] Security group member updated ['5c37dc84-f80b-4e12-8d9d-49a8b22167f0']#033[00m Dec 15 05:09:08 localhost nova_compute[286344]: 2025-12-15 10:09:08.782 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:09:08 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e205 do_prune osdmap full prune enabled Dec 15 05:09:08 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e206 e206: 6 total, 6 up, 6 in Dec 15 05:09:08 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e206: 6 total, 6 up, 6 in Dec 15 05:09:09 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:09:09.003 267546 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-15T10:09:08Z, description=, device_id=8bd8a4ee-433e-4995-81ca-080a5f5208c4, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=237d2bea-2fee-4dbf-a05c-7fed21be7194, ip_allocation=immediate, mac_address=fa:16:3e:87:f3:a9, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-15T10:08:59Z, description=, dns_domain=, id=c64cba8d-8f89-44f9-b11a-021414a39002, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-615426864, port_security_enabled=True, project_id=37134118fff54995bf9c18df154a3cf8, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=49648, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3016, status=ACTIVE, subnets=['373fc99b-71f2-459c-9b48-fecf83476cda'], tags=[], tenant_id=37134118fff54995bf9c18df154a3cf8, updated_at=2025-12-15T10:09:02Z, vlan_transparent=None, network_id=c64cba8d-8f89-44f9-b11a-021414a39002, port_security_enabled=False, project_id=37134118fff54995bf9c18df154a3cf8, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3035, status=DOWN, tags=[], tenant_id=37134118fff54995bf9c18df154a3cf8, updated_at=2025-12-15T10:09:08Z on network c64cba8d-8f89-44f9-b11a-021414a39002#033[00m Dec 15 05:09:09 localhost neutron_sriov_agent[260044]: 2025-12-15 10:09:09.076 2 INFO neutron.agent.securitygroups_rpc [None req-3bebc9bc-9bf6-4593-b0ee-85cade7e8424 af329bd85e524c1ea41f4255aa5eb9d8 1d15e551a54e4d0ba499cdbee6530731 - - default default] Security group member updated ['5c37dc84-f80b-4e12-8d9d-49a8b22167f0']#033[00m Dec 15 05:09:09 localhost systemd[1]: tmp-crun.PvrDar.mount: Deactivated successfully. Dec 15 05:09:09 localhost dnsmasq[331892]: read /var/lib/neutron/dhcp/c64cba8d-8f89-44f9-b11a-021414a39002/addn_hosts - 1 addresses Dec 15 05:09:09 localhost dnsmasq-dhcp[331892]: read /var/lib/neutron/dhcp/c64cba8d-8f89-44f9-b11a-021414a39002/host Dec 15 05:09:09 localhost podman[331995]: 2025-12-15 10:09:09.214283026 +0000 UTC m=+0.065078770 container kill 695a5ad1bff6bf1e4833eecda29638a20465df0962e9963b9dc6e3b4baacd00c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c64cba8d-8f89-44f9-b11a-021414a39002, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Dec 15 05:09:09 localhost dnsmasq-dhcp[331892]: read /var/lib/neutron/dhcp/c64cba8d-8f89-44f9-b11a-021414a39002/opts Dec 15 05:09:09 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Dec 15 05:09:09 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:09:09.643 267546 INFO neutron.agent.dhcp.agent [None req-43b49bc7-fed8-4d40-8a80-8e9d3c7ae565 - - - - - -] DHCP configuration for ports {'237d2bea-2fee-4dbf-a05c-7fed21be7194'} is completed#033[00m Dec 15 05:09:09 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:09:09 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-537145655", "caps": ["mds", "allow rw path=/volumes/_nogroup/d45816ae-1301-449f-a02a-41f15745dceb/4f63897f-7a83-4942-b630-cfc3d31a8b78", "osd", "allow rw pool=manila_data namespace=fsvolumens_d45816ae-1301-449f-a02a-41f15745dceb", "mon", "allow r"], "format": "json"} v 0) Dec 15 05:09:09 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-537145655", "caps": ["mds", "allow rw path=/volumes/_nogroup/d45816ae-1301-449f-a02a-41f15745dceb/4f63897f-7a83-4942-b630-cfc3d31a8b78", "osd", "allow rw pool=manila_data namespace=fsvolumens_d45816ae-1301-449f-a02a-41f15745dceb", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:09:09 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-537145655", "caps": ["mds", "allow rw path=/volumes/_nogroup/d45816ae-1301-449f-a02a-41f15745dceb/4f63897f-7a83-4942-b630-cfc3d31a8b78", "osd", "allow rw pool=manila_data namespace=fsvolumens_d45816ae-1301-449f-a02a-41f15745dceb", "mon", "allow r"], "format": "json"}]': finished Dec 15 05:09:09 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-537145655", "format": "json"} : dispatch Dec 15 05:09:09 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:09:09 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-537145655", "caps": ["mds", "allow rw path=/volumes/_nogroup/d45816ae-1301-449f-a02a-41f15745dceb/4f63897f-7a83-4942-b630-cfc3d31a8b78", "osd", "allow rw pool=manila_data namespace=fsvolumens_d45816ae-1301-449f-a02a-41f15745dceb", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:09:09 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-537145655", "caps": ["mds", "allow rw path=/volumes/_nogroup/d45816ae-1301-449f-a02a-41f15745dceb/4f63897f-7a83-4942-b630-cfc3d31a8b78", "osd", "allow rw pool=manila_data namespace=fsvolumens_d45816ae-1301-449f-a02a-41f15745dceb", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:09:09 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-537145655", "caps": ["mds", "allow rw path=/volumes/_nogroup/d45816ae-1301-449f-a02a-41f15745dceb/4f63897f-7a83-4942-b630-cfc3d31a8b78", "osd", "allow rw pool=manila_data namespace=fsvolumens_d45816ae-1301-449f-a02a-41f15745dceb", "mon", "allow r"], "format": "json"}]': finished Dec 15 05:09:09 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:09:09.959 267546 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-15T10:09:08Z, description=, device_id=8bd8a4ee-433e-4995-81ca-080a5f5208c4, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=237d2bea-2fee-4dbf-a05c-7fed21be7194, ip_allocation=immediate, mac_address=fa:16:3e:87:f3:a9, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-15T10:08:59Z, description=, dns_domain=, id=c64cba8d-8f89-44f9-b11a-021414a39002, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-615426864, port_security_enabled=True, project_id=37134118fff54995bf9c18df154a3cf8, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=49648, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3016, status=ACTIVE, subnets=['373fc99b-71f2-459c-9b48-fecf83476cda'], tags=[], tenant_id=37134118fff54995bf9c18df154a3cf8, updated_at=2025-12-15T10:09:02Z, vlan_transparent=None, network_id=c64cba8d-8f89-44f9-b11a-021414a39002, port_security_enabled=False, project_id=37134118fff54995bf9c18df154a3cf8, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3035, status=DOWN, tags=[], tenant_id=37134118fff54995bf9c18df154a3cf8, updated_at=2025-12-15T10:09:08Z on network c64cba8d-8f89-44f9-b11a-021414a39002#033[00m Dec 15 05:09:10 localhost neutron_sriov_agent[260044]: 2025-12-15 10:09:10.055 2 INFO neutron.agent.securitygroups_rpc [None req-bc91dd48-284b-4d4c-ab50-9f0e6d9e1ce5 af329bd85e524c1ea41f4255aa5eb9d8 1d15e551a54e4d0ba499cdbee6530731 - - default default] Security group member updated ['5c37dc84-f80b-4e12-8d9d-49a8b22167f0']#033[00m Dec 15 05:09:10 localhost dnsmasq[331892]: read /var/lib/neutron/dhcp/c64cba8d-8f89-44f9-b11a-021414a39002/addn_hosts - 1 addresses Dec 15 05:09:10 localhost dnsmasq-dhcp[331892]: read /var/lib/neutron/dhcp/c64cba8d-8f89-44f9-b11a-021414a39002/host Dec 15 05:09:10 localhost dnsmasq-dhcp[331892]: read /var/lib/neutron/dhcp/c64cba8d-8f89-44f9-b11a-021414a39002/opts Dec 15 05:09:10 localhost podman[332032]: 2025-12-15 10:09:10.185083792 +0000 UTC m=+0.065516332 container kill 695a5ad1bff6bf1e4833eecda29638a20465df0962e9963b9dc6e3b4baacd00c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c64cba8d-8f89-44f9-b11a-021414a39002, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0) Dec 15 05:09:10 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:09:10.399 267546 INFO neutron.agent.dhcp.agent [None req-786526bd-7892-47b1-a747-d449b93c3cde - - - - - -] DHCP configuration for ports {'237d2bea-2fee-4dbf-a05c-7fed21be7194'} is completed#033[00m Dec 15 05:09:10 localhost neutron_sriov_agent[260044]: 2025-12-15 10:09:10.537 2 INFO neutron.agent.securitygroups_rpc [None req-a379d1e3-dfab-4108-8d80-54501dd8cd2f af329bd85e524c1ea41f4255aa5eb9d8 1d15e551a54e4d0ba499cdbee6530731 - - default default] Security group member updated ['5c37dc84-f80b-4e12-8d9d-49a8b22167f0']#033[00m Dec 15 05:09:10 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e206 do_prune osdmap full prune enabled Dec 15 05:09:10 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e207 e207: 6 total, 6 up, 6 in Dec 15 05:09:10 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e207: 6 total, 6 up, 6 in Dec 15 05:09:11 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth rm", "entity": "client.eve47"} v 0) Dec 15 05:09:11 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.eve47"} : dispatch Dec 15 05:09:11 localhost nova_compute[286344]: 2025-12-15 10:09:11.127 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:09:11 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth rm", "entity": "client.eve47"}]': finished Dec 15 05:09:11 localhost podman[332072]: 2025-12-15 10:09:11.26329498 +0000 UTC m=+0.060073864 container kill 695a5ad1bff6bf1e4833eecda29638a20465df0962e9963b9dc6e3b4baacd00c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c64cba8d-8f89-44f9-b11a-021414a39002, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 05:09:11 localhost dnsmasq[331892]: read /var/lib/neutron/dhcp/c64cba8d-8f89-44f9-b11a-021414a39002/addn_hosts - 0 addresses Dec 15 05:09:11 localhost dnsmasq-dhcp[331892]: read /var/lib/neutron/dhcp/c64cba8d-8f89-44f9-b11a-021414a39002/host Dec 15 05:09:11 localhost dnsmasq-dhcp[331892]: read /var/lib/neutron/dhcp/c64cba8d-8f89-44f9-b11a-021414a39002/opts Dec 15 05:09:11 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e207 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:09:11 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e207 do_prune osdmap full prune enabled Dec 15 05:09:11 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e208 e208: 6 total, 6 up, 6 in Dec 15 05:09:11 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e208: 6 total, 6 up, 6 in Dec 15 05:09:11 localhost ovn_controller[154603]: 2025-12-15T10:09:11Z|00445|binding|INFO|Releasing lport b2445dd8-4619-4744-b6e8-830d7be7c5d8 from this chassis (sb_readonly=0) Dec 15 05:09:11 localhost ovn_controller[154603]: 2025-12-15T10:09:11Z|00446|binding|INFO|Setting lport b2445dd8-4619-4744-b6e8-830d7be7c5d8 down in Southbound Dec 15 05:09:11 localhost kernel: device tapb2445dd8-46 left promiscuous mode Dec 15 05:09:11 localhost nova_compute[286344]: 2025-12-15 10:09:11.452 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:09:11 localhost ovn_metadata_agent[160585]: 2025-12-15 10:09:11.463 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.101.0.2/28', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-c64cba8d-8f89-44f9-b11a-021414a39002', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c64cba8d-8f89-44f9-b11a-021414a39002', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '37134118fff54995bf9c18df154a3cf8', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005559462.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b2b36aa5-7f9d-47b4-997a-af0aa4a54015, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=b2445dd8-4619-4744-b6e8-830d7be7c5d8) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:09:11 localhost ovn_metadata_agent[160585]: 2025-12-15 10:09:11.465 160590 INFO neutron.agent.ovn.metadata.agent [-] Port b2445dd8-4619-4744-b6e8-830d7be7c5d8 in datapath c64cba8d-8f89-44f9-b11a-021414a39002 unbound from our chassis#033[00m Dec 15 05:09:11 localhost ovn_metadata_agent[160585]: 2025-12-15 10:09:11.467 160590 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c64cba8d-8f89-44f9-b11a-021414a39002, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 15 05:09:11 localhost ovn_metadata_agent[160585]: 2025-12-15 10:09:11.468 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[64a204fc-ef0f-4e1d-a576-ea14d2ccd8dd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:09:11 localhost nova_compute[286344]: 2025-12-15 10:09:11.474 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:09:11 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.eve47", "format": "json"} : dispatch Dec 15 05:09:11 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.eve47"} : dispatch Dec 15 05:09:11 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.eve47"} : dispatch Dec 15 05:09:11 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth rm", "entity": "client.eve47"}]': finished Dec 15 05:09:12 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e208 do_prune osdmap full prune enabled Dec 15 05:09:12 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e209 e209: 6 total, 6 up, 6 in Dec 15 05:09:12 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e209: 6 total, 6 up, 6 in Dec 15 05:09:13 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth rm", "entity": "client.tempest-cephx-id-537145655"} v 0) Dec 15 05:09:13 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-537145655"} : dispatch Dec 15 05:09:13 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-537145655"}]': finished Dec 15 05:09:13 localhost dnsmasq[331892]: exiting on receipt of SIGTERM Dec 15 05:09:13 localhost podman[332112]: 2025-12-15 10:09:13.592253727 +0000 UTC m=+0.058653076 container kill 695a5ad1bff6bf1e4833eecda29638a20465df0962e9963b9dc6e3b4baacd00c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c64cba8d-8f89-44f9-b11a-021414a39002, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Dec 15 05:09:13 localhost systemd[1]: libpod-695a5ad1bff6bf1e4833eecda29638a20465df0962e9963b9dc6e3b4baacd00c.scope: Deactivated successfully. Dec 15 05:09:13 localhost podman[332128]: 2025-12-15 10:09:13.663860423 +0000 UTC m=+0.052625842 container died 695a5ad1bff6bf1e4833eecda29638a20465df0962e9963b9dc6e3b4baacd00c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c64cba8d-8f89-44f9-b11a-021414a39002, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:09:13 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-695a5ad1bff6bf1e4833eecda29638a20465df0962e9963b9dc6e3b4baacd00c-userdata-shm.mount: Deactivated successfully. Dec 15 05:09:13 localhost systemd[1]: var-lib-containers-storage-overlay-1594dbbd3d870bf76756642c3a8b9c2843e1fc740dcc6181100456977df633d0-merged.mount: Deactivated successfully. Dec 15 05:09:13 localhost podman[332128]: 2025-12-15 10:09:13.757772307 +0000 UTC m=+0.146537686 container remove 695a5ad1bff6bf1e4833eecda29638a20465df0962e9963b9dc6e3b4baacd00c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c64cba8d-8f89-44f9-b11a-021414a39002, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true) Dec 15 05:09:13 localhost systemd[1]: libpod-conmon-695a5ad1bff6bf1e4833eecda29638a20465df0962e9963b9dc6e3b4baacd00c.scope: Deactivated successfully. Dec 15 05:09:13 localhost nova_compute[286344]: 2025-12-15 10:09:13.823 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:09:13 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-537145655", "format": "json"} : dispatch Dec 15 05:09:13 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-537145655"} : dispatch Dec 15 05:09:13 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-537145655"} : dispatch Dec 15 05:09:13 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-537145655"}]': finished Dec 15 05:09:13 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:09:13.979 267546 INFO neutron.agent.dhcp.agent [None req-233c2280-a232-4582-af8b-4fb18e96a08d - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:09:14 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:09:14.433 267546 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:09:14 localhost systemd[1]: run-netns-qdhcp\x2dc64cba8d\x2d8f89\x2d44f9\x2db11a\x2d021414a39002.mount: Deactivated successfully. Dec 15 05:09:14 localhost ovn_controller[154603]: 2025-12-15T10:09:14Z|00447|binding|INFO|Releasing lport b35254ad-12eb-47bb-92be-44fefe0694f0 from this chassis (sb_readonly=0) Dec 15 05:09:14 localhost nova_compute[286344]: 2025-12-15 10:09:14.712 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:09:14 localhost neutron_sriov_agent[260044]: 2025-12-15 10:09:14.756 2 INFO neutron.agent.securitygroups_rpc [None req-a95ea43b-28fd-4c44-bfb2-392061fae7fb af329bd85e524c1ea41f4255aa5eb9d8 1d15e551a54e4d0ba499cdbee6530731 - - default default] Security group member updated ['5c37dc84-f80b-4e12-8d9d-49a8b22167f0']#033[00m Dec 15 05:09:15 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth rm", "entity": "client.eve49"} v 0) Dec 15 05:09:15 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.eve49"} : dispatch Dec 15 05:09:15 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth rm", "entity": "client.eve49"}]': finished Dec 15 05:09:15 localhost nova_compute[286344]: 2025-12-15 10:09:15.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:09:15 localhost nova_compute[286344]: 2025-12-15 10:09:15.271 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:09:15 localhost neutron_sriov_agent[260044]: 2025-12-15 10:09:15.691 2 INFO neutron.agent.securitygroups_rpc [None req-9c985468-6f81-41e5-bf46-6e0ee9c1a978 af329bd85e524c1ea41f4255aa5eb9d8 1d15e551a54e4d0ba499cdbee6530731 - - default default] Security group member updated ['5c37dc84-f80b-4e12-8d9d-49a8b22167f0']#033[00m Dec 15 05:09:16 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e209 do_prune osdmap full prune enabled Dec 15 05:09:16 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.eve49", "format": "json"} : dispatch Dec 15 05:09:16 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.eve49"} : dispatch Dec 15 05:09:16 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.eve49"} : dispatch Dec 15 05:09:16 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth rm", "entity": "client.eve49"}]': finished Dec 15 05:09:16 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e210 e210: 6 total, 6 up, 6 in Dec 15 05:09:16 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e210: 6 total, 6 up, 6 in Dec 15 05:09:16 localhost nova_compute[286344]: 2025-12-15 10:09:16.172 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:09:16 localhost nova_compute[286344]: 2025-12-15 10:09:16.266 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:09:16 localhost nova_compute[286344]: 2025-12-15 10:09:16.269 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:09:16 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e210 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:09:16 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e210 do_prune osdmap full prune enabled Dec 15 05:09:16 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e211 e211: 6 total, 6 up, 6 in Dec 15 05:09:16 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e211: 6 total, 6 up, 6 in Dec 15 05:09:16 localhost neutron_sriov_agent[260044]: 2025-12-15 10:09:16.401 2 INFO neutron.agent.securitygroups_rpc [None req-be12a3ed-ad42-4ca9-a8bc-5157410b6093 af329bd85e524c1ea41f4255aa5eb9d8 1d15e551a54e4d0ba499cdbee6530731 - - default default] Security group member updated ['5c37dc84-f80b-4e12-8d9d-49a8b22167f0']#033[00m Dec 15 05:09:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0. Dec 15 05:09:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09. Dec 15 05:09:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 05:09:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 05:09:16 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a. Dec 15 05:09:16 localhost neutron_sriov_agent[260044]: 2025-12-15 10:09:16.769 2 INFO neutron.agent.securitygroups_rpc [None req-03bfd829-27fd-457d-8aef-e80c85c40a7c af329bd85e524c1ea41f4255aa5eb9d8 1d15e551a54e4d0ba499cdbee6530731 - - default default] Security group member updated ['5c37dc84-f80b-4e12-8d9d-49a8b22167f0']#033[00m Dec 15 05:09:16 localhost podman[332154]: 2025-12-15 10:09:16.765209351 +0000 UTC m=+0.083762059 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202) Dec 15 05:09:16 localhost systemd[1]: tmp-crun.FyAQZv.mount: Deactivated successfully. Dec 15 05:09:16 localhost podman[332152]: 2025-12-15 10:09:16.836541761 +0000 UTC m=+0.159806916 container health_status 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, architecture=x86_64, managed_by=edpm_ansible, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.tags=minimal rhel9, version=9.6, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, build-date=2025-08-20T13:12:41, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.) Dec 15 05:09:16 localhost podman[332165]: 2025-12-15 10:09:16.788382922 +0000 UTC m=+0.099875317 container health_status b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true) Dec 15 05:09:16 localhost podman[332154]: 2025-12-15 10:09:16.849502233 +0000 UTC m=+0.168055021 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller) Dec 15 05:09:16 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 05:09:16 localhost podman[332165]: 2025-12-15 10:09:16.873400433 +0000 UTC m=+0.184892818 container exec_died b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute) Dec 15 05:09:16 localhost systemd[1]: b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a.service: Deactivated successfully. Dec 15 05:09:16 localhost podman[332152]: 2025-12-15 10:09:16.936418976 +0000 UTC m=+0.259684151 container exec_died 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., name=ubi9-minimal, release=1755695350, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, distribution-scope=public, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., version=9.6, config_id=openstack_network_exporter) Dec 15 05:09:16 localhost systemd[1]: 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09.service: Deactivated successfully. Dec 15 05:09:16 localhost podman[332153]: 2025-12-15 10:09:16.988119762 +0000 UTC m=+0.305535628 container health_status 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202) Dec 15 05:09:17 localhost podman[332153]: 2025-12-15 10:09:17.002824272 +0000 UTC m=+0.320240138 container exec_died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=multipathd) Dec 15 05:09:17 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.service: Deactivated successfully. Dec 15 05:09:17 localhost podman[332151]: 2025-12-15 10:09:17.098258187 +0000 UTC m=+0.422844909 container health_status 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Dec 15 05:09:17 localhost podman[332151]: 2025-12-15 10:09:17.113499602 +0000 UTC m=+0.438086324 container exec_died 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Dec 15 05:09:17 localhost systemd[1]: 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0.service: Deactivated successfully. Dec 15 05:09:17 localhost nova_compute[286344]: 2025-12-15 10:09:17.269 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:09:17 localhost nova_compute[286344]: 2025-12-15 10:09:17.270 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 15 05:09:17 localhost nova_compute[286344]: 2025-12-15 10:09:17.270 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 15 05:09:17 localhost nova_compute[286344]: 2025-12-15 10:09:17.348 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 15 05:09:17 localhost nova_compute[286344]: 2025-12-15 10:09:17.349 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquired lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 15 05:09:17 localhost nova_compute[286344]: 2025-12-15 10:09:17.349 286348 DEBUG nova.network.neutron [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 15 05:09:17 localhost nova_compute[286344]: 2025-12-15 10:09:17.349 286348 DEBUG nova.objects.instance [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 15 05:09:17 localhost ovn_metadata_agent[160585]: 2025-12-15 10:09:17.622 160590 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=12d96d64-e862-4f68-81e5-8d9ec5d3a5e2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '17'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 15 05:09:17 localhost nova_compute[286344]: 2025-12-15 10:09:17.946 286348 DEBUG nova.network.neutron [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Updating instance_info_cache with network_info: [{"id": "03ef8889-3216-43fb-8a52-4be17a956ce1", "address": "fa:16:3e:74:df:7c", "network": {"id": "befb7a72-17a9-4bcb-b561-84b8f626685a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.201", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "c785bf23f53946bc99867d8832a50266", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03ef8889-32", "ovs_interfaceid": "03ef8889-3216-43fb-8a52-4be17a956ce1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 15 05:09:17 localhost nova_compute[286344]: 2025-12-15 10:09:17.966 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Releasing lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 15 05:09:17 localhost nova_compute[286344]: 2025-12-15 10:09:17.967 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 15 05:09:18 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:09:18.287 267546 INFO neutron.agent.linux.ip_lib [None req-3ef31b3a-26d7-493e-8dcd-a6391d55ddfc - - - - - -] Device tapa53b633d-e1 cannot be used as it has no MAC address#033[00m Dec 15 05:09:18 localhost nova_compute[286344]: 2025-12-15 10:09:18.313 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:09:18 localhost kernel: device tapa53b633d-e1 entered promiscuous mode Dec 15 05:09:18 localhost nova_compute[286344]: 2025-12-15 10:09:18.324 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:09:18 localhost NetworkManager[5963]: [1765793358.3248] manager: (tapa53b633d-e1): new Generic device (/org/freedesktop/NetworkManager/Devices/70) Dec 15 05:09:18 localhost ovn_controller[154603]: 2025-12-15T10:09:18Z|00448|binding|INFO|Claiming lport a53b633d-e179-45d6-9571-f3615ed5347f for this chassis. Dec 15 05:09:18 localhost ovn_controller[154603]: 2025-12-15T10:09:18Z|00449|binding|INFO|a53b633d-e179-45d6-9571-f3615ed5347f: Claiming unknown Dec 15 05:09:18 localhost systemd-udevd[332267]: Network interface NamePolicy= disabled on kernel command line. Dec 15 05:09:18 localhost ovn_metadata_agent[160585]: 2025-12-15 10:09:18.338 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-3c24c7aa-9727-4a7c-a93f-1107646953b1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3c24c7aa-9727-4a7c-a93f-1107646953b1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '37134118fff54995bf9c18df154a3cf8', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0c3d9771-dd19-4b51-8a3d-d2c0a052af0f, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=a53b633d-e179-45d6-9571-f3615ed5347f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:09:18 localhost ovn_metadata_agent[160585]: 2025-12-15 10:09:18.340 160590 INFO neutron.agent.ovn.metadata.agent [-] Port a53b633d-e179-45d6-9571-f3615ed5347f in datapath 3c24c7aa-9727-4a7c-a93f-1107646953b1 bound to our chassis#033[00m Dec 15 05:09:18 localhost ovn_metadata_agent[160585]: 2025-12-15 10:09:18.344 160590 DEBUG neutron.agent.ovn.metadata.agent [-] Port 239810fd-ef2e-476a-986c-c04269450783 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Dec 15 05:09:18 localhost ovn_metadata_agent[160585]: 2025-12-15 10:09:18.344 160590 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3c24c7aa-9727-4a7c-a93f-1107646953b1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 15 05:09:18 localhost ovn_metadata_agent[160585]: 2025-12-15 10:09:18.345 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[fb46fd5b-ed4a-4ae7-af65-0a848474096a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:09:18 localhost journal[231322]: ethtool ioctl error on tapa53b633d-e1: No such device Dec 15 05:09:18 localhost nova_compute[286344]: 2025-12-15 10:09:18.362 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:09:18 localhost ovn_controller[154603]: 2025-12-15T10:09:18Z|00450|binding|INFO|Setting lport a53b633d-e179-45d6-9571-f3615ed5347f ovn-installed in OVS Dec 15 05:09:18 localhost ovn_controller[154603]: 2025-12-15T10:09:18Z|00451|binding|INFO|Setting lport a53b633d-e179-45d6-9571-f3615ed5347f up in Southbound Dec 15 05:09:18 localhost nova_compute[286344]: 2025-12-15 10:09:18.369 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:09:18 localhost nova_compute[286344]: 2025-12-15 10:09:18.370 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:09:18 localhost journal[231322]: ethtool ioctl error on tapa53b633d-e1: No such device Dec 15 05:09:18 localhost journal[231322]: ethtool ioctl error on tapa53b633d-e1: No such device Dec 15 05:09:18 localhost journal[231322]: ethtool ioctl error on tapa53b633d-e1: No such device Dec 15 05:09:18 localhost journal[231322]: ethtool ioctl error on tapa53b633d-e1: No such device Dec 15 05:09:18 localhost journal[231322]: ethtool ioctl error on tapa53b633d-e1: No such device Dec 15 05:09:18 localhost neutron_sriov_agent[260044]: 2025-12-15 10:09:18.404 2 INFO neutron.agent.securitygroups_rpc [None req-58d259c3-322a-4fb4-9bfb-df1bbcc658f6 ce7b69d2eeaf4277a0135eadc2f767d3 37134118fff54995bf9c18df154a3cf8 - - default default] Security group member updated ['0ef0bd73-d34f-45ef-bf2a-2ccb46289a13']#033[00m Dec 15 05:09:18 localhost journal[231322]: ethtool ioctl error on tapa53b633d-e1: No such device Dec 15 05:09:18 localhost journal[231322]: ethtool ioctl error on tapa53b633d-e1: No such device Dec 15 05:09:18 localhost nova_compute[286344]: 2025-12-15 10:09:18.424 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:09:18 localhost nova_compute[286344]: 2025-12-15 10:09:18.466 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:09:18 localhost nova_compute[286344]: 2025-12-15 10:09:18.870 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:09:18 localhost nova_compute[286344]: 2025-12-15 10:09:18.963 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:09:19 localhost nova_compute[286344]: 2025-12-15 10:09:19.269 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:09:19 localhost podman[332338]: Dec 15 05:09:19 localhost podman[332338]: 2025-12-15 10:09:19.367862549 +0000 UTC m=+0.098961002 container create dc7156822d75b972805ee20242909cd562d0e9043045aece8339eaff987a154b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3c24c7aa-9727-4a7c-a93f-1107646953b1, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Dec 15 05:09:19 localhost systemd[1]: Started libpod-conmon-dc7156822d75b972805ee20242909cd562d0e9043045aece8339eaff987a154b.scope. Dec 15 05:09:19 localhost podman[332338]: 2025-12-15 10:09:19.322853355 +0000 UTC m=+0.053951838 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 15 05:09:19 localhost systemd[1]: Started libcrun container. Dec 15 05:09:19 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/932bdab5716148067981ee9890707ea2d93565e780ebb108c1d094089e393bba/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 05:09:19 localhost podman[332338]: 2025-12-15 10:09:19.461507406 +0000 UTC m=+0.192605859 container init dc7156822d75b972805ee20242909cd562d0e9043045aece8339eaff987a154b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3c24c7aa-9727-4a7c-a93f-1107646953b1, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true) Dec 15 05:09:19 localhost podman[332338]: 2025-12-15 10:09:19.471526167 +0000 UTC m=+0.202624620 container start dc7156822d75b972805ee20242909cd562d0e9043045aece8339eaff987a154b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3c24c7aa-9727-4a7c-a93f-1107646953b1, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202) Dec 15 05:09:19 localhost dnsmasq[332356]: started, version 2.85 cachesize 150 Dec 15 05:09:19 localhost dnsmasq[332356]: DNS service limited to local subnets Dec 15 05:09:19 localhost dnsmasq[332356]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 15 05:09:19 localhost dnsmasq[332356]: warning: no upstream servers configured Dec 15 05:09:19 localhost dnsmasq-dhcp[332356]: DHCP, static leases only on 10.100.0.0, lease time 1d Dec 15 05:09:19 localhost dnsmasq[332356]: read /var/lib/neutron/dhcp/3c24c7aa-9727-4a7c-a93f-1107646953b1/addn_hosts - 0 addresses Dec 15 05:09:19 localhost dnsmasq-dhcp[332356]: read /var/lib/neutron/dhcp/3c24c7aa-9727-4a7c-a93f-1107646953b1/host Dec 15 05:09:19 localhost dnsmasq-dhcp[332356]: read /var/lib/neutron/dhcp/3c24c7aa-9727-4a7c-a93f-1107646953b1/opts Dec 15 05:09:19 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:09:19.518 267546 INFO neutron.agent.dhcp.agent [None req-c6a086cb-2191-414e-abb5-2952cf157984 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-15T10:09:18Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=4c3aaf9e-eb6e-46dd-b91a-10646e40ef02, ip_allocation=immediate, mac_address=fa:16:3e:1e:88:dd, name=tempest-RoutersTest-1305806659, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-15T10:09:15Z, description=, dns_domain=, id=3c24c7aa-9727-4a7c-a93f-1107646953b1, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-752515366, port_security_enabled=True, project_id=37134118fff54995bf9c18df154a3cf8, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=56559, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3043, status=ACTIVE, subnets=['26e1ae82-1265-4173-8e85-f7fd5e8d71c2'], tags=[], tenant_id=37134118fff54995bf9c18df154a3cf8, updated_at=2025-12-15T10:09:16Z, vlan_transparent=None, network_id=3c24c7aa-9727-4a7c-a93f-1107646953b1, port_security_enabled=True, project_id=37134118fff54995bf9c18df154a3cf8, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['0ef0bd73-d34f-45ef-bf2a-2ccb46289a13'], standard_attr_id=3050, status=DOWN, tags=[], tenant_id=37134118fff54995bf9c18df154a3cf8, updated_at=2025-12-15T10:09:18Z on network 3c24c7aa-9727-4a7c-a93f-1107646953b1#033[00m Dec 15 05:09:19 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Dec 15 05:09:19 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1389758561' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Dec 15 05:09:19 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Dec 15 05:09:19 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1389758561' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Dec 15 05:09:19 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:09:19.620 267546 INFO neutron.agent.dhcp.agent [None req-4b166563-10f7-46b4-a3d2-3743181f1cce - - - - - -] DHCP configuration for ports {'7be76b5c-15fb-4ced-9d6f-eec5dc8c574f'} is completed#033[00m Dec 15 05:09:19 localhost dnsmasq[332356]: read /var/lib/neutron/dhcp/3c24c7aa-9727-4a7c-a93f-1107646953b1/addn_hosts - 1 addresses Dec 15 05:09:19 localhost dnsmasq-dhcp[332356]: read /var/lib/neutron/dhcp/3c24c7aa-9727-4a7c-a93f-1107646953b1/host Dec 15 05:09:19 localhost dnsmasq-dhcp[332356]: read /var/lib/neutron/dhcp/3c24c7aa-9727-4a7c-a93f-1107646953b1/opts Dec 15 05:09:19 localhost podman[332374]: 2025-12-15 10:09:19.73822634 +0000 UTC m=+0.066087258 container kill dc7156822d75b972805ee20242909cd562d0e9043045aece8339eaff987a154b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3c24c7aa-9727-4a7c-a93f-1107646953b1, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202) Dec 15 05:09:19 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:09:19.969 267546 INFO neutron.agent.dhcp.agent [None req-8dca2f80-306f-4017-aecc-a6df30cc4668 - - - - - -] DHCP configuration for ports {'4c3aaf9e-eb6e-46dd-b91a-10646e40ef02'} is completed#033[00m Dec 15 05:09:20 localhost nova_compute[286344]: 2025-12-15 10:09:20.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:09:20 localhost nova_compute[286344]: 2025-12-15 10:09:20.271 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 15 05:09:20 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:09:20.878 267546 INFO neutron.agent.linux.ip_lib [None req-cbc7a27c-e145-4529-b8a6-d0f7f67106de - - - - - -] Device tapd581117d-e8 cannot be used as it has no MAC address#033[00m Dec 15 05:09:20 localhost nova_compute[286344]: 2025-12-15 10:09:20.904 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:09:20 localhost kernel: device tapd581117d-e8 entered promiscuous mode Dec 15 05:09:20 localhost NetworkManager[5963]: [1765793360.9128] manager: (tapd581117d-e8): new Generic device (/org/freedesktop/NetworkManager/Devices/71) Dec 15 05:09:20 localhost ovn_controller[154603]: 2025-12-15T10:09:20Z|00452|binding|INFO|Claiming lport d581117d-e8be-4a95-9d02-22e704d6746c for this chassis. Dec 15 05:09:20 localhost ovn_controller[154603]: 2025-12-15T10:09:20Z|00453|binding|INFO|d581117d-e8be-4a95-9d02-22e704d6746c: Claiming unknown Dec 15 05:09:20 localhost nova_compute[286344]: 2025-12-15 10:09:20.912 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:09:20 localhost systemd-udevd[332269]: Network interface NamePolicy= disabled on kernel command line. Dec 15 05:09:20 localhost ovn_metadata_agent[160585]: 2025-12-15 10:09:20.922 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-5b141ff3-0264-4c09-b1ae-3e5c10ec9cfe', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5b141ff3-0264-4c09-b1ae-3e5c10ec9cfe', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1d15e551a54e4d0ba499cdbee6530731', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6dda4428-c0e2-4e1a-b493-e2c0a73bb32a, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=d581117d-e8be-4a95-9d02-22e704d6746c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:09:20 localhost ovn_metadata_agent[160585]: 2025-12-15 10:09:20.924 160590 INFO neutron.agent.ovn.metadata.agent [-] Port d581117d-e8be-4a95-9d02-22e704d6746c in datapath 5b141ff3-0264-4c09-b1ae-3e5c10ec9cfe bound to our chassis#033[00m Dec 15 05:09:20 localhost ovn_metadata_agent[160585]: 2025-12-15 10:09:20.926 160590 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 5b141ff3-0264-4c09-b1ae-3e5c10ec9cfe or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 15 05:09:20 localhost ovn_metadata_agent[160585]: 2025-12-15 10:09:20.927 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[5ac26059-2d08-4a66-a1d3-7aae4fdc80ea]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:09:20 localhost ovn_controller[154603]: 2025-12-15T10:09:20Z|00454|binding|INFO|Setting lport d581117d-e8be-4a95-9d02-22e704d6746c ovn-installed in OVS Dec 15 05:09:20 localhost ovn_controller[154603]: 2025-12-15T10:09:20Z|00455|binding|INFO|Setting lport d581117d-e8be-4a95-9d02-22e704d6746c up in Southbound Dec 15 05:09:20 localhost nova_compute[286344]: 2025-12-15 10:09:20.966 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:09:21 localhost nova_compute[286344]: 2025-12-15 10:09:21.005 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:09:21 localhost nova_compute[286344]: 2025-12-15 10:09:21.040 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:09:21 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e211 do_prune osdmap full prune enabled Dec 15 05:09:21 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e212 e212: 6 total, 6 up, 6 in Dec 15 05:09:21 localhost nova_compute[286344]: 2025-12-15 10:09:21.173 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:09:21 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e212: 6 total, 6 up, 6 in Dec 15 05:09:21 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e212 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:09:21 localhost neutron_sriov_agent[260044]: 2025-12-15 10:09:21.616 2 INFO neutron.agent.securitygroups_rpc [None req-7b4d0cc1-d5b1-4380-8253-b44b7e58f172 af329bd85e524c1ea41f4255aa5eb9d8 1d15e551a54e4d0ba499cdbee6530731 - - default default] Security group member updated ['5c37dc84-f80b-4e12-8d9d-49a8b22167f0']#033[00m Dec 15 05:09:21 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:09:21.884 267546 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-15T10:09:18Z, description=, device_id=47276ef3-c70e-423e-9bd5-27fc702b7139, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=4c3aaf9e-eb6e-46dd-b91a-10646e40ef02, ip_allocation=immediate, mac_address=fa:16:3e:1e:88:dd, name=tempest-RoutersTest-1305806659, network_id=3c24c7aa-9727-4a7c-a93f-1107646953b1, port_security_enabled=True, project_id=37134118fff54995bf9c18df154a3cf8, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=3, security_groups=['0ef0bd73-d34f-45ef-bf2a-2ccb46289a13'], standard_attr_id=3050, status=ACTIVE, tags=[], tenant_id=37134118fff54995bf9c18df154a3cf8, updated_at=2025-12-15T10:09:19Z on network 3c24c7aa-9727-4a7c-a93f-1107646953b1#033[00m Dec 15 05:09:21 localhost podman[332459]: Dec 15 05:09:21 localhost podman[332459]: 2025-12-15 10:09:21.917605549 +0000 UTC m=+0.089586767 container create 5bdc3fa5f47a44e97ad277b1f8f9ba7778f747a25f8c8c513d001cfa7c4b3b7f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5b141ff3-0264-4c09-b1ae-3e5c10ec9cfe, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:09:21 localhost podman[332459]: 2025-12-15 10:09:21.865162952 +0000 UTC m=+0.037144250 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 15 05:09:21 localhost systemd[1]: Started libpod-conmon-5bdc3fa5f47a44e97ad277b1f8f9ba7778f747a25f8c8c513d001cfa7c4b3b7f.scope. Dec 15 05:09:22 localhost systemd[1]: Started libcrun container. Dec 15 05:09:22 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eb988999a3df65dfebb6456e42cd97db8e19e2efe306ed75020f24176521fa67/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 05:09:22 localhost podman[332459]: 2025-12-15 10:09:22.017845453 +0000 UTC m=+0.189826671 container init 5bdc3fa5f47a44e97ad277b1f8f9ba7778f747a25f8c8c513d001cfa7c4b3b7f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5b141ff3-0264-4c09-b1ae-3e5c10ec9cfe, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:09:22 localhost podman[332459]: 2025-12-15 10:09:22.029091979 +0000 UTC m=+0.201073197 container start 5bdc3fa5f47a44e97ad277b1f8f9ba7778f747a25f8c8c513d001cfa7c4b3b7f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5b141ff3-0264-4c09-b1ae-3e5c10ec9cfe, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2) Dec 15 05:09:22 localhost dnsmasq[332491]: started, version 2.85 cachesize 150 Dec 15 05:09:22 localhost dnsmasq[332491]: DNS service limited to local subnets Dec 15 05:09:22 localhost dnsmasq[332491]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 15 05:09:22 localhost dnsmasq[332491]: warning: no upstream servers configured Dec 15 05:09:22 localhost dnsmasq-dhcp[332491]: DHCPv6, static leases only on 2001:db8::, lease time 1d Dec 15 05:09:22 localhost dnsmasq[332491]: read /var/lib/neutron/dhcp/5b141ff3-0264-4c09-b1ae-3e5c10ec9cfe/addn_hosts - 0 addresses Dec 15 05:09:22 localhost dnsmasq-dhcp[332491]: read /var/lib/neutron/dhcp/5b141ff3-0264-4c09-b1ae-3e5c10ec9cfe/host Dec 15 05:09:22 localhost dnsmasq-dhcp[332491]: read /var/lib/neutron/dhcp/5b141ff3-0264-4c09-b1ae-3e5c10ec9cfe/opts Dec 15 05:09:22 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:09:22.093 267546 INFO neutron.agent.dhcp.agent [None req-cbc7a27c-e145-4529-b8a6-d0f7f67106de - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-15T10:09:21Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=db5c0db9-92be-4115-b881-b6839925cec6, ip_allocation=immediate, mac_address=fa:16:3e:0d:6c:e5, name=tempest-PortsIpV6TestJSON-1653914342, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-15T10:09:18Z, description=, dns_domain=, id=5b141ff3-0264-4c09-b1ae-3e5c10ec9cfe, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsIpV6TestJSON-582786432, port_security_enabled=True, project_id=1d15e551a54e4d0ba499cdbee6530731, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=46136, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3051, status=ACTIVE, subnets=['6ec9c089-155a-4766-97c5-d580de8c731d'], tags=[], tenant_id=1d15e551a54e4d0ba499cdbee6530731, updated_at=2025-12-15T10:09:19Z, vlan_transparent=None, network_id=5b141ff3-0264-4c09-b1ae-3e5c10ec9cfe, port_security_enabled=True, project_id=1d15e551a54e4d0ba499cdbee6530731, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['5c37dc84-f80b-4e12-8d9d-49a8b22167f0'], standard_attr_id=3082, status=DOWN, tags=[], tenant_id=1d15e551a54e4d0ba499cdbee6530731, updated_at=2025-12-15T10:09:21Z on network 5b141ff3-0264-4c09-b1ae-3e5c10ec9cfe#033[00m Dec 15 05:09:22 localhost dnsmasq[332356]: read /var/lib/neutron/dhcp/3c24c7aa-9727-4a7c-a93f-1107646953b1/addn_hosts - 1 addresses Dec 15 05:09:22 localhost dnsmasq-dhcp[332356]: read /var/lib/neutron/dhcp/3c24c7aa-9727-4a7c-a93f-1107646953b1/host Dec 15 05:09:22 localhost dnsmasq-dhcp[332356]: read /var/lib/neutron/dhcp/3c24c7aa-9727-4a7c-a93f-1107646953b1/opts Dec 15 05:09:22 localhost podman[332495]: 2025-12-15 10:09:22.145102984 +0000 UTC m=+0.060686571 container kill dc7156822d75b972805ee20242909cd562d0e9043045aece8339eaff987a154b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3c24c7aa-9727-4a7c-a93f-1107646953b1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 05:09:22 localhost dnsmasq[332491]: read /var/lib/neutron/dhcp/5b141ff3-0264-4c09-b1ae-3e5c10ec9cfe/addn_hosts - 1 addresses Dec 15 05:09:22 localhost dnsmasq-dhcp[332491]: read /var/lib/neutron/dhcp/5b141ff3-0264-4c09-b1ae-3e5c10ec9cfe/host Dec 15 05:09:22 localhost podman[332529]: 2025-12-15 10:09:22.311780746 +0000 UTC m=+0.064363441 container kill 5bdc3fa5f47a44e97ad277b1f8f9ba7778f747a25f8c8c513d001cfa7c4b3b7f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5b141ff3-0264-4c09-b1ae-3e5c10ec9cfe, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true) Dec 15 05:09:22 localhost dnsmasq-dhcp[332491]: read /var/lib/neutron/dhcp/5b141ff3-0264-4c09-b1ae-3e5c10ec9cfe/opts Dec 15 05:09:23 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:09:23.159 267546 INFO neutron.agent.dhcp.agent [None req-d9794897-d618-496e-b64e-39fb306b7eda - - - - - -] DHCP configuration for ports {'e1c6fbea-acc9-4ebe-b72d-8ffb77391217'} is completed#033[00m Dec 15 05:09:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 05:09:23 localhost podman[332554]: 2025-12-15 10:09:23.406113211 +0000 UTC m=+0.084976491 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3) Dec 15 05:09:23 localhost podman[332554]: 2025-12-15 10:09:23.440411684 +0000 UTC m=+0.119274944 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2) Dec 15 05:09:23 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 05:09:23 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:09:23.617 267546 INFO neutron.agent.dhcp.agent [None req-5ecd9966-b9cc-4d37-a7c1-69cde46363a1 - - - - - -] DHCP configuration for ports {'4c3aaf9e-eb6e-46dd-b91a-10646e40ef02', 'db5c0db9-92be-4115-b881-b6839925cec6'} is completed#033[00m Dec 15 05:09:23 localhost nova_compute[286344]: 2025-12-15 10:09:23.873 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:09:24 localhost nova_compute[286344]: 2025-12-15 10:09:24.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:09:24 localhost nova_compute[286344]: 2025-12-15 10:09:24.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:09:24 localhost nova_compute[286344]: 2025-12-15 10:09:24.302 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 05:09:24 localhost nova_compute[286344]: 2025-12-15 10:09:24.303 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 05:09:24 localhost nova_compute[286344]: 2025-12-15 10:09:24.303 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 05:09:24 localhost nova_compute[286344]: 2025-12-15 10:09:24.304 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Auditing locally available compute resources for np0005559462.localdomain (node: np0005559462.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 15 05:09:24 localhost nova_compute[286344]: 2025-12-15 10:09:24.304 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 05:09:24 localhost neutron_sriov_agent[260044]: 2025-12-15 10:09:24.664 2 INFO neutron.agent.securitygroups_rpc [None req-d4c03117-815b-4947-86b0-2f1ab5557ce0 ce7b69d2eeaf4277a0135eadc2f767d3 37134118fff54995bf9c18df154a3cf8 - - default default] Security group member updated ['0ef0bd73-d34f-45ef-bf2a-2ccb46289a13']#033[00m Dec 15 05:09:24 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:09:24.813 267546 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-15T10:09:21Z, description=, device_id=e30c508d-eb2b-42f7-967f-7f3922ddb6dc, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=db5c0db9-92be-4115-b881-b6839925cec6, ip_allocation=immediate, mac_address=fa:16:3e:0d:6c:e5, name=tempest-PortsIpV6TestJSON-1653914342, network_id=5b141ff3-0264-4c09-b1ae-3e5c10ec9cfe, port_security_enabled=True, project_id=1d15e551a54e4d0ba499cdbee6530731, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['5c37dc84-f80b-4e12-8d9d-49a8b22167f0'], standard_attr_id=3082, status=DOWN, tags=[], tenant_id=1d15e551a54e4d0ba499cdbee6530731, updated_at=2025-12-15T10:09:23Z on network 5b141ff3-0264-4c09-b1ae-3e5c10ec9cfe#033[00m Dec 15 05:09:24 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Dec 15 05:09:24 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/31950114' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Dec 15 05:09:24 localhost nova_compute[286344]: 2025-12-15 10:09:24.889 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.585s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 05:09:24 localhost dnsmasq[332356]: read /var/lib/neutron/dhcp/3c24c7aa-9727-4a7c-a93f-1107646953b1/addn_hosts - 0 addresses Dec 15 05:09:24 localhost dnsmasq-dhcp[332356]: read /var/lib/neutron/dhcp/3c24c7aa-9727-4a7c-a93f-1107646953b1/host Dec 15 05:09:24 localhost dnsmasq-dhcp[332356]: read /var/lib/neutron/dhcp/3c24c7aa-9727-4a7c-a93f-1107646953b1/opts Dec 15 05:09:24 localhost podman[332615]: 2025-12-15 10:09:24.927479618 +0000 UTC m=+0.060696922 container kill dc7156822d75b972805ee20242909cd562d0e9043045aece8339eaff987a154b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3c24c7aa-9727-4a7c-a93f-1107646953b1, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS) Dec 15 05:09:24 localhost nova_compute[286344]: 2025-12-15 10:09:24.955 286348 DEBUG nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 05:09:24 localhost nova_compute[286344]: 2025-12-15 10:09:24.955 286348 DEBUG nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 05:09:25 localhost dnsmasq[332491]: read /var/lib/neutron/dhcp/5b141ff3-0264-4c09-b1ae-3e5c10ec9cfe/addn_hosts - 1 addresses Dec 15 05:09:25 localhost dnsmasq-dhcp[332491]: read /var/lib/neutron/dhcp/5b141ff3-0264-4c09-b1ae-3e5c10ec9cfe/host Dec 15 05:09:25 localhost podman[332644]: 2025-12-15 10:09:25.096573725 +0000 UTC m=+0.077231810 container kill 5bdc3fa5f47a44e97ad277b1f8f9ba7778f747a25f8c8c513d001cfa7c4b3b7f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5b141ff3-0264-4c09-b1ae-3e5c10ec9cfe, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202) Dec 15 05:09:25 localhost dnsmasq-dhcp[332491]: read /var/lib/neutron/dhcp/5b141ff3-0264-4c09-b1ae-3e5c10ec9cfe/opts Dec 15 05:09:25 localhost nova_compute[286344]: 2025-12-15 10:09:25.159 286348 WARNING nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 15 05:09:25 localhost nova_compute[286344]: 2025-12-15 10:09:25.161 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Hypervisor/Node resource view: name=np0005559462.localdomain free_ram=11147MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 15 05:09:25 localhost nova_compute[286344]: 2025-12-15 10:09:25.162 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 05:09:25 localhost nova_compute[286344]: 2025-12-15 10:09:25.163 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 05:09:25 localhost ovn_controller[154603]: 2025-12-15T10:09:25Z|00456|binding|INFO|Releasing lport a53b633d-e179-45d6-9571-f3615ed5347f from this chassis (sb_readonly=0) Dec 15 05:09:25 localhost ovn_controller[154603]: 2025-12-15T10:09:25Z|00457|binding|INFO|Setting lport a53b633d-e179-45d6-9571-f3615ed5347f down in Southbound Dec 15 05:09:25 localhost kernel: device tapa53b633d-e1 left promiscuous mode Dec 15 05:09:25 localhost nova_compute[286344]: 2025-12-15 10:09:25.212 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:09:25 localhost ovn_metadata_agent[160585]: 2025-12-15 10:09:25.218 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-3c24c7aa-9727-4a7c-a93f-1107646953b1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3c24c7aa-9727-4a7c-a93f-1107646953b1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '37134118fff54995bf9c18df154a3cf8', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005559462.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0c3d9771-dd19-4b51-8a3d-d2c0a052af0f, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=a53b633d-e179-45d6-9571-f3615ed5347f) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:09:25 localhost ovn_metadata_agent[160585]: 2025-12-15 10:09:25.220 160590 INFO neutron.agent.ovn.metadata.agent [-] Port a53b633d-e179-45d6-9571-f3615ed5347f in datapath 3c24c7aa-9727-4a7c-a93f-1107646953b1 unbound from our chassis#033[00m Dec 15 05:09:25 localhost ovn_metadata_agent[160585]: 2025-12-15 10:09:25.223 160590 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3c24c7aa-9727-4a7c-a93f-1107646953b1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 15 05:09:25 localhost ovn_metadata_agent[160585]: 2025-12-15 10:09:25.224 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[9054732e-f823-4fb5-aaeb-62ef29ae54ee]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:09:25 localhost nova_compute[286344]: 2025-12-15 10:09:25.236 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:09:25 localhost nova_compute[286344]: 2025-12-15 10:09:25.266 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Instance 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 15 05:09:25 localhost nova_compute[286344]: 2025-12-15 10:09:25.266 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 15 05:09:25 localhost nova_compute[286344]: 2025-12-15 10:09:25.267 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Final resource view: name=np0005559462.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 15 05:09:25 localhost nova_compute[286344]: 2025-12-15 10:09:25.315 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 05:09:25 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:09:25.372 267546 INFO neutron.agent.dhcp.agent [None req-a0d93ba9-e672-45eb-9507-66e4fdd993ce - - - - - -] DHCP configuration for ports {'db5c0db9-92be-4115-b881-b6839925cec6'} is completed#033[00m Dec 15 05:09:25 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Dec 15 05:09:25 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1593231066' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Dec 15 05:09:25 localhost nova_compute[286344]: 2025-12-15 10:09:25.805 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.491s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 05:09:25 localhost nova_compute[286344]: 2025-12-15 10:09:25.812 286348 DEBUG nova.compute.provider_tree [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Inventory has not changed in ProviderTree for provider: 26c8956b-6742-4951-b566-971b9bbe323b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 15 05:09:25 localhost nova_compute[286344]: 2025-12-15 10:09:25.829 286348 DEBUG nova.scheduler.client.report [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Inventory has not changed for provider 26c8956b-6742-4951-b566-971b9bbe323b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 15 05:09:25 localhost nova_compute[286344]: 2025-12-15 10:09:25.832 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Compute_service record updated for np0005559462.localdomain:np0005559462.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 15 05:09:25 localhost nova_compute[286344]: 2025-12-15 10:09:25.832 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.670s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 05:09:26 localhost neutron_sriov_agent[260044]: 2025-12-15 10:09:26.052 2 INFO neutron.agent.securitygroups_rpc [None req-3a18aab9-2028-4a14-adb1-a784644f602a af329bd85e524c1ea41f4255aa5eb9d8 1d15e551a54e4d0ba499cdbee6530731 - - default default] Security group member updated ['5c37dc84-f80b-4e12-8d9d-49a8b22167f0']#033[00m Dec 15 05:09:26 localhost nova_compute[286344]: 2025-12-15 10:09:26.175 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:09:26 localhost dnsmasq[332356]: exiting on receipt of SIGTERM Dec 15 05:09:26 localhost podman[332723]: 2025-12-15 10:09:26.225512992 +0000 UTC m=+0.069295664 container kill dc7156822d75b972805ee20242909cd562d0e9043045aece8339eaff987a154b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3c24c7aa-9727-4a7c-a93f-1107646953b1, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:09:26 localhost systemd[1]: tmp-crun.0aQwhU.mount: Deactivated successfully. Dec 15 05:09:26 localhost systemd[1]: libpod-dc7156822d75b972805ee20242909cd562d0e9043045aece8339eaff987a154b.scope: Deactivated successfully. Dec 15 05:09:26 localhost systemd[1]: tmp-crun.QkmoxN.mount: Deactivated successfully. Dec 15 05:09:26 localhost dnsmasq[332491]: read /var/lib/neutron/dhcp/5b141ff3-0264-4c09-b1ae-3e5c10ec9cfe/addn_hosts - 0 addresses Dec 15 05:09:26 localhost dnsmasq-dhcp[332491]: read /var/lib/neutron/dhcp/5b141ff3-0264-4c09-b1ae-3e5c10ec9cfe/host Dec 15 05:09:26 localhost podman[332738]: 2025-12-15 10:09:26.293974114 +0000 UTC m=+0.083295496 container kill 5bdc3fa5f47a44e97ad277b1f8f9ba7778f747a25f8c8c513d001cfa7c4b3b7f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5b141ff3-0264-4c09-b1ae-3e5c10ec9cfe, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202) Dec 15 05:09:26 localhost dnsmasq-dhcp[332491]: read /var/lib/neutron/dhcp/5b141ff3-0264-4c09-b1ae-3e5c10ec9cfe/opts Dec 15 05:09:26 localhost podman[332745]: 2025-12-15 10:09:26.303104812 +0000 UTC m=+0.063695503 container died dc7156822d75b972805ee20242909cd562d0e9043045aece8339eaff987a154b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3c24c7aa-9727-4a7c-a93f-1107646953b1, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3) Dec 15 05:09:26 localhost podman[332745]: 2025-12-15 10:09:26.335953236 +0000 UTC m=+0.096543887 container cleanup dc7156822d75b972805ee20242909cd562d0e9043045aece8339eaff987a154b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3c24c7aa-9727-4a7c-a93f-1107646953b1, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202) Dec 15 05:09:26 localhost systemd[1]: libpod-conmon-dc7156822d75b972805ee20242909cd562d0e9043045aece8339eaff987a154b.scope: Deactivated successfully. Dec 15 05:09:26 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e212 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:09:26 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e212 do_prune osdmap full prune enabled Dec 15 05:09:26 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e213 e213: 6 total, 6 up, 6 in Dec 15 05:09:26 localhost podman[332747]: 2025-12-15 10:09:26.376703773 +0000 UTC m=+0.132258547 container remove dc7156822d75b972805ee20242909cd562d0e9043045aece8339eaff987a154b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3c24c7aa-9727-4a7c-a93f-1107646953b1, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 05:09:26 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e213: 6 total, 6 up, 6 in Dec 15 05:09:26 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:09:26.420 267546 INFO neutron.agent.dhcp.agent [None req-12af5d4e-7517-4f52-b2cc-3f3bb96b7a01 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:09:26 localhost ovn_controller[154603]: 2025-12-15T10:09:26Z|00458|binding|INFO|Releasing lport d581117d-e8be-4a95-9d02-22e704d6746c from this chassis (sb_readonly=0) Dec 15 05:09:26 localhost kernel: device tapd581117d-e8 left promiscuous mode Dec 15 05:09:26 localhost ovn_controller[154603]: 2025-12-15T10:09:26Z|00459|binding|INFO|Setting lport d581117d-e8be-4a95-9d02-22e704d6746c down in Southbound Dec 15 05:09:26 localhost nova_compute[286344]: 2025-12-15 10:09:26.508 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:09:26 localhost ovn_metadata_agent[160585]: 2025-12-15 10:09:26.518 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-5b141ff3-0264-4c09-b1ae-3e5c10ec9cfe', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5b141ff3-0264-4c09-b1ae-3e5c10ec9cfe', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1d15e551a54e4d0ba499cdbee6530731', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005559462.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6dda4428-c0e2-4e1a-b493-e2c0a73bb32a, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=d581117d-e8be-4a95-9d02-22e704d6746c) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:09:26 localhost ovn_metadata_agent[160585]: 2025-12-15 10:09:26.518 160590 INFO neutron.agent.ovn.metadata.agent [-] Port d581117d-e8be-4a95-9d02-22e704d6746c in datapath 5b141ff3-0264-4c09-b1ae-3e5c10ec9cfe unbound from our chassis#033[00m Dec 15 05:09:26 localhost ovn_metadata_agent[160585]: 2025-12-15 10:09:26.519 160590 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 5b141ff3-0264-4c09-b1ae-3e5c10ec9cfe or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 15 05:09:26 localhost ovn_metadata_agent[160585]: 2025-12-15 10:09:26.520 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[4da83bd7-d213-43db-a4ba-1110f64c7186]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:09:26 localhost nova_compute[286344]: 2025-12-15 10:09:26.524 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:09:26 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:09:26.597 267546 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:09:26 localhost ovn_controller[154603]: 2025-12-15T10:09:26Z|00460|binding|INFO|Releasing lport b35254ad-12eb-47bb-92be-44fefe0694f0 from this chassis (sb_readonly=0) Dec 15 05:09:26 localhost nova_compute[286344]: 2025-12-15 10:09:26.951 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:09:27 localhost dnsmasq[332491]: exiting on receipt of SIGTERM Dec 15 05:09:27 localhost podman[332809]: 2025-12-15 10:09:27.038629092 +0000 UTC m=+0.051582584 container kill 5bdc3fa5f47a44e97ad277b1f8f9ba7778f747a25f8c8c513d001cfa7c4b3b7f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5b141ff3-0264-4c09-b1ae-3e5c10ec9cfe, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202) Dec 15 05:09:27 localhost systemd[1]: libpod-5bdc3fa5f47a44e97ad277b1f8f9ba7778f747a25f8c8c513d001cfa7c4b3b7f.scope: Deactivated successfully. Dec 15 05:09:27 localhost podman[332829]: 2025-12-15 10:09:27.12206503 +0000 UTC m=+0.055257903 container died 5bdc3fa5f47a44e97ad277b1f8f9ba7778f747a25f8c8c513d001cfa7c4b3b7f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5b141ff3-0264-4c09-b1ae-3e5c10ec9cfe, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202) Dec 15 05:09:27 localhost podman[332829]: 2025-12-15 10:09:27.217533536 +0000 UTC m=+0.150726359 container remove 5bdc3fa5f47a44e97ad277b1f8f9ba7778f747a25f8c8c513d001cfa7c4b3b7f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5b141ff3-0264-4c09-b1ae-3e5c10ec9cfe, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2) Dec 15 05:09:27 localhost systemd[1]: var-lib-containers-storage-overlay-eb988999a3df65dfebb6456e42cd97db8e19e2efe306ed75020f24176521fa67-merged.mount: Deactivated successfully. Dec 15 05:09:27 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5bdc3fa5f47a44e97ad277b1f8f9ba7778f747a25f8c8c513d001cfa7c4b3b7f-userdata-shm.mount: Deactivated successfully. Dec 15 05:09:27 localhost systemd[1]: var-lib-containers-storage-overlay-932bdab5716148067981ee9890707ea2d93565e780ebb108c1d094089e393bba-merged.mount: Deactivated successfully. Dec 15 05:09:27 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-dc7156822d75b972805ee20242909cd562d0e9043045aece8339eaff987a154b-userdata-shm.mount: Deactivated successfully. Dec 15 05:09:27 localhost systemd[1]: run-netns-qdhcp\x2d3c24c7aa\x2d9727\x2d4a7c\x2da93f\x2d1107646953b1.mount: Deactivated successfully. Dec 15 05:09:27 localhost systemd[1]: libpod-conmon-5bdc3fa5f47a44e97ad277b1f8f9ba7778f747a25f8c8c513d001cfa7c4b3b7f.scope: Deactivated successfully. Dec 15 05:09:27 localhost systemd[1]: run-netns-qdhcp\x2d5b141ff3\x2d0264\x2d4c09\x2db1ae\x2d3e5c10ec9cfe.mount: Deactivated successfully. Dec 15 05:09:27 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:09:27.268 267546 INFO neutron.agent.dhcp.agent [None req-ec3441c9-7e39-4c27-94f7-40ddf37697df - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:09:27 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:09:27.308 267546 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:09:27 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:09:27.883 267546 INFO neutron.agent.linux.ip_lib [None req-e41f702b-d56f-4db4-8972-1bb9233f19f3 - - - - - -] Device tap10979e70-1b cannot be used as it has no MAC address#033[00m Dec 15 05:09:27 localhost nova_compute[286344]: 2025-12-15 10:09:27.907 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:09:27 localhost kernel: device tap10979e70-1b entered promiscuous mode Dec 15 05:09:27 localhost NetworkManager[5963]: [1765793367.9159] manager: (tap10979e70-1b): new Generic device (/org/freedesktop/NetworkManager/Devices/72) Dec 15 05:09:27 localhost ovn_controller[154603]: 2025-12-15T10:09:27Z|00461|binding|INFO|Claiming lport 10979e70-1b81-427f-b3c0-41c05affdb66 for this chassis. Dec 15 05:09:27 localhost ovn_controller[154603]: 2025-12-15T10:09:27Z|00462|binding|INFO|10979e70-1b81-427f-b3c0-41c05affdb66: Claiming unknown Dec 15 05:09:27 localhost nova_compute[286344]: 2025-12-15 10:09:27.917 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:09:27 localhost systemd-udevd[332858]: Network interface NamePolicy= disabled on kernel command line. Dec 15 05:09:27 localhost ovn_metadata_agent[160585]: 2025-12-15 10:09:27.928 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-91df3eeb-df70-4c24-88ec-6d2e82e64e9a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-91df3eeb-df70-4c24-88ec-6d2e82e64e9a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '736a858f8d6a4ae6a58d901e2d9d6e8d', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fdeb10fb-bebf-4bd3-ad29-00f7a4b43609, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=10979e70-1b81-427f-b3c0-41c05affdb66) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:09:27 localhost ovn_metadata_agent[160585]: 2025-12-15 10:09:27.931 160590 INFO neutron.agent.ovn.metadata.agent [-] Port 10979e70-1b81-427f-b3c0-41c05affdb66 in datapath 91df3eeb-df70-4c24-88ec-6d2e82e64e9a bound to our chassis#033[00m Dec 15 05:09:27 localhost ovn_metadata_agent[160585]: 2025-12-15 10:09:27.935 160590 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 91df3eeb-df70-4c24-88ec-6d2e82e64e9a or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 15 05:09:27 localhost ovn_metadata_agent[160585]: 2025-12-15 10:09:27.936 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[6f9b676e-0485-4cb5-b27e-2f5d980a523c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:09:27 localhost journal[231322]: ethtool ioctl error on tap10979e70-1b: No such device Dec 15 05:09:27 localhost ovn_controller[154603]: 2025-12-15T10:09:27Z|00463|binding|INFO|Setting lport 10979e70-1b81-427f-b3c0-41c05affdb66 ovn-installed in OVS Dec 15 05:09:27 localhost ovn_controller[154603]: 2025-12-15T10:09:27Z|00464|binding|INFO|Setting lport 10979e70-1b81-427f-b3c0-41c05affdb66 up in Southbound Dec 15 05:09:27 localhost nova_compute[286344]: 2025-12-15 10:09:27.953 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:09:27 localhost journal[231322]: ethtool ioctl error on tap10979e70-1b: No such device Dec 15 05:09:27 localhost journal[231322]: ethtool ioctl error on tap10979e70-1b: No such device Dec 15 05:09:27 localhost journal[231322]: ethtool ioctl error on tap10979e70-1b: No such device Dec 15 05:09:27 localhost journal[231322]: ethtool ioctl error on tap10979e70-1b: No such device Dec 15 05:09:27 localhost journal[231322]: ethtool ioctl error on tap10979e70-1b: No such device Dec 15 05:09:27 localhost journal[231322]: ethtool ioctl error on tap10979e70-1b: No such device Dec 15 05:09:27 localhost journal[231322]: ethtool ioctl error on tap10979e70-1b: No such device Dec 15 05:09:27 localhost nova_compute[286344]: 2025-12-15 10:09:27.990 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:09:28 localhost nova_compute[286344]: 2025-12-15 10:09:28.017 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:09:28 localhost nova_compute[286344]: 2025-12-15 10:09:28.877 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:09:28 localhost podman[332927]: Dec 15 05:09:28 localhost podman[332927]: 2025-12-15 10:09:28.972373791 +0000 UTC m=+0.096440444 container create 3ce513ec863173da1d0dc5fb86570d52d02616d4e1c034bfd6fe5ee1770bd8f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-91df3eeb-df70-4c24-88ec-6d2e82e64e9a, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, io.buildah.version=1.41.3) Dec 15 05:09:29 localhost systemd[1]: Started /usr/bin/podman healthcheck run a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e. Dec 15 05:09:29 localhost systemd[1]: Started libpod-conmon-3ce513ec863173da1d0dc5fb86570d52d02616d4e1c034bfd6fe5ee1770bd8f4.scope. Dec 15 05:09:29 localhost podman[332927]: 2025-12-15 10:09:28.922415423 +0000 UTC m=+0.046482126 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 15 05:09:29 localhost systemd[1]: tmp-crun.pQa3Kb.mount: Deactivated successfully. Dec 15 05:09:29 localhost systemd[1]: Started libcrun container. Dec 15 05:09:29 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/db4d82f8e576a716edaf03d3a2ef2b18357635d8498f4316d21c995619652ef4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 05:09:29 localhost podman[332942]: 2025-12-15 10:09:29.097973057 +0000 UTC m=+0.081256151 container health_status a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 15 05:09:29 localhost podman[332942]: 2025-12-15 10:09:29.106124628 +0000 UTC m=+0.089407722 container exec_died a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Dec 15 05:09:29 localhost podman[332927]: 2025-12-15 10:09:29.119523682 +0000 UTC m=+0.243590335 container init 3ce513ec863173da1d0dc5fb86570d52d02616d4e1c034bfd6fe5ee1770bd8f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-91df3eeb-df70-4c24-88ec-6d2e82e64e9a, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202) Dec 15 05:09:29 localhost systemd[1]: a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e.service: Deactivated successfully. Dec 15 05:09:29 localhost podman[332927]: 2025-12-15 10:09:29.128302831 +0000 UTC m=+0.252369484 container start 3ce513ec863173da1d0dc5fb86570d52d02616d4e1c034bfd6fe5ee1770bd8f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-91df3eeb-df70-4c24-88ec-6d2e82e64e9a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:09:29 localhost dnsmasq[332970]: started, version 2.85 cachesize 150 Dec 15 05:09:29 localhost dnsmasq[332970]: DNS service limited to local subnets Dec 15 05:09:29 localhost dnsmasq[332970]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 15 05:09:29 localhost dnsmasq[332970]: warning: no upstream servers configured Dec 15 05:09:29 localhost dnsmasq-dhcp[332970]: DHCP, static leases only on 10.100.0.0, lease time 1d Dec 15 05:09:29 localhost dnsmasq[332970]: read /var/lib/neutron/dhcp/91df3eeb-df70-4c24-88ec-6d2e82e64e9a/addn_hosts - 0 addresses Dec 15 05:09:29 localhost dnsmasq-dhcp[332970]: read /var/lib/neutron/dhcp/91df3eeb-df70-4c24-88ec-6d2e82e64e9a/host Dec 15 05:09:29 localhost dnsmasq-dhcp[332970]: read /var/lib/neutron/dhcp/91df3eeb-df70-4c24-88ec-6d2e82e64e9a/opts Dec 15 05:09:29 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:09:29.331 267546 INFO neutron.agent.dhcp.agent [None req-c860b17b-cd36-4f3a-89fc-836a95d7c336 - - - - - -] DHCP configuration for ports {'e76bbfad-db6c-4f87-85f4-60d80c08821a'} is completed#033[00m Dec 15 05:09:29 localhost neutron_sriov_agent[260044]: 2025-12-15 10:09:29.844 2 INFO neutron.agent.securitygroups_rpc [None req-569961f4-7a10-4311-a49a-ab19e365f0a0 af329bd85e524c1ea41f4255aa5eb9d8 1d15e551a54e4d0ba499cdbee6530731 - - default default] Security group member updated ['301e6c89-75e9-48b2-aea3-93ff00995261']#033[00m Dec 15 05:09:30 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Dec 15 05:09:30 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.25625 172.18.0.34:0/382777224' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Dec 15 05:09:31 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e213 do_prune osdmap full prune enabled Dec 15 05:09:31 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e214 e214: 6 total, 6 up, 6 in Dec 15 05:09:31 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e214: 6 total, 6 up, 6 in Dec 15 05:09:31 localhost nova_compute[286344]: 2025-12-15 10:09:31.178 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:09:31 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e214 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:09:31 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e214 do_prune osdmap full prune enabled Dec 15 05:09:31 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e215 e215: 6 total, 6 up, 6 in Dec 15 05:09:31 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e215: 6 total, 6 up, 6 in Dec 15 05:09:31 localhost nova_compute[286344]: 2025-12-15 10:09:31.832 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:09:31 localhost podman[243449]: time="2025-12-15T10:09:31Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 15 05:09:31 localhost podman[243449]: @ - - [15/Dec/2025:10:09:31 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 158464 "" "Go-http-client/1.1" Dec 15 05:09:31 localhost podman[243449]: @ - - [15/Dec/2025:10:09:31 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19726 "" "Go-http-client/1.1" Dec 15 05:09:32 localhost neutron_sriov_agent[260044]: 2025-12-15 10:09:32.466 2 INFO neutron.agent.securitygroups_rpc [None req-7d0715bf-7ab1-4a62-8bea-6d63945cf746 af329bd85e524c1ea41f4255aa5eb9d8 1d15e551a54e4d0ba499cdbee6530731 - - default default] Security group member updated ['c040b36e-2f52-48d9-af71-187f0aec7a56', '301e6c89-75e9-48b2-aea3-93ff00995261']#033[00m Dec 15 05:09:32 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:09:32.955 267546 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-15T10:09:32Z, description=, device_id=dc6c7afa-c39e-452f-9f01-245c39cdedc9, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=5e5eb46f-4b10-4448-92e0-81ec771eeb16, ip_allocation=immediate, mac_address=fa:16:3e:f6:3f:95, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-15T10:09:26Z, description=, dns_domain=, id=91df3eeb-df70-4c24-88ec-6d2e82e64e9a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-TelemetryAlarmingNegativeTest-2001012576-network, port_security_enabled=True, project_id=736a858f8d6a4ae6a58d901e2d9d6e8d, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=30242, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3108, status=ACTIVE, subnets=['cd058419-3055-4872-8058-fa842ea536cb'], tags=[], tenant_id=736a858f8d6a4ae6a58d901e2d9d6e8d, updated_at=2025-12-15T10:09:26Z, vlan_transparent=None, network_id=91df3eeb-df70-4c24-88ec-6d2e82e64e9a, port_security_enabled=False, project_id=736a858f8d6a4ae6a58d901e2d9d6e8d, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3141, status=DOWN, tags=[], tenant_id=736a858f8d6a4ae6a58d901e2d9d6e8d, updated_at=2025-12-15T10:09:32Z on network 91df3eeb-df70-4c24-88ec-6d2e82e64e9a#033[00m Dec 15 05:09:33 localhost systemd[1]: tmp-crun.DjcJTJ.mount: Deactivated successfully. Dec 15 05:09:33 localhost dnsmasq[332970]: read /var/lib/neutron/dhcp/91df3eeb-df70-4c24-88ec-6d2e82e64e9a/addn_hosts - 1 addresses Dec 15 05:09:33 localhost dnsmasq-dhcp[332970]: read /var/lib/neutron/dhcp/91df3eeb-df70-4c24-88ec-6d2e82e64e9a/host Dec 15 05:09:33 localhost podman[332987]: 2025-12-15 10:09:33.196709963 +0000 UTC m=+0.075635768 container kill 3ce513ec863173da1d0dc5fb86570d52d02616d4e1c034bfd6fe5ee1770bd8f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-91df3eeb-df70-4c24-88ec-6d2e82e64e9a, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:09:33 localhost dnsmasq-dhcp[332970]: read /var/lib/neutron/dhcp/91df3eeb-df70-4c24-88ec-6d2e82e64e9a/opts Dec 15 05:09:33 localhost neutron_sriov_agent[260044]: 2025-12-15 10:09:33.253 2 INFO neutron.agent.securitygroups_rpc [None req-644d77e0-1896-4f8f-a473-026d28354635 af329bd85e524c1ea41f4255aa5eb9d8 1d15e551a54e4d0ba499cdbee6530731 - - default default] Security group member updated ['c040b36e-2f52-48d9-af71-187f0aec7a56']#033[00m Dec 15 05:09:33 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:09:33.500 267546 INFO neutron.agent.dhcp.agent [None req-7225fd13-0007-46be-939d-ff231654d145 - - - - - -] DHCP configuration for ports {'5e5eb46f-4b10-4448-92e0-81ec771eeb16'} is completed#033[00m Dec 15 05:09:33 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Dec 15 05:09:33 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/987932222' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Dec 15 05:09:33 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Dec 15 05:09:33 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/987932222' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Dec 15 05:09:33 localhost nova_compute[286344]: 2025-12-15 10:09:33.880 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:09:34 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:09:34.241 267546 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-15T10:09:32Z, description=, device_id=dc6c7afa-c39e-452f-9f01-245c39cdedc9, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=5e5eb46f-4b10-4448-92e0-81ec771eeb16, ip_allocation=immediate, mac_address=fa:16:3e:f6:3f:95, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-15T10:09:26Z, description=, dns_domain=, id=91df3eeb-df70-4c24-88ec-6d2e82e64e9a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-TelemetryAlarmingNegativeTest-2001012576-network, port_security_enabled=True, project_id=736a858f8d6a4ae6a58d901e2d9d6e8d, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=30242, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3108, status=ACTIVE, subnets=['cd058419-3055-4872-8058-fa842ea536cb'], tags=[], tenant_id=736a858f8d6a4ae6a58d901e2d9d6e8d, updated_at=2025-12-15T10:09:26Z, vlan_transparent=None, network_id=91df3eeb-df70-4c24-88ec-6d2e82e64e9a, port_security_enabled=False, project_id=736a858f8d6a4ae6a58d901e2d9d6e8d, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3141, status=DOWN, tags=[], tenant_id=736a858f8d6a4ae6a58d901e2d9d6e8d, updated_at=2025-12-15T10:09:32Z on network 91df3eeb-df70-4c24-88ec-6d2e82e64e9a#033[00m Dec 15 05:09:34 localhost podman[333022]: 2025-12-15 10:09:34.44591215 +0000 UTC m=+0.053769253 container kill 3ce513ec863173da1d0dc5fb86570d52d02616d4e1c034bfd6fe5ee1770bd8f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-91df3eeb-df70-4c24-88ec-6d2e82e64e9a, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:09:34 localhost dnsmasq[332970]: read /var/lib/neutron/dhcp/91df3eeb-df70-4c24-88ec-6d2e82e64e9a/addn_hosts - 1 addresses Dec 15 05:09:34 localhost dnsmasq-dhcp[332970]: read /var/lib/neutron/dhcp/91df3eeb-df70-4c24-88ec-6d2e82e64e9a/host Dec 15 05:09:34 localhost dnsmasq-dhcp[332970]: read /var/lib/neutron/dhcp/91df3eeb-df70-4c24-88ec-6d2e82e64e9a/opts Dec 15 05:09:34 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:09:34.648 267546 INFO neutron.agent.dhcp.agent [None req-398c868a-e79c-46fa-95d3-d757c20f09c3 - - - - - -] DHCP configuration for ports {'5e5eb46f-4b10-4448-92e0-81ec771eeb16'} is completed#033[00m Dec 15 05:09:34 localhost openstack_network_exporter[246484]: ERROR 10:09:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 05:09:34 localhost openstack_network_exporter[246484]: ERROR 10:09:34 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 15 05:09:34 localhost openstack_network_exporter[246484]: ERROR 10:09:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 05:09:34 localhost openstack_network_exporter[246484]: ERROR 10:09:34 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 15 05:09:34 localhost openstack_network_exporter[246484]: Dec 15 05:09:34 localhost openstack_network_exporter[246484]: ERROR 10:09:34 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 15 05:09:34 localhost openstack_network_exporter[246484]: Dec 15 05:09:35 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Dec 15 05:09:35 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3105025530' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Dec 15 05:09:35 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Dec 15 05:09:35 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3105025530' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Dec 15 05:09:36 localhost nova_compute[286344]: 2025-12-15 10:09:36.217 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:09:36 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e215 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:09:36 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e215 do_prune osdmap full prune enabled Dec 15 05:09:36 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e216 e216: 6 total, 6 up, 6 in Dec 15 05:09:36 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e216: 6 total, 6 up, 6 in Dec 15 05:09:36 localhost neutron_sriov_agent[260044]: 2025-12-15 10:09:36.664 2 INFO neutron.agent.securitygroups_rpc [None req-70a49ec1-625f-4bb1-82c3-5d7b9a8c060d af329bd85e524c1ea41f4255aa5eb9d8 1d15e551a54e4d0ba499cdbee6530731 - - default default] Security group member updated ['915bc5fa-725a-4c44-a445-c3659b1e3b59']#033[00m Dec 15 05:09:37 localhost sshd[333043]: main: sshd: ssh-rsa algorithm is disabled Dec 15 05:09:38 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Dec 15 05:09:38 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.25625 172.18.0.34:0/382777224' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Dec 15 05:09:38 localhost neutron_sriov_agent[260044]: 2025-12-15 10:09:38.709 2 INFO neutron.agent.securitygroups_rpc [None req-2c036ea3-888e-4e64-b308-ff07692c8ba7 af329bd85e524c1ea41f4255aa5eb9d8 1d15e551a54e4d0ba499cdbee6530731 - - default default] Security group member updated ['d89de261-5db1-4fe4-85ca-28721032b15b', '915bc5fa-725a-4c44-a445-c3659b1e3b59', 'ff3f94de-de1c-4455-905e-fd9860dc0b85']#033[00m Dec 15 05:09:38 localhost nova_compute[286344]: 2025-12-15 10:09:38.910 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:09:39 localhost neutron_sriov_agent[260044]: 2025-12-15 10:09:39.159 2 INFO neutron.agent.securitygroups_rpc [None req-f97e54e3-8784-4199-91fd-1941b3aa447c af329bd85e524c1ea41f4255aa5eb9d8 1d15e551a54e4d0ba499cdbee6530731 - - default default] Security group member updated ['d89de261-5db1-4fe4-85ca-28721032b15b', 'ff3f94de-de1c-4455-905e-fd9860dc0b85']#033[00m Dec 15 05:09:39 localhost ovn_controller[154603]: 2025-12-15T10:09:39Z|00465|binding|INFO|Releasing lport b35254ad-12eb-47bb-92be-44fefe0694f0 from this chassis (sb_readonly=0) Dec 15 05:09:39 localhost nova_compute[286344]: 2025-12-15 10:09:39.402 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:09:39 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e216 do_prune osdmap full prune enabled Dec 15 05:09:39 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e217 e217: 6 total, 6 up, 6 in Dec 15 05:09:39 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e217: 6 total, 6 up, 6 in Dec 15 05:09:41 localhost nova_compute[286344]: 2025-12-15 10:09:41.220 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:09:41 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Dec 15 05:09:41 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.25625 172.18.0.34:0/382777224' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Dec 15 05:09:41 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e217 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:09:41 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e217 do_prune osdmap full prune enabled Dec 15 05:09:41 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e218 e218: 6 total, 6 up, 6 in Dec 15 05:09:41 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e218: 6 total, 6 up, 6 in Dec 15 05:09:42 localhost neutron_sriov_agent[260044]: 2025-12-15 10:09:42.150 2 INFO neutron.agent.securitygroups_rpc [None req-1dc6b30b-443e-4b60-984c-be6111aedcc0 af329bd85e524c1ea41f4255aa5eb9d8 1d15e551a54e4d0ba499cdbee6530731 - - default default] Security group member updated ['5c37dc84-f80b-4e12-8d9d-49a8b22167f0']#033[00m Dec 15 05:09:43 localhost nova_compute[286344]: 2025-12-15 10:09:43.913 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:09:45 localhost dnsmasq[332970]: read /var/lib/neutron/dhcp/91df3eeb-df70-4c24-88ec-6d2e82e64e9a/addn_hosts - 0 addresses Dec 15 05:09:45 localhost dnsmasq-dhcp[332970]: read /var/lib/neutron/dhcp/91df3eeb-df70-4c24-88ec-6d2e82e64e9a/host Dec 15 05:09:45 localhost dnsmasq-dhcp[332970]: read /var/lib/neutron/dhcp/91df3eeb-df70-4c24-88ec-6d2e82e64e9a/opts Dec 15 05:09:45 localhost podman[333060]: 2025-12-15 10:09:45.403290279 +0000 UTC m=+0.061041271 container kill 3ce513ec863173da1d0dc5fb86570d52d02616d4e1c034bfd6fe5ee1770bd8f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-91df3eeb-df70-4c24-88ec-6d2e82e64e9a, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2) Dec 15 05:09:45 localhost kernel: device tap10979e70-1b left promiscuous mode Dec 15 05:09:45 localhost ovn_controller[154603]: 2025-12-15T10:09:45Z|00466|binding|INFO|Releasing lport 10979e70-1b81-427f-b3c0-41c05affdb66 from this chassis (sb_readonly=0) Dec 15 05:09:45 localhost ovn_controller[154603]: 2025-12-15T10:09:45Z|00467|binding|INFO|Setting lport 10979e70-1b81-427f-b3c0-41c05affdb66 down in Southbound Dec 15 05:09:45 localhost nova_compute[286344]: 2025-12-15 10:09:45.655 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:09:45 localhost ovn_metadata_agent[160585]: 2025-12-15 10:09:45.663 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-91df3eeb-df70-4c24-88ec-6d2e82e64e9a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-91df3eeb-df70-4c24-88ec-6d2e82e64e9a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '736a858f8d6a4ae6a58d901e2d9d6e8d', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005559462.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fdeb10fb-bebf-4bd3-ad29-00f7a4b43609, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=10979e70-1b81-427f-b3c0-41c05affdb66) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:09:45 localhost ovn_metadata_agent[160585]: 2025-12-15 10:09:45.665 160590 INFO neutron.agent.ovn.metadata.agent [-] Port 10979e70-1b81-427f-b3c0-41c05affdb66 in datapath 91df3eeb-df70-4c24-88ec-6d2e82e64e9a unbound from our chassis#033[00m Dec 15 05:09:45 localhost ovn_metadata_agent[160585]: 2025-12-15 10:09:45.667 160590 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 91df3eeb-df70-4c24-88ec-6d2e82e64e9a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 15 05:09:45 localhost ovn_metadata_agent[160585]: 2025-12-15 10:09:45.668 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[d12b2e15-965e-4cd3-8a8f-0b0c0c347820]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:09:45 localhost nova_compute[286344]: 2025-12-15 10:09:45.671 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:09:46 localhost nova_compute[286344]: 2025-12-15 10:09:46.221 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:09:46 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e218 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:09:46 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e218 do_prune osdmap full prune enabled Dec 15 05:09:46 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e219 e219: 6 total, 6 up, 6 in Dec 15 05:09:46 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e219: 6 total, 6 up, 6 in Dec 15 05:09:47 localhost ovn_controller[154603]: 2025-12-15T10:09:47Z|00468|binding|INFO|Releasing lport b35254ad-12eb-47bb-92be-44fefe0694f0 from this chassis (sb_readonly=0) Dec 15 05:09:47 localhost nova_compute[286344]: 2025-12-15 10:09:47.194 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:09:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0. Dec 15 05:09:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09. Dec 15 05:09:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 05:09:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 05:09:47 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a. Dec 15 05:09:47 localhost systemd[1]: tmp-crun.2Z4mi5.mount: Deactivated successfully. Dec 15 05:09:47 localhost ovn_controller[154603]: 2025-12-15T10:09:47Z|00469|binding|INFO|Releasing lport b35254ad-12eb-47bb-92be-44fefe0694f0 from this chassis (sb_readonly=0) Dec 15 05:09:47 localhost podman[333082]: 2025-12-15 10:09:47.787139498 +0000 UTC m=+0.115837771 container health_status 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 15 05:09:47 localhost nova_compute[286344]: 2025-12-15 10:09:47.802 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:09:47 localhost podman[333082]: 2025-12-15 10:09:47.825481711 +0000 UTC m=+0.154179974 container exec_died 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Dec 15 05:09:47 localhost systemd[1]: 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0.service: Deactivated successfully. Dec 15 05:09:47 localhost podman[333084]: 2025-12-15 10:09:47.879068898 +0000 UTC m=+0.201959403 container health_status 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251202, config_id=multipathd, tcib_managed=true) Dec 15 05:09:47 localhost podman[333096]: 2025-12-15 10:09:47.831973947 +0000 UTC m=+0.147371569 container health_status b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Dec 15 05:09:47 localhost podman[333084]: 2025-12-15 10:09:47.904427657 +0000 UTC m=+0.227318152 container exec_died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:09:47 localhost podman[333096]: 2025-12-15 10:09:47.91887338 +0000 UTC m=+0.234270952 container exec_died b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Dec 15 05:09:47 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.service: Deactivated successfully. Dec 15 05:09:47 localhost systemd[1]: b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a.service: Deactivated successfully. Dec 15 05:09:47 localhost podman[333083]: 2025-12-15 10:09:47.982165071 +0000 UTC m=+0.309103336 container health_status 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-type=git, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, managed_by=edpm_ansible, version=9.6, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers) Dec 15 05:09:47 localhost podman[333083]: 2025-12-15 10:09:47.992560563 +0000 UTC m=+0.319498868 container exec_died 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, distribution-scope=public, vcs-type=git, architecture=x86_64, version=9.6, managed_by=edpm_ansible, name=ubi9-minimal, config_id=openstack_network_exporter, io.openshift.expose-services=) Dec 15 05:09:48 localhost systemd[1]: 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09.service: Deactivated successfully. Dec 15 05:09:48 localhost podman[333085]: 2025-12-15 10:09:48.041517985 +0000 UTC m=+0.359390493 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_controller) Dec 15 05:09:48 localhost podman[333085]: 2025-12-15 10:09:48.081961034 +0000 UTC m=+0.399833572 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, config_id=ovn_controller, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:09:48 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.124 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'name': 'test', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005559462.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'c785bf23f53946bc99867d8832a50266', 'user_id': '1ba5fce347b64bfebf995f187193f205', 'hostId': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.125 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.130 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.132 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6f6bc9ba-9d57-4b2b-b007-d989e9e168be', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:09:48.125545', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '2ff7b608-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12454.318212952, 'message_signature': '91f7a05cc8b5162a160d0079ad7c3517e892345fe0613d7461606ce7fd018166'}]}, 'timestamp': '2025-12-15 10:09:48.131202', '_unique_id': '6b69ba38201440e1947acbcb65e9eea3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.132 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.132 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.132 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.132 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.132 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.132 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.132 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.132 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.132 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.132 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.132 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.132 12 ERROR oslo_messaging.notify.messaging Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.132 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.132 12 ERROR oslo_messaging.notify.messaging Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.132 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.132 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.132 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.132 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.132 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.132 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.132 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.132 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.132 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.132 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.132 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.132 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.132 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.132 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.132 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.132 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.132 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.132 12 ERROR oslo_messaging.notify.messaging Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.133 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.134 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.135 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b9c461e3-0e2b-4bd0-b47c-d07f737395d1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:09:48.134162', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '2ff84032-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12454.318212952, 'message_signature': '8a561bf9bcb23e7a7e63ae722ff17912d0fc933787017567e6c7f3b6352a192d'}]}, 'timestamp': '2025-12-15 10:09:48.134652', '_unique_id': 'd2a2c29501904690976b8984ef63b576'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.135 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.135 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.135 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.135 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.135 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.135 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.135 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.135 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.135 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.135 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.135 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.135 12 ERROR oslo_messaging.notify.messaging Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.135 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.135 12 ERROR oslo_messaging.notify.messaging Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.135 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.135 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.135 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.135 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.135 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.135 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.135 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.135 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.135 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.135 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.135 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.135 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.135 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.135 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.135 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.135 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.135 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.135 12 ERROR oslo_messaging.notify.messaging Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.136 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.154 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/memory.usage volume: 51.73828125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.156 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b7371676-3797-4316-b73c-ce9cd49213f9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.73828125, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'timestamp': '2025-12-15T10:09:48.136959', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '2ffb688e-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12454.347357694, 'message_signature': 'b16500e3c1598944361f828f5e12c1b30c80e8d268866527d3e715387cd2e4f4'}]}, 'timestamp': '2025-12-15 10:09:48.155331', '_unique_id': '19a96b9610f04590a27e27db3fa3bb19'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.156 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.156 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.156 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.156 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.156 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.156 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.156 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.156 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.156 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.156 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.156 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.156 12 ERROR oslo_messaging.notify.messaging Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.156 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.156 12 ERROR oslo_messaging.notify.messaging Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.156 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.156 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.156 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.156 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.156 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.156 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.156 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.156 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.156 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.156 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.156 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.156 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.156 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.156 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.156 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.156 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.156 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.156 12 ERROR oslo_messaging.notify.messaging Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.158 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.158 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.158 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.159 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '17cd7897-e4f3-4aa4-86f8-ea8a26d63457', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:09:48.158482', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '2ffbf65a-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12454.318212952, 'message_signature': '905b06811c6fe0fecf67e89af637d81eab9cead994ec8b98abffb11b171b912e'}]}, 'timestamp': '2025-12-15 10:09:48.158973', '_unique_id': '1abb5ab61a5f49de8078ba29702e9df5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.159 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.159 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.159 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.159 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.159 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.159 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.159 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.159 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.159 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.159 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.159 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.159 12 ERROR oslo_messaging.notify.messaging Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.159 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.159 12 ERROR oslo_messaging.notify.messaging Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.159 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.159 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.159 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.159 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.159 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.159 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.159 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.159 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.159 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.159 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.159 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.159 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.159 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.159 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.159 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.159 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.159 12 ERROR oslo_messaging.notify.messaging Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.161 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.161 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.162 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '54c20120-799c-4d1b-8eb2-a6221252af30', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:09:48.161209', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '2ffc67de-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12454.318212952, 'message_signature': '2f14595a318a0f5f69ce2fa6b0d57c9a4a246d77f27af2b8e316de9975d6fb86'}]}, 'timestamp': '2025-12-15 10:09:48.161878', '_unique_id': '5fa8ab7661984b0ba7b42cbe8ef532b3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.162 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.162 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.162 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.162 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.162 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.162 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.162 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.162 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.162 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.162 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.162 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.162 12 ERROR oslo_messaging.notify.messaging Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.162 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.162 12 ERROR oslo_messaging.notify.messaging Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.162 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.162 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.162 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.162 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.162 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.162 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.162 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.162 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.162 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.162 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.162 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.162 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.162 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.162 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.162 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.162 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.162 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.162 12 ERROR oslo_messaging.notify.messaging Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.164 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.190 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.latency volume: 1243487016 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.191 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.latency volume: 24779175 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.192 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c2131597-4d43-40dd-9496-e94a6427d258', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1243487016, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T10:09:48.164176', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '3000e4c6-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12454.356871664, 'message_signature': '14719e0449bfbcfc7d37f62f3b75d31684dca57bfea0e11a27f41cca03f3cf2c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24779175, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T10:09:48.164176', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '3000f646-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12454.356871664, 'message_signature': '37cf2b42ae2c1d7d321d7882cebd37c6fb258440b4efe124d0b00047a7c31a09'}]}, 'timestamp': '2025-12-15 10:09:48.191742', '_unique_id': '13514d86c59245e7a11e5892a4b3d31d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.192 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.192 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.192 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.192 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.192 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.192 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.192 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.192 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.192 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.192 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.192 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.192 12 ERROR oslo_messaging.notify.messaging Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.192 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.192 12 ERROR oslo_messaging.notify.messaging Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.192 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.192 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.192 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.192 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.192 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.192 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.192 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.192 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.192 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.192 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.192 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.192 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.192 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.192 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.192 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.192 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.192 12 ERROR oslo_messaging.notify.messaging Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.193 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.194 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.195 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'faa83534-0eb5-41d2-84d9-5ebed15645ee', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:09:48.194079', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '300164b4-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12454.318212952, 'message_signature': 'cec132098b64285ace8225c23211ddf1b4418fc7ac489cd7b885b357446a4508'}]}, 'timestamp': '2025-12-15 10:09:48.194561', '_unique_id': '43f06637ef7e4e4c888ae7a5b3382f2d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.195 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.195 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.195 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.195 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.195 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.195 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.195 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.195 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.195 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.195 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.195 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.195 12 ERROR oslo_messaging.notify.messaging Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.195 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.195 12 ERROR oslo_messaging.notify.messaging Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.195 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.195 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.195 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.195 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.195 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.195 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.195 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.195 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.195 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.195 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.195 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.195 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.195 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.195 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.195 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.195 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.195 12 ERROR oslo_messaging.notify.messaging Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.196 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.196 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.197 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.199 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd3508c44-92a4-4ad6-9d86-0de26f07fea1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T10:09:48.196822', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '3001d516-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12454.356871664, 'message_signature': '682373af7c12507c555c3a0a232c7921d048077fc3d53ce8d3379ee7478b82dd'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T10:09:48.196822', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '3001ec9a-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12454.356871664, 'message_signature': 'fe53f7c1244d289279c541222ae9df7581a7eb637fa0102d250d7c43e2a6ea98'}]}, 'timestamp': '2025-12-15 10:09:48.198140', '_unique_id': 'b06061f427d0444490b54ca87ce4f875'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.199 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.199 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.199 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.199 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.199 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.199 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.199 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.199 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.199 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.199 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.199 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.199 12 ERROR oslo_messaging.notify.messaging Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.199 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.199 12 ERROR oslo_messaging.notify.messaging Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.199 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.199 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.199 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.199 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.199 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.199 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.199 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.199 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.199 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.199 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.199 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.199 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.199 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.199 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.199 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.199 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.199 12 ERROR oslo_messaging.notify.messaging Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.200 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.200 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.201 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.202 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6ce3d930-216c-4557-acb7-f87f228b5311', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T10:09:48.200861', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '30026eb8-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12454.356871664, 'message_signature': '776b876dfa3600a9606a0046d57ae22880ce2591b4d5418997a2d855474880a5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T10:09:48.200861', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '30027f48-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12454.356871664, 'message_signature': '9351d74b55feee2f548e15b08e6309cb819091d7b2f38ac79422c95163c5036b'}]}, 'timestamp': '2025-12-15 10:09:48.201759', '_unique_id': '996b0e1869e240a59dfdd99442bb5681'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.202 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.202 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.202 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.202 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.202 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.202 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.202 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.202 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.202 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.202 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.202 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.202 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.202 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.202 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.202 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.202 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.202 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.202 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.202 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.202 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.202 12 ERROR oslo_messaging.notify.messaging Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.202 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.202 12 ERROR oslo_messaging.notify.messaging Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.202 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.202 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.202 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.202 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.202 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.202 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.202 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.202 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.202 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.202 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.202 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.202 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.202 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.202 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.202 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.202 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.202 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.202 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.202 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.202 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.202 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.202 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.202 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.202 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.202 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.202 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.202 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.202 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.202 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.202 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.202 12 ERROR oslo_messaging.notify.messaging Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.203 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.204 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.205 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e8f1a407-017a-4f3b-9aab-c335cf222de9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:09:48.204031', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '3002e94c-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12454.318212952, 'message_signature': '52322fd20d11035c2b35f6752d0dcd780a78f6a5d0bcccd1b84e67276d1abbd0'}]}, 'timestamp': '2025-12-15 10:09:48.204509', '_unique_id': '9a5aa9e505854f859da06a26703854bc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.205 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.205 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.205 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.205 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.205 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.205 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.205 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.205 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.205 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.205 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.205 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.205 12 ERROR oslo_messaging.notify.messaging Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.205 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.205 12 ERROR oslo_messaging.notify.messaging Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.205 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.205 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.205 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.205 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.205 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.205 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.205 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.205 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.205 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.205 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.205 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.205 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.205 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.205 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.205 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.205 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.205 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.205 12 ERROR oslo_messaging.notify.messaging Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.206 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.217 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.217 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.219 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0ea88ee5-3e21-46b9-ac42-2f613c05404d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T10:09:48.206637', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '3004efe4-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12454.399316318, 'message_signature': 'faeccc96dabb24d6a0b1ed862a031c8d8eed79f5641f658a457b7b24d907e5e7'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T10:09:48.206637', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '30050092-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12454.399316318, 'message_signature': 'fe4cd174e5431c49eb0286873951bacdd5ec8fa70566f135fad3f1f10fd3a540'}]}, 'timestamp': '2025-12-15 10:09:48.218213', '_unique_id': 'eb2c7389853540a382dcb543dd73e9ad'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.219 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.219 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.219 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.219 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.219 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.219 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.219 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.219 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.219 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.219 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.219 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.219 12 ERROR oslo_messaging.notify.messaging Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.219 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.219 12 ERROR oslo_messaging.notify.messaging Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.219 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.219 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.219 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.219 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.219 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.219 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.219 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.219 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.219 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.219 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.219 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.219 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.219 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.219 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.219 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.219 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.219 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.219 12 ERROR oslo_messaging.notify.messaging Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.220 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.220 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.220 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.222 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '57be8bfc-2bbd-4e8a-9d46-cb49b4a3f3b2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T10:09:48.220474', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '30056b18-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12454.399316318, 'message_signature': '9a6cc30387e71e230cae4d184b678f6502a519d72d0b8395c6213e43cf4fd428'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T10:09:48.220474', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '30057ebe-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12454.399316318, 'message_signature': 'c06cf424277efc4291c8c0bb3fb6a989bf283127ba514ac2af8e183a20faa3ab'}]}, 'timestamp': '2025-12-15 10:09:48.221489', '_unique_id': '5c00ec2b3e434c80b3f5500a262fb97c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.222 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.222 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.222 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.222 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.222 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.222 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.222 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.222 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.222 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.222 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.222 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.222 12 ERROR oslo_messaging.notify.messaging Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.222 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.222 12 ERROR oslo_messaging.notify.messaging Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.222 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.222 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.222 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.222 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.222 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.222 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.222 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.222 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.222 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.222 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.222 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.222 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.222 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.222 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.222 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.222 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.222 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.222 12 ERROR oslo_messaging.notify.messaging Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.224 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.224 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.225 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.226 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4a189e2f-9ea5-4c3c-bdd1-741b18bbff47', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T10:09:48.224622', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '30060e24-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12454.399316318, 'message_signature': 'a58e761775a6eb50d03e17dc99ec90ec8b86de3e8401a0ceaf62dbd1e00d9f92'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T10:09:48.224622', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '300620a8-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12454.399316318, 'message_signature': 'c057c575e33a5e886e2ddfe4d7d61f3322ecf123aaa9fe39271c7f54fd2fcb5f'}]}, 'timestamp': '2025-12-15 10:09:48.225553', '_unique_id': '947224d63fa94a7ba62b5f8f8b4d663d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.226 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.226 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.226 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.226 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.226 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.226 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.226 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.226 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.226 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.226 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.226 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.226 12 ERROR oslo_messaging.notify.messaging Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.226 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.226 12 ERROR oslo_messaging.notify.messaging Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.226 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.226 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.226 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.226 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.226 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.226 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.226 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.226 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.226 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.226 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.226 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.226 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.226 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.226 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.226 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.226 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.226 12 ERROR oslo_messaging.notify.messaging Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.227 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.227 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.227 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/cpu volume: 16480000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.229 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '550c9cd4-5a84-4a25-80c7-22ffa87df2f0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 16480000000, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'timestamp': '2025-12-15T10:09:48.227886', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '30068fca-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12454.347357694, 'message_signature': '765752334a7083462474a15fe89765d13823b27b79237649bc7ac90d1b1f295a'}]}, 'timestamp': '2025-12-15 10:09:48.228415', '_unique_id': 'd487a6100e7e49038ad454fb9883eafe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.229 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.229 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.229 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.229 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.229 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.229 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.229 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.229 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.229 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.229 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.229 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.229 12 ERROR oslo_messaging.notify.messaging Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.229 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.229 12 ERROR oslo_messaging.notify.messaging Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.229 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.229 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.229 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.229 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.229 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.229 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.229 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.229 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.229 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.229 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.229 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.229 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.229 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.229 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.229 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.229 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.229 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.229 12 ERROR oslo_messaging.notify.messaging Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.230 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.230 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.230 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.latency volume: 1342134926 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.231 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.latency volume: 123356132 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.232 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4b4a5313-b51c-4670-9678-5105bb708a19', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1342134926, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T10:09:48.230754', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '3006fdfc-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12454.356871664, 'message_signature': '1ceca5577183e6c00bafe259b753222708a30e6c3fb07cb15ff62b16f76d77f2'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 123356132, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T10:09:48.230754', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '30070e6e-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12454.356871664, 'message_signature': '121dd48812b99bd0e1b9d6dca1d50e270612e5bfae21cb66ef28f9362ccd3b3a'}]}, 'timestamp': '2025-12-15 10:09:48.231639', '_unique_id': 'a91a7a193adf4d66854e0de96c5a1554'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.232 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.232 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.232 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.232 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.232 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.232 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.232 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.232 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.232 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.232 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.232 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.232 12 ERROR oslo_messaging.notify.messaging Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.232 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.232 12 ERROR oslo_messaging.notify.messaging Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.232 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.232 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.232 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.232 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.232 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.232 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.232 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.232 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.232 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.232 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.232 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.232 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.232 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.232 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.232 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.232 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.232 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.232 12 ERROR oslo_messaging.notify.messaging Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.233 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.233 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.235 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8e1ef867-bd3d-4e94-a450-eb2bfbaee060', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:09:48.233845', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '300776e2-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12454.318212952, 'message_signature': 'd8689d290d4dc37711802e0d86cf7ef51a5ad0e7eab152150bb6338c33d6c9c3'}]}, 'timestamp': '2025-12-15 10:09:48.234344', '_unique_id': 'ac7a495235ae4d3ebda22cea1cc3ff93'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.235 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.235 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.235 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.235 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.235 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.235 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.235 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.235 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.235 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.235 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.235 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.235 12 ERROR oslo_messaging.notify.messaging Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.235 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.235 12 ERROR oslo_messaging.notify.messaging Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.235 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.235 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.235 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.235 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.235 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.235 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.235 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.235 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.235 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.235 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.235 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.235 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.235 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.235 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.235 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.235 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.235 12 ERROR oslo_messaging.notify.messaging Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.236 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.236 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.236 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.237 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '89c9399d-3449-4977-8144-37b311700fe3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T10:09:48.236379', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '3007d4f2-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12454.356871664, 'message_signature': 'ec2b157c07947b4480bfbb5ea57c193a4f710ef1ca04df7300dbfe2a65f733e9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T10:09:48.236379', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '3007defc-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12454.356871664, 'message_signature': 'c769c5893029dc82b758a3e8c8e74cad88b6987e281b2a7adbd672ff6612ba10'}]}, 'timestamp': '2025-12-15 10:09:48.236902', '_unique_id': '91f3da6a06f64b8588b4cd56255674e9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.237 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.237 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.237 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.237 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.237 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.237 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.237 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.237 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.237 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.237 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.237 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.237 12 ERROR oslo_messaging.notify.messaging Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.237 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.237 12 ERROR oslo_messaging.notify.messaging Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.237 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.237 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.237 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.237 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.237 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.237 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.237 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.237 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.237 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.237 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.237 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.237 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.237 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.237 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.237 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.237 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.237 12 ERROR oslo_messaging.notify.messaging Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.238 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.238 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.239 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cbafbbe6-4c9b-4c73-8111-4a9fd19ef79d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:09:48.238478', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '3008293e-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12454.318212952, 'message_signature': 'db1e7c63cf11d090999e2a9dd2b0672cf78b6c5a754a4aa698175103106c2f87'}]}, 'timestamp': '2025-12-15 10:09:48.238889', '_unique_id': 'db7f0d1f9b0446ed9615ff032c2a6cdf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.239 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.239 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.239 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.239 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.239 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.239 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.239 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.239 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.239 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.239 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.239 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.239 12 ERROR oslo_messaging.notify.messaging Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.239 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.239 12 ERROR oslo_messaging.notify.messaging Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.239 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.239 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.239 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.239 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.239 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.239 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.239 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.239 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.239 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.239 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.239 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.239 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.239 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.239 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.239 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.239 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.239 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.239 12 ERROR oslo_messaging.notify.messaging Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.240 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.240 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.241 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.242 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f8cb9665-7b35-4a09-83b8-8c7eaaed2a10', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T10:09:48.240785', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '30088492-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12454.356871664, 'message_signature': 'dee04f489cc6da904d83e6ce6cbe4e8237c5ad76f7c9db86111d1130f29134cc'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T10:09:48.240785', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '3008937e-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12454.356871664, 'message_signature': '6f0f8e13a20464ed5cf297f66cc9e6a45ec70951f3fc1b129349504375053a2a'}]}, 'timestamp': '2025-12-15 10:09:48.241587', '_unique_id': 'f274eb9c2e7342fa94226b3ec208f694'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.242 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.242 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.242 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.242 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.242 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.242 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.242 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.242 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.242 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.242 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.242 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.242 12 ERROR oslo_messaging.notify.messaging Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.242 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.242 12 ERROR oslo_messaging.notify.messaging Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.242 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.242 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.242 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.242 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.242 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.242 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.242 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.242 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.242 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.242 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.242 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.242 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.242 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.242 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.242 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.242 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.242 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.242 12 ERROR oslo_messaging.notify.messaging Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.242 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.243 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.243 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.243 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2b1e8304-1a18-4a4e-b576-62ae6f4a1013', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:09:48.243087', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '3008db18-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12454.318212952, 'message_signature': 'fbc82c279ebcf0d95c6e8fbf798034053faab95f53ac57245a5857ab3ad8fb70'}]}, 'timestamp': '2025-12-15 10:09:48.243374', '_unique_id': '588320ce29c54feaae0cdf94d59be4fd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.243 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.243 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.243 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.243 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.243 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.243 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.243 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.243 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.243 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.243 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.243 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.243 12 ERROR oslo_messaging.notify.messaging Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.243 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.243 12 ERROR oslo_messaging.notify.messaging Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.243 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.243 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.243 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.243 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.243 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.243 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.243 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.243 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.243 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.243 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.243 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.243 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.243 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.243 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.243 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.243 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.243 12 ERROR oslo_messaging.notify.messaging Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.244 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.244 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.245 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9c063416-3ea5-4a23-b353-063dc9c49bfc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:09:48.244709', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '30091a74-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12454.318212952, 'message_signature': '0a3f9665276233e36ec77b13c2f4e5ade537d7f5aa339a5791b42eb2d879bcd2'}]}, 'timestamp': '2025-12-15 10:09:48.245019', '_unique_id': '5119a136250e4d839d6f6add68a27343'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.245 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.245 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.245 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.245 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.245 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.245 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.245 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.245 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.245 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.245 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.245 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.245 12 ERROR oslo_messaging.notify.messaging Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.245 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.245 12 ERROR oslo_messaging.notify.messaging Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.245 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.245 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.245 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.245 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.245 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.245 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.245 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.245 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.245 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.245 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.245 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.245 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.245 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.245 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.245 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.245 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:09:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:09:48.245 12 ERROR oslo_messaging.notify.messaging Dec 15 05:09:48 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e219 do_prune osdmap full prune enabled Dec 15 05:09:48 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e220 e220: 6 total, 6 up, 6 in Dec 15 05:09:48 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e220: 6 total, 6 up, 6 in Dec 15 05:09:48 localhost dnsmasq[332970]: exiting on receipt of SIGTERM Dec 15 05:09:48 localhost podman[333204]: 2025-12-15 10:09:48.55880747 +0000 UTC m=+0.061965496 container kill 3ce513ec863173da1d0dc5fb86570d52d02616d4e1c034bfd6fe5ee1770bd8f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-91df3eeb-df70-4c24-88ec-6d2e82e64e9a, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2) Dec 15 05:09:48 localhost systemd[1]: libpod-3ce513ec863173da1d0dc5fb86570d52d02616d4e1c034bfd6fe5ee1770bd8f4.scope: Deactivated successfully. Dec 15 05:09:48 localhost podman[333217]: 2025-12-15 10:09:48.634609201 +0000 UTC m=+0.060536747 container died 3ce513ec863173da1d0dc5fb86570d52d02616d4e1c034bfd6fe5ee1770bd8f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-91df3eeb-df70-4c24-88ec-6d2e82e64e9a, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 05:09:48 localhost podman[333217]: 2025-12-15 10:09:48.664171205 +0000 UTC m=+0.090098721 container cleanup 3ce513ec863173da1d0dc5fb86570d52d02616d4e1c034bfd6fe5ee1770bd8f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-91df3eeb-df70-4c24-88ec-6d2e82e64e9a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:09:48 localhost systemd[1]: libpod-conmon-3ce513ec863173da1d0dc5fb86570d52d02616d4e1c034bfd6fe5ee1770bd8f4.scope: Deactivated successfully. Dec 15 05:09:48 localhost podman[333219]: 2025-12-15 10:09:48.715254413 +0000 UTC m=+0.134524118 container remove 3ce513ec863173da1d0dc5fb86570d52d02616d4e1c034bfd6fe5ee1770bd8f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-91df3eeb-df70-4c24-88ec-6d2e82e64e9a, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202) Dec 15 05:09:48 localhost systemd[1]: var-lib-containers-storage-overlay-db4d82f8e576a716edaf03d3a2ef2b18357635d8498f4316d21c995619652ef4-merged.mount: Deactivated successfully. Dec 15 05:09:48 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3ce513ec863173da1d0dc5fb86570d52d02616d4e1c034bfd6fe5ee1770bd8f4-userdata-shm.mount: Deactivated successfully. Dec 15 05:09:48 localhost nova_compute[286344]: 2025-12-15 10:09:48.915 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:09:48 localhost systemd[1]: run-netns-qdhcp\x2d91df3eeb\x2ddf70\x2d4c24\x2d88ec\x2d6d2e82e64e9a.mount: Deactivated successfully. Dec 15 05:09:48 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:09:48.934 267546 INFO neutron.agent.dhcp.agent [None req-414354e7-b9d0-416b-b723-2e52cb1b7a24 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:09:49 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:09:49.045 267546 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:09:50 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:09:50.360 267546 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:09:50 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:09:50.381 267546 INFO neutron.agent.linux.ip_lib [None req-319bac06-2f9a-453e-a165-012f71704f84 - - - - - -] Device tapf3ac7d66-83 cannot be used as it has no MAC address#033[00m Dec 15 05:09:50 localhost nova_compute[286344]: 2025-12-15 10:09:50.457 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:09:50 localhost kernel: device tapf3ac7d66-83 entered promiscuous mode Dec 15 05:09:50 localhost NetworkManager[5963]: [1765793390.4666] manager: (tapf3ac7d66-83): new Generic device (/org/freedesktop/NetworkManager/Devices/73) Dec 15 05:09:50 localhost nova_compute[286344]: 2025-12-15 10:09:50.468 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:09:50 localhost ovn_controller[154603]: 2025-12-15T10:09:50Z|00470|binding|INFO|Claiming lport f3ac7d66-832e-4ea6-ba95-02be907cd4ba for this chassis. Dec 15 05:09:50 localhost ovn_controller[154603]: 2025-12-15T10:09:50Z|00471|binding|INFO|f3ac7d66-832e-4ea6-ba95-02be907cd4ba: Claiming unknown Dec 15 05:09:50 localhost systemd-udevd[333257]: Network interface NamePolicy= disabled on kernel command line. Dec 15 05:09:50 localhost journal[231322]: ethtool ioctl error on tapf3ac7d66-83: No such device Dec 15 05:09:50 localhost journal[231322]: ethtool ioctl error on tapf3ac7d66-83: No such device Dec 15 05:09:50 localhost ovn_controller[154603]: 2025-12-15T10:09:50Z|00472|binding|INFO|Setting lport f3ac7d66-832e-4ea6-ba95-02be907cd4ba ovn-installed in OVS Dec 15 05:09:50 localhost nova_compute[286344]: 2025-12-15 10:09:50.507 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:09:50 localhost journal[231322]: ethtool ioctl error on tapf3ac7d66-83: No such device Dec 15 05:09:50 localhost journal[231322]: ethtool ioctl error on tapf3ac7d66-83: No such device Dec 15 05:09:50 localhost journal[231322]: ethtool ioctl error on tapf3ac7d66-83: No such device Dec 15 05:09:50 localhost journal[231322]: ethtool ioctl error on tapf3ac7d66-83: No such device Dec 15 05:09:50 localhost journal[231322]: ethtool ioctl error on tapf3ac7d66-83: No such device Dec 15 05:09:50 localhost journal[231322]: ethtool ioctl error on tapf3ac7d66-83: No such device Dec 15 05:09:50 localhost nova_compute[286344]: 2025-12-15 10:09:50.553 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:09:50 localhost nova_compute[286344]: 2025-12-15 10:09:50.587 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:09:50 localhost ovn_controller[154603]: 2025-12-15T10:09:50Z|00473|binding|INFO|Setting lport f3ac7d66-832e-4ea6-ba95-02be907cd4ba up in Southbound Dec 15 05:09:50 localhost ovn_metadata_agent[160585]: 2025-12-15 10:09:50.765 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-542d072b-cbf8-4dc6-8dee-96fae1ee7f4a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-542d072b-cbf8-4dc6-8dee-96fae1ee7f4a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '68b3221c6efd4aac8016cd49e82c26c8', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cb4fc024-9871-4c39-ae3b-4f805eccd069, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=f3ac7d66-832e-4ea6-ba95-02be907cd4ba) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:09:50 localhost ovn_metadata_agent[160585]: 2025-12-15 10:09:50.768 160590 INFO neutron.agent.ovn.metadata.agent [-] Port f3ac7d66-832e-4ea6-ba95-02be907cd4ba in datapath 542d072b-cbf8-4dc6-8dee-96fae1ee7f4a bound to our chassis#033[00m Dec 15 05:09:50 localhost ovn_metadata_agent[160585]: 2025-12-15 10:09:50.769 160590 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 542d072b-cbf8-4dc6-8dee-96fae1ee7f4a or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 15 05:09:50 localhost ovn_metadata_agent[160585]: 2025-12-15 10:09:50.770 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[c8450a80-863d-4171-8a60-f1fdd990c96c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:09:51 localhost nova_compute[286344]: 2025-12-15 10:09:51.223 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:09:51 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e220 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:09:51 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e220 do_prune osdmap full prune enabled Dec 15 05:09:51 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e221 e221: 6 total, 6 up, 6 in Dec 15 05:09:51 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e221: 6 total, 6 up, 6 in Dec 15 05:09:51 localhost ovn_metadata_agent[160585]: 2025-12-15 10:09:51.485 160590 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 05:09:51 localhost ovn_metadata_agent[160585]: 2025-12-15 10:09:51.487 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 05:09:51 localhost podman[333328]: Dec 15 05:09:51 localhost ovn_metadata_agent[160585]: 2025-12-15 10:09:51.488 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 05:09:51 localhost podman[333328]: 2025-12-15 10:09:51.501551325 +0000 UTC m=+0.108461800 container create e44a658bb3d3b219e2a4367afab90c98f666ed71dfcbbd64fa6512b8782bca2d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-542d072b-cbf8-4dc6-8dee-96fae1ee7f4a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS) Dec 15 05:09:51 localhost systemd[1]: Started libpod-conmon-e44a658bb3d3b219e2a4367afab90c98f666ed71dfcbbd64fa6512b8782bca2d.scope. Dec 15 05:09:51 localhost podman[333328]: 2025-12-15 10:09:51.445662606 +0000 UTC m=+0.052573131 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 15 05:09:51 localhost systemd[1]: tmp-crun.nQJWh5.mount: Deactivated successfully. Dec 15 05:09:51 localhost systemd[1]: Started libcrun container. Dec 15 05:09:51 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e8666602780204ce69d0a39d497dbd12fd9eb8f7be515d65915269ab156098d8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 05:09:51 localhost podman[333328]: 2025-12-15 10:09:51.591694236 +0000 UTC m=+0.198604711 container init e44a658bb3d3b219e2a4367afab90c98f666ed71dfcbbd64fa6512b8782bca2d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-542d072b-cbf8-4dc6-8dee-96fae1ee7f4a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202) Dec 15 05:09:51 localhost podman[333328]: 2025-12-15 10:09:51.601964826 +0000 UTC m=+0.208875311 container start e44a658bb3d3b219e2a4367afab90c98f666ed71dfcbbd64fa6512b8782bca2d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-542d072b-cbf8-4dc6-8dee-96fae1ee7f4a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3) Dec 15 05:09:51 localhost dnsmasq[333346]: started, version 2.85 cachesize 150 Dec 15 05:09:51 localhost dnsmasq[333346]: DNS service limited to local subnets Dec 15 05:09:51 localhost dnsmasq[333346]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 15 05:09:51 localhost dnsmasq[333346]: warning: no upstream servers configured Dec 15 05:09:51 localhost dnsmasq-dhcp[333346]: DHCPv6, static leases only on 2001:db8::, lease time 1d Dec 15 05:09:51 localhost dnsmasq[333346]: read /var/lib/neutron/dhcp/542d072b-cbf8-4dc6-8dee-96fae1ee7f4a/addn_hosts - 0 addresses Dec 15 05:09:51 localhost dnsmasq-dhcp[333346]: read /var/lib/neutron/dhcp/542d072b-cbf8-4dc6-8dee-96fae1ee7f4a/host Dec 15 05:09:51 localhost dnsmasq-dhcp[333346]: read /var/lib/neutron/dhcp/542d072b-cbf8-4dc6-8dee-96fae1ee7f4a/opts Dec 15 05:09:51 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:09:51.733 267546 INFO neutron.agent.dhcp.agent [None req-8e01893b-d151-4e46-ac75-2c322237a590 - - - - - -] DHCP configuration for ports {'20280407-37fe-4b14-ac1a-a6711fdfc3fe'} is completed#033[00m Dec 15 05:09:51 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Dec 15 05:09:51 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.25625 172.18.0.34:0/382777224' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Dec 15 05:09:52 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:09:52.789 267546 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-15T10:09:52Z, description=, device_id=cf2ed6eb-84ee-4b6b-acbe-6301f277c61c, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=ee90037f-c1fa-43e9-89b8-79d4373c8202, ip_allocation=immediate, mac_address=fa:16:3e:96:29:8c, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-15T10:09:46Z, description=, dns_domain=, id=542d072b-cbf8-4dc6-8dee-96fae1ee7f4a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-615202783, port_security_enabled=True, project_id=68b3221c6efd4aac8016cd49e82c26c8, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=32741, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3172, status=ACTIVE, subnets=['eb22db21-c47a-4fd2-b62d-823fa5b5c5ab'], tags=[], tenant_id=68b3221c6efd4aac8016cd49e82c26c8, updated_at=2025-12-15T10:09:49Z, vlan_transparent=None, network_id=542d072b-cbf8-4dc6-8dee-96fae1ee7f4a, port_security_enabled=False, project_id=68b3221c6efd4aac8016cd49e82c26c8, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3200, status=DOWN, tags=[], tenant_id=68b3221c6efd4aac8016cd49e82c26c8, updated_at=2025-12-15T10:09:52Z on network 542d072b-cbf8-4dc6-8dee-96fae1ee7f4a#033[00m Dec 15 05:09:52 localhost dnsmasq[333346]: read /var/lib/neutron/dhcp/542d072b-cbf8-4dc6-8dee-96fae1ee7f4a/addn_hosts - 1 addresses Dec 15 05:09:52 localhost systemd[1]: tmp-crun.9MT74t.mount: Deactivated successfully. Dec 15 05:09:52 localhost dnsmasq-dhcp[333346]: read /var/lib/neutron/dhcp/542d072b-cbf8-4dc6-8dee-96fae1ee7f4a/host Dec 15 05:09:52 localhost dnsmasq-dhcp[333346]: read /var/lib/neutron/dhcp/542d072b-cbf8-4dc6-8dee-96fae1ee7f4a/opts Dec 15 05:09:52 localhost podman[333366]: 2025-12-15 10:09:52.991271212 +0000 UTC m=+0.070617951 container kill e44a658bb3d3b219e2a4367afab90c98f666ed71dfcbbd64fa6512b8782bca2d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-542d072b-cbf8-4dc6-8dee-96fae1ee7f4a, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Dec 15 05:09:53 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:09:53.291 267546 INFO neutron.agent.dhcp.agent [None req-ba0555d4-2969-4917-aaa7-e10c12a55b95 - - - - - -] DHCP configuration for ports {'ee90037f-c1fa-43e9-89b8-79d4373c8202'} is completed#033[00m Dec 15 05:09:53 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:09:53.467 267546 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-15T10:09:52Z, description=, device_id=cf2ed6eb-84ee-4b6b-acbe-6301f277c61c, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=ee90037f-c1fa-43e9-89b8-79d4373c8202, ip_allocation=immediate, mac_address=fa:16:3e:96:29:8c, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-15T10:09:46Z, description=, dns_domain=, id=542d072b-cbf8-4dc6-8dee-96fae1ee7f4a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-615202783, port_security_enabled=True, project_id=68b3221c6efd4aac8016cd49e82c26c8, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=32741, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3172, status=ACTIVE, subnets=['eb22db21-c47a-4fd2-b62d-823fa5b5c5ab'], tags=[], tenant_id=68b3221c6efd4aac8016cd49e82c26c8, updated_at=2025-12-15T10:09:49Z, vlan_transparent=None, network_id=542d072b-cbf8-4dc6-8dee-96fae1ee7f4a, port_security_enabled=False, project_id=68b3221c6efd4aac8016cd49e82c26c8, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3200, status=DOWN, tags=[], tenant_id=68b3221c6efd4aac8016cd49e82c26c8, updated_at=2025-12-15T10:09:52Z on network 542d072b-cbf8-4dc6-8dee-96fae1ee7f4a#033[00m Dec 15 05:09:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 05:09:53 localhost dnsmasq[333346]: read /var/lib/neutron/dhcp/542d072b-cbf8-4dc6-8dee-96fae1ee7f4a/addn_hosts - 1 addresses Dec 15 05:09:53 localhost dnsmasq-dhcp[333346]: read /var/lib/neutron/dhcp/542d072b-cbf8-4dc6-8dee-96fae1ee7f4a/host Dec 15 05:09:53 localhost dnsmasq-dhcp[333346]: read /var/lib/neutron/dhcp/542d072b-cbf8-4dc6-8dee-96fae1ee7f4a/opts Dec 15 05:09:53 localhost podman[333406]: 2025-12-15 10:09:53.660432037 +0000 UTC m=+0.060347942 container kill e44a658bb3d3b219e2a4367afab90c98f666ed71dfcbbd64fa6512b8782bca2d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-542d072b-cbf8-4dc6-8dee-96fae1ee7f4a, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true) Dec 15 05:09:53 localhost podman[333420]: 2025-12-15 10:09:53.746844016 +0000 UTC m=+0.075601947 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 05:09:53 localhost podman[333420]: 2025-12-15 10:09:53.751907644 +0000 UTC m=+0.080665575 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, config_id=ovn_metadata_agent) Dec 15 05:09:53 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 05:09:53 localhost nova_compute[286344]: 2025-12-15 10:09:53.918 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:09:53 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:09:53.936 267546 INFO neutron.agent.dhcp.agent [None req-02d4db2e-6aeb-4159-94a8-63d5b0b08ea4 - - - - - -] DHCP configuration for ports {'ee90037f-c1fa-43e9-89b8-79d4373c8202'} is completed#033[00m Dec 15 05:09:55 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Dec 15 05:09:55 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2699307333' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Dec 15 05:09:55 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Dec 15 05:09:55 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2699307333' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Dec 15 05:09:56 localhost nova_compute[286344]: 2025-12-15 10:09:56.225 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:09:56 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e221 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:09:56 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e221 do_prune osdmap full prune enabled Dec 15 05:09:56 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e222 e222: 6 total, 6 up, 6 in Dec 15 05:09:56 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e222: 6 total, 6 up, 6 in Dec 15 05:09:58 localhost dnsmasq[333346]: read /var/lib/neutron/dhcp/542d072b-cbf8-4dc6-8dee-96fae1ee7f4a/addn_hosts - 0 addresses Dec 15 05:09:58 localhost podman[333463]: 2025-12-15 10:09:58.878574161 +0000 UTC m=+0.059076317 container kill e44a658bb3d3b219e2a4367afab90c98f666ed71dfcbbd64fa6512b8782bca2d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-542d072b-cbf8-4dc6-8dee-96fae1ee7f4a, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3) Dec 15 05:09:58 localhost dnsmasq-dhcp[333346]: read /var/lib/neutron/dhcp/542d072b-cbf8-4dc6-8dee-96fae1ee7f4a/host Dec 15 05:09:58 localhost dnsmasq-dhcp[333346]: read /var/lib/neutron/dhcp/542d072b-cbf8-4dc6-8dee-96fae1ee7f4a/opts Dec 15 05:09:58 localhost nova_compute[286344]: 2025-12-15 10:09:58.922 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:09:59 localhost ovn_controller[154603]: 2025-12-15T10:09:59Z|00474|binding|INFO|Releasing lport f3ac7d66-832e-4ea6-ba95-02be907cd4ba from this chassis (sb_readonly=0) Dec 15 05:09:59 localhost kernel: device tapf3ac7d66-83 left promiscuous mode Dec 15 05:09:59 localhost nova_compute[286344]: 2025-12-15 10:09:59.077 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:09:59 localhost ovn_controller[154603]: 2025-12-15T10:09:59Z|00475|binding|INFO|Setting lport f3ac7d66-832e-4ea6-ba95-02be907cd4ba down in Southbound Dec 15 05:09:59 localhost ovn_metadata_agent[160585]: 2025-12-15 10:09:59.085 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-542d072b-cbf8-4dc6-8dee-96fae1ee7f4a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-542d072b-cbf8-4dc6-8dee-96fae1ee7f4a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '68b3221c6efd4aac8016cd49e82c26c8', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005559462.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cb4fc024-9871-4c39-ae3b-4f805eccd069, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=f3ac7d66-832e-4ea6-ba95-02be907cd4ba) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:09:59 localhost ovn_metadata_agent[160585]: 2025-12-15 10:09:59.087 160590 INFO neutron.agent.ovn.metadata.agent [-] Port f3ac7d66-832e-4ea6-ba95-02be907cd4ba in datapath 542d072b-cbf8-4dc6-8dee-96fae1ee7f4a unbound from our chassis#033[00m Dec 15 05:09:59 localhost ovn_metadata_agent[160585]: 2025-12-15 10:09:59.089 160590 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 542d072b-cbf8-4dc6-8dee-96fae1ee7f4a or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 15 05:09:59 localhost ovn_metadata_agent[160585]: 2025-12-15 10:09:59.090 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[936f8ce3-3fe5-40cb-a93e-d9b6da3cf436]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:09:59 localhost nova_compute[286344]: 2025-12-15 10:09:59.096 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:09:59 localhost nova_compute[286344]: 2025-12-15 10:09:59.098 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:09:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e. Dec 15 05:09:59 localhost podman[333485]: 2025-12-15 10:09:59.755785783 +0000 UTC m=+0.081498117 container health_status a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 15 05:09:59 localhost podman[333485]: 2025-12-15 10:09:59.76739945 +0000 UTC m=+0.093111834 container exec_died a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Dec 15 05:09:59 localhost systemd[1]: a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e.service: Deactivated successfully. Dec 15 05:10:00 localhost ceph-mon[298913]: log_channel(cluster) log [INF] : overall HEALTH_OK Dec 15 05:10:00 localhost ceph-mon[298913]: overall HEALTH_OK Dec 15 05:10:01 localhost nova_compute[286344]: 2025-12-15 10:10:01.228 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:10:01 localhost dnsmasq[333346]: exiting on receipt of SIGTERM Dec 15 05:10:01 localhost podman[333525]: 2025-12-15 10:10:01.388049246 +0000 UTC m=+0.056984490 container kill e44a658bb3d3b219e2a4367afab90c98f666ed71dfcbbd64fa6512b8782bca2d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-542d072b-cbf8-4dc6-8dee-96fae1ee7f4a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:10:01 localhost systemd[1]: libpod-e44a658bb3d3b219e2a4367afab90c98f666ed71dfcbbd64fa6512b8782bca2d.scope: Deactivated successfully. Dec 15 05:10:01 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e222 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:10:01 localhost podman[333539]: 2025-12-15 10:10:01.464078523 +0000 UTC m=+0.057652128 container died e44a658bb3d3b219e2a4367afab90c98f666ed71dfcbbd64fa6512b8782bca2d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-542d072b-cbf8-4dc6-8dee-96fae1ee7f4a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Dec 15 05:10:01 localhost podman[333539]: 2025-12-15 10:10:01.497006539 +0000 UTC m=+0.090580104 container cleanup e44a658bb3d3b219e2a4367afab90c98f666ed71dfcbbd64fa6512b8782bca2d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-542d072b-cbf8-4dc6-8dee-96fae1ee7f4a, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3) Dec 15 05:10:01 localhost systemd[1]: libpod-conmon-e44a658bb3d3b219e2a4367afab90c98f666ed71dfcbbd64fa6512b8782bca2d.scope: Deactivated successfully. Dec 15 05:10:01 localhost podman[333540]: 2025-12-15 10:10:01.541795936 +0000 UTC m=+0.130607213 container remove e44a658bb3d3b219e2a4367afab90c98f666ed71dfcbbd64fa6512b8782bca2d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-542d072b-cbf8-4dc6-8dee-96fae1ee7f4a, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:10:01 localhost podman[243449]: time="2025-12-15T10:10:01Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 15 05:10:01 localhost podman[243449]: @ - - [15/Dec/2025:10:10:01 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156640 "" "Go-http-client/1.1" Dec 15 05:10:01 localhost podman[243449]: @ - - [15/Dec/2025:10:10:01 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19258 "" "Go-http-client/1.1" Dec 15 05:10:02 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:10:02.184 267546 INFO neutron.agent.dhcp.agent [None req-ba5d5269-419b-48e9-ad54-cd5fa982a1db - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:10:02 localhost systemd[1]: var-lib-containers-storage-overlay-e8666602780204ce69d0a39d497dbd12fd9eb8f7be515d65915269ab156098d8-merged.mount: Deactivated successfully. Dec 15 05:10:02 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e44a658bb3d3b219e2a4367afab90c98f666ed71dfcbbd64fa6512b8782bca2d-userdata-shm.mount: Deactivated successfully. Dec 15 05:10:02 localhost systemd[1]: run-netns-qdhcp\x2d542d072b\x2dcbf8\x2d4dc6\x2d8dee\x2d96fae1ee7f4a.mount: Deactivated successfully. Dec 15 05:10:02 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:10:02.477 267546 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:10:02 localhost ovn_controller[154603]: 2025-12-15T10:10:02Z|00476|binding|INFO|Releasing lport b35254ad-12eb-47bb-92be-44fefe0694f0 from this chassis (sb_readonly=0) Dec 15 05:10:02 localhost nova_compute[286344]: 2025-12-15 10:10:02.735 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:10:03 localhost nova_compute[286344]: 2025-12-15 10:10:03.924 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:10:04 localhost neutron_sriov_agent[260044]: 2025-12-15 10:10:04.608 2 INFO neutron.agent.securitygroups_rpc [None req-edd0a1b2-2d18-4cc3-a39c-4b9553f8254b 62016038ba7f4f3887d3aca00bf73ffb 68b3221c6efd4aac8016cd49e82c26c8 - - default default] Security group member updated ['5e55752b-a766-48fa-89d6-8fff269ecb70']#033[00m Dec 15 05:10:04 localhost openstack_network_exporter[246484]: ERROR 10:10:04 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 15 05:10:04 localhost openstack_network_exporter[246484]: ERROR 10:10:04 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 15 05:10:04 localhost openstack_network_exporter[246484]: Dec 15 05:10:04 localhost openstack_network_exporter[246484]: ERROR 10:10:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 05:10:04 localhost openstack_network_exporter[246484]: ERROR 10:10:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 05:10:04 localhost openstack_network_exporter[246484]: ERROR 10:10:04 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 15 05:10:04 localhost openstack_network_exporter[246484]: Dec 15 05:10:04 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:10:04.998 267546 INFO neutron.agent.linux.ip_lib [None req-6a74f05d-5838-4aee-8031-7167d59fd5a8 - - - - - -] Device tap7aea0509-43 cannot be used as it has no MAC address#033[00m Dec 15 05:10:05 localhost nova_compute[286344]: 2025-12-15 10:10:05.019 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:10:05 localhost kernel: device tap7aea0509-43 entered promiscuous mode Dec 15 05:10:05 localhost NetworkManager[5963]: [1765793405.0256] manager: (tap7aea0509-43): new Generic device (/org/freedesktop/NetworkManager/Devices/74) Dec 15 05:10:05 localhost nova_compute[286344]: 2025-12-15 10:10:05.027 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:10:05 localhost ovn_controller[154603]: 2025-12-15T10:10:05Z|00477|binding|INFO|Claiming lport 7aea0509-43fe-4325-badf-31f59c07ea33 for this chassis. Dec 15 05:10:05 localhost ovn_controller[154603]: 2025-12-15T10:10:05Z|00478|binding|INFO|7aea0509-43fe-4325-badf-31f59c07ea33: Claiming unknown Dec 15 05:10:05 localhost systemd-udevd[333577]: Network interface NamePolicy= disabled on kernel command line. Dec 15 05:10:05 localhost ovn_metadata_agent[160585]: 2025-12-15 10:10:05.041 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-c4bf02e5-cf10-439d-ab95-325cf65ed771', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c4bf02e5-cf10-439d-ab95-325cf65ed771', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '68b3221c6efd4aac8016cd49e82c26c8', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=515171f3-37cc-4566-814e-6c8ba1e57b63, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=7aea0509-43fe-4325-badf-31f59c07ea33) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:10:05 localhost ovn_metadata_agent[160585]: 2025-12-15 10:10:05.043 160590 INFO neutron.agent.ovn.metadata.agent [-] Port 7aea0509-43fe-4325-badf-31f59c07ea33 in datapath c4bf02e5-cf10-439d-ab95-325cf65ed771 bound to our chassis#033[00m Dec 15 05:10:05 localhost ovn_metadata_agent[160585]: 2025-12-15 10:10:05.046 160590 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network c4bf02e5-cf10-439d-ab95-325cf65ed771 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 15 05:10:05 localhost ovn_metadata_agent[160585]: 2025-12-15 10:10:05.047 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[0353c6b9-ac3a-44e9-92a0-0a268721f95f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:10:05 localhost journal[231322]: ethtool ioctl error on tap7aea0509-43: No such device Dec 15 05:10:05 localhost journal[231322]: ethtool ioctl error on tap7aea0509-43: No such device Dec 15 05:10:05 localhost nova_compute[286344]: 2025-12-15 10:10:05.060 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:10:05 localhost journal[231322]: ethtool ioctl error on tap7aea0509-43: No such device Dec 15 05:10:05 localhost ovn_controller[154603]: 2025-12-15T10:10:05Z|00479|binding|INFO|Setting lport 7aea0509-43fe-4325-badf-31f59c07ea33 ovn-installed in OVS Dec 15 05:10:05 localhost ovn_controller[154603]: 2025-12-15T10:10:05Z|00480|binding|INFO|Setting lport 7aea0509-43fe-4325-badf-31f59c07ea33 up in Southbound Dec 15 05:10:05 localhost nova_compute[286344]: 2025-12-15 10:10:05.065 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:10:05 localhost journal[231322]: ethtool ioctl error on tap7aea0509-43: No such device Dec 15 05:10:05 localhost journal[231322]: ethtool ioctl error on tap7aea0509-43: No such device Dec 15 05:10:05 localhost journal[231322]: ethtool ioctl error on tap7aea0509-43: No such device Dec 15 05:10:05 localhost journal[231322]: ethtool ioctl error on tap7aea0509-43: No such device Dec 15 05:10:05 localhost journal[231322]: ethtool ioctl error on tap7aea0509-43: No such device Dec 15 05:10:05 localhost nova_compute[286344]: 2025-12-15 10:10:05.092 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:10:05 localhost nova_compute[286344]: 2025-12-15 10:10:05.114 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:10:05 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Dec 15 05:10:05 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.25625 172.18.0.34:0/382777224' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Dec 15 05:10:05 localhost podman[333648]: Dec 15 05:10:05 localhost podman[333648]: 2025-12-15 10:10:05.889765041 +0000 UTC m=+0.087195322 container create 19710a50b1edef829a7e2759d8a79badc0febb081f00697957146e8787e6ca49 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c4bf02e5-cf10-439d-ab95-325cf65ed771, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 05:10:05 localhost systemd[1]: Started libpod-conmon-19710a50b1edef829a7e2759d8a79badc0febb081f00697957146e8787e6ca49.scope. Dec 15 05:10:05 localhost podman[333648]: 2025-12-15 10:10:05.846256708 +0000 UTC m=+0.043686819 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 15 05:10:05 localhost systemd[1]: Started libcrun container. Dec 15 05:10:05 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f82205c180bd6ce2fcd513a3171975c89b99a8c370313e0a023297c0922e6f94/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 05:10:05 localhost podman[333648]: 2025-12-15 10:10:05.963033163 +0000 UTC m=+0.160463304 container init 19710a50b1edef829a7e2759d8a79badc0febb081f00697957146e8787e6ca49 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c4bf02e5-cf10-439d-ab95-325cf65ed771, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0) Dec 15 05:10:05 localhost podman[333648]: 2025-12-15 10:10:05.972229703 +0000 UTC m=+0.169659814 container start 19710a50b1edef829a7e2759d8a79badc0febb081f00697957146e8787e6ca49 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c4bf02e5-cf10-439d-ab95-325cf65ed771, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202) Dec 15 05:10:05 localhost dnsmasq[333667]: started, version 2.85 cachesize 150 Dec 15 05:10:05 localhost dnsmasq[333667]: DNS service limited to local subnets Dec 15 05:10:05 localhost dnsmasq[333667]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 15 05:10:05 localhost dnsmasq[333667]: warning: no upstream servers configured Dec 15 05:10:05 localhost dnsmasq-dhcp[333667]: DHCPv6, static leases only on 2001:db8::, lease time 1d Dec 15 05:10:05 localhost dnsmasq[333667]: read /var/lib/neutron/dhcp/c4bf02e5-cf10-439d-ab95-325cf65ed771/addn_hosts - 0 addresses Dec 15 05:10:05 localhost dnsmasq-dhcp[333667]: read /var/lib/neutron/dhcp/c4bf02e5-cf10-439d-ab95-325cf65ed771/host Dec 15 05:10:05 localhost dnsmasq-dhcp[333667]: read /var/lib/neutron/dhcp/c4bf02e5-cf10-439d-ab95-325cf65ed771/opts Dec 15 05:10:06 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:10:06.035 267546 INFO neutron.agent.dhcp.agent [None req-6a74f05d-5838-4aee-8031-7167d59fd5a8 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-15T10:10:04Z, description=, device_id=, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=2872d1c2-3f14-455a-a446-0760229a722c, ip_allocation=immediate, mac_address=fa:16:3e:d3:54:3a, name=tempest-RoutersIpV6Test-1673003888, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-15T10:10:02Z, description=, dns_domain=, id=c4bf02e5-cf10-439d-ab95-325cf65ed771, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-812394422, port_security_enabled=True, project_id=68b3221c6efd4aac8016cd49e82c26c8, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=50159, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3238, status=ACTIVE, subnets=['6e5f65f7-640e-40db-a015-4f3796fadcff'], tags=[], tenant_id=68b3221c6efd4aac8016cd49e82c26c8, updated_at=2025-12-15T10:10:03Z, vlan_transparent=None, network_id=c4bf02e5-cf10-439d-ab95-325cf65ed771, port_security_enabled=True, project_id=68b3221c6efd4aac8016cd49e82c26c8, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['5e55752b-a766-48fa-89d6-8fff269ecb70'], standard_attr_id=3255, status=DOWN, tags=[], tenant_id=68b3221c6efd4aac8016cd49e82c26c8, updated_at=2025-12-15T10:10:04Z on network c4bf02e5-cf10-439d-ab95-325cf65ed771#033[00m Dec 15 05:10:06 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:10:06.164 267546 INFO neutron.agent.dhcp.agent [None req-9fea42de-8ea7-4421-b8f2-b14f32f6d5b8 - - - - - -] DHCP configuration for ports {'942b817b-4a2e-4717-a77a-3ae0e6bb4eda'} is completed#033[00m Dec 15 05:10:06 localhost nova_compute[286344]: 2025-12-15 10:10:06.230 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:10:06 localhost dnsmasq[333667]: read /var/lib/neutron/dhcp/c4bf02e5-cf10-439d-ab95-325cf65ed771/addn_hosts - 1 addresses Dec 15 05:10:06 localhost dnsmasq-dhcp[333667]: read /var/lib/neutron/dhcp/c4bf02e5-cf10-439d-ab95-325cf65ed771/host Dec 15 05:10:06 localhost dnsmasq-dhcp[333667]: read /var/lib/neutron/dhcp/c4bf02e5-cf10-439d-ab95-325cf65ed771/opts Dec 15 05:10:06 localhost podman[333685]: 2025-12-15 10:10:06.234114594 +0000 UTC m=+0.066440868 container kill 19710a50b1edef829a7e2759d8a79badc0febb081f00697957146e8787e6ca49 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c4bf02e5-cf10-439d-ab95-325cf65ed771, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Dec 15 05:10:06 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e222 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:10:06 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:10:06.700 267546 INFO neutron.agent.dhcp.agent [None req-842e830b-de0c-4d95-bb63-17f78d349f3c - - - - - -] DHCP configuration for ports {'2872d1c2-3f14-455a-a446-0760229a722c'} is completed#033[00m Dec 15 05:10:07 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559463.localdomain.devices.0}] v 0) Dec 15 05:10:07 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:10:07 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559462.localdomain.devices.0}] v 0) Dec 15 05:10:07 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559463.localdomain}] v 0) Dec 15 05:10:07 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:10:07 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559464.localdomain.devices.0}] v 0) Dec 15 05:10:07 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:10:07 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559462.localdomain}] v 0) Dec 15 05:10:07 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:10:07 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:10:07 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559464.localdomain}] v 0) Dec 15 05:10:07 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:10:07 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:10:07 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:10:07 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:10:07 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:10:07 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:10:07 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:10:07 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:10:07.481 267546 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-15T10:10:04Z, description=, device_id=65673dbe-f07b-432e-99b8-ab1fdc0babed, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=2872d1c2-3f14-455a-a446-0760229a722c, ip_allocation=immediate, mac_address=fa:16:3e:d3:54:3a, name=tempest-RoutersIpV6Test-1673003888, network_id=c4bf02e5-cf10-439d-ab95-325cf65ed771, port_security_enabled=True, project_id=68b3221c6efd4aac8016cd49e82c26c8, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['5e55752b-a766-48fa-89d6-8fff269ecb70'], standard_attr_id=3255, status=DOWN, tags=[], tenant_id=68b3221c6efd4aac8016cd49e82c26c8, updated_at=2025-12-15T10:10:05Z on network c4bf02e5-cf10-439d-ab95-325cf65ed771#033[00m Dec 15 05:10:07 localhost dnsmasq[333667]: read /var/lib/neutron/dhcp/c4bf02e5-cf10-439d-ab95-325cf65ed771/addn_hosts - 1 addresses Dec 15 05:10:07 localhost dnsmasq-dhcp[333667]: read /var/lib/neutron/dhcp/c4bf02e5-cf10-439d-ab95-325cf65ed771/host Dec 15 05:10:07 localhost podman[333816]: 2025-12-15 10:10:07.625115086 +0000 UTC m=+0.046042653 container kill 19710a50b1edef829a7e2759d8a79badc0febb081f00697957146e8787e6ca49 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c4bf02e5-cf10-439d-ab95-325cf65ed771, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true) Dec 15 05:10:07 localhost dnsmasq-dhcp[333667]: read /var/lib/neutron/dhcp/c4bf02e5-cf10-439d-ab95-325cf65ed771/opts Dec 15 05:10:07 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:10:07.893 267546 INFO neutron.agent.dhcp.agent [None req-79f061ec-18ff-45cf-aef5-9ed16ab52188 - - - - - -] DHCP configuration for ports {'2872d1c2-3f14-455a-a446-0760229a722c'} is completed#033[00m Dec 15 05:10:08 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Dec 15 05:10:08 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:10:08 localhost neutron_sriov_agent[260044]: 2025-12-15 10:10:08.191 2 INFO neutron.agent.securitygroups_rpc [None req-cc37dabe-92ff-494e-9b75-23c67ba34bd5 62016038ba7f4f3887d3aca00bf73ffb 68b3221c6efd4aac8016cd49e82c26c8 - - default default] Security group member updated ['5e55752b-a766-48fa-89d6-8fff269ecb70']#033[00m Dec 15 05:10:08 localhost nova_compute[286344]: 2025-12-15 10:10:08.235 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:10:08 localhost ovn_metadata_agent[160585]: 2025-12-15 10:10:08.237 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=18, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'fe:17:e3', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fe:55:2b:86:15:b5'}, ipsec=False) old=SB_Global(nb_cfg=17) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:10:08 localhost ovn_metadata_agent[160585]: 2025-12-15 10:10:08.239 160590 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Dec 15 05:10:08 localhost dnsmasq[333667]: read /var/lib/neutron/dhcp/c4bf02e5-cf10-439d-ab95-325cf65ed771/addn_hosts - 0 addresses Dec 15 05:10:08 localhost dnsmasq-dhcp[333667]: read /var/lib/neutron/dhcp/c4bf02e5-cf10-439d-ab95-325cf65ed771/host Dec 15 05:10:08 localhost dnsmasq-dhcp[333667]: read /var/lib/neutron/dhcp/c4bf02e5-cf10-439d-ab95-325cf65ed771/opts Dec 15 05:10:08 localhost podman[333906]: 2025-12-15 10:10:08.371863321 +0000 UTC m=+0.059282803 container kill 19710a50b1edef829a7e2759d8a79badc0febb081f00697957146e8787e6ca49 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c4bf02e5-cf10-439d-ab95-325cf65ed771, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true) Dec 15 05:10:08 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Dec 15 05:10:08 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:10:08 localhost ovn_controller[154603]: 2025-12-15T10:10:08Z|00481|binding|INFO|Releasing lport 7aea0509-43fe-4325-badf-31f59c07ea33 from this chassis (sb_readonly=0) Dec 15 05:10:08 localhost kernel: device tap7aea0509-43 left promiscuous mode Dec 15 05:10:08 localhost ovn_controller[154603]: 2025-12-15T10:10:08Z|00482|binding|INFO|Setting lport 7aea0509-43fe-4325-badf-31f59c07ea33 down in Southbound Dec 15 05:10:08 localhost nova_compute[286344]: 2025-12-15 10:10:08.553 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:10:08 localhost ovn_metadata_agent[160585]: 2025-12-15 10:10:08.563 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-c4bf02e5-cf10-439d-ab95-325cf65ed771', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c4bf02e5-cf10-439d-ab95-325cf65ed771', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '68b3221c6efd4aac8016cd49e82c26c8', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005559462.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=515171f3-37cc-4566-814e-6c8ba1e57b63, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=7aea0509-43fe-4325-badf-31f59c07ea33) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:10:08 localhost ovn_metadata_agent[160585]: 2025-12-15 10:10:08.566 160590 INFO neutron.agent.ovn.metadata.agent [-] Port 7aea0509-43fe-4325-badf-31f59c07ea33 in datapath c4bf02e5-cf10-439d-ab95-325cf65ed771 unbound from our chassis#033[00m Dec 15 05:10:08 localhost ovn_metadata_agent[160585]: 2025-12-15 10:10:08.567 160590 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network c4bf02e5-cf10-439d-ab95-325cf65ed771 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 15 05:10:08 localhost ovn_metadata_agent[160585]: 2025-12-15 10:10:08.568 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[58a9ec8d-3ded-4ffc-8fbe-e4be0d87f7ab]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:10:08 localhost nova_compute[286344]: 2025-12-15 10:10:08.572 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:10:08 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Dec 15 05:10:08 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.25625 172.18.0.34:0/382777224' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Dec 15 05:10:08 localhost nova_compute[286344]: 2025-12-15 10:10:08.927 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:10:09 localhost dnsmasq[333667]: exiting on receipt of SIGTERM Dec 15 05:10:09 localhost podman[333945]: 2025-12-15 10:10:09.402918445 +0000 UTC m=+0.061127093 container kill 19710a50b1edef829a7e2759d8a79badc0febb081f00697957146e8787e6ca49 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c4bf02e5-cf10-439d-ab95-325cf65ed771, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3) Dec 15 05:10:09 localhost systemd[1]: libpod-19710a50b1edef829a7e2759d8a79badc0febb081f00697957146e8787e6ca49.scope: Deactivated successfully. Dec 15 05:10:09 localhost podman[333961]: 2025-12-15 10:10:09.475086698 +0000 UTC m=+0.047848212 container died 19710a50b1edef829a7e2759d8a79badc0febb081f00697957146e8787e6ca49 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c4bf02e5-cf10-439d-ab95-325cf65ed771, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:10:09 localhost systemd[1]: tmp-crun.ireuhX.mount: Deactivated successfully. Dec 15 05:10:09 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-19710a50b1edef829a7e2759d8a79badc0febb081f00697957146e8787e6ca49-userdata-shm.mount: Deactivated successfully. Dec 15 05:10:09 localhost podman[333961]: 2025-12-15 10:10:09.576078784 +0000 UTC m=+0.148840298 container remove 19710a50b1edef829a7e2759d8a79badc0febb081f00697957146e8787e6ca49 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c4bf02e5-cf10-439d-ab95-325cf65ed771, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2) Dec 15 05:10:09 localhost systemd[1]: libpod-conmon-19710a50b1edef829a7e2759d8a79badc0febb081f00697957146e8787e6ca49.scope: Deactivated successfully. Dec 15 05:10:09 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Dec 15 05:10:09 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:10:09 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:10:09.887 267546 INFO neutron.agent.dhcp.agent [None req-c6013a44-297f-4bf0-8399-a6d8858cc261 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:10:09 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:10:09.904 267546 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:10:10 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:10:10.160 267546 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:10:10 localhost systemd[1]: var-lib-containers-storage-overlay-f82205c180bd6ce2fcd513a3171975c89b99a8c370313e0a023297c0922e6f94-merged.mount: Deactivated successfully. Dec 15 05:10:10 localhost systemd[1]: run-netns-qdhcp\x2dc4bf02e5\x2dcf10\x2d439d\x2dab95\x2d325cf65ed771.mount: Deactivated successfully. Dec 15 05:10:10 localhost ovn_controller[154603]: 2025-12-15T10:10:10Z|00483|binding|INFO|Releasing lport b35254ad-12eb-47bb-92be-44fefe0694f0 from this chassis (sb_readonly=0) Dec 15 05:10:10 localhost nova_compute[286344]: 2025-12-15 10:10:10.475 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:10:10 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:10:10 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Dec 15 05:10:10 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.25625 172.18.0.34:0/382777224' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Dec 15 05:10:11 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Dec 15 05:10:11 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.25625 172.18.0.34:0/382777224' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Dec 15 05:10:11 localhost nova_compute[286344]: 2025-12-15 10:10:11.232 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:10:11 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e222 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:10:11 localhost ceph-mon[298913]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #61. Immutable memtables: 0. Dec 15 05:10:11 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:10:11.458572) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Dec 15 05:10:11 localhost ceph-mon[298913]: rocksdb: [db/flush_job.cc:856] [default] [JOB 35] Flushing memtable with next log file: 61 Dec 15 05:10:11 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765793411458634, "job": 35, "event": "flush_started", "num_memtables": 1, "num_entries": 2448, "num_deletes": 268, "total_data_size": 2365644, "memory_usage": 2417112, "flush_reason": "Manual Compaction"} Dec 15 05:10:11 localhost ceph-mon[298913]: rocksdb: [db/flush_job.cc:885] [default] [JOB 35] Level-0 flush table #62: started Dec 15 05:10:11 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765793411479417, "cf_name": "default", "job": 35, "event": "table_file_creation", "file_number": 62, "file_size": 2295756, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 32870, "largest_seqno": 35317, "table_properties": {"data_size": 2285084, "index_size": 6665, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2949, "raw_key_size": 25265, "raw_average_key_size": 21, "raw_value_size": 2262349, "raw_average_value_size": 1936, "num_data_blocks": 283, "num_entries": 1168, "num_filter_entries": 1168, "num_deletions": 268, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765793284, "oldest_key_time": 1765793284, "file_creation_time": 1765793411, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "603b24af-e2be-4214-bc56-9e652eb4af3d", "db_session_id": "0OJRM9SCUA16EXV0VQZ2", "orig_file_number": 62, "seqno_to_time_mapping": "N/A"}} Dec 15 05:10:11 localhost ceph-mon[298913]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 35] Flush lasted 21007 microseconds, and 7033 cpu microseconds. Dec 15 05:10:11 localhost ceph-mon[298913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Dec 15 05:10:11 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:10:11.479575) [db/flush_job.cc:967] [default] [JOB 35] Level-0 flush table #62: 2295756 bytes OK Dec 15 05:10:11 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:10:11.479607) [db/memtable_list.cc:519] [default] Level-0 commit table #62 started Dec 15 05:10:11 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:10:11.482162) [db/memtable_list.cc:722] [default] Level-0 commit table #62: memtable #1 done Dec 15 05:10:11 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:10:11.482186) EVENT_LOG_v1 {"time_micros": 1765793411482179, "job": 35, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Dec 15 05:10:11 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:10:11.482209) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Dec 15 05:10:11 localhost ceph-mon[298913]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 35] Try to delete WAL files size 2354660, prev total WAL file size 2354660, number of live WAL files 2. Dec 15 05:10:11 localhost ceph-mon[298913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005559462/store.db/000058.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Dec 15 05:10:11 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:10:11.483048) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B760031353338' seq:72057594037927935, type:22 .. '6B760031373934' seq:0, type:0; will stop at (end) Dec 15 05:10:11 localhost ceph-mon[298913]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 36] Compacting 1@0 + 1@6 files to L6, score -1.00 Dec 15 05:10:11 localhost ceph-mon[298913]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 35 Base level 0, inputs: [62(2241KB)], [60(16MB)] Dec 15 05:10:11 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765793411483116, "job": 36, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [62], "files_L6": [60], "score": -1, "input_data_size": 19705318, "oldest_snapshot_seqno": -1} Dec 15 05:10:11 localhost ceph-mon[298913]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 36] Generated table #63: 13817 keys, 18631611 bytes, temperature: kUnknown Dec 15 05:10:11 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765793411615078, "cf_name": "default", "job": 36, "event": "table_file_creation", "file_number": 63, "file_size": 18631611, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 18551374, "index_size": 44616, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 34565, "raw_key_size": 370384, "raw_average_key_size": 26, "raw_value_size": 18314673, "raw_average_value_size": 1325, "num_data_blocks": 1675, "num_entries": 13817, "num_filter_entries": 13817, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765792320, "oldest_key_time": 0, "file_creation_time": 1765793411, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "603b24af-e2be-4214-bc56-9e652eb4af3d", "db_session_id": "0OJRM9SCUA16EXV0VQZ2", "orig_file_number": 63, "seqno_to_time_mapping": "N/A"}} Dec 15 05:10:11 localhost ceph-mon[298913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Dec 15 05:10:11 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:10:11.615303) [db/compaction/compaction_job.cc:1663] [default] [JOB 36] Compacted 1@0 + 1@6 files to L6 => 18631611 bytes Dec 15 05:10:11 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:10:11.616728) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 149.3 rd, 141.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.2, 16.6 +0.0 blob) out(17.8 +0.0 blob), read-write-amplify(16.7) write-amplify(8.1) OK, records in: 14374, records dropped: 557 output_compression: NoCompression Dec 15 05:10:11 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:10:11.616743) EVENT_LOG_v1 {"time_micros": 1765793411616736, "job": 36, "event": "compaction_finished", "compaction_time_micros": 132027, "compaction_time_cpu_micros": 54955, "output_level": 6, "num_output_files": 1, "total_output_size": 18631611, "num_input_records": 14374, "num_output_records": 13817, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Dec 15 05:10:11 localhost ceph-mon[298913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005559462/store.db/000062.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Dec 15 05:10:11 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765793411617011, "job": 36, "event": "table_file_deletion", "file_number": 62} Dec 15 05:10:11 localhost ceph-mon[298913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005559462/store.db/000060.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Dec 15 05:10:11 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765793411619491, "job": 36, "event": "table_file_deletion", "file_number": 60} Dec 15 05:10:11 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:10:11.482916) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 05:10:11 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:10:11.619569) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 05:10:11 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:10:11.619578) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 05:10:11 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:10:11.619581) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 05:10:11 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:10:11.619583) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 05:10:11 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:10:11.619585) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 05:10:12 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e222 do_prune osdmap full prune enabled Dec 15 05:10:12 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e223 e223: 6 total, 6 up, 6 in Dec 15 05:10:12 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e223: 6 total, 6 up, 6 in Dec 15 05:10:13 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:10:13.104 267546 INFO neutron.agent.linux.ip_lib [None req-0a9bcde2-d104-455e-90d4-2466bd9b548a - - - - - -] Device tap8d188a62-16 cannot be used as it has no MAC address#033[00m Dec 15 05:10:13 localhost nova_compute[286344]: 2025-12-15 10:10:13.123 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:10:13 localhost kernel: device tap8d188a62-16 entered promiscuous mode Dec 15 05:10:13 localhost NetworkManager[5963]: [1765793413.1330] manager: (tap8d188a62-16): new Generic device (/org/freedesktop/NetworkManager/Devices/75) Dec 15 05:10:13 localhost nova_compute[286344]: 2025-12-15 10:10:13.133 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:10:13 localhost ovn_controller[154603]: 2025-12-15T10:10:13Z|00484|binding|INFO|Claiming lport 8d188a62-1614-434c-9ce5-49667185857d for this chassis. Dec 15 05:10:13 localhost ovn_controller[154603]: 2025-12-15T10:10:13Z|00485|binding|INFO|8d188a62-1614-434c-9ce5-49667185857d: Claiming unknown Dec 15 05:10:13 localhost systemd-udevd[333998]: Network interface NamePolicy= disabled on kernel command line. Dec 15 05:10:13 localhost ovn_metadata_agent[160585]: 2025-12-15 10:10:13.159 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.102.0.2/28', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-4ec39a2f-452e-4f94-b8fd-ad944930604a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4ec39a2f-452e-4f94-b8fd-ad944930604a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '37134118fff54995bf9c18df154a3cf8', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=154f67e3-e095-4616-9161-d4025ab1bffe, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=8d188a62-1614-434c-9ce5-49667185857d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:10:13 localhost ovn_metadata_agent[160585]: 2025-12-15 10:10:13.160 160590 INFO neutron.agent.ovn.metadata.agent [-] Port 8d188a62-1614-434c-9ce5-49667185857d in datapath 4ec39a2f-452e-4f94-b8fd-ad944930604a bound to our chassis#033[00m Dec 15 05:10:13 localhost ovn_metadata_agent[160585]: 2025-12-15 10:10:13.160 160590 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 4ec39a2f-452e-4f94-b8fd-ad944930604a or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 15 05:10:13 localhost ovn_metadata_agent[160585]: 2025-12-15 10:10:13.161 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[d62f33a7-cadb-4a22-bac0-ebee0a423166]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:10:13 localhost journal[231322]: ethtool ioctl error on tap8d188a62-16: No such device Dec 15 05:10:13 localhost journal[231322]: ethtool ioctl error on tap8d188a62-16: No such device Dec 15 05:10:13 localhost ovn_controller[154603]: 2025-12-15T10:10:13Z|00486|binding|INFO|Setting lport 8d188a62-1614-434c-9ce5-49667185857d ovn-installed in OVS Dec 15 05:10:13 localhost ovn_controller[154603]: 2025-12-15T10:10:13Z|00487|binding|INFO|Setting lport 8d188a62-1614-434c-9ce5-49667185857d up in Southbound Dec 15 05:10:13 localhost journal[231322]: ethtool ioctl error on tap8d188a62-16: No such device Dec 15 05:10:13 localhost nova_compute[286344]: 2025-12-15 10:10:13.176 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:10:13 localhost journal[231322]: ethtool ioctl error on tap8d188a62-16: No such device Dec 15 05:10:13 localhost journal[231322]: ethtool ioctl error on tap8d188a62-16: No such device Dec 15 05:10:13 localhost journal[231322]: ethtool ioctl error on tap8d188a62-16: No such device Dec 15 05:10:13 localhost journal[231322]: ethtool ioctl error on tap8d188a62-16: No such device Dec 15 05:10:13 localhost journal[231322]: ethtool ioctl error on tap8d188a62-16: No such device Dec 15 05:10:13 localhost nova_compute[286344]: 2025-12-15 10:10:13.210 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:10:13 localhost nova_compute[286344]: 2025-12-15 10:10:13.240 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:10:13 localhost nova_compute[286344]: 2025-12-15 10:10:13.928 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:10:14 localhost podman[334069]: Dec 15 05:10:14 localhost podman[334069]: 2025-12-15 10:10:14.190017271 +0000 UTC m=+0.087502890 container create 805ae500e536e52dbe2df070ab460abb67eb99dd45560710136ae839334b8030 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4ec39a2f-452e-4f94-b8fd-ad944930604a, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:10:14 localhost systemd[1]: Started libpod-conmon-805ae500e536e52dbe2df070ab460abb67eb99dd45560710136ae839334b8030.scope. Dec 15 05:10:14 localhost podman[334069]: 2025-12-15 10:10:14.146141187 +0000 UTC m=+0.043626856 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 15 05:10:14 localhost systemd[1]: tmp-crun.l2mazh.mount: Deactivated successfully. Dec 15 05:10:14 localhost systemd[1]: Started libcrun container. Dec 15 05:10:14 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/40ecc108e45c7a929ccc94b795068d8c66848872f36c114e33258147034c3517/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 05:10:14 localhost podman[334069]: 2025-12-15 10:10:14.266713456 +0000 UTC m=+0.164199075 container init 805ae500e536e52dbe2df070ab460abb67eb99dd45560710136ae839334b8030 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4ec39a2f-452e-4f94-b8fd-ad944930604a, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:10:14 localhost podman[334069]: 2025-12-15 10:10:14.276336418 +0000 UTC m=+0.173822037 container start 805ae500e536e52dbe2df070ab460abb67eb99dd45560710136ae839334b8030 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4ec39a2f-452e-4f94-b8fd-ad944930604a, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Dec 15 05:10:14 localhost dnsmasq[334087]: started, version 2.85 cachesize 150 Dec 15 05:10:14 localhost dnsmasq[334087]: DNS service limited to local subnets Dec 15 05:10:14 localhost dnsmasq[334087]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 15 05:10:14 localhost dnsmasq[334087]: warning: no upstream servers configured Dec 15 05:10:14 localhost dnsmasq-dhcp[334087]: DHCP, static leases only on 10.102.0.0, lease time 1d Dec 15 05:10:14 localhost dnsmasq[334087]: read /var/lib/neutron/dhcp/4ec39a2f-452e-4f94-b8fd-ad944930604a/addn_hosts - 0 addresses Dec 15 05:10:14 localhost dnsmasq-dhcp[334087]: read /var/lib/neutron/dhcp/4ec39a2f-452e-4f94-b8fd-ad944930604a/host Dec 15 05:10:14 localhost dnsmasq-dhcp[334087]: read /var/lib/neutron/dhcp/4ec39a2f-452e-4f94-b8fd-ad944930604a/opts Dec 15 05:10:14 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:10:14.372 267546 INFO neutron.agent.dhcp.agent [None req-c0a82e0c-edcd-45bb-96e4-81a00223e131 - - - - - -] DHCP configuration for ports {'d26e9f52-b629-49d2-85c0-01a65fbfec55'} is completed#033[00m Dec 15 05:10:15 localhost ovn_metadata_agent[160585]: 2025-12-15 10:10:15.240 160590 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=12d96d64-e862-4f68-81e5-8d9ec5d3a5e2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '18'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 15 05:10:15 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:10:15.254 267546 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-15T10:10:14Z, description=, device_id=37b02db1-841a-4b2f-9419-9b1716b5c675, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=21ba44da-d216-4616-be20-0777a4119d07, ip_allocation=immediate, mac_address=fa:16:3e:84:14:fc, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-15T10:10:10Z, description=, dns_domain=, id=4ec39a2f-452e-4f94-b8fd-ad944930604a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-1851369825, port_security_enabled=True, project_id=37134118fff54995bf9c18df154a3cf8, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=40013, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3271, status=ACTIVE, subnets=['f65704e9-e5f0-4fc2-b2b8-733035613f5b'], tags=[], tenant_id=37134118fff54995bf9c18df154a3cf8, updated_at=2025-12-15T10:10:11Z, vlan_transparent=None, network_id=4ec39a2f-452e-4f94-b8fd-ad944930604a, port_security_enabled=False, project_id=37134118fff54995bf9c18df154a3cf8, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3280, status=DOWN, tags=[], tenant_id=37134118fff54995bf9c18df154a3cf8, updated_at=2025-12-15T10:10:14Z on network 4ec39a2f-452e-4f94-b8fd-ad944930604a#033[00m Dec 15 05:10:15 localhost dnsmasq[334087]: read /var/lib/neutron/dhcp/4ec39a2f-452e-4f94-b8fd-ad944930604a/addn_hosts - 1 addresses Dec 15 05:10:15 localhost dnsmasq-dhcp[334087]: read /var/lib/neutron/dhcp/4ec39a2f-452e-4f94-b8fd-ad944930604a/host Dec 15 05:10:15 localhost dnsmasq-dhcp[334087]: read /var/lib/neutron/dhcp/4ec39a2f-452e-4f94-b8fd-ad944930604a/opts Dec 15 05:10:15 localhost podman[334105]: 2025-12-15 10:10:15.483882762 +0000 UTC m=+0.062822030 container kill 805ae500e536e52dbe2df070ab460abb67eb99dd45560710136ae839334b8030 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4ec39a2f-452e-4f94-b8fd-ad944930604a, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) Dec 15 05:10:15 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:10:15.732 267546 INFO neutron.agent.dhcp.agent [None req-38083364-5163-4c59-93f6-1bfd47614969 - - - - - -] DHCP configuration for ports {'21ba44da-d216-4616-be20-0777a4119d07'} is completed#033[00m Dec 15 05:10:15 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e223 do_prune osdmap full prune enabled Dec 15 05:10:15 localhost ceph-mon[298913]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #64. Immutable memtables: 0. Dec 15 05:10:15 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:10:15.801024) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Dec 15 05:10:15 localhost ceph-mon[298913]: rocksdb: [db/flush_job.cc:856] [default] [JOB 37] Flushing memtable with next log file: 64 Dec 15 05:10:15 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765793415801078, "job": 37, "event": "flush_started", "num_memtables": 1, "num_entries": 328, "num_deletes": 251, "total_data_size": 106260, "memory_usage": 113432, "flush_reason": "Manual Compaction"} Dec 15 05:10:15 localhost ceph-mon[298913]: rocksdb: [db/flush_job.cc:885] [default] [JOB 37] Level-0 flush table #65: started Dec 15 05:10:15 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e224 e224: 6 total, 6 up, 6 in Dec 15 05:10:15 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765793415804278, "cf_name": "default", "job": 37, "event": "table_file_creation", "file_number": 65, "file_size": 105028, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 35318, "largest_seqno": 35645, "table_properties": {"data_size": 102937, "index_size": 266, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 773, "raw_key_size": 5912, "raw_average_key_size": 19, "raw_value_size": 98574, "raw_average_value_size": 330, "num_data_blocks": 12, "num_entries": 298, "num_filter_entries": 298, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765793411, "oldest_key_time": 1765793411, "file_creation_time": 1765793415, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "603b24af-e2be-4214-bc56-9e652eb4af3d", "db_session_id": "0OJRM9SCUA16EXV0VQZ2", "orig_file_number": 65, "seqno_to_time_mapping": "N/A"}} Dec 15 05:10:15 localhost ceph-mon[298913]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 37] Flush lasted 3300 microseconds, and 1069 cpu microseconds. Dec 15 05:10:15 localhost ceph-mon[298913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Dec 15 05:10:15 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:10:15.804322) [db/flush_job.cc:967] [default] [JOB 37] Level-0 flush table #65: 105028 bytes OK Dec 15 05:10:15 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:10:15.804344) [db/memtable_list.cc:519] [default] Level-0 commit table #65 started Dec 15 05:10:15 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:10:15.806509) [db/memtable_list.cc:722] [default] Level-0 commit table #65: memtable #1 done Dec 15 05:10:15 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:10:15.806536) EVENT_LOG_v1 {"time_micros": 1765793415806528, "job": 37, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Dec 15 05:10:15 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:10:15.806556) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Dec 15 05:10:15 localhost ceph-mon[298913]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 37] Try to delete WAL files size 103929, prev total WAL file size 103970, number of live WAL files 2. Dec 15 05:10:15 localhost ceph-mon[298913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005559462/store.db/000061.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Dec 15 05:10:15 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:10:15.807060) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003132353530' seq:72057594037927935, type:22 .. '7061786F73003132383032' seq:0, type:0; will stop at (end) Dec 15 05:10:15 localhost ceph-mon[298913]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 38] Compacting 1@0 + 1@6 files to L6, score -1.00 Dec 15 05:10:15 localhost ceph-mon[298913]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 37 Base level 0, inputs: [65(102KB)], [63(17MB)] Dec 15 05:10:15 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765793415807099, "job": 38, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [65], "files_L6": [63], "score": -1, "input_data_size": 18736639, "oldest_snapshot_seqno": -1} Dec 15 05:10:15 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e224: 6 total, 6 up, 6 in Dec 15 05:10:15 localhost ceph-mon[298913]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 38] Generated table #66: 13601 keys, 17476765 bytes, temperature: kUnknown Dec 15 05:10:15 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765793415930755, "cf_name": "default", "job": 38, "event": "table_file_creation", "file_number": 66, "file_size": 17476765, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17399505, "index_size": 42151, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 34053, "raw_key_size": 366432, "raw_average_key_size": 26, "raw_value_size": 17168085, "raw_average_value_size": 1262, "num_data_blocks": 1567, "num_entries": 13601, "num_filter_entries": 13601, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765792320, "oldest_key_time": 0, "file_creation_time": 1765793415, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "603b24af-e2be-4214-bc56-9e652eb4af3d", "db_session_id": "0OJRM9SCUA16EXV0VQZ2", "orig_file_number": 66, "seqno_to_time_mapping": "N/A"}} Dec 15 05:10:15 localhost ceph-mon[298913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Dec 15 05:10:15 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:10:15.931189) [db/compaction/compaction_job.cc:1663] [default] [JOB 38] Compacted 1@0 + 1@6 files to L6 => 17476765 bytes Dec 15 05:10:15 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:10:15.933574) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 151.4 rd, 141.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.1, 17.8 +0.0 blob) out(16.7 +0.0 blob), read-write-amplify(344.8) write-amplify(166.4) OK, records in: 14115, records dropped: 514 output_compression: NoCompression Dec 15 05:10:15 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:10:15.933603) EVENT_LOG_v1 {"time_micros": 1765793415933591, "job": 38, "event": "compaction_finished", "compaction_time_micros": 123793, "compaction_time_cpu_micros": 49148, "output_level": 6, "num_output_files": 1, "total_output_size": 17476765, "num_input_records": 14115, "num_output_records": 13601, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Dec 15 05:10:15 localhost ceph-mon[298913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005559462/store.db/000065.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Dec 15 05:10:15 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765793415933792, "job": 38, "event": "table_file_deletion", "file_number": 65} Dec 15 05:10:15 localhost ceph-mon[298913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005559462/store.db/000063.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Dec 15 05:10:15 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765793415936790, "job": 38, "event": "table_file_deletion", "file_number": 63} Dec 15 05:10:15 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:10:15.806950) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 05:10:15 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:10:15.936930) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 05:10:15 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:10:15.936939) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 05:10:15 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:10:15.936942) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 05:10:15 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:10:15.936945) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 05:10:15 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:10:15.936948) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 05:10:16 localhost nova_compute[286344]: 2025-12-15 10:10:16.235 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:10:16 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e224 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:10:16 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:10:16.949 267546 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-15T10:10:14Z, description=, device_id=37b02db1-841a-4b2f-9419-9b1716b5c675, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=21ba44da-d216-4616-be20-0777a4119d07, ip_allocation=immediate, mac_address=fa:16:3e:84:14:fc, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-15T10:10:10Z, description=, dns_domain=, id=4ec39a2f-452e-4f94-b8fd-ad944930604a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-1851369825, port_security_enabled=True, project_id=37134118fff54995bf9c18df154a3cf8, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=40013, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3271, status=ACTIVE, subnets=['f65704e9-e5f0-4fc2-b2b8-733035613f5b'], tags=[], tenant_id=37134118fff54995bf9c18df154a3cf8, updated_at=2025-12-15T10:10:11Z, vlan_transparent=None, network_id=4ec39a2f-452e-4f94-b8fd-ad944930604a, port_security_enabled=False, project_id=37134118fff54995bf9c18df154a3cf8, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3280, status=DOWN, tags=[], tenant_id=37134118fff54995bf9c18df154a3cf8, updated_at=2025-12-15T10:10:14Z on network 4ec39a2f-452e-4f94-b8fd-ad944930604a#033[00m Dec 15 05:10:17 localhost dnsmasq[334087]: read /var/lib/neutron/dhcp/4ec39a2f-452e-4f94-b8fd-ad944930604a/addn_hosts - 1 addresses Dec 15 05:10:17 localhost podman[334144]: 2025-12-15 10:10:17.173097352 +0000 UTC m=+0.060618929 container kill 805ae500e536e52dbe2df070ab460abb67eb99dd45560710136ae839334b8030 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4ec39a2f-452e-4f94-b8fd-ad944930604a, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 05:10:17 localhost dnsmasq-dhcp[334087]: read /var/lib/neutron/dhcp/4ec39a2f-452e-4f94-b8fd-ad944930604a/host Dec 15 05:10:17 localhost dnsmasq-dhcp[334087]: read /var/lib/neutron/dhcp/4ec39a2f-452e-4f94-b8fd-ad944930604a/opts Dec 15 05:10:17 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:10:17.448 267546 INFO neutron.agent.dhcp.agent [None req-db92209f-7802-42a4-872f-1a6615859ddb - - - - - -] DHCP configuration for ports {'21ba44da-d216-4616-be20-0777a4119d07'} is completed#033[00m Dec 15 05:10:17 localhost nova_compute[286344]: 2025-12-15 10:10:17.637 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:10:17 localhost nova_compute[286344]: 2025-12-15 10:10:17.833 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:10:17 localhost nova_compute[286344]: 2025-12-15 10:10:17.834 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 15 05:10:17 localhost nova_compute[286344]: 2025-12-15 10:10:17.834 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 15 05:10:18 localhost nova_compute[286344]: 2025-12-15 10:10:18.489 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 15 05:10:18 localhost nova_compute[286344]: 2025-12-15 10:10:18.490 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquired lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 15 05:10:18 localhost nova_compute[286344]: 2025-12-15 10:10:18.490 286348 DEBUG nova.network.neutron [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 15 05:10:18 localhost nova_compute[286344]: 2025-12-15 10:10:18.491 286348 DEBUG nova.objects.instance [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 15 05:10:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0. Dec 15 05:10:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09. Dec 15 05:10:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 05:10:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 05:10:18 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a. Dec 15 05:10:18 localhost podman[334165]: 2025-12-15 10:10:18.754728658 +0000 UTC m=+0.077429336 container health_status 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 15 05:10:18 localhost podman[334166]: 2025-12-15 10:10:18.81510892 +0000 UTC m=+0.136159754 container health_status 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, release=1755695350, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, container_name=openstack_network_exporter, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Dec 15 05:10:18 localhost podman[334166]: 2025-12-15 10:10:18.828223457 +0000 UTC m=+0.149274311 container exec_died 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, version=9.6, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, vcs-type=git, maintainer=Red Hat, Inc., release=1755695350, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, container_name=openstack_network_exporter, io.openshift.expose-services=) Dec 15 05:10:18 localhost systemd[1]: 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09.service: Deactivated successfully. Dec 15 05:10:18 localhost podman[334165]: 2025-12-15 10:10:18.890384687 +0000 UTC m=+0.213085455 container exec_died 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Dec 15 05:10:18 localhost systemd[1]: 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0.service: Deactivated successfully. Dec 15 05:10:18 localhost podman[334170]: 2025-12-15 10:10:18.867060063 +0000 UTC m=+0.178340131 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible) Dec 15 05:10:18 localhost nova_compute[286344]: 2025-12-15 10:10:18.931 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:10:18 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Dec 15 05:10:18 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.25625 172.18.0.34:0/382777224' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Dec 15 05:10:18 localhost podman[334167]: 2025-12-15 10:10:18.981920486 +0000 UTC m=+0.296113103 container health_status 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:10:18 localhost podman[334167]: 2025-12-15 10:10:18.998424494 +0000 UTC m=+0.312617131 container exec_died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true) Dec 15 05:10:19 localhost podman[334179]: 2025-12-15 10:10:18.899375141 +0000 UTC m=+0.203800213 container health_status b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS) Dec 15 05:10:19 localhost podman[334179]: 2025-12-15 10:10:19.033532729 +0000 UTC m=+0.337957841 container exec_died b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 05:10:19 localhost systemd[1]: b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a.service: Deactivated successfully. Dec 15 05:10:19 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.service: Deactivated successfully. Dec 15 05:10:19 localhost podman[334170]: 2025-12-15 10:10:19.102823963 +0000 UTC m=+0.414104021 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251202, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 05:10:19 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 05:10:19 localhost systemd[1]: tmp-crun.BDhkkt.mount: Deactivated successfully. Dec 15 05:10:20 localhost nova_compute[286344]: 2025-12-15 10:10:20.000 286348 DEBUG nova.network.neutron [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Updating instance_info_cache with network_info: [{"id": "03ef8889-3216-43fb-8a52-4be17a956ce1", "address": "fa:16:3e:74:df:7c", "network": {"id": "befb7a72-17a9-4bcb-b561-84b8f626685a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.201", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "c785bf23f53946bc99867d8832a50266", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03ef8889-32", "ovs_interfaceid": "03ef8889-3216-43fb-8a52-4be17a956ce1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 15 05:10:20 localhost nova_compute[286344]: 2025-12-15 10:10:20.019 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Releasing lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 15 05:10:20 localhost nova_compute[286344]: 2025-12-15 10:10:20.019 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 15 05:10:20 localhost nova_compute[286344]: 2025-12-15 10:10:20.019 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:10:20 localhost nova_compute[286344]: 2025-12-15 10:10:20.020 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:10:20 localhost nova_compute[286344]: 2025-12-15 10:10:20.020 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:10:20 localhost nova_compute[286344]: 2025-12-15 10:10:20.271 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:10:20 localhost nova_compute[286344]: 2025-12-15 10:10:20.272 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:10:21 localhost nova_compute[286344]: 2025-12-15 10:10:21.272 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:10:21 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e224 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:10:22 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:10:22.022 267546 INFO neutron.agent.linux.ip_lib [None req-b12eae96-5d96-43f8-8830-b6ac85ba3381 - - - - - -] Device tap04e07e7a-a6 cannot be used as it has no MAC address#033[00m Dec 15 05:10:22 localhost nova_compute[286344]: 2025-12-15 10:10:22.045 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:10:22 localhost kernel: device tap04e07e7a-a6 entered promiscuous mode Dec 15 05:10:22 localhost NetworkManager[5963]: [1765793422.0537] manager: (tap04e07e7a-a6): new Generic device (/org/freedesktop/NetworkManager/Devices/76) Dec 15 05:10:22 localhost ovn_controller[154603]: 2025-12-15T10:10:22Z|00488|binding|INFO|Claiming lport 04e07e7a-a66c-4cf6-8f56-f9256f972214 for this chassis. Dec 15 05:10:22 localhost ovn_controller[154603]: 2025-12-15T10:10:22Z|00489|binding|INFO|04e07e7a-a66c-4cf6-8f56-f9256f972214: Claiming unknown Dec 15 05:10:22 localhost nova_compute[286344]: 2025-12-15 10:10:22.055 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:10:22 localhost systemd-udevd[334279]: Network interface NamePolicy= disabled on kernel command line. Dec 15 05:10:22 localhost ovn_metadata_agent[160585]: 2025-12-15 10:10:22.066 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.103.0.2/28', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-64be92a7-1510-4089-81a2-9c244c43a61c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-64be92a7-1510-4089-81a2-9c244c43a61c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '37134118fff54995bf9c18df154a3cf8', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2fffc778-9871-4ecf-88cf-046a04d0a8a3, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=04e07e7a-a66c-4cf6-8f56-f9256f972214) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:10:22 localhost ovn_metadata_agent[160585]: 2025-12-15 10:10:22.069 160590 INFO neutron.agent.ovn.metadata.agent [-] Port 04e07e7a-a66c-4cf6-8f56-f9256f972214 in datapath 64be92a7-1510-4089-81a2-9c244c43a61c bound to our chassis#033[00m Dec 15 05:10:22 localhost ovn_metadata_agent[160585]: 2025-12-15 10:10:22.070 160590 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 64be92a7-1510-4089-81a2-9c244c43a61c or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 15 05:10:22 localhost ovn_metadata_agent[160585]: 2025-12-15 10:10:22.071 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[f66d69fb-e183-4973-98e9-cbd83d698bba]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:10:22 localhost journal[231322]: ethtool ioctl error on tap04e07e7a-a6: No such device Dec 15 05:10:22 localhost journal[231322]: ethtool ioctl error on tap04e07e7a-a6: No such device Dec 15 05:10:22 localhost ovn_controller[154603]: 2025-12-15T10:10:22Z|00490|binding|INFO|Setting lport 04e07e7a-a66c-4cf6-8f56-f9256f972214 ovn-installed in OVS Dec 15 05:10:22 localhost ovn_controller[154603]: 2025-12-15T10:10:22Z|00491|binding|INFO|Setting lport 04e07e7a-a66c-4cf6-8f56-f9256f972214 up in Southbound Dec 15 05:10:22 localhost nova_compute[286344]: 2025-12-15 10:10:22.099 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:10:22 localhost journal[231322]: ethtool ioctl error on tap04e07e7a-a6: No such device Dec 15 05:10:22 localhost nova_compute[286344]: 2025-12-15 10:10:22.101 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:10:22 localhost journal[231322]: ethtool ioctl error on tap04e07e7a-a6: No such device Dec 15 05:10:22 localhost journal[231322]: ethtool ioctl error on tap04e07e7a-a6: No such device Dec 15 05:10:22 localhost journal[231322]: ethtool ioctl error on tap04e07e7a-a6: No such device Dec 15 05:10:22 localhost journal[231322]: ethtool ioctl error on tap04e07e7a-a6: No such device Dec 15 05:10:22 localhost journal[231322]: ethtool ioctl error on tap04e07e7a-a6: No such device Dec 15 05:10:22 localhost nova_compute[286344]: 2025-12-15 10:10:22.138 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:10:22 localhost nova_compute[286344]: 2025-12-15 10:10:22.165 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:10:22 localhost nova_compute[286344]: 2025-12-15 10:10:22.269 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:10:22 localhost nova_compute[286344]: 2025-12-15 10:10:22.270 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 15 05:10:23 localhost podman[334350]: Dec 15 05:10:23 localhost podman[334350]: 2025-12-15 10:10:23.070295621 +0000 UTC m=+0.093959106 container create 652f232af5784614df2f6c89f868238ad09ee4cebcad47bd93f03e26aaca4cb3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-64be92a7-1510-4089-81a2-9c244c43a61c, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Dec 15 05:10:23 localhost systemd[1]: Started libpod-conmon-652f232af5784614df2f6c89f868238ad09ee4cebcad47bd93f03e26aaca4cb3.scope. Dec 15 05:10:23 localhost podman[334350]: 2025-12-15 10:10:23.021036911 +0000 UTC m=+0.044700436 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 15 05:10:23 localhost systemd[1]: tmp-crun.hRHnYf.mount: Deactivated successfully. Dec 15 05:10:23 localhost systemd[1]: Started libcrun container. Dec 15 05:10:23 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/abfe53ebb3cf4c63a07fe583f8a916c68515105bbca029e3896f120f4697ffc9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 05:10:23 localhost podman[334350]: 2025-12-15 10:10:23.169625822 +0000 UTC m=+0.193289307 container init 652f232af5784614df2f6c89f868238ad09ee4cebcad47bd93f03e26aaca4cb3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-64be92a7-1510-4089-81a2-9c244c43a61c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202) Dec 15 05:10:23 localhost podman[334350]: 2025-12-15 10:10:23.17837942 +0000 UTC m=+0.202042905 container start 652f232af5784614df2f6c89f868238ad09ee4cebcad47bd93f03e26aaca4cb3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-64be92a7-1510-4089-81a2-9c244c43a61c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:10:23 localhost dnsmasq[334368]: started, version 2.85 cachesize 150 Dec 15 05:10:23 localhost dnsmasq[334368]: DNS service limited to local subnets Dec 15 05:10:23 localhost dnsmasq[334368]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 15 05:10:23 localhost dnsmasq[334368]: warning: no upstream servers configured Dec 15 05:10:23 localhost dnsmasq-dhcp[334368]: DHCP, static leases only on 10.103.0.0, lease time 1d Dec 15 05:10:23 localhost dnsmasq[334368]: read /var/lib/neutron/dhcp/64be92a7-1510-4089-81a2-9c244c43a61c/addn_hosts - 0 addresses Dec 15 05:10:23 localhost dnsmasq-dhcp[334368]: read /var/lib/neutron/dhcp/64be92a7-1510-4089-81a2-9c244c43a61c/host Dec 15 05:10:23 localhost dnsmasq-dhcp[334368]: read /var/lib/neutron/dhcp/64be92a7-1510-4089-81a2-9c244c43a61c/opts Dec 15 05:10:23 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:10:23.596 267546 INFO neutron.agent.dhcp.agent [None req-4f907232-2105-4800-bd4a-a74e8539f89e - - - - - -] DHCP configuration for ports {'934dc0ef-45db-45d1-a9d5-34fa2c03469e'} is completed#033[00m Dec 15 05:10:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 05:10:23 localhost nova_compute[286344]: 2025-12-15 10:10:23.961 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:10:24 localhost podman[334369]: 2025-12-15 10:10:24.026666455 +0000 UTC m=+0.107278208 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3) Dec 15 05:10:24 localhost podman[334369]: 2025-12-15 10:10:24.056751313 +0000 UTC m=+0.137363006 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent) Dec 15 05:10:24 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 05:10:24 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Dec 15 05:10:24 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.25625 172.18.0.34:0/382777224' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Dec 15 05:10:24 localhost nova_compute[286344]: 2025-12-15 10:10:24.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:10:24 localhost nova_compute[286344]: 2025-12-15 10:10:24.295 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 05:10:24 localhost nova_compute[286344]: 2025-12-15 10:10:24.295 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 05:10:24 localhost nova_compute[286344]: 2025-12-15 10:10:24.295 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 05:10:24 localhost nova_compute[286344]: 2025-12-15 10:10:24.296 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Auditing locally available compute resources for np0005559462.localdomain (node: np0005559462.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 15 05:10:24 localhost nova_compute[286344]: 2025-12-15 10:10:24.296 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 05:10:24 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:10:24.469 267546 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-15T10:10:24Z, description=, device_id=37b02db1-841a-4b2f-9419-9b1716b5c675, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=7b71da57-6881-4812-ba93-d6f49cf4f7a8, ip_allocation=immediate, mac_address=fa:16:3e:2b:4c:8d, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-15T10:10:17Z, description=, dns_domain=, id=64be92a7-1510-4089-81a2-9c244c43a61c, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-969491392, port_security_enabled=True, project_id=37134118fff54995bf9c18df154a3cf8, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=2857, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3281, status=ACTIVE, subnets=['445c99ee-033b-4608-81e8-aceb70ea3f6a'], tags=[], tenant_id=37134118fff54995bf9c18df154a3cf8, updated_at=2025-12-15T10:10:20Z, vlan_transparent=None, network_id=64be92a7-1510-4089-81a2-9c244c43a61c, port_security_enabled=False, project_id=37134118fff54995bf9c18df154a3cf8, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3308, status=DOWN, tags=[], tenant_id=37134118fff54995bf9c18df154a3cf8, updated_at=2025-12-15T10:10:24Z on network 64be92a7-1510-4089-81a2-9c244c43a61c#033[00m Dec 15 05:10:24 localhost dnsmasq[334368]: read /var/lib/neutron/dhcp/64be92a7-1510-4089-81a2-9c244c43a61c/addn_hosts - 1 addresses Dec 15 05:10:24 localhost dnsmasq-dhcp[334368]: read /var/lib/neutron/dhcp/64be92a7-1510-4089-81a2-9c244c43a61c/host Dec 15 05:10:24 localhost podman[334425]: 2025-12-15 10:10:24.669607887 +0000 UTC m=+0.053679350 container kill 652f232af5784614df2f6c89f868238ad09ee4cebcad47bd93f03e26aaca4cb3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-64be92a7-1510-4089-81a2-9c244c43a61c, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS) Dec 15 05:10:24 localhost dnsmasq-dhcp[334368]: read /var/lib/neutron/dhcp/64be92a7-1510-4089-81a2-9c244c43a61c/opts Dec 15 05:10:24 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Dec 15 05:10:24 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/136186641' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Dec 15 05:10:24 localhost nova_compute[286344]: 2025-12-15 10:10:24.751 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 05:10:24 localhost nova_compute[286344]: 2025-12-15 10:10:24.817 286348 DEBUG nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 05:10:24 localhost nova_compute[286344]: 2025-12-15 10:10:24.818 286348 DEBUG nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 05:10:24 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:10:24.962 267546 INFO neutron.agent.dhcp.agent [None req-c2299e61-7e71-4cf6-82b4-e1a6684f9d8b - - - - - -] DHCP configuration for ports {'7b71da57-6881-4812-ba93-d6f49cf4f7a8'} is completed#033[00m Dec 15 05:10:25 localhost nova_compute[286344]: 2025-12-15 10:10:25.017 286348 WARNING nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 15 05:10:25 localhost nova_compute[286344]: 2025-12-15 10:10:25.018 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Hypervisor/Node resource view: name=np0005559462.localdomain free_ram=11202MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 15 05:10:25 localhost nova_compute[286344]: 2025-12-15 10:10:25.018 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 05:10:25 localhost nova_compute[286344]: 2025-12-15 10:10:25.019 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 05:10:25 localhost nova_compute[286344]: 2025-12-15 10:10:25.333 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Instance 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 15 05:10:25 localhost nova_compute[286344]: 2025-12-15 10:10:25.334 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 15 05:10:25 localhost nova_compute[286344]: 2025-12-15 10:10:25.334 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Final resource view: name=np0005559462.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 15 05:10:25 localhost nova_compute[286344]: 2025-12-15 10:10:25.495 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 05:10:25 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Dec 15 05:10:25 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3099330790' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Dec 15 05:10:25 localhost nova_compute[286344]: 2025-12-15 10:10:25.947 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 05:10:25 localhost nova_compute[286344]: 2025-12-15 10:10:25.955 286348 DEBUG nova.compute.provider_tree [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Inventory has not changed in ProviderTree for provider: 26c8956b-6742-4951-b566-971b9bbe323b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 15 05:10:26 localhost nova_compute[286344]: 2025-12-15 10:10:26.177 286348 DEBUG nova.scheduler.client.report [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Inventory has not changed for provider 26c8956b-6742-4951-b566-971b9bbe323b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 15 05:10:26 localhost nova_compute[286344]: 2025-12-15 10:10:26.179 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Compute_service record updated for np0005559462.localdomain:np0005559462.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 15 05:10:26 localhost nova_compute[286344]: 2025-12-15 10:10:26.180 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.161s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 05:10:26 localhost nova_compute[286344]: 2025-12-15 10:10:26.274 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:10:26 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e224 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:10:26 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e224 do_prune osdmap full prune enabled Dec 15 05:10:26 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e225 e225: 6 total, 6 up, 6 in Dec 15 05:10:26 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e225: 6 total, 6 up, 6 in Dec 15 05:10:26 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:10:26.642 267546 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-15T10:10:24Z, description=, device_id=37b02db1-841a-4b2f-9419-9b1716b5c675, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=7b71da57-6881-4812-ba93-d6f49cf4f7a8, ip_allocation=immediate, mac_address=fa:16:3e:2b:4c:8d, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-15T10:10:17Z, description=, dns_domain=, id=64be92a7-1510-4089-81a2-9c244c43a61c, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-969491392, port_security_enabled=True, project_id=37134118fff54995bf9c18df154a3cf8, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=2857, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3281, status=ACTIVE, subnets=['445c99ee-033b-4608-81e8-aceb70ea3f6a'], tags=[], tenant_id=37134118fff54995bf9c18df154a3cf8, updated_at=2025-12-15T10:10:20Z, vlan_transparent=None, network_id=64be92a7-1510-4089-81a2-9c244c43a61c, port_security_enabled=False, project_id=37134118fff54995bf9c18df154a3cf8, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3308, status=DOWN, tags=[], tenant_id=37134118fff54995bf9c18df154a3cf8, updated_at=2025-12-15T10:10:24Z on network 64be92a7-1510-4089-81a2-9c244c43a61c#033[00m Dec 15 05:10:26 localhost podman[334488]: 2025-12-15 10:10:26.880260336 +0000 UTC m=+0.057825473 container kill 652f232af5784614df2f6c89f868238ad09ee4cebcad47bd93f03e26aaca4cb3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-64be92a7-1510-4089-81a2-9c244c43a61c, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 05:10:26 localhost dnsmasq[334368]: read /var/lib/neutron/dhcp/64be92a7-1510-4089-81a2-9c244c43a61c/addn_hosts - 1 addresses Dec 15 05:10:26 localhost dnsmasq-dhcp[334368]: read /var/lib/neutron/dhcp/64be92a7-1510-4089-81a2-9c244c43a61c/host Dec 15 05:10:26 localhost dnsmasq-dhcp[334368]: read /var/lib/neutron/dhcp/64be92a7-1510-4089-81a2-9c244c43a61c/opts Dec 15 05:10:27 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:10:27.131 267546 INFO neutron.agent.dhcp.agent [None req-1063e541-fd6b-4dd6-8162-b49b863d90bd - - - - - -] DHCP configuration for ports {'7b71da57-6881-4812-ba93-d6f49cf4f7a8'} is completed#033[00m Dec 15 05:10:27 localhost nova_compute[286344]: 2025-12-15 10:10:27.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:10:27 localhost nova_compute[286344]: 2025-12-15 10:10:27.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:10:27 localhost nova_compute[286344]: 2025-12-15 10:10:27.271 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Cleaning up deleted instances with incomplete migration _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m Dec 15 05:10:27 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Dec 15 05:10:27 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.25625 172.18.0.34:0/382777224' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Dec 15 05:10:28 localhost nova_compute[286344]: 2025-12-15 10:10:28.290 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:10:28 localhost nova_compute[286344]: 2025-12-15 10:10:28.291 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m Dec 15 05:10:28 localhost nova_compute[286344]: 2025-12-15 10:10:28.321 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m Dec 15 05:10:28 localhost nova_compute[286344]: 2025-12-15 10:10:28.964 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:10:30 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Dec 15 05:10:30 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.25625 172.18.0.34:0/382777224' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Dec 15 05:10:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e. Dec 15 05:10:30 localhost systemd[1]: tmp-crun.eA32Cb.mount: Deactivated successfully. Dec 15 05:10:30 localhost podman[334508]: 2025-12-15 10:10:30.748798684 +0000 UTC m=+0.082970638 container health_status a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Dec 15 05:10:30 localhost podman[334508]: 2025-12-15 10:10:30.760237365 +0000 UTC m=+0.094409309 container exec_died a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Dec 15 05:10:30 localhost systemd[1]: a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e.service: Deactivated successfully. Dec 15 05:10:31 localhost nova_compute[286344]: 2025-12-15 10:10:31.314 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:10:31 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e225 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:10:31 localhost podman[243449]: time="2025-12-15T10:10:31Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 15 05:10:31 localhost podman[243449]: @ - - [15/Dec/2025:10:10:31 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 160288 "" "Go-http-client/1.1" Dec 15 05:10:31 localhost podman[243449]: @ - - [15/Dec/2025:10:10:31 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 20208 "" "Go-http-client/1.1" Dec 15 05:10:32 localhost dnsmasq[334368]: read /var/lib/neutron/dhcp/64be92a7-1510-4089-81a2-9c244c43a61c/addn_hosts - 0 addresses Dec 15 05:10:32 localhost dnsmasq-dhcp[334368]: read /var/lib/neutron/dhcp/64be92a7-1510-4089-81a2-9c244c43a61c/host Dec 15 05:10:32 localhost dnsmasq-dhcp[334368]: read /var/lib/neutron/dhcp/64be92a7-1510-4089-81a2-9c244c43a61c/opts Dec 15 05:10:32 localhost podman[334547]: 2025-12-15 10:10:32.563095305 +0000 UTC m=+0.072175663 container kill 652f232af5784614df2f6c89f868238ad09ee4cebcad47bd93f03e26aaca4cb3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-64be92a7-1510-4089-81a2-9c244c43a61c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Dec 15 05:10:32 localhost systemd[1]: tmp-crun.E8yIAq.mount: Deactivated successfully. Dec 15 05:10:32 localhost kernel: device tap04e07e7a-a6 left promiscuous mode Dec 15 05:10:32 localhost ovn_controller[154603]: 2025-12-15T10:10:32Z|00492|binding|INFO|Releasing lport 04e07e7a-a66c-4cf6-8f56-f9256f972214 from this chassis (sb_readonly=0) Dec 15 05:10:32 localhost ovn_controller[154603]: 2025-12-15T10:10:32Z|00493|binding|INFO|Setting lport 04e07e7a-a66c-4cf6-8f56-f9256f972214 down in Southbound Dec 15 05:10:32 localhost nova_compute[286344]: 2025-12-15 10:10:32.790 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:10:32 localhost ovn_metadata_agent[160585]: 2025-12-15 10:10:32.807 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.103.0.2/28', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-64be92a7-1510-4089-81a2-9c244c43a61c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-64be92a7-1510-4089-81a2-9c244c43a61c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '37134118fff54995bf9c18df154a3cf8', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005559462.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2fffc778-9871-4ecf-88cf-046a04d0a8a3, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=04e07e7a-a66c-4cf6-8f56-f9256f972214) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:10:32 localhost ovn_metadata_agent[160585]: 2025-12-15 10:10:32.809 160590 INFO neutron.agent.ovn.metadata.agent [-] Port 04e07e7a-a66c-4cf6-8f56-f9256f972214 in datapath 64be92a7-1510-4089-81a2-9c244c43a61c unbound from our chassis#033[00m Dec 15 05:10:32 localhost ovn_metadata_agent[160585]: 2025-12-15 10:10:32.812 160590 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 64be92a7-1510-4089-81a2-9c244c43a61c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 15 05:10:32 localhost ovn_metadata_agent[160585]: 2025-12-15 10:10:32.813 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[73202414-810e-44d3-8f9b-ba07c4a1b64d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:10:32 localhost nova_compute[286344]: 2025-12-15 10:10:32.815 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:10:33 localhost dnsmasq[334368]: exiting on receipt of SIGTERM Dec 15 05:10:33 localhost podman[334584]: 2025-12-15 10:10:33.347696769 +0000 UTC m=+0.061863943 container kill 652f232af5784614df2f6c89f868238ad09ee4cebcad47bd93f03e26aaca4cb3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-64be92a7-1510-4089-81a2-9c244c43a61c, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:10:33 localhost systemd[1]: libpod-652f232af5784614df2f6c89f868238ad09ee4cebcad47bd93f03e26aaca4cb3.scope: Deactivated successfully. Dec 15 05:10:33 localhost podman[334598]: 2025-12-15 10:10:33.413550369 +0000 UTC m=+0.055248533 container died 652f232af5784614df2f6c89f868238ad09ee4cebcad47bd93f03e26aaca4cb3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-64be92a7-1510-4089-81a2-9c244c43a61c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2) Dec 15 05:10:33 localhost podman[334598]: 2025-12-15 10:10:33.448911621 +0000 UTC m=+0.090609755 container cleanup 652f232af5784614df2f6c89f868238ad09ee4cebcad47bd93f03e26aaca4cb3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-64be92a7-1510-4089-81a2-9c244c43a61c, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:10:33 localhost systemd[1]: libpod-conmon-652f232af5784614df2f6c89f868238ad09ee4cebcad47bd93f03e26aaca4cb3.scope: Deactivated successfully. Dec 15 05:10:33 localhost podman[334600]: 2025-12-15 10:10:33.492898236 +0000 UTC m=+0.125396379 container remove 652f232af5784614df2f6c89f868238ad09ee4cebcad47bd93f03e26aaca4cb3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-64be92a7-1510-4089-81a2-9c244c43a61c, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0) Dec 15 05:10:33 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:10:33.553 267546 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:10:33 localhost systemd[1]: var-lib-containers-storage-overlay-abfe53ebb3cf4c63a07fe583f8a916c68515105bbca029e3896f120f4697ffc9-merged.mount: Deactivated successfully. Dec 15 05:10:33 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-652f232af5784614df2f6c89f868238ad09ee4cebcad47bd93f03e26aaca4cb3-userdata-shm.mount: Deactivated successfully. Dec 15 05:10:33 localhost systemd[1]: run-netns-qdhcp\x2d64be92a7\x2d1510\x2d4089\x2d81a2\x2d9c244c43a61c.mount: Deactivated successfully. Dec 15 05:10:33 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Dec 15 05:10:33 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.25625 172.18.0.34:0/382777224' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Dec 15 05:10:33 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:10:33.889 267546 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:10:33 localhost nova_compute[286344]: 2025-12-15 10:10:33.999 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:10:34 localhost ovn_controller[154603]: 2025-12-15T10:10:34Z|00494|binding|INFO|Releasing lport b35254ad-12eb-47bb-92be-44fefe0694f0 from this chassis (sb_readonly=0) Dec 15 05:10:34 localhost nova_compute[286344]: 2025-12-15 10:10:34.456 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:10:34 localhost openstack_network_exporter[246484]: ERROR 10:10:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 05:10:34 localhost openstack_network_exporter[246484]: ERROR 10:10:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 05:10:34 localhost openstack_network_exporter[246484]: ERROR 10:10:34 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 15 05:10:34 localhost openstack_network_exporter[246484]: ERROR 10:10:34 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 15 05:10:34 localhost openstack_network_exporter[246484]: Dec 15 05:10:34 localhost openstack_network_exporter[246484]: ERROR 10:10:34 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 15 05:10:34 localhost openstack_network_exporter[246484]: Dec 15 05:10:35 localhost systemd[1]: tmp-crun.5iVXBj.mount: Deactivated successfully. Dec 15 05:10:35 localhost dnsmasq[334087]: read /var/lib/neutron/dhcp/4ec39a2f-452e-4f94-b8fd-ad944930604a/addn_hosts - 0 addresses Dec 15 05:10:35 localhost dnsmasq-dhcp[334087]: read /var/lib/neutron/dhcp/4ec39a2f-452e-4f94-b8fd-ad944930604a/host Dec 15 05:10:35 localhost podman[334642]: 2025-12-15 10:10:35.743100042 +0000 UTC m=+0.066700505 container kill 805ae500e536e52dbe2df070ab460abb67eb99dd45560710136ae839334b8030 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4ec39a2f-452e-4f94-b8fd-ad944930604a, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 05:10:35 localhost dnsmasq-dhcp[334087]: read /var/lib/neutron/dhcp/4ec39a2f-452e-4f94-b8fd-ad944930604a/opts Dec 15 05:10:35 localhost ovn_controller[154603]: 2025-12-15T10:10:35Z|00495|binding|INFO|Releasing lport 8d188a62-1614-434c-9ce5-49667185857d from this chassis (sb_readonly=0) Dec 15 05:10:35 localhost ovn_controller[154603]: 2025-12-15T10:10:35Z|00496|binding|INFO|Setting lport 8d188a62-1614-434c-9ce5-49667185857d down in Southbound Dec 15 05:10:35 localhost nova_compute[286344]: 2025-12-15 10:10:35.934 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:10:35 localhost kernel: device tap8d188a62-16 left promiscuous mode Dec 15 05:10:35 localhost nova_compute[286344]: 2025-12-15 10:10:35.958 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:10:36 localhost ovn_metadata_agent[160585]: 2025-12-15 10:10:36.040 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.102.0.2/28', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-4ec39a2f-452e-4f94-b8fd-ad944930604a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4ec39a2f-452e-4f94-b8fd-ad944930604a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '37134118fff54995bf9c18df154a3cf8', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005559462.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=154f67e3-e095-4616-9161-d4025ab1bffe, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=8d188a62-1614-434c-9ce5-49667185857d) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:10:36 localhost ovn_metadata_agent[160585]: 2025-12-15 10:10:36.044 160590 INFO neutron.agent.ovn.metadata.agent [-] Port 8d188a62-1614-434c-9ce5-49667185857d in datapath 4ec39a2f-452e-4f94-b8fd-ad944930604a unbound from our chassis#033[00m Dec 15 05:10:36 localhost ovn_metadata_agent[160585]: 2025-12-15 10:10:36.047 160590 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4ec39a2f-452e-4f94-b8fd-ad944930604a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 15 05:10:36 localhost ovn_metadata_agent[160585]: 2025-12-15 10:10:36.048 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[1c9828c6-cc26-40d7-8e9b-82dcf8514647]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:10:36 localhost nova_compute[286344]: 2025-12-15 10:10:36.366 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:10:36 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e225 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:10:36 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e225 do_prune osdmap full prune enabled Dec 15 05:10:36 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e226 e226: 6 total, 6 up, 6 in Dec 15 05:10:36 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e226: 6 total, 6 up, 6 in Dec 15 05:10:36 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.Joe", "caps": ["mds", "allow rw path=/volumes/_nogroup/c73e998c-987a-45d1-bbc3-da65922e987c/9e8891cd-31dc-44fa-b7fc-384803ade97c", "osd", "allow rw pool=manila_data namespace=fsvolumens_c73e998c-987a-45d1-bbc3-da65922e987c", "mon", "allow r"], "format": "json"} v 0) Dec 15 05:10:36 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.Joe", "caps": ["mds", "allow rw path=/volumes/_nogroup/c73e998c-987a-45d1-bbc3-da65922e987c/9e8891cd-31dc-44fa-b7fc-384803ade97c", "osd", "allow rw pool=manila_data namespace=fsvolumens_c73e998c-987a-45d1-bbc3-da65922e987c", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:10:36 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth get-or-create", "entity": "client.Joe", "caps": ["mds", "allow rw path=/volumes/_nogroup/c73e998c-987a-45d1-bbc3-da65922e987c/9e8891cd-31dc-44fa-b7fc-384803ade97c", "osd", "allow rw pool=manila_data namespace=fsvolumens_c73e998c-987a-45d1-bbc3-da65922e987c", "mon", "allow r"], "format": "json"}]': finished Dec 15 05:10:37 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.Joe", "format": "json"} : dispatch Dec 15 05:10:37 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.Joe", "caps": ["mds", "allow rw path=/volumes/_nogroup/c73e998c-987a-45d1-bbc3-da65922e987c/9e8891cd-31dc-44fa-b7fc-384803ade97c", "osd", "allow rw pool=manila_data namespace=fsvolumens_c73e998c-987a-45d1-bbc3-da65922e987c", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:10:37 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.Joe", "caps": ["mds", "allow rw path=/volumes/_nogroup/c73e998c-987a-45d1-bbc3-da65922e987c/9e8891cd-31dc-44fa-b7fc-384803ade97c", "osd", "allow rw pool=manila_data namespace=fsvolumens_c73e998c-987a-45d1-bbc3-da65922e987c", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:10:37 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth get-or-create", "entity": "client.Joe", "caps": ["mds", "allow rw path=/volumes/_nogroup/c73e998c-987a-45d1-bbc3-da65922e987c/9e8891cd-31dc-44fa-b7fc-384803ade97c", "osd", "allow rw pool=manila_data namespace=fsvolumens_c73e998c-987a-45d1-bbc3-da65922e987c", "mon", "allow r"], "format": "json"}]': finished Dec 15 05:10:37 localhost podman[334683]: 2025-12-15 10:10:37.25864756 +0000 UTC m=+0.058881382 container kill 805ae500e536e52dbe2df070ab460abb67eb99dd45560710136ae839334b8030 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4ec39a2f-452e-4f94-b8fd-ad944930604a, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:10:37 localhost systemd[1]: tmp-crun.8iNQax.mount: Deactivated successfully. Dec 15 05:10:37 localhost dnsmasq[334087]: exiting on receipt of SIGTERM Dec 15 05:10:37 localhost systemd[1]: libpod-805ae500e536e52dbe2df070ab460abb67eb99dd45560710136ae839334b8030.scope: Deactivated successfully. Dec 15 05:10:37 localhost podman[334697]: 2025-12-15 10:10:37.335837789 +0000 UTC m=+0.059648983 container died 805ae500e536e52dbe2df070ab460abb67eb99dd45560710136ae839334b8030 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4ec39a2f-452e-4f94-b8fd-ad944930604a, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:10:37 localhost podman[334697]: 2025-12-15 10:10:37.372104585 +0000 UTC m=+0.095915769 container cleanup 805ae500e536e52dbe2df070ab460abb67eb99dd45560710136ae839334b8030 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4ec39a2f-452e-4f94-b8fd-ad944930604a, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:10:37 localhost systemd[1]: libpod-conmon-805ae500e536e52dbe2df070ab460abb67eb99dd45560710136ae839334b8030.scope: Deactivated successfully. Dec 15 05:10:37 localhost podman[334698]: 2025-12-15 10:10:37.413338696 +0000 UTC m=+0.133825470 container remove 805ae500e536e52dbe2df070ab460abb67eb99dd45560710136ae839334b8030 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4ec39a2f-452e-4f94-b8fd-ad944930604a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:10:37 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:10:37.445 267546 INFO neutron.agent.dhcp.agent [None req-a03b2638-42cf-400f-9fc3-811764b99f7f - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:10:37 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Dec 15 05:10:37 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.25625 172.18.0.34:0/382777224' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Dec 15 05:10:37 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:10:37.719 267546 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:10:38 localhost systemd[1]: var-lib-containers-storage-overlay-40ecc108e45c7a929ccc94b795068d8c66848872f36c114e33258147034c3517-merged.mount: Deactivated successfully. Dec 15 05:10:38 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-805ae500e536e52dbe2df070ab460abb67eb99dd45560710136ae839334b8030-userdata-shm.mount: Deactivated successfully. Dec 15 05:10:38 localhost systemd[1]: run-netns-qdhcp\x2d4ec39a2f\x2d452e\x2d4f94\x2db8fd\x2dad944930604a.mount: Deactivated successfully. Dec 15 05:10:38 localhost ovn_controller[154603]: 2025-12-15T10:10:38Z|00497|binding|INFO|Releasing lport b35254ad-12eb-47bb-92be-44fefe0694f0 from this chassis (sb_readonly=0) Dec 15 05:10:38 localhost nova_compute[286344]: 2025-12-15 10:10:38.318 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:10:39 localhost nova_compute[286344]: 2025-12-15 10:10:39.037 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:10:39 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Dec 15 05:10:39 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.25625 172.18.0.34:0/382777224' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Dec 15 05:10:40 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:10:40.035 267546 INFO neutron.agent.linux.ip_lib [None req-0f2d7450-c072-4d46-a5e9-0d48761f7976 - - - - - -] Device tap347c9c41-0a cannot be used as it has no MAC address#033[00m Dec 15 05:10:40 localhost nova_compute[286344]: 2025-12-15 10:10:40.064 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:10:40 localhost kernel: device tap347c9c41-0a entered promiscuous mode Dec 15 05:10:40 localhost NetworkManager[5963]: [1765793440.0766] manager: (tap347c9c41-0a): new Generic device (/org/freedesktop/NetworkManager/Devices/77) Dec 15 05:10:40 localhost ovn_controller[154603]: 2025-12-15T10:10:40Z|00498|binding|INFO|Claiming lport 347c9c41-0ad7-42df-851b-4296ded79424 for this chassis. Dec 15 05:10:40 localhost ovn_controller[154603]: 2025-12-15T10:10:40Z|00499|binding|INFO|347c9c41-0ad7-42df-851b-4296ded79424: Claiming unknown Dec 15 05:10:40 localhost nova_compute[286344]: 2025-12-15 10:10:40.079 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:10:40 localhost systemd-udevd[334735]: Network interface NamePolicy= disabled on kernel command line. Dec 15 05:10:40 localhost ovn_metadata_agent[160585]: 2025-12-15 10:10:40.090 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-e2a1290f-45b4-4372-a047-df2537ce2336', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e2a1290f-45b4-4372-a047-df2537ce2336', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '68b3221c6efd4aac8016cd49e82c26c8', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4fdae3b8-0df4-41d4-b2da-97cd632389fc, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=347c9c41-0ad7-42df-851b-4296ded79424) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:10:40 localhost ovn_metadata_agent[160585]: 2025-12-15 10:10:40.092 160590 INFO neutron.agent.ovn.metadata.agent [-] Port 347c9c41-0ad7-42df-851b-4296ded79424 in datapath e2a1290f-45b4-4372-a047-df2537ce2336 bound to our chassis#033[00m Dec 15 05:10:40 localhost ovn_metadata_agent[160585]: 2025-12-15 10:10:40.093 160590 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network e2a1290f-45b4-4372-a047-df2537ce2336 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 15 05:10:40 localhost ovn_metadata_agent[160585]: 2025-12-15 10:10:40.094 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[71140125-24e4-4b3a-8355-45a94ac65624]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:10:40 localhost journal[231322]: ethtool ioctl error on tap347c9c41-0a: No such device Dec 15 05:10:40 localhost journal[231322]: ethtool ioctl error on tap347c9c41-0a: No such device Dec 15 05:10:40 localhost ovn_controller[154603]: 2025-12-15T10:10:40Z|00500|binding|INFO|Setting lport 347c9c41-0ad7-42df-851b-4296ded79424 ovn-installed in OVS Dec 15 05:10:40 localhost ovn_controller[154603]: 2025-12-15T10:10:40Z|00501|binding|INFO|Setting lport 347c9c41-0ad7-42df-851b-4296ded79424 up in Southbound Dec 15 05:10:40 localhost nova_compute[286344]: 2025-12-15 10:10:40.120 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:10:40 localhost journal[231322]: ethtool ioctl error on tap347c9c41-0a: No such device Dec 15 05:10:40 localhost journal[231322]: ethtool ioctl error on tap347c9c41-0a: No such device Dec 15 05:10:40 localhost journal[231322]: ethtool ioctl error on tap347c9c41-0a: No such device Dec 15 05:10:40 localhost journal[231322]: ethtool ioctl error on tap347c9c41-0a: No such device Dec 15 05:10:40 localhost journal[231322]: ethtool ioctl error on tap347c9c41-0a: No such device Dec 15 05:10:40 localhost journal[231322]: ethtool ioctl error on tap347c9c41-0a: No such device Dec 15 05:10:40 localhost nova_compute[286344]: 2025-12-15 10:10:40.193 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:10:40 localhost nova_compute[286344]: 2025-12-15 10:10:40.200 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:10:40 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Dec 15 05:10:40 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.25625 172.18.0.34:0/382777224' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Dec 15 05:10:41 localhost podman[334806]: Dec 15 05:10:41 localhost podman[334806]: 2025-12-15 10:10:41.064907634 +0000 UTC m=+0.093372199 container create 2286740e6174c88daf6e0f11f64bfbbf5430099ed035d5d12271ef5a915478be (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e2a1290f-45b4-4372-a047-df2537ce2336, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2) Dec 15 05:10:41 localhost systemd[1]: Started libpod-conmon-2286740e6174c88daf6e0f11f64bfbbf5430099ed035d5d12271ef5a915478be.scope. Dec 15 05:10:41 localhost podman[334806]: 2025-12-15 10:10:41.01875591 +0000 UTC m=+0.047220505 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 15 05:10:41 localhost systemd[1]: tmp-crun.qjijCp.mount: Deactivated successfully. Dec 15 05:10:41 localhost systemd[1]: Started libcrun container. Dec 15 05:10:41 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0dd356bea474ad5e041e358c013579f37a0929a9b3ce78ef44f566b2d95acdd6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 05:10:41 localhost podman[334806]: 2025-12-15 10:10:41.158554051 +0000 UTC m=+0.187018606 container init 2286740e6174c88daf6e0f11f64bfbbf5430099ed035d5d12271ef5a915478be (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e2a1290f-45b4-4372-a047-df2537ce2336, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202) Dec 15 05:10:41 localhost podman[334806]: 2025-12-15 10:10:41.16772986 +0000 UTC m=+0.196194415 container start 2286740e6174c88daf6e0f11f64bfbbf5430099ed035d5d12271ef5a915478be (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e2a1290f-45b4-4372-a047-df2537ce2336, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:10:41 localhost dnsmasq[334825]: started, version 2.85 cachesize 150 Dec 15 05:10:41 localhost dnsmasq[334825]: DNS service limited to local subnets Dec 15 05:10:41 localhost dnsmasq[334825]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 15 05:10:41 localhost dnsmasq[334825]: warning: no upstream servers configured Dec 15 05:10:41 localhost dnsmasq-dhcp[334825]: DHCPv6, static leases only on 2001:db8::, lease time 1d Dec 15 05:10:41 localhost dnsmasq[334825]: read /var/lib/neutron/dhcp/e2a1290f-45b4-4372-a047-df2537ce2336/addn_hosts - 0 addresses Dec 15 05:10:41 localhost dnsmasq-dhcp[334825]: read /var/lib/neutron/dhcp/e2a1290f-45b4-4372-a047-df2537ce2336/host Dec 15 05:10:41 localhost dnsmasq-dhcp[334825]: read /var/lib/neutron/dhcp/e2a1290f-45b4-4372-a047-df2537ce2336/opts Dec 15 05:10:41 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:10:41.229 267546 INFO neutron.agent.dhcp.agent [None req-0f2d7450-c072-4d46-a5e9-0d48761f7976 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-15T10:10:39Z, description=, device_id=8ffc82b6-8c38-4efd-881e-6428848f72b1, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=d60b886d-9e9b-4397-8bda-347563800b25, ip_allocation=immediate, mac_address=fa:16:3e:ca:e8:22, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-15T10:10:38Z, description=, dns_domain=, id=e2a1290f-45b4-4372-a047-df2537ce2336, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-68146735, port_security_enabled=True, project_id=68b3221c6efd4aac8016cd49e82c26c8, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=23159, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3342, status=ACTIVE, subnets=['4d63e35e-4361-4379-a96d-2e357f70a222'], tags=[], tenant_id=68b3221c6efd4aac8016cd49e82c26c8, updated_at=2025-12-15T10:10:38Z, vlan_transparent=None, network_id=e2a1290f-45b4-4372-a047-df2537ce2336, port_security_enabled=False, project_id=68b3221c6efd4aac8016cd49e82c26c8, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3347, status=DOWN, tags=[], tenant_id=68b3221c6efd4aac8016cd49e82c26c8, updated_at=2025-12-15T10:10:39Z on network e2a1290f-45b4-4372-a047-df2537ce2336#033[00m Dec 15 05:10:41 localhost nova_compute[286344]: 2025-12-15 10:10:41.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:10:41 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:10:41.312 267546 INFO neutron.agent.dhcp.agent [None req-6d1e34ce-0207-49cd-a06f-8438e23f5077 - - - - - -] DHCP configuration for ports {'f1d3c7ba-75a1-48a5-ae17-2a285f57f0de'} is completed#033[00m Dec 15 05:10:41 localhost nova_compute[286344]: 2025-12-15 10:10:41.397 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:10:41 localhost dnsmasq[334825]: read /var/lib/neutron/dhcp/e2a1290f-45b4-4372-a047-df2537ce2336/addn_hosts - 1 addresses Dec 15 05:10:41 localhost dnsmasq-dhcp[334825]: read /var/lib/neutron/dhcp/e2a1290f-45b4-4372-a047-df2537ce2336/host Dec 15 05:10:41 localhost dnsmasq-dhcp[334825]: read /var/lib/neutron/dhcp/e2a1290f-45b4-4372-a047-df2537ce2336/opts Dec 15 05:10:41 localhost podman[334844]: 2025-12-15 10:10:41.45051307 +0000 UTC m=+0.083420010 container kill 2286740e6174c88daf6e0f11f64bfbbf5430099ed035d5d12271ef5a915478be (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e2a1290f-45b4-4372-a047-df2537ce2336, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Dec 15 05:10:41 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e226 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:10:41 localhost nova_compute[286344]: 2025-12-15 10:10:41.583 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:10:41 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:10:41.605 267546 INFO neutron.agent.dhcp.agent [None req-0f2d7450-c072-4d46-a5e9-0d48761f7976 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-15T10:10:39Z, description=, device_id=8ffc82b6-8c38-4efd-881e-6428848f72b1, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=d60b886d-9e9b-4397-8bda-347563800b25, ip_allocation=immediate, mac_address=fa:16:3e:ca:e8:22, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-15T10:10:38Z, description=, dns_domain=, id=e2a1290f-45b4-4372-a047-df2537ce2336, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-68146735, port_security_enabled=True, project_id=68b3221c6efd4aac8016cd49e82c26c8, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=23159, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3342, status=ACTIVE, subnets=['4d63e35e-4361-4379-a96d-2e357f70a222'], tags=[], tenant_id=68b3221c6efd4aac8016cd49e82c26c8, updated_at=2025-12-15T10:10:38Z, vlan_transparent=None, network_id=e2a1290f-45b4-4372-a047-df2537ce2336, port_security_enabled=False, project_id=68b3221c6efd4aac8016cd49e82c26c8, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3347, status=DOWN, tags=[], tenant_id=68b3221c6efd4aac8016cd49e82c26c8, updated_at=2025-12-15T10:10:39Z on network e2a1290f-45b4-4372-a047-df2537ce2336#033[00m Dec 15 05:10:41 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:10:41.665 267546 INFO neutron.agent.dhcp.agent [None req-82c7f883-1d6b-4527-954d-bcc8e9fc32c9 - - - - - -] DHCP configuration for ports {'d60b886d-9e9b-4397-8bda-347563800b25'} is completed#033[00m Dec 15 05:10:41 localhost dnsmasq[334825]: read /var/lib/neutron/dhcp/e2a1290f-45b4-4372-a047-df2537ce2336/addn_hosts - 1 addresses Dec 15 05:10:41 localhost dnsmasq-dhcp[334825]: read /var/lib/neutron/dhcp/e2a1290f-45b4-4372-a047-df2537ce2336/host Dec 15 05:10:41 localhost podman[334883]: 2025-12-15 10:10:41.786723791 +0000 UTC m=+0.058039389 container kill 2286740e6174c88daf6e0f11f64bfbbf5430099ed035d5d12271ef5a915478be (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e2a1290f-45b4-4372-a047-df2537ce2336, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Dec 15 05:10:41 localhost dnsmasq-dhcp[334825]: read /var/lib/neutron/dhcp/e2a1290f-45b4-4372-a047-df2537ce2336/opts Dec 15 05:10:42 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:10:42.013 267546 INFO neutron.agent.dhcp.agent [None req-f279a2b8-23c3-42d2-bda6-fd3bc081a9b5 - - - - - -] DHCP configuration for ports {'d60b886d-9e9b-4397-8bda-347563800b25'} is completed#033[00m Dec 15 05:10:44 localhost nova_compute[286344]: 2025-12-15 10:10:44.074 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:10:44 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.Joe", "format": "json"} : dispatch Dec 15 05:10:46 localhost nova_compute[286344]: 2025-12-15 10:10:46.399 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:10:46 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e226 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:10:46 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-309271475", "caps": ["mds", "allow rw path=/volumes/_nogroup/24b89787-8789-48aa-b143-67a0bb16661b/15ab028f-6929-48dd-88c0-d1c26935f223", "osd", "allow rw pool=manila_data namespace=fsvolumens_24b89787-8789-48aa-b143-67a0bb16661b", "mon", "allow r"], "format": "json"} v 0) Dec 15 05:10:46 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-309271475", "caps": ["mds", "allow rw path=/volumes/_nogroup/24b89787-8789-48aa-b143-67a0bb16661b/15ab028f-6929-48dd-88c0-d1c26935f223", "osd", "allow rw pool=manila_data namespace=fsvolumens_24b89787-8789-48aa-b143-67a0bb16661b", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:10:46 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-309271475", "caps": ["mds", "allow rw path=/volumes/_nogroup/24b89787-8789-48aa-b143-67a0bb16661b/15ab028f-6929-48dd-88c0-d1c26935f223", "osd", "allow rw pool=manila_data namespace=fsvolumens_24b89787-8789-48aa-b143-67a0bb16661b", "mon", "allow r"], "format": "json"}]': finished Dec 15 05:10:47 localhost nova_compute[286344]: 2025-12-15 10:10:47.121 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:10:47 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-309271475", "format": "json"} : dispatch Dec 15 05:10:47 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-309271475", "caps": ["mds", "allow rw path=/volumes/_nogroup/24b89787-8789-48aa-b143-67a0bb16661b/15ab028f-6929-48dd-88c0-d1c26935f223", "osd", "allow rw pool=manila_data namespace=fsvolumens_24b89787-8789-48aa-b143-67a0bb16661b", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:10:47 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-309271475", "caps": ["mds", "allow rw path=/volumes/_nogroup/24b89787-8789-48aa-b143-67a0bb16661b/15ab028f-6929-48dd-88c0-d1c26935f223", "osd", "allow rw pool=manila_data namespace=fsvolumens_24b89787-8789-48aa-b143-67a0bb16661b", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:10:47 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-309271475", "caps": ["mds", "allow rw path=/volumes/_nogroup/24b89787-8789-48aa-b143-67a0bb16661b/15ab028f-6929-48dd-88c0-d1c26935f223", "osd", "allow rw pool=manila_data namespace=fsvolumens_24b89787-8789-48aa-b143-67a0bb16661b", "mon", "allow r"], "format": "json"}]': finished Dec 15 05:10:47 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Dec 15 05:10:47 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.25625 172.18.0.34:0/382777224' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Dec 15 05:10:49 localhost nova_compute[286344]: 2025-12-15 10:10:49.077 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:10:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0. Dec 15 05:10:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09. Dec 15 05:10:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 05:10:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 05:10:49 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a. Dec 15 05:10:49 localhost podman[334907]: 2025-12-15 10:10:49.75854179 +0000 UTC m=+0.079488803 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller) Dec 15 05:10:49 localhost systemd[1]: tmp-crun.oTNbnY.mount: Deactivated successfully. Dec 15 05:10:49 localhost podman[334904]: 2025-12-15 10:10:49.831802352 +0000 UTC m=+0.156694422 container health_status 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Dec 15 05:10:49 localhost podman[334905]: 2025-12-15 10:10:49.837190299 +0000 UTC m=+0.160048863 container health_status 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, name=ubi9-minimal, maintainer=Red Hat, Inc., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., vcs-type=git, release=1755695350, io.openshift.tags=minimal rhel9, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible) Dec 15 05:10:49 localhost podman[334907]: 2025-12-15 10:10:49.861364905 +0000 UTC m=+0.182311908 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, tcib_managed=true, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202) Dec 15 05:10:49 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 05:10:49 localhost podman[334906]: 2025-12-15 10:10:49.879237852 +0000 UTC m=+0.197072179 container health_status 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, io.buildah.version=1.41.3) Dec 15 05:10:49 localhost podman[334905]: 2025-12-15 10:10:49.882547082 +0000 UTC m=+0.205405696 container exec_died 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, name=ubi9-minimal, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, architecture=x86_64, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 15 05:10:49 localhost podman[334906]: 2025-12-15 10:10:49.892519443 +0000 UTC m=+0.210353760 container exec_died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, container_name=multipathd) Dec 15 05:10:49 localhost systemd[1]: 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09.service: Deactivated successfully. Dec 15 05:10:49 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.service: Deactivated successfully. Dec 15 05:10:49 localhost podman[334913]: 2025-12-15 10:10:49.944025193 +0000 UTC m=+0.255069196 container health_status b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ceilometer_agent_compute) Dec 15 05:10:49 localhost podman[334913]: 2025-12-15 10:10:49.955494005 +0000 UTC m=+0.266537988 container exec_died b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true) Dec 15 05:10:49 localhost systemd[1]: b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a.service: Deactivated successfully. Dec 15 05:10:49 localhost podman[334904]: 2025-12-15 10:10:49.968344165 +0000 UTC m=+0.293236205 container exec_died 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Dec 15 05:10:49 localhost systemd[1]: 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0.service: Deactivated successfully. Dec 15 05:10:50 localhost nova_compute[286344]: 2025-12-15 10:10:50.077 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:10:50 localhost systemd[1]: tmp-crun.zElUQC.mount: Deactivated successfully. Dec 15 05:10:51 localhost nova_compute[286344]: 2025-12-15 10:10:51.440 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:10:51 localhost ovn_metadata_agent[160585]: 2025-12-15 10:10:51.485 160590 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 05:10:51 localhost ovn_metadata_agent[160585]: 2025-12-15 10:10:51.486 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 05:10:51 localhost ovn_metadata_agent[160585]: 2025-12-15 10:10:51.488 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 05:10:51 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e226 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:10:52 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e226 do_prune osdmap full prune enabled Dec 15 05:10:52 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e227 e227: 6 total, 6 up, 6 in Dec 15 05:10:52 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e227: 6 total, 6 up, 6 in Dec 15 05:10:52 localhost nova_compute[286344]: 2025-12-15 10:10:52.987 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:10:53 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth rm", "entity": "client.tempest-cephx-id-309271475"} v 0) Dec 15 05:10:53 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-309271475"} : dispatch Dec 15 05:10:53 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-309271475"}]': finished Dec 15 05:10:54 localhost nova_compute[286344]: 2025-12-15 10:10:54.118 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:10:54 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Dec 15 05:10:54 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.25625 172.18.0.34:0/382777224' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Dec 15 05:10:54 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-309271475", "format": "json"} : dispatch Dec 15 05:10:54 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-309271475"} : dispatch Dec 15 05:10:54 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-309271475"} : dispatch Dec 15 05:10:54 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-309271475"}]': finished Dec 15 05:10:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 05:10:54 localhost podman[335009]: 2025-12-15 10:10:54.758565705 +0000 UTC m=+0.080407548 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 05:10:54 localhost podman[335009]: 2025-12-15 10:10:54.768358371 +0000 UTC m=+0.090200204 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS) Dec 15 05:10:54 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 05:10:56 localhost nova_compute[286344]: 2025-12-15 10:10:56.443 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:10:56 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e227 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:10:56 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e227 do_prune osdmap full prune enabled Dec 15 05:10:56 localhost ovn_controller[154603]: 2025-12-15T10:10:56Z|00502|binding|INFO|Releasing lport b35254ad-12eb-47bb-92be-44fefe0694f0 from this chassis (sb_readonly=0) Dec 15 05:10:56 localhost nova_compute[286344]: 2025-12-15 10:10:56.565 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:10:56 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e228 e228: 6 total, 6 up, 6 in Dec 15 05:10:56 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e228: 6 total, 6 up, 6 in Dec 15 05:10:57 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth rm", "entity": "client.Joe"} v 0) Dec 15 05:10:57 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.Joe"} : dispatch Dec 15 05:10:57 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth rm", "entity": "client.Joe"}]': finished Dec 15 05:10:57 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.Joe", "format": "json"} : dispatch Dec 15 05:10:57 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.Joe"} : dispatch Dec 15 05:10:57 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.Joe"} : dispatch Dec 15 05:10:57 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth rm", "entity": "client.Joe"}]': finished Dec 15 05:10:58 localhost ovn_controller[154603]: 2025-12-15T10:10:58Z|00503|binding|INFO|Releasing lport b35254ad-12eb-47bb-92be-44fefe0694f0 from this chassis (sb_readonly=0) Dec 15 05:10:58 localhost nova_compute[286344]: 2025-12-15 10:10:58.252 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:10:58 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Dec 15 05:10:58 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.25625 172.18.0.34:0/382777224' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Dec 15 05:10:59 localhost nova_compute[286344]: 2025-12-15 10:10:59.159 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:11:00 localhost ovn_controller[154603]: 2025-12-15T10:11:00Z|00504|binding|INFO|Releasing lport b35254ad-12eb-47bb-92be-44fefe0694f0 from this chassis (sb_readonly=0) Dec 15 05:11:00 localhost nova_compute[286344]: 2025-12-15 10:11:00.276 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:11:00 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.admin", "format": "json"} : dispatch Dec 15 05:11:00 localhost dnsmasq[334825]: read /var/lib/neutron/dhcp/e2a1290f-45b4-4372-a047-df2537ce2336/addn_hosts - 0 addresses Dec 15 05:11:00 localhost dnsmasq-dhcp[334825]: read /var/lib/neutron/dhcp/e2a1290f-45b4-4372-a047-df2537ce2336/host Dec 15 05:11:00 localhost dnsmasq-dhcp[334825]: read /var/lib/neutron/dhcp/e2a1290f-45b4-4372-a047-df2537ce2336/opts Dec 15 05:11:00 localhost podman[335046]: 2025-12-15 10:11:00.93334443 +0000 UTC m=+0.060400283 container kill 2286740e6174c88daf6e0f11f64bfbbf5430099ed035d5d12271ef5a915478be (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e2a1290f-45b4-4372-a047-df2537ce2336, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:11:00 localhost systemd[1]: Started /usr/bin/podman healthcheck run a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e. Dec 15 05:11:01 localhost podman[335061]: 2025-12-15 10:11:01.048889832 +0000 UTC m=+0.085556677 container health_status a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Dec 15 05:11:01 localhost podman[335061]: 2025-12-15 10:11:01.055971615 +0000 UTC m=+0.092638510 container exec_died a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 15 05:11:01 localhost systemd[1]: a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e.service: Deactivated successfully. Dec 15 05:11:01 localhost ovn_controller[154603]: 2025-12-15T10:11:01Z|00505|binding|INFO|Releasing lport 347c9c41-0ad7-42df-851b-4296ded79424 from this chassis (sb_readonly=0) Dec 15 05:11:01 localhost ovn_controller[154603]: 2025-12-15T10:11:01Z|00506|binding|INFO|Setting lport 347c9c41-0ad7-42df-851b-4296ded79424 down in Southbound Dec 15 05:11:01 localhost kernel: device tap347c9c41-0a left promiscuous mode Dec 15 05:11:01 localhost nova_compute[286344]: 2025-12-15 10:11:01.124 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:11:01 localhost nova_compute[286344]: 2025-12-15 10:11:01.146 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:11:01 localhost ovn_metadata_agent[160585]: 2025-12-15 10:11:01.193 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-e2a1290f-45b4-4372-a047-df2537ce2336', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e2a1290f-45b4-4372-a047-df2537ce2336', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '68b3221c6efd4aac8016cd49e82c26c8', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005559462.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4fdae3b8-0df4-41d4-b2da-97cd632389fc, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=347c9c41-0ad7-42df-851b-4296ded79424) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:11:01 localhost ovn_metadata_agent[160585]: 2025-12-15 10:11:01.196 160590 INFO neutron.agent.ovn.metadata.agent [-] Port 347c9c41-0ad7-42df-851b-4296ded79424 in datapath e2a1290f-45b4-4372-a047-df2537ce2336 unbound from our chassis#033[00m Dec 15 05:11:01 localhost ovn_metadata_agent[160585]: 2025-12-15 10:11:01.197 160590 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network e2a1290f-45b4-4372-a047-df2537ce2336 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 15 05:11:01 localhost ovn_metadata_agent[160585]: 2025-12-15 10:11:01.198 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[aaf739fe-2655-48a9-bd42-0b9adc326400]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:11:01 localhost nova_compute[286344]: 2025-12-15 10:11:01.487 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:11:01 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e228 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:11:01 localhost ceph-mon[298913]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #67. Immutable memtables: 0. Dec 15 05:11:01 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:11:01.518158) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Dec 15 05:11:01 localhost ceph-mon[298913]: rocksdb: [db/flush_job.cc:856] [default] [JOB 39] Flushing memtable with next log file: 67 Dec 15 05:11:01 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765793461518245, "job": 39, "event": "flush_started", "num_memtables": 1, "num_entries": 937, "num_deletes": 260, "total_data_size": 667993, "memory_usage": 685304, "flush_reason": "Manual Compaction"} Dec 15 05:11:01 localhost ceph-mon[298913]: rocksdb: [db/flush_job.cc:885] [default] [JOB 39] Level-0 flush table #68: started Dec 15 05:11:01 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765793461526181, "cf_name": "default", "job": 39, "event": "table_file_creation", "file_number": 68, "file_size": 654074, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 35647, "largest_seqno": 36582, "table_properties": {"data_size": 649511, "index_size": 2098, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1413, "raw_key_size": 11479, "raw_average_key_size": 20, "raw_value_size": 639736, "raw_average_value_size": 1142, "num_data_blocks": 91, "num_entries": 560, "num_filter_entries": 560, "num_deletions": 260, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765793415, "oldest_key_time": 1765793415, "file_creation_time": 1765793461, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "603b24af-e2be-4214-bc56-9e652eb4af3d", "db_session_id": "0OJRM9SCUA16EXV0VQZ2", "orig_file_number": 68, "seqno_to_time_mapping": "N/A"}} Dec 15 05:11:01 localhost ceph-mon[298913]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 39] Flush lasted 8073 microseconds, and 2952 cpu microseconds. Dec 15 05:11:01 localhost ceph-mon[298913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Dec 15 05:11:01 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:11:01.526241) [db/flush_job.cc:967] [default] [JOB 39] Level-0 flush table #68: 654074 bytes OK Dec 15 05:11:01 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:11:01.526264) [db/memtable_list.cc:519] [default] Level-0 commit table #68 started Dec 15 05:11:01 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:11:01.529599) [db/memtable_list.cc:722] [default] Level-0 commit table #68: memtable #1 done Dec 15 05:11:01 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:11:01.529625) EVENT_LOG_v1 {"time_micros": 1765793461529618, "job": 39, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Dec 15 05:11:01 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:11:01.529648) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Dec 15 05:11:01 localhost ceph-mon[298913]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 39] Try to delete WAL files size 663190, prev total WAL file size 663514, number of live WAL files 2. Dec 15 05:11:01 localhost ceph-mon[298913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005559462/store.db/000064.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Dec 15 05:11:01 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:11:01.530248) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0034323730' seq:72057594037927935, type:22 .. '6C6F676D0034353234' seq:0, type:0; will stop at (end) Dec 15 05:11:01 localhost ceph-mon[298913]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 40] Compacting 1@0 + 1@6 files to L6, score -1.00 Dec 15 05:11:01 localhost ceph-mon[298913]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 39 Base level 0, inputs: [68(638KB)], [66(16MB)] Dec 15 05:11:01 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765793461530311, "job": 40, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [68], "files_L6": [66], "score": -1, "input_data_size": 18130839, "oldest_snapshot_seqno": -1} Dec 15 05:11:01 localhost ceph-mon[298913]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 40] Generated table #69: 13619 keys, 17653979 bytes, temperature: kUnknown Dec 15 05:11:01 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765793461665853, "cf_name": "default", "job": 40, "event": "table_file_creation", "file_number": 69, "file_size": 17653979, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17576374, "index_size": 42496, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 34053, "raw_key_size": 368320, "raw_average_key_size": 27, "raw_value_size": 17344371, "raw_average_value_size": 1273, "num_data_blocks": 1572, "num_entries": 13619, "num_filter_entries": 13619, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765792320, "oldest_key_time": 0, "file_creation_time": 1765793461, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "603b24af-e2be-4214-bc56-9e652eb4af3d", "db_session_id": "0OJRM9SCUA16EXV0VQZ2", "orig_file_number": 69, "seqno_to_time_mapping": "N/A"}} Dec 15 05:11:01 localhost ceph-mon[298913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Dec 15 05:11:01 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:11:01.666140) [db/compaction/compaction_job.cc:1663] [default] [JOB 40] Compacted 1@0 + 1@6 files to L6 => 17653979 bytes Dec 15 05:11:01 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:11:01.670116) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 133.7 rd, 130.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.6, 16.7 +0.0 blob) out(16.8 +0.0 blob), read-write-amplify(54.7) write-amplify(27.0) OK, records in: 14161, records dropped: 542 output_compression: NoCompression Dec 15 05:11:01 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:11:01.670136) EVENT_LOG_v1 {"time_micros": 1765793461670127, "job": 40, "event": "compaction_finished", "compaction_time_micros": 135621, "compaction_time_cpu_micros": 52989, "output_level": 6, "num_output_files": 1, "total_output_size": 17653979, "num_input_records": 14161, "num_output_records": 13619, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Dec 15 05:11:01 localhost ceph-mon[298913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005559462/store.db/000068.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Dec 15 05:11:01 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765793461670314, "job": 40, "event": "table_file_deletion", "file_number": 68} Dec 15 05:11:01 localhost ceph-mon[298913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005559462/store.db/000066.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Dec 15 05:11:01 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765793461672606, "job": 40, "event": "table_file_deletion", "file_number": 66} Dec 15 05:11:01 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:11:01.530164) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 05:11:01 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:11:01.672679) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 05:11:01 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:11:01.672687) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 05:11:01 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:11:01.672690) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 05:11:01 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:11:01.672693) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 05:11:01 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:11:01.672697) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 05:11:01 localhost dnsmasq[334825]: exiting on receipt of SIGTERM Dec 15 05:11:01 localhost podman[335108]: 2025-12-15 10:11:01.724616855 +0000 UTC m=+0.060799733 container kill 2286740e6174c88daf6e0f11f64bfbbf5430099ed035d5d12271ef5a915478be (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e2a1290f-45b4-4372-a047-df2537ce2336, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:11:01 localhost systemd[1]: libpod-2286740e6174c88daf6e0f11f64bfbbf5430099ed035d5d12271ef5a915478be.scope: Deactivated successfully. Dec 15 05:11:01 localhost podman[335124]: 2025-12-15 10:11:01.786291432 +0000 UTC m=+0.044477790 container died 2286740e6174c88daf6e0f11f64bfbbf5430099ed035d5d12271ef5a915478be (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e2a1290f-45b4-4372-a047-df2537ce2336, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2) Dec 15 05:11:01 localhost podman[335124]: 2025-12-15 10:11:01.832369306 +0000 UTC m=+0.090555614 container remove 2286740e6174c88daf6e0f11f64bfbbf5430099ed035d5d12271ef5a915478be (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-e2a1290f-45b4-4372-a047-df2537ce2336, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3) Dec 15 05:11:01 localhost systemd[1]: libpod-conmon-2286740e6174c88daf6e0f11f64bfbbf5430099ed035d5d12271ef5a915478be.scope: Deactivated successfully. Dec 15 05:11:01 localhost podman[243449]: time="2025-12-15T10:11:01Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 15 05:11:01 localhost podman[243449]: @ - - [15/Dec/2025:10:11:01 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156640 "" "Go-http-client/1.1" Dec 15 05:11:01 localhost podman[243449]: @ - - [15/Dec/2025:10:11:01 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19273 "" "Go-http-client/1.1" Dec 15 05:11:01 localhost systemd[1]: var-lib-containers-storage-overlay-0dd356bea474ad5e041e358c013579f37a0929a9b3ce78ef44f566b2d95acdd6-merged.mount: Deactivated successfully. Dec 15 05:11:01 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2286740e6174c88daf6e0f11f64bfbbf5430099ed035d5d12271ef5a915478be-userdata-shm.mount: Deactivated successfully. Dec 15 05:11:02 localhost systemd[1]: run-netns-qdhcp\x2de2a1290f\x2d45b4\x2d4372\x2da047\x2ddf2537ce2336.mount: Deactivated successfully. Dec 15 05:11:02 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:11:02.106 267546 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:11:02 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:11:02.338 267546 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:11:02 localhost ovn_controller[154603]: 2025-12-15T10:11:02Z|00507|binding|INFO|Releasing lport b35254ad-12eb-47bb-92be-44fefe0694f0 from this chassis (sb_readonly=0) Dec 15 05:11:02 localhost nova_compute[286344]: 2025-12-15 10:11:02.675 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:11:03 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.david", "caps": ["mds", "allow rw path=/volumes/_nogroup/92daff28-beb3-4518-862f-402cd16df894/708e195f-e0a6-454b-9af4-93f854d665e8", "osd", "allow rw pool=manila_data namespace=fsvolumens_92daff28-beb3-4518-862f-402cd16df894", "mon", "allow r"], "format": "json"} v 0) Dec 15 05:11:03 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.david", "caps": ["mds", "allow rw path=/volumes/_nogroup/92daff28-beb3-4518-862f-402cd16df894/708e195f-e0a6-454b-9af4-93f854d665e8", "osd", "allow rw pool=manila_data namespace=fsvolumens_92daff28-beb3-4518-862f-402cd16df894", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:11:03 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth get-or-create", "entity": "client.david", "caps": ["mds", "allow rw path=/volumes/_nogroup/92daff28-beb3-4518-862f-402cd16df894/708e195f-e0a6-454b-9af4-93f854d665e8", "osd", "allow rw pool=manila_data namespace=fsvolumens_92daff28-beb3-4518-862f-402cd16df894", "mon", "allow r"], "format": "json"}]': finished Dec 15 05:11:04 localhost nova_compute[286344]: 2025-12-15 10:11:04.203 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:11:04 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.david", "format": "json"} : dispatch Dec 15 05:11:04 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.david", "caps": ["mds", "allow rw path=/volumes/_nogroup/92daff28-beb3-4518-862f-402cd16df894/708e195f-e0a6-454b-9af4-93f854d665e8", "osd", "allow rw pool=manila_data namespace=fsvolumens_92daff28-beb3-4518-862f-402cd16df894", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:11:04 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.david", "caps": ["mds", "allow rw path=/volumes/_nogroup/92daff28-beb3-4518-862f-402cd16df894/708e195f-e0a6-454b-9af4-93f854d665e8", "osd", "allow rw pool=manila_data namespace=fsvolumens_92daff28-beb3-4518-862f-402cd16df894", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:11:04 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth get-or-create", "entity": "client.david", "caps": ["mds", "allow rw path=/volumes/_nogroup/92daff28-beb3-4518-862f-402cd16df894/708e195f-e0a6-454b-9af4-93f854d665e8", "osd", "allow rw pool=manila_data namespace=fsvolumens_92daff28-beb3-4518-862f-402cd16df894", "mon", "allow r"], "format": "json"}]': finished Dec 15 05:11:04 localhost openstack_network_exporter[246484]: ERROR 10:11:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 05:11:04 localhost openstack_network_exporter[246484]: ERROR 10:11:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 05:11:04 localhost openstack_network_exporter[246484]: ERROR 10:11:04 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 15 05:11:04 localhost openstack_network_exporter[246484]: ERROR 10:11:04 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 15 05:11:04 localhost openstack_network_exporter[246484]: Dec 15 05:11:04 localhost openstack_network_exporter[246484]: ERROR 10:11:04 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 15 05:11:04 localhost openstack_network_exporter[246484]: Dec 15 05:11:05 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Dec 15 05:11:05 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2637230056' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Dec 15 05:11:05 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Dec 15 05:11:05 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2637230056' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Dec 15 05:11:06 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e228 do_prune osdmap full prune enabled Dec 15 05:11:06 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e229 e229: 6 total, 6 up, 6 in Dec 15 05:11:06 localhost nova_compute[286344]: 2025-12-15 10:11:06.489 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:11:06 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e229: 6 total, 6 up, 6 in Dec 15 05:11:06 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : mgrmap e50: np0005559464.aomnqe(active, since 13m), standbys: np0005559462.fudvyx, np0005559463.daptkf Dec 15 05:11:06 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e229 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:11:06 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e229 do_prune osdmap full prune enabled Dec 15 05:11:06 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e230 e230: 6 total, 6 up, 6 in Dec 15 05:11:06 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e230: 6 total, 6 up, 6 in Dec 15 05:11:07 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Dec 15 05:11:07 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.25625 172.18.0.34:0/382777224' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Dec 15 05:11:08 localhost ovn_metadata_agent[160585]: 2025-12-15 10:11:08.767 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=19, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'fe:17:e3', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fe:55:2b:86:15:b5'}, ipsec=False) old=SB_Global(nb_cfg=18) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:11:08 localhost ovn_metadata_agent[160585]: 2025-12-15 10:11:08.768 160590 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Dec 15 05:11:08 localhost nova_compute[286344]: 2025-12-15 10:11:08.801 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:11:09 localhost nova_compute[286344]: 2025-12-15 10:11:09.205 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:11:09 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Dec 15 05:11:09 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:11:09 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Dec 15 05:11:09 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:11:09 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Dec 15 05:11:09 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:11:10 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:11:10 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.david", "format": "json"} : dispatch Dec 15 05:11:11 localhost nova_compute[286344]: 2025-12-15 10:11:11.491 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:11:11 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e230 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:11:13 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:11:13.092 267546 INFO neutron.agent.linux.ip_lib [None req-75266d09-7b71-49ee-b27f-718651bed3d2 - - - - - -] Device tapd0a827a2-ba cannot be used as it has no MAC address#033[00m Dec 15 05:11:13 localhost nova_compute[286344]: 2025-12-15 10:11:13.123 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:11:13 localhost kernel: device tapd0a827a2-ba entered promiscuous mode Dec 15 05:11:13 localhost NetworkManager[5963]: [1765793473.1315] manager: (tapd0a827a2-ba): new Generic device (/org/freedesktop/NetworkManager/Devices/78) Dec 15 05:11:13 localhost ovn_controller[154603]: 2025-12-15T10:11:13Z|00508|binding|INFO|Claiming lport d0a827a2-ba8d-4ff8-b4de-16db92caf4c1 for this chassis. Dec 15 05:11:13 localhost ovn_controller[154603]: 2025-12-15T10:11:13Z|00509|binding|INFO|d0a827a2-ba8d-4ff8-b4de-16db92caf4c1: Claiming unknown Dec 15 05:11:13 localhost nova_compute[286344]: 2025-12-15 10:11:13.135 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:11:13 localhost systemd-udevd[335246]: Network interface NamePolicy= disabled on kernel command line. Dec 15 05:11:13 localhost ovn_metadata_agent[160585]: 2025-12-15 10:11:13.145 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-b35c3a3c-c123-4757-af17-fe86d0729b58', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b35c3a3c-c123-4757-af17-fe86d0729b58', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '229ca1deba244d0780350d1a77507ad1', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7289809c-b4ba-4daa-ad97-4de4c8e0e4c3, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=d0a827a2-ba8d-4ff8-b4de-16db92caf4c1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:11:13 localhost ovn_metadata_agent[160585]: 2025-12-15 10:11:13.147 160590 INFO neutron.agent.ovn.metadata.agent [-] Port d0a827a2-ba8d-4ff8-b4de-16db92caf4c1 in datapath b35c3a3c-c123-4757-af17-fe86d0729b58 bound to our chassis#033[00m Dec 15 05:11:13 localhost ovn_metadata_agent[160585]: 2025-12-15 10:11:13.149 160590 DEBUG neutron.agent.ovn.metadata.agent [-] Port 57f5c7c5-1761-46f5-8f70-fdabdd0c159c IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536#033[00m Dec 15 05:11:13 localhost ovn_metadata_agent[160585]: 2025-12-15 10:11:13.149 160590 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b35c3a3c-c123-4757-af17-fe86d0729b58, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 15 05:11:13 localhost ovn_metadata_agent[160585]: 2025-12-15 10:11:13.150 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[0db4b292-1325-4607-8b65-db4ffc4bcad2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:11:13 localhost journal[231322]: ethtool ioctl error on tapd0a827a2-ba: No such device Dec 15 05:11:13 localhost ovn_controller[154603]: 2025-12-15T10:11:13Z|00510|binding|INFO|Setting lport d0a827a2-ba8d-4ff8-b4de-16db92caf4c1 ovn-installed in OVS Dec 15 05:11:13 localhost journal[231322]: ethtool ioctl error on tapd0a827a2-ba: No such device Dec 15 05:11:13 localhost ovn_controller[154603]: 2025-12-15T10:11:13Z|00511|binding|INFO|Setting lport d0a827a2-ba8d-4ff8-b4de-16db92caf4c1 up in Southbound Dec 15 05:11:13 localhost nova_compute[286344]: 2025-12-15 10:11:13.172 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:11:13 localhost journal[231322]: ethtool ioctl error on tapd0a827a2-ba: No such device Dec 15 05:11:13 localhost journal[231322]: ethtool ioctl error on tapd0a827a2-ba: No such device Dec 15 05:11:13 localhost journal[231322]: ethtool ioctl error on tapd0a827a2-ba: No such device Dec 15 05:11:13 localhost journal[231322]: ethtool ioctl error on tapd0a827a2-ba: No such device Dec 15 05:11:13 localhost journal[231322]: ethtool ioctl error on tapd0a827a2-ba: No such device Dec 15 05:11:13 localhost journal[231322]: ethtool ioctl error on tapd0a827a2-ba: No such device Dec 15 05:11:13 localhost nova_compute[286344]: 2025-12-15 10:11:13.214 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:11:13 localhost nova_compute[286344]: 2025-12-15 10:11:13.242 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:11:13 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:11:13.810 267546 INFO neutron.agent.linux.ip_lib [None req-e44d0428-de8a-4eef-a443-f3821060d9d5 - - - - - -] Device tap8f119cd4-75 cannot be used as it has no MAC address#033[00m Dec 15 05:11:13 localhost nova_compute[286344]: 2025-12-15 10:11:13.836 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:11:13 localhost kernel: device tap8f119cd4-75 entered promiscuous mode Dec 15 05:11:13 localhost NetworkManager[5963]: [1765793473.8417] manager: (tap8f119cd4-75): new Generic device (/org/freedesktop/NetworkManager/Devices/79) Dec 15 05:11:13 localhost systemd-udevd[335248]: Network interface NamePolicy= disabled on kernel command line. Dec 15 05:11:13 localhost ovn_controller[154603]: 2025-12-15T10:11:13Z|00512|binding|INFO|Claiming lport 8f119cd4-7502-4b8d-844e-501737be70eb for this chassis. Dec 15 05:11:13 localhost ovn_controller[154603]: 2025-12-15T10:11:13Z|00513|binding|INFO|8f119cd4-7502-4b8d-844e-501737be70eb: Claiming unknown Dec 15 05:11:13 localhost nova_compute[286344]: 2025-12-15 10:11:13.851 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:11:13 localhost ovn_metadata_agent[160585]: 2025-12-15 10:11:13.864 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-d02f1047-156b-4b4a-8a9e-ac162d53b0c5', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d02f1047-156b-4b4a-8a9e-ac162d53b0c5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'dc12e715dc244046a7e657a59c77bc04', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9e4442e5-b140-463f-8da5-8bd120ed1d9b, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=8f119cd4-7502-4b8d-844e-501737be70eb) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:11:13 localhost ovn_metadata_agent[160585]: 2025-12-15 10:11:13.866 160590 INFO neutron.agent.ovn.metadata.agent [-] Port 8f119cd4-7502-4b8d-844e-501737be70eb in datapath d02f1047-156b-4b4a-8a9e-ac162d53b0c5 bound to our chassis#033[00m Dec 15 05:11:13 localhost ovn_metadata_agent[160585]: 2025-12-15 10:11:13.868 160590 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network d02f1047-156b-4b4a-8a9e-ac162d53b0c5 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 15 05:11:13 localhost ovn_metadata_agent[160585]: 2025-12-15 10:11:13.869 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[995756f6-961d-4bb9-ad21-7f8ce2987fe9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:11:13 localhost ovn_controller[154603]: 2025-12-15T10:11:13Z|00514|binding|INFO|Setting lport 8f119cd4-7502-4b8d-844e-501737be70eb ovn-installed in OVS Dec 15 05:11:13 localhost ovn_controller[154603]: 2025-12-15T10:11:13Z|00515|binding|INFO|Setting lport 8f119cd4-7502-4b8d-844e-501737be70eb up in Southbound Dec 15 05:11:13 localhost nova_compute[286344]: 2025-12-15 10:11:13.884 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:11:13 localhost nova_compute[286344]: 2025-12-15 10:11:13.932 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:11:14 localhost nova_compute[286344]: 2025-12-15 10:11:14.009 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:11:14 localhost podman[335335]: Dec 15 05:11:14 localhost podman[335335]: 2025-12-15 10:11:14.162970362 +0000 UTC m=+0.088433555 container create a28b5bd276c88258da127610008fefed786d6a74d721e8f5acfc18f6893594cf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b35c3a3c-c123-4757-af17-fe86d0729b58, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) Dec 15 05:11:14 localhost nova_compute[286344]: 2025-12-15 10:11:14.209 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:11:14 localhost podman[335335]: 2025-12-15 10:11:14.117816304 +0000 UTC m=+0.043279527 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 15 05:11:14 localhost systemd[1]: Started libpod-conmon-a28b5bd276c88258da127610008fefed786d6a74d721e8f5acfc18f6893594cf.scope. Dec 15 05:11:14 localhost systemd[1]: Started libcrun container. Dec 15 05:11:14 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/020a0d1b7a1fb883a3a27d891797524553ec3881d89717b22ab1b04e5d7c5df8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 05:11:14 localhost podman[335335]: 2025-12-15 10:11:14.255835417 +0000 UTC m=+0.181298610 container init a28b5bd276c88258da127610008fefed786d6a74d721e8f5acfc18f6893594cf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b35c3a3c-c123-4757-af17-fe86d0729b58, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202) Dec 15 05:11:14 localhost podman[335335]: 2025-12-15 10:11:14.270100666 +0000 UTC m=+0.195563839 container start a28b5bd276c88258da127610008fefed786d6a74d721e8f5acfc18f6893594cf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b35c3a3c-c123-4757-af17-fe86d0729b58, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 05:11:14 localhost dnsmasq[335364]: started, version 2.85 cachesize 150 Dec 15 05:11:14 localhost dnsmasq[335364]: DNS service limited to local subnets Dec 15 05:11:14 localhost dnsmasq[335364]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 15 05:11:14 localhost dnsmasq[335364]: warning: no upstream servers configured Dec 15 05:11:14 localhost dnsmasq-dhcp[335364]: DHCP, static leases only on 10.100.0.0, lease time 1d Dec 15 05:11:14 localhost dnsmasq[335364]: read /var/lib/neutron/dhcp/b35c3a3c-c123-4757-af17-fe86d0729b58/addn_hosts - 0 addresses Dec 15 05:11:14 localhost dnsmasq-dhcp[335364]: read /var/lib/neutron/dhcp/b35c3a3c-c123-4757-af17-fe86d0729b58/host Dec 15 05:11:14 localhost dnsmasq-dhcp[335364]: read /var/lib/neutron/dhcp/b35c3a3c-c123-4757-af17-fe86d0729b58/opts Dec 15 05:11:14 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:11:14.451 267546 INFO neutron.agent.dhcp.agent [None req-ee4a7a14-868e-4bca-8385-5f0950532485 - - - - - -] DHCP configuration for ports {'c4a56896-12b4-4dc0-a4db-01613a7f19e8'} is completed#033[00m Dec 15 05:11:14 localhost nova_compute[286344]: 2025-12-15 10:11:14.697 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:11:14 localhost podman[335395]: Dec 15 05:11:14 localhost podman[335395]: 2025-12-15 10:11:14.84728452 +0000 UTC m=+0.095419426 container create 2dc8e840c5c54c5f155f2909bbddf42501cda27d51bd9ec50074e909e636c8dc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d02f1047-156b-4b4a-8a9e-ac162d53b0c5, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:11:14 localhost systemd[1]: Started libpod-conmon-2dc8e840c5c54c5f155f2909bbddf42501cda27d51bd9ec50074e909e636c8dc.scope. Dec 15 05:11:14 localhost systemd[1]: Started libcrun container. Dec 15 05:11:14 localhost podman[335395]: 2025-12-15 10:11:14.795603444 +0000 UTC m=+0.043738420 image pull quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified Dec 15 05:11:14 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aff0820cf9c1a3439b905ccaace0c6ba1f7e7fcae7da0c2f86e8b74fab193ba6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff) Dec 15 05:11:14 localhost podman[335395]: 2025-12-15 10:11:14.909180473 +0000 UTC m=+0.157315379 container init 2dc8e840c5c54c5f155f2909bbddf42501cda27d51bd9ec50074e909e636c8dc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d02f1047-156b-4b4a-8a9e-ac162d53b0c5, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Dec 15 05:11:14 localhost podman[335395]: 2025-12-15 10:11:14.917334204 +0000 UTC m=+0.165469120 container start 2dc8e840c5c54c5f155f2909bbddf42501cda27d51bd9ec50074e909e636c8dc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d02f1047-156b-4b4a-8a9e-ac162d53b0c5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0) Dec 15 05:11:14 localhost dnsmasq[335413]: started, version 2.85 cachesize 150 Dec 15 05:11:14 localhost dnsmasq[335413]: DNS service limited to local subnets Dec 15 05:11:14 localhost dnsmasq[335413]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile Dec 15 05:11:14 localhost dnsmasq[335413]: warning: no upstream servers configured Dec 15 05:11:14 localhost dnsmasq-dhcp[335413]: DHCPv6, static leases only on 2001:db8::, lease time 1d Dec 15 05:11:14 localhost dnsmasq[335413]: read /var/lib/neutron/dhcp/d02f1047-156b-4b4a-8a9e-ac162d53b0c5/addn_hosts - 0 addresses Dec 15 05:11:14 localhost dnsmasq-dhcp[335413]: read /var/lib/neutron/dhcp/d02f1047-156b-4b4a-8a9e-ac162d53b0c5/host Dec 15 05:11:14 localhost dnsmasq-dhcp[335413]: read /var/lib/neutron/dhcp/d02f1047-156b-4b4a-8a9e-ac162d53b0c5/opts Dec 15 05:11:15 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:11:15.125 267546 INFO neutron.agent.dhcp.agent [None req-74763bf0-cfcf-4238-9e69-3a5dbceaa955 - - - - - -] DHCP configuration for ports {'00f22398-dedc-402f-aae8-5551a1bd05b9'} is completed#033[00m Dec 15 05:11:15 localhost systemd[1]: tmp-crun.vxBbFX.mount: Deactivated successfully. Dec 15 05:11:15 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:11:15.497 267546 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-15T10:11:15Z, description=, device_id=145c35a3-f902-4a3f-af68-dd6784659097, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=a6f21308-0063-4847-8516-c1253dab9575, ip_allocation=immediate, mac_address=fa:16:3e:c5:fc:7d, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-15T10:11:09Z, description=, dns_domain=, id=b35c3a3c-c123-4757-af17-fe86d0729b58, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-VolumesBackupsTest-595279419-network, port_security_enabled=True, project_id=229ca1deba244d0780350d1a77507ad1, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=28427, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3447, status=ACTIVE, subnets=['7328b3dd-1cd3-4845-8896-6081e3046fb3'], tags=[], tenant_id=229ca1deba244d0780350d1a77507ad1, updated_at=2025-12-15T10:11:10Z, vlan_transparent=None, network_id=b35c3a3c-c123-4757-af17-fe86d0729b58, port_security_enabled=False, project_id=229ca1deba244d0780350d1a77507ad1, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3483, status=DOWN, tags=[], tenant_id=229ca1deba244d0780350d1a77507ad1, updated_at=2025-12-15T10:11:15Z on network b35c3a3c-c123-4757-af17-fe86d0729b58#033[00m Dec 15 05:11:15 localhost podman[335430]: 2025-12-15 10:11:15.73437877 +0000 UTC m=+0.053224148 container kill a28b5bd276c88258da127610008fefed786d6a74d721e8f5acfc18f6893594cf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b35c3a3c-c123-4757-af17-fe86d0729b58, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Dec 15 05:11:15 localhost dnsmasq[335364]: read /var/lib/neutron/dhcp/b35c3a3c-c123-4757-af17-fe86d0729b58/addn_hosts - 1 addresses Dec 15 05:11:15 localhost dnsmasq-dhcp[335364]: read /var/lib/neutron/dhcp/b35c3a3c-c123-4757-af17-fe86d0729b58/host Dec 15 05:11:15 localhost dnsmasq-dhcp[335364]: read /var/lib/neutron/dhcp/b35c3a3c-c123-4757-af17-fe86d0729b58/opts Dec 15 05:11:16 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:11:16.023 267546 INFO neutron.agent.dhcp.agent [None req-ce02f385-a9f4-4fc6-97ff-4c2182f86a8e - - - - - -] DHCP configuration for ports {'a6f21308-0063-4847-8516-c1253dab9575'} is completed#033[00m Dec 15 05:11:16 localhost nova_compute[286344]: 2025-12-15 10:11:16.292 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:11:16 localhost nova_compute[286344]: 2025-12-15 10:11:16.493 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:11:16 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e230 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:11:16 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e230 do_prune osdmap full prune enabled Dec 15 05:11:16 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e231 e231: 6 total, 6 up, 6 in Dec 15 05:11:16 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e231: 6 total, 6 up, 6 in Dec 15 05:11:16 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:11:16.689 267546 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-15T10:11:15Z, description=, device_id=145c35a3-f902-4a3f-af68-dd6784659097, device_owner=network:router_interface, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=a6f21308-0063-4847-8516-c1253dab9575, ip_allocation=immediate, mac_address=fa:16:3e:c5:fc:7d, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-15T10:11:09Z, description=, dns_domain=, id=b35c3a3c-c123-4757-af17-fe86d0729b58, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-VolumesBackupsTest-595279419-network, port_security_enabled=True, project_id=229ca1deba244d0780350d1a77507ad1, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=28427, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3447, status=ACTIVE, subnets=['7328b3dd-1cd3-4845-8896-6081e3046fb3'], tags=[], tenant_id=229ca1deba244d0780350d1a77507ad1, updated_at=2025-12-15T10:11:10Z, vlan_transparent=None, network_id=b35c3a3c-c123-4757-af17-fe86d0729b58, port_security_enabled=False, project_id=229ca1deba244d0780350d1a77507ad1, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3483, status=DOWN, tags=[], tenant_id=229ca1deba244d0780350d1a77507ad1, updated_at=2025-12-15T10:11:15Z on network b35c3a3c-c123-4757-af17-fe86d0729b58#033[00m Dec 15 05:11:16 localhost podman[335469]: 2025-12-15 10:11:16.915810594 +0000 UTC m=+0.063737294 container kill a28b5bd276c88258da127610008fefed786d6a74d721e8f5acfc18f6893594cf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b35c3a3c-c123-4757-af17-fe86d0729b58, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3) Dec 15 05:11:16 localhost dnsmasq[335364]: read /var/lib/neutron/dhcp/b35c3a3c-c123-4757-af17-fe86d0729b58/addn_hosts - 1 addresses Dec 15 05:11:16 localhost dnsmasq-dhcp[335364]: read /var/lib/neutron/dhcp/b35c3a3c-c123-4757-af17-fe86d0729b58/host Dec 15 05:11:16 localhost dnsmasq-dhcp[335364]: read /var/lib/neutron/dhcp/b35c3a3c-c123-4757-af17-fe86d0729b58/opts Dec 15 05:11:17 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:11:17.175 267546 INFO neutron.agent.dhcp.agent [None req-c9eacb7f-80e4-472b-9549-dc29986d44b9 - - - - - -] DHCP configuration for ports {'a6f21308-0063-4847-8516-c1253dab9575'} is completed#033[00m Dec 15 05:11:17 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth rm", "entity": "client.david"} v 0) Dec 15 05:11:17 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.david"} : dispatch Dec 15 05:11:17 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth rm", "entity": "client.david"}]': finished Dec 15 05:11:17 localhost nova_compute[286344]: 2025-12-15 10:11:17.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:11:17 localhost nova_compute[286344]: 2025-12-15 10:11:17.271 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 15 05:11:17 localhost nova_compute[286344]: 2025-12-15 10:11:17.271 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 15 05:11:17 localhost nova_compute[286344]: 2025-12-15 10:11:17.349 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 15 05:11:17 localhost nova_compute[286344]: 2025-12-15 10:11:17.350 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquired lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 15 05:11:17 localhost nova_compute[286344]: 2025-12-15 10:11:17.351 286348 DEBUG nova.network.neutron [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 15 05:11:17 localhost nova_compute[286344]: 2025-12-15 10:11:17.351 286348 DEBUG nova.objects.instance [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 15 05:11:17 localhost ovn_metadata_agent[160585]: 2025-12-15 10:11:17.770 160590 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=12d96d64-e862-4f68-81e5-8d9ec5d3a5e2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '19'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 15 05:11:18 localhost nova_compute[286344]: 2025-12-15 10:11:18.030 286348 DEBUG nova.network.neutron [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Updating instance_info_cache with network_info: [{"id": "03ef8889-3216-43fb-8a52-4be17a956ce1", "address": "fa:16:3e:74:df:7c", "network": {"id": "befb7a72-17a9-4bcb-b561-84b8f626685a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.201", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "c785bf23f53946bc99867d8832a50266", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03ef8889-32", "ovs_interfaceid": "03ef8889-3216-43fb-8a52-4be17a956ce1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 15 05:11:18 localhost nova_compute[286344]: 2025-12-15 10:11:18.046 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Releasing lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 15 05:11:18 localhost nova_compute[286344]: 2025-12-15 10:11:18.047 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 15 05:11:18 localhost nova_compute[286344]: 2025-12-15 10:11:18.048 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:11:18 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.david", "format": "json"} : dispatch Dec 15 05:11:18 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.david"} : dispatch Dec 15 05:11:18 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.david"} : dispatch Dec 15 05:11:18 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth rm", "entity": "client.david"}]': finished Dec 15 05:11:18 localhost nova_compute[286344]: 2025-12-15 10:11:18.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:11:19 localhost nova_compute[286344]: 2025-12-15 10:11:19.238 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:11:20 localhost nova_compute[286344]: 2025-12-15 10:11:20.266 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:11:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0. Dec 15 05:11:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09. Dec 15 05:11:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 05:11:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 05:11:20 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a. Dec 15 05:11:20 localhost podman[335504]: 2025-12-15 10:11:20.804083499 +0000 UTC m=+0.104352858 container health_status b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Dec 15 05:11:20 localhost podman[335504]: 2025-12-15 10:11:20.809941988 +0000 UTC m=+0.110211347 container exec_died b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible) Dec 15 05:11:20 localhost systemd[1]: b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a.service: Deactivated successfully. Dec 15 05:11:20 localhost podman[335495]: 2025-12-15 10:11:20.888095753 +0000 UTC m=+0.192707561 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller) Dec 15 05:11:20 localhost podman[335495]: 2025-12-15 10:11:20.931559574 +0000 UTC m=+0.236171382 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Dec 15 05:11:20 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 05:11:20 localhost podman[335490]: 2025-12-15 10:11:20.982772018 +0000 UTC m=+0.298366245 container health_status 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Dec 15 05:11:20 localhost podman[335491]: 2025-12-15 10:11:20.939519421 +0000 UTC m=+0.251606872 container health_status 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, config_id=openstack_network_exporter, managed_by=edpm_ansible, architecture=x86_64, vcs-type=git, version=9.6, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, release=1755695350, maintainer=Red Hat, Inc.) Dec 15 05:11:21 localhost podman[335491]: 2025-12-15 10:11:21.020470302 +0000 UTC m=+0.332557743 container exec_died 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, io.buildah.version=1.33.7, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, io.openshift.expose-services=, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, version=9.6, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, distribution-scope=public, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container) Dec 15 05:11:21 localhost systemd[1]: 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09.service: Deactivated successfully. Dec 15 05:11:21 localhost podman[335490]: 2025-12-15 10:11:21.065847806 +0000 UTC m=+0.381442003 container exec_died 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Dec 15 05:11:21 localhost systemd[1]: 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0.service: Deactivated successfully. Dec 15 05:11:21 localhost podman[335492]: 2025-12-15 10:11:21.152866512 +0000 UTC m=+0.459726021 container health_status 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:11:21 localhost podman[335492]: 2025-12-15 10:11:21.16639141 +0000 UTC m=+0.473250909 container exec_died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.license=GPLv2) Dec 15 05:11:21 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.service: Deactivated successfully. Dec 15 05:11:21 localhost nova_compute[286344]: 2025-12-15 10:11:21.495 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:11:21 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e231 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:11:22 localhost nova_compute[286344]: 2025-12-15 10:11:22.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:11:22 localhost nova_compute[286344]: 2025-12-15 10:11:22.271 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:11:22 localhost nova_compute[286344]: 2025-12-15 10:11:22.271 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 15 05:11:23 localhost nova_compute[286344]: 2025-12-15 10:11:23.266 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:11:23 localhost podman[335614]: 2025-12-15 10:11:23.922166611 +0000 UTC m=+0.059512929 container kill 2dc8e840c5c54c5f155f2909bbddf42501cda27d51bd9ec50074e909e636c8dc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d02f1047-156b-4b4a-8a9e-ac162d53b0c5, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202) Dec 15 05:11:23 localhost dnsmasq[335413]: exiting on receipt of SIGTERM Dec 15 05:11:23 localhost systemd[1]: libpod-2dc8e840c5c54c5f155f2909bbddf42501cda27d51bd9ec50074e909e636c8dc.scope: Deactivated successfully. Dec 15 05:11:23 localhost podman[335629]: 2025-12-15 10:11:23.992927434 +0000 UTC m=+0.056683041 container died 2dc8e840c5c54c5f155f2909bbddf42501cda27d51bd9ec50074e909e636c8dc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d02f1047-156b-4b4a-8a9e-ac162d53b0c5, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:11:24 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2dc8e840c5c54c5f155f2909bbddf42501cda27d51bd9ec50074e909e636c8dc-userdata-shm.mount: Deactivated successfully. Dec 15 05:11:24 localhost systemd[1]: var-lib-containers-storage-overlay-aff0820cf9c1a3439b905ccaace0c6ba1f7e7fcae7da0c2f86e8b74fab193ba6-merged.mount: Deactivated successfully. Dec 15 05:11:24 localhost podman[335629]: 2025-12-15 10:11:24.032457859 +0000 UTC m=+0.096213426 container cleanup 2dc8e840c5c54c5f155f2909bbddf42501cda27d51bd9ec50074e909e636c8dc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d02f1047-156b-4b4a-8a9e-ac162d53b0c5, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) Dec 15 05:11:24 localhost systemd[1]: libpod-conmon-2dc8e840c5c54c5f155f2909bbddf42501cda27d51bd9ec50074e909e636c8dc.scope: Deactivated successfully. Dec 15 05:11:24 localhost podman[335631]: 2025-12-15 10:11:24.070394201 +0000 UTC m=+0.125771501 container remove 2dc8e840c5c54c5f155f2909bbddf42501cda27d51bd9ec50074e909e636c8dc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d02f1047-156b-4b4a-8a9e-ac162d53b0c5, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3) Dec 15 05:11:24 localhost nova_compute[286344]: 2025-12-15 10:11:24.124 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:11:24 localhost kernel: device tap8f119cd4-75 left promiscuous mode Dec 15 05:11:24 localhost ovn_controller[154603]: 2025-12-15T10:11:24Z|00516|binding|INFO|Releasing lport 8f119cd4-7502-4b8d-844e-501737be70eb from this chassis (sb_readonly=0) Dec 15 05:11:24 localhost ovn_controller[154603]: 2025-12-15T10:11:24Z|00517|binding|INFO|Setting lport 8f119cd4-7502-4b8d-844e-501737be70eb down in Southbound Dec 15 05:11:24 localhost ovn_metadata_agent[160585]: 2025-12-15 10:11:24.135 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-d02f1047-156b-4b4a-8a9e-ac162d53b0c5', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d02f1047-156b-4b4a-8a9e-ac162d53b0c5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'dc12e715dc244046a7e657a59c77bc04', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005559462.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9e4442e5-b140-463f-8da5-8bd120ed1d9b, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=8f119cd4-7502-4b8d-844e-501737be70eb) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:11:24 localhost ovn_metadata_agent[160585]: 2025-12-15 10:11:24.135 160590 INFO neutron.agent.ovn.metadata.agent [-] Port 8f119cd4-7502-4b8d-844e-501737be70eb in datapath d02f1047-156b-4b4a-8a9e-ac162d53b0c5 unbound from our chassis#033[00m Dec 15 05:11:24 localhost ovn_metadata_agent[160585]: 2025-12-15 10:11:24.136 160590 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network d02f1047-156b-4b4a-8a9e-ac162d53b0c5 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m Dec 15 05:11:24 localhost ovn_metadata_agent[160585]: 2025-12-15 10:11:24.137 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[dd1de25e-89c5-49f6-9c46-e6038e927bf2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:11:24 localhost nova_compute[286344]: 2025-12-15 10:11:24.146 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:11:24 localhost nova_compute[286344]: 2025-12-15 10:11:24.147 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:11:24 localhost nova_compute[286344]: 2025-12-15 10:11:24.240 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:11:24 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:11:24.333 267546 INFO neutron.agent.dhcp.agent [None req-7b55f7bf-5dd5-4973-ae7f-7ee8e1abc182 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:11:24 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:11:24.390 267546 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:11:24 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:11:24.676 267546 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:11:24 localhost ovn_controller[154603]: 2025-12-15T10:11:24Z|00518|binding|INFO|Releasing lport b35254ad-12eb-47bb-92be-44fefe0694f0 from this chassis (sb_readonly=0) Dec 15 05:11:24 localhost nova_compute[286344]: 2025-12-15 10:11:24.907 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:11:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 05:11:24 localhost systemd[1]: run-netns-qdhcp\x2dd02f1047\x2d156b\x2d4b4a\x2d8a9e\x2dac162d53b0c5.mount: Deactivated successfully. Dec 15 05:11:25 localhost podman[335657]: 2025-12-15 10:11:25.003569285 +0000 UTC m=+0.079805141 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent) Dec 15 05:11:25 localhost podman[335657]: 2025-12-15 10:11:25.012337893 +0000 UTC m=+0.088573729 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, tcib_managed=true, org.label-schema.license=GPLv2) Dec 15 05:11:25 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 05:11:26 localhost nova_compute[286344]: 2025-12-15 10:11:26.269 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:11:26 localhost nova_compute[286344]: 2025-12-15 10:11:26.290 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 05:11:26 localhost nova_compute[286344]: 2025-12-15 10:11:26.291 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 05:11:26 localhost nova_compute[286344]: 2025-12-15 10:11:26.291 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 05:11:26 localhost nova_compute[286344]: 2025-12-15 10:11:26.291 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Auditing locally available compute resources for np0005559462.localdomain (node: np0005559462.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 15 05:11:26 localhost nova_compute[286344]: 2025-12-15 10:11:26.292 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 05:11:26 localhost nova_compute[286344]: 2025-12-15 10:11:26.497 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:11:26 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e231 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:11:26 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Dec 15 05:11:26 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3916575713' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Dec 15 05:11:26 localhost nova_compute[286344]: 2025-12-15 10:11:26.736 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 05:11:26 localhost nova_compute[286344]: 2025-12-15 10:11:26.820 286348 DEBUG nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 05:11:26 localhost nova_compute[286344]: 2025-12-15 10:11:26.821 286348 DEBUG nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 05:11:26 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Dec 15 05:11:26 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1375357866' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Dec 15 05:11:27 localhost nova_compute[286344]: 2025-12-15 10:11:27.043 286348 WARNING nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 15 05:11:27 localhost nova_compute[286344]: 2025-12-15 10:11:27.045 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Hypervisor/Node resource view: name=np0005559462.localdomain free_ram=11165MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 15 05:11:27 localhost nova_compute[286344]: 2025-12-15 10:11:27.046 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 05:11:27 localhost nova_compute[286344]: 2025-12-15 10:11:27.046 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 05:11:27 localhost nova_compute[286344]: 2025-12-15 10:11:27.124 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Instance 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 15 05:11:27 localhost nova_compute[286344]: 2025-12-15 10:11:27.125 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 15 05:11:27 localhost nova_compute[286344]: 2025-12-15 10:11:27.125 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Final resource view: name=np0005559462.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 15 05:11:27 localhost nova_compute[286344]: 2025-12-15 10:11:27.141 286348 DEBUG nova.scheduler.client.report [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Refreshing inventories for resource provider 26c8956b-6742-4951-b566-971b9bbe323b _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Dec 15 05:11:27 localhost nova_compute[286344]: 2025-12-15 10:11:27.161 286348 DEBUG nova.scheduler.client.report [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Updating ProviderTree inventory for provider 26c8956b-6742-4951-b566-971b9bbe323b from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Dec 15 05:11:27 localhost nova_compute[286344]: 2025-12-15 10:11:27.161 286348 DEBUG nova.compute.provider_tree [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Updating inventory in ProviderTree for provider 26c8956b-6742-4951-b566-971b9bbe323b with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Dec 15 05:11:27 localhost nova_compute[286344]: 2025-12-15 10:11:27.175 286348 DEBUG nova.scheduler.client.report [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Refreshing aggregate associations for resource provider 26c8956b-6742-4951-b566-971b9bbe323b, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Dec 15 05:11:27 localhost nova_compute[286344]: 2025-12-15 10:11:27.201 286348 DEBUG nova.scheduler.client.report [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Refreshing trait associations for resource provider 26c8956b-6742-4951-b566-971b9bbe323b, traits: COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_IDE,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NODE,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_CLMUL,HW_CPU_X86_ABM,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_BMI2,HW_CPU_X86_SSE42,HW_CPU_X86_FMA3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_BMI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE,HW_CPU_X86_AVX,HW_CPU_X86_SVM,COMPUTE_SECURITY_TPM_2_0,COMPUTE_TRUSTED_CERTS,COMPUTE_RESCUE_BFV,HW_CPU_X86_AVX2,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSSE3,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_AESNI,HW_CPU_X86_SSE4A,HW_CPU_X86_MMX,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_F16C,HW_CPU_X86_SHA,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_SATA _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Dec 15 05:11:27 localhost nova_compute[286344]: 2025-12-15 10:11:27.242 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 05:11:27 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e231 do_prune osdmap full prune enabled Dec 15 05:11:27 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e232 e232: 6 total, 6 up, 6 in Dec 15 05:11:27 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e232: 6 total, 6 up, 6 in Dec 15 05:11:27 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Dec 15 05:11:27 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1605155725' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Dec 15 05:11:27 localhost nova_compute[286344]: 2025-12-15 10:11:27.723 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.481s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 05:11:27 localhost nova_compute[286344]: 2025-12-15 10:11:27.729 286348 DEBUG nova.compute.provider_tree [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Inventory has not changed in ProviderTree for provider: 26c8956b-6742-4951-b566-971b9bbe323b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 15 05:11:27 localhost nova_compute[286344]: 2025-12-15 10:11:27.745 286348 DEBUG nova.scheduler.client.report [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Inventory has not changed for provider 26c8956b-6742-4951-b566-971b9bbe323b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 15 05:11:27 localhost nova_compute[286344]: 2025-12-15 10:11:27.747 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Compute_service record updated for np0005559462.localdomain:np0005559462.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 15 05:11:27 localhost nova_compute[286344]: 2025-12-15 10:11:27.748 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.701s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 05:11:28 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Dec 15 05:11:28 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.25625 172.18.0.34:0/382777224' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Dec 15 05:11:28 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e232 do_prune osdmap full prune enabled Dec 15 05:11:28 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e233 e233: 6 total, 6 up, 6 in Dec 15 05:11:28 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e233: 6 total, 6 up, 6 in Dec 15 05:11:28 localhost nova_compute[286344]: 2025-12-15 10:11:28.750 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:11:29 localhost nova_compute[286344]: 2025-12-15 10:11:29.242 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:11:31 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Dec 15 05:11:31 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2420403560' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Dec 15 05:11:31 localhost nova_compute[286344]: 2025-12-15 10:11:31.500 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:11:31 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Dec 15 05:11:31 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.25625 172.18.0.34:0/382777224' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Dec 15 05:11:31 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e233 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:11:31 localhost systemd[1]: Started /usr/bin/podman healthcheck run a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e. Dec 15 05:11:31 localhost podman[335719]: 2025-12-15 10:11:31.76486684 +0000 UTC m=+0.092921378 container health_status a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Dec 15 05:11:31 localhost podman[335719]: 2025-12-15 10:11:31.801372743 +0000 UTC m=+0.129427261 container exec_died a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 15 05:11:31 localhost systemd[1]: a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e.service: Deactivated successfully. Dec 15 05:11:31 localhost podman[243449]: time="2025-12-15T10:11:31Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 15 05:11:31 localhost podman[243449]: @ - - [15/Dec/2025:10:11:31 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 158464 "" "Go-http-client/1.1" Dec 15 05:11:31 localhost podman[243449]: @ - - [15/Dec/2025:10:11:31 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19749 "" "Go-http-client/1.1" Dec 15 05:11:32 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e233 do_prune osdmap full prune enabled Dec 15 05:11:32 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e234 e234: 6 total, 6 up, 6 in Dec 15 05:11:32 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e234: 6 total, 6 up, 6 in Dec 15 05:11:33 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e234 do_prune osdmap full prune enabled Dec 15 05:11:33 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e235 e235: 6 total, 6 up, 6 in Dec 15 05:11:33 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e235: 6 total, 6 up, 6 in Dec 15 05:11:34 localhost nova_compute[286344]: 2025-12-15 10:11:34.246 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:11:34 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e235 do_prune osdmap full prune enabled Dec 15 05:11:34 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e236 e236: 6 total, 6 up, 6 in Dec 15 05:11:34 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e236: 6 total, 6 up, 6 in Dec 15 05:11:34 localhost openstack_network_exporter[246484]: ERROR 10:11:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 05:11:34 localhost openstack_network_exporter[246484]: ERROR 10:11:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 05:11:34 localhost openstack_network_exporter[246484]: ERROR 10:11:34 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 15 05:11:34 localhost openstack_network_exporter[246484]: ERROR 10:11:34 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 15 05:11:34 localhost openstack_network_exporter[246484]: Dec 15 05:11:34 localhost openstack_network_exporter[246484]: ERROR 10:11:34 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 15 05:11:34 localhost openstack_network_exporter[246484]: Dec 15 05:11:35 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Dec 15 05:11:35 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.25625 172.18.0.34:0/382777224' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Dec 15 05:11:36 localhost nova_compute[286344]: 2025-12-15 10:11:36.502 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:11:36 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e236 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:11:36 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e236 do_prune osdmap full prune enabled Dec 15 05:11:36 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e237 e237: 6 total, 6 up, 6 in Dec 15 05:11:36 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e237: 6 total, 6 up, 6 in Dec 15 05:11:38 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Dec 15 05:11:38 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.25625 172.18.0.34:0/382777224' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Dec 15 05:11:38 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e237 do_prune osdmap full prune enabled Dec 15 05:11:38 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e238 e238: 6 total, 6 up, 6 in Dec 15 05:11:38 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e238: 6 total, 6 up, 6 in Dec 15 05:11:39 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Dec 15 05:11:39 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1067616804' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Dec 15 05:11:39 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Dec 15 05:11:39 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1067616804' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Dec 15 05:11:39 localhost nova_compute[286344]: 2025-12-15 10:11:39.249 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:11:39 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e238 do_prune osdmap full prune enabled Dec 15 05:11:39 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e239 e239: 6 total, 6 up, 6 in Dec 15 05:11:39 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e239: 6 total, 6 up, 6 in Dec 15 05:11:41 localhost nova_compute[286344]: 2025-12-15 10:11:41.505 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:11:41 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e239 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:11:41 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e239 do_prune osdmap full prune enabled Dec 15 05:11:41 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e240 e240: 6 total, 6 up, 6 in Dec 15 05:11:41 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e240: 6 total, 6 up, 6 in Dec 15 05:11:41 localhost neutron_sriov_agent[260044]: 2025-12-15 10:11:41.769 2 INFO neutron.agent.securitygroups_rpc [None req-9c91270c-b2e0-4d54-8d69-6a3f0acc8985 055a2ead711042929f6186ce0df9286e 229ca1deba244d0780350d1a77507ad1 - - default default] Security group rule updated ['49be7990-8193-41d2-bdb8-cde8aeb193db']#033[00m Dec 15 05:11:41 localhost neutron_sriov_agent[260044]: 2025-12-15 10:11:41.902 2 INFO neutron.agent.securitygroups_rpc [None req-e4001925-e460-4611-b5ed-7dfae2b07127 055a2ead711042929f6186ce0df9286e 229ca1deba244d0780350d1a77507ad1 - - default default] Security group rule updated ['49be7990-8193-41d2-bdb8-cde8aeb193db']#033[00m Dec 15 05:11:42 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e240 do_prune osdmap full prune enabled Dec 15 05:11:42 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e241 e241: 6 total, 6 up, 6 in Dec 15 05:11:42 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e241: 6 total, 6 up, 6 in Dec 15 05:11:42 localhost sshd[335742]: main: sshd: ssh-rsa algorithm is disabled Dec 15 05:11:43 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Dec 15 05:11:43 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1647404681' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Dec 15 05:11:43 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Dec 15 05:11:43 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1647404681' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Dec 15 05:11:44 localhost nova_compute[286344]: 2025-12-15 10:11:44.275 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:11:46 localhost neutron_sriov_agent[260044]: 2025-12-15 10:11:46.162 2 INFO neutron.agent.securitygroups_rpc [req-403d1980-60be-475a-b99b-d4810355349a req-12b722a1-3c7b-46b4-8601-0d8439a0b25b 055a2ead711042929f6186ce0df9286e 229ca1deba244d0780350d1a77507ad1 - - default default] Security group member updated ['49be7990-8193-41d2-bdb8-cde8aeb193db']#033[00m Dec 15 05:11:46 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:11:46.192 267546 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-15T10:11:45Z, description=, device_id=ae9e9af2-1108-49d3-9595-3991b1935e3c, device_owner=, dns_assignment=[], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[], id=cb745671-0158-495c-b08f-1562fdda921f, ip_allocation=immediate, mac_address=fa:16:3e:43:d0:33, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2025-12-15T10:11:09Z, description=, dns_domain=, id=b35c3a3c-c123-4757-af17-fe86d0729b58, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-VolumesBackupsTest-595279419-network, port_security_enabled=True, project_id=229ca1deba244d0780350d1a77507ad1, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=28427, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3447, status=ACTIVE, subnets=['7328b3dd-1cd3-4845-8896-6081e3046fb3'], tags=[], tenant_id=229ca1deba244d0780350d1a77507ad1, updated_at=2025-12-15T10:11:10Z, vlan_transparent=None, network_id=b35c3a3c-c123-4757-af17-fe86d0729b58, port_security_enabled=True, project_id=229ca1deba244d0780350d1a77507ad1, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['49be7990-8193-41d2-bdb8-cde8aeb193db'], standard_attr_id=3562, status=DOWN, tags=[], tenant_id=229ca1deba244d0780350d1a77507ad1, updated_at=2025-12-15T10:11:46Z on network b35c3a3c-c123-4757-af17-fe86d0729b58#033[00m Dec 15 05:11:46 localhost podman[335760]: 2025-12-15 10:11:46.409172366 +0000 UTC m=+0.060599365 container kill a28b5bd276c88258da127610008fefed786d6a74d721e8f5acfc18f6893594cf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b35c3a3c-c123-4757-af17-fe86d0729b58, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202) Dec 15 05:11:46 localhost dnsmasq[335364]: read /var/lib/neutron/dhcp/b35c3a3c-c123-4757-af17-fe86d0729b58/addn_hosts - 2 addresses Dec 15 05:11:46 localhost dnsmasq-dhcp[335364]: read /var/lib/neutron/dhcp/b35c3a3c-c123-4757-af17-fe86d0729b58/host Dec 15 05:11:46 localhost dnsmasq-dhcp[335364]: read /var/lib/neutron/dhcp/b35c3a3c-c123-4757-af17-fe86d0729b58/opts Dec 15 05:11:46 localhost nova_compute[286344]: 2025-12-15 10:11:46.507 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:11:46 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e241 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:11:46 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e241 do_prune osdmap full prune enabled Dec 15 05:11:46 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e242 e242: 6 total, 6 up, 6 in Dec 15 05:11:46 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e242: 6 total, 6 up, 6 in Dec 15 05:11:46 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:11:46.781 267546 INFO neutron.agent.dhcp.agent [None req-d395690b-fa7b-4716-bf06-3bd1191242a1 - - - - - -] DHCP configuration for ports {'cb745671-0158-495c-b08f-1562fdda921f'} is completed#033[00m Dec 15 05:11:47 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:11:47.055 267546 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=np0005559464.localdomain, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2025-12-15T10:11:45Z, description=, device_id=ae9e9af2-1108-49d3-9595-3991b1935e3c, device_owner=compute:nova, dns_assignment=[], dns_domain=, dns_name=tempest-volumesbackupstest-instance-468025304, extra_dhcp_opts=[], fixed_ips=[], id=cb745671-0158-495c-b08f-1562fdda921f, ip_allocation=immediate, mac_address=fa:16:3e:43:d0:33, name=, network_id=b35c3a3c-c123-4757-af17-fe86d0729b58, port_security_enabled=True, project_id=229ca1deba244d0780350d1a77507ad1, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['49be7990-8193-41d2-bdb8-cde8aeb193db'], standard_attr_id=3562, status=DOWN, tags=[], tenant_id=229ca1deba244d0780350d1a77507ad1, updated_at=2025-12-15T10:11:46Z on network b35c3a3c-c123-4757-af17-fe86d0729b58#033[00m Dec 15 05:11:47 localhost systemd[1]: tmp-crun.GhhKuG.mount: Deactivated successfully. Dec 15 05:11:47 localhost dnsmasq[335364]: read /var/lib/neutron/dhcp/b35c3a3c-c123-4757-af17-fe86d0729b58/addn_hosts - 2 addresses Dec 15 05:11:47 localhost dnsmasq-dhcp[335364]: read /var/lib/neutron/dhcp/b35c3a3c-c123-4757-af17-fe86d0729b58/host Dec 15 05:11:47 localhost dnsmasq-dhcp[335364]: read /var/lib/neutron/dhcp/b35c3a3c-c123-4757-af17-fe86d0729b58/opts Dec 15 05:11:47 localhost podman[335798]: 2025-12-15 10:11:47.279236392 +0000 UTC m=+0.071182592 container kill a28b5bd276c88258da127610008fefed786d6a74d721e8f5acfc18f6893594cf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b35c3a3c-c123-4757-af17-fe86d0729b58, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Dec 15 05:11:47 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:11:47.550 267546 INFO neutron.agent.dhcp.agent [None req-04e41612-ff7b-40b8-a271-0c50b5583b39 - - - - - -] DHCP configuration for ports {'cb745671-0158-495c-b08f-1562fdda921f'} is completed#033[00m Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.125 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'name': 'test', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005559462.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'c785bf23f53946bc99867d8832a50266', 'user_id': '1ba5fce347b64bfebf995f187193f205', 'hostId': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.126 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.131 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.133 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b4307dd2-8856-4a6f-be1e-913df9516ef1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:11:48.126650', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '777e5b76-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12574.319327168, 'message_signature': '503ff45e2016d00786c932f70f6722434a5f0b5222b20c6c0d31b6f04f50d7a3'}]}, 'timestamp': '2025-12-15 10:11:48.131816', '_unique_id': 'b4449fdbd15943ffae07e0c04bd67318'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.133 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.133 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.133 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.133 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.133 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.133 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.133 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.133 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.133 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.133 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.133 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.133 12 ERROR oslo_messaging.notify.messaging Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.133 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.133 12 ERROR oslo_messaging.notify.messaging Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.133 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.133 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.133 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.133 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.133 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.133 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.133 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.133 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.133 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.133 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.133 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.133 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.133 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.133 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.133 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.133 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.133 12 ERROR oslo_messaging.notify.messaging Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.135 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.135 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.137 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7ec9a492-e915-4c71-ac30-aa42f46d3182', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:11:48.135427', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '777f009e-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12574.319327168, 'message_signature': 'f8236dc79c564fe1846c33411efe31dd87d86fbb58e837d72b0aee934405d2c2'}]}, 'timestamp': '2025-12-15 10:11:48.136036', '_unique_id': '371bb001dbcb4fcca6a04d0c568c965c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.137 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.137 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.137 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.137 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.137 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.137 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.137 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.137 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.137 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.137 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.137 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.137 12 ERROR oslo_messaging.notify.messaging Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.137 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.137 12 ERROR oslo_messaging.notify.messaging Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.137 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.137 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.137 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.137 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.137 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.137 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.137 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.137 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.137 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.137 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.137 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.137 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.137 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.137 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.137 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.137 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.137 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.137 12 ERROR oslo_messaging.notify.messaging Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.138 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.138 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.140 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f0101a93-fd23-4de4-9a42-0fe963cf8608', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:11:48.138799', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '777f8762-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12574.319327168, 'message_signature': 'f9319c339921e0b9d62c8e6d14b4fa05cb6658ebde4b618ea06e775c3bcc83e2'}]}, 'timestamp': '2025-12-15 10:11:48.139446', '_unique_id': '50187d7f8286442eaa1f49648dd789ee'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.140 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.140 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.140 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.140 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.140 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.140 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.140 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.140 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.140 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.140 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.140 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.140 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.140 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.140 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.140 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.140 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.140 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.140 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.140 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.140 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.140 12 ERROR oslo_messaging.notify.messaging Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.140 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.140 12 ERROR oslo_messaging.notify.messaging Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.140 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.140 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.140 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.140 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.140 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.140 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.140 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.140 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.140 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.140 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.140 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.140 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.140 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.140 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.140 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.140 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.140 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.140 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.140 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.140 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.140 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.140 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.140 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.140 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.140 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.140 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.140 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.140 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.140 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.140 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.140 12 ERROR oslo_messaging.notify.messaging Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.142 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.142 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.144 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '64bd0ba1-ec86-4fd9-81b8-c6393b8b235f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:11:48.142475', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '778013c6-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12574.319327168, 'message_signature': '49f9f695eaa84ec4aa693cb882b19e00eb035559afcf9a309b3caaea773b5459'}]}, 'timestamp': '2025-12-15 10:11:48.143111', '_unique_id': 'e49bac71d53a4dda8f2047255554c317'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.144 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.144 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.144 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.144 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.144 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.144 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.144 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.144 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.144 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.144 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.144 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.144 12 ERROR oslo_messaging.notify.messaging Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.144 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.144 12 ERROR oslo_messaging.notify.messaging Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.144 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.144 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.144 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.144 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.144 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.144 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.144 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.144 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.144 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.144 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.144 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.144 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.144 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.144 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.144 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.144 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.144 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.144 12 ERROR oslo_messaging.notify.messaging Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.145 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.146 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.146 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.147 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd963f8ca-36c5-4eae-9976-50b37ce52d15', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:11:48.146315', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '7780a980-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12574.319327168, 'message_signature': '4b86136672716e77484ebe41582e9b96a0b184714a1731fc59689b0157bef5c7'}]}, 'timestamp': '2025-12-15 10:11:48.146868', '_unique_id': '6169e543ac494db58885222c3fb8cc0d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.147 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.147 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.147 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.147 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.147 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.147 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.147 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.147 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.147 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.147 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.147 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.147 12 ERROR oslo_messaging.notify.messaging Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.147 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.147 12 ERROR oslo_messaging.notify.messaging Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.147 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.147 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.147 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.147 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.147 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.147 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.147 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.147 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.147 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.147 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.147 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.147 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.147 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.147 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.147 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.147 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.147 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.147 12 ERROR oslo_messaging.notify.messaging Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.149 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.167 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/memory.usage volume: 51.73828125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.169 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'df5d4f04-9a36-4111-b088-e9dd33a9ebf1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.73828125, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'timestamp': '2025-12-15T10:11:48.149735', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '7783f3f6-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12574.360323334, 'message_signature': '64c37dfdfae888f8d53848b64414b752dfd8868c0b76ef54cff1eded0f4843ab'}]}, 'timestamp': '2025-12-15 10:11:48.168436', '_unique_id': '545c0b3c9619471c88df47d8a2bc17d1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.169 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.169 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.169 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.169 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.169 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.169 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.169 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.169 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.169 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.169 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.169 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.169 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.169 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.169 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.169 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.169 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.169 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.169 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.169 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.169 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.169 12 ERROR oslo_messaging.notify.messaging Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.169 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.169 12 ERROR oslo_messaging.notify.messaging Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.169 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.169 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.169 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.169 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.169 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.169 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.169 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.169 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.169 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.169 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.169 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.169 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.169 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.169 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.169 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.169 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.169 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.169 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.169 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.169 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.169 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.169 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.169 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.169 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.169 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.169 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.169 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.169 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.169 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.169 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.169 12 ERROR oslo_messaging.notify.messaging Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.171 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.171 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.182 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.183 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.184 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '268fae9b-2f84-41e9-8190-4a58f4865e98', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T10:11:48.171709', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '77863cce-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12574.364436705, 'message_signature': 'b1c325426ba06263b6590fc9553bf6aa36b41208560573e54f73b715563bf8d1'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T10:11:48.171709', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '778650ce-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12574.364436705, 'message_signature': 'b4b8b20d4b3afb6dace8984b86ef186d176a7c39264410a38cb9fde1620f5956'}]}, 'timestamp': '2025-12-15 10:11:48.183888', '_unique_id': '61300761d0144751aac21667436714cf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.184 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.184 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.184 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.184 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.184 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.184 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.184 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.184 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.184 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.184 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.184 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.184 12 ERROR oslo_messaging.notify.messaging Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.184 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.184 12 ERROR oslo_messaging.notify.messaging Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.184 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.184 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.184 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.184 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.184 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.184 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.184 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.184 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.184 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.184 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.184 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.184 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.184 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.184 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.184 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.184 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.184 12 ERROR oslo_messaging.notify.messaging Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.186 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.187 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.187 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.188 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.190 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '27d1eeee-300a-4944-84ce-f552d15b5fd5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T10:11:48.187454', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '7786f0ba-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12574.364436705, 'message_signature': '33f7e43b0e77b7883ecd6c08efdf979219c07493544d7b82480fb41093ac2547'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T10:11:48.187454', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '77870654-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12574.364436705, 'message_signature': '3d3f13723cb02450c8360b63c12335436621f4fb6c4d631c00e522a165e6f654'}]}, 'timestamp': '2025-12-15 10:11:48.188547', '_unique_id': '014a36b4bfb547fe934630d4b5219054'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.190 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.190 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.190 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.190 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.190 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.190 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.190 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.190 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.190 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.190 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.190 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.190 12 ERROR oslo_messaging.notify.messaging Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.190 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.190 12 ERROR oslo_messaging.notify.messaging Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.190 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.190 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.190 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.190 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.190 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.190 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.190 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.190 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.190 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.190 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.190 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.190 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.190 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.190 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.190 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.190 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.190 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.190 12 ERROR oslo_messaging.notify.messaging Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.191 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.191 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.191 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.193 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '42c21a05-59f9-4eff-b499-a23e0fa4df55', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:11:48.191770', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '778797ae-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12574.319327168, 'message_signature': 'f94c62389bd056630eb8ab2a0087600081054f05a38355705a044ceb1b1b093e'}]}, 'timestamp': '2025-12-15 10:11:48.192285', '_unique_id': '6debb78cddc9435d99d0f878b35f26c3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.193 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.193 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.193 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.193 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.193 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.193 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.193 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.193 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.193 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.193 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.193 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.193 12 ERROR oslo_messaging.notify.messaging Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.193 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.193 12 ERROR oslo_messaging.notify.messaging Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.193 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.193 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.193 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.193 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.193 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.193 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.193 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.193 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.193 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.193 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.193 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.193 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.193 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.193 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.193 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.193 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.193 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.193 12 ERROR oslo_messaging.notify.messaging Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.194 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.194 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.196 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '83ab7baa-cc67-4168-a9d7-05388de686ca', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:11:48.194488', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '7788040a-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12574.319327168, 'message_signature': '5d195d0c1046e2326db8d5bd0c3fa9c75af359d68637988e8d69b90345fb9fce'}]}, 'timestamp': '2025-12-15 10:11:48.195089', '_unique_id': 'a396f12a4c6344d38cad54bafcb71722'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.196 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.196 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.196 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.196 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.196 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.196 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.196 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.196 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.196 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.196 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.196 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.196 12 ERROR oslo_messaging.notify.messaging Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.196 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.196 12 ERROR oslo_messaging.notify.messaging Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.196 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.196 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.196 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.196 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.196 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.196 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.196 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.196 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.196 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.196 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.196 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.196 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.196 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.196 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.196 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.196 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.196 12 ERROR oslo_messaging.notify.messaging Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.197 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.223 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.latency volume: 1243487016 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.223 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.latency volume: 24779175 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.225 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '04b81654-2660-4b62-8fba-2c57e0e5f174', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1243487016, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T10:11:48.197228', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '778c600e-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12574.389934023, 'message_signature': '9bf414b5e9cd065dcb2cf08a6518baa377862ec124ed1ee1028707ee3edb82c3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24779175, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T10:11:48.197228', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '778c71b6-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12574.389934023, 'message_signature': '1f49e8d0bc7ba4017fe8aa8a876e3dc953a8401d7a5fbdeb99c0c899082c4c51'}]}, 'timestamp': '2025-12-15 10:11:48.224089', '_unique_id': '28d15b27b0c94189b71fd4f1e2262311'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.225 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.225 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.225 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.225 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.225 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.225 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.225 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.225 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.225 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.225 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.225 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.225 12 ERROR oslo_messaging.notify.messaging Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.225 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.225 12 ERROR oslo_messaging.notify.messaging Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.225 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.225 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.225 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.225 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.225 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.225 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.225 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.225 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.225 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.225 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.225 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.225 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.225 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.225 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.225 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.225 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.225 12 ERROR oslo_messaging.notify.messaging Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.226 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.226 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.228 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ecfa1c90-5099-4481-acbd-4a8d4d1b68c6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:11:48.226635', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '778ceae2-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12574.319327168, 'message_signature': 'c777ca502fc0e305b76bf69ff40ce8de5f99054e348ee108cc371bcc09af64cb'}]}, 'timestamp': '2025-12-15 10:11:48.227267', '_unique_id': '56e0063a7ab548519e1d0e15ee1228b5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.228 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.228 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.228 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.228 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.228 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.228 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.228 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.228 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.228 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.228 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.228 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.228 12 ERROR oslo_messaging.notify.messaging Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.228 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.228 12 ERROR oslo_messaging.notify.messaging Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.228 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.228 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.228 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.228 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.228 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.228 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.228 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.228 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.228 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.228 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.228 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.228 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.228 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.228 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.228 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.228 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.228 12 ERROR oslo_messaging.notify.messaging Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.229 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.229 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.230 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.231 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7fa3fc22-1a68-4283-b396-e0634d728029', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T10:11:48.229727', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '778d6576-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12574.389934023, 'message_signature': '504ea73c9f479925f836ae8d2ba64709139f0ba84bc0859d62f445146bdd5645'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T10:11:48.229727', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '778d7610-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12574.389934023, 'message_signature': '1c8be74b62053f5848ba709941e922d0e5b2dab2ff90713f4e86ab4636b20596'}]}, 'timestamp': '2025-12-15 10:11:48.230710', '_unique_id': '6c84cfaeb25942a0aef3af1c59d878ce'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.231 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.231 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.231 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.231 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.231 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.231 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.231 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.231 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.231 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.231 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.231 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.231 12 ERROR oslo_messaging.notify.messaging Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.231 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.231 12 ERROR oslo_messaging.notify.messaging Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.231 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.231 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.231 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.231 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.231 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.231 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.231 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.231 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.231 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.231 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.231 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.231 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.231 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.231 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.231 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.231 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.231 12 ERROR oslo_messaging.notify.messaging Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.232 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.232 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.233 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.234 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '47b62a9a-f485-416d-aba0-3979fb09369a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T10:11:48.232900', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '778dde16-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12574.389934023, 'message_signature': '7446db7460036d6a5c17d6dbd627d2c4371851cb761508c64a83aeb7c19026db'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T10:11:48.232900', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '778dee4c-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12574.389934023, 'message_signature': '1baa5047647a5e1d7775f21bd8914fdf54b9a88a68dc7b08de897c2d604d9aa6'}]}, 'timestamp': '2025-12-15 10:11:48.233809', '_unique_id': 'd74871309acf4d4fa5eeaee7960b34bd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.234 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.234 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.234 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.234 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.234 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.234 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.234 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.234 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.234 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.234 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.234 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.234 12 ERROR oslo_messaging.notify.messaging Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.234 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.234 12 ERROR oslo_messaging.notify.messaging Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.234 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.234 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.234 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.234 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.234 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.234 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.234 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.234 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.234 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.234 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.234 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.234 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.234 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.234 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.234 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.234 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.234 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.234 12 ERROR oslo_messaging.notify.messaging Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.235 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.236 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.237 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '28afcd84-fafa-403e-92df-ecbe536ebc4f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:11:48.236023', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '778e574c-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12574.319327168, 'message_signature': 'f9793ae0607ca391276fca1eaa7cf3c6d242c32a5fa9a521c599dd1eb8afb1ea'}]}, 'timestamp': '2025-12-15 10:11:48.236503', '_unique_id': '934961d467b243a29556c415637836cf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.237 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.237 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.237 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.237 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.237 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.237 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.237 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.237 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.237 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.237 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.237 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.237 12 ERROR oslo_messaging.notify.messaging Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.237 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.237 12 ERROR oslo_messaging.notify.messaging Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.237 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.237 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.237 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.237 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.237 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.237 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.237 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.237 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.237 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.237 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.237 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.237 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.237 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.237 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.237 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.237 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.237 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.237 12 ERROR oslo_messaging.notify.messaging Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.239 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.239 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.239 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.241 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9da99342-fd89-46dd-9423-4d68ede53a57', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T10:11:48.239240', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '778ed4ba-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12574.389934023, 'message_signature': '08831b370c215a74229f4ecf90092f2ccad6cc5f518b54d774cd4d35efba29fd'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T10:11:48.239240', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '778ee518-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12574.389934023, 'message_signature': '23adf7833d7833419a8f6527a83daebc892c1c91ce0331200a681fd5ac7ca0ea'}]}, 'timestamp': '2025-12-15 10:11:48.240137', '_unique_id': '37b35958e6e64f59bdc3d48658e9070c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.241 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.241 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.241 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.241 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.241 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.241 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.241 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.241 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.241 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.241 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.241 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.241 12 ERROR oslo_messaging.notify.messaging Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.241 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.241 12 ERROR oslo_messaging.notify.messaging Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.241 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.241 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.241 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.241 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.241 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.241 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.241 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.241 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.241 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.241 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.241 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.241 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.241 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.241 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.241 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.241 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.241 12 ERROR oslo_messaging.notify.messaging Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.242 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.242 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.latency volume: 1342134926 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.243 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.latency volume: 123356132 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.244 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '72473d5c-ce78-4e15-b298-fee71bda0281', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1342134926, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T10:11:48.242755', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '778f6092-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12574.389934023, 'message_signature': '7d8d6692168ea7cd44f207b0c98a7307f996622873ed65afb22d8c4bb79869be'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 123356132, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T10:11:48.242755', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '778f73a2-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12574.389934023, 'message_signature': '0d0f403fac827c1ac587a7159e248a3acb1590d1d13535353d4356810349a687'}]}, 'timestamp': '2025-12-15 10:11:48.243839', '_unique_id': 'f38d77d23e934c4a86754555260d348e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.244 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.244 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.244 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.244 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.244 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.244 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.244 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.244 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.244 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.244 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.244 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.244 12 ERROR oslo_messaging.notify.messaging Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.244 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.244 12 ERROR oslo_messaging.notify.messaging Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.244 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.244 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.244 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.244 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.244 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.244 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.244 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.244 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.244 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.244 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.244 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.244 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.244 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.244 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.244 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.244 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.244 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.244 12 ERROR oslo_messaging.notify.messaging Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.246 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.246 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.246 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.248 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c8daf3ee-7cd6-447c-a02a-58da1627418b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T10:11:48.246281', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '778fe788-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12574.364436705, 'message_signature': '2faa295b093e15e98fb1a8cf4cd5945ec322275c6ce684ea13d0e8f4b042e5d7'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T10:11:48.246281', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '778ff7aa-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12574.364436705, 'message_signature': '70fbe3bd58ffd42dfca27884d31032ef7e8bfa737d9f224e280069aa1657024b'}]}, 'timestamp': '2025-12-15 10:11:48.247164', '_unique_id': '8cfe6cbb9a4a43eb8e1aa39e40262305'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.248 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.248 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.248 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.248 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.248 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.248 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.248 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.248 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.248 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.248 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.248 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.248 12 ERROR oslo_messaging.notify.messaging Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.248 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.248 12 ERROR oslo_messaging.notify.messaging Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.248 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.248 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.248 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.248 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.248 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.248 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.248 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.248 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.248 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.248 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.248 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.248 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.248 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.248 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.248 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.248 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.248 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.248 12 ERROR oslo_messaging.notify.messaging Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.249 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.249 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/cpu volume: 17090000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.250 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3a7b07c4-9c92-44f1-9655-1bd40939507e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 17090000000, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'timestamp': '2025-12-15T10:11:48.249312', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '77905f42-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12574.360323334, 'message_signature': '5594355e5b31d3f6339851788610274b528e88fa7e36539bd76d8168f5d836c2'}]}, 'timestamp': '2025-12-15 10:11:48.249808', '_unique_id': 'b0867c938e7f497ca97e0c3569747c24'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.250 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.250 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.250 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.250 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.250 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.250 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.250 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.250 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.250 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.250 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.250 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.250 12 ERROR oslo_messaging.notify.messaging Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.250 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.250 12 ERROR oslo_messaging.notify.messaging Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.250 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.250 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.250 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.250 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.250 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.250 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.250 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.250 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.250 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.250 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.250 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.250 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.250 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.250 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.250 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.250 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.250 12 ERROR oslo_messaging.notify.messaging Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.252 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.252 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.253 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '06ef07e4-e299-49c2-9947-74176eb0461b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:11:48.252180', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '7790ce78-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12574.319327168, 'message_signature': '4707789c5d8fd58b40d78ebdfec86eb50ae231047776c40d078bee23e4b26aa2'}]}, 'timestamp': '2025-12-15 10:11:48.252667', '_unique_id': '4dd144b3b9de46b28ed5d7b6d63ca468'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.253 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.253 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.253 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.253 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.253 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.253 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.253 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.253 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.253 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.253 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.253 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.253 12 ERROR oslo_messaging.notify.messaging Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.253 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.253 12 ERROR oslo_messaging.notify.messaging Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.253 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.253 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.253 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.253 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.253 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.253 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.253 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.253 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.253 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.253 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.253 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.253 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.253 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.253 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.253 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.253 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.253 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.253 12 ERROR oslo_messaging.notify.messaging Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.254 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.254 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.254 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.255 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0ad50e2a-7399-403f-841f-be3a5bb7ae45', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T10:11:48.254467', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '779123b4-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12574.389934023, 'message_signature': '4714b204acc3342117d169f44009d83417fa3a1b649641bbfa1629b74150c95a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T10:11:48.254467', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '77912e2c-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12574.389934023, 'message_signature': '274ae5314ba73945e93e3970215449188f189a39b33c7b1a8c6308a106c6ed2d'}]}, 'timestamp': '2025-12-15 10:11:48.255031', '_unique_id': '73c98305b9ba4a918c7bbdc360383a6a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.255 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.255 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.255 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.255 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.255 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.255 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.255 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.255 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.255 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.255 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.255 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.255 12 ERROR oslo_messaging.notify.messaging Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.255 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.255 12 ERROR oslo_messaging.notify.messaging Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.255 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.255 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.255 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.255 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.255 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.255 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.255 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.255 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.255 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.255 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.255 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.255 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.255 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.255 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.255 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.255 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:11:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:11:48.255 12 ERROR oslo_messaging.notify.messaging Dec 15 05:11:49 localhost nova_compute[286344]: 2025-12-15 10:11:49.278 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:11:50 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Dec 15 05:11:50 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.25625 172.18.0.34:0/382777224' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Dec 15 05:11:50 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e242 do_prune osdmap full prune enabled Dec 15 05:11:50 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e243 e243: 6 total, 6 up, 6 in Dec 15 05:11:50 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e243: 6 total, 6 up, 6 in Dec 15 05:11:51 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e243 do_prune osdmap full prune enabled Dec 15 05:11:51 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e244 e244: 6 total, 6 up, 6 in Dec 15 05:11:51 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e244: 6 total, 6 up, 6 in Dec 15 05:11:51 localhost ovn_metadata_agent[160585]: 2025-12-15 10:11:51.486 160590 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 05:11:51 localhost ovn_metadata_agent[160585]: 2025-12-15 10:11:51.487 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 05:11:51 localhost ovn_metadata_agent[160585]: 2025-12-15 10:11:51.487 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 05:11:51 localhost nova_compute[286344]: 2025-12-15 10:11:51.541 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:11:51 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e244 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:11:51 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e244 do_prune osdmap full prune enabled Dec 15 05:11:51 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e245 e245: 6 total, 6 up, 6 in Dec 15 05:11:51 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e245: 6 total, 6 up, 6 in Dec 15 05:11:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0. Dec 15 05:11:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09. Dec 15 05:11:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 05:11:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 05:11:51 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a. Dec 15 05:11:51 localhost podman[335826]: 2025-12-15 10:11:51.753812888 +0000 UTC m=+0.075309503 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.schema-version=1.0) Dec 15 05:11:51 localhost podman[335820]: 2025-12-15 10:11:51.806520671 +0000 UTC m=+0.130620416 container health_status 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 05:11:51 localhost podman[335820]: 2025-12-15 10:11:51.819344466 +0000 UTC m=+0.143444221 container exec_died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Dec 15 05:11:51 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.service: Deactivated successfully. Dec 15 05:11:51 localhost podman[335828]: 2025-12-15 10:11:51.868720719 +0000 UTC m=+0.186382401 container health_status b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 05:11:51 localhost podman[335818]: 2025-12-15 10:11:51.911429301 +0000 UTC m=+0.242293578 container health_status 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 15 05:11:51 localhost podman[335818]: 2025-12-15 10:11:51.922350755 +0000 UTC m=+0.253215022 container exec_died 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Dec 15 05:11:51 localhost podman[335828]: 2025-12-15 10:11:51.928697416 +0000 UTC m=+0.246359108 container exec_died b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0) Dec 15 05:11:51 localhost systemd[1]: 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0.service: Deactivated successfully. Dec 15 05:11:51 localhost podman[335826]: 2025-12-15 10:11:51.939127238 +0000 UTC m=+0.260623813 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Dec 15 05:11:51 localhost systemd[1]: b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a.service: Deactivated successfully. Dec 15 05:11:51 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 05:11:52 localhost podman[335819]: 2025-12-15 10:11:52.025619282 +0000 UTC m=+0.351215397 container health_status 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, version=9.6, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container) Dec 15 05:11:52 localhost podman[335819]: 2025-12-15 10:11:52.039753273 +0000 UTC m=+0.365349388 container exec_died 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, architecture=x86_64, container_name=openstack_network_exporter, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.buildah.version=1.33.7, vcs-type=git, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, version=9.6, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public) Dec 15 05:11:52 localhost systemd[1]: 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09.service: Deactivated successfully. Dec 15 05:11:52 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e245 do_prune osdmap full prune enabled Dec 15 05:11:52 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e246 e246: 6 total, 6 up, 6 in Dec 15 05:11:52 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e246: 6 total, 6 up, 6 in Dec 15 05:11:54 localhost nova_compute[286344]: 2025-12-15 10:11:54.301 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:11:55 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e246 do_prune osdmap full prune enabled Dec 15 05:11:55 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e247 e247: 6 total, 6 up, 6 in Dec 15 05:11:55 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e247: 6 total, 6 up, 6 in Dec 15 05:11:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 05:11:55 localhost podman[335922]: 2025-12-15 10:11:55.759457334 +0000 UTC m=+0.090767831 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:11:55 localhost podman[335922]: 2025-12-15 10:11:55.768642581 +0000 UTC m=+0.099953118 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2) Dec 15 05:11:55 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 05:11:56 localhost nova_compute[286344]: 2025-12-15 10:11:56.543 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:11:56 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Dec 15 05:11:56 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.25625 172.18.0.34:0/382777224' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Dec 15 05:11:56 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:11:56 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e247 do_prune osdmap full prune enabled Dec 15 05:11:56 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e248 e248: 6 total, 6 up, 6 in Dec 15 05:11:56 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e248: 6 total, 6 up, 6 in Dec 15 05:11:59 localhost nova_compute[286344]: 2025-12-15 10:11:59.305 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:12:00 localhost ceph-mon[298913]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Dec 15 05:12:00 localhost ceph-mon[298913]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1200.0 total, 600.0 interval#012Cumulative writes: 5398 writes, 37K keys, 5398 commit groups, 1.0 writes per commit group, ingest: 0.06 GB, 0.05 MB/s#012Cumulative WAL: 5398 writes, 5398 syncs, 1.00 writes per sync, written: 0.06 GB, 0.05 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2329 writes, 11K keys, 2329 commit groups, 1.0 writes per commit group, ingest: 10.46 MB, 0.02 MB/s#012Interval WAL: 2329 writes, 2329 syncs, 1.00 writes per sync, written: 0.01 GB, 0.02 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 L0 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 123.5 0.37 0.12 20 0.019 0 0 0.0 0.0#012 L6 1/0 16.84 MB 0.0 0.3 0.0 0.3 0.3 0.0 0.0 7.0 152.8 140.2 2.30 0.90 19 0.121 239K 9942 0.0 0.0#012 Sum 1/0 16.84 MB 0.0 0.3 0.0 0.3 0.4 0.1 0.0 8.0 131.5 137.8 2.67 1.03 39 0.068 239K 9942 0.0 0.0#012 Int 0/0 0.00 KB 0.0 0.1 0.0 0.1 0.1 0.0 0.0 14.5 129.4 129.2 1.13 0.45 16 0.071 110K 4274 0.0 0.0#012#012** Compaction Stats [default] **#012Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low 0/0 0.00 KB 0.0 0.3 0.0 0.3 0.3 0.0 0.0 0.0 152.8 140.2 2.30 0.90 19 0.121 239K 9942 0.0 0.0#012High 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 124.9 0.37 0.12 19 0.019 0 0 0.0 0.0#012User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.5 0.00 0.00 1 0.004 0 0 0.0 0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.0 total, 600.0 interval#012Flush(GB): cumulative 0.045, interval 0.010#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.36 GB write, 0.31 MB/s write, 0.34 GB read, 0.29 MB/s read, 2.7 seconds#012Interval compaction: 0.14 GB write, 0.24 MB/s write, 0.14 GB read, 0.24 MB/s read, 1.1 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55e4c4afd350#2 capacity: 304.00 MB usage: 31.12 MB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 0 last_secs: 0.000193 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(1596,29.58 MB,9.72871%) FilterBlock(39,687.42 KB,0.220826%) IndexBlock(39,892.67 KB,0.28676%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] ** Dec 15 05:12:01 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Dec 15 05:12:01 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.25625 172.18.0.34:0/382777224' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Dec 15 05:12:01 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e248 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:12:01 localhost nova_compute[286344]: 2025-12-15 10:12:01.585 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:12:01 localhost podman[243449]: time="2025-12-15T10:12:01Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 15 05:12:01 localhost podman[243449]: @ - - [15/Dec/2025:10:12:01 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 158464 "" "Go-http-client/1.1" Dec 15 05:12:01 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Dec 15 05:12:01 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.25625 172.18.0.34:0/382777224' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Dec 15 05:12:01 localhost podman[243449]: @ - - [15/Dec/2025:10:12:01 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19750 "" "Go-http-client/1.1" Dec 15 05:12:02 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e248 do_prune osdmap full prune enabled Dec 15 05:12:02 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e249 e249: 6 total, 6 up, 6 in Dec 15 05:12:02 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e249: 6 total, 6 up, 6 in Dec 15 05:12:02 localhost systemd[1]: Started /usr/bin/podman healthcheck run a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e. Dec 15 05:12:02 localhost podman[335940]: 2025-12-15 10:12:02.760691603 +0000 UTC m=+0.082171067 container health_status a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Dec 15 05:12:02 localhost podman[335940]: 2025-12-15 10:12:02.798389441 +0000 UTC m=+0.119868895 container exec_died a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Dec 15 05:12:02 localhost systemd[1]: a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e.service: Deactivated successfully. Dec 15 05:12:03 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Dec 15 05:12:03 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.25625 172.18.0.34:0/382777224' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Dec 15 05:12:04 localhost nova_compute[286344]: 2025-12-15 10:12:04.341 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:12:04 localhost openstack_network_exporter[246484]: ERROR 10:12:04 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 15 05:12:04 localhost openstack_network_exporter[246484]: ERROR 10:12:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 05:12:04 localhost openstack_network_exporter[246484]: ERROR 10:12:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 05:12:04 localhost openstack_network_exporter[246484]: ERROR 10:12:04 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 15 05:12:04 localhost openstack_network_exporter[246484]: Dec 15 05:12:04 localhost openstack_network_exporter[246484]: ERROR 10:12:04 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 15 05:12:04 localhost openstack_network_exporter[246484]: Dec 15 05:12:05 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Dec 15 05:12:05 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.25625 172.18.0.34:0/382777224' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Dec 15 05:12:06 localhost nova_compute[286344]: 2025-12-15 10:12:06.587 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:12:06 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e249 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:12:06 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e249 do_prune osdmap full prune enabled Dec 15 05:12:06 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e250 e250: 6 total, 6 up, 6 in Dec 15 05:12:06 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e250: 6 total, 6 up, 6 in Dec 15 05:12:08 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Dec 15 05:12:08 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.25625 172.18.0.34:0/382777224' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Dec 15 05:12:09 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Dec 15 05:12:09 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.25625 172.18.0.34:0/382777224' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Dec 15 05:12:09 localhost nova_compute[286344]: 2025-12-15 10:12:09.375 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:12:09 localhost nova_compute[286344]: 2025-12-15 10:12:09.390 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:12:09 localhost ovn_metadata_agent[160585]: 2025-12-15 10:12:09.390 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=20, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'fe:17:e3', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fe:55:2b:86:15:b5'}, ipsec=False) old=SB_Global(nb_cfg=19) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:12:09 localhost ovn_metadata_agent[160585]: 2025-12-15 10:12:09.391 160590 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Dec 15 05:12:10 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Dec 15 05:12:10 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.25625 172.18.0.34:0/382777224' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Dec 15 05:12:10 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Dec 15 05:12:10 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:12:11 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Dec 15 05:12:11 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1920188860' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Dec 15 05:12:11 localhost ovn_metadata_agent[160585]: 2025-12-15 10:12:11.393 160590 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=12d96d64-e862-4f68-81e5-8d9ec5d3a5e2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '20'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 15 05:12:11 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Dec 15 05:12:11 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:12:11 localhost nova_compute[286344]: 2025-12-15 10:12:11.591 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:12:11 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e250 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:12:12 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : mgrmap e51: np0005559464.aomnqe(active, since 14m), standbys: np0005559462.fudvyx, np0005559463.daptkf Dec 15 05:12:13 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Dec 15 05:12:13 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3849689912' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Dec 15 05:12:13 localhost nova_compute[286344]: 2025-12-15 10:12:13.542 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:12:13 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e250 do_prune osdmap full prune enabled Dec 15 05:12:13 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e251 e251: 6 total, 6 up, 6 in Dec 15 05:12:13 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e251: 6 total, 6 up, 6 in Dec 15 05:12:14 localhost nova_compute[286344]: 2025-12-15 10:12:14.418 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:12:14 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Dec 15 05:12:14 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:12:14 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e251 do_prune osdmap full prune enabled Dec 15 05:12:14 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e252 e252: 6 total, 6 up, 6 in Dec 15 05:12:14 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e252: 6 total, 6 up, 6 in Dec 15 05:12:14 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:12:15 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Dec 15 05:12:15 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.25625 172.18.0.34:0/382777224' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Dec 15 05:12:15 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Dec 15 05:12:15 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.25625 172.18.0.34:0/382777224' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Dec 15 05:12:16 localhost nova_compute[286344]: 2025-12-15 10:12:16.028 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:12:16 localhost nova_compute[286344]: 2025-12-15 10:12:16.593 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:12:16 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e252 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:12:16 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e252 do_prune osdmap full prune enabled Dec 15 05:12:16 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e253 e253: 6 total, 6 up, 6 in Dec 15 05:12:16 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e253: 6 total, 6 up, 6 in Dec 15 05:12:17 localhost nova_compute[286344]: 2025-12-15 10:12:17.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:12:17 localhost nova_compute[286344]: 2025-12-15 10:12:17.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:12:17 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e253 do_prune osdmap full prune enabled Dec 15 05:12:17 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e254 e254: 6 total, 6 up, 6 in Dec 15 05:12:18 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e254: 6 total, 6 up, 6 in Dec 15 05:12:18 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : mgrmap e52: np0005559464.aomnqe(active, since 14m), standbys: np0005559462.fudvyx, np0005559463.daptkf Dec 15 05:12:18 localhost nova_compute[286344]: 2025-12-15 10:12:18.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:12:19 localhost nova_compute[286344]: 2025-12-15 10:12:19.271 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:12:19 localhost nova_compute[286344]: 2025-12-15 10:12:19.271 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 15 05:12:19 localhost nova_compute[286344]: 2025-12-15 10:12:19.272 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 15 05:12:19 localhost nova_compute[286344]: 2025-12-15 10:12:19.454 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:12:19 localhost nova_compute[286344]: 2025-12-15 10:12:19.606 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 15 05:12:19 localhost nova_compute[286344]: 2025-12-15 10:12:19.606 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquired lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 15 05:12:19 localhost nova_compute[286344]: 2025-12-15 10:12:19.607 286348 DEBUG nova.network.neutron [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 15 05:12:19 localhost nova_compute[286344]: 2025-12-15 10:12:19.607 286348 DEBUG nova.objects.instance [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 15 05:12:20 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e254 do_prune osdmap full prune enabled Dec 15 05:12:20 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e255 e255: 6 total, 6 up, 6 in Dec 15 05:12:20 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e255: 6 total, 6 up, 6 in Dec 15 05:12:20 localhost nova_compute[286344]: 2025-12-15 10:12:20.049 286348 DEBUG nova.network.neutron [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Updating instance_info_cache with network_info: [{"id": "03ef8889-3216-43fb-8a52-4be17a956ce1", "address": "fa:16:3e:74:df:7c", "network": {"id": "befb7a72-17a9-4bcb-b561-84b8f626685a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.201", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "c785bf23f53946bc99867d8832a50266", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03ef8889-32", "ovs_interfaceid": "03ef8889-3216-43fb-8a52-4be17a956ce1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 15 05:12:20 localhost nova_compute[286344]: 2025-12-15 10:12:20.070 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Releasing lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 15 05:12:20 localhost nova_compute[286344]: 2025-12-15 10:12:20.070 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 15 05:12:21 localhost nova_compute[286344]: 2025-12-15 10:12:21.110 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:12:21 localhost nova_compute[286344]: 2025-12-15 10:12:21.596 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:12:21 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e255 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:12:22 localhost nova_compute[286344]: 2025-12-15 10:12:22.065 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:12:22 localhost nova_compute[286344]: 2025-12-15 10:12:22.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:12:22 localhost nova_compute[286344]: 2025-12-15 10:12:22.270 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 15 05:12:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0. Dec 15 05:12:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09. Dec 15 05:12:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 05:12:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 05:12:22 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a. Dec 15 05:12:22 localhost podman[336052]: 2025-12-15 10:12:22.77203759 +0000 UTC m=+0.090355528 container health_status 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.expose-services=, maintainer=Red Hat, Inc., name=ubi9-minimal, container_name=openstack_network_exporter, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7) Dec 15 05:12:22 localhost podman[336064]: 2025-12-15 10:12:22.793700345 +0000 UTC m=+0.099974219 container health_status b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2) Dec 15 05:12:22 localhost podman[336064]: 2025-12-15 10:12:22.82833328 +0000 UTC m=+0.134607134 container exec_died b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Dec 15 05:12:22 localhost systemd[1]: tmp-crun.r7OGbZ.mount: Deactivated successfully. Dec 15 05:12:22 localhost podman[336059]: 2025-12-15 10:12:22.840383165 +0000 UTC m=+0.150807420 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, org.label-schema.vendor=CentOS) Dec 15 05:12:22 localhost systemd[1]: b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a.service: Deactivated successfully. Dec 15 05:12:22 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e255 do_prune osdmap full prune enabled Dec 15 05:12:22 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e256 e256: 6 total, 6 up, 6 in Dec 15 05:12:22 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e256: 6 total, 6 up, 6 in Dec 15 05:12:22 localhost podman[336053]: 2025-12-15 10:12:22.922345466 +0000 UTC m=+0.237196441 container health_status 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2) Dec 15 05:12:22 localhost podman[336053]: 2025-12-15 10:12:22.931369819 +0000 UTC m=+0.246220824 container exec_died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:12:22 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.service: Deactivated successfully. Dec 15 05:12:22 localhost podman[336052]: 2025-12-15 10:12:22.955621044 +0000 UTC m=+0.273938962 container exec_died 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, distribution-scope=public, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, version=9.6, io.buildah.version=1.33.7, config_id=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., name=ubi9-minimal, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI) Dec 15 05:12:22 localhost systemd[1]: 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09.service: Deactivated successfully. Dec 15 05:12:22 localhost podman[336059]: 2025-12-15 10:12:22.975373427 +0000 UTC m=+0.285797672 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_managed=true, org.label-schema.build-date=20251202, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Dec 15 05:12:22 localhost podman[336051]: 2025-12-15 10:12:22.939261952 +0000 UTC m=+0.260178450 container health_status 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 15 05:12:22 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 05:12:23 localhost podman[336051]: 2025-12-15 10:12:23.022696004 +0000 UTC m=+0.343612512 container exec_died 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 15 05:12:23 localhost systemd[1]: 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0.service: Deactivated successfully. Dec 15 05:12:23 localhost nova_compute[286344]: 2025-12-15 10:12:23.271 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:12:23 localhost nova_compute[286344]: 2025-12-15 10:12:23.844 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:12:23 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e256 do_prune osdmap full prune enabled Dec 15 05:12:23 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e257 e257: 6 total, 6 up, 6 in Dec 15 05:12:23 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e257: 6 total, 6 up, 6 in Dec 15 05:12:24 localhost nova_compute[286344]: 2025-12-15 10:12:24.485 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:12:24 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e257 do_prune osdmap full prune enabled Dec 15 05:12:24 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e258 e258: 6 total, 6 up, 6 in Dec 15 05:12:24 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e258: 6 total, 6 up, 6 in Dec 15 05:12:25 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Dec 15 05:12:25 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.25625 172.18.0.34:0/382777224' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Dec 15 05:12:25 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Dec 15 05:12:25 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.25625 172.18.0.34:0/382777224' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Dec 15 05:12:25 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e258 do_prune osdmap full prune enabled Dec 15 05:12:25 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e259 e259: 6 total, 6 up, 6 in Dec 15 05:12:25 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e259: 6 total, 6 up, 6 in Dec 15 05:12:26 localhost nova_compute[286344]: 2025-12-15 10:12:26.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:12:26 localhost nova_compute[286344]: 2025-12-15 10:12:26.313 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 05:12:26 localhost nova_compute[286344]: 2025-12-15 10:12:26.313 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 05:12:26 localhost nova_compute[286344]: 2025-12-15 10:12:26.314 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 05:12:26 localhost nova_compute[286344]: 2025-12-15 10:12:26.314 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Auditing locally available compute resources for np0005559462.localdomain (node: np0005559462.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 15 05:12:26 localhost nova_compute[286344]: 2025-12-15 10:12:26.315 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 05:12:26 localhost nova_compute[286344]: 2025-12-15 10:12:26.600 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:12:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 05:12:26 localhost systemd[1]: tmp-crun.PY95sB.mount: Deactivated successfully. Dec 15 05:12:26 localhost podman[336176]: 2025-12-15 10:12:26.759189278 +0000 UTC m=+0.089878106 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3) Dec 15 05:12:26 localhost podman[336176]: 2025-12-15 10:12:26.795550189 +0000 UTC m=+0.126239047 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible) Dec 15 05:12:26 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 05:12:26 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Dec 15 05:12:26 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2677735704' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Dec 15 05:12:26 localhost nova_compute[286344]: 2025-12-15 10:12:26.832 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.517s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 05:12:26 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e259 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:12:26 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e259 do_prune osdmap full prune enabled Dec 15 05:12:26 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e260 e260: 6 total, 6 up, 6 in Dec 15 05:12:26 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e260: 6 total, 6 up, 6 in Dec 15 05:12:26 localhost nova_compute[286344]: 2025-12-15 10:12:26.908 286348 DEBUG nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 05:12:26 localhost nova_compute[286344]: 2025-12-15 10:12:26.909 286348 DEBUG nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 05:12:27 localhost nova_compute[286344]: 2025-12-15 10:12:27.132 286348 WARNING nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 15 05:12:27 localhost nova_compute[286344]: 2025-12-15 10:12:27.134 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Hypervisor/Node resource view: name=np0005559462.localdomain free_ram=11144MB free_disk=41.700279235839844GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 15 05:12:27 localhost nova_compute[286344]: 2025-12-15 10:12:27.134 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 05:12:27 localhost nova_compute[286344]: 2025-12-15 10:12:27.135 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 05:12:27 localhost nova_compute[286344]: 2025-12-15 10:12:27.193 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Instance 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 15 05:12:27 localhost nova_compute[286344]: 2025-12-15 10:12:27.193 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 15 05:12:27 localhost nova_compute[286344]: 2025-12-15 10:12:27.194 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Final resource view: name=np0005559462.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 15 05:12:27 localhost nova_compute[286344]: 2025-12-15 10:12:27.243 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 05:12:27 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Dec 15 05:12:27 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2361374161' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Dec 15 05:12:27 localhost nova_compute[286344]: 2025-12-15 10:12:27.702 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 05:12:27 localhost nova_compute[286344]: 2025-12-15 10:12:27.707 286348 DEBUG nova.compute.provider_tree [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Inventory has not changed in ProviderTree for provider: 26c8956b-6742-4951-b566-971b9bbe323b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 15 05:12:27 localhost nova_compute[286344]: 2025-12-15 10:12:27.720 286348 DEBUG nova.scheduler.client.report [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Inventory has not changed for provider 26c8956b-6742-4951-b566-971b9bbe323b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 15 05:12:27 localhost nova_compute[286344]: 2025-12-15 10:12:27.721 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Compute_service record updated for np0005559462.localdomain:np0005559462.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 15 05:12:27 localhost nova_compute[286344]: 2025-12-15 10:12:27.721 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.587s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 05:12:28 localhost ovn_controller[154603]: 2025-12-15T10:12:28Z|00519|binding|INFO|Releasing lport b35254ad-12eb-47bb-92be-44fefe0694f0 from this chassis (sb_readonly=0) Dec 15 05:12:28 localhost nova_compute[286344]: 2025-12-15 10:12:28.573 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:12:29 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Dec 15 05:12:29 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.25625 172.18.0.34:0/382777224' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Dec 15 05:12:29 localhost nova_compute[286344]: 2025-12-15 10:12:29.517 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:12:29 localhost nova_compute[286344]: 2025-12-15 10:12:29.722 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:12:30 localhost neutron_sriov_agent[260044]: 2025-12-15 10:12:30.623 2 INFO neutron.agent.securitygroups_rpc [req-14c1c54a-7480-4b83-9653-5c4accadadc0 req-905f544d-0006-46ca-b404-074c30c3c392 055a2ead711042929f6186ce0df9286e 229ca1deba244d0780350d1a77507ad1 - - default default] Security group member updated ['49be7990-8193-41d2-bdb8-cde8aeb193db']#033[00m Dec 15 05:12:30 localhost systemd[1]: tmp-crun.WAzhex.mount: Deactivated successfully. Dec 15 05:12:30 localhost dnsmasq[335364]: read /var/lib/neutron/dhcp/b35c3a3c-c123-4757-af17-fe86d0729b58/addn_hosts - 1 addresses Dec 15 05:12:30 localhost dnsmasq-dhcp[335364]: read /var/lib/neutron/dhcp/b35c3a3c-c123-4757-af17-fe86d0729b58/host Dec 15 05:12:30 localhost podman[336234]: 2025-12-15 10:12:30.946416232 +0000 UTC m=+0.067761389 container kill a28b5bd276c88258da127610008fefed786d6a74d721e8f5acfc18f6893594cf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b35c3a3c-c123-4757-af17-fe86d0729b58, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) Dec 15 05:12:30 localhost dnsmasq-dhcp[335364]: read /var/lib/neutron/dhcp/b35c3a3c-c123-4757-af17-fe86d0729b58/opts Dec 15 05:12:31 localhost ceph-mon[298913]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #70. Immutable memtables: 0. Dec 15 05:12:31 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:12:31.064123) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Dec 15 05:12:31 localhost ceph-mon[298913]: rocksdb: [db/flush_job.cc:856] [default] [JOB 41] Flushing memtable with next log file: 70 Dec 15 05:12:31 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765793551064183, "job": 41, "event": "flush_started", "num_memtables": 1, "num_entries": 1891, "num_deletes": 263, "total_data_size": 2702973, "memory_usage": 2742832, "flush_reason": "Manual Compaction"} Dec 15 05:12:31 localhost ceph-mon[298913]: rocksdb: [db/flush_job.cc:885] [default] [JOB 41] Level-0 flush table #71: started Dec 15 05:12:31 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765793551084417, "cf_name": "default", "job": 41, "event": "table_file_creation", "file_number": 71, "file_size": 2655088, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 36583, "largest_seqno": 38473, "table_properties": {"data_size": 2646460, "index_size": 5133, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2437, "raw_key_size": 21174, "raw_average_key_size": 22, "raw_value_size": 2628198, "raw_average_value_size": 2772, "num_data_blocks": 220, "num_entries": 948, "num_filter_entries": 948, "num_deletions": 263, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765793461, "oldest_key_time": 1765793461, "file_creation_time": 1765793551, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "603b24af-e2be-4214-bc56-9e652eb4af3d", "db_session_id": "0OJRM9SCUA16EXV0VQZ2", "orig_file_number": 71, "seqno_to_time_mapping": "N/A"}} Dec 15 05:12:31 localhost ceph-mon[298913]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 41] Flush lasted 20360 microseconds, and 5969 cpu microseconds. Dec 15 05:12:31 localhost ceph-mon[298913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Dec 15 05:12:31 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:12:31.084477) [db/flush_job.cc:967] [default] [JOB 41] Level-0 flush table #71: 2655088 bytes OK Dec 15 05:12:31 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:12:31.084504) [db/memtable_list.cc:519] [default] Level-0 commit table #71 started Dec 15 05:12:31 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:12:31.086421) [db/memtable_list.cc:722] [default] Level-0 commit table #71: memtable #1 done Dec 15 05:12:31 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:12:31.086441) EVENT_LOG_v1 {"time_micros": 1765793551086435, "job": 41, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Dec 15 05:12:31 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:12:31.086465) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Dec 15 05:12:31 localhost ceph-mon[298913]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 41] Try to delete WAL files size 2694249, prev total WAL file size 2694249, number of live WAL files 2. Dec 15 05:12:31 localhost ceph-mon[298913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005559462/store.db/000067.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Dec 15 05:12:31 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:12:31.087305) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003132383031' seq:72057594037927935, type:22 .. '7061786F73003133303533' seq:0, type:0; will stop at (end) Dec 15 05:12:31 localhost ceph-mon[298913]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 42] Compacting 1@0 + 1@6 files to L6, score -1.00 Dec 15 05:12:31 localhost ceph-mon[298913]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 41 Base level 0, inputs: [71(2592KB)], [69(16MB)] Dec 15 05:12:31 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765793551087369, "job": 42, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [71], "files_L6": [69], "score": -1, "input_data_size": 20309067, "oldest_snapshot_seqno": -1} Dec 15 05:12:31 localhost ceph-mon[298913]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 42] Generated table #72: 14022 keys, 18735210 bytes, temperature: kUnknown Dec 15 05:12:31 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765793551196277, "cf_name": "default", "job": 42, "event": "table_file_creation", "file_number": 72, "file_size": 18735210, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 18654153, "index_size": 44924, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 35077, "raw_key_size": 378473, "raw_average_key_size": 26, "raw_value_size": 18414342, "raw_average_value_size": 1313, "num_data_blocks": 1666, "num_entries": 14022, "num_filter_entries": 14022, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765792320, "oldest_key_time": 0, "file_creation_time": 1765793551, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "603b24af-e2be-4214-bc56-9e652eb4af3d", "db_session_id": "0OJRM9SCUA16EXV0VQZ2", "orig_file_number": 72, "seqno_to_time_mapping": "N/A"}} Dec 15 05:12:31 localhost ceph-mon[298913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Dec 15 05:12:31 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:12:31.196570) [db/compaction/compaction_job.cc:1663] [default] [JOB 42] Compacted 1@0 + 1@6 files to L6 => 18735210 bytes Dec 15 05:12:31 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:12:31.198541) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 186.4 rd, 171.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.5, 16.8 +0.0 blob) out(17.9 +0.0 blob), read-write-amplify(14.7) write-amplify(7.1) OK, records in: 14567, records dropped: 545 output_compression: NoCompression Dec 15 05:12:31 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:12:31.198747) EVENT_LOG_v1 {"time_micros": 1765793551198557, "job": 42, "event": "compaction_finished", "compaction_time_micros": 108981, "compaction_time_cpu_micros": 50655, "output_level": 6, "num_output_files": 1, "total_output_size": 18735210, "num_input_records": 14567, "num_output_records": 14022, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Dec 15 05:12:31 localhost ceph-mon[298913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005559462/store.db/000071.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Dec 15 05:12:31 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765793551199237, "job": 42, "event": "table_file_deletion", "file_number": 71} Dec 15 05:12:31 localhost ceph-mon[298913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005559462/store.db/000069.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Dec 15 05:12:31 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765793551201889, "job": 42, "event": "table_file_deletion", "file_number": 69} Dec 15 05:12:31 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:12:31.087190) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 05:12:31 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:12:31.202048) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 05:12:31 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:12:31.202058) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 05:12:31 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:12:31.202062) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 05:12:31 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:12:31.202066) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 05:12:31 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:12:31.202070) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 05:12:31 localhost nova_compute[286344]: 2025-12-15 10:12:31.602 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:12:31 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e260 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:12:31 localhost podman[243449]: time="2025-12-15T10:12:31Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 15 05:12:31 localhost podman[243449]: @ - - [15/Dec/2025:10:12:31 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 158464 "" "Go-http-client/1.1" Dec 15 05:12:31 localhost podman[243449]: @ - - [15/Dec/2025:10:12:31 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19756 "" "Go-http-client/1.1" Dec 15 05:12:32 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Dec 15 05:12:32 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.25625 172.18.0.34:0/382777224' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Dec 15 05:12:33 localhost ovn_controller[154603]: 2025-12-15T10:12:33Z|00520|binding|INFO|Releasing lport b35254ad-12eb-47bb-92be-44fefe0694f0 from this chassis (sb_readonly=0) Dec 15 05:12:33 localhost nova_compute[286344]: 2025-12-15 10:12:33.492 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:12:33 localhost systemd[1]: Started /usr/bin/podman healthcheck run a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e. Dec 15 05:12:33 localhost podman[336254]: 2025-12-15 10:12:33.732225615 +0000 UTC m=+0.067867962 container health_status a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Dec 15 05:12:33 localhost podman[336254]: 2025-12-15 10:12:33.740202831 +0000 UTC m=+0.075845178 container exec_died a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 15 05:12:33 localhost systemd[1]: a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e.service: Deactivated successfully. Dec 15 05:12:34 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e260 do_prune osdmap full prune enabled Dec 15 05:12:34 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e261 e261: 6 total, 6 up, 6 in Dec 15 05:12:34 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e261: 6 total, 6 up, 6 in Dec 15 05:12:34 localhost nova_compute[286344]: 2025-12-15 10:12:34.565 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:12:34 localhost openstack_network_exporter[246484]: ERROR 10:12:34 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 15 05:12:34 localhost openstack_network_exporter[246484]: ERROR 10:12:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 05:12:34 localhost openstack_network_exporter[246484]: ERROR 10:12:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 05:12:34 localhost openstack_network_exporter[246484]: ERROR 10:12:34 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 15 05:12:34 localhost openstack_network_exporter[246484]: Dec 15 05:12:34 localhost openstack_network_exporter[246484]: ERROR 10:12:34 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 15 05:12:34 localhost openstack_network_exporter[246484]: Dec 15 05:12:35 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e261 do_prune osdmap full prune enabled Dec 15 05:12:35 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e262 e262: 6 total, 6 up, 6 in Dec 15 05:12:35 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e262: 6 total, 6 up, 6 in Dec 15 05:12:36 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e262 do_prune osdmap full prune enabled Dec 15 05:12:36 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e263 e263: 6 total, 6 up, 6 in Dec 15 05:12:36 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e263: 6 total, 6 up, 6 in Dec 15 05:12:36 localhost nova_compute[286344]: 2025-12-15 10:12:36.604 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:12:36 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e263 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:12:36 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e263 do_prune osdmap full prune enabled Dec 15 05:12:36 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e264 e264: 6 total, 6 up, 6 in Dec 15 05:12:36 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e264: 6 total, 6 up, 6 in Dec 15 05:12:37 localhost dnsmasq[335364]: read /var/lib/neutron/dhcp/b35c3a3c-c123-4757-af17-fe86d0729b58/addn_hosts - 0 addresses Dec 15 05:12:37 localhost podman[336294]: 2025-12-15 10:12:37.809485083 +0000 UTC m=+0.062632581 container kill a28b5bd276c88258da127610008fefed786d6a74d721e8f5acfc18f6893594cf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b35c3a3c-c123-4757-af17-fe86d0729b58, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:12:37 localhost dnsmasq-dhcp[335364]: read /var/lib/neutron/dhcp/b35c3a3c-c123-4757-af17-fe86d0729b58/host Dec 15 05:12:37 localhost dnsmasq-dhcp[335364]: read /var/lib/neutron/dhcp/b35c3a3c-c123-4757-af17-fe86d0729b58/opts Dec 15 05:12:37 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e264 do_prune osdmap full prune enabled Dec 15 05:12:37 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e265 e265: 6 total, 6 up, 6 in Dec 15 05:12:37 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e265: 6 total, 6 up, 6 in Dec 15 05:12:38 localhost ovn_controller[154603]: 2025-12-15T10:12:38Z|00521|binding|INFO|Releasing lport d0a827a2-ba8d-4ff8-b4de-16db92caf4c1 from this chassis (sb_readonly=0) Dec 15 05:12:38 localhost kernel: device tapd0a827a2-ba left promiscuous mode Dec 15 05:12:38 localhost ovn_controller[154603]: 2025-12-15T10:12:38Z|00522|binding|INFO|Setting lport d0a827a2-ba8d-4ff8-b4de-16db92caf4c1 down in Southbound Dec 15 05:12:38 localhost nova_compute[286344]: 2025-12-15 10:12:38.092 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:12:38 localhost nova_compute[286344]: 2025-12-15 10:12:38.113 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:12:38 localhost ovn_metadata_agent[160585]: 2025-12-15 10:12:38.118 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005559462.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpce287b4b-a043-5ed9-8e08-4fd555d768a2-b35c3a3c-c123-4757-af17-fe86d0729b58', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b35c3a3c-c123-4757-af17-fe86d0729b58', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '229ca1deba244d0780350d1a77507ad1', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005559462.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7289809c-b4ba-4daa-ad97-4de4c8e0e4c3, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=d0a827a2-ba8d-4ff8-b4de-16db92caf4c1) old=Port_Binding(up=[True], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:12:38 localhost ovn_metadata_agent[160585]: 2025-12-15 10:12:38.120 160590 INFO neutron.agent.ovn.metadata.agent [-] Port d0a827a2-ba8d-4ff8-b4de-16db92caf4c1 in datapath b35c3a3c-c123-4757-af17-fe86d0729b58 unbound from our chassis#033[00m Dec 15 05:12:38 localhost ovn_metadata_agent[160585]: 2025-12-15 10:12:38.122 160590 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b35c3a3c-c123-4757-af17-fe86d0729b58, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m Dec 15 05:12:38 localhost ovn_metadata_agent[160585]: 2025-12-15 10:12:38.123 160858 DEBUG oslo.privsep.daemon [-] privsep: reply[c4ed0210-5665-438f-af05-fe5f0014307c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m Dec 15 05:12:38 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Dec 15 05:12:38 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.25625 172.18.0.34:0/382777224' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Dec 15 05:12:39 localhost nova_compute[286344]: 2025-12-15 10:12:39.569 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:12:39 localhost ovn_controller[154603]: 2025-12-15T10:12:39Z|00523|binding|INFO|Releasing lport b35254ad-12eb-47bb-92be-44fefe0694f0 from this chassis (sb_readonly=0) Dec 15 05:12:39 localhost nova_compute[286344]: 2025-12-15 10:12:39.732 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:12:40 localhost ceph-mgr[292421]: client.0 ms_handle_reset on v2:172.18.0.108:6810/2408732030 Dec 15 05:12:40 localhost dnsmasq[335364]: exiting on receipt of SIGTERM Dec 15 05:12:40 localhost podman[336334]: 2025-12-15 10:12:40.824672054 +0000 UTC m=+0.057116842 container kill a28b5bd276c88258da127610008fefed786d6a74d721e8f5acfc18f6893594cf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b35c3a3c-c123-4757-af17-fe86d0729b58, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0) Dec 15 05:12:40 localhost systemd[1]: libpod-a28b5bd276c88258da127610008fefed786d6a74d721e8f5acfc18f6893594cf.scope: Deactivated successfully. Dec 15 05:12:40 localhost podman[336349]: 2025-12-15 10:12:40.896732338 +0000 UTC m=+0.055755876 container died a28b5bd276c88258da127610008fefed786d6a74d721e8f5acfc18f6893594cf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b35c3a3c-c123-4757-af17-fe86d0729b58, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true) Dec 15 05:12:40 localhost podman[336349]: 2025-12-15 10:12:40.932697408 +0000 UTC m=+0.091720906 container cleanup a28b5bd276c88258da127610008fefed786d6a74d721e8f5acfc18f6893594cf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b35c3a3c-c123-4757-af17-fe86d0729b58, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202) Dec 15 05:12:40 localhost systemd[1]: libpod-conmon-a28b5bd276c88258da127610008fefed786d6a74d721e8f5acfc18f6893594cf.scope: Deactivated successfully. Dec 15 05:12:40 localhost podman[336351]: 2025-12-15 10:12:40.978026451 +0000 UTC m=+0.131580381 container remove a28b5bd276c88258da127610008fefed786d6a74d721e8f5acfc18f6893594cf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-b35c3a3c-c123-4757-af17-fe86d0729b58, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Dec 15 05:12:41 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:12:41.088 267546 INFO neutron.agent.dhcp.agent [None req-b910ec39-97eb-474b-9e46-0d26f45d0a79 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:12:41 localhost neutron_dhcp_agent[267542]: 2025-12-15 10:12:41.090 267546 INFO neutron.agent.dhcp.agent [None req-b910ec39-97eb-474b-9e46-0d26f45d0a79 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}#033[00m Dec 15 05:12:41 localhost nova_compute[286344]: 2025-12-15 10:12:41.607 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:12:41 localhost systemd[1]: var-lib-containers-storage-overlay-020a0d1b7a1fb883a3a27d891797524553ec3881d89717b22ab1b04e5d7c5df8-merged.mount: Deactivated successfully. Dec 15 05:12:41 localhost systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a28b5bd276c88258da127610008fefed786d6a74d721e8f5acfc18f6893594cf-userdata-shm.mount: Deactivated successfully. Dec 15 05:12:41 localhost systemd[1]: run-netns-qdhcp\x2db35c3a3c\x2dc123\x2d4757\x2daf17\x2dfe86d0729b58.mount: Deactivated successfully. Dec 15 05:12:41 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e265 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:12:44 localhost nova_compute[286344]: 2025-12-15 10:12:44.571 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:12:45 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e265 do_prune osdmap full prune enabled Dec 15 05:12:45 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e266 e266: 6 total, 6 up, 6 in Dec 15 05:12:45 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e266: 6 total, 6 up, 6 in Dec 15 05:12:45 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Dec 15 05:12:45 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.25625 172.18.0.34:0/382777224' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Dec 15 05:12:46 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e266 do_prune osdmap full prune enabled Dec 15 05:12:46 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e267 e267: 6 total, 6 up, 6 in Dec 15 05:12:46 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e267: 6 total, 6 up, 6 in Dec 15 05:12:46 localhost nova_compute[286344]: 2025-12-15 10:12:46.609 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:12:46 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e267 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:12:46 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e267 do_prune osdmap full prune enabled Dec 15 05:12:46 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e268 e268: 6 total, 6 up, 6 in Dec 15 05:12:46 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e268: 6 total, 6 up, 6 in Dec 15 05:12:47 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e268 do_prune osdmap full prune enabled Dec 15 05:12:47 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e269 e269: 6 total, 6 up, 6 in Dec 15 05:12:47 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e269: 6 total, 6 up, 6 in Dec 15 05:12:47 localhost ceph-osd[31375]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #45. Immutable memtables: 2. Dec 15 05:12:49 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Dec 15 05:12:49 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.25625 172.18.0.34:0/382777224' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Dec 15 05:12:49 localhost nova_compute[286344]: 2025-12-15 10:12:49.574 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:12:49 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e269 do_prune osdmap full prune enabled Dec 15 05:12:49 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e270 e270: 6 total, 6 up, 6 in Dec 15 05:12:49 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e270: 6 total, 6 up, 6 in Dec 15 05:12:51 localhost ovn_metadata_agent[160585]: 2025-12-15 10:12:51.487 160590 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 05:12:51 localhost ovn_metadata_agent[160585]: 2025-12-15 10:12:51.488 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 05:12:51 localhost ovn_metadata_agent[160585]: 2025-12-15 10:12:51.489 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 05:12:51 localhost nova_compute[286344]: 2025-12-15 10:12:51.611 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:12:51 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e270 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:12:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0. Dec 15 05:12:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09. Dec 15 05:12:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 05:12:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 05:12:53 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a. Dec 15 05:12:53 localhost podman[336390]: 2025-12-15 10:12:53.446221317 +0000 UTC m=+0.098662043 container health_status b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:12:53 localhost podman[336390]: 2025-12-15 10:12:53.458591031 +0000 UTC m=+0.111031747 container exec_died b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.41.3) Dec 15 05:12:53 localhost systemd[1]: b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a.service: Deactivated successfully. Dec 15 05:12:53 localhost podman[336376]: 2025-12-15 10:12:53.421595922 +0000 UTC m=+0.098977101 container health_status 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 15 05:12:53 localhost podman[336384]: 2025-12-15 10:12:53.528434185 +0000 UTC m=+0.193055259 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS) Dec 15 05:12:53 localhost podman[336378]: 2025-12-15 10:12:53.569806881 +0000 UTC m=+0.238722442 container health_status 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd) Dec 15 05:12:53 localhost podman[336378]: 2025-12-15 10:12:53.581449515 +0000 UTC m=+0.250365036 container exec_died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, managed_by=edpm_ansible, config_id=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true) Dec 15 05:12:53 localhost podman[336384]: 2025-12-15 10:12:53.591497737 +0000 UTC m=+0.256118871 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:12:53 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.service: Deactivated successfully. Dec 15 05:12:53 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 05:12:53 localhost podman[336377]: 2025-12-15 10:12:53.641146736 +0000 UTC m=+0.313966792 container health_status 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.tags=minimal rhel9, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, release=1755695350, com.redhat.component=ubi9-minimal-container, distribution-scope=public, maintainer=Red Hat, Inc., name=ubi9-minimal, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.) Dec 15 05:12:53 localhost podman[336376]: 2025-12-15 10:12:53.653945621 +0000 UTC m=+0.331326760 container exec_died 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 15 05:12:53 localhost systemd[1]: 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0.service: Deactivated successfully. Dec 15 05:12:53 localhost podman[336377]: 2025-12-15 10:12:53.680471477 +0000 UTC m=+0.353291583 container exec_died 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, config_id=openstack_network_exporter, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, vcs-type=git, io.openshift.expose-services=) Dec 15 05:12:53 localhost systemd[1]: 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09.service: Deactivated successfully. Dec 15 05:12:54 localhost nova_compute[286344]: 2025-12-15 10:12:54.576 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:12:56 localhost nova_compute[286344]: 2025-12-15 10:12:56.614 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:12:56 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Dec 15 05:12:56 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.25625 172.18.0.34:0/382777224' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Dec 15 05:12:56 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e270 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:12:56 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e270 do_prune osdmap full prune enabled Dec 15 05:12:56 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e271 e271: 6 total, 6 up, 6 in Dec 15 05:12:56 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e271: 6 total, 6 up, 6 in Dec 15 05:12:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 05:12:57 localhost podman[336480]: 2025-12-15 10:12:57.755702581 +0000 UTC m=+0.084900652 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true) Dec 15 05:12:57 localhost podman[336480]: 2025-12-15 10:12:57.760538901 +0000 UTC m=+0.089736972 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2) Dec 15 05:12:57 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 05:12:59 localhost nova_compute[286344]: 2025-12-15 10:12:59.578 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:13:01 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Dec 15 05:13:01 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.25625 172.18.0.34:0/382777224' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Dec 15 05:13:01 localhost nova_compute[286344]: 2025-12-15 10:13:01.616 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:13:01 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e271 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:13:01 localhost podman[243449]: time="2025-12-15T10:13:01Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 15 05:13:01 localhost podman[243449]: @ - - [15/Dec/2025:10:13:01 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156640 "" "Go-http-client/1.1" Dec 15 05:13:01 localhost podman[243449]: @ - - [15/Dec/2025:10:13:01 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19268 "" "Go-http-client/1.1" Dec 15 05:13:03 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Dec 15 05:13:03 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.25625 172.18.0.34:0/382777224' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Dec 15 05:13:04 localhost nova_compute[286344]: 2025-12-15 10:13:04.582 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:13:04 localhost systemd[1]: Started /usr/bin/podman healthcheck run a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e. Dec 15 05:13:04 localhost podman[336498]: 2025-12-15 10:13:04.718410603 +0000 UTC m=+0.053895915 container health_status a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Dec 15 05:13:04 localhost podman[336498]: 2025-12-15 10:13:04.754399914 +0000 UTC m=+0.089885166 container exec_died a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Dec 15 05:13:04 localhost systemd[1]: a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e.service: Deactivated successfully. Dec 15 05:13:04 localhost openstack_network_exporter[246484]: ERROR 10:13:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 05:13:04 localhost openstack_network_exporter[246484]: ERROR 10:13:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 05:13:04 localhost openstack_network_exporter[246484]: ERROR 10:13:04 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 15 05:13:04 localhost openstack_network_exporter[246484]: ERROR 10:13:04 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 15 05:13:04 localhost openstack_network_exporter[246484]: Dec 15 05:13:04 localhost openstack_network_exporter[246484]: ERROR 10:13:04 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 15 05:13:04 localhost openstack_network_exporter[246484]: Dec 15 05:13:05 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Dec 15 05:13:05 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1823998623' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Dec 15 05:13:05 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Dec 15 05:13:05 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1823998623' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Dec 15 05:13:06 localhost nova_compute[286344]: 2025-12-15 10:13:06.619 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:13:06 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e271 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:13:09 localhost nova_compute[286344]: 2025-12-15 10:13:09.584 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:13:09 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Dec 15 05:13:09 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.25625 172.18.0.34:0/382777224' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Dec 15 05:13:10 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Dec 15 05:13:10 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.25625 172.18.0.34:0/382777224' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Dec 15 05:13:11 localhost ovn_controller[154603]: 2025-12-15T10:13:11Z|00524|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory Dec 15 05:13:11 localhost nova_compute[286344]: 2025-12-15 10:13:11.621 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:13:11 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e271 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:13:11 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Dec 15 05:13:11 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:13:12 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e271 do_prune osdmap full prune enabled Dec 15 05:13:12 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e272 e272: 6 total, 6 up, 6 in Dec 15 05:13:12 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e272: 6 total, 6 up, 6 in Dec 15 05:13:12 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Dec 15 05:13:12 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:13:14 localhost nova_compute[286344]: 2025-12-15 10:13:14.586 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:13:14 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Dec 15 05:13:14 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:13:14 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:13:16 localhost nova_compute[286344]: 2025-12-15 10:13:16.623 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:13:16 localhost nova_compute[286344]: 2025-12-15 10:13:16.650 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:13:16 localhost ovn_metadata_agent[160585]: 2025-12-15 10:13:16.650 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=21, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'fe:17:e3', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fe:55:2b:86:15:b5'}, ipsec=False) old=SB_Global(nb_cfg=20) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:13:16 localhost ovn_metadata_agent[160585]: 2025-12-15 10:13:16.652 160590 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Dec 15 05:13:16 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e272 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:13:17 localhost nova_compute[286344]: 2025-12-15 10:13:17.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:13:17 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Dec 15 05:13:17 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1623220713' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Dec 15 05:13:18 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Dec 15 05:13:18 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.25625 172.18.0.34:0/382777224' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Dec 15 05:13:19 localhost nova_compute[286344]: 2025-12-15 10:13:19.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:13:19 localhost nova_compute[286344]: 2025-12-15 10:13:19.270 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 15 05:13:19 localhost nova_compute[286344]: 2025-12-15 10:13:19.271 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 15 05:13:19 localhost nova_compute[286344]: 2025-12-15 10:13:19.330 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 15 05:13:19 localhost nova_compute[286344]: 2025-12-15 10:13:19.331 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquired lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 15 05:13:19 localhost nova_compute[286344]: 2025-12-15 10:13:19.331 286348 DEBUG nova.network.neutron [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 15 05:13:19 localhost nova_compute[286344]: 2025-12-15 10:13:19.332 286348 DEBUG nova.objects.instance [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 15 05:13:19 localhost nova_compute[286344]: 2025-12-15 10:13:19.591 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:13:19 localhost nova_compute[286344]: 2025-12-15 10:13:19.737 286348 DEBUG nova.network.neutron [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Updating instance_info_cache with network_info: [{"id": "03ef8889-3216-43fb-8a52-4be17a956ce1", "address": "fa:16:3e:74:df:7c", "network": {"id": "befb7a72-17a9-4bcb-b561-84b8f626685a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.201", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "c785bf23f53946bc99867d8832a50266", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03ef8889-32", "ovs_interfaceid": "03ef8889-3216-43fb-8a52-4be17a956ce1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 15 05:13:19 localhost nova_compute[286344]: 2025-12-15 10:13:19.808 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Releasing lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 15 05:13:19 localhost nova_compute[286344]: 2025-12-15 10:13:19.809 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 15 05:13:19 localhost nova_compute[286344]: 2025-12-15 10:13:19.809 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:13:20 localhost nova_compute[286344]: 2025-12-15 10:13:20.269 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:13:21 localhost nova_compute[286344]: 2025-12-15 10:13:21.625 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:13:21 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e272 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:13:21 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e272 do_prune osdmap full prune enabled Dec 15 05:13:21 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e273 e273: 6 total, 6 up, 6 in Dec 15 05:13:21 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e273: 6 total, 6 up, 6 in Dec 15 05:13:22 localhost nova_compute[286344]: 2025-12-15 10:13:22.266 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:13:22 localhost ovn_metadata_agent[160585]: 2025-12-15 10:13:22.654 160590 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=12d96d64-e862-4f68-81e5-8d9ec5d3a5e2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '21'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 15 05:13:23 localhost nova_compute[286344]: 2025-12-15 10:13:23.265 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:13:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 05:13:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 05:13:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a. Dec 15 05:13:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0. Dec 15 05:13:23 localhost podman[336609]: 2025-12-15 10:13:23.780207094 +0000 UTC m=+0.098462518 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller) Dec 15 05:13:23 localhost systemd[1]: Started /usr/bin/podman healthcheck run 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09. Dec 15 05:13:23 localhost podman[336609]: 2025-12-15 10:13:23.821809386 +0000 UTC m=+0.140064760 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 05:13:23 localhost systemd[1]: tmp-crun.T4nEsX.mount: Deactivated successfully. Dec 15 05:13:23 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 05:13:23 localhost podman[336610]: 2025-12-15 10:13:23.872110453 +0000 UTC m=+0.185026633 container health_status b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Dec 15 05:13:23 localhost podman[336611]: 2025-12-15 10:13:23.830909281 +0000 UTC m=+0.141551809 container health_status 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Dec 15 05:13:23 localhost podman[336610]: 2025-12-15 10:13:23.884271051 +0000 UTC m=+0.197187271 container exec_died b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Dec 15 05:13:23 localhost systemd[1]: b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a.service: Deactivated successfully. Dec 15 05:13:23 localhost podman[336611]: 2025-12-15 10:13:23.915407692 +0000 UTC m=+0.226050210 container exec_died 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter) Dec 15 05:13:23 localhost systemd[1]: 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0.service: Deactivated successfully. Dec 15 05:13:23 localhost podman[336608]: 2025-12-15 10:13:23.933916091 +0000 UTC m=+0.253833709 container health_status 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251202, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=multipathd, io.buildah.version=1.41.3) Dec 15 05:13:23 localhost podman[336659]: 2025-12-15 10:13:23.888823044 +0000 UTC m=+0.087940603 container health_status 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, distribution-scope=public, managed_by=edpm_ansible, vcs-type=git, io.openshift.expose-services=, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, architecture=x86_64, build-date=2025-08-20T13:12:41) Dec 15 05:13:23 localhost podman[336608]: 2025-12-15 10:13:23.946289505 +0000 UTC m=+0.266207123 container exec_died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202) Dec 15 05:13:23 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.service: Deactivated successfully. Dec 15 05:13:23 localhost podman[336659]: 2025-12-15 10:13:23.972554074 +0000 UTC m=+0.171671683 container exec_died 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, name=ubi9-minimal, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, architecture=x86_64, distribution-scope=public, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter) Dec 15 05:13:23 localhost systemd[1]: 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09.service: Deactivated successfully. Dec 15 05:13:24 localhost nova_compute[286344]: 2025-12-15 10:13:24.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:13:24 localhost nova_compute[286344]: 2025-12-15 10:13:24.271 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:13:24 localhost nova_compute[286344]: 2025-12-15 10:13:24.271 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 15 05:13:24 localhost nova_compute[286344]: 2025-12-15 10:13:24.594 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:13:26 localhost nova_compute[286344]: 2025-12-15 10:13:26.628 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:13:26 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:13:27 localhost nova_compute[286344]: 2025-12-15 10:13:27.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:13:27 localhost nova_compute[286344]: 2025-12-15 10:13:27.287 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 05:13:27 localhost nova_compute[286344]: 2025-12-15 10:13:27.288 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 05:13:27 localhost nova_compute[286344]: 2025-12-15 10:13:27.288 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 05:13:27 localhost nova_compute[286344]: 2025-12-15 10:13:27.288 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Auditing locally available compute resources for np0005559462.localdomain (node: np0005559462.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 15 05:13:27 localhost nova_compute[286344]: 2025-12-15 10:13:27.289 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 05:13:27 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Dec 15 05:13:27 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/268131552' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Dec 15 05:13:27 localhost nova_compute[286344]: 2025-12-15 10:13:27.746 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 05:13:27 localhost nova_compute[286344]: 2025-12-15 10:13:27.818 286348 DEBUG nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 05:13:27 localhost nova_compute[286344]: 2025-12-15 10:13:27.819 286348 DEBUG nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 05:13:28 localhost nova_compute[286344]: 2025-12-15 10:13:28.009 286348 WARNING nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 15 05:13:28 localhost nova_compute[286344]: 2025-12-15 10:13:28.010 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Hypervisor/Node resource view: name=np0005559462.localdomain free_ram=11138MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 15 05:13:28 localhost nova_compute[286344]: 2025-12-15 10:13:28.011 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 05:13:28 localhost nova_compute[286344]: 2025-12-15 10:13:28.011 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 05:13:28 localhost nova_compute[286344]: 2025-12-15 10:13:28.067 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Instance 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 15 05:13:28 localhost nova_compute[286344]: 2025-12-15 10:13:28.068 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 15 05:13:28 localhost nova_compute[286344]: 2025-12-15 10:13:28.068 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Final resource view: name=np0005559462.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 15 05:13:28 localhost nova_compute[286344]: 2025-12-15 10:13:28.104 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 05:13:28 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Dec 15 05:13:28 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2411806599' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Dec 15 05:13:28 localhost nova_compute[286344]: 2025-12-15 10:13:28.559 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 05:13:28 localhost nova_compute[286344]: 2025-12-15 10:13:28.566 286348 DEBUG nova.compute.provider_tree [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Inventory has not changed in ProviderTree for provider: 26c8956b-6742-4951-b566-971b9bbe323b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 15 05:13:28 localhost nova_compute[286344]: 2025-12-15 10:13:28.583 286348 DEBUG nova.scheduler.client.report [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Inventory has not changed for provider 26c8956b-6742-4951-b566-971b9bbe323b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 15 05:13:28 localhost nova_compute[286344]: 2025-12-15 10:13:28.585 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Compute_service record updated for np0005559462.localdomain:np0005559462.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 15 05:13:28 localhost nova_compute[286344]: 2025-12-15 10:13:28.585 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.574s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 05:13:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 05:13:28 localhost podman[336760]: 2025-12-15 10:13:28.745356907 +0000 UTC m=+0.075405075 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2) Dec 15 05:13:28 localhost podman[336760]: 2025-12-15 10:13:28.753279351 +0000 UTC m=+0.083327499 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:13:28 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 05:13:29 localhost nova_compute[286344]: 2025-12-15 10:13:29.597 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:13:31 localhost nova_compute[286344]: 2025-12-15 10:13:31.586 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:13:31 localhost nova_compute[286344]: 2025-12-15 10:13:31.631 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:13:31 localhost podman[243449]: time="2025-12-15T10:13:31Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 15 05:13:31 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:13:31 localhost podman[243449]: @ - - [15/Dec/2025:10:13:31 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156640 "" "Go-http-client/1.1" Dec 15 05:13:31 localhost podman[243449]: @ - - [15/Dec/2025:10:13:31 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19264 "" "Go-http-client/1.1" Dec 15 05:13:34 localhost nova_compute[286344]: 2025-12-15 10:13:34.599 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:13:34 localhost openstack_network_exporter[246484]: ERROR 10:13:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 05:13:34 localhost openstack_network_exporter[246484]: ERROR 10:13:34 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 15 05:13:34 localhost openstack_network_exporter[246484]: ERROR 10:13:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 05:13:34 localhost openstack_network_exporter[246484]: ERROR 10:13:34 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 15 05:13:34 localhost openstack_network_exporter[246484]: Dec 15 05:13:34 localhost openstack_network_exporter[246484]: ERROR 10:13:34 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 15 05:13:34 localhost openstack_network_exporter[246484]: Dec 15 05:13:35 localhost systemd[1]: Started /usr/bin/podman healthcheck run a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e. Dec 15 05:13:35 localhost podman[336778]: 2025-12-15 10:13:35.753295628 +0000 UTC m=+0.079570498 container health_status a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Dec 15 05:13:35 localhost podman[336778]: 2025-12-15 10:13:35.765444026 +0000 UTC m=+0.091718916 container exec_died a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Dec 15 05:13:35 localhost systemd[1]: a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e.service: Deactivated successfully. Dec 15 05:13:36 localhost nova_compute[286344]: 2025-12-15 10:13:36.633 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:13:36 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:13:36 localhost nova_compute[286344]: 2025-12-15 10:13:36.937 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:13:39 localhost nova_compute[286344]: 2025-12-15 10:13:39.602 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:13:40 localhost nova_compute[286344]: 2025-12-15 10:13:40.792 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:13:41 localhost nova_compute[286344]: 2025-12-15 10:13:41.634 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:13:41 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:13:42 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : mgrmap e53: np0005559464.aomnqe(active, since 16m), standbys: np0005559462.fudvyx, np0005559463.daptkf Dec 15 05:13:44 localhost nova_compute[286344]: 2025-12-15 10:13:44.605 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:13:45 localhost ovn_controller[154603]: 2025-12-15T10:13:45Z|00525|binding|INFO|Releasing lport b35254ad-12eb-47bb-92be-44fefe0694f0 from this chassis (sb_readonly=0) Dec 15 05:13:45 localhost nova_compute[286344]: 2025-12-15 10:13:45.820 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:13:46 localhost nova_compute[286344]: 2025-12-15 10:13:46.636 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:13:46 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.127 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'name': 'test', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005559462.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'c785bf23f53946bc99867d8832a50266', 'user_id': '1ba5fce347b64bfebf995f187193f205', 'hostId': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.129 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.155 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.156 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.159 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4aeab6d3-e1c3-4036-95cb-e81ef274c36c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T10:13:48.130373', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'bf08b928-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12694.323162499, 'message_signature': 'e409bf40d35c5fbf9b93eba32aa078ac05abab9423f38322d64834435c59dd07'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T10:13:48.130373', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'bf08d35e-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12694.323162499, 'message_signature': '86eac81f0e5e2d68c7efb7f5bd8d79da6c0037a350e19485ec2359231af03576'}]}, 'timestamp': '2025-12-15 10:13:48.157458', '_unique_id': 'd1bf1d1ae62a42d49cb11ba5bf562b0f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.159 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.159 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.159 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.159 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.159 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.159 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.159 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.159 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.159 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.159 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.159 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.159 12 ERROR oslo_messaging.notify.messaging Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.159 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.159 12 ERROR oslo_messaging.notify.messaging Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.159 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.159 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.159 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.159 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.159 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.159 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.159 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.159 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.159 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.159 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.159 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.159 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.159 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.159 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.159 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.159 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.159 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.159 12 ERROR oslo_messaging.notify.messaging Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.161 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.161 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.latency volume: 1342134926 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.161 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.latency volume: 123356132 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.163 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b1d61da3-e34a-4c9a-ad6f-a2834de11f53', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1342134926, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T10:13:48.161365', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'bf09815a-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12694.323162499, 'message_signature': 'a113025a014f4baada0b86cd8014499855b4148d73b81dcbb5f068981edde799'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 123356132, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T10:13:48.161365', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'bf099cd0-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12694.323162499, 'message_signature': 'be9ecf6b252d3b46aeb4695eac27bf5a416cbd63d19eac49e15752c84cde1f54'}]}, 'timestamp': '2025-12-15 10:13:48.162593', '_unique_id': 'e0bce61a50584127ba034d5807784688'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.163 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.163 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.163 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.163 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.163 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.163 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.163 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.163 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.163 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.163 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.163 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.163 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.163 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.163 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.163 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.163 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.163 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.163 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.163 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.163 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.163 12 ERROR oslo_messaging.notify.messaging Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.163 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.163 12 ERROR oslo_messaging.notify.messaging Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.163 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.163 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.163 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.163 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.163 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.163 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.163 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.163 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.163 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.163 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.163 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.163 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.163 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.163 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.163 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.163 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.163 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.163 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.163 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.163 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.163 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.163 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.163 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.163 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.163 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.163 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.163 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.163 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.163 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.163 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.163 12 ERROR oslo_messaging.notify.messaging Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.165 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.176 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.176 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.178 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '02c89511-2f67-47eb-960d-9bacb78f20da', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T10:13:48.165443', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'bf0bc4f6-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12694.358162123, 'message_signature': 'eb0cdbb92f6d120fefa587f0a4f2c7a68bd815d08a2d090d794a5895c6bdf1c1'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T10:13:48.165443', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'bf0bda90-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12694.358162123, 'message_signature': '0ed281556985e545eb1901430bd426872469d06065cb38d477f888cfd00e2029'}]}, 'timestamp': '2025-12-15 10:13:48.177279', '_unique_id': 'eee28d60303543c4ad4d20a23a578baf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.178 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.178 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.178 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.178 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.178 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.178 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.178 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.178 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.178 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.178 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.178 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.178 12 ERROR oslo_messaging.notify.messaging Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.178 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.178 12 ERROR oslo_messaging.notify.messaging Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.178 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.178 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.178 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.178 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.178 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.178 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.178 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.178 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.178 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.178 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.178 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.178 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.178 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.178 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.178 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.178 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.178 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.178 12 ERROR oslo_messaging.notify.messaging Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.180 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.181 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.181 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.latency volume: 1243487016 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.182 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.latency volume: 24779175 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.184 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e05a069d-c50c-4755-a098-ca00924caa0f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1243487016, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T10:13:48.181431', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'bf0c96c4-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12694.323162499, 'message_signature': '4530b6c3c02837bb5e9179b7a84094efabf6116d100de5df70c88a84f6d93d60'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24779175, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T10:13:48.181431', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'bf0cae20-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12694.323162499, 'message_signature': 'c038d9c833ab01a237589d12cbf3f5c4ba4ba8dbc23cb5acd8bfb3cee30b3fd0'}]}, 'timestamp': '2025-12-15 10:13:48.182715', '_unique_id': '94d60ae9418c40f3bd7b6c4cdf686b93'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.184 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.184 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.184 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.184 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.184 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.184 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.184 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.184 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.184 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.184 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.184 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.184 12 ERROR oslo_messaging.notify.messaging Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.184 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.184 12 ERROR oslo_messaging.notify.messaging Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.184 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.184 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.184 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.184 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.184 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.184 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.184 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.184 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.184 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.184 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.184 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.184 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.184 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.184 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.184 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.184 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.184 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.184 12 ERROR oslo_messaging.notify.messaging Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.186 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.186 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.191 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.192 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a3aa67bb-f4d4-4f5e-a631-628221bc4003', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:13:48.186440', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': 'bf0e0e8c-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12694.37918735, 'message_signature': '5b9f66b1c5851a9f86f0e2782707c871454c7bf4e634a205cec38227cac3221a'}]}, 'timestamp': '2025-12-15 10:13:48.191763', '_unique_id': 'e25f9406e1eb4bce9d14fb3e65b749bb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.192 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.192 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.192 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.192 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.192 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.192 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.192 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.192 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.192 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.192 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.192 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.192 12 ERROR oslo_messaging.notify.messaging Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.192 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.192 12 ERROR oslo_messaging.notify.messaging Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.192 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.192 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.192 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.192 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.192 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.192 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.192 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.192 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.192 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.192 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.192 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.192 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.192 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.192 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.192 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.192 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.192 12 ERROR oslo_messaging.notify.messaging Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.194 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.194 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.194 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.196 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '81d7c79f-4e4f-481e-b2e5-e3633bfdd24c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T10:13:48.194257', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'bf0e82d6-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12694.358162123, 'message_signature': 'e267d28f4bf75b78b4712e97514a11b6d159d859cb751be106563d27388b65e0'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T10:13:48.194257', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'bf0e92d0-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12694.358162123, 'message_signature': 'bc7f11b848a4655f7561914468e2f612eb3ae1ea103f477cf9bdfb7ad9e9d71d'}]}, 'timestamp': '2025-12-15 10:13:48.195119', '_unique_id': '3eca524afedc43218192f69569eadc7e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.196 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.196 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.196 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.196 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.196 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.196 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.196 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.196 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.196 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.196 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.196 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.196 12 ERROR oslo_messaging.notify.messaging Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.196 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.196 12 ERROR oslo_messaging.notify.messaging Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.196 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.196 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.196 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.196 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.196 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.196 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.196 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.196 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.196 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.196 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.196 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.196 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.196 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.196 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.196 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.196 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.196 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.196 12 ERROR oslo_messaging.notify.messaging Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.197 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.197 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.197 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.199 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fb21bcac-9359-4214-945f-0d73d6cb15cc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T10:13:48.197274', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'bf0ef8a6-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12694.358162123, 'message_signature': '724d903c4320aa1bd3877d5da09e1c5dc4ddb4a4faa2a8d1556d30eecb518765'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T10:13:48.197274', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'bf0f1002-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12694.358162123, 'message_signature': '22923a1504a99a46ac1ccf4d7b7dc60e87a832a28ae3dafcf4f2763ce9838fbd'}]}, 'timestamp': '2025-12-15 10:13:48.198303', '_unique_id': 'b5b7336720354ea992545d5ae01ee71f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.199 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.199 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.199 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.199 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.199 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.199 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.199 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.199 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.199 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.199 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.199 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.199 12 ERROR oslo_messaging.notify.messaging Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.199 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.199 12 ERROR oslo_messaging.notify.messaging Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.199 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.199 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.199 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.199 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.199 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.199 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.199 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.199 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.199 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.199 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.199 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.199 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.199 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.199 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.199 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.199 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.199 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.199 12 ERROR oslo_messaging.notify.messaging Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.200 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.200 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.216 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/memory.usage volume: 51.73828125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.218 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '164c1727-e1ab-40f6-8623-54e625f78ce9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.73828125, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'timestamp': '2025-12-15T10:13:48.200765', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'bf11e52a-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12694.408944353, 'message_signature': 'e0d749660a5a0a15af937ebc305f8ab7ba3f1e5f70d012aae93cc28fc6ac8d2f'}]}, 'timestamp': '2025-12-15 10:13:48.217167', '_unique_id': '681669040ab34d70b823704465546235'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.218 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.218 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.218 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.218 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.218 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.218 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.218 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.218 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.218 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.218 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.218 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.218 12 ERROR oslo_messaging.notify.messaging Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.218 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.218 12 ERROR oslo_messaging.notify.messaging Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.218 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.218 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.218 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.218 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.218 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.218 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.218 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.218 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.218 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.218 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.218 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.218 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.218 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.218 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.218 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.218 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.218 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.218 12 ERROR oslo_messaging.notify.messaging Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.219 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.219 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.220 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd752bd00-4ce6-442c-80e3-6df442b717fe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:13:48.219299', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': 'bf1254ec-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12694.37918735, 'message_signature': '656b3484c683e759d0aa5e316ea3a20ac010bf52c06484a47e9614a7f5478c22'}]}, 'timestamp': '2025-12-15 10:13:48.219747', '_unique_id': 'a41cae83e1694c8ebb9ef755f881e3bc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.220 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.220 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.220 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.220 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.220 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.220 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.220 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.220 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.220 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.220 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.220 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.220 12 ERROR oslo_messaging.notify.messaging Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.220 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.220 12 ERROR oslo_messaging.notify.messaging Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.220 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.220 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.220 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.220 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.220 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.220 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.220 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.220 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.220 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.220 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.220 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.220 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.220 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.220 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.220 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.220 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.220 12 ERROR oslo_messaging.notify.messaging Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.221 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.222 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.223 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '46170a02-4154-4367-8141-11860cb582f6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:13:48.222044', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': 'bf12c03a-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12694.37918735, 'message_signature': '714163cc6b7c4203398467f1ee9671beb70eeafded54435efa33d16afbffea1e'}]}, 'timestamp': '2025-12-15 10:13:48.222496', '_unique_id': 'af05f8c95765440bb8847f9f17a39b70'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.223 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.223 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.223 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.223 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.223 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.223 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.223 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.223 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.223 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.223 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.223 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.223 12 ERROR oslo_messaging.notify.messaging Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.223 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.223 12 ERROR oslo_messaging.notify.messaging Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.223 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.223 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.223 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.223 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.223 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.223 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.223 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.223 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.223 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.223 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.223 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.223 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.223 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.223 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.223 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.223 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.223 12 ERROR oslo_messaging.notify.messaging Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.224 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.224 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.224 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.226 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dfe5a062-9697-423f-98e3-b84e11a48f44', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:13:48.224690', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': 'bf132728-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12694.37918735, 'message_signature': 'e7cea3377f6c387d08e9fb8b587077b4959fdd29f572745c8e06193c19069b51'}]}, 'timestamp': '2025-12-15 10:13:48.225165', '_unique_id': 'cc953f552f86463eaf44b426231acb37'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.226 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.226 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.226 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.226 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.226 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.226 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.226 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.226 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.226 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.226 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.226 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.226 12 ERROR oslo_messaging.notify.messaging Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.226 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.226 12 ERROR oslo_messaging.notify.messaging Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.226 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.226 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.226 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.226 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.226 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.226 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.226 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.226 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.226 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.226 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.226 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.226 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.226 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.226 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.226 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.226 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.226 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.226 12 ERROR oslo_messaging.notify.messaging Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.227 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.227 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.228 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0d88e731-a1d5-4857-9a18-7681dfa67181', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:13:48.227162', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': 'bf1387cc-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12694.37918735, 'message_signature': 'e7566173f169dfdc0e92d4c799f7552b942edd1cdf8b69c689a3c7e260717a1d'}]}, 'timestamp': '2025-12-15 10:13:48.227654', '_unique_id': 'eb9aa65cc6894f549ab1fd27bb27fbb4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.228 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.228 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.228 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.228 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.228 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.228 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.228 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.228 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.228 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.228 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.228 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.228 12 ERROR oslo_messaging.notify.messaging Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.228 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.228 12 ERROR oslo_messaging.notify.messaging Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.228 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.228 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.228 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.228 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.228 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.228 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.228 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.228 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.228 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.228 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.228 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.228 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.228 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.228 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.228 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.228 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.228 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.228 12 ERROR oslo_messaging.notify.messaging Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.229 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.229 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.231 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6fe1a405-dc42-4780-aecf-8cbd6aa64021', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:13:48.229698', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': 'bf13eadc-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12694.37918735, 'message_signature': '284aa1eab06a8de43e8802c4b774b8af58a3c589bd4f347aa77b2414f3217700'}]}, 'timestamp': '2025-12-15 10:13:48.230172', '_unique_id': 'dd2c5c13187643e2a545e2c1f4f4041d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.231 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.231 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.231 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.231 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.231 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.231 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.231 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.231 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.231 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.231 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.231 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.231 12 ERROR oslo_messaging.notify.messaging Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.231 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.231 12 ERROR oslo_messaging.notify.messaging Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.231 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.231 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.231 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.231 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.231 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.231 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.231 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.231 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.231 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.231 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.231 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.231 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.231 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.231 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.231 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.231 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.231 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.231 12 ERROR oslo_messaging.notify.messaging Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.232 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.232 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.233 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0a1d8d2f-2fbf-41f1-bcdd-6d28b24e86f3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:13:48.232160', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': 'bf144afe-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12694.37918735, 'message_signature': 'd7d947245b7ea32c36d90fc16586f75adb5d059a94ac8b145368333068f2a05a'}]}, 'timestamp': '2025-12-15 10:13:48.232620', '_unique_id': 'ef6661d03ab84736bdd6acb9d388328a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.233 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.233 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.233 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.233 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.233 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.233 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.233 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.233 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.233 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.233 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.233 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.233 12 ERROR oslo_messaging.notify.messaging Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.233 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.233 12 ERROR oslo_messaging.notify.messaging Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.233 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.233 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.233 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.233 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.233 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.233 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.233 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.233 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.233 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.233 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.233 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.233 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.233 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.233 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.233 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.233 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.233 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.233 12 ERROR oslo_messaging.notify.messaging Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.234 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.234 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.235 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '18857ca8-2b0b-4f94-9383-3fa4c70f109c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:13:48.234617', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': 'bf14aae4-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12694.37918735, 'message_signature': 'e4a6d309113d2b0b89ed537a67dc0f5865def721b7b91c4e632492ad826ac61d'}]}, 'timestamp': '2025-12-15 10:13:48.235083', '_unique_id': '921c4ea545a04926a9dd873f582484af'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.235 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.235 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.235 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.235 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.235 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.235 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.235 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.235 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.235 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.235 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.235 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.235 12 ERROR oslo_messaging.notify.messaging Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.235 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.235 12 ERROR oslo_messaging.notify.messaging Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.235 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.235 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.235 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.235 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.235 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.235 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.235 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.235 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.235 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.235 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.235 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.235 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.235 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.235 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.235 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.235 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.235 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.235 12 ERROR oslo_messaging.notify.messaging Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.238 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.238 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.239 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.241 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b85dd939-bab9-4b63-8beb-55ae6c228827', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T10:13:48.238493', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'bf1546de-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12694.323162499, 'message_signature': '4caa3fb0666d39ac35d12d2759389095ff43b5bb6db90906d8e35b7fcc177e4a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T10:13:48.238493', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'bf155bec-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12694.323162499, 'message_signature': 'a135dd2dce8b5d62c840859f07e28374e7d846a71f45373afd841578599524df'}]}, 'timestamp': '2025-12-15 10:13:48.239582', '_unique_id': '3d98593d41174bc0ba6abcb6ae0091da'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.241 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.241 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.241 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.241 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.241 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.241 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.241 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.241 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.241 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.241 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.241 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.241 12 ERROR oslo_messaging.notify.messaging Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.241 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.241 12 ERROR oslo_messaging.notify.messaging Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.241 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.241 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.241 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.241 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.241 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.241 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.241 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.241 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.241 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.241 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.241 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.241 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.241 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.241 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.241 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.241 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.241 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.241 12 ERROR oslo_messaging.notify.messaging Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.242 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.242 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.243 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dc37dd7a-bcd9-4eb8-9fce-c6aaad0c5001', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:13:48.242490', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': 'bf15dffe-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12694.37918735, 'message_signature': '27ee57cd873a64ccaac35afe59f17983f45f13f289c4db39ed61148cfd6a7155'}]}, 'timestamp': '2025-12-15 10:13:48.242973', '_unique_id': 'd7e62a0b48784cd59c2e83e7a3e274ec'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.243 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.243 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.243 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.243 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.243 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.243 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.243 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.243 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.243 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.243 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.243 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.243 12 ERROR oslo_messaging.notify.messaging Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.243 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.243 12 ERROR oslo_messaging.notify.messaging Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.243 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.243 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.243 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.243 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.243 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.243 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.243 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.243 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.243 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.243 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.243 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.243 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.243 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.243 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.243 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.243 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.243 12 ERROR oslo_messaging.notify.messaging Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.245 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.245 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.245 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.246 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '43eec1ce-0a8f-4013-b34c-b327cc1221ef', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T10:13:48.245152', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'bf1646ec-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12694.323162499, 'message_signature': '7f269fa99f03f9ad2de46f7eb75f3db201c3d767a1fe710bb648872d2d98aa0b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T10:13:48.245152', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'bf1656dc-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12694.323162499, 'message_signature': '73113a8c5fe81a399db8cb4b5f47b237b66dd58f77a72bff589a95214f5b3f67'}]}, 'timestamp': '2025-12-15 10:13:48.246013', '_unique_id': '8e66dd3f5ba0455da9e50595378444d0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.246 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.246 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.246 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.246 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.246 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.246 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.246 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.246 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.246 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.246 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.246 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.246 12 ERROR oslo_messaging.notify.messaging Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.246 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.246 12 ERROR oslo_messaging.notify.messaging Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.246 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.246 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.246 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.246 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.246 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.246 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.246 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.246 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.246 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.246 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.246 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.246 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.246 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.246 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.246 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.246 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.246 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.246 12 ERROR oslo_messaging.notify.messaging Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.248 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.248 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.248 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.249 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f064a390-eb5d-4ddd-a506-fc423113023b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T10:13:48.248166', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'bf16bdd4-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12694.323162499, 'message_signature': '2f1ab2d11d8b3ac6098139823123789e01a7b707a32d46021fd48158eceb45c0'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T10:13:48.248166', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'bf16ce14-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12694.323162499, 'message_signature': '884be58e7ebe70c32df5f9df64d4f16dd684e8bbdb0f010bc69534469cae7f46'}]}, 'timestamp': '2025-12-15 10:13:48.249072', '_unique_id': 'e4bd1673cff64937beb844771760776a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.249 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.249 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.249 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.249 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.249 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.249 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.249 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.249 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.249 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.249 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.249 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.249 12 ERROR oslo_messaging.notify.messaging Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.249 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.249 12 ERROR oslo_messaging.notify.messaging Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.249 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.249 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.249 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.249 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.249 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.249 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.249 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.249 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.249 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.249 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.249 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.249 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.249 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.249 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.249 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.249 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.249 12 ERROR oslo_messaging.notify.messaging Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.251 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.251 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.252 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fedf7413-b799-4680-8e16-5aad45f236b2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:13:48.251189', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': 'bf1732c8-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12694.37918735, 'message_signature': '6648a3e838719bd6885ae28f48dea960c65c153c63f2fc515e65616be974cd03'}]}, 'timestamp': '2025-12-15 10:13:48.251643', '_unique_id': '7efb52b5a8c441589166ed46be6f1b0c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.252 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.252 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.252 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.252 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.252 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.252 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.252 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.252 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.252 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.252 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.252 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.252 12 ERROR oslo_messaging.notify.messaging Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.252 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.252 12 ERROR oslo_messaging.notify.messaging Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.252 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.252 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.252 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.252 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.252 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.252 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.252 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.252 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.252 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.252 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.252 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.252 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.252 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.252 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.252 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.252 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.252 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.252 12 ERROR oslo_messaging.notify.messaging Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.253 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.253 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/cpu volume: 17690000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.255 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b177d2b4-1797-4b51-abd7-a75b5f1ac3f0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 17690000000, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'timestamp': '2025-12-15T10:13:48.253763', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'bf1798f8-d99e-11f0-817e-fa163ebaca0f', 'monotonic_time': 12694.408944353, 'message_signature': 'e6e3941ccbfe40d26f77a8f917414b27b58f98d0da4bb2db8fd61193fdf7f59d'}]}, 'timestamp': '2025-12-15 10:13:48.254250', '_unique_id': '2367237e511e4ce69a4710701a77627f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.255 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.255 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.255 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.255 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.255 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.255 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.255 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.255 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.255 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.255 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.255 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.255 12 ERROR oslo_messaging.notify.messaging Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.255 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.255 12 ERROR oslo_messaging.notify.messaging Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.255 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.255 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.255 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.255 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.255 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.255 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.255 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.255 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.255 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.255 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.255 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.255 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.255 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.255 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.255 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.255 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.255 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:13:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:13:48.255 12 ERROR oslo_messaging.notify.messaging Dec 15 05:13:49 localhost nova_compute[286344]: 2025-12-15 10:13:49.609 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:13:51 localhost ovn_metadata_agent[160585]: 2025-12-15 10:13:51.488 160590 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 05:13:51 localhost ovn_metadata_agent[160585]: 2025-12-15 10:13:51.488 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 05:13:51 localhost ovn_metadata_agent[160585]: 2025-12-15 10:13:51.489 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 05:13:51 localhost nova_compute[286344]: 2025-12-15 10:13:51.638 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:13:51 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:13:52 localhost sshd[336803]: main: sshd: ssh-rsa algorithm is disabled Dec 15 05:13:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0. Dec 15 05:13:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09. Dec 15 05:13:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 05:13:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 05:13:54 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a. Dec 15 05:13:54 localhost podman[336805]: 2025-12-15 10:13:54.392369035 +0000 UTC m=+0.096396693 container health_status 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}) Dec 15 05:13:54 localhost podman[336805]: 2025-12-15 10:13:54.40331531 +0000 UTC m=+0.107343008 container exec_died 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Dec 15 05:13:54 localhost systemd[1]: 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0.service: Deactivated successfully. Dec 15 05:13:54 localhost systemd[1]: tmp-crun.QnAZNX.mount: Deactivated successfully. Dec 15 05:13:54 localhost systemd[1]: tmp-crun.ZA08QP.mount: Deactivated successfully. Dec 15 05:13:54 localhost podman[336806]: 2025-12-15 10:13:54.501485379 +0000 UTC m=+0.201197160 container health_status 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, architecture=x86_64, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, name=ubi9-minimal, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b) Dec 15 05:13:54 localhost podman[336813]: 2025-12-15 10:13:54.548162488 +0000 UTC m=+0.238618669 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0) Dec 15 05:13:54 localhost podman[336813]: 2025-12-15 10:13:54.591463306 +0000 UTC m=+0.281919537 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}) Dec 15 05:13:54 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 05:13:54 localhost podman[336807]: 2025-12-15 10:13:54.609052241 +0000 UTC m=+0.303858439 container health_status 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.vendor=CentOS) Dec 15 05:13:54 localhost nova_compute[286344]: 2025-12-15 10:13:54.610 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:13:54 localhost podman[336806]: 2025-12-15 10:13:54.620397707 +0000 UTC m=+0.320109468 container exec_died 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, release=1755695350, version=9.6, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, name=ubi9-minimal, distribution-scope=public, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., vcs-type=git) Dec 15 05:13:54 localhost podman[336807]: 2025-12-15 10:13:54.624437276 +0000 UTC m=+0.319243514 container exec_died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Dec 15 05:13:54 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.service: Deactivated successfully. Dec 15 05:13:54 localhost podman[336819]: 2025-12-15 10:13:54.474945682 +0000 UTC m=+0.160849710 container health_status b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true) Dec 15 05:13:54 localhost systemd[1]: 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09.service: Deactivated successfully. Dec 15 05:13:54 localhost podman[336819]: 2025-12-15 10:13:54.709491801 +0000 UTC m=+0.395395769 container exec_died b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible) Dec 15 05:13:54 localhost systemd[1]: b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a.service: Deactivated successfully. Dec 15 05:13:56 localhost nova_compute[286344]: 2025-12-15 10:13:56.640 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:13:56 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:13:59 localhost nova_compute[286344]: 2025-12-15 10:13:59.613 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:13:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 05:13:59 localhost podman[336908]: 2025-12-15 10:13:59.749558776 +0000 UTC m=+0.077869032 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent) Dec 15 05:13:59 localhost podman[336908]: 2025-12-15 10:13:59.779624848 +0000 UTC m=+0.107935154 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Dec 15 05:13:59 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 05:14:01 localhost nova_compute[286344]: 2025-12-15 10:14:01.644 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:14:01 localhost podman[243449]: time="2025-12-15T10:14:01Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 15 05:14:01 localhost podman[243449]: @ - - [15/Dec/2025:10:14:01 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156640 "" "Go-http-client/1.1" Dec 15 05:14:01 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:14:01 localhost podman[243449]: @ - - [15/Dec/2025:10:14:01 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19265 "" "Go-http-client/1.1" Dec 15 05:14:04 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Dec 15 05:14:04 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.25625 172.18.0.34:0/382777224' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Dec 15 05:14:04 localhost nova_compute[286344]: 2025-12-15 10:14:04.616 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:14:04 localhost openstack_network_exporter[246484]: ERROR 10:14:04 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 15 05:14:04 localhost openstack_network_exporter[246484]: ERROR 10:14:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 05:14:04 localhost openstack_network_exporter[246484]: ERROR 10:14:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 05:14:04 localhost openstack_network_exporter[246484]: ERROR 10:14:04 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 15 05:14:04 localhost openstack_network_exporter[246484]: Dec 15 05:14:04 localhost openstack_network_exporter[246484]: ERROR 10:14:04 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 15 05:14:04 localhost openstack_network_exporter[246484]: Dec 15 05:14:06 localhost nova_compute[286344]: 2025-12-15 10:14:06.647 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:14:06 localhost systemd[1]: Started /usr/bin/podman healthcheck run a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e. Dec 15 05:14:06 localhost podman[336926]: 2025-12-15 10:14:06.751850451 +0000 UTC m=+0.080833272 container health_status a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Dec 15 05:14:06 localhost podman[336926]: 2025-12-15 10:14:06.783136205 +0000 UTC m=+0.112119016 container exec_died a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Dec 15 05:14:06 localhost systemd[1]: a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e.service: Deactivated successfully. Dec 15 05:14:06 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:14:09 localhost nova_compute[286344]: 2025-12-15 10:14:09.623 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:14:11 localhost nova_compute[286344]: 2025-12-15 10:14:11.688 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:14:11 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:14:13 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Dec 15 05:14:13 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:14:13 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Dec 15 05:14:13 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:14:14 localhost nova_compute[286344]: 2025-12-15 10:14:14.624 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:14:14 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Dec 15 05:14:14 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:14:15 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:14:16 localhost nova_compute[286344]: 2025-12-15 10:14:16.738 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:14:16 localhost ovn_controller[154603]: 2025-12-15T10:14:16Z|00526|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory Dec 15 05:14:16 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:14:17 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e273 do_prune osdmap full prune enabled Dec 15 05:14:17 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e274 e274: 6 total, 6 up, 6 in Dec 15 05:14:17 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e274: 6 total, 6 up, 6 in Dec 15 05:14:17 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Dec 15 05:14:17 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.25625 172.18.0.34:0/382777224' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Dec 15 05:14:18 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Dec 15 05:14:18 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.25625 172.18.0.34:0/382777224' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Dec 15 05:14:19 localhost nova_compute[286344]: 2025-12-15 10:14:19.271 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:14:19 localhost nova_compute[286344]: 2025-12-15 10:14:19.654 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:14:20 localhost nova_compute[286344]: 2025-12-15 10:14:20.271 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:14:21 localhost nova_compute[286344]: 2025-12-15 10:14:21.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:14:21 localhost nova_compute[286344]: 2025-12-15 10:14:21.271 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 15 05:14:21 localhost nova_compute[286344]: 2025-12-15 10:14:21.271 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 15 05:14:21 localhost nova_compute[286344]: 2025-12-15 10:14:21.687 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 15 05:14:21 localhost nova_compute[286344]: 2025-12-15 10:14:21.687 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquired lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 15 05:14:21 localhost nova_compute[286344]: 2025-12-15 10:14:21.687 286348 DEBUG nova.network.neutron [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 15 05:14:21 localhost nova_compute[286344]: 2025-12-15 10:14:21.687 286348 DEBUG nova.objects.instance [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 15 05:14:21 localhost nova_compute[286344]: 2025-12-15 10:14:21.764 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:14:21 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e274 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:14:22 localhost nova_compute[286344]: 2025-12-15 10:14:22.472 286348 DEBUG nova.network.neutron [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Updating instance_info_cache with network_info: [{"id": "03ef8889-3216-43fb-8a52-4be17a956ce1", "address": "fa:16:3e:74:df:7c", "network": {"id": "befb7a72-17a9-4bcb-b561-84b8f626685a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.201", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "c785bf23f53946bc99867d8832a50266", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03ef8889-32", "ovs_interfaceid": "03ef8889-3216-43fb-8a52-4be17a956ce1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 15 05:14:22 localhost nova_compute[286344]: 2025-12-15 10:14:22.487 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Releasing lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 15 05:14:22 localhost nova_compute[286344]: 2025-12-15 10:14:22.487 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 15 05:14:22 localhost nova_compute[286344]: 2025-12-15 10:14:22.487 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:14:23 localhost nova_compute[286344]: 2025-12-15 10:14:23.483 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:14:23 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Dec 15 05:14:23 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.25625 172.18.0.34:0/382777224' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Dec 15 05:14:24 localhost nova_compute[286344]: 2025-12-15 10:14:24.657 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:14:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0. Dec 15 05:14:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 05:14:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 05:14:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09. Dec 15 05:14:24 localhost podman[337037]: 2025-12-15 10:14:24.77314166 +0000 UTC m=+0.095581380 container health_status 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, config_id=multipathd, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:14:24 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a. Dec 15 05:14:24 localhost podman[337036]: 2025-12-15 10:14:24.822288495 +0000 UTC m=+0.147327035 container health_status 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Dec 15 05:14:24 localhost podman[337036]: 2025-12-15 10:14:24.830486207 +0000 UTC m=+0.155524747 container exec_died 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Dec 15 05:14:24 localhost podman[337037]: 2025-12-15 10:14:24.844197747 +0000 UTC m=+0.166637487 container exec_died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, container_name=multipathd, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}) Dec 15 05:14:24 localhost systemd[1]: 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0.service: Deactivated successfully. Dec 15 05:14:24 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.service: Deactivated successfully. Dec 15 05:14:24 localhost podman[337075]: 2025-12-15 10:14:24.913276001 +0000 UTC m=+0.130220305 container health_status 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, io.openshift.expose-services=, config_id=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., release=1755695350, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter) Dec 15 05:14:24 localhost podman[337076]: 2025-12-15 10:14:24.924284398 +0000 UTC m=+0.136720900 container health_status b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202) Dec 15 05:14:24 localhost podman[337075]: 2025-12-15 10:14:24.95959134 +0000 UTC m=+0.176535664 container exec_died 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, version=9.6, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, release=1755695350, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, managed_by=edpm_ansible, name=ubi9-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, vcs-type=git) Dec 15 05:14:24 localhost podman[337038]: 2025-12-15 10:14:24.973120585 +0000 UTC m=+0.290809237 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_controller, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:14:24 localhost systemd[1]: 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09.service: Deactivated successfully. Dec 15 05:14:25 localhost podman[337038]: 2025-12-15 10:14:25.006353182 +0000 UTC m=+0.324041834 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.vendor=CentOS) Dec 15 05:14:25 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 05:14:25 localhost podman[337076]: 2025-12-15 10:14:25.0644481 +0000 UTC m=+0.276884672 container exec_died b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:14:25 localhost systemd[1]: b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a.service: Deactivated successfully. Dec 15 05:14:25 localhost nova_compute[286344]: 2025-12-15 10:14:25.269 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:14:25 localhost nova_compute[286344]: 2025-12-15 10:14:25.270 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 15 05:14:26 localhost nova_compute[286344]: 2025-12-15 10:14:26.278 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:14:26 localhost nova_compute[286344]: 2025-12-15 10:14:26.766 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:14:26 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e274 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:14:26 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e274 do_prune osdmap full prune enabled Dec 15 05:14:26 localhost ovn_metadata_agent[160585]: 2025-12-15 10:14:26.944 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=22, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'fe:17:e3', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fe:55:2b:86:15:b5'}, ipsec=False) old=SB_Global(nb_cfg=21) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:14:26 localhost nova_compute[286344]: 2025-12-15 10:14:26.944 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:14:26 localhost ovn_metadata_agent[160585]: 2025-12-15 10:14:26.946 160590 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Dec 15 05:14:26 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e275 e275: 6 total, 6 up, 6 in Dec 15 05:14:26 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e275: 6 total, 6 up, 6 in Dec 15 05:14:28 localhost nova_compute[286344]: 2025-12-15 10:14:28.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:14:28 localhost nova_compute[286344]: 2025-12-15 10:14:28.302 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 05:14:28 localhost nova_compute[286344]: 2025-12-15 10:14:28.302 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 05:14:28 localhost nova_compute[286344]: 2025-12-15 10:14:28.303 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 05:14:28 localhost nova_compute[286344]: 2025-12-15 10:14:28.303 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Auditing locally available compute resources for np0005559462.localdomain (node: np0005559462.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 15 05:14:28 localhost nova_compute[286344]: 2025-12-15 10:14:28.303 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 05:14:28 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Dec 15 05:14:28 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/4038756829' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Dec 15 05:14:28 localhost nova_compute[286344]: 2025-12-15 10:14:28.746 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.443s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 05:14:28 localhost nova_compute[286344]: 2025-12-15 10:14:28.809 286348 DEBUG nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 05:14:28 localhost nova_compute[286344]: 2025-12-15 10:14:28.810 286348 DEBUG nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 05:14:29 localhost nova_compute[286344]: 2025-12-15 10:14:29.011 286348 WARNING nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 15 05:14:29 localhost nova_compute[286344]: 2025-12-15 10:14:29.014 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Hypervisor/Node resource view: name=np0005559462.localdomain free_ram=11151MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 15 05:14:29 localhost nova_compute[286344]: 2025-12-15 10:14:29.015 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 05:14:29 localhost nova_compute[286344]: 2025-12-15 10:14:29.015 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 05:14:29 localhost nova_compute[286344]: 2025-12-15 10:14:29.077 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Instance 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 15 05:14:29 localhost nova_compute[286344]: 2025-12-15 10:14:29.078 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 15 05:14:29 localhost nova_compute[286344]: 2025-12-15 10:14:29.079 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Final resource view: name=np0005559462.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 15 05:14:29 localhost nova_compute[286344]: 2025-12-15 10:14:29.118 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 05:14:29 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Dec 15 05:14:29 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/215372177' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Dec 15 05:14:29 localhost nova_compute[286344]: 2025-12-15 10:14:29.573 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 05:14:29 localhost nova_compute[286344]: 2025-12-15 10:14:29.580 286348 DEBUG nova.compute.provider_tree [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Inventory has not changed in ProviderTree for provider: 26c8956b-6742-4951-b566-971b9bbe323b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 15 05:14:29 localhost nova_compute[286344]: 2025-12-15 10:14:29.597 286348 DEBUG nova.scheduler.client.report [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Inventory has not changed for provider 26c8956b-6742-4951-b566-971b9bbe323b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 15 05:14:29 localhost nova_compute[286344]: 2025-12-15 10:14:29.599 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Compute_service record updated for np0005559462.localdomain:np0005559462.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 15 05:14:29 localhost nova_compute[286344]: 2025-12-15 10:14:29.600 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.585s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 05:14:29 localhost nova_compute[286344]: 2025-12-15 10:14:29.660 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:14:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 05:14:30 localhost systemd[1]: tmp-crun.i9FQHM.mount: Deactivated successfully. Dec 15 05:14:30 localhost podman[337180]: 2025-12-15 10:14:30.750177613 +0000 UTC m=+0.080752489 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent) Dec 15 05:14:30 localhost podman[337180]: 2025-12-15 10:14:30.76042081 +0000 UTC m=+0.090995686 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:14:30 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 05:14:31 localhost nova_compute[286344]: 2025-12-15 10:14:31.601 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:14:31 localhost nova_compute[286344]: 2025-12-15 10:14:31.781 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:14:31 localhost podman[243449]: time="2025-12-15T10:14:31Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 15 05:14:31 localhost podman[243449]: @ - - [15/Dec/2025:10:14:31 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156640 "" "Go-http-client/1.1" Dec 15 05:14:31 localhost podman[243449]: @ - - [15/Dec/2025:10:14:31 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19272 "" "Go-http-client/1.1" Dec 15 05:14:31 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e275 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:14:32 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : mgrmap e54: np0005559464.aomnqe(active, since 16m), standbys: np0005559462.fudvyx, np0005559463.daptkf Dec 15 05:14:33 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Dec 15 05:14:33 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.25625 172.18.0.34:0/382777224' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Dec 15 05:14:33 localhost ovn_metadata_agent[160585]: 2025-12-15 10:14:33.948 160590 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=12d96d64-e862-4f68-81e5-8d9ec5d3a5e2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '22'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 15 05:14:34 localhost nova_compute[286344]: 2025-12-15 10:14:34.663 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:14:34 localhost openstack_network_exporter[246484]: ERROR 10:14:34 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 15 05:14:34 localhost openstack_network_exporter[246484]: ERROR 10:14:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 05:14:34 localhost openstack_network_exporter[246484]: ERROR 10:14:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 05:14:34 localhost openstack_network_exporter[246484]: ERROR 10:14:34 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 15 05:14:34 localhost openstack_network_exporter[246484]: Dec 15 05:14:34 localhost openstack_network_exporter[246484]: ERROR 10:14:34 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 15 05:14:34 localhost openstack_network_exporter[246484]: Dec 15 05:14:36 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} v 0) Dec 15 05:14:36 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:14:36 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"}]': finished Dec 15 05:14:36 localhost nova_compute[286344]: 2025-12-15 10:14:36.784 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:14:36 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e275 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:14:37 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Dec 15 05:14:37 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:14:37 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:14:37 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"}]': finished Dec 15 05:14:37 localhost systemd[1]: Started /usr/bin/podman healthcheck run a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e. Dec 15 05:14:37 localhost systemd[1]: tmp-crun.dleVl2.mount: Deactivated successfully. Dec 15 05:14:37 localhost podman[337199]: 2025-12-15 10:14:37.752221025 +0000 UTC m=+0.083352560 container health_status a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Dec 15 05:14:37 localhost podman[337199]: 2025-12-15 10:14:37.789381947 +0000 UTC m=+0.120513402 container exec_died a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Dec 15 05:14:37 localhost systemd[1]: a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e.service: Deactivated successfully. Dec 15 05:14:39 localhost nova_compute[286344]: 2025-12-15 10:14:39.665 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:14:39 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice"} v 0) Dec 15 05:14:39 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Dec 15 05:14:39 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Dec 15 05:14:40 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Dec 15 05:14:40 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Dec 15 05:14:40 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Dec 15 05:14:40 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Dec 15 05:14:41 localhost nova_compute[286344]: 2025-12-15 10:14:41.829 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:14:41 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e275 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:14:42 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow r pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} v 0) Dec 15 05:14:42 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow r pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:14:42 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow r pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"}]': finished Dec 15 05:14:42 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Dec 15 05:14:42 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow r pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:14:42 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow r pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:14:42 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow r pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"}]': finished Dec 15 05:14:44 localhost nova_compute[286344]: 2025-12-15 10:14:44.688 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:14:46 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice"} v 0) Dec 15 05:14:46 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Dec 15 05:14:46 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Dec 15 05:14:46 localhost nova_compute[286344]: 2025-12-15 10:14:46.832 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:14:46 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e275 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:14:47 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Dec 15 05:14:47 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Dec 15 05:14:47 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Dec 15 05:14:47 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Dec 15 05:14:49 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} v 0) Dec 15 05:14:49 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:14:49 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"}]': finished Dec 15 05:14:49 localhost nova_compute[286344]: 2025-12-15 10:14:49.693 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:14:50 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Dec 15 05:14:50 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:14:50 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:14:50 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"}]': finished Dec 15 05:14:51 localhost ovn_metadata_agent[160585]: 2025-12-15 10:14:51.490 160590 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 05:14:51 localhost ovn_metadata_agent[160585]: 2025-12-15 10:14:51.490 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 05:14:51 localhost ovn_metadata_agent[160585]: 2025-12-15 10:14:51.491 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 05:14:51 localhost nova_compute[286344]: 2025-12-15 10:14:51.871 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:14:51 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e275 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:14:52 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice_bob"} v 0) Dec 15 05:14:52 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Dec 15 05:14:52 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Dec 15 05:14:52 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Dec 15 05:14:52 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Dec 15 05:14:52 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Dec 15 05:14:52 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Dec 15 05:14:54 localhost nova_compute[286344]: 2025-12-15 10:14:54.714 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:14:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0. Dec 15 05:14:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09. Dec 15 05:14:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 05:14:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 05:14:55 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a. Dec 15 05:14:55 localhost podman[337230]: 2025-12-15 10:14:55.767265206 +0000 UTC m=+0.081749327 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:14:55 localhost podman[337230]: 2025-12-15 10:14:55.870691177 +0000 UTC m=+0.185175228 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2) Dec 15 05:14:55 localhost podman[337222]: 2025-12-15 10:14:55.882499906 +0000 UTC m=+0.211748925 container health_status 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Dec 15 05:14:55 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 05:14:55 localhost podman[337222]: 2025-12-15 10:14:55.894017326 +0000 UTC m=+0.223266335 container exec_died 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Dec 15 05:14:55 localhost podman[337224]: 2025-12-15 10:14:55.851273443 +0000 UTC m=+0.169343330 container health_status 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible) Dec 15 05:14:55 localhost systemd[1]: 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0.service: Deactivated successfully. Dec 15 05:14:55 localhost podman[337224]: 2025-12-15 10:14:55.935463515 +0000 UTC m=+0.253533392 container exec_died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}) Dec 15 05:14:55 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.service: Deactivated successfully. Dec 15 05:14:55 localhost podman[337237]: 2025-12-15 10:14:55.992886983 +0000 UTC m=+0.300112248 container health_status b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202) Dec 15 05:14:56 localhost podman[337237]: 2025-12-15 10:14:56.006282615 +0000 UTC m=+0.313507850 container exec_died b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:14:56 localhost systemd[1]: b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a.service: Deactivated successfully. Dec 15 05:14:56 localhost podman[337223]: 2025-12-15 10:14:56.079319996 +0000 UTC m=+0.402964354 container health_status 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, distribution-scope=public, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, container_name=openstack_network_exporter, vendor=Red Hat, Inc., version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, build-date=2025-08-20T13:12:41, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.buildah.version=1.33.7, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}) Dec 15 05:14:56 localhost podman[337223]: 2025-12-15 10:14:56.093411196 +0000 UTC m=+0.417055594 container exec_died 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, vendor=Red Hat, Inc., vcs-type=git, maintainer=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, container_name=openstack_network_exporter, release=1755695350, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=) Dec 15 05:14:56 localhost systemd[1]: 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09.service: Deactivated successfully. Dec 15 05:14:56 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Dec 15 05:14:56 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow r pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} v 0) Dec 15 05:14:56 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow r pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:14:56 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow r pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"}]': finished Dec 15 05:14:56 localhost nova_compute[286344]: 2025-12-15 10:14:56.900 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:14:56 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e275 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:14:57 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow r pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:14:57 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow r pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:14:57 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow r pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"}]': finished Dec 15 05:14:59 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice_bob"} v 0) Dec 15 05:14:59 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Dec 15 05:14:59 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Dec 15 05:14:59 localhost nova_compute[286344]: 2025-12-15 10:14:59.759 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:15:00 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Dec 15 05:15:00 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Dec 15 05:15:00 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Dec 15 05:15:00 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Dec 15 05:15:01 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Dec 15 05:15:01 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.25625 172.18.0.34:0/382777224' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Dec 15 05:15:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 05:15:01 localhost podman[337329]: 2025-12-15 10:15:01.750246173 +0000 UTC m=+0.077308686 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:15:01 localhost podman[337329]: 2025-12-15 10:15:01.755210848 +0000 UTC m=+0.082273371 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0) Dec 15 05:15:01 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 05:15:01 localhost podman[243449]: time="2025-12-15T10:15:01Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 15 05:15:01 localhost podman[243449]: @ - - [15/Dec/2025:10:15:01 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156640 "" "Go-http-client/1.1" Dec 15 05:15:01 localhost nova_compute[286344]: 2025-12-15 10:15:01.905 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:15:01 localhost podman[243449]: @ - - [15/Dec/2025:10:15:01 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19276 "" "Go-http-client/1.1" Dec 15 05:15:01 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e275 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:15:01 localhost ceph-mon[298913]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #73. Immutable memtables: 0. Dec 15 05:15:01 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:15:01.981955) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Dec 15 05:15:01 localhost ceph-mon[298913]: rocksdb: [db/flush_job.cc:856] [default] [JOB 43] Flushing memtable with next log file: 73 Dec 15 05:15:01 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765793701982082, "job": 43, "event": "flush_started", "num_memtables": 1, "num_entries": 2303, "num_deletes": 264, "total_data_size": 2624905, "memory_usage": 2771224, "flush_reason": "Manual Compaction"} Dec 15 05:15:01 localhost ceph-mon[298913]: rocksdb: [db/flush_job.cc:885] [default] [JOB 43] Level-0 flush table #74: started Dec 15 05:15:01 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765793701998806, "cf_name": "default", "job": 43, "event": "table_file_creation", "file_number": 74, "file_size": 1985379, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 38474, "largest_seqno": 40776, "table_properties": {"data_size": 1977828, "index_size": 4060, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2437, "raw_key_size": 21562, "raw_average_key_size": 22, "raw_value_size": 1960363, "raw_average_value_size": 2020, "num_data_blocks": 178, "num_entries": 970, "num_filter_entries": 970, "num_deletions": 264, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765793551, "oldest_key_time": 1765793551, "file_creation_time": 1765793701, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "603b24af-e2be-4214-bc56-9e652eb4af3d", "db_session_id": "0OJRM9SCUA16EXV0VQZ2", "orig_file_number": 74, "seqno_to_time_mapping": "N/A"}} Dec 15 05:15:01 localhost ceph-mon[298913]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 43] Flush lasted 16901 microseconds, and 6715 cpu microseconds. Dec 15 05:15:01 localhost ceph-mon[298913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Dec 15 05:15:02 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:15:01.998860) [db/flush_job.cc:967] [default] [JOB 43] Level-0 flush table #74: 1985379 bytes OK Dec 15 05:15:02 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:15:01.998884) [db/memtable_list.cc:519] [default] Level-0 commit table #74 started Dec 15 05:15:02 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:15:02.002503) [db/memtable_list.cc:722] [default] Level-0 commit table #74: memtable #1 done Dec 15 05:15:02 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:15:02.002530) EVENT_LOG_v1 {"time_micros": 1765793702002521, "job": 43, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Dec 15 05:15:02 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:15:02.002550) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Dec 15 05:15:02 localhost ceph-mon[298913]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 43] Try to delete WAL files size 2614916, prev total WAL file size 2615406, number of live WAL files 2. Dec 15 05:15:02 localhost ceph-mon[298913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005559462/store.db/000070.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Dec 15 05:15:02 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:15:02.004823) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740034323539' seq:72057594037927935, type:22 .. '6D6772737461740034353130' seq:0, type:0; will stop at (end) Dec 15 05:15:02 localhost ceph-mon[298913]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 44] Compacting 1@0 + 1@6 files to L6, score -1.00 Dec 15 05:15:02 localhost ceph-mon[298913]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 43 Base level 0, inputs: [74(1938KB)], [72(17MB)] Dec 15 05:15:02 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765793702004878, "job": 44, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [74], "files_L6": [72], "score": -1, "input_data_size": 20720589, "oldest_snapshot_seqno": -1} Dec 15 05:15:02 localhost ceph-mon[298913]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 44] Generated table #75: 14513 keys, 19179768 bytes, temperature: kUnknown Dec 15 05:15:02 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765793702135490, "cf_name": "default", "job": 44, "event": "table_file_creation", "file_number": 75, "file_size": 19179768, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 19097611, "index_size": 44808, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 36293, "raw_key_size": 389695, "raw_average_key_size": 26, "raw_value_size": 18851401, "raw_average_value_size": 1298, "num_data_blocks": 1660, "num_entries": 14513, "num_filter_entries": 14513, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765792320, "oldest_key_time": 0, "file_creation_time": 1765793702, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "603b24af-e2be-4214-bc56-9e652eb4af3d", "db_session_id": "0OJRM9SCUA16EXV0VQZ2", "orig_file_number": 75, "seqno_to_time_mapping": "N/A"}} Dec 15 05:15:02 localhost ceph-mon[298913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Dec 15 05:15:02 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:15:02.135781) [db/compaction/compaction_job.cc:1663] [default] [JOB 44] Compacted 1@0 + 1@6 files to L6 => 19179768 bytes Dec 15 05:15:02 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:15:02.138208) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 158.5 rd, 146.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.9, 17.9 +0.0 blob) out(18.3 +0.0 blob), read-write-amplify(20.1) write-amplify(9.7) OK, records in: 14992, records dropped: 479 output_compression: NoCompression Dec 15 05:15:02 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:15:02.138239) EVENT_LOG_v1 {"time_micros": 1765793702138225, "job": 44, "event": "compaction_finished", "compaction_time_micros": 130694, "compaction_time_cpu_micros": 49637, "output_level": 6, "num_output_files": 1, "total_output_size": 19179768, "num_input_records": 14992, "num_output_records": 14513, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Dec 15 05:15:02 localhost ceph-mon[298913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005559462/store.db/000074.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Dec 15 05:15:02 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765793702138752, "job": 44, "event": "table_file_deletion", "file_number": 74} Dec 15 05:15:02 localhost ceph-mon[298913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005559462/store.db/000072.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Dec 15 05:15:02 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765793702141434, "job": 44, "event": "table_file_deletion", "file_number": 72} Dec 15 05:15:02 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:15:02.004739) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 05:15:02 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:15:02.141536) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 05:15:02 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:15:02.141542) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 05:15:02 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:15:02.141545) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 05:15:02 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:15:02.141548) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 05:15:02 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:15:02.141551) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 05:15:02 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} v 0) Dec 15 05:15:02 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:15:02 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"}]': finished Dec 15 05:15:03 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Dec 15 05:15:03 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:15:03 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:15:03 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"}]': finished Dec 15 05:15:04 localhost ceph-mon[298913]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #76. Immutable memtables: 0. Dec 15 05:15:04 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:15:04.538159) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Dec 15 05:15:04 localhost ceph-mon[298913]: rocksdb: [db/flush_job.cc:856] [default] [JOB 45] Flushing memtable with next log file: 76 Dec 15 05:15:04 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765793704538217, "job": 45, "event": "flush_started", "num_memtables": 1, "num_entries": 311, "num_deletes": 251, "total_data_size": 64941, "memory_usage": 71224, "flush_reason": "Manual Compaction"} Dec 15 05:15:04 localhost ceph-mon[298913]: rocksdb: [db/flush_job.cc:885] [default] [JOB 45] Level-0 flush table #77: started Dec 15 05:15:04 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765793704541422, "cf_name": "default", "job": 45, "event": "table_file_creation", "file_number": 77, "file_size": 64022, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 40777, "largest_seqno": 41087, "table_properties": {"data_size": 62023, "index_size": 174, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 773, "raw_key_size": 5656, "raw_average_key_size": 19, "raw_value_size": 57905, "raw_average_value_size": 200, "num_data_blocks": 8, "num_entries": 289, "num_filter_entries": 289, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765793701, "oldest_key_time": 1765793701, "file_creation_time": 1765793704, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "603b24af-e2be-4214-bc56-9e652eb4af3d", "db_session_id": "0OJRM9SCUA16EXV0VQZ2", "orig_file_number": 77, "seqno_to_time_mapping": "N/A"}} Dec 15 05:15:04 localhost ceph-mon[298913]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 45] Flush lasted 3317 microseconds, and 945 cpu microseconds. Dec 15 05:15:04 localhost ceph-mon[298913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Dec 15 05:15:04 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:15:04.541477) [db/flush_job.cc:967] [default] [JOB 45] Level-0 flush table #77: 64022 bytes OK Dec 15 05:15:04 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:15:04.541501) [db/memtable_list.cc:519] [default] Level-0 commit table #77 started Dec 15 05:15:04 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:15:04.543608) [db/memtable_list.cc:722] [default] Level-0 commit table #77: memtable #1 done Dec 15 05:15:04 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:15:04.543719) EVENT_LOG_v1 {"time_micros": 1765793704543712, "job": 45, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Dec 15 05:15:04 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:15:04.543737) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Dec 15 05:15:04 localhost ceph-mon[298913]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 45] Try to delete WAL files size 62693, prev total WAL file size 62693, number of live WAL files 2. Dec 15 05:15:04 localhost ceph-mon[298913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005559462/store.db/000073.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Dec 15 05:15:04 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:15:04.544323) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003133303532' seq:72057594037927935, type:22 .. '7061786F73003133333034' seq:0, type:0; will stop at (end) Dec 15 05:15:04 localhost ceph-mon[298913]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 46] Compacting 1@0 + 1@6 files to L6, score -1.00 Dec 15 05:15:04 localhost ceph-mon[298913]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 45 Base level 0, inputs: [77(62KB)], [75(18MB)] Dec 15 05:15:04 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765793704544370, "job": 46, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [77], "files_L6": [75], "score": -1, "input_data_size": 19243790, "oldest_snapshot_seqno": -1} Dec 15 05:15:04 localhost ceph-mon[298913]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 46] Generated table #78: 14284 keys, 17998741 bytes, temperature: kUnknown Dec 15 05:15:04 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765793704681303, "cf_name": "default", "job": 46, "event": "table_file_creation", "file_number": 78, "file_size": 17998741, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17919988, "index_size": 41980, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 35717, "raw_key_size": 385410, "raw_average_key_size": 26, "raw_value_size": 17679647, "raw_average_value_size": 1237, "num_data_blocks": 1536, "num_entries": 14284, "num_filter_entries": 14284, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765792320, "oldest_key_time": 0, "file_creation_time": 1765793704, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "603b24af-e2be-4214-bc56-9e652eb4af3d", "db_session_id": "0OJRM9SCUA16EXV0VQZ2", "orig_file_number": 78, "seqno_to_time_mapping": "N/A"}} Dec 15 05:15:04 localhost ceph-mon[298913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Dec 15 05:15:04 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:15:04.681581) [db/compaction/compaction_job.cc:1663] [default] [JOB 46] Compacted 1@0 + 1@6 files to L6 => 17998741 bytes Dec 15 05:15:04 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:15:04.683958) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 140.5 rd, 131.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.1, 18.3 +0.0 blob) out(17.2 +0.0 blob), read-write-amplify(581.7) write-amplify(281.1) OK, records in: 14802, records dropped: 518 output_compression: NoCompression Dec 15 05:15:04 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:15:04.684016) EVENT_LOG_v1 {"time_micros": 1765793704683975, "job": 46, "event": "compaction_finished", "compaction_time_micros": 137013, "compaction_time_cpu_micros": 50178, "output_level": 6, "num_output_files": 1, "total_output_size": 17998741, "num_input_records": 14802, "num_output_records": 14284, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Dec 15 05:15:04 localhost ceph-mon[298913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005559462/store.db/000077.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Dec 15 05:15:04 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765793704684159, "job": 46, "event": "table_file_deletion", "file_number": 77} Dec 15 05:15:04 localhost ceph-mon[298913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005559462/store.db/000075.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Dec 15 05:15:04 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765793704687241, "job": 46, "event": "table_file_deletion", "file_number": 75} Dec 15 05:15:04 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:15:04.544213) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 05:15:04 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:15:04.687371) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 05:15:04 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:15:04.687379) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 05:15:04 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:15:04.687383) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 05:15:04 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:15:04.687387) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 05:15:04 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:15:04.687391) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 05:15:04 localhost nova_compute[286344]: 2025-12-15 10:15:04.795 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:15:04 localhost openstack_network_exporter[246484]: ERROR 10:15:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 05:15:04 localhost openstack_network_exporter[246484]: ERROR 10:15:04 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 15 05:15:04 localhost openstack_network_exporter[246484]: ERROR 10:15:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 05:15:04 localhost openstack_network_exporter[246484]: ERROR 10:15:04 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 15 05:15:04 localhost openstack_network_exporter[246484]: Dec 15 05:15:04 localhost openstack_network_exporter[246484]: ERROR 10:15:04 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 15 05:15:04 localhost openstack_network_exporter[246484]: Dec 15 05:15:05 localhost ceph-osd[31375]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Dec 15 05:15:05 localhost ceph-osd[31375]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 9000.1 total, 600.0 interval#012Cumulative writes: 19K writes, 75K keys, 19K commit groups, 1.0 writes per commit group, ingest: 0.07 GB, 0.01 MB/s#012Cumulative WAL: 19K writes, 6755 syncs, 2.84 writes per sync, written: 0.07 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 9980 writes, 39K keys, 9980 commit groups, 1.0 writes per commit group, ingest: 35.78 MB, 0.06 MB/s#012Interval WAL: 9980 writes, 4210 syncs, 2.37 writes per sync, written: 0.03 GB, 0.06 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Dec 15 05:15:06 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice bob"} v 0) Dec 15 05:15:06 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Dec 15 05:15:06 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Dec 15 05:15:06 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Dec 15 05:15:06 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Dec 15 05:15:06 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Dec 15 05:15:06 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Dec 15 05:15:06 localhost nova_compute[286344]: 2025-12-15 10:15:06.907 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:15:06 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e275 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:15:08 localhost systemd[1]: Started /usr/bin/podman healthcheck run a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e. Dec 15 05:15:08 localhost podman[337348]: 2025-12-15 10:15:08.759306094 +0000 UTC m=+0.088892309 container health_status a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 15 05:15:08 localhost podman[337348]: 2025-12-15 10:15:08.772704155 +0000 UTC m=+0.102290410 container exec_died a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Dec 15 05:15:08 localhost systemd[1]: a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e.service: Deactivated successfully. Dec 15 05:15:09 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow r pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} v 0) Dec 15 05:15:09 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow r pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:15:09 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow r pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"}]': finished Dec 15 05:15:09 localhost nova_compute[286344]: 2025-12-15 10:15:09.844 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:15:10 localhost ceph-osd[32311]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- Dec 15 05:15:10 localhost ceph-osd[32311]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 9000.2 total, 600.0 interval#012Cumulative writes: 24K writes, 92K keys, 24K commit groups, 1.0 writes per commit group, ingest: 0.06 GB, 0.01 MB/s#012Cumulative WAL: 24K writes, 8482 syncs, 2.94 writes per sync, written: 0.06 GB, 0.01 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 13K writes, 49K keys, 13K commit groups, 1.0 writes per commit group, ingest: 27.10 MB, 0.05 MB/s#012Interval WAL: 13K writes, 5455 syncs, 2.53 writes per sync, written: 0.03 GB, 0.05 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent Dec 15 05:15:10 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Dec 15 05:15:10 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow r pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:15:10 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow r pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:15:10 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow r pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"}]': finished Dec 15 05:15:11 localhost nova_compute[286344]: 2025-12-15 10:15:11.935 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:15:11 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e275 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:15:12 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e275 do_prune osdmap full prune enabled Dec 15 05:15:12 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e276 e276: 6 total, 6 up, 6 in Dec 15 05:15:12 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e276: 6 total, 6 up, 6 in Dec 15 05:15:12 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice bob"} v 0) Dec 15 05:15:12 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Dec 15 05:15:12 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Dec 15 05:15:13 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Dec 15 05:15:13 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Dec 15 05:15:13 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Dec 15 05:15:13 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Dec 15 05:15:14 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Dec 15 05:15:14 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:15:14 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Dec 15 05:15:14 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:15:14 localhost nova_compute[286344]: 2025-12-15 10:15:14.872 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:15:15 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Dec 15 05:15:15 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:15:15 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:15:16 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} v 0) Dec 15 05:15:16 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:15:16 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"}]': finished Dec 15 05:15:16 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Dec 15 05:15:16 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:15:16 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:15:16 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"}]': finished Dec 15 05:15:16 localhost nova_compute[286344]: 2025-12-15 10:15:16.938 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:15:16 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e276 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:15:19 localhost nova_compute[286344]: 2025-12-15 10:15:19.920 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:15:20 localhost nova_compute[286344]: 2025-12-15 10:15:20.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:15:20 localhost nova_compute[286344]: 2025-12-15 10:15:20.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:15:20 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice"} v 0) Dec 15 05:15:20 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Dec 15 05:15:20 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Dec 15 05:15:21 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Dec 15 05:15:21 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Dec 15 05:15:21 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Dec 15 05:15:21 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Dec 15 05:15:21 localhost nova_compute[286344]: 2025-12-15 10:15:21.965 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:15:21 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e276 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:15:21 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e276 do_prune osdmap full prune enabled Dec 15 05:15:21 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e277 e277: 6 total, 6 up, 6 in Dec 15 05:15:22 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e277: 6 total, 6 up, 6 in Dec 15 05:15:22 localhost nova_compute[286344]: 2025-12-15 10:15:22.267 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:15:22 localhost nova_compute[286344]: 2025-12-15 10:15:22.269 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:15:22 localhost nova_compute[286344]: 2025-12-15 10:15:22.270 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 15 05:15:22 localhost nova_compute[286344]: 2025-12-15 10:15:22.270 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 15 05:15:22 localhost nova_compute[286344]: 2025-12-15 10:15:22.344 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 15 05:15:22 localhost nova_compute[286344]: 2025-12-15 10:15:22.344 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquired lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 15 05:15:22 localhost nova_compute[286344]: 2025-12-15 10:15:22.345 286348 DEBUG nova.network.neutron [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 15 05:15:22 localhost nova_compute[286344]: 2025-12-15 10:15:22.345 286348 DEBUG nova.objects.instance [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 15 05:15:22 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Dec 15 05:15:22 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.25625 172.18.0.34:0/382777224' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Dec 15 05:15:22 localhost nova_compute[286344]: 2025-12-15 10:15:22.788 286348 DEBUG nova.network.neutron [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Updating instance_info_cache with network_info: [{"id": "03ef8889-3216-43fb-8a52-4be17a956ce1", "address": "fa:16:3e:74:df:7c", "network": {"id": "befb7a72-17a9-4bcb-b561-84b8f626685a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.201", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "c785bf23f53946bc99867d8832a50266", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03ef8889-32", "ovs_interfaceid": "03ef8889-3216-43fb-8a52-4be17a956ce1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 15 05:15:22 localhost nova_compute[286344]: 2025-12-15 10:15:22.811 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Releasing lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 15 05:15:22 localhost nova_compute[286344]: 2025-12-15 10:15:22.811 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 15 05:15:23 localhost nova_compute[286344]: 2025-12-15 10:15:23.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:15:23 localhost nova_compute[286344]: 2025-12-15 10:15:23.291 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:15:23 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow r pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} v 0) Dec 15 05:15:23 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow r pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:15:23 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow r pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"}]': finished Dec 15 05:15:24 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Dec 15 05:15:24 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow r pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:15:24 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow r pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:15:24 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow r pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"}]': finished Dec 15 05:15:24 localhost nova_compute[286344]: 2025-12-15 10:15:24.956 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:15:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0. Dec 15 05:15:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09. Dec 15 05:15:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 05:15:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 05:15:26 localhost nova_compute[286344]: 2025-12-15 10:15:26.674 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:15:26 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a. Dec 15 05:15:26 localhost systemd[1]: tmp-crun.pkx2cb.mount: Deactivated successfully. Dec 15 05:15:26 localhost podman[337458]: 2025-12-15 10:15:26.784796731 +0000 UTC m=+0.105004185 container health_status 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Dec 15 05:15:26 localhost podman[337459]: 2025-12-15 10:15:26.828636563 +0000 UTC m=+0.150432969 container health_status 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.buildah.version=1.33.7, version=9.6, io.openshift.expose-services=, architecture=x86_64, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9) Dec 15 05:15:26 localhost podman[337459]: 2025-12-15 10:15:26.843419142 +0000 UTC m=+0.165215588 container exec_died 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, release=1755695350, config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, vcs-type=git, distribution-scope=public, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6) Dec 15 05:15:26 localhost podman[337468]: 2025-12-15 10:15:26.879302781 +0000 UTC m=+0.189140065 container health_status b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible) Dec 15 05:15:26 localhost podman[337458]: 2025-12-15 10:15:26.894436589 +0000 UTC m=+0.214644083 container exec_died 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 15 05:15:26 localhost systemd[1]: 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09.service: Deactivated successfully. Dec 15 05:15:26 localhost systemd[1]: 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0.service: Deactivated successfully. Dec 15 05:15:26 localhost nova_compute[286344]: 2025-12-15 10:15:26.966 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:15:26 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e277 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:15:27 localhost podman[337460]: 2025-12-15 10:15:27.024640761 +0000 UTC m=+0.340012444 container health_status 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, io.buildah.version=1.41.3, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202) Dec 15 05:15:27 localhost podman[337460]: 2025-12-15 10:15:27.037274253 +0000 UTC m=+0.352645946 container exec_died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Dec 15 05:15:27 localhost podman[337468]: 2025-12-15 10:15:27.046921942 +0000 UTC m=+0.356759196 container exec_died b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS) Dec 15 05:15:27 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.service: Deactivated successfully. Dec 15 05:15:27 localhost systemd[1]: b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a.service: Deactivated successfully. Dec 15 05:15:27 localhost podman[337461]: 2025-12-15 10:15:27.145354579 +0000 UTC m=+0.457782073 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) Dec 15 05:15:27 localhost podman[337461]: 2025-12-15 10:15:27.212709636 +0000 UTC m=+0.525137120 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Dec 15 05:15:27 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 05:15:27 localhost nova_compute[286344]: 2025-12-15 10:15:27.301 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:15:27 localhost nova_compute[286344]: 2025-12-15 10:15:27.302 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 15 05:15:27 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice"} v 0) Dec 15 05:15:27 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Dec 15 05:15:27 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Dec 15 05:15:27 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Dec 15 05:15:27 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Dec 15 05:15:27 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Dec 15 05:15:27 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Dec 15 05:15:28 localhost nova_compute[286344]: 2025-12-15 10:15:28.272 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:15:29 localhost nova_compute[286344]: 2025-12-15 10:15:29.960 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:15:30 localhost nova_compute[286344]: 2025-12-15 10:15:30.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:15:30 localhost nova_compute[286344]: 2025-12-15 10:15:30.294 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 05:15:30 localhost nova_compute[286344]: 2025-12-15 10:15:30.294 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 05:15:30 localhost nova_compute[286344]: 2025-12-15 10:15:30.294 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 05:15:30 localhost nova_compute[286344]: 2025-12-15 10:15:30.295 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Auditing locally available compute resources for np0005559462.localdomain (node: np0005559462.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 15 05:15:30 localhost nova_compute[286344]: 2025-12-15 10:15:30.295 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 05:15:30 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Dec 15 05:15:30 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} v 0) Dec 15 05:15:30 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:15:30 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"}]': finished Dec 15 05:15:30 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Dec 15 05:15:30 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/696117849' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Dec 15 05:15:30 localhost nova_compute[286344]: 2025-12-15 10:15:30.759 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 05:15:30 localhost nova_compute[286344]: 2025-12-15 10:15:30.834 286348 DEBUG nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 05:15:30 localhost nova_compute[286344]: 2025-12-15 10:15:30.834 286348 DEBUG nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 05:15:31 localhost nova_compute[286344]: 2025-12-15 10:15:31.006 286348 WARNING nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 15 05:15:31 localhost nova_compute[286344]: 2025-12-15 10:15:31.008 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Hypervisor/Node resource view: name=np0005559462.localdomain free_ram=11125MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 15 05:15:31 localhost nova_compute[286344]: 2025-12-15 10:15:31.008 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 05:15:31 localhost nova_compute[286344]: 2025-12-15 10:15:31.008 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 05:15:31 localhost nova_compute[286344]: 2025-12-15 10:15:31.291 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Instance 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 15 05:15:31 localhost nova_compute[286344]: 2025-12-15 10:15:31.291 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 15 05:15:31 localhost nova_compute[286344]: 2025-12-15 10:15:31.292 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Final resource view: name=np0005559462.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 15 05:15:31 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:15:31 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:15:31 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"}]': finished Dec 15 05:15:31 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e277 do_prune osdmap full prune enabled Dec 15 05:15:31 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e278 e278: 6 total, 6 up, 6 in Dec 15 05:15:31 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e278: 6 total, 6 up, 6 in Dec 15 05:15:31 localhost nova_compute[286344]: 2025-12-15 10:15:31.642 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 05:15:31 localhost podman[243449]: time="2025-12-15T10:15:31Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 15 05:15:31 localhost podman[243449]: @ - - [15/Dec/2025:10:15:31 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156640 "" "Go-http-client/1.1" Dec 15 05:15:31 localhost podman[243449]: @ - - [15/Dec/2025:10:15:31 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19271 "" "Go-http-client/1.1" Dec 15 05:15:31 localhost nova_compute[286344]: 2025-12-15 10:15:31.971 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:15:31 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e278 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:15:32 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Dec 15 05:15:32 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/975745483' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Dec 15 05:15:32 localhost nova_compute[286344]: 2025-12-15 10:15:32.080 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.438s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 05:15:32 localhost nova_compute[286344]: 2025-12-15 10:15:32.086 286348 DEBUG nova.compute.provider_tree [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Inventory has not changed in ProviderTree for provider: 26c8956b-6742-4951-b566-971b9bbe323b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 15 05:15:32 localhost nova_compute[286344]: 2025-12-15 10:15:32.110 286348 DEBUG nova.scheduler.client.report [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Inventory has not changed for provider 26c8956b-6742-4951-b566-971b9bbe323b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 15 05:15:32 localhost nova_compute[286344]: 2025-12-15 10:15:32.113 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Compute_service record updated for np0005559462.localdomain:np0005559462.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 15 05:15:32 localhost nova_compute[286344]: 2025-12-15 10:15:32.113 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.105s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 05:15:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 05:15:32 localhost podman[337610]: 2025-12-15 10:15:32.746277756 +0000 UTC m=+0.080966735 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0) Dec 15 05:15:32 localhost podman[337610]: 2025-12-15 10:15:32.757661373 +0000 UTC m=+0.092350342 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202) Dec 15 05:15:32 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 05:15:33 localhost nova_compute[286344]: 2025-12-15 10:15:33.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:15:33 localhost nova_compute[286344]: 2025-12-15 10:15:33.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:15:33 localhost nova_compute[286344]: 2025-12-15 10:15:33.271 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Cleaning up deleted instances with incomplete migration _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m Dec 15 05:15:33 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice_bob"} v 0) Dec 15 05:15:33 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Dec 15 05:15:33 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Dec 15 05:15:34 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Dec 15 05:15:34 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Dec 15 05:15:34 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Dec 15 05:15:34 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Dec 15 05:15:34 localhost nova_compute[286344]: 2025-12-15 10:15:34.287 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:15:34 localhost nova_compute[286344]: 2025-12-15 10:15:34.288 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m Dec 15 05:15:34 localhost nova_compute[286344]: 2025-12-15 10:15:34.303 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m Dec 15 05:15:34 localhost openstack_network_exporter[246484]: ERROR 10:15:34 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 15 05:15:34 localhost openstack_network_exporter[246484]: ERROR 10:15:34 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 15 05:15:34 localhost openstack_network_exporter[246484]: Dec 15 05:15:34 localhost openstack_network_exporter[246484]: ERROR 10:15:34 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 15 05:15:34 localhost openstack_network_exporter[246484]: Dec 15 05:15:34 localhost openstack_network_exporter[246484]: ERROR 10:15:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 05:15:34 localhost openstack_network_exporter[246484]: ERROR 10:15:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 05:15:34 localhost nova_compute[286344]: 2025-12-15 10:15:34.963 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:15:35 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Dec 15 05:15:35 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.25625 172.18.0.34:0/382777224' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Dec 15 05:15:36 localhost nova_compute[286344]: 2025-12-15 10:15:36.973 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:15:37 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e278 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:15:37 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow r pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} v 0) Dec 15 05:15:37 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow r pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:15:37 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow r pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"}]': finished Dec 15 05:15:38 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Dec 15 05:15:38 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow r pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:15:38 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow r pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:15:38 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow r pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"}]': finished Dec 15 05:15:39 localhost systemd[1]: Started /usr/bin/podman healthcheck run a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e. Dec 15 05:15:39 localhost systemd[1]: tmp-crun.fIRhQT.mount: Deactivated successfully. Dec 15 05:15:39 localhost podman[337628]: 2025-12-15 10:15:39.74511808 +0000 UTC m=+0.079048144 container health_status a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Dec 15 05:15:39 localhost podman[337628]: 2025-12-15 10:15:39.758407138 +0000 UTC m=+0.092337202 container exec_died a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Dec 15 05:15:39 localhost systemd[1]: a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e.service: Deactivated successfully. Dec 15 05:15:39 localhost nova_compute[286344]: 2025-12-15 10:15:39.989 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:15:40 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice_bob"} v 0) Dec 15 05:15:40 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Dec 15 05:15:40 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Dec 15 05:15:41 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Dec 15 05:15:41 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Dec 15 05:15:41 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Dec 15 05:15:41 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Dec 15 05:15:42 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e278 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:15:42 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e278 do_prune osdmap full prune enabled Dec 15 05:15:42 localhost nova_compute[286344]: 2025-12-15 10:15:42.012 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:15:42 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e279 e279: 6 total, 6 up, 6 in Dec 15 05:15:42 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e279: 6 total, 6 up, 6 in Dec 15 05:15:42 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : mgrmap e55: np0005559464.aomnqe(active, since 18m), standbys: np0005559462.fudvyx, np0005559463.daptkf Dec 15 05:15:43 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} v 0) Dec 15 05:15:43 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:15:43 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"}]': finished Dec 15 05:15:44 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Dec 15 05:15:44 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:15:44 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:15:44 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"}]': finished Dec 15 05:15:45 localhost nova_compute[286344]: 2025-12-15 10:15:45.035 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:15:46 localhost nova_compute[286344]: 2025-12-15 10:15:46.271 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:15:46 localhost nova_compute[286344]: 2025-12-15 10:15:46.677 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:15:46 localhost nova_compute[286344]: 2025-12-15 10:15:46.744 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Triggering sync for uuid 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m Dec 15 05:15:46 localhost nova_compute[286344]: 2025-12-15 10:15:46.745 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 05:15:46 localhost nova_compute[286344]: 2025-12-15 10:15:46.746 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 05:15:46 localhost nova_compute[286344]: 2025-12-15 10:15:46.778 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.032s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 05:15:47 localhost nova_compute[286344]: 2025-12-15 10:15:47.015 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:15:47 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e279 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:15:47 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice bob"} v 0) Dec 15 05:15:47 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Dec 15 05:15:47 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Dec 15 05:15:47 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e279 do_prune osdmap full prune enabled Dec 15 05:15:47 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Dec 15 05:15:47 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Dec 15 05:15:47 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Dec 15 05:15:47 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Dec 15 05:15:47 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e280 e280: 6 total, 6 up, 6 in Dec 15 05:15:47 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e280: 6 total, 6 up, 6 in Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.127 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'name': 'test', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005559462.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'c785bf23f53946bc99867d8832a50266', 'user_id': '1ba5fce347b64bfebf995f187193f205', 'hostId': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.129 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.161 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.163 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.165 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '90f57096-9d6c-4766-a1b3-b73b4f64fcd4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T10:15:48.129813', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '06903dd4-d99f-11f0-817e-fa163ebaca0f', 'monotonic_time': 12814.322657674, 'message_signature': 'c390320e18923829d564473b33576d7fe06346a5ada608921d37d6eae442f148'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T10:15:48.129813', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '06905df0-d99f-11f0-817e-fa163ebaca0f', 'monotonic_time': 12814.322657674, 'message_signature': 'eb1c0033357a3f0d897397e22f81cfecea8893f6c42fe36727029f1ce3d50f5b'}]}, 'timestamp': '2025-12-15 10:15:48.164017', '_unique_id': '7d05fd8108eb4e37a9706b71a83435ac'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.165 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.165 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.165 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.165 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.165 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.165 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.165 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.165 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.165 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.165 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.165 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.165 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.165 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.165 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.165 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.165 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.165 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.165 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.165 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.165 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.165 12 ERROR oslo_messaging.notify.messaging Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.165 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.165 12 ERROR oslo_messaging.notify.messaging Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.165 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.165 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.165 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.165 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.165 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.165 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.165 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.165 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.165 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.165 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.165 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.165 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.165 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.165 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.165 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.165 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.165 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.165 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.165 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.165 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.165 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.165 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.165 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.165 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.165 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.165 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.165 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.165 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.165 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.165 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.165 12 ERROR oslo_messaging.notify.messaging Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.169 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.169 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.169 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.latency volume: 1243487016 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.169 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.latency volume: 24779175 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.171 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2839a5be-9bc0-43c6-8f61-52e8281106b1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1243487016, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T10:15:48.169415', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '06914512-d99f-11f0-817e-fa163ebaca0f', 'monotonic_time': 12814.322657674, 'message_signature': '0f3268f159dc323a924e24f918b696c074c9a2e2b083bab7c86f783c5e436659'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24779175, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T10:15:48.169415', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0691569c-d99f-11f0-817e-fa163ebaca0f', 'monotonic_time': 12814.322657674, 'message_signature': '936940d3902b2b4778429935fc0703dd84001899a856fb0e090f4c640bd60303'}]}, 'timestamp': '2025-12-15 10:15:48.170309', '_unique_id': 'ecd7f540110d41ae8196beef4431e978'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.171 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.171 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.171 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.171 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.171 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.171 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.171 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.171 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.171 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.171 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.171 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.171 12 ERROR oslo_messaging.notify.messaging Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.171 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.171 12 ERROR oslo_messaging.notify.messaging Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.171 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.171 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.171 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.171 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.171 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.171 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.171 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.171 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.171 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.171 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.171 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.171 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.171 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.171 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.171 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.171 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.171 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.171 12 ERROR oslo_messaging.notify.messaging Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.172 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.176 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.177 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4b38b1cb-0227-4ad1-ae39-67ecd19843d6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:15:48.172711', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '06925a38-d99f-11f0-817e-fa163ebaca0f', 'monotonic_time': 12814.365425079, 'message_signature': '81e9387790e7c10de6abf3839882f1e236639b70f18b8fe84c5812f0d4aa9347'}]}, 'timestamp': '2025-12-15 10:15:48.177014', '_unique_id': '7cd3982d2b6746a1b0ae458168ca7cd6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.177 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.177 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.177 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.177 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.177 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.177 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.177 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.177 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.177 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.177 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.177 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.177 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.177 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.177 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.177 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.177 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.177 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.177 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.177 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.177 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.177 12 ERROR oslo_messaging.notify.messaging Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.177 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.177 12 ERROR oslo_messaging.notify.messaging Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.177 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.177 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.177 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.177 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.177 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.177 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.177 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.177 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.177 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.177 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.177 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.177 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.177 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.177 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.177 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.177 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.177 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.177 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.177 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.177 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.177 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.177 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.177 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.177 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.177 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.177 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.177 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.177 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.177 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.177 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.177 12 ERROR oslo_messaging.notify.messaging Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.179 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.179 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.190 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.190 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.192 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3a8ffd6b-da98-41d3-bfe6-ce64214f5e8a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T10:15:48.179606', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '06946c24-d99f-11f0-817e-fa163ebaca0f', 'monotonic_time': 12814.372276003, 'message_signature': '3bd671b63cda2ced07081303f240e7f07bd1276c01b1231c29c644719b95ff5d'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T10:15:48.179606', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '06947d0e-d99f-11f0-817e-fa163ebaca0f', 'monotonic_time': 12814.372276003, 'message_signature': '32e9780487038db6499e6d8b866358260fa25ed073ac919494ca32037a203cc7'}]}, 'timestamp': '2025-12-15 10:15:48.190957', '_unique_id': 'fa578bdcc3d742f696f8ac4ff08c4fd5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.192 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.192 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.192 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.192 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.192 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.192 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.192 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.192 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.192 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.192 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.192 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.192 12 ERROR oslo_messaging.notify.messaging Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.192 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.192 12 ERROR oslo_messaging.notify.messaging Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.192 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.192 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.192 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.192 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.192 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.192 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.192 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.192 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.192 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.192 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.192 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.192 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.192 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.192 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.192 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.192 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.192 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.192 12 ERROR oslo_messaging.notify.messaging Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.193 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.193 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.193 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.195 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '70dd3420-c642-4d97-9201-78eea9aa4cab', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T10:15:48.193290', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0694ebf4-d99f-11f0-817e-fa163ebaca0f', 'monotonic_time': 12814.322657674, 'message_signature': '86e6527d0e1f845b8770d5cca57481fe38bef8644d530d97bf9c12c69b5691fb'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T10:15:48.193290', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0694fd4c-d99f-11f0-817e-fa163ebaca0f', 'monotonic_time': 12814.322657674, 'message_signature': 'db9ea234319cfb1be3534f9f313411b2e6650c20b695130b411fa4e1e556c577'}]}, 'timestamp': '2025-12-15 10:15:48.194238', '_unique_id': '6e53d69929fe4bc0b7a4def86b0f777c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.195 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.195 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.195 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.195 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.195 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.195 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.195 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.195 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.195 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.195 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.195 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.195 12 ERROR oslo_messaging.notify.messaging Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.195 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.195 12 ERROR oslo_messaging.notify.messaging Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.195 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.195 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.195 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.195 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.195 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.195 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.195 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.195 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.195 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.195 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.195 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.195 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.195 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.195 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.195 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.195 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.195 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.195 12 ERROR oslo_messaging.notify.messaging Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.196 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.196 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.196 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.198 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1f20ae5a-ed41-429b-92d9-96e71cbf8726', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T10:15:48.196455', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '06956584-d99f-11f0-817e-fa163ebaca0f', 'monotonic_time': 12814.372276003, 'message_signature': '39ddc097a90e7082a623eb6b15fb6be131e944b0724c890340f9ca0aa6f19199'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T10:15:48.196455', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0695775e-d99f-11f0-817e-fa163ebaca0f', 'monotonic_time': 12814.372276003, 'message_signature': 'ba0263be09fb70f9aa2147fad00ceb4e78bb10e1416cffc3e549682d2168dcd9'}]}, 'timestamp': '2025-12-15 10:15:48.197363', '_unique_id': '38c4d96e282645ae830c049d69152ed4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.198 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.198 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.198 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.198 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.198 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.198 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.198 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.198 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.198 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.198 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.198 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.198 12 ERROR oslo_messaging.notify.messaging Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.198 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.198 12 ERROR oslo_messaging.notify.messaging Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.198 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.198 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.198 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.198 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.198 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.198 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.198 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.198 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.198 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.198 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.198 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.198 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.198 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.198 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.198 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.198 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.198 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.198 12 ERROR oslo_messaging.notify.messaging Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.199 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.199 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.200 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '09e96a7f-a0c5-4298-bcb2-c7e16acc088f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:15:48.199575', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '0695def6-d99f-11f0-817e-fa163ebaca0f', 'monotonic_time': 12814.365425079, 'message_signature': '12336c50555b11e002d02190dcfd7ec1d50f47cb90095edcba2bdac732a78067'}]}, 'timestamp': '2025-12-15 10:15:48.200075', '_unique_id': '48d31e5ad7ba484592836491f774dcf7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.200 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.200 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.200 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.200 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.200 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.200 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.200 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.200 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.200 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.200 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.200 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.200 12 ERROR oslo_messaging.notify.messaging Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.200 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.200 12 ERROR oslo_messaging.notify.messaging Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.200 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.200 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.200 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.200 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.200 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.200 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.200 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.200 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.200 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.200 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.200 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.200 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.200 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.200 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.200 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.200 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.200 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.200 12 ERROR oslo_messaging.notify.messaging Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.202 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.202 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.204 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0ea0b43f-69d6-4490-9c09-bb2989085772', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:15:48.202590', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '069654b2-d99f-11f0-817e-fa163ebaca0f', 'monotonic_time': 12814.365425079, 'message_signature': 'a536d4dc6b5359cefceea835b7b1275db077d8204d6f60760e0a5a540ed89ff7'}]}, 'timestamp': '2025-12-15 10:15:48.203169', '_unique_id': '893908c3f4b84b4c86bfe8528d0146e7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.204 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.204 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.204 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.204 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.204 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.204 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.204 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.204 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.204 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.204 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.204 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.204 12 ERROR oslo_messaging.notify.messaging Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.204 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.204 12 ERROR oslo_messaging.notify.messaging Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.204 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.204 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.204 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.204 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.204 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.204 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.204 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.204 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.204 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.204 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.204 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.204 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.204 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.204 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.204 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.204 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.204 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.204 12 ERROR oslo_messaging.notify.messaging Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.205 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.205 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.205 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.206 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e1047aa3-79c3-4346-834e-607a5a0075b5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:15:48.205439', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '0696c424-d99f-11f0-817e-fa163ebaca0f', 'monotonic_time': 12814.365425079, 'message_signature': 'e4898a5f9a0f9dc80dfa2414db7d9b67e225646047eeaf97dae55b610da6e05e'}]}, 'timestamp': '2025-12-15 10:15:48.205909', '_unique_id': '850ccc913f144654b0391d71b7f72d8f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.206 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.206 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.206 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.206 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.206 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.206 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.206 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.206 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.206 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.206 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.206 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.206 12 ERROR oslo_messaging.notify.messaging Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.206 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.206 12 ERROR oslo_messaging.notify.messaging Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.206 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.206 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.206 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.206 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.206 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.206 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.206 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.206 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.206 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.206 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.206 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.206 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.206 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.206 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.206 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.206 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.206 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.206 12 ERROR oslo_messaging.notify.messaging Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.207 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.208 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.209 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3746a64b-81e6-40a2-b35b-1d5d48f8b697', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:15:48.208108', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '06972c20-d99f-11f0-817e-fa163ebaca0f', 'monotonic_time': 12814.365425079, 'message_signature': 'b6bc866d529f6cb08bf1dcb7ec9610b8b2a5aadb25b0e5748034ed8ae43f9401'}]}, 'timestamp': '2025-12-15 10:15:48.208573', '_unique_id': '9ee00a476bf44e6d92f7958d45844e70'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.209 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.209 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.209 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.209 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.209 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.209 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.209 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.209 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.209 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.209 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.209 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.209 12 ERROR oslo_messaging.notify.messaging Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.209 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.209 12 ERROR oslo_messaging.notify.messaging Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.209 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.209 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.209 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.209 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.209 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.209 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.209 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.209 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.209 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.209 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.209 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.209 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.209 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.209 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.209 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.209 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.209 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.209 12 ERROR oslo_messaging.notify.messaging Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.210 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.210 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.211 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.212 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c151dbcc-d545-47b1-825c-3739818bd32b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T10:15:48.210662', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '06978f80-d99f-11f0-817e-fa163ebaca0f', 'monotonic_time': 12814.322657674, 'message_signature': '76d6034d79383ca62cc4d60db31461af79cfa66ed8786349411dd83833a1bfe3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T10:15:48.210662', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '0697a3ee-d99f-11f0-817e-fa163ebaca0f', 'monotonic_time': 12814.322657674, 'message_signature': 'dd60a33a39f4ab8035603b2528026ef4ce351eceb4c6ed3d1ed54a676eff1a51'}]}, 'timestamp': '2025-12-15 10:15:48.211612', '_unique_id': '575e3a3619784cb78716249c1730bca2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.212 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.212 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.212 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.212 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.212 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.212 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.212 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.212 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.212 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.212 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.212 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.212 12 ERROR oslo_messaging.notify.messaging Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.212 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.212 12 ERROR oslo_messaging.notify.messaging Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.212 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.212 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.212 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.212 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.212 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.212 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.212 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.212 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.212 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.212 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.212 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.212 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.212 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.212 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.212 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.212 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.212 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.212 12 ERROR oslo_messaging.notify.messaging Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.213 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.213 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.215 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '61c926bc-7b03-4c70-9986-0a3086285f1e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:15:48.213735', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '069807d0-d99f-11f0-817e-fa163ebaca0f', 'monotonic_time': 12814.365425079, 'message_signature': 'c0118f86814d832a1b399b30d69324fc1dae1ff3d6e6b43dc8cf3f10a7c875d3'}]}, 'timestamp': '2025-12-15 10:15:48.214225', '_unique_id': '5dbe2daa12814223a906514800e6a739'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.215 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.215 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.215 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.215 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.215 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.215 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.215 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.215 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.215 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.215 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.215 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.215 12 ERROR oslo_messaging.notify.messaging Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.215 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.215 12 ERROR oslo_messaging.notify.messaging Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.215 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.215 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.215 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.215 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.215 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.215 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.215 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.215 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.215 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.215 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.215 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.215 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.215 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.215 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.215 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.215 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.215 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.215 12 ERROR oslo_messaging.notify.messaging Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.216 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.216 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.217 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f11c0e87-dda1-467b-9075-7a2496d694a0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:15:48.216344', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '06986db0-d99f-11f0-817e-fa163ebaca0f', 'monotonic_time': 12814.365425079, 'message_signature': 'be0ae7c60316ce71ec086056ffa4707f0893d0aea7a58906cb23e27a5cac1ff1'}]}, 'timestamp': '2025-12-15 10:15:48.216803', '_unique_id': '688b66c4673045c7a8f75e371ed543e8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.217 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.217 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.217 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.217 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.217 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.217 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.217 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.217 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.217 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.217 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.217 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.217 12 ERROR oslo_messaging.notify.messaging Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.217 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.217 12 ERROR oslo_messaging.notify.messaging Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.217 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.217 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.217 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.217 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.217 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.217 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.217 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.217 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.217 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.217 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.217 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.217 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.217 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.217 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.217 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.217 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.217 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.217 12 ERROR oslo_messaging.notify.messaging Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.218 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.218 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.220 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9a12e67c-38f6-4ea7-9fd2-ea4d50d35414', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:15:48.218865', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '0698d34a-d99f-11f0-817e-fa163ebaca0f', 'monotonic_time': 12814.365425079, 'message_signature': '84ecf82dd591efc2dd8fc22bb335770690bb0861917d709e8cec982fc82fd7b3'}]}, 'timestamp': '2025-12-15 10:15:48.219409', '_unique_id': '7a9fb6a5cc314fe396dd982162d7b60c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.220 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.220 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.220 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.220 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.220 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.220 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.220 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.220 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.220 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.220 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.220 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.220 12 ERROR oslo_messaging.notify.messaging Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.220 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.220 12 ERROR oslo_messaging.notify.messaging Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.220 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.220 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.220 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.220 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.220 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.220 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.220 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.220 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.220 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.220 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.220 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.220 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.220 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.220 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.220 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.220 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.220 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.220 12 ERROR oslo_messaging.notify.messaging Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.221 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.221 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.221 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.223 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ac1df25f-d166-47e7-a66f-c1b89336817a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T10:15:48.221515', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '0699374a-d99f-11f0-817e-fa163ebaca0f', 'monotonic_time': 12814.372276003, 'message_signature': 'a2f7ff083ed59893ab1665610bb46c8f9b1455f9739d7760a85c2f791c517ca8'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T10:15:48.221515', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '069948b6-d99f-11f0-817e-fa163ebaca0f', 'monotonic_time': 12814.372276003, 'message_signature': '08fef0ceba2828a52f558239d4750d83231ee65ed269c52c572a15338ca4955b'}]}, 'timestamp': '2025-12-15 10:15:48.222381', '_unique_id': '7f7308cad93f44dba4641d94bb071722'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.223 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.223 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.223 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.223 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.223 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.223 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.223 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.223 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.223 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.223 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.223 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.223 12 ERROR oslo_messaging.notify.messaging Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.223 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.223 12 ERROR oslo_messaging.notify.messaging Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.223 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.223 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.223 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.223 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.223 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.223 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.223 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.223 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.223 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.223 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.223 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.223 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.223 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.223 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.223 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.223 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.223 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.223 12 ERROR oslo_messaging.notify.messaging Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.224 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.224 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.225 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4c8cd753-a9e3-4112-9fef-7b34cf4b8a31', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:15:48.224568', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '0699aee6-d99f-11f0-817e-fa163ebaca0f', 'monotonic_time': 12814.365425079, 'message_signature': 'c1ed8801473573b654706dcc5084c8eb1a01c312474c4604ae0c1349755fe27a'}]}, 'timestamp': '2025-12-15 10:15:48.225057', '_unique_id': '331856d71d694650963a709f9f761bb8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.225 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.225 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.225 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.225 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.225 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.225 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.225 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.225 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.225 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.225 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.225 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.225 12 ERROR oslo_messaging.notify.messaging Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.225 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.225 12 ERROR oslo_messaging.notify.messaging Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.225 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.225 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.225 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.225 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.225 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.225 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.225 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.225 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.225 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.225 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.225 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.225 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.225 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.225 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.225 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.225 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.225 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.225 12 ERROR oslo_messaging.notify.messaging Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.226 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.227 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.242 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/cpu volume: 18280000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.243 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ae367b09-d734-4b91-840f-871296e444ee', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 18280000000, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'timestamp': '2025-12-15T10:15:48.227391', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '069c6ac8-d99f-11f0-817e-fa163ebaca0f', 'monotonic_time': 12814.434999715, 'message_signature': '93d7bf7d3408abaf1a27236a221cdabb34256a85a9d738a86ee262f95dd0e59c'}]}, 'timestamp': '2025-12-15 10:15:48.242867', '_unique_id': '6065542f132242cfa076890b7ded5e14'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.243 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.243 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.243 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.243 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.243 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.243 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.243 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.243 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.243 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.243 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.243 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.243 12 ERROR oslo_messaging.notify.messaging Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.243 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.243 12 ERROR oslo_messaging.notify.messaging Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.243 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.243 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.243 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.243 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.243 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.243 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.243 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.243 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.243 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.243 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.243 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.243 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.243 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.243 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.243 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.243 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.243 12 ERROR oslo_messaging.notify.messaging Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.244 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.244 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.244 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.245 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dd5cc803-37c4-4dd7-92fc-2bf5c8679a56', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T10:15:48.244398', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '069cb244-d99f-11f0-817e-fa163ebaca0f', 'monotonic_time': 12814.322657674, 'message_signature': '20947e1e588867ad19f8b1d89614e32b55e8da06ee0d290802be0112f8a05a91'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T10:15:48.244398', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '069cbc58-d99f-11f0-817e-fa163ebaca0f', 'monotonic_time': 12814.322657674, 'message_signature': '3f3a8275db71ffa166ee05bc17883b47761e29c5091b9077eec9170b52825c3d'}]}, 'timestamp': '2025-12-15 10:15:48.244925', '_unique_id': 'eaffc348af23429ab587a171d1fba3f3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.245 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.245 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.245 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.245 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.245 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.245 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.245 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.245 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.245 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.245 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.245 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.245 12 ERROR oslo_messaging.notify.messaging Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.245 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.245 12 ERROR oslo_messaging.notify.messaging Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.245 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.245 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.245 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.245 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.245 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.245 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.245 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.245 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.245 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.245 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.245 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.245 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.245 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.245 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.245 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.245 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.245 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.245 12 ERROR oslo_messaging.notify.messaging Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.246 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.246 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.247 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '83e932c4-936a-4440-ab47-1e991966dce0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:15:48.246572', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '069d0adc-d99f-11f0-817e-fa163ebaca0f', 'monotonic_time': 12814.365425079, 'message_signature': '4beab7e30a85e455f9927e8210577e0cf0030fc9e316033de4191354edcfa8ec'}]}, 'timestamp': '2025-12-15 10:15:48.246955', '_unique_id': '7f78345fbbd34e469062168071ab9ee7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.247 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.247 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.247 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.247 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.247 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.247 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.247 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.247 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.247 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.247 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.247 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.247 12 ERROR oslo_messaging.notify.messaging Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.247 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.247 12 ERROR oslo_messaging.notify.messaging Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.247 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.247 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.247 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.247 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.247 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.247 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.247 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.247 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.247 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.247 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.247 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.247 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.247 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.247 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.247 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.247 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.247 12 ERROR oslo_messaging.notify.messaging Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.248 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.248 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/memory.usage volume: 51.73828125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.249 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8a8c1a2c-c098-4140-88b4-0bddcc80aefa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.73828125, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'timestamp': '2025-12-15T10:15:48.248668', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '069d592e-d99f-11f0-817e-fa163ebaca0f', 'monotonic_time': 12814.434999715, 'message_signature': '26e5817874f26d74a43a91ebaee5b30e93352a8530129d72e0cb619e4ac01de3'}]}, 'timestamp': '2025-12-15 10:15:48.248946', '_unique_id': 'ffd78a57fc4949e98848f706feaa3813'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.249 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.249 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.249 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.249 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.249 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.249 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.249 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.249 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.249 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.249 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.249 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.249 12 ERROR oslo_messaging.notify.messaging Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.249 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.249 12 ERROR oslo_messaging.notify.messaging Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.249 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.249 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.249 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.249 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.249 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.249 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.249 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.249 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.249 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.249 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.249 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.249 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.249 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.249 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.249 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.249 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.249 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.249 12 ERROR oslo_messaging.notify.messaging Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.250 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.250 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.latency volume: 1342134926 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.250 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.latency volume: 123356132 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.251 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e232ed8d-c6aa-4c1d-b147-7ada10fe49e4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1342134926, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T10:15:48.250270', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '069d97a4-d99f-11f0-817e-fa163ebaca0f', 'monotonic_time': 12814.322657674, 'message_signature': 'd75530553be9765f33d222215d1933c3a37da959729c792afdfb43a10273b3ad'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 123356132, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T10:15:48.250270', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '069da1ea-d99f-11f0-817e-fa163ebaca0f', 'monotonic_time': 12814.322657674, 'message_signature': 'fff0d476a80886c31ef685fa4910ab42595b0fa5ac62d4f0d21bf5bf484ab5c2'}]}, 'timestamp': '2025-12-15 10:15:48.250801', '_unique_id': 'f2bffe4c576645acb120f07f8db3f252'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.251 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.251 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.251 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.251 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.251 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.251 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.251 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.251 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.251 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.251 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.251 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.251 12 ERROR oslo_messaging.notify.messaging Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.251 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.251 12 ERROR oslo_messaging.notify.messaging Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.251 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.251 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.251 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.251 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.251 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.251 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.251 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.251 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.251 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.251 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.251 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.251 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.251 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.251 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.251 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.251 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.251 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:15:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:15:48.251 12 ERROR oslo_messaging.notify.messaging Dec 15 05:15:50 localhost nova_compute[286344]: 2025-12-15 10:15:50.062 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:15:50 localhost nova_compute[286344]: 2025-12-15 10:15:50.130 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:15:50 localhost ovn_metadata_agent[160585]: 2025-12-15 10:15:50.129 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=23, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'fe:17:e3', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fe:55:2b:86:15:b5'}, ipsec=False) old=SB_Global(nb_cfg=22) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:15:50 localhost ovn_metadata_agent[160585]: 2025-12-15 10:15:50.130 160590 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Dec 15 05:15:50 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow r pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} v 0) Dec 15 05:15:50 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow r pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:15:50 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow r pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"}]': finished Dec 15 05:15:50 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Dec 15 05:15:50 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow r pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:15:50 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow r pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:15:51 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow r pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"}]': finished Dec 15 05:15:51 localhost ovn_metadata_agent[160585]: 2025-12-15 10:15:51.491 160590 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 05:15:51 localhost ovn_metadata_agent[160585]: 2025-12-15 10:15:51.491 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 05:15:51 localhost ovn_metadata_agent[160585]: 2025-12-15 10:15:51.492 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 05:15:52 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e280 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:15:52 localhost nova_compute[286344]: 2025-12-15 10:15:52.054 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:15:53 localhost ovn_metadata_agent[160585]: 2025-12-15 10:15:53.133 160590 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=12d96d64-e862-4f68-81e5-8d9ec5d3a5e2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '23'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 15 05:15:53 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice bob"} v 0) Dec 15 05:15:53 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Dec 15 05:15:53 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Dec 15 05:15:54 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Dec 15 05:15:54 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Dec 15 05:15:54 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Dec 15 05:15:54 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Dec 15 05:15:55 localhost nova_compute[286344]: 2025-12-15 10:15:55.101 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:15:57 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e280 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:15:57 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e280 do_prune osdmap full prune enabled Dec 15 05:15:57 localhost nova_compute[286344]: 2025-12-15 10:15:57.057 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:15:57 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e281 e281: 6 total, 6 up, 6 in Dec 15 05:15:57 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : osdmap e281: 6 total, 6 up, 6 in Dec 15 05:15:57 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} v 0) Dec 15 05:15:57 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:15:57 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"}]': finished Dec 15 05:15:57 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Dec 15 05:15:57 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:15:57 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:15:57 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"}]': finished Dec 15 05:15:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0. Dec 15 05:15:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09. Dec 15 05:15:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 05:15:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 05:15:57 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a. Dec 15 05:15:57 localhost podman[337654]: 2025-12-15 10:15:57.782049734 +0000 UTC m=+0.098833417 container health_status 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:15:57 localhost podman[337652]: 2025-12-15 10:15:57.842814413 +0000 UTC m=+0.168976000 container health_status 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Dec 15 05:15:57 localhost podman[337652]: 2025-12-15 10:15:57.855387003 +0000 UTC m=+0.181548620 container exec_died 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Dec 15 05:15:57 localhost systemd[1]: 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0.service: Deactivated successfully. Dec 15 05:15:57 localhost podman[337653]: 2025-12-15 10:15:57.76634865 +0000 UTC m=+0.088930111 container health_status 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, version=9.6, maintainer=Red Hat, Inc., vcs-type=git, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible) Dec 15 05:15:57 localhost podman[337654]: 2025-12-15 10:15:57.896278706 +0000 UTC m=+0.213062379 container exec_died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Dec 15 05:15:57 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.service: Deactivated successfully. Dec 15 05:15:57 localhost podman[337666]: 2025-12-15 10:15:57.819819093 +0000 UTC m=+0.131044007 container health_status b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS) Dec 15 05:15:57 localhost podman[337660]: 2025-12-15 10:15:57.989966134 +0000 UTC m=+0.301414744 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3) Dec 15 05:15:58 localhost podman[337653]: 2025-12-15 10:15:58.00315309 +0000 UTC m=+0.325734541 container exec_died 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, maintainer=Red Hat, Inc., vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, architecture=x86_64, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, container_name=openstack_network_exporter, vendor=Red Hat, Inc., config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container) Dec 15 05:15:58 localhost systemd[1]: 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09.service: Deactivated successfully. Dec 15 05:15:58 localhost podman[337666]: 2025-12-15 10:15:58.055173743 +0000 UTC m=+0.366398697 container exec_died b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2) Dec 15 05:15:58 localhost systemd[1]: b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a.service: Deactivated successfully. Dec 15 05:15:58 localhost podman[337660]: 2025-12-15 10:15:58.107726671 +0000 UTC m=+0.419175241 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3) Dec 15 05:15:58 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 05:16:00 localhost nova_compute[286344]: 2025-12-15 10:16:00.104 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:16:00 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice"} v 0) Dec 15 05:16:00 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Dec 15 05:16:00 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Dec 15 05:16:01 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Dec 15 05:16:01 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Dec 15 05:16:01 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Dec 15 05:16:01 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Dec 15 05:16:01 localhost sshd[337758]: main: sshd: ssh-rsa algorithm is disabled Dec 15 05:16:01 localhost podman[243449]: time="2025-12-15T10:16:01Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 15 05:16:01 localhost podman[243449]: @ - - [15/Dec/2025:10:16:01 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156640 "" "Go-http-client/1.1" Dec 15 05:16:01 localhost podman[243449]: @ - - [15/Dec/2025:10:16:01 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19273 "" "Go-http-client/1.1" Dec 15 05:16:02 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e281 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:16:02 localhost nova_compute[286344]: 2025-12-15 10:16:02.060 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:16:03 localhost sshd[337760]: main: sshd: ssh-rsa algorithm is disabled Dec 15 05:16:03 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow r pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} v 0) Dec 15 05:16:03 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow r pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:16:03 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow r pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"}]': finished Dec 15 05:16:03 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 05:16:03 localhost podman[337762]: 2025-12-15 10:16:03.75543786 +0000 UTC m=+0.078731497 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible) Dec 15 05:16:03 localhost podman[337762]: 2025-12-15 10:16:03.786760747 +0000 UTC m=+0.110054344 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_managed=true) Dec 15 05:16:03 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 05:16:04 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Dec 15 05:16:04 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow r pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:16:04 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow r pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:16:04 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow r pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"}]': finished Dec 15 05:16:04 localhost openstack_network_exporter[246484]: ERROR 10:16:04 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 15 05:16:04 localhost openstack_network_exporter[246484]: ERROR 10:16:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 05:16:04 localhost openstack_network_exporter[246484]: ERROR 10:16:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 05:16:04 localhost openstack_network_exporter[246484]: ERROR 10:16:04 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 15 05:16:04 localhost openstack_network_exporter[246484]: Dec 15 05:16:04 localhost openstack_network_exporter[246484]: ERROR 10:16:04 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 15 05:16:04 localhost openstack_network_exporter[246484]: Dec 15 05:16:05 localhost nova_compute[286344]: 2025-12-15 10:16:05.108 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:16:05 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0) Dec 15 05:16:05 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1069434045' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch Dec 15 05:16:05 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) Dec 15 05:16:05 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1069434045' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch Dec 15 05:16:06 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice"} v 0) Dec 15 05:16:06 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Dec 15 05:16:06 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Dec 15 05:16:07 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e281 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:16:07 localhost nova_compute[286344]: 2025-12-15 10:16:07.063 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:16:07 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Dec 15 05:16:07 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Dec 15 05:16:07 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Dec 15 05:16:07 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Dec 15 05:16:10 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} v 0) Dec 15 05:16:10 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:16:10 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"}]': finished Dec 15 05:16:10 localhost nova_compute[286344]: 2025-12-15 10:16:10.161 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:16:10 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Dec 15 05:16:10 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:16:10 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:16:10 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"}]': finished Dec 15 05:16:10 localhost systemd[1]: Started /usr/bin/podman healthcheck run a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e. Dec 15 05:16:10 localhost podman[337781]: 2025-12-15 10:16:10.746617312 +0000 UTC m=+0.080881920 container health_status a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 15 05:16:10 localhost podman[337781]: 2025-12-15 10:16:10.760598772 +0000 UTC m=+0.094863350 container exec_died a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Dec 15 05:16:10 localhost systemd[1]: a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e.service: Deactivated successfully. Dec 15 05:16:12 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e281 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:16:12 localhost ceph-mon[298913]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #79. Immutable memtables: 0. Dec 15 05:16:12 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:16:12.048547) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Dec 15 05:16:12 localhost ceph-mon[298913]: rocksdb: [db/flush_job.cc:856] [default] [JOB 47] Flushing memtable with next log file: 79 Dec 15 05:16:12 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765793772048603, "job": 47, "event": "flush_started", "num_memtables": 1, "num_entries": 1404, "num_deletes": 260, "total_data_size": 1332086, "memory_usage": 1369440, "flush_reason": "Manual Compaction"} Dec 15 05:16:12 localhost ceph-mon[298913]: rocksdb: [db/flush_job.cc:885] [default] [JOB 47] Level-0 flush table #80: started Dec 15 05:16:12 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765793772061480, "cf_name": "default", "job": 47, "event": "table_file_creation", "file_number": 80, "file_size": 1307429, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 41088, "largest_seqno": 42491, "table_properties": {"data_size": 1301399, "index_size": 3116, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1861, "raw_key_size": 15379, "raw_average_key_size": 20, "raw_value_size": 1288117, "raw_average_value_size": 1743, "num_data_blocks": 137, "num_entries": 739, "num_filter_entries": 739, "num_deletions": 260, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765793705, "oldest_key_time": 1765793705, "file_creation_time": 1765793772, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "603b24af-e2be-4214-bc56-9e652eb4af3d", "db_session_id": "0OJRM9SCUA16EXV0VQZ2", "orig_file_number": 80, "seqno_to_time_mapping": "N/A"}} Dec 15 05:16:12 localhost ceph-mon[298913]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 47] Flush lasted 12992 microseconds, and 5376 cpu microseconds. Dec 15 05:16:12 localhost ceph-mon[298913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Dec 15 05:16:12 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:16:12.061539) [db/flush_job.cc:967] [default] [JOB 47] Level-0 flush table #80: 1307429 bytes OK Dec 15 05:16:12 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:16:12.061565) [db/memtable_list.cc:519] [default] Level-0 commit table #80 started Dec 15 05:16:12 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:16:12.063498) [db/memtable_list.cc:722] [default] Level-0 commit table #80: memtable #1 done Dec 15 05:16:12 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:16:12.063517) EVENT_LOG_v1 {"time_micros": 1765793772063511, "job": 47, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Dec 15 05:16:12 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:16:12.063540) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Dec 15 05:16:12 localhost ceph-mon[298913]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 47] Try to delete WAL files size 1325394, prev total WAL file size 1325718, number of live WAL files 2. Dec 15 05:16:12 localhost ceph-mon[298913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005559462/store.db/000076.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Dec 15 05:16:12 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:16:12.064203) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0034353233' seq:72057594037927935, type:22 .. '6C6F676D0034373736' seq:0, type:0; will stop at (end) Dec 15 05:16:12 localhost ceph-mon[298913]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 48] Compacting 1@0 + 1@6 files to L6, score -1.00 Dec 15 05:16:12 localhost ceph-mon[298913]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 47 Base level 0, inputs: [80(1276KB)], [78(17MB)] Dec 15 05:16:12 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765793772064348, "job": 48, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [80], "files_L6": [78], "score": -1, "input_data_size": 19306170, "oldest_snapshot_seqno": -1} Dec 15 05:16:12 localhost nova_compute[286344]: 2025-12-15 10:16:12.099 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:16:12 localhost ceph-mon[298913]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 48] Generated table #81: 14477 keys, 19176071 bytes, temperature: kUnknown Dec 15 05:16:12 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765793772198775, "cf_name": "default", "job": 48, "event": "table_file_creation", "file_number": 81, "file_size": 19176071, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 19094079, "index_size": 44707, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 36229, "raw_key_size": 391059, "raw_average_key_size": 27, "raw_value_size": 18848520, "raw_average_value_size": 1301, "num_data_blocks": 1646, "num_entries": 14477, "num_filter_entries": 14477, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765792320, "oldest_key_time": 0, "file_creation_time": 1765793772, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "603b24af-e2be-4214-bc56-9e652eb4af3d", "db_session_id": "0OJRM9SCUA16EXV0VQZ2", "orig_file_number": 81, "seqno_to_time_mapping": "N/A"}} Dec 15 05:16:12 localhost ceph-mon[298913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Dec 15 05:16:12 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:16:12.199182) [db/compaction/compaction_job.cc:1663] [default] [JOB 48] Compacted 1@0 + 1@6 files to L6 => 19176071 bytes Dec 15 05:16:12 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:16:12.201834) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 143.5 rd, 142.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.2, 17.2 +0.0 blob) out(18.3 +0.0 blob), read-write-amplify(29.4) write-amplify(14.7) OK, records in: 15023, records dropped: 546 output_compression: NoCompression Dec 15 05:16:12 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:16:12.201863) EVENT_LOG_v1 {"time_micros": 1765793772201850, "job": 48, "event": "compaction_finished", "compaction_time_micros": 134565, "compaction_time_cpu_micros": 56459, "output_level": 6, "num_output_files": 1, "total_output_size": 19176071, "num_input_records": 15023, "num_output_records": 14477, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Dec 15 05:16:12 localhost ceph-mon[298913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005559462/store.db/000080.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Dec 15 05:16:12 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765793772202200, "job": 48, "event": "table_file_deletion", "file_number": 80} Dec 15 05:16:12 localhost ceph-mon[298913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005559462/store.db/000078.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Dec 15 05:16:12 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765793772204997, "job": 48, "event": "table_file_deletion", "file_number": 78} Dec 15 05:16:12 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:16:12.064048) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 05:16:12 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:16:12.205095) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 05:16:12 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:16:12.205102) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 05:16:12 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:16:12.205105) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 05:16:12 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:16:12.205108) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 05:16:12 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:16:12.205111) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 05:16:13 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice_bob"} v 0) Dec 15 05:16:13 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Dec 15 05:16:13 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Dec 15 05:16:14 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Dec 15 05:16:14 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Dec 15 05:16:14 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Dec 15 05:16:14 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Dec 15 05:16:15 localhost nova_compute[286344]: 2025-12-15 10:16:15.201 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:16:15 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Dec 15 05:16:15 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:16:16 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Dec 15 05:16:16 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:16:16 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow r pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} v 0) Dec 15 05:16:16 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow r pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:16:16 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow r pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"}]': finished Dec 15 05:16:17 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e281 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:16:17 localhost nova_compute[286344]: 2025-12-15 10:16:17.102 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:16:17 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Dec 15 05:16:17 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow r pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:16:17 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow r pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:16:17 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow r pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"}]': finished Dec 15 05:16:19 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Dec 15 05:16:19 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:16:20 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice_bob"} v 0) Dec 15 05:16:20 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Dec 15 05:16:20 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Dec 15 05:16:20 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:16:20 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Dec 15 05:16:20 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Dec 15 05:16:20 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Dec 15 05:16:20 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Dec 15 05:16:20 localhost nova_compute[286344]: 2025-12-15 10:16:20.228 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:16:21 localhost nova_compute[286344]: 2025-12-15 10:16:21.340 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:16:22 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e281 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:16:22 localhost nova_compute[286344]: 2025-12-15 10:16:22.142 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:16:22 localhost nova_compute[286344]: 2025-12-15 10:16:22.271 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:16:23 localhost nova_compute[286344]: 2025-12-15 10:16:23.266 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:16:23 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} v 0) Dec 15 05:16:23 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:16:23 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"}]': finished Dec 15 05:16:24 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Dec 15 05:16:24 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:16:24 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:16:24 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"}]': finished Dec 15 05:16:24 localhost nova_compute[286344]: 2025-12-15 10:16:24.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:16:24 localhost nova_compute[286344]: 2025-12-15 10:16:24.271 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 15 05:16:24 localhost nova_compute[286344]: 2025-12-15 10:16:24.271 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 15 05:16:24 localhost nova_compute[286344]: 2025-12-15 10:16:24.763 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 15 05:16:24 localhost nova_compute[286344]: 2025-12-15 10:16:24.763 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquired lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 15 05:16:24 localhost nova_compute[286344]: 2025-12-15 10:16:24.764 286348 DEBUG nova.network.neutron [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 15 05:16:24 localhost nova_compute[286344]: 2025-12-15 10:16:24.764 286348 DEBUG nova.objects.instance [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 15 05:16:25 localhost nova_compute[286344]: 2025-12-15 10:16:25.270 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:16:25 localhost nova_compute[286344]: 2025-12-15 10:16:25.348 286348 DEBUG nova.network.neutron [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Updating instance_info_cache with network_info: [{"id": "03ef8889-3216-43fb-8a52-4be17a956ce1", "address": "fa:16:3e:74:df:7c", "network": {"id": "befb7a72-17a9-4bcb-b561-84b8f626685a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.201", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "c785bf23f53946bc99867d8832a50266", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03ef8889-32", "ovs_interfaceid": "03ef8889-3216-43fb-8a52-4be17a956ce1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 15 05:16:25 localhost nova_compute[286344]: 2025-12-15 10:16:25.375 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Releasing lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 15 05:16:25 localhost nova_compute[286344]: 2025-12-15 10:16:25.376 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 15 05:16:25 localhost nova_compute[286344]: 2025-12-15 10:16:25.376 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:16:27 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e281 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:16:27 localhost nova_compute[286344]: 2025-12-15 10:16:27.145 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:16:28 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice bob"} v 0) Dec 15 05:16:28 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Dec 15 05:16:28 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Dec 15 05:16:28 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Dec 15 05:16:28 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Dec 15 05:16:28 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Dec 15 05:16:28 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Dec 15 05:16:28 localhost nova_compute[286344]: 2025-12-15 10:16:28.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:16:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0. Dec 15 05:16:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09. Dec 15 05:16:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 05:16:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 05:16:28 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a. Dec 15 05:16:28 localhost podman[337891]: 2025-12-15 10:16:28.750811051 +0000 UTC m=+0.081977592 container health_status 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Dec 15 05:16:28 localhost podman[337907]: 2025-12-15 10:16:28.820786951 +0000 UTC m=+0.130558465 container health_status b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, org.label-schema.build-date=20251202) Dec 15 05:16:28 localhost podman[337904]: 2025-12-15 10:16:28.775496703 +0000 UTC m=+0.086578296 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0) Dec 15 05:16:28 localhost podman[337907]: 2025-12-15 10:16:28.830161635 +0000 UTC m=+0.139933169 container exec_died b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute) Dec 15 05:16:28 localhost systemd[1]: b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a.service: Deactivated successfully. Dec 15 05:16:28 localhost podman[337892]: 2025-12-15 10:16:28.798073395 +0000 UTC m=+0.127616968 container health_status 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, release=1755695350, distribution-scope=public, architecture=x86_64, name=ubi9-minimal, version=9.6, managed_by=edpm_ansible, maintainer=Red Hat, Inc., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9) Dec 15 05:16:28 localhost podman[337904]: 2025-12-15 10:16:28.856388083 +0000 UTC m=+0.167469746 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:16:28 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 05:16:28 localhost podman[337891]: 2025-12-15 10:16:28.88563076 +0000 UTC m=+0.216797291 container exec_died 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors ) Dec 15 05:16:28 localhost systemd[1]: 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0.service: Deactivated successfully. Dec 15 05:16:28 localhost podman[337892]: 2025-12-15 10:16:28.929440482 +0000 UTC m=+0.258984105 container exec_died 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, release=1755695350, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, architecture=x86_64, distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal) Dec 15 05:16:28 localhost systemd[1]: 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09.service: Deactivated successfully. Dec 15 05:16:28 localhost podman[337893]: 2025-12-15 10:16:28.962867081 +0000 UTC m=+0.286435750 container health_status 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.vendor=CentOS) Dec 15 05:16:29 localhost podman[337893]: 2025-12-15 10:16:29.008350183 +0000 UTC m=+0.331918912 container exec_died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:16:29 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.service: Deactivated successfully. Dec 15 05:16:29 localhost nova_compute[286344]: 2025-12-15 10:16:29.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:16:29 localhost nova_compute[286344]: 2025-12-15 10:16:29.271 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 15 05:16:30 localhost nova_compute[286344]: 2025-12-15 10:16:30.302 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:16:31 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow r pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} v 0) Dec 15 05:16:31 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow r pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:16:31 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow r pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"}]': finished Dec 15 05:16:31 localhost podman[243449]: time="2025-12-15T10:16:31Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 15 05:16:31 localhost podman[243449]: @ - - [15/Dec/2025:10:16:31 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156640 "" "Go-http-client/1.1" Dec 15 05:16:31 localhost podman[243449]: @ - - [15/Dec/2025:10:16:31 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19280 "" "Go-http-client/1.1" Dec 15 05:16:32 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e281 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:16:32 localhost nova_compute[286344]: 2025-12-15 10:16:32.147 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:16:32 localhost nova_compute[286344]: 2025-12-15 10:16:32.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:16:32 localhost nova_compute[286344]: 2025-12-15 10:16:32.287 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 05:16:32 localhost nova_compute[286344]: 2025-12-15 10:16:32.287 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 05:16:32 localhost nova_compute[286344]: 2025-12-15 10:16:32.288 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 05:16:32 localhost nova_compute[286344]: 2025-12-15 10:16:32.288 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Auditing locally available compute resources for np0005559462.localdomain (node: np0005559462.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 15 05:16:32 localhost nova_compute[286344]: 2025-12-15 10:16:32.289 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 05:16:32 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Dec 15 05:16:32 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow r pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:16:32 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow r pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:16:32 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow r pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"}]': finished Dec 15 05:16:32 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Dec 15 05:16:32 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1278746250' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Dec 15 05:16:32 localhost nova_compute[286344]: 2025-12-15 10:16:32.742 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 05:16:32 localhost nova_compute[286344]: 2025-12-15 10:16:32.800 286348 DEBUG nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 05:16:32 localhost nova_compute[286344]: 2025-12-15 10:16:32.801 286348 DEBUG nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 05:16:33 localhost nova_compute[286344]: 2025-12-15 10:16:33.003 286348 WARNING nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 15 05:16:33 localhost nova_compute[286344]: 2025-12-15 10:16:33.004 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Hypervisor/Node resource view: name=np0005559462.localdomain free_ram=11131MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 15 05:16:33 localhost nova_compute[286344]: 2025-12-15 10:16:33.005 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 05:16:33 localhost nova_compute[286344]: 2025-12-15 10:16:33.005 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 05:16:33 localhost nova_compute[286344]: 2025-12-15 10:16:33.072 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Instance 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 15 05:16:33 localhost nova_compute[286344]: 2025-12-15 10:16:33.073 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 15 05:16:33 localhost nova_compute[286344]: 2025-12-15 10:16:33.073 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Final resource view: name=np0005559462.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 15 05:16:33 localhost nova_compute[286344]: 2025-12-15 10:16:33.091 286348 DEBUG nova.scheduler.client.report [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Refreshing inventories for resource provider 26c8956b-6742-4951-b566-971b9bbe323b _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m Dec 15 05:16:33 localhost nova_compute[286344]: 2025-12-15 10:16:33.108 286348 DEBUG nova.scheduler.client.report [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Updating ProviderTree inventory for provider 26c8956b-6742-4951-b566-971b9bbe323b from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m Dec 15 05:16:33 localhost nova_compute[286344]: 2025-12-15 10:16:33.109 286348 DEBUG nova.compute.provider_tree [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Updating inventory in ProviderTree for provider 26c8956b-6742-4951-b566-971b9bbe323b with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m Dec 15 05:16:33 localhost nova_compute[286344]: 2025-12-15 10:16:33.128 286348 DEBUG nova.scheduler.client.report [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Refreshing aggregate associations for resource provider 26c8956b-6742-4951-b566-971b9bbe323b, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m Dec 15 05:16:33 localhost nova_compute[286344]: 2025-12-15 10:16:33.151 286348 DEBUG nova.scheduler.client.report [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Refreshing trait associations for resource provider 26c8956b-6742-4951-b566-971b9bbe323b, traits: COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_IDE,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NODE,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_CLMUL,HW_CPU_X86_ABM,HW_CPU_X86_SSE2,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_BMI2,HW_CPU_X86_SSE42,HW_CPU_X86_FMA3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_BMI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SSE,HW_CPU_X86_AVX,HW_CPU_X86_SVM,COMPUTE_SECURITY_TPM_2_0,COMPUTE_TRUSTED_CERTS,COMPUTE_RESCUE_BFV,HW_CPU_X86_AVX2,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSSE3,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_AESNI,HW_CPU_X86_SSE4A,HW_CPU_X86_MMX,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_F16C,HW_CPU_X86_SHA,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_SATA _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m Dec 15 05:16:33 localhost nova_compute[286344]: 2025-12-15 10:16:33.193 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 05:16:33 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Dec 15 05:16:33 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/314786558' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Dec 15 05:16:33 localhost nova_compute[286344]: 2025-12-15 10:16:33.653 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 05:16:33 localhost nova_compute[286344]: 2025-12-15 10:16:33.659 286348 DEBUG nova.compute.provider_tree [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Inventory has not changed in ProviderTree for provider: 26c8956b-6742-4951-b566-971b9bbe323b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 15 05:16:33 localhost nova_compute[286344]: 2025-12-15 10:16:33.682 286348 DEBUG nova.scheduler.client.report [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Inventory has not changed for provider 26c8956b-6742-4951-b566-971b9bbe323b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 15 05:16:33 localhost nova_compute[286344]: 2025-12-15 10:16:33.685 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Compute_service record updated for np0005559462.localdomain:np0005559462.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 15 05:16:33 localhost nova_compute[286344]: 2025-12-15 10:16:33.685 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.681s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 05:16:34 localhost ceph-osd[31375]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #46. Immutable memtables: 3. Dec 15 05:16:34 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 05:16:34 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice bob"} v 0) Dec 15 05:16:34 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Dec 15 05:16:34 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Dec 15 05:16:34 localhost podman[338037]: 2025-12-15 10:16:34.761267656 +0000 UTC m=+0.083398204 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true) Dec 15 05:16:34 localhost podman[338037]: 2025-12-15 10:16:34.790279665 +0000 UTC m=+0.112410213 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:16:34 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 05:16:34 localhost openstack_network_exporter[246484]: ERROR 10:16:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 05:16:34 localhost openstack_network_exporter[246484]: ERROR 10:16:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 05:16:34 localhost openstack_network_exporter[246484]: ERROR 10:16:34 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 15 05:16:34 localhost openstack_network_exporter[246484]: ERROR 10:16:34 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 15 05:16:34 localhost openstack_network_exporter[246484]: Dec 15 05:16:34 localhost openstack_network_exporter[246484]: ERROR 10:16:34 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 15 05:16:34 localhost openstack_network_exporter[246484]: Dec 15 05:16:35 localhost nova_compute[286344]: 2025-12-15 10:16:35.304 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:16:35 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Dec 15 05:16:35 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Dec 15 05:16:35 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Dec 15 05:16:35 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Dec 15 05:16:35 localhost nova_compute[286344]: 2025-12-15 10:16:35.686 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:16:37 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e281 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:16:37 localhost nova_compute[286344]: 2025-12-15 10:16:37.149 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:16:37 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} v 0) Dec 15 05:16:37 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:16:37 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"}]': finished Dec 15 05:16:38 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Dec 15 05:16:38 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:16:38 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:16:38 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"}]': finished Dec 15 05:16:40 localhost nova_compute[286344]: 2025-12-15 10:16:40.326 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:16:41 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice"} v 0) Dec 15 05:16:41 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Dec 15 05:16:41 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Dec 15 05:16:41 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Dec 15 05:16:41 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Dec 15 05:16:41 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Dec 15 05:16:41 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Dec 15 05:16:41 localhost systemd[1]: Started /usr/bin/podman healthcheck run a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e. Dec 15 05:16:41 localhost podman[338055]: 2025-12-15 10:16:41.759413421 +0000 UTC m=+0.088207974 container health_status a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 15 05:16:41 localhost podman[338055]: 2025-12-15 10:16:41.768099875 +0000 UTC m=+0.096894458 container exec_died a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi ) Dec 15 05:16:41 localhost systemd[1]: a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e.service: Deactivated successfully. Dec 15 05:16:42 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e281 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:16:42 localhost nova_compute[286344]: 2025-12-15 10:16:42.192 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:16:44 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow r pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} v 0) Dec 15 05:16:44 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow r pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:16:44 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow r pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"}]': finished Dec 15 05:16:45 localhost nova_compute[286344]: 2025-12-15 10:16:45.356 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:16:45 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Dec 15 05:16:45 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow r pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:16:45 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow r pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:16:45 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow r pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"}]': finished Dec 15 05:16:47 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e281 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:16:47 localhost nova_compute[286344]: 2025-12-15 10:16:47.195 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:16:47 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice"} v 0) Dec 15 05:16:47 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Dec 15 05:16:47 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Dec 15 05:16:48 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch Dec 15 05:16:48 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Dec 15 05:16:48 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch Dec 15 05:16:48 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished Dec 15 05:16:50 localhost nova_compute[286344]: 2025-12-15 10:16:50.392 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:16:51 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} v 0) Dec 15 05:16:51 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:16:51 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"}]': finished Dec 15 05:16:51 localhost ceph-osd[32311]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #45. Immutable memtables: 2. Dec 15 05:16:51 localhost ovn_metadata_agent[160585]: 2025-12-15 10:16:51.491 160590 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 05:16:51 localhost ovn_metadata_agent[160585]: 2025-12-15 10:16:51.492 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 05:16:51 localhost ovn_metadata_agent[160585]: 2025-12-15 10:16:51.492 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 05:16:51 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Dec 15 05:16:51 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:16:51 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:16:51 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"}]': finished Dec 15 05:16:52 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e281 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:16:52 localhost nova_compute[286344]: 2025-12-15 10:16:52.229 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:16:54 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice_bob"} v 0) Dec 15 05:16:54 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Dec 15 05:16:54 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Dec 15 05:16:54 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Dec 15 05:16:54 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Dec 15 05:16:54 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Dec 15 05:16:54 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Dec 15 05:16:55 localhost nova_compute[286344]: 2025-12-15 10:16:55.424 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:16:57 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e281 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:16:57 localhost nova_compute[286344]: 2025-12-15 10:16:57.231 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:16:57 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow r pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} v 0) Dec 15 05:16:57 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow r pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:16:57 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow r pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"}]': finished Dec 15 05:16:58 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Dec 15 05:16:58 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow r pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:16:58 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow r pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:16:58 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow r pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"}]': finished Dec 15 05:16:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0. Dec 15 05:16:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09. Dec 15 05:16:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 05:16:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 05:16:59 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a. Dec 15 05:16:59 localhost podman[338078]: 2025-12-15 10:16:59.76746821 +0000 UTC m=+0.097381362 container health_status 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 15 05:16:59 localhost podman[338078]: 2025-12-15 10:16:59.798567792 +0000 UTC m=+0.128480924 container exec_died 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 15 05:16:59 localhost podman[338089]: 2025-12-15 10:16:59.828122967 +0000 UTC m=+0.141313409 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true) Dec 15 05:16:59 localhost podman[338089]: 2025-12-15 10:16:59.857816167 +0000 UTC m=+0.171006609 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0) Dec 15 05:16:59 localhost podman[338092]: 2025-12-15 10:16:59.814699573 +0000 UTC m=+0.125533597 container health_status b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute) Dec 15 05:16:59 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 05:16:59 localhost podman[338080]: 2025-12-15 10:16:59.873194327 +0000 UTC m=+0.192793288 container health_status 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Dec 15 05:16:59 localhost systemd[1]: 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0.service: Deactivated successfully. Dec 15 05:16:59 localhost podman[338079]: 2025-12-15 10:16:59.914890598 +0000 UTC m=+0.240808504 container health_status 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, vcs-type=git, build-date=2025-08-20T13:12:41, architecture=x86_64, io.openshift.tags=minimal rhel9, name=ubi9-minimal, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc.) Dec 15 05:16:59 localhost podman[338080]: 2025-12-15 10:16:59.939634503 +0000 UTC m=+0.259233464 container exec_died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true) Dec 15 05:16:59 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.service: Deactivated successfully. Dec 15 05:16:59 localhost podman[338079]: 2025-12-15 10:16:59.956433535 +0000 UTC m=+0.282351381 container exec_died 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-type=git, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., architecture=x86_64, config_id=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, release=1755695350, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, container_name=openstack_network_exporter, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Dec 15 05:16:59 localhost systemd[1]: 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09.service: Deactivated successfully. Dec 15 05:16:59 localhost podman[338092]: 2025-12-15 10:16:59.996390235 +0000 UTC m=+0.307224299 container exec_died b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251202, config_id=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS) Dec 15 05:17:00 localhost systemd[1]: b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a.service: Deactivated successfully. Dec 15 05:17:00 localhost nova_compute[286344]: 2025-12-15 10:17:00.428 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:17:00 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice_bob"} v 0) Dec 15 05:17:00 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Dec 15 05:17:00 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Dec 15 05:17:01 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch Dec 15 05:17:01 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Dec 15 05:17:01 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch Dec 15 05:17:01 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished Dec 15 05:17:01 localhost podman[243449]: time="2025-12-15T10:17:01Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 15 05:17:01 localhost podman[243449]: @ - - [15/Dec/2025:10:17:01 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156640 "" "Go-http-client/1.1" Dec 15 05:17:01 localhost podman[243449]: @ - - [15/Dec/2025:10:17:01 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19277 "" "Go-http-client/1.1" Dec 15 05:17:02 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e281 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:17:02 localhost nova_compute[286344]: 2025-12-15 10:17:02.233 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:17:03 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} v 0) Dec 15 05:17:03 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:17:03 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"}]': finished Dec 15 05:17:04 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Dec 15 05:17:04 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:17:04 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:17:04 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"}]': finished Dec 15 05:17:04 localhost openstack_network_exporter[246484]: ERROR 10:17:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 05:17:04 localhost openstack_network_exporter[246484]: ERROR 10:17:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 05:17:04 localhost openstack_network_exporter[246484]: ERROR 10:17:04 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 15 05:17:04 localhost openstack_network_exporter[246484]: ERROR 10:17:04 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 15 05:17:04 localhost openstack_network_exporter[246484]: Dec 15 05:17:04 localhost openstack_network_exporter[246484]: ERROR 10:17:04 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 15 05:17:04 localhost openstack_network_exporter[246484]: Dec 15 05:17:05 localhost nova_compute[286344]: 2025-12-15 10:17:05.473 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:17:05 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 05:17:05 localhost podman[338184]: 2025-12-15 10:17:05.74072349 +0000 UTC m=+0.075504202 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent) Dec 15 05:17:05 localhost podman[338184]: 2025-12-15 10:17:05.751372481 +0000 UTC m=+0.086153193 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 05:17:05 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 05:17:07 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e281 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:17:07 localhost nova_compute[286344]: 2025-12-15 10:17:07.252 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:17:07 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice bob"} v 0) Dec 15 05:17:07 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Dec 15 05:17:07 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Dec 15 05:17:08 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Dec 15 05:17:08 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Dec 15 05:17:08 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Dec 15 05:17:08 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Dec 15 05:17:10 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : mgrmap e56: np0005559464.aomnqe(active, since 19m), standbys: np0005559462.fudvyx, np0005559463.daptkf Dec 15 05:17:10 localhost nova_compute[286344]: 2025-12-15 10:17:10.491 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:17:10 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow r pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} v 0) Dec 15 05:17:10 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow r pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:17:10 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow r pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"}]': finished Dec 15 05:17:11 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Dec 15 05:17:11 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow r pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:17:11 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow r pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:17:11 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow r pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"}]': finished Dec 15 05:17:12 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e281 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:17:12 localhost nova_compute[286344]: 2025-12-15 10:17:12.283 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:17:12 localhost systemd[1]: Started /usr/bin/podman healthcheck run a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e. Dec 15 05:17:12 localhost systemd[1]: tmp-crun.R0H0gw.mount: Deactivated successfully. Dec 15 05:17:12 localhost podman[338202]: 2025-12-15 10:17:12.766667197 +0000 UTC m=+0.093200831 container health_status a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 15 05:17:12 localhost podman[338202]: 2025-12-15 10:17:12.775724732 +0000 UTC m=+0.102258396 container exec_died a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Dec 15 05:17:12 localhost systemd[1]: a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e.service: Deactivated successfully. Dec 15 05:17:13 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice bob"} v 0) Dec 15 05:17:13 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Dec 15 05:17:13 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Dec 15 05:17:14 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch Dec 15 05:17:14 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Dec 15 05:17:14 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch Dec 15 05:17:14 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished Dec 15 05:17:15 localhost nova_compute[286344]: 2025-12-15 10:17:15.530 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:17:17 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} v 0) Dec 15 05:17:17 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:17:17 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth get-or-create", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"}]': finished Dec 15 05:17:17 localhost ceph-mon[298913]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #82. Immutable memtables: 0. Dec 15 05:17:17 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:17:17.066252) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0 Dec 15 05:17:17 localhost ceph-mon[298913]: rocksdb: [db/flush_job.cc:856] [default] [JOB 49] Flushing memtable with next log file: 82 Dec 15 05:17:17 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765793837066307, "job": 49, "event": "flush_started", "num_memtables": 1, "num_entries": 1248, "num_deletes": 251, "total_data_size": 1165932, "memory_usage": 1305456, "flush_reason": "Manual Compaction"} Dec 15 05:17:17 localhost ceph-mon[298913]: rocksdb: [db/flush_job.cc:885] [default] [JOB 49] Level-0 flush table #83: started Dec 15 05:17:17 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765793837077717, "cf_name": "default", "job": 49, "event": "table_file_creation", "file_number": 83, "file_size": 1143879, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 42492, "largest_seqno": 43739, "table_properties": {"data_size": 1138428, "index_size": 2730, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1669, "raw_key_size": 13720, "raw_average_key_size": 21, "raw_value_size": 1126765, "raw_average_value_size": 1728, "num_data_blocks": 120, "num_entries": 652, "num_filter_entries": 652, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765793772, "oldest_key_time": 1765793772, "file_creation_time": 1765793837, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "603b24af-e2be-4214-bc56-9e652eb4af3d", "db_session_id": "0OJRM9SCUA16EXV0VQZ2", "orig_file_number": 83, "seqno_to_time_mapping": "N/A"}} Dec 15 05:17:17 localhost ceph-mon[298913]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 49] Flush lasted 11535 microseconds, and 3846 cpu microseconds. Dec 15 05:17:17 localhost ceph-mon[298913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Dec 15 05:17:17 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:17:17.077785) [db/flush_job.cc:967] [default] [JOB 49] Level-0 flush table #83: 1143879 bytes OK Dec 15 05:17:17 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:17:17.077814) [db/memtable_list.cc:519] [default] Level-0 commit table #83 started Dec 15 05:17:17 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:17:17.080142) [db/memtable_list.cc:722] [default] Level-0 commit table #83: memtable #1 done Dec 15 05:17:17 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:17:17.080162) EVENT_LOG_v1 {"time_micros": 1765793837080156, "job": 49, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0} Dec 15 05:17:17 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:17:17.080184) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25 Dec 15 05:17:17 localhost ceph-mon[298913]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 49] Try to delete WAL files size 1159957, prev total WAL file size 1159957, number of live WAL files 2. Dec 15 05:17:17 localhost ceph-mon[298913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005559462/store.db/000079.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Dec 15 05:17:17 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:17:17.080816) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003133333033' seq:72057594037927935, type:22 .. '7061786F73003133353535' seq:0, type:0; will stop at (end) Dec 15 05:17:17 localhost ceph-mon[298913]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 50] Compacting 1@0 + 1@6 files to L6, score -1.00 Dec 15 05:17:17 localhost ceph-mon[298913]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 49 Base level 0, inputs: [83(1117KB)], [81(18MB)] Dec 15 05:17:17 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765793837080859, "job": 50, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [83], "files_L6": [81], "score": -1, "input_data_size": 20319950, "oldest_snapshot_seqno": -1} Dec 15 05:17:17 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e281 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:17:17 localhost ceph-mon[298913]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 50] Generated table #84: 14603 keys, 18614237 bytes, temperature: kUnknown Dec 15 05:17:17 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765793837215083, "cf_name": "default", "job": 50, "event": "table_file_creation", "file_number": 84, "file_size": 18614237, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 18532278, "index_size": 44354, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 36549, "raw_key_size": 394594, "raw_average_key_size": 27, "raw_value_size": 18285387, "raw_average_value_size": 1252, "num_data_blocks": 1626, "num_entries": 14603, "num_filter_entries": 14603, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1765792320, "oldest_key_time": 0, "file_creation_time": 1765793837, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "603b24af-e2be-4214-bc56-9e652eb4af3d", "db_session_id": "0OJRM9SCUA16EXV0VQZ2", "orig_file_number": 84, "seqno_to_time_mapping": "N/A"}} Dec 15 05:17:17 localhost ceph-mon[298913]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. Dec 15 05:17:17 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:17:17.215356) [db/compaction/compaction_job.cc:1663] [default] [JOB 50] Compacted 1@0 + 1@6 files to L6 => 18614237 bytes Dec 15 05:17:17 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:17:17.217434) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 151.3 rd, 138.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.1, 18.3 +0.0 blob) out(17.8 +0.0 blob), read-write-amplify(34.0) write-amplify(16.3) OK, records in: 15129, records dropped: 526 output_compression: NoCompression Dec 15 05:17:17 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:17:17.217465) EVENT_LOG_v1 {"time_micros": 1765793837217451, "job": 50, "event": "compaction_finished", "compaction_time_micros": 134303, "compaction_time_cpu_micros": 51182, "output_level": 6, "num_output_files": 1, "total_output_size": 18614237, "num_input_records": 15129, "num_output_records": 14603, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]} Dec 15 05:17:17 localhost ceph-mon[298913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005559462/store.db/000083.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Dec 15 05:17:17 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765793837217748, "job": 50, "event": "table_file_deletion", "file_number": 83} Dec 15 05:17:17 localhost ceph-mon[298913]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005559462/store.db/000081.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 Dec 15 05:17:17 localhost ceph-mon[298913]: rocksdb: EVENT_LOG_v1 {"time_micros": 1765793837220641, "job": 50, "event": "table_file_deletion", "file_number": 81} Dec 15 05:17:17 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:17:17.080737) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 05:17:17 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:17:17.220738) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 05:17:17 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:17:17.220743) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 05:17:17 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:17:17.220745) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 05:17:17 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:17:17.220746) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 05:17:17 localhost ceph-mon[298913]: rocksdb: (Original Log Time 2025/12/15-10:17:17.220748) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting Dec 15 05:17:17 localhost nova_compute[286344]: 2025-12-15 10:17:17.286 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:17:17 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch Dec 15 05:17:17 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:17:17 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get-or-create", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"} : dispatch Dec 15 05:17:17 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth get-or-create", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275", "mon", "allow r"], "format": "json"}]': finished Dec 15 05:17:17 localhost podman[338369]: Dec 15 05:17:17 localhost podman[338369]: 2025-12-15 10:17:17.694263691 +0000 UTC m=+0.077686027 container create feeb1f6a78f60a6c8fb3d29db5c783f1bdf0758d088ca475c41d6390dfbd7c99 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quirky_jennings, name=rhceph, architecture=x86_64, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, vendor=Red Hat, Inc., RELEASE=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux , release=1763362218, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-11-26T19:44:28Z, version=7) Dec 15 05:17:17 localhost systemd[1]: Started libpod-conmon-feeb1f6a78f60a6c8fb3d29db5c783f1bdf0758d088ca475c41d6390dfbd7c99.scope. Dec 15 05:17:17 localhost systemd[1]: Started libcrun container. Dec 15 05:17:17 localhost podman[338369]: 2025-12-15 10:17:17.662443718 +0000 UTC m=+0.045866044 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 05:17:17 localhost podman[338369]: 2025-12-15 10:17:17.770903255 +0000 UTC m=+0.154325571 container init feeb1f6a78f60a6c8fb3d29db5c783f1bdf0758d088ca475c41d6390dfbd7c99 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quirky_jennings, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., release=1763362218, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux , cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, build-date=2025-11-26T19:44:28Z, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7) Dec 15 05:17:17 localhost podman[338369]: 2025-12-15 10:17:17.781702761 +0000 UTC m=+0.165125067 container start feeb1f6a78f60a6c8fb3d29db5c783f1bdf0758d088ca475c41d6390dfbd7c99 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quirky_jennings, build-date=2025-11-26T19:44:28Z, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, release=1763362218, name=rhceph, ceph=True, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, CEPH_POINT_RELEASE=, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, version=7, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, distribution-scope=public, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, RELEASE=main, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git) Dec 15 05:17:17 localhost podman[338369]: 2025-12-15 10:17:17.78201564 +0000 UTC m=+0.165437996 container attach feeb1f6a78f60a6c8fb3d29db5c783f1bdf0758d088ca475c41d6390dfbd7c99 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quirky_jennings, architecture=x86_64, maintainer=Guillaume Abrioux , distribution-scope=public, GIT_CLEAN=True, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, name=rhceph, vcs-type=git, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, ceph=True, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1763362218) Dec 15 05:17:17 localhost quirky_jennings[338384]: 167 167 Dec 15 05:17:17 localhost systemd[1]: libpod-feeb1f6a78f60a6c8fb3d29db5c783f1bdf0758d088ca475c41d6390dfbd7c99.scope: Deactivated successfully. Dec 15 05:17:17 localhost podman[338369]: 2025-12-15 10:17:17.78815732 +0000 UTC m=+0.171579626 container died feeb1f6a78f60a6c8fb3d29db5c783f1bdf0758d088ca475c41d6390dfbd7c99 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quirky_jennings, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, CEPH_POINT_RELEASE=, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, release=1763362218, maintainer=Guillaume Abrioux , org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, RELEASE=main, name=rhceph, io.buildah.version=1.41.4, architecture=x86_64, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container) Dec 15 05:17:17 localhost podman[338389]: 2025-12-15 10:17:17.891426695 +0000 UTC m=+0.092157280 container remove feeb1f6a78f60a6c8fb3d29db5c783f1bdf0758d088ca475c41d6390dfbd7c99 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quirky_jennings, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., io.openshift.expose-services=, build-date=2025-11-26T19:44:28Z, ceph=True, distribution-scope=public, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, vcs-type=git, com.redhat.component=rhceph-container, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux , GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, architecture=x86_64, name=rhceph, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0) Dec 15 05:17:17 localhost systemd[1]: libpod-conmon-feeb1f6a78f60a6c8fb3d29db5c783f1bdf0758d088ca475c41d6390dfbd7c99.scope: Deactivated successfully. Dec 15 05:17:18 localhost podman[338410]: Dec 15 05:17:18 localhost podman[338410]: 2025-12-15 10:17:18.164455201 +0000 UTC m=+0.084697642 container create bae37d875b0520d5f9a33a2a2794916699667c769a8621540f5da39a0c099c46 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=bold_beaver, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, release=1763362218, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, RELEASE=main, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, maintainer=Guillaume Abrioux , GIT_BRANCH=main, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, CEPH_POINT_RELEASE=, io.openshift.expose-services=, build-date=2025-11-26T19:44:28Z, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream) Dec 15 05:17:18 localhost systemd[1]: Started libpod-conmon-bae37d875b0520d5f9a33a2a2794916699667c769a8621540f5da39a0c099c46.scope. Dec 15 05:17:18 localhost systemd[1]: Started libcrun container. Dec 15 05:17:18 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/242c8ba6c9b578f019cddb205e7c9d510f5acb689257d026b6abb1a44ff185bc/merged/rootfs supports timestamps until 2038 (0x7fffffff) Dec 15 05:17:18 localhost podman[338410]: 2025-12-15 10:17:18.129701643 +0000 UTC m=+0.049944124 image pull registry.redhat.io/rhceph/rhceph-7-rhel9:latest Dec 15 05:17:18 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/242c8ba6c9b578f019cddb205e7c9d510f5acb689257d026b6abb1a44ff185bc/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff) Dec 15 05:17:18 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/242c8ba6c9b578f019cddb205e7c9d510f5acb689257d026b6abb1a44ff185bc/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff) Dec 15 05:17:18 localhost kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/242c8ba6c9b578f019cddb205e7c9d510f5acb689257d026b6abb1a44ff185bc/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff) Dec 15 05:17:18 localhost podman[338410]: 2025-12-15 10:17:18.235326317 +0000 UTC m=+0.155568748 container init bae37d875b0520d5f9a33a2a2794916699667c769a8621540f5da39a0c099c46 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=bold_beaver, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, architecture=x86_64, version=7, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, io.openshift.expose-services=, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1763362218, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., build-date=2025-11-26T19:44:28Z, io.openshift.tags=rhceph ceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0) Dec 15 05:17:18 localhost podman[338410]: 2025-12-15 10:17:18.245752432 +0000 UTC m=+0.165994863 container start bae37d875b0520d5f9a33a2a2794916699667c769a8621540f5da39a0c099c46 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=bold_beaver, io.openshift.tags=rhceph ceph, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vendor=Red Hat, Inc., name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.openshift.expose-services=, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_CLEAN=True, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-11-26T19:44:28Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=, architecture=x86_64, com.redhat.component=rhceph-container, version=7, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1763362218, io.buildah.version=1.41.4) Dec 15 05:17:18 localhost podman[338410]: 2025-12-15 10:17:18.246208015 +0000 UTC m=+0.166450466 container attach bae37d875b0520d5f9a33a2a2794916699667c769a8621540f5da39a0c099c46 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=bold_beaver, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, release=1763362218, RELEASE=main, ceph=True, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, distribution-scope=public, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, architecture=x86_64, name=rhceph, maintainer=Guillaume Abrioux , build-date=2025-11-26T19:44:28Z, io.openshift.tags=rhceph ceph, version=7, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.) Dec 15 05:17:18 localhost systemd[1]: var-lib-containers-storage-overlay-d4cf7b25f4b431edacbae5d21e85c6b75171d7075cd6bec22c2e32497970d547-merged.mount: Deactivated successfully. Dec 15 05:17:19 localhost bold_beaver[338425]: [ Dec 15 05:17:19 localhost bold_beaver[338425]: { Dec 15 05:17:19 localhost bold_beaver[338425]: "available": false, Dec 15 05:17:19 localhost bold_beaver[338425]: "ceph_device": false, Dec 15 05:17:19 localhost bold_beaver[338425]: "device_id": "QEMU_DVD-ROM_QM00001", Dec 15 05:17:19 localhost bold_beaver[338425]: "lsm_data": {}, Dec 15 05:17:19 localhost bold_beaver[338425]: "lvs": [], Dec 15 05:17:19 localhost bold_beaver[338425]: "path": "/dev/sr0", Dec 15 05:17:19 localhost bold_beaver[338425]: "rejected_reasons": [ Dec 15 05:17:19 localhost bold_beaver[338425]: "Has a FileSystem", Dec 15 05:17:19 localhost bold_beaver[338425]: "Insufficient space (<5GB)" Dec 15 05:17:19 localhost bold_beaver[338425]: ], Dec 15 05:17:19 localhost bold_beaver[338425]: "sys_api": { Dec 15 05:17:19 localhost bold_beaver[338425]: "actuators": null, Dec 15 05:17:19 localhost bold_beaver[338425]: "device_nodes": "sr0", Dec 15 05:17:19 localhost bold_beaver[338425]: "human_readable_size": "482.00 KB", Dec 15 05:17:19 localhost bold_beaver[338425]: "id_bus": "ata", Dec 15 05:17:19 localhost bold_beaver[338425]: "model": "QEMU DVD-ROM", Dec 15 05:17:19 localhost bold_beaver[338425]: "nr_requests": "2", Dec 15 05:17:19 localhost bold_beaver[338425]: "partitions": {}, Dec 15 05:17:19 localhost bold_beaver[338425]: "path": "/dev/sr0", Dec 15 05:17:19 localhost bold_beaver[338425]: "removable": "1", Dec 15 05:17:19 localhost bold_beaver[338425]: "rev": "2.5+", Dec 15 05:17:19 localhost bold_beaver[338425]: "ro": "0", Dec 15 05:17:19 localhost bold_beaver[338425]: "rotational": "1", Dec 15 05:17:19 localhost bold_beaver[338425]: "sas_address": "", Dec 15 05:17:19 localhost bold_beaver[338425]: "sas_device_handle": "", Dec 15 05:17:19 localhost bold_beaver[338425]: "scheduler_mode": "mq-deadline", Dec 15 05:17:19 localhost bold_beaver[338425]: "sectors": 0, Dec 15 05:17:19 localhost bold_beaver[338425]: "sectorsize": "2048", Dec 15 05:17:19 localhost bold_beaver[338425]: "size": 493568.0, Dec 15 05:17:19 localhost bold_beaver[338425]: "support_discard": "0", Dec 15 05:17:19 localhost bold_beaver[338425]: "type": "disk", Dec 15 05:17:19 localhost bold_beaver[338425]: "vendor": "QEMU" Dec 15 05:17:19 localhost bold_beaver[338425]: } Dec 15 05:17:19 localhost bold_beaver[338425]: } Dec 15 05:17:19 localhost bold_beaver[338425]: ] Dec 15 05:17:19 localhost systemd[1]: libpod-bae37d875b0520d5f9a33a2a2794916699667c769a8621540f5da39a0c099c46.scope: Deactivated successfully. Dec 15 05:17:19 localhost systemd[1]: libpod-bae37d875b0520d5f9a33a2a2794916699667c769a8621540f5da39a0c099c46.scope: Consumed 1.014s CPU time. Dec 15 05:17:19 localhost podman[338410]: 2025-12-15 10:17:19.244610805 +0000 UTC m=+1.164853286 container died bae37d875b0520d5f9a33a2a2794916699667c769a8621540f5da39a0c099c46 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=bold_beaver, architecture=x86_64, GIT_CLEAN=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, vcs-type=git, maintainer=Guillaume Abrioux , RELEASE=main, build-date=2025-11-26T19:44:28Z, version=7, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, release=1763362218, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph) Dec 15 05:17:19 localhost systemd[1]: var-lib-containers-storage-overlay-242c8ba6c9b578f019cddb205e7c9d510f5acb689257d026b6abb1a44ff185bc-merged.mount: Deactivated successfully. Dec 15 05:17:19 localhost podman[340514]: 2025-12-15 10:17:19.330033647 +0000 UTC m=+0.077361057 container remove bae37d875b0520d5f9a33a2a2794916699667c769a8621540f5da39a0c099c46 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=bold_beaver, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, RELEASE=main, architecture=x86_64, release=1763362218, ceph=True, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-11-26T19:44:28Z, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, version=7, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, vcs-type=git, maintainer=Guillaume Abrioux , GIT_CLEAN=True, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git) Dec 15 05:17:19 localhost systemd[1]: libpod-conmon-bae37d875b0520d5f9a33a2a2794916699667c769a8621540f5da39a0c099c46.scope: Deactivated successfully. Dec 15 05:17:19 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559462.localdomain.devices.0}] v 0) Dec 15 05:17:19 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559464.localdomain.devices.0}] v 0) Dec 15 05:17:19 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:17:19 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559462.localdomain}] v 0) Dec 15 05:17:19 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:17:19 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559464.localdomain}] v 0) Dec 15 05:17:19 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:17:19 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:17:19 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559463.localdomain.devices.0}] v 0) Dec 15 05:17:19 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:17:19 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559463.localdomain}] v 0) Dec 15 05:17:19 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:17:19 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Dec 15 05:17:19 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:17:19 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Dec 15 05:17:19 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:17:20 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:17:20 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:17:20 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:17:20 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:17:20 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:17:20 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:17:20 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Dec 15 05:17:20 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:17:20 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:17:20 localhost nova_compute[286344]: 2025-12-15 10:17:20.564 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:17:21 localhost nova_compute[286344]: 2025-12-15 10:17:21.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:17:21 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) Dec 15 05:17:21 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.25625 172.18.0.34:0/382777224' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch Dec 15 05:17:22 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e281 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:17:22 localhost nova_compute[286344]: 2025-12-15 10:17:22.326 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:17:23 localhost nova_compute[286344]: 2025-12-15 10:17:23.267 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:17:24 localhost nova_compute[286344]: 2025-12-15 10:17:24.266 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:17:24 localhost nova_compute[286344]: 2025-12-15 10:17:24.293 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:17:24 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth caps", "entity": "client.bob", "caps": ["mon", "allow r", "mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411,allow rw path=/volumes/_nogroup/57a5f1aa-767f-40a8-9645-d7c9ee6060fb/f9aa11a9-e3a4-41d2-aa98-0bb57897775e", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275,allow rw pool=manila_data namespace=fsvolumens_57a5f1aa-767f-40a8-9645-d7c9ee6060fb"]} v 0) Dec 15 05:17:24 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth caps", "entity": "client.bob", "caps": ["mon", "allow r", "mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411,allow rw path=/volumes/_nogroup/57a5f1aa-767f-40a8-9645-d7c9ee6060fb/f9aa11a9-e3a4-41d2-aa98-0bb57897775e", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275,allow rw pool=manila_data namespace=fsvolumens_57a5f1aa-767f-40a8-9645-d7c9ee6060fb"]} : dispatch Dec 15 05:17:24 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth caps", "entity": "client.bob", "caps": ["mon", "allow r", "mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411,allow rw path=/volumes/_nogroup/57a5f1aa-767f-40a8-9645-d7c9ee6060fb/f9aa11a9-e3a4-41d2-aa98-0bb57897775e", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275,allow rw pool=manila_data namespace=fsvolumens_57a5f1aa-767f-40a8-9645-d7c9ee6060fb"]}]': finished Dec 15 05:17:25 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch Dec 15 05:17:25 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth caps", "entity": "client.bob", "caps": ["mon", "allow r", "mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411,allow rw path=/volumes/_nogroup/57a5f1aa-767f-40a8-9645-d7c9ee6060fb/f9aa11a9-e3a4-41d2-aa98-0bb57897775e", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275,allow rw pool=manila_data namespace=fsvolumens_57a5f1aa-767f-40a8-9645-d7c9ee6060fb"]} : dispatch Dec 15 05:17:25 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth caps", "entity": "client.bob", "caps": ["mon", "allow r", "mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411,allow rw path=/volumes/_nogroup/57a5f1aa-767f-40a8-9645-d7c9ee6060fb/f9aa11a9-e3a4-41d2-aa98-0bb57897775e", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275,allow rw pool=manila_data namespace=fsvolumens_57a5f1aa-767f-40a8-9645-d7c9ee6060fb"]} : dispatch Dec 15 05:17:25 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch Dec 15 05:17:25 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth caps", "entity": "client.bob", "caps": ["mon", "allow r", "mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411,allow rw path=/volumes/_nogroup/57a5f1aa-767f-40a8-9645-d7c9ee6060fb/f9aa11a9-e3a4-41d2-aa98-0bb57897775e", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275,allow rw pool=manila_data namespace=fsvolumens_57a5f1aa-767f-40a8-9645-d7c9ee6060fb"]}]': finished Dec 15 05:17:25 localhost nova_compute[286344]: 2025-12-15 10:17:25.603 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:17:26 localhost nova_compute[286344]: 2025-12-15 10:17:26.271 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:17:26 localhost nova_compute[286344]: 2025-12-15 10:17:26.271 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 15 05:17:26 localhost nova_compute[286344]: 2025-12-15 10:17:26.271 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 15 05:17:26 localhost nova_compute[286344]: 2025-12-15 10:17:26.782 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 15 05:17:26 localhost nova_compute[286344]: 2025-12-15 10:17:26.783 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquired lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 15 05:17:26 localhost nova_compute[286344]: 2025-12-15 10:17:26.783 286348 DEBUG nova.network.neutron [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 15 05:17:26 localhost nova_compute[286344]: 2025-12-15 10:17:26.783 286348 DEBUG nova.objects.instance [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 15 05:17:27 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e281 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:17:27 localhost nova_compute[286344]: 2025-12-15 10:17:27.186 286348 DEBUG nova.network.neutron [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Updating instance_info_cache with network_info: [{"id": "03ef8889-3216-43fb-8a52-4be17a956ce1", "address": "fa:16:3e:74:df:7c", "network": {"id": "befb7a72-17a9-4bcb-b561-84b8f626685a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.201", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "c785bf23f53946bc99867d8832a50266", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03ef8889-32", "ovs_interfaceid": "03ef8889-3216-43fb-8a52-4be17a956ce1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 15 05:17:27 localhost nova_compute[286344]: 2025-12-15 10:17:27.200 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Releasing lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 15 05:17:27 localhost nova_compute[286344]: 2025-12-15 10:17:27.200 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 15 05:17:27 localhost nova_compute[286344]: 2025-12-15 10:17:27.269 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:17:27 localhost nova_compute[286344]: 2025-12-15 10:17:27.329 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:17:28 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth caps", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275"]} v 0) Dec 15 05:17:28 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth caps", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275"]} : dispatch Dec 15 05:17:28 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth caps", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275"]}]': finished Dec 15 05:17:28 localhost nova_compute[286344]: 2025-12-15 10:17:28.271 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:17:28 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch Dec 15 05:17:28 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth caps", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275"]} : dispatch Dec 15 05:17:28 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth caps", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275"]} : dispatch Dec 15 05:17:28 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth caps", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f8ee972-fbad-4b31-8005-874e13dc9275/fbd4d206-5fcf-4579-a13d-8e4adc225411", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f8ee972-fbad-4b31-8005-874e13dc9275"]}]': finished Dec 15 05:17:29 localhost nova_compute[286344]: 2025-12-15 10:17:29.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:17:29 localhost nova_compute[286344]: 2025-12-15 10:17:29.270 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 15 05:17:30 localhost nova_compute[286344]: 2025-12-15 10:17:30.605 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:17:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0. Dec 15 05:17:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09. Dec 15 05:17:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 05:17:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 05:17:30 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a. Dec 15 05:17:30 localhost podman[340547]: 2025-12-15 10:17:30.7799397 +0000 UTC m=+0.105387598 container health_status 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 15 05:17:30 localhost podman[340547]: 2025-12-15 10:17:30.792374843 +0000 UTC m=+0.117822801 container exec_died 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 15 05:17:30 localhost systemd[1]: 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0.service: Deactivated successfully. Dec 15 05:17:30 localhost podman[340561]: 2025-12-15 10:17:30.841780881 +0000 UTC m=+0.153368313 container health_status b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:17:30 localhost podman[340549]: 2025-12-15 10:17:30.886547572 +0000 UTC m=+0.204676325 container health_status 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251202, container_name=multipathd, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image) Dec 15 05:17:30 localhost podman[340549]: 2025-12-15 10:17:30.92677721 +0000 UTC m=+0.244905953 container exec_died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=multipathd, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:17:30 localhost podman[340550]: 2025-12-15 10:17:30.937623598 +0000 UTC m=+0.252334662 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:17:30 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.service: Deactivated successfully. Dec 15 05:17:30 localhost podman[340561]: 2025-12-15 10:17:30.952434511 +0000 UTC m=+0.264021943 container exec_died b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}) Dec 15 05:17:31 localhost systemd[1]: b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a.service: Deactivated successfully. Dec 15 05:17:31 localhost podman[340548]: 2025-12-15 10:17:31.042074826 +0000 UTC m=+0.364476485 container health_status 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, io.openshift.expose-services=, distribution-scope=public, vcs-type=git, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, release=1755695350, config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, name=ubi9-minimal) Dec 15 05:17:31 localhost podman[340548]: 2025-12-15 10:17:31.05173813 +0000 UTC m=+0.374139789 container exec_died 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, maintainer=Red Hat, Inc., release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_id=openstack_network_exporter) Dec 15 05:17:31 localhost systemd[1]: 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09.service: Deactivated successfully. Dec 15 05:17:31 localhost podman[340550]: 2025-12-15 10:17:31.072466716 +0000 UTC m=+0.387177780 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2) Dec 15 05:17:31 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 05:17:31 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "auth rm", "entity": "client.bob"} v 0) Dec 15 05:17:31 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.bob"} : dispatch Dec 15 05:17:31 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth rm", "entity": "client.bob"}]': finished Dec 15 05:17:31 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch Dec 15 05:17:31 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.bob"} : dispatch Dec 15 05:17:31 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth rm", "entity": "client.bob"} : dispatch Dec 15 05:17:31 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd='[{"prefix": "auth rm", "entity": "client.bob"}]': finished Dec 15 05:17:31 localhost podman[243449]: time="2025-12-15T10:17:31Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 15 05:17:31 localhost podman[243449]: @ - - [15/Dec/2025:10:17:31 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156640 "" "Go-http-client/1.1" Dec 15 05:17:31 localhost podman[243449]: @ - - [15/Dec/2025:10:17:31 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19277 "" "Go-http-client/1.1" Dec 15 05:17:32 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e281 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:17:32 localhost nova_compute[286344]: 2025-12-15 10:17:32.383 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:17:33 localhost nova_compute[286344]: 2025-12-15 10:17:33.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:17:33 localhost nova_compute[286344]: 2025-12-15 10:17:33.289 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 05:17:33 localhost nova_compute[286344]: 2025-12-15 10:17:33.290 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 05:17:33 localhost nova_compute[286344]: 2025-12-15 10:17:33.290 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 05:17:33 localhost nova_compute[286344]: 2025-12-15 10:17:33.290 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Auditing locally available compute resources for np0005559462.localdomain (node: np0005559462.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 15 05:17:33 localhost nova_compute[286344]: 2025-12-15 10:17:33.291 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 05:17:33 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Dec 15 05:17:33 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3005870055' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Dec 15 05:17:33 localhost nova_compute[286344]: 2025-12-15 10:17:33.738 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.447s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 05:17:33 localhost nova_compute[286344]: 2025-12-15 10:17:33.799 286348 DEBUG nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 05:17:33 localhost nova_compute[286344]: 2025-12-15 10:17:33.800 286348 DEBUG nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 05:17:34 localhost nova_compute[286344]: 2025-12-15 10:17:34.012 286348 WARNING nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 15 05:17:34 localhost nova_compute[286344]: 2025-12-15 10:17:34.014 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Hypervisor/Node resource view: name=np0005559462.localdomain free_ram=11109MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 15 05:17:34 localhost nova_compute[286344]: 2025-12-15 10:17:34.015 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 05:17:34 localhost nova_compute[286344]: 2025-12-15 10:17:34.015 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 05:17:34 localhost nova_compute[286344]: 2025-12-15 10:17:34.078 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Instance 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 15 05:17:34 localhost nova_compute[286344]: 2025-12-15 10:17:34.078 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 15 05:17:34 localhost nova_compute[286344]: 2025-12-15 10:17:34.079 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Final resource view: name=np0005559462.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 15 05:17:34 localhost nova_compute[286344]: 2025-12-15 10:17:34.121 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 05:17:34 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Dec 15 05:17:34 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3694827700' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Dec 15 05:17:34 localhost nova_compute[286344]: 2025-12-15 10:17:34.575 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 05:17:34 localhost nova_compute[286344]: 2025-12-15 10:17:34.581 286348 DEBUG nova.compute.provider_tree [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Inventory has not changed in ProviderTree for provider: 26c8956b-6742-4951-b566-971b9bbe323b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 15 05:17:34 localhost nova_compute[286344]: 2025-12-15 10:17:34.597 286348 DEBUG nova.scheduler.client.report [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Inventory has not changed for provider 26c8956b-6742-4951-b566-971b9bbe323b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 15 05:17:34 localhost nova_compute[286344]: 2025-12-15 10:17:34.599 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Compute_service record updated for np0005559462.localdomain:np0005559462.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 15 05:17:34 localhost nova_compute[286344]: 2025-12-15 10:17:34.600 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.585s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 05:17:34 localhost openstack_network_exporter[246484]: ERROR 10:17:34 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 15 05:17:34 localhost openstack_network_exporter[246484]: ERROR 10:17:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 05:17:34 localhost openstack_network_exporter[246484]: ERROR 10:17:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 05:17:34 localhost openstack_network_exporter[246484]: ERROR 10:17:34 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 15 05:17:34 localhost openstack_network_exporter[246484]: Dec 15 05:17:34 localhost openstack_network_exporter[246484]: ERROR 10:17:34 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 15 05:17:34 localhost openstack_network_exporter[246484]: Dec 15 05:17:35 localhost nova_compute[286344]: 2025-12-15 10:17:35.635 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:17:36 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : mgrmap e57: np0005559464.aomnqe(active, since 19m), standbys: np0005559462.fudvyx, np0005559463.daptkf Dec 15 05:17:36 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 05:17:36 localhost podman[340696]: 2025-12-15 10:17:36.753784947 +0000 UTC m=+0.077158811 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) Dec 15 05:17:36 localhost podman[340696]: 2025-12-15 10:17:36.79145786 +0000 UTC m=+0.114831674 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:17:36 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 05:17:37 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e281 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:17:37 localhost nova_compute[286344]: 2025-12-15 10:17:37.386 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:17:37 localhost nova_compute[286344]: 2025-12-15 10:17:37.601 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:17:40 localhost nova_compute[286344]: 2025-12-15 10:17:40.664 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:17:42 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e281 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:17:42 localhost ovn_metadata_agent[160585]: 2025-12-15 10:17:42.379 160590 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=24, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'fe:17:e3', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'fe:55:2b:86:15:b5'}, ipsec=False) old=SB_Global(nb_cfg=23) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m Dec 15 05:17:42 localhost ovn_metadata_agent[160585]: 2025-12-15 10:17:42.381 160590 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m Dec 15 05:17:42 localhost nova_compute[286344]: 2025-12-15 10:17:42.423 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:17:43 localhost systemd[1]: Started /usr/bin/podman healthcheck run a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e. Dec 15 05:17:43 localhost podman[340713]: 2025-12-15 10:17:43.747541301 +0000 UTC m=+0.081611081 container health_status a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible) Dec 15 05:17:43 localhost podman[340713]: 2025-12-15 10:17:43.784446382 +0000 UTC m=+0.118516162 container exec_died a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Dec 15 05:17:43 localhost systemd[1]: a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e.service: Deactivated successfully. Dec 15 05:17:45 localhost nova_compute[286344]: 2025-12-15 10:17:45.704 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:17:47 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e281 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:17:47 localhost nova_compute[286344]: 2025-12-15 10:17:47.426 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.127 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'name': 'test', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005559462.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'c785bf23f53946bc99867d8832a50266', 'user_id': '1ba5fce347b64bfebf995f187193f205', 'hostId': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228 Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.127 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.131 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.133 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b3ab9d6b-17ba-4be2-9dd1-ad80273c20f1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:17:48.128099', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '4e121b3c-d99f-11f0-817e-fa163ebaca0f', 'monotonic_time': 12934.320773539, 'message_signature': '19516afd29e098709d44638282ae016d887a0bc47b49a5da852197de4198bdf1'}]}, 'timestamp': '2025-12-15 10:17:48.132521', '_unique_id': '645d8202fef045e6b5d6baa0bb986b57'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.133 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.133 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.133 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.133 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.133 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.133 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.133 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.133 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.133 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.133 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.133 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.133 12 ERROR oslo_messaging.notify.messaging Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.133 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.133 12 ERROR oslo_messaging.notify.messaging Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.133 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.133 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.133 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.133 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.133 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.133 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.133 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.133 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.133 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.133 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.133 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.133 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.133 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.133 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.133 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.133 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.133 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.133 12 ERROR oslo_messaging.notify.messaging Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.135 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.135 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.136 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'edfb084e-f212-4e13-97c2-4cf921fe5ea4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:17:48.135373', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '4e129f76-d99f-11f0-817e-fa163ebaca0f', 'monotonic_time': 12934.320773539, 'message_signature': 'a5b2dcbce6b322d55fd02c12e112f90321920f7c61663ffcf7b2575cea81ca4c'}]}, 'timestamp': '2025-12-15 10:17:48.135860', '_unique_id': 'fdc2ba213a1046bcbb2c27e1c76a8bc9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.136 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.136 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.136 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.136 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.136 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.136 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.136 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.136 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.136 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.136 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.136 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.136 12 ERROR oslo_messaging.notify.messaging Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.136 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.136 12 ERROR oslo_messaging.notify.messaging Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.136 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.136 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.136 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.136 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.136 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.136 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.136 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.136 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.136 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.136 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.136 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.136 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.136 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.136 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.136 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.136 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.136 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.136 12 ERROR oslo_messaging.notify.messaging Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.137 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.167 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.167 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.169 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5fcdf859-902f-4441-826a-8cce07487e78', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T10:17:48.138150', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4e177622-d99f-11f0-817e-fa163ebaca0f', 'monotonic_time': 12934.330824222, 'message_signature': '286010760759d5638040cfb636e046dcc1276dbe3d1c93f7885bb593093d7470'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T10:17:48.138150', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4e178720-d99f-11f0-817e-fa163ebaca0f', 'monotonic_time': 12934.330824222, 'message_signature': '15c7911c60bfebca11c77e75c006cdecec177df1f0952e9e1e71ce7ed2d2e688'}]}, 'timestamp': '2025-12-15 10:17:48.167966', '_unique_id': 'd462978241724943a0a7bb9fd213d025'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.169 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.169 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.169 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.169 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.169 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.169 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.169 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.169 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.169 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.169 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.169 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.169 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.169 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.169 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.169 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.169 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.169 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.169 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.169 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.169 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.169 12 ERROR oslo_messaging.notify.messaging Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.169 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.169 12 ERROR oslo_messaging.notify.messaging Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.169 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.169 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.169 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.169 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.169 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.169 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.169 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.169 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.169 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.169 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.169 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.169 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.169 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.169 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.169 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.169 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.169 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.169 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.169 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.169 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.169 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.169 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.169 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.169 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.169 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.169 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.169 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.169 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.169 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.169 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.169 12 ERROR oslo_messaging.notify.messaging Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.170 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.170 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.172 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4f3f9fad-c59f-4381-be1b-9f4223efcc0b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:17:48.170670', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '4e1801d2-d99f-11f0-817e-fa163ebaca0f', 'monotonic_time': 12934.320773539, 'message_signature': 'b1e5fa3be4f1ae0ddd8642c42660cb2d6664fc6534182537ed0ced26ffe107cb'}]}, 'timestamp': '2025-12-15 10:17:48.171179', '_unique_id': 'bd9d47d5ac544b6ca180c653dd73056a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.172 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.172 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.172 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.172 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.172 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.172 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.172 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.172 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.172 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.172 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.172 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.172 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.172 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.172 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.172 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.172 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.172 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.172 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.172 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.172 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.172 12 ERROR oslo_messaging.notify.messaging Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.172 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.172 12 ERROR oslo_messaging.notify.messaging Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.172 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.172 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.172 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.172 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.172 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.172 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.172 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.172 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.172 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.172 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.172 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.172 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.172 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.172 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.172 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.172 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.172 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.172 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.172 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.172 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.172 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.172 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.172 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.172 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.172 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.172 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.172 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.172 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.172 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.172 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.172 12 ERROR oslo_messaging.notify.messaging Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.173 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.173 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.173 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.174 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4b2650d9-6d44-44ff-98ab-da6dee8ae67f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:17:48.173535', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '4e1871da-d99f-11f0-817e-fa163ebaca0f', 'monotonic_time': 12934.320773539, 'message_signature': 'dbeeb1932267db1c916d8e573549769013554553595ea989f918ca2be227ffd1'}]}, 'timestamp': '2025-12-15 10:17:48.174047', '_unique_id': '3e5640aecc2d409abdbefed2c9d8bfb8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.174 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.174 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.174 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.174 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.174 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.174 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.174 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.174 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.174 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.174 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.174 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.174 12 ERROR oslo_messaging.notify.messaging Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.174 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.174 12 ERROR oslo_messaging.notify.messaging Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.174 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.174 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.174 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.174 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.174 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.174 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.174 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.174 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.174 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.174 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.174 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.174 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.174 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.174 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.174 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.174 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.174 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.174 12 ERROR oslo_messaging.notify.messaging Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.176 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.186 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.187 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.188 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0e7288eb-25ac-43c3-a0d0-e264c559119e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T10:17:48.176269', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4e1a7732-d99f-11f0-817e-fa163ebaca0f', 'monotonic_time': 12934.36897238, 'message_signature': '7b4bff2e882eab7d723dfe7c3d1f638710f4bf71a415a46403b2d975f7fd2a3c'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T10:17:48.176269', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4e1a8830-d99f-11f0-817e-fa163ebaca0f', 'monotonic_time': 12934.36897238, 'message_signature': 'e68ed9e222412366825b31e2dbf5612a91e451bc1e9813eb9ac9cfc77279c8d2'}]}, 'timestamp': '2025-12-15 10:17:48.187654', '_unique_id': '77aa9fba6f524580a4f9c7b096195bc7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.188 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.188 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.188 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.188 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.188 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.188 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.188 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.188 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.188 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.188 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.188 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.188 12 ERROR oslo_messaging.notify.messaging Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.188 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.188 12 ERROR oslo_messaging.notify.messaging Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.188 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.188 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.188 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.188 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.188 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.188 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.188 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.188 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.188 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.188 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.188 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.188 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.188 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.188 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.188 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.188 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.188 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.188 12 ERROR oslo_messaging.notify.messaging Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.189 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.189 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.latency volume: 1342134926 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.190 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.latency volume: 123356132 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.191 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fe7fbd89-8788-4635-b03d-c4f394944477', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1342134926, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T10:17:48.189934', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4e1af3c4-d99f-11f0-817e-fa163ebaca0f', 'monotonic_time': 12934.330824222, 'message_signature': '5fbcda12501a4f572c641049218ac64b7896d1f4e552ca42e5729c4058ca8146'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 123356132, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T10:17:48.189934', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4e1b03fa-d99f-11f0-817e-fa163ebaca0f', 'monotonic_time': 12934.330824222, 'message_signature': 'bddaaf211aca674edf72ed773063cb6f0ac71920cbdb583a09fc07ad11d95309'}]}, 'timestamp': '2025-12-15 10:17:48.190823', '_unique_id': 'c4335f546eb445fa9f3f65a364aa2269'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.191 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.191 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.191 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.191 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.191 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.191 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.191 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.191 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.191 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.191 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.191 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.191 12 ERROR oslo_messaging.notify.messaging Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.191 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.191 12 ERROR oslo_messaging.notify.messaging Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.191 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.191 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.191 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.191 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.191 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.191 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.191 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.191 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.191 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.191 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.191 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.191 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.191 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.191 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.191 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.191 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.191 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.191 12 ERROR oslo_messaging.notify.messaging Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.192 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.193 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.241 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/cpu volume: 18870000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.243 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '568b993f-9689-4464-b42d-d7c5e43a05d8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 18870000000, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'timestamp': '2025-12-15T10:17:48.193232', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '4e22e75a-d99f-11f0-817e-fa163ebaca0f', 'monotonic_time': 12934.434393736, 'message_signature': 'd9e280185586b65681645ba5689cede50f25bf32b8333a384ce205dc07521898'}]}, 'timestamp': '2025-12-15 10:17:48.242594', '_unique_id': 'c5ce334b3be84af68fe90997f8813e46'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.243 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.243 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.243 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.243 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.243 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.243 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.243 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.243 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.243 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.243 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.243 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.243 12 ERROR oslo_messaging.notify.messaging Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.243 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.243 12 ERROR oslo_messaging.notify.messaging Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.243 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.243 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.243 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.243 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.243 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.243 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.243 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.243 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.243 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.243 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.243 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.243 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.243 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.243 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.243 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.243 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.243 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.243 12 ERROR oslo_messaging.notify.messaging Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.245 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.245 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.latency volume: 1243487016 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.245 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.latency volume: 24779175 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.247 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f01c20a9-403c-4a25-92b3-f9cdc94d7977', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1243487016, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T10:17:48.245351', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4e236a40-d99f-11f0-817e-fa163ebaca0f', 'monotonic_time': 12934.330824222, 'message_signature': '2fe542b1999fc3803692daf8c9319f920999db25e22c0ab88ac9aa26d567c941'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24779175, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T10:17:48.245351', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4e238048-d99f-11f0-817e-fa163ebaca0f', 'monotonic_time': 12934.330824222, 'message_signature': '439eb580275085670b986c5df4e927a52423eb02208e99a2046bdf08526638e3'}]}, 'timestamp': '2025-12-15 10:17:48.246453', '_unique_id': '67f2e5db00094c6685de411aefbe3dc6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.247 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.247 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.247 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.247 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.247 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.247 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.247 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.247 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.247 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.247 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.247 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.247 12 ERROR oslo_messaging.notify.messaging Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.247 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.247 12 ERROR oslo_messaging.notify.messaging Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.247 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.247 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.247 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.247 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.247 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.247 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.247 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.247 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.247 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.247 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.247 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.247 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.247 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.247 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.247 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.247 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.247 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.247 12 ERROR oslo_messaging.notify.messaging Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.248 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.248 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.249 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.250 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b4bccfe0-d86e-4f68-8c2b-2c03d6984d90', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:17:48.249127', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '4e23fea6-d99f-11f0-817e-fa163ebaca0f', 'monotonic_time': 12934.320773539, 'message_signature': 'b43020d1ccce4fa8ca25f397f40f4a2a9102655dad763bae51c1525494d8174b'}]}, 'timestamp': '2025-12-15 10:17:48.249705', '_unique_id': '45c01068727646198cf9ae7ad3d316f1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.250 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.250 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.250 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.250 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.250 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.250 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.250 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.250 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.250 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.250 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.250 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.250 12 ERROR oslo_messaging.notify.messaging Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.250 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.250 12 ERROR oslo_messaging.notify.messaging Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.250 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.250 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.250 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.250 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.250 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.250 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.250 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.250 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.250 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.250 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.250 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.250 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.250 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.250 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.250 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.250 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.250 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.250 12 ERROR oslo_messaging.notify.messaging Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.251 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.252 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.252 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.254 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c26cbb27-5abf-43f0-b5b8-c40a5861a7e3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T10:17:48.251931', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4e246d1e-d99f-11f0-817e-fa163ebaca0f', 'monotonic_time': 12934.330824222, 'message_signature': '4eb6a0724d2f2aab834315d079d0d796367c68973ad8b5a5e6fae2abeb14be43'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T10:17:48.251931', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4e247fca-d99f-11f0-817e-fa163ebaca0f', 'monotonic_time': 12934.330824222, 'message_signature': '97e4b03ec2ec51fa5cae138c9ba2ca6d122f9d0958b62cacd88969fee392b5d9'}]}, 'timestamp': '2025-12-15 10:17:48.253083', '_unique_id': 'e62091865da64eaa9566097ae5738104'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.254 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.254 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.254 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.254 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.254 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.254 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.254 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.254 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.254 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.254 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.254 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.254 12 ERROR oslo_messaging.notify.messaging Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.254 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.254 12 ERROR oslo_messaging.notify.messaging Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.254 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.254 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.254 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.254 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.254 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.254 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.254 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.254 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.254 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.254 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.254 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.254 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.254 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.254 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.254 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.254 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.254 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.254 12 ERROR oslo_messaging.notify.messaging Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.255 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193 Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.255 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.255 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.257 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e5a569cc-b54a-40cd-b4e2-f08a218ff79b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:17:48.255641', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '4e24fcac-d99f-11f0-817e-fa163ebaca0f', 'monotonic_time': 12934.320773539, 'message_signature': 'dc257cc0ccb819cf4921c9baa0c3ebb64882932170c849521f2ff8be765b093f'}]}, 'timestamp': '2025-12-15 10:17:48.256264', '_unique_id': '333e83fff442472aa08ec73b2de7f095'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.257 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.257 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.257 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.257 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.257 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.257 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.257 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.257 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.257 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.257 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.257 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.257 12 ERROR oslo_messaging.notify.messaging Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.257 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.257 12 ERROR oslo_messaging.notify.messaging Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.257 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.257 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.257 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.257 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.257 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.257 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.257 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.257 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.257 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.257 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.257 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.257 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.257 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.257 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.257 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.257 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.257 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.257 12 ERROR oslo_messaging.notify.messaging Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.258 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.258 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.260 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '76c970f1-f7b6-4c99-9fff-3a65f0ea26af', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:17:48.258515', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '4e256c78-d99f-11f0-817e-fa163ebaca0f', 'monotonic_time': 12934.320773539, 'message_signature': '9adc4cdfce8900cef862df0452914f8a4586791594774e08ba5ee0ef3fbbbe8e'}]}, 'timestamp': '2025-12-15 10:17:48.259150', '_unique_id': 'dd5eaa8e5d11407e805d668827308229'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.260 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.260 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.260 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.260 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.260 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.260 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.260 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.260 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.260 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.260 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.260 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.260 12 ERROR oslo_messaging.notify.messaging Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.260 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.260 12 ERROR oslo_messaging.notify.messaging Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.260 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.260 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.260 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.260 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.260 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.260 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.260 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.260 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.260 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.260 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.260 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.260 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.260 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.260 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.260 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.260 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.260 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.260 12 ERROR oslo_messaging.notify.messaging Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.261 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.261 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.261 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.263 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '218e04d9-0526-467a-9456-85cb49f65034', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T10:17:48.261394', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4e25dcd0-d99f-11f0-817e-fa163ebaca0f', 'monotonic_time': 12934.330824222, 'message_signature': 'b54a55257e9305b22f15311ccd9a4760f7a953cc7d08b040b744eb9f4b44f142'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T10:17:48.261394', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4e25f1b6-d99f-11f0-817e-fa163ebaca0f', 'monotonic_time': 12934.330824222, 'message_signature': '7f65fd6f52d2072f2c5acae38cbb1de3086eb587f78c769cd272da11bbfb0997'}]}, 'timestamp': '2025-12-15 10:17:48.262461', '_unique_id': '5aeb477ac29e4828be0bd1359d88f041'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.263 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.263 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.263 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.263 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.263 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.263 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.263 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.263 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.263 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.263 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.263 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.263 12 ERROR oslo_messaging.notify.messaging Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.263 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.263 12 ERROR oslo_messaging.notify.messaging Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.263 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.263 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.263 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.263 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.263 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.263 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.263 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.263 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.263 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.263 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.263 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.263 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.263 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.263 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.263 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.263 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.263 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.263 12 ERROR oslo_messaging.notify.messaging Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.264 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.264 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.265 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.266 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fb450910-fdea-4cea-8814-4e3992a9a78b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T10:17:48.264737', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4e2661dc-d99f-11f0-817e-fa163ebaca0f', 'monotonic_time': 12934.36897238, 'message_signature': '293d50cec363ad4122cada4d466e118a5c0119393a3bd69467986e11de1085a6'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T10:17:48.264737', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4e267528-d99f-11f0-817e-fa163ebaca0f', 'monotonic_time': 12934.36897238, 'message_signature': '840934bb42facc5c6f47112f6588ccfbc9fd1c46d910f286bbdc34a104bc91b3'}]}, 'timestamp': '2025-12-15 10:17:48.265822', '_unique_id': '20cf1c87c0f4445ba2790aad734a7159'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.266 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.266 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.266 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.266 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.266 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.266 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.266 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.266 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.266 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.266 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.266 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.266 12 ERROR oslo_messaging.notify.messaging Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.266 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.266 12 ERROR oslo_messaging.notify.messaging Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.266 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.266 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.266 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.266 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.266 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.266 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.266 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.266 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.266 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.266 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.266 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.266 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.266 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.266 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.266 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.266 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.266 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.266 12 ERROR oslo_messaging.notify.messaging Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.267 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.268 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.269 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c4bbb64d-8547-4a63-b6cc-4735553f0720', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:17:48.268131', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '4e26e45e-d99f-11f0-817e-fa163ebaca0f', 'monotonic_time': 12934.320773539, 'message_signature': '5cf614fa05c1cea026bbafec6253f6c74d45200a61aa155a7680a1f5b1df3ac3'}]}, 'timestamp': '2025-12-15 10:17:48.268711', '_unique_id': 'ba5ea1d94ffd42ffa4f6e81c5fdb516a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.269 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.269 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.269 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.269 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.269 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.269 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.269 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.269 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.269 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.269 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.269 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.269 12 ERROR oslo_messaging.notify.messaging Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.269 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.269 12 ERROR oslo_messaging.notify.messaging Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.269 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.269 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.269 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.269 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.269 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.269 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.269 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.269 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.269 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.269 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.269 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.269 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.269 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.269 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.269 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.269 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.269 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.269 12 ERROR oslo_messaging.notify.messaging Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.270 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.271 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.272 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6f26bb9f-36e6-4c80-b49e-68c84cbcaca7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:17:48.270934', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '4e2753b2-d99f-11f0-817e-fa163ebaca0f', 'monotonic_time': 12934.320773539, 'message_signature': 'e7b00e868fd8ba476294fc722903e9d02b2fe1e937298a17ce0e26435161f22a'}]}, 'timestamp': '2025-12-15 10:17:48.271556', '_unique_id': 'adddd8de4635431d943ca7fc9253f0ce'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.272 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.272 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.272 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.272 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.272 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.272 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.272 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.272 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.272 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.272 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.272 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.272 12 ERROR oslo_messaging.notify.messaging Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.272 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.272 12 ERROR oslo_messaging.notify.messaging Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.272 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.272 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.272 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.272 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.272 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.272 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.272 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.272 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.272 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.272 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.272 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.272 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.272 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.272 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.272 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.272 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.272 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.272 12 ERROR oslo_messaging.notify.messaging Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.273 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.273 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/memory.usage volume: 51.73828125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.275 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '772225b3-6454-48f5-8ca2-4866fa4daaba', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.73828125, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'timestamp': '2025-12-15T10:17:48.273810', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '4e27c342-d99f-11f0-817e-fa163ebaca0f', 'monotonic_time': 12934.434393736, 'message_signature': '20f8592a7b272d0a0016fa311919c9ba8e55d9978f18e0fdb4fcf18c56e37c75'}]}, 'timestamp': '2025-12-15 10:17:48.274409', '_unique_id': 'c65ed60cefbf462fb6401b975997b7ee'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.275 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.275 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.275 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.275 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.275 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.275 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.275 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.275 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.275 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.275 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.275 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.275 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.275 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.275 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.275 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.275 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.275 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.275 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.275 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.275 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.275 12 ERROR oslo_messaging.notify.messaging Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.275 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.275 12 ERROR oslo_messaging.notify.messaging Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.275 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.275 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.275 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.275 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.275 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.275 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.275 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.275 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.275 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.275 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.275 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.275 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.275 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.275 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.275 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.275 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.275 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.275 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.275 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.275 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.275 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.275 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.275 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.275 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.275 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.275 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.275 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.275 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.275 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.275 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.275 12 ERROR oslo_messaging.notify.messaging Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.276 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.276 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.278 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7f81f515-5e80-46a3-a1a1-f7ca36583b6d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': 'instance-00000002-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-tap03ef8889-32', 'timestamp': '2025-12-15T10:17:48.276627', 'resource_metadata': {'display_name': 'test', 'name': 'tap03ef8889-32', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:74:df:7c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap03ef8889-32'}, 'message_id': '4e282ff8-d99f-11f0-817e-fa163ebaca0f', 'monotonic_time': 12934.320773539, 'message_signature': 'cf5e9d54658b74e430f2141d9d95fa0a10b026ee9438c2fdfd747be680df178a'}]}, 'timestamp': '2025-12-15 10:17:48.277270', '_unique_id': 'ae65c2ea08a0457298b1a04814ee58b3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.278 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.278 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.278 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.278 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.278 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.278 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.278 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.278 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.278 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.278 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.278 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.278 12 ERROR oslo_messaging.notify.messaging Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.278 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.278 12 ERROR oslo_messaging.notify.messaging Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.278 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.278 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.278 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.278 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.278 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.278 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.278 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.278 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.278 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.278 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.278 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.278 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.278 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.278 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.278 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.278 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.278 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.278 12 ERROR oslo_messaging.notify.messaging Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.279 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.279 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.280 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.281 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '71ffce91-1032-4493-9504-45f69c7cd04a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T10:17:48.279485', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4e289f38-d99f-11f0-817e-fa163ebaca0f', 'monotonic_time': 12934.330824222, 'message_signature': 'a3d83e3b22aea9ffd40077cc069c89e7bea6f29e2303f58eebbd6422d78a0282'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T10:17:48.279485', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4e28b4f0-d99f-11f0-817e-fa163ebaca0f', 'monotonic_time': 12934.330824222, 'message_signature': 'adab142c9990e8644454b0c93d5bbc6d8d801a92112d904f9224a7ef396869cc'}]}, 'timestamp': '2025-12-15 10:17:48.280565', '_unique_id': '32eeccb254074674b316ab3a30326950'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.281 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.281 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.281 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.281 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.281 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.281 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.281 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.281 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.281 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.281 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.281 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.281 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.281 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.281 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.281 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.281 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.281 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.281 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.281 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.281 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.281 12 ERROR oslo_messaging.notify.messaging Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.281 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.281 12 ERROR oslo_messaging.notify.messaging Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.281 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.281 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.281 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.281 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.281 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.281 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.281 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.281 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.281 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.281 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.281 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.281 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.281 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.281 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.281 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.281 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.281 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.281 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.281 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.281 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.281 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.281 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.281 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.281 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.281 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.281 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.281 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.281 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.281 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.281 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.281 12 ERROR oslo_messaging.notify.messaging Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.282 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.282 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.282 12 DEBUG ceilometer.compute.pollsters [-] 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108 Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.283 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e97ebb83-ecf9-4d3b-8d78-c7ae25d8633a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vda', 'timestamp': '2025-12-15T10:17:48.282618', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4e29153a-d99f-11f0-817e-fa163ebaca0f', 'monotonic_time': 12934.36897238, 'message_signature': '3d8767ca3f8c1d31137e527dccc204a11d1b1da5bca1f3cae0e0faa7659b4279'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '1ba5fce347b64bfebf995f187193f205', 'user_name': None, 'project_id': 'c785bf23f53946bc99867d8832a50266', 'project_name': None, 'resource_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359-vdb', 'timestamp': '2025-12-15T10:17:48.282618', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': '39ff1bd9-6f6b-44c8-bbec-a1fd9d196359', 'instance_type': 'm1.small', 'host': 'bf4d507aa00b3fcf643a7e0b883420632ac7fc96e8c7a756982e121d', 'instance_host': 'np0005559462.localdomain', 'flavor': {'id': '2da0e147-aaa7-4bb9-a176-5fe1b15a32a0', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e'}, 'image_ref': '7d4ec78f-3ce6-45d1-ac49-0eb10cd6b17e', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4e2922dc-d99f-11f0-817e-fa163ebaca0f', 'monotonic_time': 12934.36897238, 'message_signature': 'bea02111c0ce8fa04ca8be9a321447e83567a3b470132a13a64b429050fb0eb5'}]}, 'timestamp': '2025-12-15 10:17:48.283300', '_unique_id': '641253619b9344fd94afb5a9d403f2e3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.283 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.283 12 ERROR oslo_messaging.notify.messaging yield Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.283 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.283 12 ERROR oslo_messaging.notify.messaging return fun(*args, **kwargs) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.283 12 ERROR oslo_messaging.notify.messaging self._connection = self._establish_connection() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.283 12 ERROR oslo_messaging.notify.messaging conn = self.transport.establish_connection() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.283 12 ERROR oslo_messaging.notify.messaging conn.connect() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.283 12 ERROR oslo_messaging.notify.messaging self.transport.connect() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.283 12 ERROR oslo_messaging.notify.messaging self._connect(self.host, self.port, self.connect_timeout) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.283 12 ERROR oslo_messaging.notify.messaging self.sock.connect(sa) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.283 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.283 12 ERROR oslo_messaging.notify.messaging Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.283 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception: Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.283 12 ERROR oslo_messaging.notify.messaging Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.283 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last): Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.283 12 ERROR oslo_messaging.notify.messaging self.transport._send_notification(target, ctxt, message, Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.283 12 ERROR oslo_messaging.notify.messaging self._driver.send_notification(target, ctxt, message, version, Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.283 12 ERROR oslo_messaging.notify.messaging return self._send(target, ctxt, message, Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.283 12 ERROR oslo_messaging.notify.messaging with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn: Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.283 12 ERROR oslo_messaging.notify.messaging return rpc_common.ConnectionContext(self._connection_pool, Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__ Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.283 12 ERROR oslo_messaging.notify.messaging self.connection = connection_pool.get(retry=retry) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.283 12 ERROR oslo_messaging.notify.messaging return self.create(retry=retry) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.283 12 ERROR oslo_messaging.notify.messaging return self.connection_cls(self.conf, self.url, purpose, retry=retry) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__ Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.283 12 ERROR oslo_messaging.notify.messaging self.ensure_connection() Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.283 12 ERROR oslo_messaging.notify.messaging self.connection.ensure_connection( Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.283 12 ERROR oslo_messaging.notify.messaging self._ensure_connection(*args, **kwargs) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.283 12 ERROR oslo_messaging.notify.messaging return retry_over_time( Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__ Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.283 12 ERROR oslo_messaging.notify.messaging self.gen.throw(typ, value, traceback) Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.283 12 ERROR oslo_messaging.notify.messaging File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.283 12 ERROR oslo_messaging.notify.messaging raise ConnectionError(str(exc)) from exc Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.283 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused Dec 15 05:17:48 localhost ceilometer_agent_compute[238479]: 2025-12-15 10:17:48.283 12 ERROR oslo_messaging.notify.messaging Dec 15 05:17:50 localhost ovn_metadata_agent[160585]: 2025-12-15 10:17:50.382 160590 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=12d96d64-e862-4f68-81e5-8d9ec5d3a5e2, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '24'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m Dec 15 05:17:50 localhost nova_compute[286344]: 2025-12-15 10:17:50.707 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:17:51 localhost ovn_metadata_agent[160585]: 2025-12-15 10:17:51.492 160590 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 05:17:51 localhost ovn_metadata_agent[160585]: 2025-12-15 10:17:51.492 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 05:17:51 localhost ovn_metadata_agent[160585]: 2025-12-15 10:17:51.493 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 05:17:51 localhost nova_compute[286344]: 2025-12-15 10:17:51.986 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:17:52 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e281 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:17:52 localhost nova_compute[286344]: 2025-12-15 10:17:52.455 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:17:54 localhost nova_compute[286344]: 2025-12-15 10:17:54.881 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:17:55 localhost nova_compute[286344]: 2025-12-15 10:17:55.710 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:17:57 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e281 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:17:57 localhost nova_compute[286344]: 2025-12-15 10:17:57.456 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:17:58 localhost ovn_controller[154603]: 2025-12-15T10:17:58Z|00527|binding|INFO|Releasing lport b35254ad-12eb-47bb-92be-44fefe0694f0 from this chassis (sb_readonly=0) Dec 15 05:17:58 localhost nova_compute[286344]: 2025-12-15 10:17:58.998 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:18:00 localhost nova_compute[286344]: 2025-12-15 10:18:00.715 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:18:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0. Dec 15 05:18:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09. Dec 15 05:18:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 05:18:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 05:18:01 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a. Dec 15 05:18:01 localhost systemd[1]: tmp-crun.WqHXmS.mount: Deactivated successfully. Dec 15 05:18:01 localhost podman[340739]: 2025-12-15 10:18:01.779591784 +0000 UTC m=+0.098655341 container health_status 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:18:01 localhost podman[340739]: 2025-12-15 10:18:01.823425177 +0000 UTC m=+0.142488774 container exec_died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true) Dec 15 05:18:01 localhost podman[340737]: 2025-12-15 10:18:01.782300893 +0000 UTC m=+0.104389229 container health_status 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors , managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter) Dec 15 05:18:01 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.service: Deactivated successfully. Dec 15 05:18:01 localhost podman[340740]: 2025-12-15 10:18:01.838445798 +0000 UTC m=+0.149806269 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible) Dec 15 05:18:01 localhost podman[340737]: 2025-12-15 10:18:01.863945985 +0000 UTC m=+0.186034371 container exec_died 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 15 05:18:01 localhost podman[340738]: 2025-12-15 10:18:01.87302041 +0000 UTC m=+0.194015023 container health_status 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, release=1755695350, name=ubi9-minimal, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7) Dec 15 05:18:01 localhost podman[243449]: time="2025-12-15T10:18:01Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 15 05:18:01 localhost systemd[1]: 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0.service: Deactivated successfully. Dec 15 05:18:01 localhost podman[340740]: 2025-12-15 10:18:01.905422539 +0000 UTC m=+0.216783040 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202) Dec 15 05:18:01 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 05:18:01 localhost podman[340738]: 2025-12-15 10:18:01.943284068 +0000 UTC m=+0.264278641 container exec_died 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, distribution-scope=public, managed_by=edpm_ansible, vcs-type=git, version=9.6, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.) Dec 15 05:18:01 localhost podman[243449]: @ - - [15/Dec/2025:10:18:01 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156640 "" "Go-http-client/1.1" Dec 15 05:18:01 localhost systemd[1]: 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09.service: Deactivated successfully. Dec 15 05:18:01 localhost podman[243449]: @ - - [15/Dec/2025:10:18:01 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19276 "" "Go-http-client/1.1" Dec 15 05:18:02 localhost podman[340744]: 2025-12-15 10:18:02.034935132 +0000 UTC m=+0.346194230 container health_status b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true) Dec 15 05:18:02 localhost podman[340744]: 2025-12-15 10:18:02.048354575 +0000 UTC m=+0.359613703 container exec_died b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team) Dec 15 05:18:02 localhost systemd[1]: b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a.service: Deactivated successfully. Dec 15 05:18:02 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e281 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:18:02 localhost nova_compute[286344]: 2025-12-15 10:18:02.458 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:18:04 localhost openstack_network_exporter[246484]: ERROR 10:18:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 05:18:04 localhost openstack_network_exporter[246484]: ERROR 10:18:04 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 05:18:04 localhost openstack_network_exporter[246484]: ERROR 10:18:04 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 15 05:18:04 localhost openstack_network_exporter[246484]: ERROR 10:18:04 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 15 05:18:04 localhost openstack_network_exporter[246484]: Dec 15 05:18:04 localhost openstack_network_exporter[246484]: ERROR 10:18:04 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 15 05:18:04 localhost openstack_network_exporter[246484]: Dec 15 05:18:05 localhost nova_compute[286344]: 2025-12-15 10:18:05.748 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:18:07 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e281 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:18:07 localhost nova_compute[286344]: 2025-12-15 10:18:07.460 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:18:07 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 05:18:07 localhost podman[340841]: 2025-12-15 10:18:07.745654142 +0000 UTC m=+0.076783889 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3) Dec 15 05:18:07 localhost podman[340841]: 2025-12-15 10:18:07.749904887 +0000 UTC m=+0.081034674 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}) Dec 15 05:18:07 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 05:18:07 localhost sshd[340859]: main: sshd: ssh-rsa algorithm is disabled Dec 15 05:18:10 localhost nova_compute[286344]: 2025-12-15 10:18:10.751 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:18:12 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e281 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:18:12 localhost nova_compute[286344]: 2025-12-15 10:18:12.489 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:18:14 localhost systemd[1]: Started /usr/bin/podman healthcheck run a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e. Dec 15 05:18:14 localhost podman[340861]: 2025-12-15 10:18:14.750181102 +0000 UTC m=+0.080477099 container health_status a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Dec 15 05:18:14 localhost podman[340861]: 2025-12-15 10:18:14.761366799 +0000 UTC m=+0.091662776 container exec_died a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter) Dec 15 05:18:14 localhost systemd[1]: a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e.service: Deactivated successfully. Dec 15 05:18:15 localhost nova_compute[286344]: 2025-12-15 10:18:15.755 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:18:17 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e281 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:18:17 localhost nova_compute[286344]: 2025-12-15 10:18:17.491 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:18:20 localhost podman[340995]: 2025-12-15 10:18:20.704080204 +0000 UTC m=+0.089480722 container exec 8dcda56b365b42dc8758aab77a9ec80db304780e449052738f7e4e648ae1ecaf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-crash-np0005559462, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=Guillaume Abrioux , vendor=Red Hat, Inc., com.redhat.component=rhceph-container, architecture=x86_64, build-date=2025-11-26T19:44:28Z, ceph=True, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, RELEASE=main, distribution-scope=public, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, GIT_CLEAN=True, CEPH_POINT_RELEASE=, name=rhceph, GIT_BRANCH=main, io.buildah.version=1.41.4, release=1763362218, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, version=7) Dec 15 05:18:20 localhost nova_compute[286344]: 2025-12-15 10:18:20.757 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:18:20 localhost podman[340995]: 2025-12-15 10:18:20.828477036 +0000 UTC m=+0.213877514 container exec_died 8dcda56b365b42dc8758aab77a9ec80db304780e449052738f7e4e648ae1ecaf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-bce17446-41b5-5408-a23e-0b011906b44a-crash-np0005559462, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, build-date=2025-11-26T19:44:28Z, GIT_BRANCH=main, version=7, ceph=True, io.buildah.version=1.41.4, io.openshift.expose-services=, vcs-type=git, release=1763362218, GIT_CLEAN=True, CEPH_POINT_RELEASE=, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=09e5383fa24dada2ef392e4f10e9f5d0a9ef83f0, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, distribution-scope=public, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux , name=rhceph) Dec 15 05:18:21 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559462.localdomain.devices.0}] v 0) Dec 15 05:18:21 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:18:21 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559462.localdomain}] v 0) Dec 15 05:18:21 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:18:21 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559463.localdomain.devices.0}] v 0) Dec 15 05:18:21 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:18:21 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559463.localdomain}] v 0) Dec 15 05:18:21 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:18:21 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559464.localdomain.devices.0}] v 0) Dec 15 05:18:21 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:18:21 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005559464.localdomain}] v 0) Dec 15 05:18:21 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:18:22 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e281 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:18:22 localhost nova_compute[286344]: 2025-12-15 10:18:22.271 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:18:22 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} v 0) Dec 15 05:18:22 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Dec 15 05:18:22 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} v 0) Dec 15 05:18:22 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Dec 15 05:18:22 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} v 0) Dec 15 05:18:22 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Dec 15 05:18:22 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} v 0) Dec 15 05:18:22 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Dec 15 05:18:22 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0) Dec 15 05:18:22 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0) Dec 15 05:18:22 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} v 0) Dec 15 05:18:22 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Dec 15 05:18:22 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} v 0) Dec 15 05:18:22 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Dec 15 05:18:22 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0) Dec 15 05:18:22 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:18:22 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:18:22 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:18:22 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:18:22 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:18:22 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:18:22 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Dec 15 05:18:22 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Dec 15 05:18:22 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Dec 15 05:18:22 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Dec 15 05:18:22 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch Dec 15 05:18:22 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch Dec 15 05:18:22 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch Dec 15 05:18:22 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch Dec 15 05:18:22 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Dec 15 05:18:22 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Dec 15 05:18:22 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch Dec 15 05:18:22 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch Dec 15 05:18:22 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0) Dec 15 05:18:22 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:18:22 localhost nova_compute[286344]: 2025-12-15 10:18:22.535 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:18:23 localhost nova_compute[286344]: 2025-12-15 10:18:23.267 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:18:23 localhost ceph-mon[298913]: Adjusting osd_memory_target on np0005559462.localdomain to 836.6M Dec 15 05:18:23 localhost ceph-mon[298913]: Adjusting osd_memory_target on np0005559463.localdomain to 836.6M Dec 15 05:18:23 localhost ceph-mon[298913]: Unable to set osd_memory_target on np0005559462.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096 Dec 15 05:18:23 localhost ceph-mon[298913]: Unable to set osd_memory_target on np0005559463.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Dec 15 05:18:23 localhost ceph-mon[298913]: Adjusting osd_memory_target on np0005559464.localdomain to 836.6M Dec 15 05:18:23 localhost ceph-mon[298913]: Unable to set osd_memory_target on np0005559464.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096 Dec 15 05:18:23 localhost ceph-mon[298913]: from='mgr.34411 172.18.0.108:0/929118679' entity='mgr.np0005559464.aomnqe' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch Dec 15 05:18:23 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:18:24 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0) Dec 15 05:18:24 localhost ceph-mon[298913]: log_channel(audit) log [INF] : from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:18:25 localhost ceph-mon[298913]: from='mgr.34411 ' entity='mgr.np0005559464.aomnqe' Dec 15 05:18:25 localhost nova_compute[286344]: 2025-12-15 10:18:25.759 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:18:26 localhost nova_compute[286344]: 2025-12-15 10:18:26.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:18:26 localhost nova_compute[286344]: 2025-12-15 10:18:26.271 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m Dec 15 05:18:26 localhost nova_compute[286344]: 2025-12-15 10:18:26.271 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m Dec 15 05:18:26 localhost nova_compute[286344]: 2025-12-15 10:18:26.802 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m Dec 15 05:18:26 localhost nova_compute[286344]: 2025-12-15 10:18:26.803 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquired lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m Dec 15 05:18:26 localhost nova_compute[286344]: 2025-12-15 10:18:26.803 286348 DEBUG nova.network.neutron [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m Dec 15 05:18:26 localhost nova_compute[286344]: 2025-12-15 10:18:26.804 286348 DEBUG nova.objects.instance [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m Dec 15 05:18:27 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e281 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:18:27 localhost nova_compute[286344]: 2025-12-15 10:18:27.201 286348 DEBUG nova.network.neutron [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Updating instance_info_cache with network_info: [{"id": "03ef8889-3216-43fb-8a52-4be17a956ce1", "address": "fa:16:3e:74:df:7c", "network": {"id": "befb7a72-17a9-4bcb-b561-84b8f626685a", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.201", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "c785bf23f53946bc99867d8832a50266", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap03ef8889-32", "ovs_interfaceid": "03ef8889-3216-43fb-8a52-4be17a956ce1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m Dec 15 05:18:27 localhost nova_compute[286344]: 2025-12-15 10:18:27.228 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Releasing lock "refresh_cache-39ff1bd9-6f6b-44c8-bbec-a1fd9d196359" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m Dec 15 05:18:27 localhost nova_compute[286344]: 2025-12-15 10:18:27.228 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] [instance: 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m Dec 15 05:18:27 localhost nova_compute[286344]: 2025-12-15 10:18:27.229 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:18:27 localhost nova_compute[286344]: 2025-12-15 10:18:27.538 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:18:28 localhost nova_compute[286344]: 2025-12-15 10:18:28.269 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:18:30 localhost nova_compute[286344]: 2025-12-15 10:18:30.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:18:30 localhost ovn_controller[154603]: 2025-12-15T10:18:30Z|00528|memory_trim|INFO|Detected inactivity (last active 30018 ms ago): trimming memory Dec 15 05:18:30 localhost sshd[341202]: main: sshd: ssh-rsa algorithm is disabled Dec 15 05:18:30 localhost nova_compute[286344]: 2025-12-15 10:18:30.762 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:18:30 localhost systemd-logind[763]: New session 75 of user zuul. Dec 15 05:18:30 localhost systemd[1]: Started Session 75 of User zuul. Dec 15 05:18:31 localhost python3[341224]: ansible-ansible.legacy.command Invoked with _raw_params=subscription-manager unregister#012 _uses_shell=True zuul_log_id=fa163efc-24cc-5c73-9448-00000000000c-1-overcloudnovacompute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Dec 15 05:18:31 localhost nova_compute[286344]: 2025-12-15 10:18:31.270 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:18:31 localhost nova_compute[286344]: 2025-12-15 10:18:31.271 286348 DEBUG nova.compute.manager [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m Dec 15 05:18:31 localhost podman[243449]: time="2025-12-15T10:18:31Z" level=info msg="List containers: received `last` parameter - overwriting `limit`" Dec 15 05:18:31 localhost podman[243449]: @ - - [15/Dec/2025:10:18:31 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 156640 "" "Go-http-client/1.1" Dec 15 05:18:31 localhost podman[243449]: @ - - [15/Dec/2025:10:18:31 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19280 "" "Go-http-client/1.1" Dec 15 05:18:32 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e281 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:18:32 localhost nova_compute[286344]: 2025-12-15 10:18:32.572 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:18:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0. Dec 15 05:18:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09. Dec 15 05:18:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb. Dec 15 05:18:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8. Dec 15 05:18:32 localhost systemd[1]: Started /usr/bin/podman healthcheck run b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a. Dec 15 05:18:32 localhost podman[341246]: 2025-12-15 10:18:32.78473897 +0000 UTC m=+0.097308420 container health_status b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, tcib_managed=true) Dec 15 05:18:32 localhost podman[341229]: 2025-12-15 10:18:32.754145974 +0000 UTC m=+0.080033874 container health_status 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202) Dec 15 05:18:32 localhost podman[341230]: 2025-12-15 10:18:32.826282197 +0000 UTC m=+0.141638058 container health_status ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) Dec 15 05:18:32 localhost podman[341230]: 2025-12-15 10:18:32.866327471 +0000 UTC m=+0.181683302 container exec_died ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3) Dec 15 05:18:32 localhost systemd[1]: ac2eae30a2a7f971398af42ea67b3ed799d4b3341f7688ebcd73f7ca7f66c2a8.service: Deactivated successfully. Dec 15 05:18:32 localhost podman[341227]: 2025-12-15 10:18:32.88748155 +0000 UTC m=+0.213327169 container health_status 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 15 05:18:32 localhost podman[341228]: 2025-12-15 10:18:32.918372754 +0000 UTC m=+0.243174633 container health_status 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, health_status=healthy, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, architecture=x86_64, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, config_id=openstack_network_exporter, name=ubi9-minimal, release=1755695350) Dec 15 05:18:32 localhost podman[341228]: 2025-12-15 10:18:32.928366038 +0000 UTC m=+0.253167907 container exec_died 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09 (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7, name=openstack_network_exporter, vcs-type=git, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:ecd56e6733c475f2d441344fd98f288c3eac0261ba113695fec7520a954ccbc7', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, architecture=x86_64, io.openshift.tags=minimal rhel9, release=1755695350, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc.) Dec 15 05:18:32 localhost podman[341227]: 2025-12-15 10:18:32.929464079 +0000 UTC m=+0.255309698 container exec_died 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors , managed_by=edpm_ansible) Dec 15 05:18:32 localhost podman[341229]: 2025-12-15 10:18:32.940552554 +0000 UTC m=+0.266440444 container exec_died 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251202, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true) Dec 15 05:18:32 localhost systemd[1]: 67ee72fcd99c896f79e8807764b1d0f04e7d45927d06e306c0986394e8ed42a0.service: Deactivated successfully. Dec 15 05:18:32 localhost podman[341246]: 2025-12-15 10:18:32.95816676 +0000 UTC m=+0.270736210 container exec_died b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-cb1e9478634a00c8fb5ad9cd7fcd38c7b2a1a85187dfaa618bc715c24c2b15fb'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb) Dec 15 05:18:32 localhost systemd[1]: 9f0f481c937606298203797a71924b7614a5d9e3f4a8afd837cfa5b9602aedeb.service: Deactivated successfully. Dec 15 05:18:32 localhost systemd[1]: b3d8483ade539500e228c2af9c0c3e647febb5ec7903610a83aea32631007d4a.service: Deactivated successfully. Dec 15 05:18:32 localhost systemd[1]: 730c50020a7d9e2da21d2462d40af20ac303e43f3491f692cbbd87f432ba3e09.service: Deactivated successfully. Dec 15 05:18:33 localhost nova_compute[286344]: 2025-12-15 10:18:33.271 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:18:33 localhost nova_compute[286344]: 2025-12-15 10:18:33.296 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 05:18:33 localhost nova_compute[286344]: 2025-12-15 10:18:33.297 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 05:18:33 localhost nova_compute[286344]: 2025-12-15 10:18:33.297 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 05:18:33 localhost nova_compute[286344]: 2025-12-15 10:18:33.298 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Auditing locally available compute resources for np0005559462.localdomain (node: np0005559462.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m Dec 15 05:18:33 localhost nova_compute[286344]: 2025-12-15 10:18:33.298 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 05:18:33 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Dec 15 05:18:33 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2951750200' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Dec 15 05:18:33 localhost nova_compute[286344]: 2025-12-15 10:18:33.773 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.475s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 05:18:33 localhost nova_compute[286344]: 2025-12-15 10:18:33.846 286348 DEBUG nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 05:18:33 localhost nova_compute[286344]: 2025-12-15 10:18:33.847 286348 DEBUG nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m Dec 15 05:18:34 localhost nova_compute[286344]: 2025-12-15 10:18:34.072 286348 WARNING nova.virt.libvirt.driver [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m Dec 15 05:18:34 localhost nova_compute[286344]: 2025-12-15 10:18:34.074 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Hypervisor/Node resource view: name=np0005559462.localdomain free_ram=11114MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m Dec 15 05:18:34 localhost nova_compute[286344]: 2025-12-15 10:18:34.075 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 05:18:34 localhost nova_compute[286344]: 2025-12-15 10:18:34.075 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 05:18:34 localhost nova_compute[286344]: 2025-12-15 10:18:34.190 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Instance 39ff1bd9-6f6b-44c8-bbec-a1fd9d196359 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m Dec 15 05:18:34 localhost nova_compute[286344]: 2025-12-15 10:18:34.191 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m Dec 15 05:18:34 localhost nova_compute[286344]: 2025-12-15 10:18:34.192 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Final resource view: name=np0005559462.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m Dec 15 05:18:34 localhost nova_compute[286344]: 2025-12-15 10:18:34.224 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m Dec 15 05:18:34 localhost ceph-mon[298913]: mon.np0005559462@0(leader) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0) Dec 15 05:18:34 localhost ceph-mon[298913]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/1145329410' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch Dec 15 05:18:34 localhost nova_compute[286344]: 2025-12-15 10:18:34.690 286348 DEBUG oslo_concurrency.processutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m Dec 15 05:18:34 localhost nova_compute[286344]: 2025-12-15 10:18:34.698 286348 DEBUG nova.compute.provider_tree [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Inventory has not changed in ProviderTree for provider: 26c8956b-6742-4951-b566-971b9bbe323b update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m Dec 15 05:18:34 localhost nova_compute[286344]: 2025-12-15 10:18:34.821 286348 DEBUG nova.scheduler.client.report [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Inventory has not changed for provider 26c8956b-6742-4951-b566-971b9bbe323b based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m Dec 15 05:18:34 localhost nova_compute[286344]: 2025-12-15 10:18:34.824 286348 DEBUG nova.compute.resource_tracker [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Compute_service record updated for np0005559462.localdomain:np0005559462.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m Dec 15 05:18:34 localhost nova_compute[286344]: 2025-12-15 10:18:34.824 286348 DEBUG oslo_concurrency.lockutils [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.749s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 05:18:34 localhost openstack_network_exporter[246484]: ERROR 10:18:34 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server Dec 15 05:18:34 localhost openstack_network_exporter[246484]: ERROR 10:18:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 05:18:34 localhost openstack_network_exporter[246484]: ERROR 10:18:34 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd Dec 15 05:18:34 localhost openstack_network_exporter[246484]: ERROR 10:18:34 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath Dec 15 05:18:34 localhost openstack_network_exporter[246484]: Dec 15 05:18:34 localhost openstack_network_exporter[246484]: ERROR 10:18:34 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath Dec 15 05:18:34 localhost openstack_network_exporter[246484]: Dec 15 05:18:35 localhost nova_compute[286344]: 2025-12-15 10:18:35.810 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:18:37 localhost systemd[1]: session-75.scope: Deactivated successfully. Dec 15 05:18:37 localhost systemd-logind[763]: Session 75 logged out. Waiting for processes to exit. Dec 15 05:18:37 localhost systemd-logind[763]: Removed session 75. Dec 15 05:18:37 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e281 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:18:37 localhost nova_compute[286344]: 2025-12-15 10:18:37.575 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:18:38 localhost systemd[1]: Started /usr/bin/podman healthcheck run 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923. Dec 15 05:18:38 localhost systemd[1]: tmp-crun.vxZnM1.mount: Deactivated successfully. Dec 15 05:18:38 localhost podman[341375]: 2025-12-15 10:18:38.755141666 +0000 UTC m=+0.089218944 container health_status 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS) Dec 15 05:18:38 localhost podman[341375]: 2025-12-15 10:18:38.784824824 +0000 UTC m=+0.118902112 container exec_died 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '07f3af15d79276418f54a8dde2fc251064527c3571430e14af77aaa2c01f1193-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible) Dec 15 05:18:38 localhost systemd[1]: 4d46c406a3855fd1c322e5d86c048188ea5865daa07ba23aaebacc7879668923.service: Deactivated successfully. Dec 15 05:18:38 localhost nova_compute[286344]: 2025-12-15 10:18:38.825 286348 DEBUG oslo_service.periodic_task [None req-bb659133-7b03-47d6-8b3b-c7b1b9491e62 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m Dec 15 05:18:40 localhost ceph-mon[298913]: log_channel(cluster) log [DBG] : mgrmap e58: np0005559464.aomnqe(active, since 21m), standbys: np0005559462.fudvyx, np0005559463.daptkf Dec 15 05:18:40 localhost nova_compute[286344]: 2025-12-15 10:18:40.811 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:18:42 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e281 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:18:42 localhost nova_compute[286344]: 2025-12-15 10:18:42.614 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:18:45 localhost systemd[1]: Started /usr/bin/podman healthcheck run a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e. Dec 15 05:18:45 localhost systemd[1]: tmp-crun.AG5y7q.mount: Deactivated successfully. Dec 15 05:18:45 localhost podman[341393]: 2025-12-15 10:18:45.755572656 +0000 UTC m=+0.079002765 container health_status a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}) Dec 15 05:18:45 localhost podman[341393]: 2025-12-15 10:18:45.792443446 +0000 UTC m=+0.115873505 container exec_died a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi , managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter) Dec 15 05:18:45 localhost systemd[1]: a4e94bd7aaee6c174c601060fb5f34559019ccea93fc397263ee3735731cfc1e.service: Deactivated successfully. Dec 15 05:18:45 localhost nova_compute[286344]: 2025-12-15 10:18:45.812 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:18:47 localhost ceph-mon[298913]: mon.np0005559462@0(leader).osd e281 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104 Dec 15 05:18:47 localhost nova_compute[286344]: 2025-12-15 10:18:47.617 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:18:50 localhost nova_compute[286344]: 2025-12-15 10:18:50.817 286348 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m Dec 15 05:18:51 localhost ovn_metadata_agent[160585]: 2025-12-15 10:18:51.493 160590 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m Dec 15 05:18:51 localhost ovn_metadata_agent[160585]: 2025-12-15 10:18:51.493 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m Dec 15 05:18:51 localhost ovn_metadata_agent[160585]: 2025-12-15 10:18:51.494 160590 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m Dec 15 05:18:51 localhost sshd[341416]: main: sshd: ssh-rsa algorithm is disabled Dec 15 05:18:51 localhost systemd-logind[763]: New session 76 of user zuul. Dec 15 05:18:51 localhost systemd[1]: Started Session 76 of User zuul.